aboutsummaryrefslogtreecommitdiff
path: root/docs/user_guide/aruco_marker_pipeline
diff options
context:
space:
mode:
Diffstat (limited to 'docs/user_guide/aruco_marker_pipeline')
-rw-r--r--docs/user_guide/aruco_marker_pipeline/advanced_topics/aruco_detector_configuration.md37
-rw-r--r--docs/user_guide/aruco_marker_pipeline/advanced_topics/optic_parameters_calibration.md190
-rw-r--r--docs/user_guide/aruco_marker_pipeline/advanced_topics/scripting.md132
-rw-r--r--docs/user_guide/aruco_marker_pipeline/aoi_3d_description.md68
-rw-r--r--docs/user_guide/aruco_marker_pipeline/aoi_3d_frame.md133
-rw-r--r--docs/user_guide/aruco_marker_pipeline/aoi_3d_projection.md168
-rw-r--r--docs/user_guide/aruco_marker_pipeline/aruco_markers_description.md120
-rw-r--r--docs/user_guide/aruco_marker_pipeline/configuration_and_execution.md153
-rw-r--r--docs/user_guide/aruco_marker_pipeline/introduction.md29
-rw-r--r--docs/user_guide/aruco_marker_pipeline/pose_estimation.md84
10 files changed, 1114 insertions, 0 deletions
diff --git a/docs/user_guide/aruco_marker_pipeline/advanced_topics/aruco_detector_configuration.md b/docs/user_guide/aruco_marker_pipeline/advanced_topics/aruco_detector_configuration.md
new file mode 100644
index 0000000..410e2d7
--- /dev/null
+++ b/docs/user_guide/aruco_marker_pipeline/advanced_topics/aruco_detector_configuration.md
@@ -0,0 +1,37 @@
+Improve ArUco markers detection
+===============================
+
+As explain in [OpenCV ArUco documentation](https://docs.opencv.org/4.x/d1/dcd/structcv_1_1aruco_1_1DetectorParameters.html), ArUco markers detection is highly configurable.
+
+## Load ArUcoDetector parameters
+
+[ArUcoCamera.detector.parameters](../../../argaze.md/#argaze.ArUcoMarkers.ArUcoDetector.Parameters) can be loaded thanks to a dedicated JSON entry.
+
+Here is an extract from the JSON [ArUcoCamera](../../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) configuration file with ArUco detector parameters:
+
+```json
+{
+ "name": "My FullHD camera",
+ "size": [1920, 1080],
+ "aruco_detector": {
+ "dictionary": "DICT_APRILTAG_16h5",
+ "parameters": {
+ "adaptiveThreshConstant": 10,
+ "useAruco3Detection": 1
+ }
+ },
+ ...
+```
+
+## Print ArUcoDetector parameters
+
+```python
+# Assuming ArUcoCamera is loaded
+...
+
+# Print all ArUcoDetector parameters
+print(aruco_camera.aruco_detector.parameters)
+
+# Print only modified ArUcoDetector parameters
+print(f'{aruco_camera.aruco_detector.parameters:modified}')
+```
diff --git a/docs/user_guide/aruco_marker_pipeline/advanced_topics/optic_parameters_calibration.md b/docs/user_guide/aruco_marker_pipeline/advanced_topics/optic_parameters_calibration.md
new file mode 100644
index 0000000..54d0c94
--- /dev/null
+++ b/docs/user_guide/aruco_marker_pipeline/advanced_topics/optic_parameters_calibration.md
@@ -0,0 +1,190 @@
+Calibrate optic parameters
+==========================
+
+A camera device have to be calibrated to compensate its optical distorsion.
+
+![Optic parameters calibration](../../../img/optic_calibration.png)
+
+## Print calibration board
+
+The first step to calibrate a camera is to create an [ArUcoBoard](../../../argaze.md/#argaze.ArUcoMarkers.ArUcoBoard) like in the code below:
+
+```python
+from argaze.ArUcoMarkers import ArUcoMarkersDictionary, ArUcoBoard
+
+# Create ArUco dictionary
+aruco_dictionary = ArUcoMarkersDictionary.ArUcoMarkersDictionary('DICT_APRILTAG_16h5')
+
+# Create an ArUco board of 7 columns and 5 rows with 5 cm squares with 3cm ArUco markers inside
+aruco_board = ArUcoBoard.ArUcoBoard(7, 5, 5, 3, aruco_dictionary)
+
+# Export ArUco board with 300 dpi resolution
+aruco_board.save('./calibration_board.png', 300)
+```
+
+!!! note
+ There is **A3_DICT_APRILTAG_16h5_3cm_35cmx25cm.pdf** file located in *./src/argaze/ArUcoMarkers/utils/* folder ready to be printed on A3 paper sheet.
+
+Let's print the calibration board before to go further.
+
+## Capture board pictures
+
+Then, the calibration process needs to make many different captures of an [ArUcoBoard](../../../argaze.md/#argaze.ArUcoMarkers.ArUcoBoard) through the camera and then, pass them to an [ArUcoDetector](../../../argaze.md/#argaze.ArUcoMarkers.ArUcoDetector.ArUcoDetector) instance to detect board corners and store them as calibration data into an [ArUcoOpticCalibrator](../../../argaze.md/#argaze.ArUcoMarkers.ArUcoOpticCalibrator) for final calibration process.
+
+![Calibration step](../../../img/optic_calibration_step.png)
+
+The sample of code below illustrates how to:
+
+* load all required ArGaze objects,
+* detect board corners into a Full HD camera video stream,
+* store detected corners as calibration data then,
+* once enough captures are made, process them to find optic parameters and,
+* finally, save optic parameters into a JSON file.
+
+```python
+from argaze.ArUcoMarkers import ArUcoMarkersDictionary, ArUcoOpticCalibrator, ArUcoBoard, ArUcoDetector
+
+# Create ArUco dictionary
+aruco_dictionary = ArUcoMarkersDictionary.ArUcoMarkersDictionary('DICT_APRILTAG_16h5')
+
+# Create ArUco optic calibrator
+aruco_optic_calibrator = ArUcoOpticCalibrator.ArUcoOpticCalibrator()
+
+# Create ArUco board of 7 columns and 5 rows with 5 cm squares with 3cm aruco markers inside
+# Note: This board is the one expected during further board tracking
+expected_aruco_board = ArUcoBoard.ArUcoBoard(7, 5, 5, 3, aruco_dictionary)
+
+# Create ArUco detector
+aruco_detector = ArUcoDetector.ArUcoDetector(dictionary=aruco_dictionary)
+
+# Assuming that live Full HD (1920x1080) video stream is enabled
+...
+
+# Assuming there is a way to escape the while loop
+...
+
+ while video_stream.is_alive():
+
+ # Capture image from video stream
+ image = video_stream.read()
+
+ # Detect all board corners in image
+ aruco_detector.detect_board(image, expected_aruco_board, expected_aruco_board.markers_number)
+
+ # If all board corners are detected
+ if aruco_detector.board_corners_number() == expected_aruco_board.corners_number:
+
+ # Draw board corners to show that board tracking succeeded
+ aruco_detector.draw_board(image)
+
+ # Append tracked board data for further calibration processing
+ aruco_optic_calibrator.store_calibration_data(aruco_detector.board_corners(), aruco_detector.board_corners_identifier())
+
+# Start optic calibration processing for Full HD image resolution
+print('Calibrating optic...')
+optic_parameters = aruco_optic_calibrator.calibrate(expected_aruco_board, dimensions=(1920, 1080))
+
+if optic_parameters:
+
+ # Export optic parameters
+ optic_parameters.to_json('./optic_parameters.json')
+
+ print('Calibration succeeded: optic_parameters.json file exported.')
+
+else:
+
+ print('Calibration failed.')
+```
+
+Below, an optic_parameters JSON file example:
+
+```json
+{
+ "rms": 0.6688921504088245,
+ "dimensions": [
+ 1920,
+ 1080
+ ],
+ "K": [
+ [
+ 1135.6524381415752,
+ 0.0,
+ 956.0685325355497
+ ],
+ [
+ 0.0,
+ 1135.9272506869524,
+ 560.059099810324
+ ],
+ [
+ 0.0,
+ 0.0,
+ 1.0
+ ]
+ ],
+ "D": [
+ 0.01655492265003404,
+ 0.1985524264972037,
+ 0.002129965902489484,
+ -0.0019528582922179365,
+ -0.5792910353639452
+ ]
+}
+```
+
+## Load and display optic parameters
+
+[ArUcoCamera.detector.optic_parameters](../../../argaze.md/#argaze.ArUcoMarkers.ArUcoOpticCalibrator.OpticParameters) can be enabled thanks to a dedicated JSON entry.
+
+Here is an extract from the JSON [ArUcoCamera](../../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) configuration file where optic parameters are loaded and displayed:
+
+```json
+{
+ "name": "My FullHD Camera",
+ "size": [1920, 1080],
+ "aruco_detector": {
+ "dictionary": "DICT_APRILTAG_16h5",
+ "optic_parameters": {
+ "rms": 0.6688921504088245,
+ "dimensions": [
+ 1920,
+ 1080
+ ],
+ "K": [
+ [
+ 1135.6524381415752,
+ 0.0,
+ 956.0685325355497
+ ],
+ [
+ 0.0,
+ 1135.9272506869524,
+ 560.059099810324
+ ],
+ [
+ 0.0,
+ 0.0,
+ 1.0
+ ]
+ ],
+ "D": [
+ 0.01655492265003404,
+ 0.1985524264972037,
+ 0.002129965902489484,
+ -0.0019528582922179365,
+ -0.5792910353639452
+ ]
+ }
+ },
+ ...
+ "image_parameters": {
+ ...
+ "draw_optic_parameters_grid": {
+ "width": 192,
+ "height": 108,
+ "z": 100,
+ "point_size": 1,
+ "point_color": [0, 0, 255]
+ }
+ }
+``` \ No newline at end of file
diff --git a/docs/user_guide/aruco_marker_pipeline/advanced_topics/scripting.md b/docs/user_guide/aruco_marker_pipeline/advanced_topics/scripting.md
new file mode 100644
index 0000000..2eb64f8
--- /dev/null
+++ b/docs/user_guide/aruco_marker_pipeline/advanced_topics/scripting.md
@@ -0,0 +1,132 @@
+Script the pipeline
+===================
+
+All aruco markers pipeline objects are accessible from Python script.
+This could be particularly useful for realtime AR interaction applications.
+
+## Load ArUcoCamera configuration from dictionary
+
+First of all, [ArUcoCamera](../../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) configuration can be loaded from a python dictionary.
+
+```python
+from argaze.ArUcoMarkers import ArUcoCamera
+
+# Edit a dict with ArUcoCamera configuration
+configuration = {
+ "name": "My FullHD camera",
+ "size": (1920, 1080),
+ ...
+ "aruco_detector": {
+ ...
+ },
+ "scenes": {
+ "MyScene" : {
+ "aruco_markers_group": {
+ ...
+ },
+ "layers": {
+ "MyLayer": {
+ "aoi_scene": {
+ ...
+ }
+ },
+ ...
+ }
+ },
+ ...
+ }
+ "layers": {
+ "MyLayer": {
+ ...
+ },
+ ...
+ },
+ "image_parameters": {
+ ...
+ }
+}
+
+# Load ArUcoCamera
+aruco_camera = ArUcoCamera.ArUcoCamera.from_dict(configuration)
+
+# Do something with ArUcoCamera
+...
+```
+
+## Access to ArUcoCamera and ArScenes attributes
+
+Then, once the configuration is loaded, it is possible to access to its attributes: [read ArUcoCamera code reference](../../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) to get a complete list of what is available.
+
+Thus, the [ArUcoCamera.scenes](../../../argaze.md/#argaze.ArFeatures.ArCamera) attribute allows to access each loaded aruco scene and so, access to their attributes: [read ArUcoScene code reference](../../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) to get a complete list of what is available.
+
+```python
+from argaze import ArFeatures
+
+# Assuming the ArUcoCamera is loaded
+...
+
+# Iterate over each ArUcoCamera scene
+for name, aruco_scene in aruco_camera.scenes.items():
+ ...
+```
+
+## Pipeline execution outputs
+
+[ArUcoCamera.watch](../../../argaze.md/#argaze.ArFeatures.ArCamera.watch) method returns data about pipeline execution.
+
+```python
+# Assuming that timestamped images are available
+...:
+
+ try:
+
+ # Watch image with ArUco camera
+ aruco_camera.watch(image, timestamp=timestamp)
+
+ # Do something with pipeline exception
+ except Exception as e:
+
+ ...
+
+ # Do something with detected_markers
+ ... aruco_camera.aruco_detector.detected_markers()
+
+```
+
+Let's understand the meaning of each returned data.
+
+### *aruco_camera.aruco_detector.detected_markers()*
+
+A dictionary containing all detected markers provided by [ArUcoDetector](../../../argaze.md/#argaze.ArUcoMarkers.ArUcoDetector) class.
+
+## Setup ArUcoCamera image parameters
+
+Specific [ArUcoCamera.image](../../../argaze.md/#argaze.ArFeatures.ArFrame.image) method parameters can be configured thanks to a Python dictionary.
+
+```python
+# Assuming ArUcoCamera is loaded
+...
+
+# Edit a dict with ArUcoCamera image parameters
+image_parameters = {
+ "draw_detected_markers": {
+ ...
+ },
+ "draw_scenes": {
+ ...
+ },
+ "draw_optic_parameters_grid": {
+ ...
+ },
+ ...
+}
+
+# Pass image parameters to ArUcoCamera
+aruco_camera_image = aruco_camera.image(**image_parameters)
+
+# Do something with ArUcoCamera image
+...
+```
+
+!!! note
+ [ArUcoCamera](../../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) inherits from [ArFrame](../../../argaze.md/#argaze.ArFeatures.ArFrame) and so, benefits from all image parameters described in [gaze analysis pipeline visualisation section](../../gaze_analysis_pipeline/visualisation.md). \ No newline at end of file
diff --git a/docs/user_guide/aruco_marker_pipeline/aoi_3d_description.md b/docs/user_guide/aruco_marker_pipeline/aoi_3d_description.md
new file mode 100644
index 0000000..23ea550
--- /dev/null
+++ b/docs/user_guide/aruco_marker_pipeline/aoi_3d_description.md
@@ -0,0 +1,68 @@
+Describe 3D AOI
+===============
+
+Now [scene pose is estimated](aruco_markers_description.md) thanks to ArUco markers description, [areas of interest (AOI)](../../argaze.md/#argaze.AreaOfInterest.AOIFeatures.AreaOfInterest) need to be described into the same 3D referential.
+
+In the example scene, the two screens, the control panel and the window are considered as areas of interest.
+
+![3D AOI description](../../img/aoi_3d_description.png)
+
+All AOI need to be described from same origin than markers in a [right-handed 3D axis](https://robotacademy.net.au/lesson/right-handed-3d-coordinate-frame/) where:
+
+* +X is pointing to the right,
+* +Y is pointing to the top,
+* +Z is pointing to the backward.
+
+!!! warning
+ All AOI spatial values must be given in **centimeters**.
+
+### Edit OBJ file description
+
+OBJ file format could be exported from most 3D editors.
+
+``` obj
+o Left_Screen
+v 0.000000 -0.000000 -0.000000
+v 15.000000 -0.000000 -0.000000
+v 0.000000 18.963333 -6.355470
+v 15.000000 18.963333 -6.355470
+f 1 2 4 3
+o Right_Screen
+v 20.000000 0.000000 -0.000000
+v 35.000000 0.000000 -0.000000
+v 20.000000 18.963337 -6.355472
+v 35.000000 18.963337 -6.355472
+f 5 6 8 7
+o Control_Panel
+v 49.500000 30.000000 18.333333
+v 55.500000 30.000000 18.333333
+v 49.500000 38.000000 18.333333
+v 55.500000 38.000000 18.333333
+f 9 10 12 11
+o Window
+v -57.800000 5.500000 -33.500000
+v 46.000000 15.500000 -35.000000
+v 1.500000 53.000000 -1.000000
+v 50.200000 61.000000 6.000000
+v -35.850000 35.000000 -15.000000
+f 13 14 16 15 17
+```
+
+Here are common OBJ file features needed to describe AOI:
+
+* Object lines (starting with *o* key) indicate AOI name.
+* Vertice lines (starting with *v* key) indicate AOI vertices.
+* Face (starting with *f* key) link vertices together.
+
+### Edit JSON file description
+
+JSON file format allows to describe AOI vertices.
+
+``` json
+{
+ "Left_Screen": [[0, 0, 0], [15, 0, 0], [0, 18.963333, -6.355470], [15, 18.963333, -6.355470]],
+ "Right_Screen": [[20, 0, 0], [35, 0, 0], [20, 18.963337, -6.355472], [35, 18.963337, -6.355472]],
+ "Control_Panel": [[49.5, 30, 18.333333], [55.5, 30, 18.333333], [49.5, 38, 18.333333], [55.5, 38, 18.333333]],
+ "Window": [[-57.8, 5.5, -33.5], [46, 15.5, -35], [1.5, 53, -1], [50.2, 61, 6], [-35.85, 35, -15]]
+}
+```
diff --git a/docs/user_guide/aruco_marker_pipeline/aoi_3d_frame.md b/docs/user_guide/aruco_marker_pipeline/aoi_3d_frame.md
new file mode 100644
index 0000000..cf4a07e
--- /dev/null
+++ b/docs/user_guide/aruco_marker_pipeline/aoi_3d_frame.md
@@ -0,0 +1,133 @@
+Define a 3D AOI as a frame
+==========================
+
+When an 3D AOI of the scene contains others coplanar 3D AOI, like a screen with GUI elements displayed on, it is better to described them as 2D AOI inside 2D coordinates system related to the containing 3D AOI.
+
+![3D AOI frame](../../img/aruco_camera_aoi_frame.png)
+
+## Add ArFrame to ArUcoScene
+
+The [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) class defines a rectangular area where timestamped gaze positions are projected in and inside which they need to be analyzed.
+
+Here is the previous extract where "Left_Screen" and "Right_Screen" AOI are defined as a frame into [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) configuration:
+
+```json
+{
+ "name": "My FullHD camera",
+ "size": [1920, 1080],
+ ...
+ "scenes": {
+ "MyScene" : {
+ "aruco_markers_group": {
+ ...
+ },
+ "layers": {
+ "MyLayer": {
+ "aoi_scene": {
+ "Left_Screen": [[0, 0, 0], [15, 0, 0], [0, 18.963333, -6.355470], [15, 18.963333, -6.355470]],
+ "Right_Screen": [[20, 0, 0], [35, 0, 0], [20, 18.963337 ,-6.355472], [35, 18.963337, -6.355472]],
+ "Control_Panel": [[49.5, 30, 18.333333], [55.5, 30, 18.333333], [49.5, 38, 18.333333], [55.5, 38, 18.333333]],
+ "Window": [[-57.8, 5.5, -33.5], [46, 15.5, -35], [1.5, 53, -1], [50.2, 61, 6], [-35.85, 35, -15]]
+ }
+ }
+ },
+ "frames": {
+ "Left_Screen": {
+ "size": [768, 1024],
+ "layers": {
+ "MyLayer": {
+ "aoi_scene": {
+ "LeftPanel": {
+ "Rectangle": {
+ "x": 0,
+ "y": 0,
+ "width": 768,
+ "height": 180
+ }
+ },
+ "CircularWidget": {
+ "Circle": {
+ "cx": 384,
+ "cy": 600,
+ "radius": 180
+ }
+ }
+ }
+ }
+ }
+ },
+ "Right_Screen": {
+ "size": [768, 1024],
+ "layers": {
+ "MyLayer": {
+ "aoi_scene": {
+ "GeoSector": [[724, 421], [537, 658], [577, 812], [230, 784], [70, 700], [44, 533], [190, 254], [537, 212]]
+ }
+ }
+ }
+ }
+ }
+ }
+ }
+ ...
+}
+```
+Now, let's understand the meaning of each JSON entry.
+
+### *frames*
+
+An [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) instance can contains multiples [ArFrames](../../argaze.md/#argaze.ArFeatures.ArFrame) stored by name.
+
+### Left_Screen & Right_Screen
+
+The names of 3D AOI **and** their related [ArFrames](../../argaze.md/#argaze.ArFeatures.ArFrame). Basically useful for visualisation purpose.
+
+!!! warning "AOI / Frame names policy"
+
+ An [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) layer 3D AOI is defined as an [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) frame, **provided they have the same name**.
+
+!!! warning "Layer name policy"
+
+ An [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) frame layer is projected into [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) layer, **provided they have the same name**.
+
+!!! note
+
+ [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) frame layers are projected into their dedicated [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) layers when the JSON configuration file is loaded.
+
+## Pipeline execution
+
+### Map ArUcoCamera image into ArUcoScenes frames
+
+After camera image is passed to [ArUcoCamera.watch](../../argaze.md/#argaze.ArFeatures.ArCamera.watch) method, it is possible to apply a perpective transformation in order to project watched image into each [ArUcoScenes](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) [frames background](../../argaze.md/#argaze.ArFeatures.ArFrame) image.
+
+```python
+# Assuming that Full HD (1920x1080) timestamped images are available
+...:
+
+ # Detect ArUco markers, estimate scene pose then, project 3D AOI into camera frame
+ aruco_camera.watch(image, timestamp=timestamp)
+
+ # Map watched image into ArUcoScenes frames background
+ aruco_camera.map(timestamp=timestamp)
+```
+
+### Analyse timestamped gaze positions into ArUcoScenes frames
+
+[ArUcoScenes](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) frames benefits from all the services described in [gaze analysis pipeline section](../gaze_analysis_pipeline/introduction.md).
+
+!!! note
+
+ Timestamped gaze positions passed to [ArUcoCamera.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method are projected into [ArUcoScenes](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) frames if applicable.
+
+### Display each ArUcoScenes frames
+
+All [ArUcoScenes](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) frames image can be displayed as any [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame).
+
+```python
+ ...
+
+ # Display all ArUcoScenes frames
+ for frame in aruco_camera.scene_frames:
+
+ ... frame.image()
+``` \ No newline at end of file
diff --git a/docs/user_guide/aruco_marker_pipeline/aoi_3d_projection.md b/docs/user_guide/aruco_marker_pipeline/aoi_3d_projection.md
new file mode 100644
index 0000000..64f5fc8
--- /dev/null
+++ b/docs/user_guide/aruco_marker_pipeline/aoi_3d_projection.md
@@ -0,0 +1,168 @@
+Project 3D AOI into camera frame
+================================
+
+Once [ArUcoScene pose is estimated](pose_estimation.md) and [3D AOI are described](aoi_3d_description.md), AOI can be projected into [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) frame.
+
+![3D AOI projection](../../img/aruco_camera_aoi_projection.png)
+
+## Add ArLayer to ArUcoScene to load 3D AOI
+
+The [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) class allows to load 3D AOI description.
+
+Here is the previous extract where one layer is added to [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) configuration:
+
+```json
+{
+ "name": "My FullHD camera",
+ "size": [1920, 1080],
+ ...
+ "scenes": {
+ "MyScene" : {
+ "aruco_markers_group": {
+ ...
+ },
+ "layers": {
+ "MyLayer": {
+ "aoi_scene": {
+ "Left_Screen": [[0, 0, 0], [15, 0, 0], [0, 18.963333, -6.355470], [15, 18.963333, -6.355470]],
+ "Right_Screen": [[20, 0, 0], [35, 0, 0], [20, 18.963337, -6.355472], [35, 18.963337, -6.355472]],
+ "Control_Panel": [[49.5, 30, 18.333333], [55.5, 30, 18.333333], [49.5, 38, 18.333333], [55.5, 38, 18.333333]],
+ "Window": [[-57.8, 5.5, -33.5], [46, 15.5, -35], [1.5, 53, -1], [50.2, 61, 6], [-35.85, 35, -15]]
+ }
+ }
+ }
+ }
+ }
+ ...
+}
+```
+
+Now, let's understand the meaning of each JSON entry.
+
+### *layers*
+
+An [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) instance can contains multiples [ArLayers](../../argaze.md/#argaze.ArFeatures.ArLayer) stored by name.
+
+### MyLayer
+
+The name of an [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer). Basically useful for visualisation purpose.
+
+### *aoi_scene*
+
+The set of 3D AOI into the layer as defined at [3D AOI description chapter](aoi_3d_description.md).
+
+## Add ArLayer to ArUcoCamera to project 3D AOI into
+
+Here is the previous extract where one layer is added to [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) configuration and displayed:
+
+```json
+{
+ "name": "My FullHD camera",
+ "size": [1920, 1080],
+ ...
+ "scenes": {
+ "MyScene" : {
+ "aruco_markers_group": {
+ ...
+ },
+ "layers": {
+ "MyLayer": {
+ "aoi_scene": {
+ ...
+ }
+ }
+ }
+ }
+ },
+ "layers": {
+ "MyLayer": {}
+ }
+ ...
+ "image_parameters": {
+ ...
+ "draw_layers": {
+ "MyLayer": {
+ "draw_aoi_scene": {
+ "draw_aoi": {
+ "color": [255, 255, 255],
+ "border_size": 1
+ }
+ }
+ }
+ }
+ }
+}
+```
+
+Now, let's understand the meaning of each JSON entry.
+
+### *layers*
+
+An [ArUcoCamera](../../argaze.md/#argaze.ArFeatures.ArFrame) instance can contains multiples [ArLayers](../../argaze.md/#argaze.ArFeatures.ArLayer) stored by name.
+
+### MyLayer
+
+The name of an [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer). Basically useful for visualisation purpose.
+
+!!! warning "Layer name policy"
+
+ An [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) layer is projected into an [ ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) layer, **provided they have the same name**.
+
+!!! note
+
+ [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) layers are projected into their dedicated [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) layers when calling the [ArUcoCamera.watch](../../argaze.md/#argaze.ArFeatures.ArCamera.watch) method.
+
+## Add AOI analysis features to ArUcoCamera layer
+
+When a scene layer is projected into a camera layer, it means that the 3D scene's AOI are transformed into 2D camera's AOI.
+
+Therefore, it means that [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) benefits from all the services described in [AOI analysis pipeline section](../gaze_analysis_pipeline/aoi_analysis.md).
+
+Here is the previous extract where AOI matcher, AOI scan path and AOI scan path analyzers are added to the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) layer:
+
+```json
+{
+ "name": "My FullHD camera",
+ "size": [1920, 1080],
+ ...
+ "scenes": {
+ "MyScene" : {
+ "aruco_markers_group": {
+ ...
+ },
+ "layers": {
+ "MyLayer": {
+ "aoi_scene": {
+ ...
+ }
+ }
+ }
+ }
+ },
+ "layers": {
+ "MyLayer": {
+ "aoi_matcher": {
+ "DeviationCircleCoverage": {
+ "coverage_threshold": 0.5
+ }
+ },
+ "aoi_scan_path": {
+ "duration_max": 30000
+ },
+ "aoi_scan_path_analyzers": {
+ "Basic": {},
+ "TransitionMatrix": {},
+ "NGram": {
+ "n_min": 3,
+ "n_max": 5
+ }
+ }
+ }
+ }
+ ...
+}
+```
+
+!!! warning
+
+ Adding scan path and scan path analyzers to an [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) layer doesn't make sense as the space viewed thru camera frame doesn't necessary reflect the space the gaze is covering.
diff --git a/docs/user_guide/aruco_marker_pipeline/aruco_markers_description.md b/docs/user_guide/aruco_marker_pipeline/aruco_markers_description.md
new file mode 100644
index 0000000..66a0581
--- /dev/null
+++ b/docs/user_guide/aruco_marker_pipeline/aruco_markers_description.md
@@ -0,0 +1,120 @@
+Set up ArUco markers
+====================
+
+First of all, ArUco markers needs to be printed and placed into the scene.
+
+Here is an example scene where markers are surrounding a workspace with two screens, a control panel and a window.
+
+![Scene](../../img/scene.png)
+
+## Print ArUco markers from an ArUco dictionary
+
+ArUco markers always belongs to a set of markers called ArUco markers dictionary.
+
+![ArUco dictionaries](../../img/aruco_dictionaries.png)
+
+Many ArUco dictionaries exist with properties concerning the format, the number of markers or the difference between each markers to avoid error in tracking.
+
+Here is the documention [about ArUco markers dictionaries](https://docs.opencv.org/3.4/d9/d6a/group__aruco.html#gac84398a9ed9dd01306592dd616c2c975).
+
+The creation of [ArUcoMarkers](../../argaze.md/#argaze.ArUcoMarkers.ArUcoMarker) pictures from a dictionary is illustrated in the code below:
+
+```python
+from argaze.ArUcoMarkers import ArUcoMarkersDictionary
+
+# Create a dictionary of specific April tags
+aruco_dictionary = ArUcoMarkersDictionary.ArUcoMarkersDictionary('DICT_APRILTAG_16h5')
+
+# Export marker n°5 as 3.5 cm picture with 300 dpi resolution
+aruco_dictionary.create_marker(5, 3.5).save('./markers/', 300)
+
+# Export all dictionary markers as 3.5 cm pictures with 300 dpi resolution
+aruco_dictionary.save('./markers/', 3.5, 300)
+```
+
+!!! note
+ There is **A4_DICT_APRILTAG_16h5_5cm_0-7.pdf** file located in *./src/argaze/ArUcoMarkers/utils/* folder ready to be printed on A4 paper sheet.
+
+Let's print some of them before to go further.
+
+!!! warning
+ Print markers with a blank zone around them to help in their detection.
+
+## Describe ArUco markers place
+
+Once [ArUcoMarkers](../../argaze.md/#argaze.ArUcoMarkers.ArUcoMarker) pictures are placed into a scene it is possible to describe their 3D places into a file.
+
+![ArUco markers description](../../img/aruco_markers_description.png)
+
+Where ever the origin point is, all markers places need to be described in a [right-handed 3D axis](https://robotacademy.net.au/lesson/right-handed-3d-coordinate-frame/) where:
+
+* +X is pointing to the right,
+* +Y is pointing to the top,
+* +Z is pointing to the backward.
+
+!!! warning
+ All ArUco markers spatial values must be given in **centimeters**.
+
+### Edit OBJ file description
+
+OBJ file format could be exported from most 3D editors.
+
+``` obj
+o DICT_APRILTAG_16h5#0_Marker
+v 15.000000 0.378741 0.330527
+v 20.000000 0.378741 0.330527
+v 15.000000 5.120359 -1.255996
+v 20.000000 5.120359 -1.255996
+f 1 2 4 3
+o DICT_APRILTAG_16h5#1_Marker
+v 43.500000 31.428055 18.333317
+v 48.500000 31.428055 18.333317
+v 43.500000 36.428055 18.333317
+v 48.500000 36.428055 18.333317
+f 5 6 8 7
+o DICT_APRILTAG_16h5#2_Marker
+v 38.500000 2.678055 5.498381
+v 43.500000 2.678055 5.498381
+v 38.500000 5.178055 1.168253
+v 43.500000 5.178055 1.168253
+f 9 10 12 11
+```
+
+Here are common OBJ file features needed to describe ArUco markers places:
+
+* Object lines (starting with *o* key) indicate markers dictionary and id by following this format: **DICTIONARY**#**ID**\_Marker.
+* Vertice lines (starting with *v* key) indicate markers corners. The marker size will be automatically deducted from the geometry.
+* Face (starting with *f* key) link vertices and normals indexes together.
+
+!!! warning
+ Markers have to belong to the same dictionary.
+
+!!! note
+ Markers can have different size.
+
+### Edit JSON file description
+
+JSON file format allows to describe markers places using translation and euler angle rotation vectors.
+
+``` json
+{
+ "dictionary": "DICT_APRILTAG_16h5",
+ "places": {
+ "0": {
+ "translation": [17.5, 2.75, -0.5],
+ "rotation": [-18.5, 0, 0],
+ "size": 5
+ },
+ "1": {
+ "translation": [46, 34, 18.333],
+ "rotation": [0, 70, 0],
+ "size": 5
+ },
+ "2": {
+ "translation": [41, 4, 3.333],
+ "rotation": [-60, 0, 0],
+ "size": 5
+ }
+ }
+}
+```
diff --git a/docs/user_guide/aruco_marker_pipeline/configuration_and_execution.md b/docs/user_guide/aruco_marker_pipeline/configuration_and_execution.md
new file mode 100644
index 0000000..dd36ed3
--- /dev/null
+++ b/docs/user_guide/aruco_marker_pipeline/configuration_and_execution.md
@@ -0,0 +1,153 @@
+Load and execute pipeline
+=========================
+
+Once [ArUco markers are placed into a scene](aruco_markers_description.md), they can be detected thanks to [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) class.
+
+As [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) inherits from [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame), the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) class also benefits from all the services described in [gaze analysis pipeline section](../gaze_analysis_pipeline/introduction.md).
+
+![ArUco camera frame](../../img/aruco_camera_frame.png)
+
+## Load JSON configuration file
+
+An [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) pipeline can be loaded from a JSON configuration file thanks to [argaze.load](../../argaze.md/#argaze.load) package method.
+
+Here is a simple JSON [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) configuration file example:
+
+```json
+{
+ "name": "My FullHD camera",
+ "size": [1920, 1080],
+ "aruco_detector": {
+ "dictionary": "DICT_APRILTAG_16h5"
+ },
+ "gaze_movement_identifier": {
+ "DispersionThresholdIdentification": {
+ "deviation_max_threshold": 25,
+ "duration_min_threshold": 150
+ }
+ },
+ "image_parameters": {
+ "background_weight": 1,
+ "draw_detected_markers": {
+ "color": [0, 255, 0],
+ "draw_axes": {
+ "thickness": 3
+ }
+ },
+ "draw_gaze_positions": {
+ "color": [0, 255, 255],
+ "size": 2
+ },
+ "draw_fixations": {
+ "deviation_circle_color": [255, 0, 255],
+ "duration_border_color": [127, 0, 127],
+ "duration_factor": 1e-2
+ },
+ "draw_saccades": {
+ "line_color": [255, 0, 255]
+ }
+ }
+}
+```
+
+Then, here is how to load the JSON file:
+
+```python
+import argaze
+
+# Load ArUcoCamera
+with argaze.load('./configuration.json') as aruco_camera:
+
+ # Do something with ArUcoCamera
+ ...
+```
+
+Now, let's understand the meaning of each JSON entry.
+
+### *name - inherited from [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame)*
+
+The name of the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) frame. Basically useful for visualisation purpose.
+
+### *size - inherited from [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame)*
+
+The size of the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) frame in pixels. Be aware that gaze positions have to be in the same range of value to be projected in.
+
+### *aruco_detector*
+
+The first [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) pipeline step is to detect ArUco markers inside input image.
+
+![ArUco markers detection](../../img/aruco_camera_markers_detection.png)
+
+The [ArUcoDetector](../../argaze.md/#argaze.ArUcoMarkers.ArUcoDetector) is in charge to detect all markers from a specific dictionary.
+
+!!! warning "Mandatory"
+ JSON *aruco_detector* entry is mandatory.
+
+### *gaze_movement_identifier - inherited from [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame)*
+
+The first [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline step dedicated to identify fixations or saccades from consecutive timestamped gaze positions.
+
+![Gaze movement identification](../../img/aruco_camera_gaze_movement_identification.png)
+
+### *image_parameters - inherited from [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame)*
+
+The usual [ArFrame visualisation parameters](../gaze_analysis_pipeline/visualisation.md) plus one additional *draw_detected_markers* field.
+
+## Pipeline execution
+
+### Detect ArUco markers, estimate scene pose and project 3D AOI
+
+Pass each camera image to [ArUcoCamera.watch](../../argaze.md/#argaze.ArFeatures.ArCamera.watch) method to execute the whole pipeline dedicated to ArUco markers detection, scene pose estimation and 3D AOI projection.
+
+!!! warning "Mandatory"
+
+ [ArUcoCamera.watch](../../argaze.md/#argaze.ArFeatures.ArCamera.watch) method must be called from a *try* block to catch pipeline exceptions.
+
+```python
+# Assuming that Full HD (1920x1080) timestamped images are available
+...:
+
+ try:
+
+ # Detect ArUco markers, estimate scene pose then, project 3D AOI into camera frame
+ aruco_camera.watch(image, timestamp=timestamp)
+
+ # Do something with pipeline exception
+ except Exception as e:
+
+ ...
+
+ # Display ArUcoCamera frame image to display detected ArUco markers, scene pose, 2D AOI projection and ArFrame visualisation.
+ ... aruco_camera.image()
+```
+
+### Analyse timestamped gaze positions into camera frame
+
+As mentioned above, [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) inherits from [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and so, benefits from all the services described in [gaze analysis pipeline section](../gaze_analysis_pipeline/introduction.md).
+
+Particularly, timestamped gaze positions can be passed one by one to [ArUcoCamera.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method to execute the whole pipeline dedicated to gaze analysis.
+
+!!! warning "Mandatory"
+
+ [ArUcoCamera.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method must be called from a *try* block to catch pipeline exceptions.
+
+```python
+# Assuming that timestamped gaze positions are available
+...
+
+ try:
+
+ # Look ArUcoCamera frame at a timestamped gaze position
+ aruco_camera.look(timestamped_gaze_position)
+
+ # Do something with pipeline exception
+ except Exception as e:
+
+ ...
+```
+
+!!! note ""
+
+ At this point, the [ArUcoCamera.watch](../../argaze.md/#argaze.ArFeatures.ArCamera.watch) method only detects ArUco markers and the [ArUcoCamera.look](../../argaze.md/#argaze.ArFeatures.ArCamera.look) method only process gaze movement identification without any AOI support as no scene description is provided into the JSON configuration file.
+
+ Read the next chapters to learn [how to estimate scene pose](pose_estimation.md), [how to describe 3D scene's AOI](aoi_3d_description.md) and [how to project them into camera frame](aoi_3d_projection.md). \ No newline at end of file
diff --git a/docs/user_guide/aruco_marker_pipeline/introduction.md b/docs/user_guide/aruco_marker_pipeline/introduction.md
new file mode 100644
index 0000000..7e662f7
--- /dev/null
+++ b/docs/user_guide/aruco_marker_pipeline/introduction.md
@@ -0,0 +1,29 @@
+Overview
+========
+
+This section explains how to build augmented reality pipelines based on [ArUco Markers technology](https://www.sciencedirect.com/science/article/abs/pii/S0031320314000235) for various use cases.
+
+The OpenCV library provides a module to detect fiducial markers into a picture and estimate their poses.
+
+![OpenCV ArUco markers](../../img/opencv_aruco.png)
+
+The ArGaze [ArUcoMarkers submodule](../../argaze.md/#argaze.ArUcoMarkers) eases markers creation, markers detection and 3D scene pose estimation through a set of high level classes.
+
+First, let's look at the schema below: it gives an overview of the main notions involved in the following chapters.
+
+![ArUco marker pipeline](../../img/aruco_marker_pipeline.png)
+
+To build your own ArUco marker pipeline, you need to know:
+
+* [How to setup ArUco markers into a scene](aruco_markers_description.md),
+* [How to load and execute ArUco marker pipeline](configuration_and_execution.md),
+* [How to estimate scene pose](pose_estimation.md),
+* [How to describe scene's AOI](aoi_3d_description.md),
+* [How to project 3D AOI into camera frame](aoi_3d_projection.md),
+* [How to define a 3D AOI as a frame](aoi_3d_frame.md).
+
+More advanced features are also explained like:
+
+* [How to script ArUco marker pipeline](advanced_topics/scripting.md),
+* [How to calibrate optic parameters](advanced_topics/optic_parameters_calibration.md),
+* [How to improve ArUco markers detection](advanced_topics/aruco_detector_configuration.md).
diff --git a/docs/user_guide/aruco_marker_pipeline/pose_estimation.md b/docs/user_guide/aruco_marker_pipeline/pose_estimation.md
new file mode 100644
index 0000000..7f6573c
--- /dev/null
+++ b/docs/user_guide/aruco_marker_pipeline/pose_estimation.md
@@ -0,0 +1,84 @@
+Estimate scene pose
+===================
+
+Once [ArUco markers are placed into a scene](aruco_markers_description.md) and [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) is [configured](configuration_and_execution.md), scene pose can be estimated.
+
+![Scene pose estimation](../../img/aruco_camera_pose_estimation.png)
+
+## Add ArUcoScene to ArUcoCamera JSON configuration file
+
+An [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) class defines a space with [ArUco markers inside](aruco_markers_description.md) helping to estimate scene pose when they are watched by [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera).
+
+Here is an extract from the JSON [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) configuration file with a sample where one scene is added and displayed:
+
+```json
+{
+ "name": "My FullHD camera",
+ "size": [1920, 1080],
+ ...
+ "scenes": {
+ "MyScene" : {
+ "aruco_markers_group": {
+ "dictionary": "DICT_APRILTAG_16h5",
+ "places": {
+ "0": {
+ "translation": [17.5, 2.75, -0.5],
+ "rotation": [-18.5, 0, 0],
+ "size": 5
+ },
+ "1": {
+ "translation": [46, 34, 18.333],
+ "rotation": [0, 70, 0],
+ "size": 5
+ },
+ "2": {
+ "translation": [41, 4, 3.333],
+ "rotation": [-60, 0, 0],
+ "size": 5
+ }
+ }
+ }
+ }
+ },
+ ...
+ "image_parameters": {
+ ...
+ "draw_scenes": {
+ "MyScene": {
+ "draw_aruco_markers_group": {
+ "draw_axes": {
+ "thickness": 3,
+ "length": 10
+ },
+ "draw_places": {
+ "color": [0, 0, 0],
+ "border_size": 1
+ }
+ }
+ }
+ }
+ }
+}
+```
+
+Now, let's understand the meaning of each JSON entry.
+
+### *scenes*
+
+An [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) instance can contains multiples [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) stored by name.
+
+### MyScene
+
+The name of an [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene). Basically useful for visualisation purpose.
+
+### *aruco_markers_group*
+
+The 3D places of ArUco markers into the scene as defined at [ArUco markers description chapter](aruco_markers_description.md). Thanks to this description, it is possible to estimate the pose of [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) in [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) frame.
+
+!!! note
+
+ [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) pose estimation is done when calling the [ArUcoCamera.watch](../../argaze.md/#argaze.ArFeatures.ArCamera.watch) method.
+
+### *draw_scenes*
+
+The drawing parameters of each loaded [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) in [ArUcoCamera.image](../../argaze.md/#argaze.ArFeatures.ArFrame.image).