aboutsummaryrefslogtreecommitdiff
path: root/docs
diff options
context:
space:
mode:
authorThéo de la Hogue2023-09-06 08:34:32 +0200
committerThéo de la Hogue2023-09-06 08:34:32 +0200
commit3b8681b848fd91989a03d0ff6a03c7deaec4addd (patch)
tree438e0eb91d4954564c774b16421c94ad2d3bbdbd /docs
parent3dba9640ad57e48d1979d19cfe8cab8c9be2d621 (diff)
downloadargaze-3b8681b848fd91989a03d0ff6a03c7deaec4addd.zip
argaze-3b8681b848fd91989a03d0ff6a03c7deaec4addd.tar.gz
argaze-3b8681b848fd91989a03d0ff6a03c7deaec4addd.tar.bz2
argaze-3b8681b848fd91989a03d0ff6a03c7deaec4addd.tar.xz
Renaming folder.
Diffstat (limited to 'docs')
-rw-r--r--docs/user_guide/aruco_markers_pipeline/aruco_camera_configuration_and_execution.md110
-rw-r--r--docs/user_guide/aruco_markers_pipeline/aruco_scene_creation.md (renamed from docs/user_guide/augmented_reality_pipeline/aruco_scene_creation.md)27
-rw-r--r--docs/user_guide/aruco_markers_pipeline/introduction.md26
-rw-r--r--docs/user_guide/aruco_markers_pipeline/optic_parameters_calibration.md (renamed from docs/user_guide/augmented_reality_pipeline/optic_parameters_calibration.md)0
-rw-r--r--docs/user_guide/augmented_reality_pipeline/introduction.md19
5 files changed, 149 insertions, 33 deletions
diff --git a/docs/user_guide/aruco_markers_pipeline/aruco_camera_configuration_and_execution.md b/docs/user_guide/aruco_markers_pipeline/aruco_camera_configuration_and_execution.md
new file mode 100644
index 0000000..a0ec2bf
--- /dev/null
+++ b/docs/user_guide/aruco_markers_pipeline/aruco_camera_configuration_and_execution.md
@@ -0,0 +1,110 @@
+Configure and execute ArUcoCamera
+=================================
+
+Once [ArUco markers are placed into a scene](aruco_scene_creation.md) and [the camera optic have been calibrated](optic_parameters_calibration.md), everything is ready to setup an ArUco marker pipeline thanks to [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) class.
+
+As it inherits from [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame), the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) class benefits from all the services of a [gaze analysis pipeline](./user_guide/gaze_analysis_pipeline/introduction.md).
+
+Besides, the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) class projects [ArUcoScenes](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene)'s layers into its own layers thanks to ArUco markers pose estimations made by its [ArUcoDetector](../../argaze.md/#argaze.ArUcoMarkers.ArUcoDetector).
+
+![ArUco camera frame](../../img/aruco_camera_frame.png)
+
+## Load JSON configuration file
+
+The [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) internal pipeline loads from a JSON configuration file thanks to [ArUcoCamera.from_json](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera.from_json) class method.
+
+Here is a simple JSON ArUcoCamera configuration file example:
+
+```json
+{
+ "name": "My FullHD camera",
+ "size": [1920, 1080],
+ "aruco_detector": {
+ "dictionary": {
+ "name": "DICT_APRILTAG_16h5"
+ },
+ "marker_size": 5,
+ "optic_parameters": "optic_parameters.json",
+ "parameters": {
+ "cornerRefinementMethod": 1,
+ "aprilTagQuadSigma": 2,
+ "aprilTagDeglitch": 1
+ }
+ },
+ "scenes": {
+
+ },
+ "layers": {
+ "main_layer": {}
+ },
+}
+```
+
+Then, here is how to load the JSON file:
+
+```python
+from argaze.ArUcoMarkers import ArUcoCamera
+
+# Load ArUcoCamera
+aruco_camera = ArUcoCamera.ArUcoCamera.from_json('./configuration.json')
+```
+
+Now, let's understand the meaning of each JSON entry.
+
+### Name
+
+The name of the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera). Basically useful for visualisation purpose.
+
+### Size
+
+The size of the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) defines the dimension of the rectangular area where gaze positions are projected. Be aware that gaze positions have to be in the same range of value.
+
+!!! warning
+ **ArGaze doesn't impose any spatial unit.** Gaze positions can either be integer or float, pixels, millimeters or what ever you need. The only concern is that all spatial values used in further configurations have to be all the same unit.
+
+### ArUco detector
+
+The first [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline step is to identify fixations or saccades from consecutive timestamped gaze positions.
+
+![Gaze Movement Identifier](../../img/ar_frame_gaze_movement_identifier.png)
+
+The identification algorithm can be selected by instantiating a particular [GazeMovementIdentifier](../../argaze.md/#argaze.GazeFeatures.GazeMovementIdentifier) from the [argaze.GazeAnalysis](../../argaze.md/#argaze.GazeAnalysis) submodule or [from another python package](advanced_topics/module_loading.md).
+
+In the example file, the choosen identification algorithm is the [Dispersion Threshold Identification (I-DT)](../../argaze.md/#argaze.GazeAnalysis.DispersionThresholdIdentification) which has two specific *deviation_max_threshold* and *duration_min_threshold* attributes.
+
+!!! note
+ In ArGaze, [Fixation](../../argaze.md/#argaze.GazeFeatures.Fixation) and [Saccade](../../argaze.md/#argaze.GazeFeatures.Saccade) are considered as particular [GazeMovements](../../argaze.md/#argaze.GazeFeatures.GazeMovement).
+
+!!! warning
+ JSON *gaze_movement_identifier* entry is mandatory. Otherwise, the ScanPath and ScanPathAnalyzers steps are disabled.
+
+### Scan Path
+
+The second [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline step aims to build a [ScanPath](../../argaze.md/#argaze.GazeFeatures.ScanPath) defined as a list of [ScanSteps](../../argaze.md/#argaze.GazeFeatures.ScanStep) made by a fixation and a consecutive saccade.
+
+![Scan Path](../../img/ar_frame_scan_path.png)
+
+Once fixations and saccades are identified, they are automatically appended to the ScanPath if required.
+
+The [ScanPath.duration_max](../../argaze.md/#argaze.GazeFeatures.ScanPath.duration_max) attribute is the duration from which older scan steps are removed each time new scan steps are added.
+
+!!! note
+ JSON *scan_path* entry is not mandatory. If scan_path_analyzers entry is not empty, the ScanPath step is automatically enabled.
+
+### Scan Path Analyzers
+
+Finally, the last [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline step consists in passing the previously built [ScanPath](../../argaze.md/#argaze.GazeFeatures.ScanPath) to each loaded [ScanPathAnalyzer](../../argaze.md/#argaze.GazeFeatures.ScanPathAnalyzer).
+
+Each analysis algorithm can be selected by instantiating a particular [ScanPathAnalyzer](../../argaze.md/#argaze.GazeFeatures.ScanPathAnalyzer) from the [argaze.GazeAnalysis](../../argaze.md/#argaze.GazeAnalysis) submodule or [from another python package](advanced_topics/module_loading.md).
+
+## Pipeline execution
+
+Timestamped gaze positions have to be passed one by one to [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method to execute the whole intanciated pipeline.
+
+```python
+# Assuming that timestamped gaze positions are available
+...
+
+ # Look ArFrame at a timestamped gaze position
+ ar_frame.look(timestamp, gaze_position)
+```
diff --git a/docs/user_guide/augmented_reality_pipeline/aruco_scene_creation.md b/docs/user_guide/aruco_markers_pipeline/aruco_scene_creation.md
index d9a7be5..ee5cca7 100644
--- a/docs/user_guide/augmented_reality_pipeline/aruco_scene_creation.md
+++ b/docs/user_guide/aruco_markers_pipeline/aruco_scene_creation.md
@@ -1,11 +1,7 @@
-Setup ArUco markers scene
+Build ArUco markers scene
=========================
-The OpenCV library provides a module to detect fiducial markers into a picture and estimate its pose (cf [OpenCV ArUco tutorial page](https://docs.opencv.org/4.x/d5/dae/tutorial_aruco_detection.html)).
-
-![OpenCV ArUco markers](https://pyimagesearch.com/wp-content/uploads/2020/12/aruco_generate_tags_header.png)
-
-The ArGaze [ArUcoMarkers submodule](../../argaze.md/#argaze.ArUcoMarkers) eases markers creation and description of their expected place for further camera pose estimation.
+First of all, ArUco markers needs to be printed and placed into the scene.
## Print ArUco markers from a ArUco dictionary
@@ -32,18 +28,21 @@ aruco_dictionary.create_marker(5, 3.5).save('./markers/', 300)
aruco_dictionary.save('./markers/', 3.5, 300)
```
+!!! note
+ There is **A4_DICT_APRILTAG_16h5_5cm_0-7.pdf** file located in *./src/argaze/ArUcoMarkers/utils/* folder ready to be printed on A4 paper sheet.
+
Let's print some of them before to go further.
!!! warning
Print markers with a blank zone around them to help in their detection.
-## Describe expected ArUco markers place
+## Describe ArUco markers place
-Once [ArUcoMarkers](../../argaze.md/#argaze.ArUcoMarkers.ArUcoMarker) pictures are placed into a scene it is possible to describe their expected 3D place into a file.
+Once [ArUcoMarkers](../../argaze.md/#argaze.ArUcoMarkers.ArUcoMarker) pictures are placed into a scene it is possible to describe their 3D places into a file.
![ArUco scene](../../img/aruco_scene.png)
-Where ever the origin point is, all expected markers places need to be described in a [right-handed 3D axis](https://robotacademy.net.au/lesson/right-handed-3d-coordinate-frame/) where:
+Where ever the origin point is, all markers places need to be described in a [right-handed 3D axis](https://robotacademy.net.au/lesson/right-handed-3d-coordinate-frame/) where:
* +X is pointing to the right,
* +Y is pointing to the top,
@@ -91,12 +90,12 @@ s off
f 13//4 14//4 16//4 15//4
```
-Here are common OBJ file features needed to describe ArUco markers place:
+Here are common OBJ file features needed to describe ArUco markers places:
-* Object lines (line starting with *o* key) indicate markers dictionary and id by following a format: **DICTIONARY**#**ID**\_Marker.
-* Vertice lines (lines starting with *v* key) indicate markers corners. The marker size will be automatically deducted from the geometry.
-* Plane normals (lines starting with *vn* key) need to be exported for further pose estimation.
-* Face (lines starting with *f* key) link vertices and normals indexes together.
+* Object lines (starting with *o* key) indicate markers dictionary and id by following this format: **DICTIONARY**#**ID**\_Marker.
+* Vertice lines (starting with *v* key) indicate markers corners. The marker size will be automatically deducted from the geometry.
+* Plane normals (starting with *vn* key) need to be exported for further pose estimation.
+* Face (starting with *f* key) link vertices and normals indexes together.
!!! warning
All markers must have the same size and belong to the same dictionary.
diff --git a/docs/user_guide/aruco_markers_pipeline/introduction.md b/docs/user_guide/aruco_markers_pipeline/introduction.md
new file mode 100644
index 0000000..d2b19eb
--- /dev/null
+++ b/docs/user_guide/aruco_markers_pipeline/introduction.md
@@ -0,0 +1,26 @@
+Overview
+========
+
+This section explains how to build augmented reality pipelines based on ArUco Markers technology for various use cases.
+
+The OpenCV library provides a module to detect fiducial markers into a picture and estimate their poses (cf [OpenCV ArUco tutorial page](https://docs.opencv.org/4.x/d5/dae/tutorial_aruco_detection.html)).
+
+![OpenCV ArUco markers](https://pyimagesearch.com/wp-content/uploads/2020/12/aruco_generate_tags_header.png)
+
+The ArGaze [ArUcoMarkers submodule](../../argaze.md/#argaze.ArUcoMarkers) eases markers creation, optic calibration, markers detection and 3D scene pose estimation through a set of high level classes.
+
+First, let's look at the schema below: it gives an overview of the main notions involved in the following chapters.
+
+![ArUco markers pipeline](../../img/aruco_markers_pipeline.png)
+
+To build your own ArUco markers pipeline, you need to know:
+
+* [How to build an ArUco markers scene](aruco_scene_creation.md),
+* [How to calibrate optic parameters](optic_parameters_calibration.md),
+* [How to deal with an ArUcoCamera instance](aruco_camera_configuration_and_execution.md),
+* [How to add ArScene instance](ar_scene.md),
+* [How to visualize ArCamera and ArScenes](visualisation.md)
+
+More advanced features are also explained like:
+
+* [How to script ArUco markers pipeline](advanced_topics/scripting.md)
diff --git a/docs/user_guide/augmented_reality_pipeline/optic_parameters_calibration.md b/docs/user_guide/aruco_markers_pipeline/optic_parameters_calibration.md
index 0561112..0561112 100644
--- a/docs/user_guide/augmented_reality_pipeline/optic_parameters_calibration.md
+++ b/docs/user_guide/aruco_markers_pipeline/optic_parameters_calibration.md
diff --git a/docs/user_guide/augmented_reality_pipeline/introduction.md b/docs/user_guide/augmented_reality_pipeline/introduction.md
deleted file mode 100644
index a06b1e2..0000000
--- a/docs/user_guide/augmented_reality_pipeline/introduction.md
+++ /dev/null
@@ -1,19 +0,0 @@
-Overview
-========
-
-This section explains how to build augmented reality pipelines based on ArUco Markers technology for various use cases.
-
-First, let's look at the schema below: it gives an overview of the main notions involved in the following chapters.
-
-![Augmented reality pipeline](../../img/augmented_reality_pipeline.png)
-
-To build your own augmented reality pipeline, you need to know:
-
-* [How to setup an ArUco markers scene](aruco_scene_creation.md),
-* [How to deal with an ArCamera instance](ar_camera_configuration_and_execution.md),
-* [How to add ArScene instance](ar_scene.md),
-* [How to visualize ArCamera and ArScenes](visualisation.md)
-
-More advanced features are also explained like:
-
-* [How to script augmented reality pipeline](advanced_topics/scripting.md)