aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorThéo de la Hogue2023-09-06 10:01:12 +0200
committerThéo de la Hogue2023-09-06 10:01:12 +0200
commit935ca70dfa1cde21b6c3c43659db3dcac3223c44 (patch)
treee8f8a984058261ce7cd37eee1c0e0a62aaa4b349
parent067b3c6dc2588be4c4e416f4f07b7dd45cf1bc67 (diff)
downloadargaze-935ca70dfa1cde21b6c3c43659db3dcac3223c44.zip
argaze-935ca70dfa1cde21b6c3c43659db3dcac3223c44.tar.gz
argaze-935ca70dfa1cde21b6c3c43659db3dcac3223c44.tar.bz2
argaze-935ca70dfa1cde21b6c3c43659db3dcac3223c44.tar.xz
Renaming aruco markers pipeline chapter. Working on aruco camera configuratin chapter.
-rw-r--r--docs/img/aruco_camera_frame.pngbin0 -> 19406 bytes
-rw-r--r--docs/user_guide/aruco_markers_pipeline/aruco_camera_configuration_and_execution.md86
-rw-r--r--docs/user_guide/aruco_markers_pipeline/aruco_markers_description.md (renamed from docs/user_guide/aruco_markers_pipeline/aruco_scene_creation.md)4
-rw-r--r--docs/user_guide/aruco_markers_pipeline/introduction.md3
-rw-r--r--docs/user_guide/aruco_markers_pipeline/optic_parameters_calibration.md2
-rw-r--r--mkdocs.yml2
6 files changed, 42 insertions, 55 deletions
diff --git a/docs/img/aruco_camera_frame.png b/docs/img/aruco_camera_frame.png
new file mode 100644
index 0000000..273675a
--- /dev/null
+++ b/docs/img/aruco_camera_frame.png
Binary files differ
diff --git a/docs/user_guide/aruco_markers_pipeline/aruco_camera_configuration_and_execution.md b/docs/user_guide/aruco_markers_pipeline/aruco_camera_configuration_and_execution.md
index a0ec2bf..9b1db42 100644
--- a/docs/user_guide/aruco_markers_pipeline/aruco_camera_configuration_and_execution.md
+++ b/docs/user_guide/aruco_markers_pipeline/aruco_camera_configuration_and_execution.md
@@ -3,9 +3,7 @@ Configure and execute ArUcoCamera
Once [ArUco markers are placed into a scene](aruco_scene_creation.md) and [the camera optic have been calibrated](optic_parameters_calibration.md), everything is ready to setup an ArUco marker pipeline thanks to [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) class.
-As it inherits from [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame), the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) class benefits from all the services of a [gaze analysis pipeline](./user_guide/gaze_analysis_pipeline/introduction.md).
-
-Besides, the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) class projects [ArUcoScenes](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene)'s layers into its own layers thanks to ArUco markers pose estimations made by its [ArUcoDetector](../../argaze.md/#argaze.ArUcoMarkers.ArUcoDetector).
+As it inherits from [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame), the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) class benefits from all the services of a [gaze analysis pipeline](./user_guide/gaze_analysis_pipeline/introduction.md). Besides, the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) class projects [ArUcoScenes](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene)'s layers into its own layers thanks to ArUco markers pose estimations made by its [ArUcoDetector](../../argaze.md/#argaze.ArUcoMarkers.ArUcoDetector).
![ArUco camera frame](../../img/aruco_camera_frame.png)
@@ -25,18 +23,21 @@ Here is a simple JSON ArUcoCamera configuration file example:
},
"marker_size": 5,
"optic_parameters": "optic_parameters.json",
- "parameters": {
- "cornerRefinementMethod": 1,
- "aprilTagQuadSigma": 2,
- "aprilTagDeglitch": 1
- }
},
"scenes": {
-
- },
- "layers": {
- "main_layer": {}
+ "main_scene" : {
+ "aruco_markers_group": "aruco_description.json"
+ }
},
+ "image_parameters": {
+ "background_weight": 1,
+ "draw_detected_markers": {
+ "color": [0, 255, 0],
+ "draw_axes": {
+ "thickness": 3
+ }
+ }
+ }
}
```
@@ -51,60 +52,45 @@ aruco_camera = ArUcoCamera.ArUcoCamera.from_json('./configuration.json')
Now, let's understand the meaning of each JSON entry.
-### Name
-
-The name of the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera). Basically useful for visualisation purpose.
+### Name (inherited from ArFrame)
-### Size
+The name of the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) frame. Basically useful for visualisation purpose.
-The size of the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) defines the dimension of the rectangular area where gaze positions are projected. Be aware that gaze positions have to be in the same range of value.
+### Size (inherited from ArFrame)
-!!! warning
- **ArGaze doesn't impose any spatial unit.** Gaze positions can either be integer or float, pixels, millimeters or what ever you need. The only concern is that all spatial values used in further configurations have to be all the same unit.
+The size of the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) frame in pixels. Be aware that gaze positions have to be in the same range of value.
### ArUco detector
-The first [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline step is to identify fixations or saccades from consecutive timestamped gaze positions.
-
-![Gaze Movement Identifier](../../img/ar_frame_gaze_movement_identifier.png)
-
-The identification algorithm can be selected by instantiating a particular [GazeMovementIdentifier](../../argaze.md/#argaze.GazeFeatures.GazeMovementIdentifier) from the [argaze.GazeAnalysis](../../argaze.md/#argaze.GazeAnalysis) submodule or [from another python package](advanced_topics/module_loading.md).
-
-In the example file, the choosen identification algorithm is the [Dispersion Threshold Identification (I-DT)](../../argaze.md/#argaze.GazeAnalysis.DispersionThresholdIdentification) which has two specific *deviation_max_threshold* and *duration_min_threshold* attributes.
-
-!!! note
- In ArGaze, [Fixation](../../argaze.md/#argaze.GazeFeatures.Fixation) and [Saccade](../../argaze.md/#argaze.GazeFeatures.Saccade) are considered as particular [GazeMovements](../../argaze.md/#argaze.GazeFeatures.GazeMovement).
-
-!!! warning
- JSON *gaze_movement_identifier* entry is mandatory. Otherwise, the ScanPath and ScanPathAnalyzers steps are disabled.
-
-### Scan Path
-
-The second [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline step aims to build a [ScanPath](../../argaze.md/#argaze.GazeFeatures.ScanPath) defined as a list of [ScanSteps](../../argaze.md/#argaze.GazeFeatures.ScanStep) made by a fixation and a consecutive saccade.
-
-![Scan Path](../../img/ar_frame_scan_path.png)
-
-Once fixations and saccades are identified, they are automatically appended to the ScanPath if required.
-
-The [ScanPath.duration_max](../../argaze.md/#argaze.GazeFeatures.ScanPath.duration_max) attribute is the duration from which older scan steps are removed each time new scan steps are added.
+...
-!!! note
- JSON *scan_path* entry is not mandatory. If scan_path_analyzers entry is not empty, the ScanPath step is automatically enabled.
+### Scenes
-### Scan Path Analyzers
+...
-Finally, the last [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline step consists in passing the previously built [ScanPath](../../argaze.md/#argaze.GazeFeatures.ScanPath) to each loaded [ScanPathAnalyzer](../../argaze.md/#argaze.GazeFeatures.ScanPathAnalyzer).
+### Image parameters (inherited from ArFrame)
-Each analysis algorithm can be selected by instantiating a particular [ScanPathAnalyzer](../../argaze.md/#argaze.GazeFeatures.ScanPathAnalyzer) from the [argaze.GazeAnalysis](../../argaze.md/#argaze.GazeAnalysis) submodule or [from another python package](advanced_topics/module_loading.md).
+...
## Pipeline execution
Timestamped gaze positions have to be passed one by one to [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method to execute the whole intanciated pipeline.
```python
-# Assuming that timestamped gaze positions are available
+# Assuming that live Full HD (1920x1080) video stream is enabled
+...
+
+# Assuming there is a way to escape the while loop
...
- # Look ArFrame at a timestamped gaze position
- ar_frame.look(timestamp, gaze_position)
+ while video_stream.is_alive():
+
+ # Capture image from video stream
+ image = video_stream.read()
+
+ # Detect ArUco markers in image
+ aruco_camera.watch(image)
+
+ # Do something with ArUcoCamera frame image
+ ...
```
diff --git a/docs/user_guide/aruco_markers_pipeline/aruco_scene_creation.md b/docs/user_guide/aruco_markers_pipeline/aruco_markers_description.md
index ee5cca7..061fb19 100644
--- a/docs/user_guide/aruco_markers_pipeline/aruco_scene_creation.md
+++ b/docs/user_guide/aruco_markers_pipeline/aruco_markers_description.md
@@ -1,5 +1,5 @@
-Build ArUco markers scene
-=========================
+Setup ArUco markers into a scene
+================================
First of all, ArUco markers needs to be printed and placed into the scene.
diff --git a/docs/user_guide/aruco_markers_pipeline/introduction.md b/docs/user_guide/aruco_markers_pipeline/introduction.md
index d2b19eb..ae174f3 100644
--- a/docs/user_guide/aruco_markers_pipeline/introduction.md
+++ b/docs/user_guide/aruco_markers_pipeline/introduction.md
@@ -15,7 +15,7 @@ First, let's look at the schema below: it gives an overview of the main notions
To build your own ArUco markers pipeline, you need to know:
-* [How to build an ArUco markers scene](aruco_scene_creation.md),
+* [How to setup ArUco markers into a scene](aruco_markers_description.md),
* [How to calibrate optic parameters](optic_parameters_calibration.md),
* [How to deal with an ArUcoCamera instance](aruco_camera_configuration_and_execution.md),
* [How to add ArScene instance](ar_scene.md),
@@ -24,3 +24,4 @@ To build your own ArUco markers pipeline, you need to know:
More advanced features are also explained like:
* [How to script ArUco markers pipeline](advanced_topics/scripting.md)
+* [How to improve ArUco markers detection](advanced_topics/aruco_detector_configuration.md)
diff --git a/docs/user_guide/aruco_markers_pipeline/optic_parameters_calibration.md b/docs/user_guide/aruco_markers_pipeline/optic_parameters_calibration.md
index 0561112..455d95a 100644
--- a/docs/user_guide/aruco_markers_pipeline/optic_parameters_calibration.md
+++ b/docs/user_guide/aruco_markers_pipeline/optic_parameters_calibration.md
@@ -63,9 +63,9 @@ aruco_detector = ArUcoDetector.ArUcoDetector(dictionary=aruco_dictionary, marker
# Assuming there is a way to escape the while loop
...
- # Capture images from video stream
while video_stream.is_alive():
+ # Capture image from video stream
image = video_stream.read()
# Detect all board corners in image
diff --git a/mkdocs.yml b/mkdocs.yml
index 89f1ffc..0755961 100644
--- a/mkdocs.yml
+++ b/mkdocs.yml
@@ -23,7 +23,7 @@ nav:
- user_guide/gaze_analysis_pipeline/advanced_topics/module_loading.md
- ArUco markers pipeline:
- user_guide/aruco_markers_pipeline/introduction.md
- - user_guide/aruco_markers_pipeline/aruco_scene_creation.md
+ - user_guide/aruco_markers_pipeline/aruco_markers_description.md
- user_guide/aruco_markers_pipeline/optic_parameters_calibration.md
- user_guide/aruco_markers_pipeline/aruco_camera_configuration_and_execution.md
# - ArUco Markers: