aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
-rw-r--r--docs/img/argaze_load_gui_haiku.pngbin0 -> 588929 bytes
-rw-r--r--docs/use_cases/pilot_gaze_monitoring/context.md7
-rw-r--r--docs/use_cases/pilot_gaze_monitoring/introduction.md5
-rw-r--r--docs/use_cases/pilot_gaze_monitoring/observers.md4
-rw-r--r--docs/use_cases/pilot_gaze_monitoring/pipeline.md25
5 files changed, 31 insertions, 10 deletions
diff --git a/docs/img/argaze_load_gui_haiku.png b/docs/img/argaze_load_gui_haiku.png
new file mode 100644
index 0000000..0546ae2
--- /dev/null
+++ b/docs/img/argaze_load_gui_haiku.png
Binary files differ
diff --git a/docs/use_cases/pilot_gaze_monitoring/context.md b/docs/use_cases/pilot_gaze_monitoring/context.md
index 417ed13..d6b712e 100644
--- a/docs/use_cases/pilot_gaze_monitoring/context.md
+++ b/docs/use_cases/pilot_gaze_monitoring/context.md
@@ -1,12 +1,11 @@
Live streaming context
======================
-The context handles pipeline inputs.
+The context handles incoming eye tracker data before to pass them to a processing pipeline.
## live_streaming_context.json
-For this use case we need to connect to a Tobii Pro Glasses 2 device.
-**ArGaze** provides a context class to live stream data from this device.
+For this use case we need to connect to a Tobii Pro Glasses 2 device: **ArGaze** provides a [ready-made context](../../../user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_2/) class to live stream data from this device.
While *address*, *project*, *participant* and *configuration* entries are specific to the [TobiiProGlasses2.LiveStream](../../argaze.md/#argaze.utils.contexts.TobiiProGlasses2.LiveStream) class, *name*, *pipeline* and *observers* entries are part of the parent [ArContext](../../argaze.md/#argaze.ArFeatures.ArContext) class.
@@ -39,4 +38,4 @@ While *address*, *project*, *participant* and *configuration* entries are specif
The [live_processing_pipeline.json](pipeline.md) file mentioned aboved is described in the next chapter.
-The observers objects are defined into the [observers.py](observers.md) file that is described in a next chapter. \ No newline at end of file
+The *IvyBus* observer object is defined into the [observers.py](observers.md) file that is described in a next chapter. \ No newline at end of file
diff --git a/docs/use_cases/pilot_gaze_monitoring/introduction.md b/docs/use_cases/pilot_gaze_monitoring/introduction.md
index 453a443..f5de773 100644
--- a/docs/use_cases/pilot_gaze_monitoring/introduction.md
+++ b/docs/use_cases/pilot_gaze_monitoring/introduction.md
@@ -30,7 +30,7 @@ Finally, fixation events were sent in real-time through [Ivy bus middleware](htt
## Setup
-The setup to integrate **ArGaze** to the experiment is defined by 3 main files:
+The setup to integrate **ArGaze** to the experiment is defined by 3 main files detailled in the next chapters:
* The context file that captures gaze data and scene camera video: [live_streaming_context.json](context.md)
* The pipeline file that processes gaze data and scene camera video: [live_processing_pipeline.json](pipeline.md)
@@ -42,5 +42,6 @@ As any **ArGaze** setup, it is loaded by executing the following command:
python -m argaze load live_streaming_context.json
```
-## Performance
+This command opens a GUI window that allows to start gaze calibration, to launch recording and to monitor gaze mapping.
+![ArGaze load GUI for Haiku](../../img/argaze_load_gui_haiku.png)
diff --git a/docs/use_cases/pilot_gaze_monitoring/observers.md b/docs/use_cases/pilot_gaze_monitoring/observers.md
index 2e3f394..5f5bc78 100644
--- a/docs/use_cases/pilot_gaze_monitoring/observers.md
+++ b/docs/use_cases/pilot_gaze_monitoring/observers.md
@@ -1,8 +1,12 @@
Fixation events sending
=======================
+Observers are attached to pipeline steps to be notified when a method is called.
+
## observers.py
+For this use case we need to enable [Ivy bus communication](https://gitlab.com/ivybus/ivy-python/) to log ArUco detection results (on *ArUcoCamera.on_watch* call) and fixation identification with AOI matching (on *ArUcoCamera.on_look* call).
+
```python
import logging
diff --git a/docs/use_cases/pilot_gaze_monitoring/pipeline.md b/docs/use_cases/pilot_gaze_monitoring/pipeline.md
index 8f8dad0..a083ade 100644
--- a/docs/use_cases/pilot_gaze_monitoring/pipeline.md
+++ b/docs/use_cases/pilot_gaze_monitoring/pipeline.md
@@ -5,8 +5,7 @@ The pipeline processes camera image and gaze data to enable gaze mapping and gaz
## live_processing_pipeline.json
-For this use case we need to detect ArUco markers to enable gaze mapping.
-**ArGaze** provides the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) class to setup an [ArUco markers pipeline](../../user_guide/aruco_marker_pipeline/introduction.md).
+For this use case we need to detect ArUco markers to enable gaze mapping: **ArGaze** provides the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) class to setup an [ArUco markers pipeline](../../user_guide/aruco_marker_pipeline/introduction.md).
```json
{
@@ -119,7 +118,13 @@ For this use case we need to detect ArUco markers to enable gaze mapping.
}
},
"observers": {
- "observers.ArUcoCameraLogger": {}
+ "observers.ArUcoCameraLogger": {},
+ "argaze.utils.UtilsFeatures.LookPerformanceRecorder": {
+ "path": "_export/look_performance.csv"
+ },
+ "argaze.utils.UtilsFeatures.WatchPerformanceRecorder": {
+ "path": "_export/watch_performance.csv"
+ }
}
}
}
@@ -127,10 +132,12 @@ For this use case we need to detect ArUco markers to enable gaze mapping.
All the files mentioned aboved are described below.
-The observers objects are defined into the [observers.py](observers.md) file that is described in the next chapter.
+The *ArUcoCameraLogger* observer object is defined into the [observers.py](observers.md) file that is described in the next chapter.
## optic_parameters.json
+This file defines the Tobii Pro glasses 2 scene camera optic parameters which has been calculated as explained into [the camera calibration chapter](../../../user_guide/aruco_marker_pipeline/advanced_topics/optic_parameters_calibration/).
+
```json
{
"rms": 0.6688921504088245,
@@ -167,6 +174,8 @@ The observers objects are defined into the [observers.py](observers.md) file tha
## detector_parameters.json
+This file defines the ArUco detector parameters as explained into [the detection improvement chapter](../../../user_guide/aruco_marker_pipeline/advanced_topics/aruco_detector_configuration/).
+
```json
{
"adaptiveThreshConstant": 7,
@@ -176,6 +185,8 @@ The observers objects are defined into the [observers.py](observers.md) file tha
## aruco_scene.obj
+This file defines the place where are the ArUco markers into the cockpit geometry. Markers' positions have been edited in Blender software from a 3D scan of the cockpit then exported at OBJ format.
+
```obj
# Blender v3.0.1 OBJ File: 'scene.blend'
# www.blender.org
@@ -239,6 +250,8 @@ f 29 30 32 31
## Cockpit.obj
+This file defines the place of the AOI into the cockpit geometry. AOI positions have been edited in [Blender software](https://www.blender.org/) from a 3D scan of the cockpit then exported at OBJ format.
+
```obj
# Blender v3.0.1 OBJ File: 'scene.blend'
# www.blender.org
@@ -274,10 +287,14 @@ f 13 14 16 15
## PIC_PFD.png
+This file is a screenshot of the PFD screen used to monitor where the gaze is projected after gaze mapping processing.
+
![PFD frame background](../../img/haiku_PIC_PFD_background.png)
## PIC_PFD.svg
+This file defines the place of the AOI into the PFD frame. AOI positions have been edited [Inkscape software](https://inkscape.org/fr/) from a screenshot of the PFD screen then exported at SVG format.
+
```svg
<svg>
<rect id="PIC_PFD_Air_Speed" x="93.228" y="193.217" width="135.445" height="571.812"/>