aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorThéo de la Hogue2024-07-01 17:19:07 +0200
committerThéo de la Hogue2024-07-01 17:19:07 +0200
commit3fc1abf3ed699c71faf7dec66c53a3a020ec8e16 (patch)
tree34283c8a9977b8c4b7f8a46aed293f72ff50a492
parentb27ec2b1d229110e8800fe4cb6de9bbf82ff5717 (diff)
downloadargaze-3fc1abf3ed699c71faf7dec66c53a3a020ec8e16.zip
argaze-3fc1abf3ed699c71faf7dec66c53a3a020ec8e16.tar.gz
argaze-3fc1abf3ed699c71faf7dec66c53a3a020ec8e16.tar.bz2
argaze-3fc1abf3ed699c71faf7dec66c53a3a020ec8e16.tar.xz
Improving use cases section.
-rw-r--r--docs/img/haiku_PIC_PFD_background.pngbin0 -> 240780 bytes
-rw-r--r--docs/img/haiku_aoi.pngbin0 -> 346960 bytes
-rw-r--r--docs/img/haiku_sa_alert.pngbin0 -> 75417 bytes
-rw-r--r--docs/img/simone_aoi_scene.pngbin141110 -> 0 bytes
-rw-r--r--docs/img/simone_aruco_scene.pngbin160927 -> 0 bytes
-rw-r--r--docs/img/simone_cockpit.pngbin671361 -> 460228 bytes
-rw-r--r--docs/img/simone_cockpit_3d.pngbin745778 -> 0 bytes
-rw-r--r--docs/img/simone_projection.pngbin681649 -> 0 bytes
-rw-r--r--docs/use_cases/pilot_gaze_monitoring/context.md33
-rw-r--r--docs/use_cases/pilot_gaze_monitoring/introduction.md43
-rw-r--r--docs/use_cases/pilot_gaze_monitoring/observers.md99
-rw-r--r--docs/use_cases/pilot_gaze_monitoring/pipeline.md235
-rw-r--r--docs/use_cases/simone_a320_cockpit_simulator.md28
-rw-r--r--mkdocs.yml6
14 files changed, 415 insertions, 29 deletions
diff --git a/docs/img/haiku_PIC_PFD_background.png b/docs/img/haiku_PIC_PFD_background.png
new file mode 100644
index 0000000..9fd48fa
--- /dev/null
+++ b/docs/img/haiku_PIC_PFD_background.png
Binary files differ
diff --git a/docs/img/haiku_aoi.png b/docs/img/haiku_aoi.png
new file mode 100644
index 0000000..1ea2236
--- /dev/null
+++ b/docs/img/haiku_aoi.png
Binary files differ
diff --git a/docs/img/haiku_sa_alert.png b/docs/img/haiku_sa_alert.png
new file mode 100644
index 0000000..42eb358
--- /dev/null
+++ b/docs/img/haiku_sa_alert.png
Binary files differ
diff --git a/docs/img/simone_aoi_scene.png b/docs/img/simone_aoi_scene.png
deleted file mode 100644
index 0273b79..0000000
--- a/docs/img/simone_aoi_scene.png
+++ /dev/null
Binary files differ
diff --git a/docs/img/simone_aruco_scene.png b/docs/img/simone_aruco_scene.png
deleted file mode 100644
index ec165cc..0000000
--- a/docs/img/simone_aruco_scene.png
+++ /dev/null
Binary files differ
diff --git a/docs/img/simone_cockpit.png b/docs/img/simone_cockpit.png
index 4ffb2ad..56fc754 100644
--- a/docs/img/simone_cockpit.png
+++ b/docs/img/simone_cockpit.png
Binary files differ
diff --git a/docs/img/simone_cockpit_3d.png b/docs/img/simone_cockpit_3d.png
deleted file mode 100644
index 92ded51..0000000
--- a/docs/img/simone_cockpit_3d.png
+++ /dev/null
Binary files differ
diff --git a/docs/img/simone_projection.png b/docs/img/simone_projection.png
deleted file mode 100644
index bcbe4c0..0000000
--- a/docs/img/simone_projection.png
+++ /dev/null
Binary files differ
diff --git a/docs/use_cases/pilot_gaze_monitoring/context.md b/docs/use_cases/pilot_gaze_monitoring/context.md
new file mode 100644
index 0000000..42df132
--- /dev/null
+++ b/docs/use_cases/pilot_gaze_monitoring/context.md
@@ -0,0 +1,33 @@
+Live streaming context
+======================
+
+The **live_streaming_context.json** file ...
+
+```json
+{
+ "argaze.utils.contexts.TobiiProGlasses2.LiveStream": {
+ "name": "Tobii Pro Glasses 2 live stream",
+ "address": "10.34.0.17",
+ "project": "HAIKU-XP",
+ "participant": "Pilot-A",
+ "configuration": {
+ "sys_ec_preset": "Indoor",
+ "sys_sc_width": 1920,
+ "sys_sc_height": 1080,
+ "sys_sc_fps": 25,
+ "sys_sc_preset": "Auto",
+ "sys_et_freq": 50,
+ "sys_mems_freq": 100
+ },
+ "pipeline": "live_processing_pipeline.json",
+ "observers": {
+ "observers.IvyBus": {
+ "name": "argaze_haiku",
+ "bus": "10.34.127.255:2023"
+ }
+ }
+ }
+}
+```
+
+The **live_streaming_context.json** file also mentions **live_processing_pipeline.json** which is described in the next chapter. \ No newline at end of file
diff --git a/docs/use_cases/pilot_gaze_monitoring/introduction.md b/docs/use_cases/pilot_gaze_monitoring/introduction.md
new file mode 100644
index 0000000..0faf4b1
--- /dev/null
+++ b/docs/use_cases/pilot_gaze_monitoring/introduction.md
@@ -0,0 +1,43 @@
+Overview
+========
+
+**ArGaze** enabled a cognitive assistant to support a pilot's situation awareness.
+
+The following use case has integrated the [ArUco marker pipeline](../../user_guide/aruco_marker_pipeline/introduction.md) to map pilot gaze onto many cockpit instruments in real-time and then enable AOI fixation matching with the [gaze analysis pipeline](../../user_guide/gaze_analysis_pipeline/introduction.md).
+
+## Background
+
+The [HAIKU project](https://haikuproject.eu/) aims to pave the way for human-centric intelligent assistants in the aviation domain by supporting, among others, the pilot during startle or surprise events.
+One of the features provided by the assistant through **ArGaze** is a situation awareness support that ensures the pilot updates his awareness of the aircraft state by monitoring his gaze and flight parameters.
+When this support is active, relevant information is highlighted on the Primary Flight Display (PFD).
+
+![SA alert](../../img/haiku_sa_alert.png)
+
+## Experiment context
+
+Pilot eye tracking data were provided by Tobii Pro Glasses 2, a head-mounted eye tracker.
+The gaze and scene camera video were captured through the Tobii SDK and processed in real-time on NVIDIA Jetson Xavier.
+Since the eye tracker model is head-mounted, ArUco markers were placed at various locations within a A320 cockpit simulator to ensure that several of them were constantly visible in the field of view of the eye tracker camera.
+
+![SimOne cockpit](../../img/simone_cockpit.png)
+
+The [ArUco marker pipeline](../../user_guide/aruco_marker_pipeline/introduction.md) has enabled real-time gaze mapping onto multiple screens and panels around pilot-in-command position while [gaze analysis pipeline](../../user_guide/gaze_analysis_pipeline/introduction.md) was identifying fixations and matching them with dynamic AOIs related to each instruments.
+To identify the relevant AOIs, a 3D model of the cockpit describing the AOI and the position of the markers has been realized.
+
+![ArUco markers and AOI scene](../../img/haiku_aoi.png)
+
+Finally, fixation events were sent in real-time through [Ivy bus middleware](https://gitlab.com/ivybus/ivy-python/) to the situation awareness software in charge of displaying attention getter onto the PFD screen.
+
+## ArGaze setup
+
+The project defines 3 main files to integrate **ArGaze** to the experiment:
+
+* The context that captures gaze data and scene camera video: [live_streaming_context.json](context.md)
+* The pipeline that processes gaze data and scene camera video: [live_processing_pipeline.json](pipeline.md)
+* The observers that send fixation events to Ivy bus middleware: [observers.py](observers.md)
+
+The project is loaded by executing the following command:
+
+```shell
+python -m argaze load live_streaming_context.json
+```
diff --git a/docs/use_cases/pilot_gaze_monitoring/observers.md b/docs/use_cases/pilot_gaze_monitoring/observers.md
new file mode 100644
index 0000000..3b3f07d
--- /dev/null
+++ b/docs/use_cases/pilot_gaze_monitoring/observers.md
@@ -0,0 +1,99 @@
+Fixation events sending
+=======================
+
+The **observers.py** file ...
+
+```python
+import logging
+
+from argaze import DataFeatures, GazeFeatures
+
+from ivy.std_api import *
+from ivy.ivy import IvyIllegalStateError
+
+
+class IvyBus(DataFeatures.PipelineStepObject):
+ """Handle Ivy bus."""
+
+ @DataFeatures.PipelineStepInit
+ def __init__(self, **kwargs):
+
+ self.__bus = None
+
+ @property
+ def bus(self) -> str:
+ return self.__bus
+
+ @bus.setter
+ def bus(self, bus: str):
+ self.__bus = bus
+
+ @DataFeatures.PipelineStepEnter
+ def __enter__(self, parent = None):
+
+ # Enable Ivy bus
+ IvyInit(self.name)
+ IvyStart(self.__bus)
+
+ return self
+
+ @DataFeatures.PipelineStepExit
+ def __exit__(self, exception_type, exception_value, exception_traceback):
+
+ # Stop Ivy bus
+ IvyStop()
+
+
+class ArUcoCameraLogger(DataFeatures.PipelineStepObject):
+ """Log ArUcoCamera activity."""
+
+ @DataFeatures.PipelineStepInit
+ def __init__(self, **kwargs):
+
+ self._last_markers_number = None
+
+ def on_watch(self, timestamp, aruco_camera, exception):
+ """Report ArUco markers detection info on Ivy bus."""
+
+ # Wait for number of detected marker changes
+ if aruco_camera.aruco_detector.detected_markers_number() != self._last_markers_number:
+
+ self._last_markers_number = aruco_camera.aruco_detector.detected_markers_number()
+
+ output = f'ArUcoDetection MarkersNumber={self._last_markers_number}'
+
+ # Send Ivy message
+ IvySendMsg(output)
+
+ logging.debug('%i %s', timestamp, output)
+
+ def on_look(self, timestamp, aruco_camera, exception):
+ """Report fixation and metrics on Ivy bus."""
+
+ # Select 'Main' layer
+ main_layer = aruco_camera.layers['Main']
+
+ if GazeFeatures.is_fixation(aruco_camera.last_gaze_movement()):
+
+ fixation = aruco_camera.last_gaze_movement()
+
+ # Output in progress fixation data
+ if not fixation.is_finished():
+
+ output = f'FixationInProgress Start={fixation[0].timestamp} Duration={fixation.duration} AOI={main_layer.last_looked_aoi_name()} Probabilities={main_layer.aoi_matcher.looked_probabilities()}'
+
+ # Send Ivy message
+ IvySendMsg(output)
+
+ logging.debug('%i %s %s %s', timestamp, aruco_camera.last_gaze_position().value, aruco_camera.name, output)
+
+ # Output finished fixation data
+ else:
+
+ output = f'FixationEnd Start={fixation[0].timestamp} Duration={fixation.duration} AOI={main_layer.aoi_matcher.looked_aoi_name()} Probabilities={main_layer.aoi_matcher.looked_probabilities()}'
+
+ # Send Ivy message
+ IvySendMsg(output)
+
+ logging.debug('%i %s %s %s', timestamp, aruco_camera.last_gaze_position().value, aruco_camera.name, output)
+```
diff --git a/docs/use_cases/pilot_gaze_monitoring/pipeline.md b/docs/use_cases/pilot_gaze_monitoring/pipeline.md
new file mode 100644
index 0000000..d41e2c4
--- /dev/null
+++ b/docs/use_cases/pilot_gaze_monitoring/pipeline.md
@@ -0,0 +1,235 @@
+Live processing pipeline
+========================
+
+The **live_processing_pipeline.json** file ...
+
+```json
+{
+ "argaze.ArUcoMarker.ArUcoCamera.ArUcoCamera": {
+ "name": "Camera",
+ "size": [1920, 1080],
+ "aruco_detector": {
+ "dictionary": "DICT_APRILTAG_16h5",
+ "optic_parameters": "optic_parameters.json",
+ "parameters": "detector_parameters.json"
+ },
+ "gaze_movement_identifier": {
+ "argaze.GazeAnalysis.DispersionThresholdIdentification.GazeMovementIdentifier": {
+ "deviation_max_threshold": 25,
+ "duration_min_threshold": 150
+ }
+ },
+ "filter_in_progress_identification": false,
+ "scenes": {
+ "Cockpit": {
+ "aruco_markers_group": "aruco_scene.obj",
+ "layers": {
+ "Main" : {
+ "aoi_scene": "aoi/Cockpit.obj"
+ }
+ },
+ "frames": {
+ "PIC_PFD": {
+ "size": [960, 1080],
+ "background": "aoi/PIC_PFD.png",
+ "gaze_movement_identifier": {
+ "argaze.GazeAnalysis.DispersionThresholdIdentification.GazeMovementIdentifier": {
+ "deviation_max_threshold": 50,
+ "duration_min_threshold": 150
+ }
+ },
+ "layers": {
+ "Main": {
+ "aoi_scene": "aoi/PIC_PFD.svg"
+ }
+ },
+ "image_parameters": {
+ "background_weight": 1,
+ "draw_gaze_positions": {
+ "color": [0, 255, 255],
+ "size": 15
+ }
+ }
+ }
+ },
+ "angle_tolerance": 15.0,
+ "distance_tolerance": 10.0
+ }
+ },
+ "layers": {
+ "Main": {
+ "aoi_matcher": {
+ "argaze.GazeAnalysis.DeviationCircleCoverage.AOIMatcher": {
+ "coverage_threshold": 0.25
+ }
+ }
+ }
+ },
+ "image_parameters": {
+ "background_weight": 1,
+ "draw_gaze_positions": {
+ "color": [0, 255, 255],
+ "size": 4
+ },
+ "draw_detected_markers": {
+ "color": [0, 255, 0],
+ "draw_axes": {
+ "thickness": 4
+ }
+ },
+ "draw_fixations": {
+ "deviation_circle_color": [255, 127, 255],
+ "duration_border_color": [127, 0, 127],
+ "duration_factor": 1e-2
+ },
+ "draw_layers": {
+ "Main": {
+ "draw_aoi_scene": {
+ "draw_aoi": {
+ "color": [0, 255, 255],
+ "border_size": 1
+ }
+ },
+ "draw_aoi_matching": {
+ "update_looked_aoi": true,
+ "draw_matched_fixation": {
+ "deviation_circle_color": [255, 255, 255],
+ "draw_positions": {
+ "position_color": [0, 255, 0],
+ "line_color": [0, 0, 0]
+ }
+ },
+ "draw_matched_region": {
+ "color": [0, 255, 0],
+ "border_size": 4
+ },
+ "draw_looked_aoi": {
+ "color": [0, 255, 0],
+ "border_size": 2
+ },
+ "looked_aoi_name_color": [255, 255, 255],
+ "looked_aoi_name_offset": [0, -10]
+ }
+ }
+ }
+ },
+ "observers": {
+ "observers.ArUcoCameraLogger": {}
+ }
+ }
+}
+```
+
+The **live_processing_pipeline.json** also mentions other files which are described below.
+
+## optic_parameters.json
+
+```json
+{
+ "rms": 0.6688921504088245,
+ "dimensions": [
+ 1920,
+ 1080
+ ],
+ "K": [
+ [
+ 1135.6524381415752,
+ 0.0,
+ 956.0685325355497
+ ],
+ [
+ 0.0,
+ 1135.9272506869524,
+ 560.059099810324
+ ],
+ [
+ 0.0,
+ 0.0,
+ 1.0
+ ]
+ ],
+ "D": [
+ 0.01655492265003404,
+ 0.1985524264972037,
+ 0.002129965902489484,
+ -0.0019528582922179365,
+ -0.5792910353639452
+ ]
+}
+```
+
+## detector_parameters.json
+
+```json
+{
+ "adaptiveThreshConstant": 7,
+ "useAruco3Detection": 1
+}
+```
+
+## aruco_scene.obj
+
+```obj
+# Blender v3.0.1 OBJ File: 'scene.blend'
+# www.blender.org
+o DICT_APRILTAG_16h5#11_Marker
+v -27.600000 29.075905 -51.042164
+v -24.400000 29.075905 -51.042164
+v -27.600000 31.927124 -52.494930
+v -24.400000 31.927124 -52.494930
+s off
+f 1 2 4 3
+o DICT_APRILTAG_16h5#14_Marker
+v -27.280746 14.890414 -43.814297
+v -24.080746 14.890414 -43.814297
+v -27.280746 17.741634 -45.267063
+v -24.080746 17.741634 -45.267063
+s off
+f 5 6 8 7
+o DICT_APRILTAG_16h5#13_Marker
+v -12.126360 14.872046 -43.804939
+v -8.926359 14.872046 -43.804939
+v -12.126360 17.723267 -45.257706
+v -8.926359 17.723267 -45.257706
+s off
+f 9 10 12 11
+o DICT_APRILTAG_16h5#12_Marker
+v -43.079227 14.890414 -43.814297
+v -39.879230 14.890414 -43.814297
+v -43.079227 17.741634 -45.267063
+v -39.879230 17.741634 -45.267063
+s off
+f 13 14 16 15
+
+```
+
+## aoi/Cockpit.obj
+
+```obj
+# Blender v3.0.1 OBJ File: 'scene.blend'
+# www.blender.org
+o PIC_PFD
+v -43.208000 32.020378 -52.542446
+v -26.000000 32.020378 -52.542446
+v -43.208000 14.779404 -43.757732
+v -26.000000 14.779404 -43.757732
+s off
+f 3 4 2 1
+```
+
+## aoi/PIC_PFD.png
+
+![PFD frame background](../../img/haiku_PIC_PFD_background.png)
+
+## aoi/PIC_PFD.svg
+
+```svg
+<svg>
+ <rect id="PIC_PFD_Air_Speed" x="93.228" y="193.217" width="135.445" height="571.812"/>
+ <rect id="PIC_PFD_Altitude" x="686.079" y="193.217" width="133.834" height="571.812"/>
+ <rect id="PIC_PFD_FMA_Mode" x="93.228" y="85.231" width="772.943" height="107.986"/>
+ <rect id="PIC_PFD_Heading" x="228.673" y="765.029" width="480.462" height="139.255"/>
+ <rect id="PIC_PFD_Attitude" x="228.673" y="193.217" width="457.406" height="571.812"/>
+ <rect id="PIC_PFD_Vertical_Speed" x="819.913" y="193.217" width="85.185" height="609.09"/>
+</svg>
+```
diff --git a/docs/use_cases/simone_a320_cockpit_simulator.md b/docs/use_cases/simone_a320_cockpit_simulator.md
deleted file mode 100644
index a3f2138..0000000
--- a/docs/use_cases/simone_a320_cockpit_simulator.md
+++ /dev/null
@@ -1,28 +0,0 @@
----
-title: SimOne A320 cockpit simulator
----
-
-SimOne A320 cockpit simulator
-=============================
-
-The [ACHIL platform](http://achil.recherche.enac.fr) have a A320 glass cockpit simulator usually operated by ENAC’s Air Transportation department for system engineering courses to students. It is also used during MCTA training to give them an overview of the pilot’s counterpart. As this cockpit is no longer certified, it can be modified for research purposes and prototyping. It can also be connected to any simulation ran on the platform and integrate the rest of traffic.
-
-In order to track pilots gaze during experimentation, a set of ArUco markers have been positioned to cover most of cockpit workspace.
-
-![Cockpit](../img/simone_cockpit.png)
-
-Then, in order to build AR environment from such complex geometry workspace, a 3D LIDAR scanner have been used to get a 3D scan of cockpit.
-
-![Cockpit 3D](../img/simone_cockpit_3d.png)
-
-The 3D scan have been loaded in a 3D editor to help in ArUco markers and AOI poses reporting.
-
-![ArUco scene](../img/simone_aruco_scene.png) ![AOI scene](../img/simone_aoi_scene.png)
-
-Finally, a python script connect Tobii eyetracker glasses to ArGaze toolkit. The 3D AR environment is loaded then, ArUco markers are detected from Tobii eyetracker field camera stream allowing to estimate pilote head pose. The AOI are projected into camera image then, gaze positions are analyzed to identify fixations and saccades to finally check if fixations matched any projected AOI.
-
-![AOI and gaze projection](../img/simone_projection.png)
-
-A demonstration movie to see ArUco detection and AOI projection in real-time
-
-![type:video](http://achil.recherche.enac.fr/videos/marker_detection_and_aoi_projection.mp4)
diff --git a/mkdocs.yml b/mkdocs.yml
index bdb8603..2ec7046 100644
--- a/mkdocs.yml
+++ b/mkdocs.yml
@@ -44,7 +44,11 @@ nav:
- user_guide/utils/ready-made_scripts.md
- user_guide/utils/demonstrations_scripts.md
- Use Cases:
- - use_cases/simone_a320_cockpit_simulator.md
+ - Pilot gaze monitoring:
+ - use_cases/pilot_gaze_monitoring/introduction.md
+ - use_cases/pilot_gaze_monitoring/context.md
+ - use_cases/pilot_gaze_monitoring/pipeline.md
+ - use_cases/pilot_gaze_monitoring/observers.md
- Code Reference:
- argaze.md
- Contributor Guide: