aboutsummaryrefslogtreecommitdiff
path: root/docs/use_cases/pilot_gaze_monitoring
diff options
context:
space:
mode:
Diffstat (limited to 'docs/use_cases/pilot_gaze_monitoring')
-rw-r--r--docs/use_cases/pilot_gaze_monitoring/context.md33
-rw-r--r--docs/use_cases/pilot_gaze_monitoring/introduction.md43
-rw-r--r--docs/use_cases/pilot_gaze_monitoring/observers.md99
-rw-r--r--docs/use_cases/pilot_gaze_monitoring/pipeline.md235
4 files changed, 410 insertions, 0 deletions
diff --git a/docs/use_cases/pilot_gaze_monitoring/context.md b/docs/use_cases/pilot_gaze_monitoring/context.md
new file mode 100644
index 0000000..42df132
--- /dev/null
+++ b/docs/use_cases/pilot_gaze_monitoring/context.md
@@ -0,0 +1,33 @@
+Live streaming context
+======================
+
+The **live_streaming_context.json** file ...
+
+```json
+{
+ "argaze.utils.contexts.TobiiProGlasses2.LiveStream": {
+ "name": "Tobii Pro Glasses 2 live stream",
+ "address": "10.34.0.17",
+ "project": "HAIKU-XP",
+ "participant": "Pilot-A",
+ "configuration": {
+ "sys_ec_preset": "Indoor",
+ "sys_sc_width": 1920,
+ "sys_sc_height": 1080,
+ "sys_sc_fps": 25,
+ "sys_sc_preset": "Auto",
+ "sys_et_freq": 50,
+ "sys_mems_freq": 100
+ },
+ "pipeline": "live_processing_pipeline.json",
+ "observers": {
+ "observers.IvyBus": {
+ "name": "argaze_haiku",
+ "bus": "10.34.127.255:2023"
+ }
+ }
+ }
+}
+```
+
+The **live_streaming_context.json** file also mentions **live_processing_pipeline.json** which is described in the next chapter. \ No newline at end of file
diff --git a/docs/use_cases/pilot_gaze_monitoring/introduction.md b/docs/use_cases/pilot_gaze_monitoring/introduction.md
new file mode 100644
index 0000000..0faf4b1
--- /dev/null
+++ b/docs/use_cases/pilot_gaze_monitoring/introduction.md
@@ -0,0 +1,43 @@
+Overview
+========
+
+**ArGaze** enabled a cognitive assistant to support a pilot's situation awareness.
+
+The following use case has integrated the [ArUco marker pipeline](../../user_guide/aruco_marker_pipeline/introduction.md) to map pilot gaze onto many cockpit instruments in real-time and then enable AOI fixation matching with the [gaze analysis pipeline](../../user_guide/gaze_analysis_pipeline/introduction.md).
+
+## Background
+
+The [HAIKU project](https://haikuproject.eu/) aims to pave the way for human-centric intelligent assistants in the aviation domain by supporting, among others, the pilot during startle or surprise events.
+One of the features provided by the assistant through **ArGaze** is a situation awareness support that ensures the pilot updates his awareness of the aircraft state by monitoring his gaze and flight parameters.
+When this support is active, relevant information is highlighted on the Primary Flight Display (PFD).
+
+![SA alert](../../img/haiku_sa_alert.png)
+
+## Experiment context
+
+Pilot eye tracking data were provided by Tobii Pro Glasses 2, a head-mounted eye tracker.
+The gaze and scene camera video were captured through the Tobii SDK and processed in real-time on NVIDIA Jetson Xavier.
+Since the eye tracker model is head-mounted, ArUco markers were placed at various locations within a A320 cockpit simulator to ensure that several of them were constantly visible in the field of view of the eye tracker camera.
+
+![SimOne cockpit](../../img/simone_cockpit.png)
+
+The [ArUco marker pipeline](../../user_guide/aruco_marker_pipeline/introduction.md) has enabled real-time gaze mapping onto multiple screens and panels around pilot-in-command position while [gaze analysis pipeline](../../user_guide/gaze_analysis_pipeline/introduction.md) was identifying fixations and matching them with dynamic AOIs related to each instruments.
+To identify the relevant AOIs, a 3D model of the cockpit describing the AOI and the position of the markers has been realized.
+
+![ArUco markers and AOI scene](../../img/haiku_aoi.png)
+
+Finally, fixation events were sent in real-time through [Ivy bus middleware](https://gitlab.com/ivybus/ivy-python/) to the situation awareness software in charge of displaying attention getter onto the PFD screen.
+
+## ArGaze setup
+
+The project defines 3 main files to integrate **ArGaze** to the experiment:
+
+* The context that captures gaze data and scene camera video: [live_streaming_context.json](context.md)
+* The pipeline that processes gaze data and scene camera video: [live_processing_pipeline.json](pipeline.md)
+* The observers that send fixation events to Ivy bus middleware: [observers.py](observers.md)
+
+The project is loaded by executing the following command:
+
+```shell
+python -m argaze load live_streaming_context.json
+```
diff --git a/docs/use_cases/pilot_gaze_monitoring/observers.md b/docs/use_cases/pilot_gaze_monitoring/observers.md
new file mode 100644
index 0000000..3b3f07d
--- /dev/null
+++ b/docs/use_cases/pilot_gaze_monitoring/observers.md
@@ -0,0 +1,99 @@
+Fixation events sending
+=======================
+
+The **observers.py** file ...
+
+```python
+import logging
+
+from argaze import DataFeatures, GazeFeatures
+
+from ivy.std_api import *
+from ivy.ivy import IvyIllegalStateError
+
+
+class IvyBus(DataFeatures.PipelineStepObject):
+ """Handle Ivy bus."""
+
+ @DataFeatures.PipelineStepInit
+ def __init__(self, **kwargs):
+
+ self.__bus = None
+
+ @property
+ def bus(self) -> str:
+ return self.__bus
+
+ @bus.setter
+ def bus(self, bus: str):
+ self.__bus = bus
+
+ @DataFeatures.PipelineStepEnter
+ def __enter__(self, parent = None):
+
+ # Enable Ivy bus
+ IvyInit(self.name)
+ IvyStart(self.__bus)
+
+ return self
+
+ @DataFeatures.PipelineStepExit
+ def __exit__(self, exception_type, exception_value, exception_traceback):
+
+ # Stop Ivy bus
+ IvyStop()
+
+
+class ArUcoCameraLogger(DataFeatures.PipelineStepObject):
+ """Log ArUcoCamera activity."""
+
+ @DataFeatures.PipelineStepInit
+ def __init__(self, **kwargs):
+
+ self._last_markers_number = None
+
+ def on_watch(self, timestamp, aruco_camera, exception):
+ """Report ArUco markers detection info on Ivy bus."""
+
+ # Wait for number of detected marker changes
+ if aruco_camera.aruco_detector.detected_markers_number() != self._last_markers_number:
+
+ self._last_markers_number = aruco_camera.aruco_detector.detected_markers_number()
+
+ output = f'ArUcoDetection MarkersNumber={self._last_markers_number}'
+
+ # Send Ivy message
+ IvySendMsg(output)
+
+ logging.debug('%i %s', timestamp, output)
+
+ def on_look(self, timestamp, aruco_camera, exception):
+ """Report fixation and metrics on Ivy bus."""
+
+ # Select 'Main' layer
+ main_layer = aruco_camera.layers['Main']
+
+ if GazeFeatures.is_fixation(aruco_camera.last_gaze_movement()):
+
+ fixation = aruco_camera.last_gaze_movement()
+
+ # Output in progress fixation data
+ if not fixation.is_finished():
+
+ output = f'FixationInProgress Start={fixation[0].timestamp} Duration={fixation.duration} AOI={main_layer.last_looked_aoi_name()} Probabilities={main_layer.aoi_matcher.looked_probabilities()}'
+
+ # Send Ivy message
+ IvySendMsg(output)
+
+ logging.debug('%i %s %s %s', timestamp, aruco_camera.last_gaze_position().value, aruco_camera.name, output)
+
+ # Output finished fixation data
+ else:
+
+ output = f'FixationEnd Start={fixation[0].timestamp} Duration={fixation.duration} AOI={main_layer.aoi_matcher.looked_aoi_name()} Probabilities={main_layer.aoi_matcher.looked_probabilities()}'
+
+ # Send Ivy message
+ IvySendMsg(output)
+
+ logging.debug('%i %s %s %s', timestamp, aruco_camera.last_gaze_position().value, aruco_camera.name, output)
+```
diff --git a/docs/use_cases/pilot_gaze_monitoring/pipeline.md b/docs/use_cases/pilot_gaze_monitoring/pipeline.md
new file mode 100644
index 0000000..d41e2c4
--- /dev/null
+++ b/docs/use_cases/pilot_gaze_monitoring/pipeline.md
@@ -0,0 +1,235 @@
+Live processing pipeline
+========================
+
+The **live_processing_pipeline.json** file ...
+
+```json
+{
+ "argaze.ArUcoMarker.ArUcoCamera.ArUcoCamera": {
+ "name": "Camera",
+ "size": [1920, 1080],
+ "aruco_detector": {
+ "dictionary": "DICT_APRILTAG_16h5",
+ "optic_parameters": "optic_parameters.json",
+ "parameters": "detector_parameters.json"
+ },
+ "gaze_movement_identifier": {
+ "argaze.GazeAnalysis.DispersionThresholdIdentification.GazeMovementIdentifier": {
+ "deviation_max_threshold": 25,
+ "duration_min_threshold": 150
+ }
+ },
+ "filter_in_progress_identification": false,
+ "scenes": {
+ "Cockpit": {
+ "aruco_markers_group": "aruco_scene.obj",
+ "layers": {
+ "Main" : {
+ "aoi_scene": "aoi/Cockpit.obj"
+ }
+ },
+ "frames": {
+ "PIC_PFD": {
+ "size": [960, 1080],
+ "background": "aoi/PIC_PFD.png",
+ "gaze_movement_identifier": {
+ "argaze.GazeAnalysis.DispersionThresholdIdentification.GazeMovementIdentifier": {
+ "deviation_max_threshold": 50,
+ "duration_min_threshold": 150
+ }
+ },
+ "layers": {
+ "Main": {
+ "aoi_scene": "aoi/PIC_PFD.svg"
+ }
+ },
+ "image_parameters": {
+ "background_weight": 1,
+ "draw_gaze_positions": {
+ "color": [0, 255, 255],
+ "size": 15
+ }
+ }
+ }
+ },
+ "angle_tolerance": 15.0,
+ "distance_tolerance": 10.0
+ }
+ },
+ "layers": {
+ "Main": {
+ "aoi_matcher": {
+ "argaze.GazeAnalysis.DeviationCircleCoverage.AOIMatcher": {
+ "coverage_threshold": 0.25
+ }
+ }
+ }
+ },
+ "image_parameters": {
+ "background_weight": 1,
+ "draw_gaze_positions": {
+ "color": [0, 255, 255],
+ "size": 4
+ },
+ "draw_detected_markers": {
+ "color": [0, 255, 0],
+ "draw_axes": {
+ "thickness": 4
+ }
+ },
+ "draw_fixations": {
+ "deviation_circle_color": [255, 127, 255],
+ "duration_border_color": [127, 0, 127],
+ "duration_factor": 1e-2
+ },
+ "draw_layers": {
+ "Main": {
+ "draw_aoi_scene": {
+ "draw_aoi": {
+ "color": [0, 255, 255],
+ "border_size": 1
+ }
+ },
+ "draw_aoi_matching": {
+ "update_looked_aoi": true,
+ "draw_matched_fixation": {
+ "deviation_circle_color": [255, 255, 255],
+ "draw_positions": {
+ "position_color": [0, 255, 0],
+ "line_color": [0, 0, 0]
+ }
+ },
+ "draw_matched_region": {
+ "color": [0, 255, 0],
+ "border_size": 4
+ },
+ "draw_looked_aoi": {
+ "color": [0, 255, 0],
+ "border_size": 2
+ },
+ "looked_aoi_name_color": [255, 255, 255],
+ "looked_aoi_name_offset": [0, -10]
+ }
+ }
+ }
+ },
+ "observers": {
+ "observers.ArUcoCameraLogger": {}
+ }
+ }
+}
+```
+
+The **live_processing_pipeline.json** also mentions other files which are described below.
+
+## optic_parameters.json
+
+```json
+{
+ "rms": 0.6688921504088245,
+ "dimensions": [
+ 1920,
+ 1080
+ ],
+ "K": [
+ [
+ 1135.6524381415752,
+ 0.0,
+ 956.0685325355497
+ ],
+ [
+ 0.0,
+ 1135.9272506869524,
+ 560.059099810324
+ ],
+ [
+ 0.0,
+ 0.0,
+ 1.0
+ ]
+ ],
+ "D": [
+ 0.01655492265003404,
+ 0.1985524264972037,
+ 0.002129965902489484,
+ -0.0019528582922179365,
+ -0.5792910353639452
+ ]
+}
+```
+
+## detector_parameters.json
+
+```json
+{
+ "adaptiveThreshConstant": 7,
+ "useAruco3Detection": 1
+}
+```
+
+## aruco_scene.obj
+
+```obj
+# Blender v3.0.1 OBJ File: 'scene.blend'
+# www.blender.org
+o DICT_APRILTAG_16h5#11_Marker
+v -27.600000 29.075905 -51.042164
+v -24.400000 29.075905 -51.042164
+v -27.600000 31.927124 -52.494930
+v -24.400000 31.927124 -52.494930
+s off
+f 1 2 4 3
+o DICT_APRILTAG_16h5#14_Marker
+v -27.280746 14.890414 -43.814297
+v -24.080746 14.890414 -43.814297
+v -27.280746 17.741634 -45.267063
+v -24.080746 17.741634 -45.267063
+s off
+f 5 6 8 7
+o DICT_APRILTAG_16h5#13_Marker
+v -12.126360 14.872046 -43.804939
+v -8.926359 14.872046 -43.804939
+v -12.126360 17.723267 -45.257706
+v -8.926359 17.723267 -45.257706
+s off
+f 9 10 12 11
+o DICT_APRILTAG_16h5#12_Marker
+v -43.079227 14.890414 -43.814297
+v -39.879230 14.890414 -43.814297
+v -43.079227 17.741634 -45.267063
+v -39.879230 17.741634 -45.267063
+s off
+f 13 14 16 15
+
+```
+
+## aoi/Cockpit.obj
+
+```obj
+# Blender v3.0.1 OBJ File: 'scene.blend'
+# www.blender.org
+o PIC_PFD
+v -43.208000 32.020378 -52.542446
+v -26.000000 32.020378 -52.542446
+v -43.208000 14.779404 -43.757732
+v -26.000000 14.779404 -43.757732
+s off
+f 3 4 2 1
+```
+
+## aoi/PIC_PFD.png
+
+![PFD frame background](../../img/haiku_PIC_PFD_background.png)
+
+## aoi/PIC_PFD.svg
+
+```svg
+<svg>
+ <rect id="PIC_PFD_Air_Speed" x="93.228" y="193.217" width="135.445" height="571.812"/>
+ <rect id="PIC_PFD_Altitude" x="686.079" y="193.217" width="133.834" height="571.812"/>
+ <rect id="PIC_PFD_FMA_Mode" x="93.228" y="85.231" width="772.943" height="107.986"/>
+ <rect id="PIC_PFD_Heading" x="228.673" y="765.029" width="480.462" height="139.255"/>
+ <rect id="PIC_PFD_Attitude" x="228.673" y="193.217" width="457.406" height="571.812"/>
+ <rect id="PIC_PFD_Vertical_Speed" x="819.913" y="193.217" width="85.185" height="609.09"/>
+</svg>
+```