From 93653734702091808aabe637dc727d55f483e72c Mon Sep 17 00:00:00 2001
From: Théo de la Hogue
Date: Wed, 11 Sep 2024 10:23:40 +0200
Subject: Changing monitoring to tracking.
---
docs/use_cases/pilot_gaze_monitoring/context.md | 41 ---
.../pilot_gaze_monitoring/introduction.md | 47 ----
docs/use_cases/pilot_gaze_monitoring/observers.md | 103 -------
docs/use_cases/pilot_gaze_monitoring/pipeline.md | 311 ---------------------
docs/use_cases/pilot_gaze_tracking/context.md | 41 +++
docs/use_cases/pilot_gaze_tracking/introduction.md | 47 ++++
docs/use_cases/pilot_gaze_tracking/observers.md | 103 +++++++
docs/use_cases/pilot_gaze_tracking/pipeline.md | 311 +++++++++++++++++++++
8 files changed, 502 insertions(+), 502 deletions(-)
delete mode 100644 docs/use_cases/pilot_gaze_monitoring/context.md
delete mode 100644 docs/use_cases/pilot_gaze_monitoring/introduction.md
delete mode 100644 docs/use_cases/pilot_gaze_monitoring/observers.md
delete mode 100644 docs/use_cases/pilot_gaze_monitoring/pipeline.md
create mode 100644 docs/use_cases/pilot_gaze_tracking/context.md
create mode 100644 docs/use_cases/pilot_gaze_tracking/introduction.md
create mode 100644 docs/use_cases/pilot_gaze_tracking/observers.md
create mode 100644 docs/use_cases/pilot_gaze_tracking/pipeline.md
(limited to 'docs')
diff --git a/docs/use_cases/pilot_gaze_monitoring/context.md b/docs/use_cases/pilot_gaze_monitoring/context.md
deleted file mode 100644
index 8839cb6..0000000
--- a/docs/use_cases/pilot_gaze_monitoring/context.md
+++ /dev/null
@@ -1,41 +0,0 @@
-Data capture context
-====================
-
-The context handles incoming eye tracker data before to pass them to a processing pipeline.
-
-## live_streaming_context.json
-
-For this use case we need to connect to a Tobii Pro Glasses 2 device: **ArGaze** provides a [ready-made context](../../user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_2.md) class to capture data from this device.
-
-While *address*, *project*, *participant* and *configuration* entries are specific to the [TobiiProGlasses2.LiveStream](../../argaze.md/#argaze.utils.contexts.TobiiProGlasses2.LiveStream) class, *name*, *pipeline* and *observers* entries are part of the parent [ArContext](../../argaze.md/#argaze.ArFeatures.ArContext) class.
-
-```json
-{
- "argaze.utils.contexts.TobiiProGlasses2.LiveStream": {
- "name": "Tobii Pro Glasses 2 live stream",
- "address": "10.34.0.17",
- "project": "HAIKU-XP",
- "participant": "Pilot-A",
- "configuration": {
- "sys_ec_preset": "Indoor",
- "sys_sc_width": 1920,
- "sys_sc_height": 1080,
- "sys_sc_fps": 25,
- "sys_sc_preset": "Auto",
- "sys_et_freq": 50,
- "sys_mems_freq": 100
- },
- "pipeline": "live_processing_pipeline.json",
- "observers": {
- "observers.IvyBus": {
- "name": "argaze_haiku",
- "bus": "10.34.127.255:2023"
- }
- }
- }
-}
-```
-
-The [live_processing_pipeline.json](pipeline.md) file mentioned above is described in the next chapter.
-
-The *IvyBus* observer object is defined into the [observers.py](observers.md) file that is described in a next chapter.
\ No newline at end of file
diff --git a/docs/use_cases/pilot_gaze_monitoring/introduction.md b/docs/use_cases/pilot_gaze_monitoring/introduction.md
deleted file mode 100644
index 7e88c69..0000000
--- a/docs/use_cases/pilot_gaze_monitoring/introduction.md
+++ /dev/null
@@ -1,47 +0,0 @@
-Real time head-mounted eye tracking interactions
-================================================
-
-**ArGaze** enabled a cognitive assistant to support a pilot's situation awareness.
-
-The following use case has integrated the [ArUco marker pipeline](../../user_guide/aruco_marker_pipeline/introduction.md) to map pilot gaze onto many cockpit instruments in real-time and then enable AOI fixation matching with the [gaze analysis pipeline](../../user_guide/gaze_analysis_pipeline/introduction.md).
-
-## Background
-
-The [HAIKU project](https://haikuproject.eu/) aims to pave the way for human-centric intelligent assistants in the aviation domain by supporting, among others, the pilot during startle or surprise events.
-One of the features provided by the assistant through **ArGaze** is a situation awareness support that ensures the pilot updates his awareness of the aircraft state by monitoring his gaze and flight parameters.
-When this support is active, relevant information is highlighted on the Primary Flight Display (PFD) and the Electronic Centralized Aircraft Monitor (ECAM).
-
-![SA alert](../../img/haiku_sa_alert.png)
-
-## Environment
-
-Due to the complexity of the cockpit simulator's geometry, pilot's eyes are tracked with a head-mounted eye tracker (Tobii Pro Glasses 2).
-The gaze and scene camera video were captured through the Tobii SDK and processed in real-time on an NVIDIA Jetson Xavier computer.
-ArUco markers were placed at various locations within the cockpit simulator to ensure that several of them were constantly visible in the field of view of the eye tracker camera.
-
-![SimOne cockpit](../../img/simone_cockpit.png)
-
-The [ArUco marker pipeline](../../user_guide/aruco_marker_pipeline/introduction.md) has enabled real-time gaze mapping onto multiple screens and panels around pilot-in-command position while [gaze analysis pipeline](../../user_guide/gaze_analysis_pipeline/introduction.md) was identifying fixations and matching them with dynamic AOIs related to each instruments.
-To identify the relevant AOIs, a 3D model of the cockpit describing the AOI and the position of the markers has been realized.
-
-![ArUco markers and AOI scene](../../img/haiku_aoi.png)
-
-Finally, fixation events were sent in real-time through [Ivy bus middleware](https://gitlab.com/ivybus/ivy-python/) to the situation awareness software in charge of displaying attention getter onto the PFD screen.
-
-## Setup
-
-The setup to integrate **ArGaze** to the experiment is defined by 3 main files detailled in the next chapters:
-
-* The context file that captures gaze data and scene camera video: [live_streaming_context.json](context.md)
-* The pipeline file that processes gaze data and scene camera video: [live_processing_pipeline.json](pipeline.md)
-* The observers file that send fixation events via Ivy bus middleware: [observers.py](observers.md)
-
-As any **ArGaze** setup, it is loaded by executing the [*load* command](../../user_guide/utils/main_commands.md):
-
-```shell
-python -m argaze load live_streaming_context.json
-```
-
-This command opens a GUI window that allows to start gaze calibration, to launch recording and to monitor gaze mapping. Another window is also opened to display gaze mapping onto PFD screen.
-
-![ArGaze load GUI for Haiku](../../img/argaze_load_gui_haiku.png)
diff --git a/docs/use_cases/pilot_gaze_monitoring/observers.md b/docs/use_cases/pilot_gaze_monitoring/observers.md
deleted file mode 100644
index 5f5bc78..0000000
--- a/docs/use_cases/pilot_gaze_monitoring/observers.md
+++ /dev/null
@@ -1,103 +0,0 @@
-Fixation events sending
-=======================
-
-Observers are attached to pipeline steps to be notified when a method is called.
-
-## observers.py
-
-For this use case we need to enable [Ivy bus communication](https://gitlab.com/ivybus/ivy-python/) to log ArUco detection results (on *ArUcoCamera.on_watch* call) and fixation identification with AOI matching (on *ArUcoCamera.on_look* call).
-
-```python
-import logging
-
-from argaze import DataFeatures, GazeFeatures
-
-from ivy.std_api import *
-from ivy.ivy import IvyIllegalStateError
-
-
-class IvyBus(DataFeatures.PipelineStepObject):
- """Handle Ivy bus."""
-
- @DataFeatures.PipelineStepInit
- def __init__(self, **kwargs):
-
- self.__bus = None
-
- @property
- def bus(self) -> str:
- return self.__bus
-
- @bus.setter
- def bus(self, bus: str):
- self.__bus = bus
-
- @DataFeatures.PipelineStepEnter
- def __enter__(self, parent = None):
-
- # Enable Ivy bus
- IvyInit(self.name)
- IvyStart(self.__bus)
-
- return self
-
- @DataFeatures.PipelineStepExit
- def __exit__(self, exception_type, exception_value, exception_traceback):
-
- # Stop Ivy bus
- IvyStop()
-
-
-class ArUcoCameraLogger(DataFeatures.PipelineStepObject):
- """Log ArUcoCamera activity."""
-
- @DataFeatures.PipelineStepInit
- def __init__(self, **kwargs):
-
- self._last_markers_number = None
-
- def on_watch(self, timestamp, aruco_camera, exception):
- """Report ArUco markers detection info on Ivy bus."""
-
- # Wait for number of detected marker changes
- if aruco_camera.aruco_detector.detected_markers_number() != self._last_markers_number:
-
- self._last_markers_number = aruco_camera.aruco_detector.detected_markers_number()
-
- output = f'ArUcoDetection MarkersNumber={self._last_markers_number}'
-
- # Send Ivy message
- IvySendMsg(output)
-
- logging.debug('%i %s', timestamp, output)
-
- def on_look(self, timestamp, aruco_camera, exception):
- """Report fixation and metrics on Ivy bus."""
-
- # Select 'Main' layer
- main_layer = aruco_camera.layers['Main']
-
- if GazeFeatures.is_fixation(aruco_camera.last_gaze_movement()):
-
- fixation = aruco_camera.last_gaze_movement()
-
- # Output in progress fixation data
- if not fixation.is_finished():
-
- output = f'FixationInProgress Start={fixation[0].timestamp} Duration={fixation.duration} AOI={main_layer.last_looked_aoi_name()} Probabilities={main_layer.aoi_matcher.looked_probabilities()}'
-
- # Send Ivy message
- IvySendMsg(output)
-
- logging.debug('%i %s %s %s', timestamp, aruco_camera.last_gaze_position().value, aruco_camera.name, output)
-
- # Output finished fixation data
- else:
-
- output = f'FixationEnd Start={fixation[0].timestamp} Duration={fixation.duration} AOI={main_layer.aoi_matcher.looked_aoi_name()} Probabilities={main_layer.aoi_matcher.looked_probabilities()}'
-
- # Send Ivy message
- IvySendMsg(output)
-
- logging.debug('%i %s %s %s', timestamp, aruco_camera.last_gaze_position().value, aruco_camera.name, output)
-```
diff --git a/docs/use_cases/pilot_gaze_monitoring/pipeline.md b/docs/use_cases/pilot_gaze_monitoring/pipeline.md
deleted file mode 100644
index 65fccc3..0000000
--- a/docs/use_cases/pilot_gaze_monitoring/pipeline.md
+++ /dev/null
@@ -1,311 +0,0 @@
-Live processing pipeline
-========================
-
-The pipeline processes camera image and gaze data to enable gaze mapping and gaze analysis.
-
-## live_processing_pipeline.json
-
-For this use case we need to detect ArUco markers to enable gaze mapping: **ArGaze** provides the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) class to setup an [ArUco markers pipeline](../../user_guide/aruco_marker_pipeline/introduction.md).
-
-```json
-{
- "argaze.ArUcoMarker.ArUcoCamera.ArUcoCamera": {
- "name": "Camera",
- "size": [1920, 1080],
- "aruco_detector": {
- "dictionary": "DICT_APRILTAG_16h5",
- "optic_parameters": "optic_parameters.json",
- "parameters": "detector_parameters.json"
- },
- "gaze_movement_identifier": {
- "argaze.GazeAnalysis.DispersionThresholdIdentification.GazeMovementIdentifier": {
- "deviation_max_threshold": 25,
- "duration_min_threshold": 150
- }
- },
- "filter_in_progress_identification": false,
- "scenes": {
- "Cockpit": {
- "aruco_markers_group": "aruco_scene.obj",
- "layers": {
- "Main" : {
- "aoi_scene": "Cockpit.obj"
- }
- },
- "frames": {
- "PIC_PFD": {
- "size": [960, 1080],
- "background": "PIC_PFD.png",
- "layers": {
- "Main": {
- "aoi_scene": "PIC_PFD.svg"
- }
- },
- "image_parameters": {
- "background_weight": 1,
- "draw_gaze_positions": {
- "color": [0, 255, 255],
- "size": 15
- }
- }
- }
- }
- }
- },
- "layers": {
- "Main": {
- "aoi_matcher": {
- "argaze.GazeAnalysis.DeviationCircleCoverage.AOIMatcher": {
- "coverage_threshold": 0.25
- }
- }
- }
- },
- "image_parameters": {
- "background_weight": 1,
- "draw_gaze_positions": {
- "color": [0, 255, 255],
- "size": 4
- },
- "draw_detected_markers": {
- "color": [0, 255, 0],
- "draw_axes": {
- "thickness": 4
- }
- },
- "draw_fixations": {
- "deviation_circle_color": [255, 127, 255],
- "duration_border_color": [127, 0, 127],
- "duration_factor": 1e-2
- },
- "draw_layers": {
- "Main": {
- "draw_aoi_scene": {
- "draw_aoi": {
- "color": [0, 255, 255],
- "border_size": 1
- }
- },
- "draw_aoi_matching": {
- "update_looked_aoi": true,
- "draw_matched_fixation": {
- "deviation_circle_color": [255, 255, 255],
- "draw_positions": {
- "position_color": [0, 255, 0],
- "line_color": [0, 0, 0]
- }
- },
- "draw_matched_region": {
- "color": [0, 255, 0],
- "border_size": 4
- },
- "draw_looked_aoi": {
- "color": [0, 255, 0],
- "border_size": 2
- },
- "looked_aoi_name_color": [255, 255, 255],
- "looked_aoi_name_offset": [0, -10]
- }
- }
- }
- },
- "observers": {
- "observers.ArUcoCameraLogger": {},
- "argaze.utils.UtilsFeatures.LookPerformanceRecorder": {
- "path": "_export/look_performance.csv"
- },
- "argaze.utils.UtilsFeatures.WatchPerformanceRecorder": {
- "path": "_export/watch_performance.csv"
- }
- }
- }
-}
-```
-
-All the files mentioned above are described below.
-
-The *ArUcoCameraLogger* observer object is defined into the [observers.py](observers.md) file that is described in the next chapter.
-
-## optic_parameters.json
-
-This file defines the Tobii Pro glasses 2 scene camera optic parameters which has been calculated as explained into [the camera calibration chapter](../../user_guide/aruco_marker_pipeline/advanced_topics/optic_parameters_calibration.md).
-
-```json
-{
- "rms": 0.6688921504088245,
- "dimensions": [
- 1920,
- 1080
- ],
- "K": [
- [
- 1135.6524381415752,
- 0.0,
- 956.0685325355497
- ],
- [
- 0.0,
- 1135.9272506869524,
- 560.059099810324
- ],
- [
- 0.0,
- 0.0,
- 1.0
- ]
- ],
- "D": [
- 0.01655492265003404,
- 0.1985524264972037,
- 0.002129965902489484,
- -0.0019528582922179365,
- -0.5792910353639452
- ]
-}
-```
-
-## detector_parameters.json
-
-This file defines the ArUco detector parameters as explained into [the detection improvement chapter](../../user_guide/aruco_marker_pipeline/advanced_topics/aruco_detector_configuration.md).
-
-```json
-{
- "adaptiveThreshConstant": 7,
- "useAruco3Detection": true
-}
-```
-
-## aruco_scene.obj
-
-This file defines the place where are the ArUco markers into the cockpit geometry. Markers' positions have been edited in Blender software from a 3D scan of the cockpit then exported at OBJ format.
-
-```obj
-# Blender v3.0.1 OBJ File: 'scene.blend'
-# www.blender.org
-o DICT_APRILTAG_16h5#2_Marker
-v -2.300000 18.573788 -49.271420
-v 2.700000 18.573788 -49.271420
-v -2.300000 23.028820 -51.541370
-v 2.700000 23.028820 -51.541370
-s off
-f 1 2 4 3
-o DICT_APRILTAG_16h5#3_Marker
-v 37.993317 9.909389 -42.172752
-v 42.993317 9.909389 -42.172752
-v 37.993317 14.364422 -44.442703
-v 42.993317 14.364422 -44.442703
-s off
-f 5 6 8 7
-o DICT_APRILTAG_16h5#11_Marker
-v -27.600000 29.075905 -51.042164
-v -24.400000 29.075905 -51.042164
-v -27.600000 31.927124 -52.494930
-v -24.400000 31.927124 -52.494930
-s off
-f 9 10 12 11
-o DICT_APRILTAG_16h5#14_Marker
-v -27.280746 14.890414 -43.814297
-v -24.080746 14.890414 -43.814297
-v -27.280746 17.741634 -45.267063
-v -24.080746 17.741634 -45.267063
-s off
-f 13 14 16 15
-o DICT_APRILTAG_16h5#21_Marker
-v 8.939880 28.459042 -50.445347
-v 12.139881 28.459042 -50.445347
-v 8.939880 31.310265 -51.898113
-v 12.139881 31.310265 -51.898113
-s off
-f 17 18 20 19
-o DICT_APRILTAG_16h5#22_Marker
-v 8.939880 21.949581 -47.128613
-v 12.139881 21.949581 -47.128613
-v 8.939880 24.800800 -48.581379
-v 12.139881 24.800800 -48.581379
-s off
-f 21 22 24 23
-o DICT_APRILTAG_16h5#13_Marker
-v -12.126360 14.872046 -43.804939
-v -8.926359 14.872046 -43.804939
-v -12.126360 17.723267 -45.257706
-v -8.926359 17.723267 -45.257706
-s off
-f 25 26 28 27
-o DICT_APRILTAG_16h5#12_Marker
-v -43.079227 14.890414 -43.814297
-v -39.879230 14.890414 -43.814297
-v -43.079227 17.741634 -45.267063
-v -39.879230 17.741634 -45.267063
-s off
-f 29 30 32 31
-```
-
-## Cockpit.obj
-
-This file defines the place of the AOI into the cockpit geometry. AOI positions have been edited in [Blender software](https://www.blender.org/) from a 3D scan of the cockpit then exported at OBJ format.
-
-```obj
-# Blender v3.0.1 OBJ File: 'scene.blend'
-# www.blender.org
-o PIC_PFD
-v -43.208000 32.020378 -52.542446
-v -26.000000 32.020378 -52.542446
-v -43.208000 14.779404 -43.757732
-v -26.000000 14.779404 -43.757732
-s off
-f 3 4 2 1
-o ECAM_Engine_Fuel_Flaps
-v 8.657453 16.194618 -44.196308
-v 27.672760 16.055838 -44.125595
-v 8.657453 31.527327 -52.008713
-v 27.672760 31.441055 -51.964756
-s off
-f 5 6 8 7
-o AP_ATHR_Plan.033
-v 16.653587 46.982643 -32.403645
-v 21.580402 46.974689 -32.399593
-v 16.653587 52.562916 -35.246937
-v 21.580402 52.554958 -35.242882
-s off
-f 9 10 12 11
-o Exterior_Left
-v -69.756531 46.523575 -40.193161
-v 18.876167 46.523575 -55.821495
-v -69.756531 87.247131 -40.193161
-v 18.876167 87.247131 -55.821495
-s off
-f 13 14 16 15
-```
-
-## PIC_PFD.png
-
-This file is a screenshot of the PFD screen used to monitor where the gaze is projected after gaze mapping processing.
-
-![PFD frame background](../../img/haiku_PIC_PFD_background.png)
-
-## PIC_PFD.svg
-
-This file defines the place of the AOI into the PFD frame. AOI positions have been edited with [Inkscape software](https://inkscape.org/fr/) from a screenshot of the PFD screen then exported at SVG format.
-
-```svg
-
-```
-
-## look_performance.csv
-
-This file contains the logs of *ArUcoCamera.look* method execution info. It is saved into an *_export* folder from where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
-
-On a Jetson Xavier computer, the *look* method execution time is 5.7ms and it is called ~100 times per second.
-
-## watch_performance.csv
-
-This file contains the logs of *ArUcoCamera.watch* method execution info. It is saved into an *_export* folder from where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
-
-On a Jetson Xavier computer with CUDA acceleration, the *watch* method execution time is 46.5ms and it is called more than 12 times per second.
diff --git a/docs/use_cases/pilot_gaze_tracking/context.md b/docs/use_cases/pilot_gaze_tracking/context.md
new file mode 100644
index 0000000..8839cb6
--- /dev/null
+++ b/docs/use_cases/pilot_gaze_tracking/context.md
@@ -0,0 +1,41 @@
+Data capture context
+====================
+
+The context handles incoming eye tracker data before to pass them to a processing pipeline.
+
+## live_streaming_context.json
+
+For this use case we need to connect to a Tobii Pro Glasses 2 device: **ArGaze** provides a [ready-made context](../../user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_2.md) class to capture data from this device.
+
+While *address*, *project*, *participant* and *configuration* entries are specific to the [TobiiProGlasses2.LiveStream](../../argaze.md/#argaze.utils.contexts.TobiiProGlasses2.LiveStream) class, *name*, *pipeline* and *observers* entries are part of the parent [ArContext](../../argaze.md/#argaze.ArFeatures.ArContext) class.
+
+```json
+{
+ "argaze.utils.contexts.TobiiProGlasses2.LiveStream": {
+ "name": "Tobii Pro Glasses 2 live stream",
+ "address": "10.34.0.17",
+ "project": "HAIKU-XP",
+ "participant": "Pilot-A",
+ "configuration": {
+ "sys_ec_preset": "Indoor",
+ "sys_sc_width": 1920,
+ "sys_sc_height": 1080,
+ "sys_sc_fps": 25,
+ "sys_sc_preset": "Auto",
+ "sys_et_freq": 50,
+ "sys_mems_freq": 100
+ },
+ "pipeline": "live_processing_pipeline.json",
+ "observers": {
+ "observers.IvyBus": {
+ "name": "argaze_haiku",
+ "bus": "10.34.127.255:2023"
+ }
+ }
+ }
+}
+```
+
+The [live_processing_pipeline.json](pipeline.md) file mentioned above is described in the next chapter.
+
+The *IvyBus* observer object is defined into the [observers.py](observers.md) file that is described in a next chapter.
\ No newline at end of file
diff --git a/docs/use_cases/pilot_gaze_tracking/introduction.md b/docs/use_cases/pilot_gaze_tracking/introduction.md
new file mode 100644
index 0000000..7e88c69
--- /dev/null
+++ b/docs/use_cases/pilot_gaze_tracking/introduction.md
@@ -0,0 +1,47 @@
+Real time head-mounted eye tracking interactions
+================================================
+
+**ArGaze** enabled a cognitive assistant to support a pilot's situation awareness.
+
+The following use case has integrated the [ArUco marker pipeline](../../user_guide/aruco_marker_pipeline/introduction.md) to map pilot gaze onto many cockpit instruments in real-time and then enable AOI fixation matching with the [gaze analysis pipeline](../../user_guide/gaze_analysis_pipeline/introduction.md).
+
+## Background
+
+The [HAIKU project](https://haikuproject.eu/) aims to pave the way for human-centric intelligent assistants in the aviation domain by supporting, among others, the pilot during startle or surprise events.
+One of the features provided by the assistant through **ArGaze** is a situation awareness support that ensures the pilot updates his awareness of the aircraft state by monitoring his gaze and flight parameters.
+When this support is active, relevant information is highlighted on the Primary Flight Display (PFD) and the Electronic Centralized Aircraft Monitor (ECAM).
+
+![SA alert](../../img/haiku_sa_alert.png)
+
+## Environment
+
+Due to the complexity of the cockpit simulator's geometry, pilot's eyes are tracked with a head-mounted eye tracker (Tobii Pro Glasses 2).
+The gaze and scene camera video were captured through the Tobii SDK and processed in real-time on an NVIDIA Jetson Xavier computer.
+ArUco markers were placed at various locations within the cockpit simulator to ensure that several of them were constantly visible in the field of view of the eye tracker camera.
+
+![SimOne cockpit](../../img/simone_cockpit.png)
+
+The [ArUco marker pipeline](../../user_guide/aruco_marker_pipeline/introduction.md) has enabled real-time gaze mapping onto multiple screens and panels around pilot-in-command position while [gaze analysis pipeline](../../user_guide/gaze_analysis_pipeline/introduction.md) was identifying fixations and matching them with dynamic AOIs related to each instruments.
+To identify the relevant AOIs, a 3D model of the cockpit describing the AOI and the position of the markers has been realized.
+
+![ArUco markers and AOI scene](../../img/haiku_aoi.png)
+
+Finally, fixation events were sent in real-time through [Ivy bus middleware](https://gitlab.com/ivybus/ivy-python/) to the situation awareness software in charge of displaying attention getter onto the PFD screen.
+
+## Setup
+
+The setup to integrate **ArGaze** to the experiment is defined by 3 main files detailled in the next chapters:
+
+* The context file that captures gaze data and scene camera video: [live_streaming_context.json](context.md)
+* The pipeline file that processes gaze data and scene camera video: [live_processing_pipeline.json](pipeline.md)
+* The observers file that send fixation events via Ivy bus middleware: [observers.py](observers.md)
+
+As any **ArGaze** setup, it is loaded by executing the [*load* command](../../user_guide/utils/main_commands.md):
+
+```shell
+python -m argaze load live_streaming_context.json
+```
+
+This command opens a GUI window that allows to start gaze calibration, to launch recording and to monitor gaze mapping. Another window is also opened to display gaze mapping onto PFD screen.
+
+![ArGaze load GUI for Haiku](../../img/argaze_load_gui_haiku.png)
diff --git a/docs/use_cases/pilot_gaze_tracking/observers.md b/docs/use_cases/pilot_gaze_tracking/observers.md
new file mode 100644
index 0000000..5f5bc78
--- /dev/null
+++ b/docs/use_cases/pilot_gaze_tracking/observers.md
@@ -0,0 +1,103 @@
+Fixation events sending
+=======================
+
+Observers are attached to pipeline steps to be notified when a method is called.
+
+## observers.py
+
+For this use case we need to enable [Ivy bus communication](https://gitlab.com/ivybus/ivy-python/) to log ArUco detection results (on *ArUcoCamera.on_watch* call) and fixation identification with AOI matching (on *ArUcoCamera.on_look* call).
+
+```python
+import logging
+
+from argaze import DataFeatures, GazeFeatures
+
+from ivy.std_api import *
+from ivy.ivy import IvyIllegalStateError
+
+
+class IvyBus(DataFeatures.PipelineStepObject):
+ """Handle Ivy bus."""
+
+ @DataFeatures.PipelineStepInit
+ def __init__(self, **kwargs):
+
+ self.__bus = None
+
+ @property
+ def bus(self) -> str:
+ return self.__bus
+
+ @bus.setter
+ def bus(self, bus: str):
+ self.__bus = bus
+
+ @DataFeatures.PipelineStepEnter
+ def __enter__(self, parent = None):
+
+ # Enable Ivy bus
+ IvyInit(self.name)
+ IvyStart(self.__bus)
+
+ return self
+
+ @DataFeatures.PipelineStepExit
+ def __exit__(self, exception_type, exception_value, exception_traceback):
+
+ # Stop Ivy bus
+ IvyStop()
+
+
+class ArUcoCameraLogger(DataFeatures.PipelineStepObject):
+ """Log ArUcoCamera activity."""
+
+ @DataFeatures.PipelineStepInit
+ def __init__(self, **kwargs):
+
+ self._last_markers_number = None
+
+ def on_watch(self, timestamp, aruco_camera, exception):
+ """Report ArUco markers detection info on Ivy bus."""
+
+ # Wait for number of detected marker changes
+ if aruco_camera.aruco_detector.detected_markers_number() != self._last_markers_number:
+
+ self._last_markers_number = aruco_camera.aruco_detector.detected_markers_number()
+
+ output = f'ArUcoDetection MarkersNumber={self._last_markers_number}'
+
+ # Send Ivy message
+ IvySendMsg(output)
+
+ logging.debug('%i %s', timestamp, output)
+
+ def on_look(self, timestamp, aruco_camera, exception):
+ """Report fixation and metrics on Ivy bus."""
+
+ # Select 'Main' layer
+ main_layer = aruco_camera.layers['Main']
+
+ if GazeFeatures.is_fixation(aruco_camera.last_gaze_movement()):
+
+ fixation = aruco_camera.last_gaze_movement()
+
+ # Output in progress fixation data
+ if not fixation.is_finished():
+
+ output = f'FixationInProgress Start={fixation[0].timestamp} Duration={fixation.duration} AOI={main_layer.last_looked_aoi_name()} Probabilities={main_layer.aoi_matcher.looked_probabilities()}'
+
+ # Send Ivy message
+ IvySendMsg(output)
+
+ logging.debug('%i %s %s %s', timestamp, aruco_camera.last_gaze_position().value, aruco_camera.name, output)
+
+ # Output finished fixation data
+ else:
+
+ output = f'FixationEnd Start={fixation[0].timestamp} Duration={fixation.duration} AOI={main_layer.aoi_matcher.looked_aoi_name()} Probabilities={main_layer.aoi_matcher.looked_probabilities()}'
+
+ # Send Ivy message
+ IvySendMsg(output)
+
+ logging.debug('%i %s %s %s', timestamp, aruco_camera.last_gaze_position().value, aruco_camera.name, output)
+```
diff --git a/docs/use_cases/pilot_gaze_tracking/pipeline.md b/docs/use_cases/pilot_gaze_tracking/pipeline.md
new file mode 100644
index 0000000..65fccc3
--- /dev/null
+++ b/docs/use_cases/pilot_gaze_tracking/pipeline.md
@@ -0,0 +1,311 @@
+Live processing pipeline
+========================
+
+The pipeline processes camera image and gaze data to enable gaze mapping and gaze analysis.
+
+## live_processing_pipeline.json
+
+For this use case we need to detect ArUco markers to enable gaze mapping: **ArGaze** provides the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) class to setup an [ArUco markers pipeline](../../user_guide/aruco_marker_pipeline/introduction.md).
+
+```json
+{
+ "argaze.ArUcoMarker.ArUcoCamera.ArUcoCamera": {
+ "name": "Camera",
+ "size": [1920, 1080],
+ "aruco_detector": {
+ "dictionary": "DICT_APRILTAG_16h5",
+ "optic_parameters": "optic_parameters.json",
+ "parameters": "detector_parameters.json"
+ },
+ "gaze_movement_identifier": {
+ "argaze.GazeAnalysis.DispersionThresholdIdentification.GazeMovementIdentifier": {
+ "deviation_max_threshold": 25,
+ "duration_min_threshold": 150
+ }
+ },
+ "filter_in_progress_identification": false,
+ "scenes": {
+ "Cockpit": {
+ "aruco_markers_group": "aruco_scene.obj",
+ "layers": {
+ "Main" : {
+ "aoi_scene": "Cockpit.obj"
+ }
+ },
+ "frames": {
+ "PIC_PFD": {
+ "size": [960, 1080],
+ "background": "PIC_PFD.png",
+ "layers": {
+ "Main": {
+ "aoi_scene": "PIC_PFD.svg"
+ }
+ },
+ "image_parameters": {
+ "background_weight": 1,
+ "draw_gaze_positions": {
+ "color": [0, 255, 255],
+ "size": 15
+ }
+ }
+ }
+ }
+ }
+ },
+ "layers": {
+ "Main": {
+ "aoi_matcher": {
+ "argaze.GazeAnalysis.DeviationCircleCoverage.AOIMatcher": {
+ "coverage_threshold": 0.25
+ }
+ }
+ }
+ },
+ "image_parameters": {
+ "background_weight": 1,
+ "draw_gaze_positions": {
+ "color": [0, 255, 255],
+ "size": 4
+ },
+ "draw_detected_markers": {
+ "color": [0, 255, 0],
+ "draw_axes": {
+ "thickness": 4
+ }
+ },
+ "draw_fixations": {
+ "deviation_circle_color": [255, 127, 255],
+ "duration_border_color": [127, 0, 127],
+ "duration_factor": 1e-2
+ },
+ "draw_layers": {
+ "Main": {
+ "draw_aoi_scene": {
+ "draw_aoi": {
+ "color": [0, 255, 255],
+ "border_size": 1
+ }
+ },
+ "draw_aoi_matching": {
+ "update_looked_aoi": true,
+ "draw_matched_fixation": {
+ "deviation_circle_color": [255, 255, 255],
+ "draw_positions": {
+ "position_color": [0, 255, 0],
+ "line_color": [0, 0, 0]
+ }
+ },
+ "draw_matched_region": {
+ "color": [0, 255, 0],
+ "border_size": 4
+ },
+ "draw_looked_aoi": {
+ "color": [0, 255, 0],
+ "border_size": 2
+ },
+ "looked_aoi_name_color": [255, 255, 255],
+ "looked_aoi_name_offset": [0, -10]
+ }
+ }
+ }
+ },
+ "observers": {
+ "observers.ArUcoCameraLogger": {},
+ "argaze.utils.UtilsFeatures.LookPerformanceRecorder": {
+ "path": "_export/look_performance.csv"
+ },
+ "argaze.utils.UtilsFeatures.WatchPerformanceRecorder": {
+ "path": "_export/watch_performance.csv"
+ }
+ }
+ }
+}
+```
+
+All the files mentioned above are described below.
+
+The *ArUcoCameraLogger* observer object is defined into the [observers.py](observers.md) file that is described in the next chapter.
+
+## optic_parameters.json
+
+This file defines the Tobii Pro glasses 2 scene camera optic parameters which has been calculated as explained into [the camera calibration chapter](../../user_guide/aruco_marker_pipeline/advanced_topics/optic_parameters_calibration.md).
+
+```json
+{
+ "rms": 0.6688921504088245,
+ "dimensions": [
+ 1920,
+ 1080
+ ],
+ "K": [
+ [
+ 1135.6524381415752,
+ 0.0,
+ 956.0685325355497
+ ],
+ [
+ 0.0,
+ 1135.9272506869524,
+ 560.059099810324
+ ],
+ [
+ 0.0,
+ 0.0,
+ 1.0
+ ]
+ ],
+ "D": [
+ 0.01655492265003404,
+ 0.1985524264972037,
+ 0.002129965902489484,
+ -0.0019528582922179365,
+ -0.5792910353639452
+ ]
+}
+```
+
+## detector_parameters.json
+
+This file defines the ArUco detector parameters as explained into [the detection improvement chapter](../../user_guide/aruco_marker_pipeline/advanced_topics/aruco_detector_configuration.md).
+
+```json
+{
+ "adaptiveThreshConstant": 7,
+ "useAruco3Detection": true
+}
+```
+
+## aruco_scene.obj
+
+This file defines the place where are the ArUco markers into the cockpit geometry. Markers' positions have been edited in Blender software from a 3D scan of the cockpit then exported at OBJ format.
+
+```obj
+# Blender v3.0.1 OBJ File: 'scene.blend'
+# www.blender.org
+o DICT_APRILTAG_16h5#2_Marker
+v -2.300000 18.573788 -49.271420
+v 2.700000 18.573788 -49.271420
+v -2.300000 23.028820 -51.541370
+v 2.700000 23.028820 -51.541370
+s off
+f 1 2 4 3
+o DICT_APRILTAG_16h5#3_Marker
+v 37.993317 9.909389 -42.172752
+v 42.993317 9.909389 -42.172752
+v 37.993317 14.364422 -44.442703
+v 42.993317 14.364422 -44.442703
+s off
+f 5 6 8 7
+o DICT_APRILTAG_16h5#11_Marker
+v -27.600000 29.075905 -51.042164
+v -24.400000 29.075905 -51.042164
+v -27.600000 31.927124 -52.494930
+v -24.400000 31.927124 -52.494930
+s off
+f 9 10 12 11
+o DICT_APRILTAG_16h5#14_Marker
+v -27.280746 14.890414 -43.814297
+v -24.080746 14.890414 -43.814297
+v -27.280746 17.741634 -45.267063
+v -24.080746 17.741634 -45.267063
+s off
+f 13 14 16 15
+o DICT_APRILTAG_16h5#21_Marker
+v 8.939880 28.459042 -50.445347
+v 12.139881 28.459042 -50.445347
+v 8.939880 31.310265 -51.898113
+v 12.139881 31.310265 -51.898113
+s off
+f 17 18 20 19
+o DICT_APRILTAG_16h5#22_Marker
+v 8.939880 21.949581 -47.128613
+v 12.139881 21.949581 -47.128613
+v 8.939880 24.800800 -48.581379
+v 12.139881 24.800800 -48.581379
+s off
+f 21 22 24 23
+o DICT_APRILTAG_16h5#13_Marker
+v -12.126360 14.872046 -43.804939
+v -8.926359 14.872046 -43.804939
+v -12.126360 17.723267 -45.257706
+v -8.926359 17.723267 -45.257706
+s off
+f 25 26 28 27
+o DICT_APRILTAG_16h5#12_Marker
+v -43.079227 14.890414 -43.814297
+v -39.879230 14.890414 -43.814297
+v -43.079227 17.741634 -45.267063
+v -39.879230 17.741634 -45.267063
+s off
+f 29 30 32 31
+```
+
+## Cockpit.obj
+
+This file defines the place of the AOI into the cockpit geometry. AOI positions have been edited in [Blender software](https://www.blender.org/) from a 3D scan of the cockpit then exported at OBJ format.
+
+```obj
+# Blender v3.0.1 OBJ File: 'scene.blend'
+# www.blender.org
+o PIC_PFD
+v -43.208000 32.020378 -52.542446
+v -26.000000 32.020378 -52.542446
+v -43.208000 14.779404 -43.757732
+v -26.000000 14.779404 -43.757732
+s off
+f 3 4 2 1
+o ECAM_Engine_Fuel_Flaps
+v 8.657453 16.194618 -44.196308
+v 27.672760 16.055838 -44.125595
+v 8.657453 31.527327 -52.008713
+v 27.672760 31.441055 -51.964756
+s off
+f 5 6 8 7
+o AP_ATHR_Plan.033
+v 16.653587 46.982643 -32.403645
+v 21.580402 46.974689 -32.399593
+v 16.653587 52.562916 -35.246937
+v 21.580402 52.554958 -35.242882
+s off
+f 9 10 12 11
+o Exterior_Left
+v -69.756531 46.523575 -40.193161
+v 18.876167 46.523575 -55.821495
+v -69.756531 87.247131 -40.193161
+v 18.876167 87.247131 -55.821495
+s off
+f 13 14 16 15
+```
+
+## PIC_PFD.png
+
+This file is a screenshot of the PFD screen used to monitor where the gaze is projected after gaze mapping processing.
+
+![PFD frame background](../../img/haiku_PIC_PFD_background.png)
+
+## PIC_PFD.svg
+
+This file defines the place of the AOI into the PFD frame. AOI positions have been edited with [Inkscape software](https://inkscape.org/fr/) from a screenshot of the PFD screen then exported at SVG format.
+
+```svg
+
+```
+
+## look_performance.csv
+
+This file contains the logs of *ArUcoCamera.look* method execution info. It is saved into an *_export* folder from where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
+
+On a Jetson Xavier computer, the *look* method execution time is 5.7ms and it is called ~100 times per second.
+
+## watch_performance.csv
+
+This file contains the logs of *ArUcoCamera.watch* method execution info. It is saved into an *_export* folder from where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
+
+On a Jetson Xavier computer with CUDA acceleration, the *watch* method execution time is 46.5ms and it is called more than 12 times per second.
--
cgit v1.1