diff options
Diffstat (limited to 'docs/use_cases/pilot_gaze_monitoring')
-rw-r--r-- | docs/use_cases/pilot_gaze_monitoring/context.md | 13 | ||||
-rw-r--r-- | docs/use_cases/pilot_gaze_monitoring/introduction.md | 29 | ||||
-rw-r--r-- | docs/use_cases/pilot_gaze_monitoring/observers.md | 2 | ||||
-rw-r--r-- | docs/use_cases/pilot_gaze_monitoring/pipeline.md | 81 |
4 files changed, 96 insertions, 29 deletions
diff --git a/docs/use_cases/pilot_gaze_monitoring/context.md b/docs/use_cases/pilot_gaze_monitoring/context.md index 42df132..417ed13 100644 --- a/docs/use_cases/pilot_gaze_monitoring/context.md +++ b/docs/use_cases/pilot_gaze_monitoring/context.md @@ -1,7 +1,14 @@ Live streaming context ====================== -The **live_streaming_context.json** file ... +The context handles pipeline inputs. + +## live_streaming_context.json + +For this use case we need to connect to a Tobii Pro Glasses 2 device. +**ArGaze** provides a context class to live stream data from this device. + +While *address*, *project*, *participant* and *configuration* entries are specific to the [TobiiProGlasses2.LiveStream](../../argaze.md/#argaze.utils.contexts.TobiiProGlasses2.LiveStream) class, *name*, *pipeline* and *observers* entries are part of the parent [ArContext](../../argaze.md/#argaze.ArFeatures.ArContext) class. ```json { @@ -30,4 +37,6 @@ The **live_streaming_context.json** file ... } ``` -The **live_streaming_context.json** file also mentions **live_processing_pipeline.json** which is described in the next chapter.
\ No newline at end of file +The [live_processing_pipeline.json](pipeline.md) file mentioned aboved is described in the next chapter. + +The observers objects are defined into the [observers.py](observers.md) file that is described in a next chapter.
\ No newline at end of file diff --git a/docs/use_cases/pilot_gaze_monitoring/introduction.md b/docs/use_cases/pilot_gaze_monitoring/introduction.md index 0faf4b1..453a443 100644 --- a/docs/use_cases/pilot_gaze_monitoring/introduction.md +++ b/docs/use_cases/pilot_gaze_monitoring/introduction.md @@ -1,5 +1,5 @@ -Overview -======== +Real time head-mounted eye tracking interactions +================================================ **ArGaze** enabled a cognitive assistant to support a pilot's situation awareness. @@ -9,15 +9,15 @@ The following use case has integrated the [ArUco marker pipeline](../../user_gui The [HAIKU project](https://haikuproject.eu/) aims to pave the way for human-centric intelligent assistants in the aviation domain by supporting, among others, the pilot during startle or surprise events. One of the features provided by the assistant through **ArGaze** is a situation awareness support that ensures the pilot updates his awareness of the aircraft state by monitoring his gaze and flight parameters. -When this support is active, relevant information is highlighted on the Primary Flight Display (PFD). +When this support is active, relevant information is highlighted on the Primary Flight Display (PFD) and the Electronic Centralized Aircraft Monitor (ECAM). ![SA alert](../../img/haiku_sa_alert.png) -## Experiment context +## Environment -Pilot eye tracking data were provided by Tobii Pro Glasses 2, a head-mounted eye tracker. -The gaze and scene camera video were captured through the Tobii SDK and processed in real-time on NVIDIA Jetson Xavier. -Since the eye tracker model is head-mounted, ArUco markers were placed at various locations within a A320 cockpit simulator to ensure that several of them were constantly visible in the field of view of the eye tracker camera. +Due to the complexity of the cockpit simulator's geometry, pilot's eyes are tracked with a head-mounted eye tracker (Tobii Pro Glasses 2). +The gaze and scene camera video were captured through the Tobii SDK and processed in real-time on an NVIDIA Jetson Xavier computer. +ArUco markers were placed at various locations within the cockpit simulator to ensure that several of them were constantly visible in the field of view of the eye tracker camera. ![SimOne cockpit](../../img/simone_cockpit.png) @@ -28,16 +28,19 @@ To identify the relevant AOIs, a 3D model of the cockpit describing the AOI and Finally, fixation events were sent in real-time through [Ivy bus middleware](https://gitlab.com/ivybus/ivy-python/) to the situation awareness software in charge of displaying attention getter onto the PFD screen. -## ArGaze setup +## Setup -The project defines 3 main files to integrate **ArGaze** to the experiment: +The setup to integrate **ArGaze** to the experiment is defined by 3 main files: -* The context that captures gaze data and scene camera video: [live_streaming_context.json](context.md) -* The pipeline that processes gaze data and scene camera video: [live_processing_pipeline.json](pipeline.md) -* The observers that send fixation events to Ivy bus middleware: [observers.py](observers.md) +* The context file that captures gaze data and scene camera video: [live_streaming_context.json](context.md) +* The pipeline file that processes gaze data and scene camera video: [live_processing_pipeline.json](pipeline.md) +* The observers file that send fixation events via Ivy bus middleware: [observers.py](observers.md) -The project is loaded by executing the following command: +As any **ArGaze** setup, it is loaded by executing the following command: ```shell python -m argaze load live_streaming_context.json ``` + +## Performance + diff --git a/docs/use_cases/pilot_gaze_monitoring/observers.md b/docs/use_cases/pilot_gaze_monitoring/observers.md index 3b3f07d..2e3f394 100644 --- a/docs/use_cases/pilot_gaze_monitoring/observers.md +++ b/docs/use_cases/pilot_gaze_monitoring/observers.md @@ -1,7 +1,7 @@ Fixation events sending ======================= -The **observers.py** file ... +## observers.py ```python import logging diff --git a/docs/use_cases/pilot_gaze_monitoring/pipeline.md b/docs/use_cases/pilot_gaze_monitoring/pipeline.md index d41e2c4..8f8dad0 100644 --- a/docs/use_cases/pilot_gaze_monitoring/pipeline.md +++ b/docs/use_cases/pilot_gaze_monitoring/pipeline.md @@ -1,7 +1,12 @@ Live processing pipeline ======================== -The **live_processing_pipeline.json** file ... +The pipeline processes camera image and gaze data to enable gaze mapping and gaze analysis. + +## live_processing_pipeline.json + +For this use case we need to detect ArUco markers to enable gaze mapping. +**ArGaze** provides the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) class to setup an [ArUco markers pipeline](../../user_guide/aruco_marker_pipeline/introduction.md). ```json { @@ -25,13 +30,13 @@ The **live_processing_pipeline.json** file ... "aruco_markers_group": "aruco_scene.obj", "layers": { "Main" : { - "aoi_scene": "aoi/Cockpit.obj" + "aoi_scene": "Cockpit.obj" } }, "frames": { "PIC_PFD": { "size": [960, 1080], - "background": "aoi/PIC_PFD.png", + "background": "PIC_PFD.png", "gaze_movement_identifier": { "argaze.GazeAnalysis.DispersionThresholdIdentification.GazeMovementIdentifier": { "deviation_max_threshold": 50, @@ -40,7 +45,7 @@ The **live_processing_pipeline.json** file ... }, "layers": { "Main": { - "aoi_scene": "aoi/PIC_PFD.svg" + "aoi_scene": "PIC_PFD.svg" } }, "image_parameters": { @@ -120,7 +125,9 @@ The **live_processing_pipeline.json** file ... } ``` -The **live_processing_pipeline.json** also mentions other files which are described below. +All the files mentioned aboved are described below. + +The observers objects are defined into the [observers.py](observers.md) file that is described in the next chapter. ## optic_parameters.json @@ -172,38 +179,65 @@ The **live_processing_pipeline.json** also mentions other files which are descri ```obj # Blender v3.0.1 OBJ File: 'scene.blend' # www.blender.org +o DICT_APRILTAG_16h5#2_Marker +v -2.300000 18.573788 -49.271420 +v 2.700000 18.573788 -49.271420 +v -2.300000 23.028820 -51.541370 +v 2.700000 23.028820 -51.541370 +s off +f 1 2 4 3 +o DICT_APRILTAG_16h5#3_Marker +v 37.993317 9.909389 -42.172752 +v 42.993317 9.909389 -42.172752 +v 37.993317 14.364422 -44.442703 +v 42.993317 14.364422 -44.442703 +s off +f 5 6 8 7 o DICT_APRILTAG_16h5#11_Marker v -27.600000 29.075905 -51.042164 v -24.400000 29.075905 -51.042164 v -27.600000 31.927124 -52.494930 v -24.400000 31.927124 -52.494930 s off -f 1 2 4 3 +f 9 10 12 11 o DICT_APRILTAG_16h5#14_Marker v -27.280746 14.890414 -43.814297 v -24.080746 14.890414 -43.814297 v -27.280746 17.741634 -45.267063 v -24.080746 17.741634 -45.267063 s off -f 5 6 8 7 +f 13 14 16 15 +o DICT_APRILTAG_16h5#21_Marker +v 8.939880 28.459042 -50.445347 +v 12.139881 28.459042 -50.445347 +v 8.939880 31.310265 -51.898113 +v 12.139881 31.310265 -51.898113 +s off +f 17 18 20 19 +o DICT_APRILTAG_16h5#22_Marker +v 8.939880 21.949581 -47.128613 +v 12.139881 21.949581 -47.128613 +v 8.939880 24.800800 -48.581379 +v 12.139881 24.800800 -48.581379 +s off +f 21 22 24 23 o DICT_APRILTAG_16h5#13_Marker v -12.126360 14.872046 -43.804939 v -8.926359 14.872046 -43.804939 v -12.126360 17.723267 -45.257706 v -8.926359 17.723267 -45.257706 s off -f 9 10 12 11 +f 25 26 28 27 o DICT_APRILTAG_16h5#12_Marker v -43.079227 14.890414 -43.814297 v -39.879230 14.890414 -43.814297 v -43.079227 17.741634 -45.267063 v -39.879230 17.741634 -45.267063 s off -f 13 14 16 15 - +f 29 30 32 31 ``` -## aoi/Cockpit.obj +## Cockpit.obj ```obj # Blender v3.0.1 OBJ File: 'scene.blend' @@ -215,13 +249,34 @@ v -43.208000 14.779404 -43.757732 v -26.000000 14.779404 -43.757732 s off f 3 4 2 1 +o ECAM_Engine_Fuel_Flaps +v 8.657453 16.194618 -44.196308 +v 27.672760 16.055838 -44.125595 +v 8.657453 31.527327 -52.008713 +v 27.672760 31.441055 -51.964756 +s off +f 5 6 8 7 +o AP_ATHR_Plan.033 +v 16.653587 46.982643 -32.403645 +v 21.580402 46.974689 -32.399593 +v 16.653587 52.562916 -35.246937 +v 21.580402 52.554958 -35.242882 +s off +f 9 10 12 11 +o Exterior_Left +v -69.756531 46.523575 -40.193161 +v 18.876167 46.523575 -55.821495 +v -69.756531 87.247131 -40.193161 +v 18.876167 87.247131 -55.821495 +s off +f 13 14 16 15 ``` -## aoi/PIC_PFD.png +## PIC_PFD.png ![PFD frame background](../../img/haiku_PIC_PFD_background.png) -## aoi/PIC_PFD.svg +## PIC_PFD.svg ```svg <svg> |