aboutsummaryrefslogtreecommitdiff
path: root/docs/use_cases/pilot_gaze_monitoring/introduction.md
diff options
context:
space:
mode:
authorThéo de la Hogue2024-07-02 07:21:25 +0200
committerThéo de la Hogue2024-07-02 07:21:25 +0200
commitb7f28b2d12c65d097607f5941a5d081d94bd83cb (patch)
tree6b92f05047300a54df59edbad8dd41fb9c1d25de /docs/use_cases/pilot_gaze_monitoring/introduction.md
parent3fc1abf3ed699c71faf7dec66c53a3a020ec8e16 (diff)
downloadargaze-b7f28b2d12c65d097607f5941a5d081d94bd83cb.zip
argaze-b7f28b2d12c65d097607f5941a5d081d94bd83cb.tar.gz
argaze-b7f28b2d12c65d097607f5941a5d081d94bd83cb.tar.bz2
argaze-b7f28b2d12c65d097607f5941a5d081d94bd83cb.tar.xz
Working on pilot gaze monitoring use case.
Diffstat (limited to 'docs/use_cases/pilot_gaze_monitoring/introduction.md')
-rw-r--r--docs/use_cases/pilot_gaze_monitoring/introduction.md29
1 files changed, 16 insertions, 13 deletions
diff --git a/docs/use_cases/pilot_gaze_monitoring/introduction.md b/docs/use_cases/pilot_gaze_monitoring/introduction.md
index 0faf4b1..453a443 100644
--- a/docs/use_cases/pilot_gaze_monitoring/introduction.md
+++ b/docs/use_cases/pilot_gaze_monitoring/introduction.md
@@ -1,5 +1,5 @@
-Overview
-========
+Real time head-mounted eye tracking interactions
+================================================
**ArGaze** enabled a cognitive assistant to support a pilot's situation awareness.
@@ -9,15 +9,15 @@ The following use case has integrated the [ArUco marker pipeline](../../user_gui
The [HAIKU project](https://haikuproject.eu/) aims to pave the way for human-centric intelligent assistants in the aviation domain by supporting, among others, the pilot during startle or surprise events.
One of the features provided by the assistant through **ArGaze** is a situation awareness support that ensures the pilot updates his awareness of the aircraft state by monitoring his gaze and flight parameters.
-When this support is active, relevant information is highlighted on the Primary Flight Display (PFD).
+When this support is active, relevant information is highlighted on the Primary Flight Display (PFD) and the Electronic Centralized Aircraft Monitor (ECAM).
![SA alert](../../img/haiku_sa_alert.png)
-## Experiment context
+## Environment
-Pilot eye tracking data were provided by Tobii Pro Glasses 2, a head-mounted eye tracker.
-The gaze and scene camera video were captured through the Tobii SDK and processed in real-time on NVIDIA Jetson Xavier.
-Since the eye tracker model is head-mounted, ArUco markers were placed at various locations within a A320 cockpit simulator to ensure that several of them were constantly visible in the field of view of the eye tracker camera.
+Due to the complexity of the cockpit simulator's geometry, pilot's eyes are tracked with a head-mounted eye tracker (Tobii Pro Glasses 2).
+The gaze and scene camera video were captured through the Tobii SDK and processed in real-time on an NVIDIA Jetson Xavier computer.
+ArUco markers were placed at various locations within the cockpit simulator to ensure that several of them were constantly visible in the field of view of the eye tracker camera.
![SimOne cockpit](../../img/simone_cockpit.png)
@@ -28,16 +28,19 @@ To identify the relevant AOIs, a 3D model of the cockpit describing the AOI and
Finally, fixation events were sent in real-time through [Ivy bus middleware](https://gitlab.com/ivybus/ivy-python/) to the situation awareness software in charge of displaying attention getter onto the PFD screen.
-## ArGaze setup
+## Setup
-The project defines 3 main files to integrate **ArGaze** to the experiment:
+The setup to integrate **ArGaze** to the experiment is defined by 3 main files:
-* The context that captures gaze data and scene camera video: [live_streaming_context.json](context.md)
-* The pipeline that processes gaze data and scene camera video: [live_processing_pipeline.json](pipeline.md)
-* The observers that send fixation events to Ivy bus middleware: [observers.py](observers.md)
+* The context file that captures gaze data and scene camera video: [live_streaming_context.json](context.md)
+* The pipeline file that processes gaze data and scene camera video: [live_processing_pipeline.json](pipeline.md)
+* The observers file that send fixation events via Ivy bus middleware: [observers.py](observers.md)
-The project is loaded by executing the following command:
+As any **ArGaze** setup, it is loaded by executing the following command:
```shell
python -m argaze load live_streaming_context.json
```
+
+## Performance
+