aboutsummaryrefslogtreecommitdiff
path: root/docs/use_cases/pilot_gaze_monitoring/introduction.md
blob: f8f071341cbd4c3bc591ed4ea086e5cdf26f2852 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
Real time head-mounted eye tracking interactions
================================================

**ArGaze** enabled a cognitive assistant to support a pilot's situation awareness.

The following use case has integrated the [ArUco marker pipeline](../../user_guide/aruco_marker_pipeline/introduction.md) to map pilot gaze onto many cockpit instruments in real-time and then enable AOI fixation matching with the [gaze analysis pipeline](../../user_guide/gaze_analysis_pipeline/introduction.md). 

## Background

The [HAIKU project](https://haikuproject.eu/) aims to pave the way for human-centric intelligent assistants in the aviation domain by supporting, among others, the pilot during startle or surprise events. 
One of the features provided by the assistant through **ArGaze** is a situation awareness support that ensures the pilot updates his awareness of the aircraft state by monitoring his gaze and flight parameters. 
When this support is active, relevant information is highlighted on the Primary Flight Display (PFD) and the Electronic Centralized Aircraft Monitor (ECAM).

![SA alert](../../img/haiku_sa_alert.png)

## Environment

Due to the complexity of the cockpit simulator's geometry, pilot's eyes are tracked with a head-mounted eye tracker (Tobii Pro Glasses 2). 
The gaze and scene camera video were captured through the Tobii SDK and processed in real-time on an NVIDIA Jetson Xavier computer. 
ArUco markers were placed at various locations within the cockpit simulator to ensure that several of them were constantly visible in the field of view of the eye tracker camera.

![SimOne cockpit](../../img/simone_cockpit.png)

The [ArUco marker pipeline](../../user_guide/aruco_marker_pipeline/introduction.md) has enabled real-time gaze mapping onto multiple screens and panels around pilot-in-command position while [gaze analysis pipeline](../../user_guide/gaze_analysis_pipeline/introduction.md) was identifying fixations and matching them with dynamic AOIs related to each instruments. 
To identify the relevant AOIs, a 3D model of the cockpit describing the AOI and the position of the markers has been realized.

![ArUco markers and AOI scene](../../img/haiku_aoi.png)

Finally, fixation events were sent in real-time through [Ivy bus middleware](https://gitlab.com/ivybus/ivy-python/) to the situation awareness software in charge of displaying attention getter onto the PFD screen.

## Setup

The setup to integrate **ArGaze** to the experiment is defined by 3 main files detailled in the next chapters:

* The context file that captures gaze data and scene camera video: [live_streaming_context.json](context.md)
* The pipeline file that processes gaze data and scene camera video: [live_processing_pipeline.json](pipeline.md)
* The observers file that send fixation events via Ivy bus middleware: [observers.py](observers.md)

As any **ArGaze** setup, it is loaded by executing the [*load* command](../../user_guide/utils/main_commands.md):

```shell
python -m argaze load live_streaming_context.json
```

This command opens a GUI window that allows to start gaze calibration, to launch recording and to monitor gaze mapping.

![ArGaze load GUI for Haiku](../../img/argaze_load_gui_haiku.png)