Real time head-mounted eye tracking interactions ================================================ **ArGaze** enabled a cognitive assistant to support a pilot's situation awareness. The following use case has integrated the [ArUco marker pipeline](../../user_guide/aruco_marker_pipeline/introduction.md) to map pilot gaze onto many cockpit instruments in real-time and then enable AOI fixation matching with the [gaze analysis pipeline](../../user_guide/gaze_analysis_pipeline/introduction.md). ## Background The [HAIKU project](https://haikuproject.eu/) aims to pave the way for human-centric intelligent assistants in the aviation domain by supporting, among others, the pilot during startle or surprise events. One of the features provided by the assistant through **ArGaze** is a situation awareness support that ensures the pilot updates his awareness of the aircraft state by monitoring his gaze and flight parameters. When this support is active, relevant information is highlighted on the Primary Flight Display (PFD) and the Electronic Centralized Aircraft Monitor (ECAM). ![SA alert](../../img/haiku_sa_alert.png) ## Environment Due to the complexity of the cockpit simulator's geometry, pilot's eyes are tracked with a head-mounted eye tracker (Tobii Pro Glasses 2). The gaze and scene camera video were captured through the Tobii SDK and processed in real-time on an NVIDIA Jetson Xavier computer. ArUco markers were placed at various locations within the cockpit simulator to ensure that several of them were constantly visible in the field of view of the eye tracker camera. ![SimOne cockpit](../../img/simone_cockpit.png) The [ArUco marker pipeline](../../user_guide/aruco_marker_pipeline/introduction.md) has enabled real-time gaze mapping onto multiple screens and panels around pilot-in-command position while [gaze analysis pipeline](../../user_guide/gaze_analysis_pipeline/introduction.md) was identifying fixations and matching them with dynamic AOIs related to each instruments. To identify the relevant AOIs, a 3D model of the cockpit describing the AOI and the position of the markers has been realized. ![ArUco markers and AOI scene](../../img/haiku_aoi.png) Finally, fixation events were sent in real-time through [Ivy bus middleware](https://gitlab.com/ivybus/ivy-python/) to the situation awareness software in charge of displaying attention getter onto the PFD screen. ## Setup The setup to integrate **ArGaze** to the experiment is defined by 3 main files: * The context file that captures gaze data and scene camera video: [live_streaming_context.json](context.md) * The pipeline file that processes gaze data and scene camera video: [live_processing_pipeline.json](pipeline.md) * The observers file that send fixation events via Ivy bus middleware: [observers.py](observers.md) As any **ArGaze** setup, it is loaded by executing the following command: ```shell python -m argaze load live_streaming_context.json ``` ## Performance