aboutsummaryrefslogtreecommitdiff
path: root/docs/use_cases/air_controller_gaze_study/introduction.md
blob: 5f1c6acce50a23a348a0e11b9fbe109634d5b042 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
Post-processing head-mounted eye tracking records
=================================================

**ArGaze** enabled a study of air traffic controller gaze strategy.

The following use case has integrated the [ArUco marker pipeline](../../user_guide/aruco_marker_pipeline/introduction.md) to map air traffic controllers gaze onto multiple screens environment in post-processing then, enable scan path study using the [gaze analysis pipeline](../../user_guide/gaze_analysis_pipeline/introduction.md).

## Background

The next-gen air traffic control system (4Flight) aims to enhance the operational capacity of the en-route control center by offering new tools to air traffic controllers. However, it entails significant changes in their working method, which will consequently have an impact on how they are trained. 
Several research projects on visual patterns of air traffic controllers indicate the urgent need to improve the effectiveness of training in visual information seeking behavior.
An exploratory study was initiated by a group of trainee air traffic controllers with the aim of analyzing the visual patterns of novice controllers and instructors, intending to propose guidelines regarding the visual pattern for training.

## Environment

The 4Flight control position consists of two screens: the first displays the radar image along with other information regarding the observed sector, the second displays the agenda, which allows the controller to link conflicting aircraft by creating data blocks, and the Dyp info, which displays some information about the flight.
During their training, controllers are taught to visually follow all aircraft streams along a given route, focusing on their planned flight path and potential interactions with other aircraft.

![4Flight Workspace](../../img/4flight_workspace.png)

A traffic simulation of moderate difficulty with a maximum of 13 and 16 aircraft simultaneously was performed by air traffic controllers. The controller could encounter lateral conflicts (same altitude) between 2 and 3 aircraft and conflicts between aircraft that need to ascend or descend within the sector. After the simulation, a directed interview about the gaze pattern was conducted. Eye tracking data was recorded with a Tobii Pro Glasses 2, a head-mounted eye tracker. The gaze and scene camera video were captured with Tobii Pro Lab software and post-processed with **ArGaze** software library. As the eye tracker model is head mounted, ArUco markers were placed around the two screens to ensure that several of them were always visible in the field of view of the eye tracker camera.

![4Flight Workspace](../../img/4flight_aoi.png)

## Setup

The setup to integrate **ArGaze** to the experiment is defined by 3 main files detailled in the next chapters:

* The context file that playback gaze data and scene camera video records: [data_playback_context.json](context.md)
* The pipeline file that processes gaze data and scene camera video: [post_processing_pipeline.json](pipeline.md)
* The observers file that exports analysis outputs: [observers.py](observers.md)

As any **ArGaze** setup, it is loaded by executing the [*load* command](../../user_guide/utils/main_commands.md):

```shell
python -m argaze load segment_playback_context.json
```

This command opens one GUI window per frame (one for the scene camera, one for the sector screen and one for the info screen) that allow to monitor gaze mapping while processing.

![ArGaze load GUI for PFE study](../../img/argaze_load_gui_pfe.png)