aboutsummaryrefslogtreecommitdiff
path: root/docs/use_cases/air_controller_gaze_study/introduction.md
diff options
context:
space:
mode:
Diffstat (limited to 'docs/use_cases/air_controller_gaze_study/introduction.md')
-rw-r--r--docs/use_cases/air_controller_gaze_study/introduction.md17
1 files changed, 12 insertions, 5 deletions
diff --git a/docs/use_cases/air_controller_gaze_study/introduction.md b/docs/use_cases/air_controller_gaze_study/introduction.md
index 313e492..f188eec 100644
--- a/docs/use_cases/air_controller_gaze_study/introduction.md
+++ b/docs/use_cases/air_controller_gaze_study/introduction.md
@@ -3,7 +3,7 @@ Post-processing head-mounted eye tracking records
**ArGaze** enabled a study of air traffic controller gaze strategy.
-The following use case has integrated the [ArUco marker pipeline](../../user_guide/aruco_marker_pipeline/introduction.md) to map air traffic controllers gaze onto multiple screens environment in post-processing then, enable scan path study thanks to the [gaze analysis pipeline](../../user_guide/gaze_analysis_pipeline/introduction.md).
+The following use case has integrated the [ArUco marker pipeline](../../user_guide/aruco_marker_pipeline/introduction.md) to map air traffic controllers gaze onto multiple screens environment in post-processing then, enable scan path study using the [gaze analysis pipeline](../../user_guide/gaze_analysis_pipeline/introduction.md).
## Background
@@ -18,22 +18,29 @@ During their training, controllers are taught to visually follow all aircraft st
![4Flight Workspace](../../img/4flight_workspace.png)
-A traffic simulation of moderate difficulty with a maximum of 13 and 16 aircraft simultaneously was performed by air traffic controllers. The controller could encounter lateral conflicts (same altitude) between 2 and 3 aircraft and conflicts between aircraft that need to ascend or descend within the sector. After the simulation, a directed interview about the gaze pattern was conducted. Eye tracking data was recorded with a Tobii Pro Glasses 2, a head-mounted eye tracker. The gaze and scene camera video were captured with Tobii Pro Lab software and post-processed with **ArGaze** software library. As the eye tracker model is head mounted, ArUco markers were placed around the two screens to ensure that several of them were always visible in the field of view of the eye tracker camera.
+A traffic simulation of moderate difficulty with a maximum of 13 and 16 aircraft simultaneously was performed by air traffic controllers. The controller could encounter lateral conflicts (same altitude) between 2 and 3 aircraft and conflicts between aircraft that need to ascend or descend within the sector.
+After the simulation, a directed interview about the gaze pattern was conducted.
+Eye tracking data was recorded with a Tobii Pro Glasses 2, a head-mounted eye tracker.
+The gaze and scene camera video were captured with Tobii Pro Lab software and post-processed with **ArGaze** software library.
+As the eye tracker model is head mounted, ArUco markers were placed around the two screens to ensure that several of them were always visible in the field of view of the eye tracker camera.
-![4Flight Workspace](../../img/4flight_aoi.png)
+Various metrics were exported with specific pipeline observers, including average fixation duration, explore/exploit ratio, K-coefficient, AOI distribution, transition matrix, entropy and N-grams.
+Although statistical analysis is not possible due to the small sample size of the study (6 instructors, 5 qualified controllers, and 5 trainees), visual pattern summaries have been manually built from transition matrix export to produce a qualitative interpretation showing what instructors attend during training and how qualified controllers work. Red arcs are more frequent than the blue ones. Instructors (Fig. a) and four different qualified controllers (Fig. b, c, d, e).
+
+![4Flight Visual pattern](../../img/4flight_visual_pattern.png)
## Setup
The setup to integrate **ArGaze** to the experiment is defined by 3 main files detailled in the next chapters:
-* The context file that reads gaze data and scene camera video records: [post_processing_context.json](context.md)
+* The context file that playback gaze data and scene camera video records: [data_playback_context.json](context.md)
* The pipeline file that processes gaze data and scene camera video: [post_processing_pipeline.json](pipeline.md)
* The observers file that exports analysis outputs: [observers.py](observers.md)
As any **ArGaze** setup, it is loaded by executing the [*load* command](../../user_guide/utils/main_commands.md):
```shell
-python -m argaze load post_processing_context.json
+python -m argaze load segment_playback_context.json
```
This command opens one GUI window per frame (one for the scene camera, one for the sector screen and one for the info screen) that allow to monitor gaze mapping while processing.