aboutsummaryrefslogtreecommitdiff
path: root/docs/use_cases/air_controller_gaze_study
diff options
context:
space:
mode:
Diffstat (limited to 'docs/use_cases/air_controller_gaze_study')
-rw-r--r--docs/use_cases/air_controller_gaze_study/context.md14
-rw-r--r--docs/use_cases/air_controller_gaze_study/introduction.md17
-rw-r--r--docs/use_cases/air_controller_gaze_study/observers.md4
-rw-r--r--docs/use_cases/air_controller_gaze_study/pipeline.md22
4 files changed, 31 insertions, 26 deletions
diff --git a/docs/use_cases/air_controller_gaze_study/context.md b/docs/use_cases/air_controller_gaze_study/context.md
index ca9adf7..8bb4ef8 100644
--- a/docs/use_cases/air_controller_gaze_study/context.md
+++ b/docs/use_cases/air_controller_gaze_study/context.md
@@ -1,22 +1,22 @@
-Live streaming context
+Data playback context
======================
The context handles incoming eye tracker data before to pass them to a processing pipeline.
-## post_processing_context.json
+## data_playback_context.json
-For this use case we need to read Tobii Pro Glasses 2 records: **ArGaze** provides a [ready-made context](../../user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_2.md) class to read data from records made by this device.
+For this use case we need to read Tobii Pro Glasses 2 records: **ArGaze** provides a [ready-made context](../../user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_2.md) class to playback data from records made by this device.
-While *segment* entries are specific to the [TobiiProGlasses2.PostProcessing](../../argaze.md/#argaze.utils.contexts.TobiiProGlasses2.PostProcessing) class, *name* and *pipeline* entries are part of the parent [ArContext](../../argaze.md/#argaze.ArFeatures.ArContext) class.
+While *segment* entry is specific to the [TobiiProGlasses2.SegmentPlayback](../../argaze.md/#argaze.utils.contexts.TobiiProGlasses2.SegmentPlayback) class, *name* and *pipeline* entries are part of the parent [ArContext](../../argaze.md/#argaze.ArFeatures.ArContext) class.
```json
{
- "argaze.utils.contexts.TobiiProGlasses2.PostProcessing": {
- "name": "Tobii Pro Glasses 2 post-processing",
+ "argaze.utils.contexts.TobiiProGlasses2.SegmentPlayback": {
+ "name": "Tobii Pro Glasses 2 segment playback",
"segment": "/Volumes/projects/fbr6k3e/records/4rcbdzk/segments/1",
"pipeline": "post_processing_pipeline.json"
}
}
```
-The [post_processing_pipeline.json](pipeline.md) file mentioned aboved is described in the next chapter.
+The [post_processing_pipeline.json](pipeline.md) file mentioned above is described in the next chapter.
diff --git a/docs/use_cases/air_controller_gaze_study/introduction.md b/docs/use_cases/air_controller_gaze_study/introduction.md
index 313e492..f188eec 100644
--- a/docs/use_cases/air_controller_gaze_study/introduction.md
+++ b/docs/use_cases/air_controller_gaze_study/introduction.md
@@ -3,7 +3,7 @@ Post-processing head-mounted eye tracking records
**ArGaze** enabled a study of air traffic controller gaze strategy.
-The following use case has integrated the [ArUco marker pipeline](../../user_guide/aruco_marker_pipeline/introduction.md) to map air traffic controllers gaze onto multiple screens environment in post-processing then, enable scan path study thanks to the [gaze analysis pipeline](../../user_guide/gaze_analysis_pipeline/introduction.md).
+The following use case has integrated the [ArUco marker pipeline](../../user_guide/aruco_marker_pipeline/introduction.md) to map air traffic controllers gaze onto multiple screens environment in post-processing then, enable scan path study using the [gaze analysis pipeline](../../user_guide/gaze_analysis_pipeline/introduction.md).
## Background
@@ -18,22 +18,29 @@ During their training, controllers are taught to visually follow all aircraft st
![4Flight Workspace](../../img/4flight_workspace.png)
-A traffic simulation of moderate difficulty with a maximum of 13 and 16 aircraft simultaneously was performed by air traffic controllers. The controller could encounter lateral conflicts (same altitude) between 2 and 3 aircraft and conflicts between aircraft that need to ascend or descend within the sector. After the simulation, a directed interview about the gaze pattern was conducted. Eye tracking data was recorded with a Tobii Pro Glasses 2, a head-mounted eye tracker. The gaze and scene camera video were captured with Tobii Pro Lab software and post-processed with **ArGaze** software library. As the eye tracker model is head mounted, ArUco markers were placed around the two screens to ensure that several of them were always visible in the field of view of the eye tracker camera.
+A traffic simulation of moderate difficulty with a maximum of 13 and 16 aircraft simultaneously was performed by air traffic controllers. The controller could encounter lateral conflicts (same altitude) between 2 and 3 aircraft and conflicts between aircraft that need to ascend or descend within the sector.
+After the simulation, a directed interview about the gaze pattern was conducted.
+Eye tracking data was recorded with a Tobii Pro Glasses 2, a head-mounted eye tracker.
+The gaze and scene camera video were captured with Tobii Pro Lab software and post-processed with **ArGaze** software library.
+As the eye tracker model is head mounted, ArUco markers were placed around the two screens to ensure that several of them were always visible in the field of view of the eye tracker camera.
-![4Flight Workspace](../../img/4flight_aoi.png)
+Various metrics were exported with specific pipeline observers, including average fixation duration, explore/exploit ratio, K-coefficient, AOI distribution, transition matrix, entropy and N-grams.
+Although statistical analysis is not possible due to the small sample size of the study (6 instructors, 5 qualified controllers, and 5 trainees), visual pattern summaries have been manually built from transition matrix export to produce a qualitative interpretation showing what instructors attend during training and how qualified controllers work. Red arcs are more frequent than the blue ones. Instructors (Fig. a) and four different qualified controllers (Fig. b, c, d, e).
+
+![4Flight Visual pattern](../../img/4flight_visual_pattern.png)
## Setup
The setup to integrate **ArGaze** to the experiment is defined by 3 main files detailled in the next chapters:
-* The context file that reads gaze data and scene camera video records: [post_processing_context.json](context.md)
+* The context file that playback gaze data and scene camera video records: [data_playback_context.json](context.md)
* The pipeline file that processes gaze data and scene camera video: [post_processing_pipeline.json](pipeline.md)
* The observers file that exports analysis outputs: [observers.py](observers.md)
As any **ArGaze** setup, it is loaded by executing the [*load* command](../../user_guide/utils/main_commands.md):
```shell
-python -m argaze load post_processing_context.json
+python -m argaze load segment_playback_context.json
```
This command opens one GUI window per frame (one for the scene camera, one for the sector screen and one for the info screen) that allow to monitor gaze mapping while processing.
diff --git a/docs/use_cases/air_controller_gaze_study/observers.md b/docs/use_cases/air_controller_gaze_study/observers.md
index aad870f..500d573 100644
--- a/docs/use_cases/air_controller_gaze_study/observers.md
+++ b/docs/use_cases/air_controller_gaze_study/observers.md
@@ -1,5 +1,5 @@
-Fixation events sending
-=======================
+Metrics and video recording
+===========================
Observers are attached to pipeline steps to be notified when a method is called.
diff --git a/docs/use_cases/air_controller_gaze_study/pipeline.md b/docs/use_cases/air_controller_gaze_study/pipeline.md
index ec1aa59..69fdd2c 100644
--- a/docs/use_cases/air_controller_gaze_study/pipeline.md
+++ b/docs/use_cases/air_controller_gaze_study/pipeline.md
@@ -1,4 +1,4 @@
-Live processing pipeline
+Post processing pipeline
========================
The pipeline processes camera image and gaze data to enable gaze mapping and gaze analysis.
@@ -19,7 +19,7 @@ For this use case we need to detect ArUco markers to enable gaze mapping: **ArGa
"optic_parameters": "optic_parameters.json",
"parameters": {
"adaptiveThreshConstant": 20,
- "useAruco3Detection": 1
+ "useAruco3Detection": true
}
},
"gaze_movement_identifier": {
@@ -182,24 +182,22 @@ For this use case we need to detect ArUco markers to enable gaze mapping: **ArGa
}
}
}
- },
- "angle_tolerance": 15.0,
- "distance_tolerance": 2.54
+ }
}
},
"observers": {
"argaze.utils.UtilsFeatures.LookPerformanceRecorder": {
- "path": "_export/look_performance.csv"
+ "path": "look_performance.csv"
},
"argaze.utils.UtilsFeatures.WatchPerformanceRecorder": {
- "path": "_export/watch_performance.csv"
+ "path": "watch_performance.csv"
}
}
}
}
```
-All the files mentioned aboved are described below.
+All the files mentioned above are described below.
The *ScanPathAnalysisRecorder* and *AOIScanPathAnalysisRecorder* observers objects are defined into the [observers.py](observers.md) file that is described in the next chapter.
@@ -357,12 +355,12 @@ The video file is a record of the sector screen frame image.
## look_performance.csv
-This file contains the logs of *ArUcoCamera.look* method execution info. It is created into an *_export* folder from where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
+This file contains the logs of *ArUcoCamera.look* method execution info. It is created into the folder where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
-On a MacBookPro (2,3GHz Intel Core i9 8 cores), the *look* method execution time is ~7ms and it is called ~115 times per second.
+On a MacBookPro (2.3GHz Intel Core i9 8 cores), the *look* method execution time is ~1ms and it is called ~51 times per second.
## watch_performance.csv
-This file contains the logs of *ArUcoCamera.watch* method execution info. It file is created into an *_export* folder from where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
+This file contains the logs of *ArUcoCamera.watch* method execution info. It is created into the folder from where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
-On a MacBookPro (2,3GHz Intel Core i9 8 cores), the *watch* method execution time is ~60ms and it is called ~10 times per second.
+On a MacBookPro (2.3GHz Intel Core i9 8 cores) without CUDA acceleration, the *watch* method execution time is ~52ms and it is called more than 12 times per second.