diff options
author | Théo de la Hogue | 2024-07-08 12:05:42 +0200 |
---|---|---|
committer | Théo de la Hogue | 2024-07-08 12:05:42 +0200 |
commit | fecf8b7a54d729d528ba1ab599a9c4b48fdde953 (patch) | |
tree | 7d6539ce97310300f84772e6045f789ccf144a32 /docs/use_cases | |
parent | f6653bcd024caad8fc6070a6b0a306e0625bd8b2 (diff) | |
download | argaze-fecf8b7a54d729d528ba1ab599a9c4b48fdde953.zip argaze-fecf8b7a54d729d528ba1ab599a9c4b48fdde953.tar.gz argaze-fecf8b7a54d729d528ba1ab599a9c4b48fdde953.tar.bz2 argaze-fecf8b7a54d729d528ba1ab599a9c4b48fdde953.tar.xz |
Adding new use case.
Diffstat (limited to 'docs/use_cases')
5 files changed, 522 insertions, 1 deletions
diff --git a/docs/use_cases/air_controller_gaze_study/context.md b/docs/use_cases/air_controller_gaze_study/context.md new file mode 100644 index 0000000..ca9adf7 --- /dev/null +++ b/docs/use_cases/air_controller_gaze_study/context.md @@ -0,0 +1,22 @@ +Live streaming context +====================== + +The context handles incoming eye tracker data before to pass them to a processing pipeline. + +## post_processing_context.json + +For this use case we need to read Tobii Pro Glasses 2 records: **ArGaze** provides a [ready-made context](../../user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_2.md) class to read data from records made by this device. + +While *segment* entries are specific to the [TobiiProGlasses2.PostProcessing](../../argaze.md/#argaze.utils.contexts.TobiiProGlasses2.PostProcessing) class, *name* and *pipeline* entries are part of the parent [ArContext](../../argaze.md/#argaze.ArFeatures.ArContext) class. + +```json +{ + "argaze.utils.contexts.TobiiProGlasses2.PostProcessing": { + "name": "Tobii Pro Glasses 2 post-processing", + "segment": "/Volumes/projects/fbr6k3e/records/4rcbdzk/segments/1", + "pipeline": "post_processing_pipeline.json" + } +} +``` + +The [post_processing_pipeline.json](pipeline.md) file mentioned aboved is described in the next chapter. diff --git a/docs/use_cases/air_controller_gaze_study/introduction.md b/docs/use_cases/air_controller_gaze_study/introduction.md new file mode 100644 index 0000000..313e492 --- /dev/null +++ b/docs/use_cases/air_controller_gaze_study/introduction.md @@ -0,0 +1,41 @@ +Post-processing head-mounted eye tracking records +================================================= + +**ArGaze** enabled a study of air traffic controller gaze strategy. + +The following use case has integrated the [ArUco marker pipeline](../../user_guide/aruco_marker_pipeline/introduction.md) to map air traffic controllers gaze onto multiple screens environment in post-processing then, enable scan path study thanks to the [gaze analysis pipeline](../../user_guide/gaze_analysis_pipeline/introduction.md). + +## Background + +The next-gen air traffic control system (4Flight) aims to enhance the operational capacity of the en-route control center by offering new tools to air traffic controllers. However, it entails significant changes in their working method, which will consequently have an impact on how they are trained. +Several research projects on visual patterns of air traffic controllers indicate the urgent need to improve the effectiveness of training in visual information seeking behavior. +An exploratory study was initiated by a group of trainee air traffic controllers with the aim of analyzing the visual patterns of novice controllers and instructors, intending to propose guidelines regarding the visual pattern for training. + +## Environment + +The 4Flight control position consists of two screens: the first displays the radar image along with other information regarding the observed sector, the second displays the agenda, which allows the controller to link conflicting aircraft by creating data blocks, and the Dyp info, which displays some information about the flight. +During their training, controllers are taught to visually follow all aircraft streams along a given route, focusing on their planned flight path and potential interactions with other aircraft. + +![4Flight Workspace](../../img/4flight_workspace.png) + +A traffic simulation of moderate difficulty with a maximum of 13 and 16 aircraft simultaneously was performed by air traffic controllers. The controller could encounter lateral conflicts (same altitude) between 2 and 3 aircraft and conflicts between aircraft that need to ascend or descend within the sector. After the simulation, a directed interview about the gaze pattern was conducted. Eye tracking data was recorded with a Tobii Pro Glasses 2, a head-mounted eye tracker. The gaze and scene camera video were captured with Tobii Pro Lab software and post-processed with **ArGaze** software library. As the eye tracker model is head mounted, ArUco markers were placed around the two screens to ensure that several of them were always visible in the field of view of the eye tracker camera. + +![4Flight Workspace](../../img/4flight_aoi.png) + +## Setup + +The setup to integrate **ArGaze** to the experiment is defined by 3 main files detailled in the next chapters: + +* The context file that reads gaze data and scene camera video records: [post_processing_context.json](context.md) +* The pipeline file that processes gaze data and scene camera video: [post_processing_pipeline.json](pipeline.md) +* The observers file that exports analysis outputs: [observers.py](observers.md) + +As any **ArGaze** setup, it is loaded by executing the [*load* command](../../user_guide/utils/main_commands.md): + +```shell +python -m argaze load post_processing_context.json +``` + +This command opens one GUI window per frame (one for the scene camera, one for the sector screen and one for the info screen) that allow to monitor gaze mapping while processing. + +![ArGaze load GUI for PFE study](../../img/argaze_load_gui_pfe.png) diff --git a/docs/use_cases/air_controller_gaze_study/observers.md b/docs/use_cases/air_controller_gaze_study/observers.md new file mode 100644 index 0000000..aad870f --- /dev/null +++ b/docs/use_cases/air_controller_gaze_study/observers.md @@ -0,0 +1,90 @@ +Fixation events sending +======================= + +Observers are attached to pipeline steps to be notified when a method is called. + +## observers.py + +For this use case we need to record gaze analysis metrics on *ArUcoCamera.on_look* call and to record sector screen image on *ArUcoCamera.on_copy_background_into_scenes_frames* signal. + +```python +import logging + +from argaze.utils import UtilsFeatures + +import cv2 + +class ScanPathAnalysisRecorder(UtilsFeatures.FileWriter): + + def __init__(self, **kwargs): + + super().__init__(**kwargs) + + self.header = "Timestamp (ms)", "Path duration (ms)", "Steps number", "Fixation durations average (ms)", "Explore/Exploit ratio", "K coefficient" + + def on_look(self, timestamp, frame, exception): + """Log scan path metrics.""" + + if frame.is_analysis_available(): + + analysis = frame.analysis() + + data = ( + int(timestamp), + analysis['argaze.GazeAnalysis.Basic.ScanPathAnalyzer'].path_duration, + analysis['argaze.GazeAnalysis.Basic.ScanPathAnalyzer'].steps_number, + analysis['argaze.GazeAnalysis.Basic.ScanPathAnalyzer'].step_fixation_durations_average, + analysis['argaze.GazeAnalysis.ExploreExploitRatio.ScanPathAnalyzer'].explore_exploit_ratio, + analysis['argaze.GazeAnalysis.KCoefficient.ScanPathAnalyzer'].K + ) + + self.write(data) + +class AOIScanPathAnalysisRecorder(UtilsFeatures.FileWriter): + + def __init__(self, **kwargs): + + super().__init__(**kwargs) + + self.header = "Timestamp (ms)", "Path duration (ms)", "Steps number", "Fixation durations average (ms)", "Transition matrix probabilities", "Transition matrix density", "N-Grams count", "Stationary entropy", "Transition entropy" + + def on_look(self, timestamp, layer, exception): + """Log aoi scan path metrics""" + + if layer.is_analysis_available(): + + analysis = layer.analysis() + + data = ( + int(timestamp), + analysis['argaze.GazeAnalysis.Basic.AOIScanPathAnalyzer'].path_duration, + analysis['argaze.GazeAnalysis.Basic.AOIScanPathAnalyzer'].steps_number, + analysis['argaze.GazeAnalysis.Basic.AOIScanPathAnalyzer'].step_fixation_durations_average, + analysis['argaze.GazeAnalysis.TransitionMatrix.AOIScanPathAnalyzer'].transition_matrix_probabilities, + analysis['argaze.GazeAnalysis.TransitionMatrix.AOIScanPathAnalyzer'].transition_matrix_density, + analysis['argaze.GazeAnalysis.NGram.AOIScanPathAnalyzer'].ngrams_count, + analysis['argaze.GazeAnalysis.Entropy.AOIScanPathAnalyzer'].stationary_entropy, + analysis['argaze.GazeAnalysis.Entropy.AOIScanPathAnalyzer'].transition_entropy + ) + + self.write(data) + +class VideoRecorder(UtilsFeatures.VideoWriter): + + def __init__(self, **kwargs): + + super().__init__(**kwargs) + + def on_copy_background_into_scenes_frames(self, timestamp, frame, exception): + """Write frame image.""" + + logging.debug('VideoRecorder.on_map') + + image = frame.image() + + # Write video timing + cv2.rectangle(image, (0, 0), (550, 50), (63, 63, 63), -1) + cv2.putText(image, f'Time: {int(timestamp)} ms', (20, 40), cv2.FONT_HERSHEY_SIMPLEX, 1, (255, 255, 255), 1, cv2.LINE_AA) + + self.write(image) +``` diff --git a/docs/use_cases/air_controller_gaze_study/pipeline.md b/docs/use_cases/air_controller_gaze_study/pipeline.md new file mode 100644 index 0000000..ec1aa59 --- /dev/null +++ b/docs/use_cases/air_controller_gaze_study/pipeline.md @@ -0,0 +1,368 @@ +Live processing pipeline +======================== + +The pipeline processes camera image and gaze data to enable gaze mapping and gaze analysis. + +## post_processing_pipeline.json + +For this use case we need to detect ArUco markers to enable gaze mapping: **ArGaze** provides the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) class to setup an [ArUco markers pipeline](../../user_guide/aruco_marker_pipeline/introduction.md). + +```json +{ + "argaze.ArUcoMarker.ArUcoCamera.ArUcoCamera": { + "name": "ATC_Study", + "size": [1920, 1080], + "sides_mask": 420, + "copy_background_into_scenes_frames": true, + "aruco_detector": { + "dictionary": "DICT_APRILTAG_16h5", + "optic_parameters": "optic_parameters.json", + "parameters": { + "adaptiveThreshConstant": 20, + "useAruco3Detection": 1 + } + }, + "gaze_movement_identifier": { + "argaze.GazeAnalysis.DispersionThresholdIdentification.GazeMovementIdentifier": { + "deviation_max_threshold": 25, + "duration_min_threshold": 150 + } + }, + "layers": { + "Main" : { + "aoi_matcher": { + "argaze.GazeAnalysis.DeviationCircleCoverage.AOIMatcher": { + "coverage_threshold": 0.5 + } + }, + "aoi_scan_path" : { + "duration_max": 60000 + }, + "aoi_scan_path_analyzers": { + "argaze.GazeAnalysis.Basic.AOIScanPathAnalyzer": {}, + "argaze.GazeAnalysis.TransitionMatrix.AOIScanPathAnalyzer": {}, + "argaze.GazeAnalysis.NGram.AOIScanPathAnalyzer": { + "n_min": 3, + "n_max": 5 + }, + "argaze.GazeAnalysis.Entropy.AOIScanPathAnalyzer": {} + }, + "observers": { + "observers.AOIScanPathAnalysisRecorder": { + "path": "aoi_metrics.csv" + } + } + } + }, + "image_parameters": { + "background_weight": 1, + "draw_gaze_positions": { + "color": [0, 255, 255], + "size": 4 + }, + "draw_detected_markers": { + "color": [0, 255, 0] + }, + "draw_layers": { + "Main": { + "draw_aoi_scene": { + "draw_aoi": { + "color": [255, 255, 255], + "border_size": 1 + } + }, + "draw_aoi_matching": { + "update_looked_aoi": true, + "draw_looked_aoi": { + "color": [0, 255, 0], + "border_size": 2 + }, + "looked_aoi_name_color": [255, 255, 255], + "looked_aoi_name_offset": [0, -10] + } + } + } + }, + "scenes": { + "Workspace": { + "aruco_markers_group": "workspace_markers.obj", + "layers": { + "Main" : { + "aoi_scene": "workspace_aois.obj" + } + }, + "frames": { + "Sector_Screen": { + "size": [1080, 1017], + "gaze_movement_identifier": { + "argaze.GazeAnalysis.DispersionThresholdIdentification.GazeMovementIdentifier": { + "deviation_max_threshold": 25, + "duration_min_threshold": 150 + } + }, + "scan_path": { + "duration_max": 30000 + }, + "scan_path_analyzers": { + "argaze.GazeAnalysis.Basic.ScanPathAnalyzer": {}, + "argaze.GazeAnalysis.ExploreExploitRatio.ScanPathAnalyzer": { + "short_fixation_duration_threshold": 0 + }, + "argaze.GazeAnalysis.KCoefficient.ScanPathAnalyzer": {} + }, + "layers" :{ + "Main": { + "aoi_scene": "sector_screen_aois.svg" + } + }, + "heatmap": { + "size": [80, 60] + }, + "image_parameters": { + "background_weight": 1, + "heatmap_weight": 0.5, + "draw_gaze_positions": { + "color": [0, 127, 127], + "size": 4 + }, + "draw_scan_path": { + "draw_fixations": { + "deviation_circle_color": [255, 255, 255], + "duration_border_color": [0, 127, 127], + "duration_factor": 1e-2 + }, + "draw_saccades": { + "line_color": [0, 255, 255] + }, + "deepness": 0 + }, + "draw_layers": { + "Main": { + "draw_aoi_scene": { + "draw_aoi": { + "color": [255, 255, 255], + "border_size": 1 + } + }, + "draw_aoi_matching": { + "draw_matched_fixation": { + "deviation_circle_color": [255, 255, 255], + "draw_positions": { + "position_color": [0, 255, 0], + "line_color": [0, 0, 0] + } + }, + "draw_looked_aoi": { + "color": [0, 255, 0], + "border_size": 2 + }, + "looked_aoi_name_color": [255, 255, 255], + "looked_aoi_name_offset": [10, 10] + } + } + } + }, + "observers": { + "observers.ScanPathAnalysisRecorder": { + "path": "sector_screen.csv" + }, + "observers.VideoRecorder": { + "path": "sector_screen.mp4", + "width": 1080, + "height": 1024, + "fps": 25 + } + } + }, + "Info_Screen": { + "size": [640, 1080], + "layers" : { + "Main": { + "aoi_scene": "info_screen_aois.svg" + } + } + } + }, + "angle_tolerance": 15.0, + "distance_tolerance": 2.54 + } + }, + "observers": { + "argaze.utils.UtilsFeatures.LookPerformanceRecorder": { + "path": "_export/look_performance.csv" + }, + "argaze.utils.UtilsFeatures.WatchPerformanceRecorder": { + "path": "_export/watch_performance.csv" + } + } + } +} +``` + +All the files mentioned aboved are described below. + +The *ScanPathAnalysisRecorder* and *AOIScanPathAnalysisRecorder* observers objects are defined into the [observers.py](observers.md) file that is described in the next chapter. + +## optic_parameters.json + +This file defines the Tobii Pro glasses 2 scene camera optic parameters which has been calculated as explained into [the camera calibration chapter](../../user_guide/aruco_marker_pipeline/advanced_topics/optic_parameters_calibration.md). + +```json +{ + "rms": 0.6688921504088245, + "dimensions": [ + 1920, + 1080 + ], + "K": [ + [ + 1135.6524381415752, + 0.0, + 956.0685325355497 + ], + [ + 0.0, + 1135.9272506869524, + 560.059099810324 + ], + [ + 0.0, + 0.0, + 1.0 + ] + ], + "D": [ + 0.01655492265003404, + 0.1985524264972037, + 0.002129965902489484, + -0.0019528582922179365, + -0.5792910353639452 + ] +} +``` + +## workspace_markers.obj + +This file defines the place where are the ArUco markers into the workspace geometry. Markers' positions have been edited in Blender software from a 3D model of the workspace built manually then exported at OBJ format. + +```obj +# Blender v3.0.1 OBJ File: 'workspace.blend' +# www.blender.org +o DICT_APRILTAG_16h5#1_Marker +v -2.532475 48.421242 0.081627 +v 2.467094 48.355682 0.077174 +v 2.532476 53.352734 -0.081634 +v -2.467093 53.418293 -0.077182 +s off +f 1 2 3 4 +o DICT_APRILTAG_16h5#6_Marker +v 88.144676 23.084166 -0.070246 +v 93.144661 23.094980 -0.072225 +v 93.133904 28.092941 0.070232 +v 88.133919 28.082127 0.072211 +s off +f 5 6 7 8 +o DICT_APRILTAG_16h5#2_Marker +v -6.234516 27.087950 0.176944 +v -1.244015 27.005413 -0.119848 +v -1.164732 32.004459 -0.176936 +v -6.155232 32.086998 0.119855 +s off +f 9 10 11 12 +o DICT_APRILTAG_16h5#3_Marker +v -2.518053 -2.481743 -0.018721 +v 2.481756 -2.518108 0.005601 +v 2.518059 2.481743 0.018721 +v -2.481749 2.518108 -0.005601 +s off +f 13 14 15 16 +o DICT_APRILTAG_16h5#5_Marker +v 48.746418 48.319012 -0.015691 +v 53.746052 48.374046 0.009490 +v 53.690983 53.373741 0.015698 +v 48.691349 53.318699 -0.009490 +s off +f 17 18 19 20 +o DICT_APRILTAG_16h5#4_Marker +v 23.331947 -3.018721 5.481743 +v 28.331757 -2.994399 5.518108 +v 28.368059 -2.981279 0.518257 +v 23.368252 -3.005600 0.481892 +s off +f 21 22 23 24 + +``` + +## workspace_aois.obj + +This file defines the place of the AOI into the workspace geometry. AOI positions have been edited in [Blender software](https://www.blender.org/) from a 3D model of the workspace built manually then exported at OBJ format. + +```obj +# Blender v3.0.1 OBJ File: 'workspace.blend' +# www.blender.org +o Sector_Screen +v 0.000000 1.008786 0.000000 +v 51.742416 1.008786 0.000000 +v 0.000000 52.998108 0.000000 +v 51.742416 52.998108 0.000000 +s off +f 1 2 4 3 +o Info_Screen +v 56.407101 0.000000 0.000000 +v 91.407104 0.000000 0.000000 +v 56.407101 52.499996 0.000000 +v 91.407104 52.499996 0.000000 +s off +f 5 6 8 7 + +``` + +## sector_screen_aois.svg + +This file defines the place of the AOI into the sector screen frame. AOI positions have been edited [Inkscape software](https://inkscape.org/fr/) from a screenshot of the sector screen then exported at SVG format. + +```svg +<svg > + <path id="Area_1" d="M317.844,198.526L507.431,426.837L306.453,595.073L110.442,355.41L317.844,198.526Z"/> + <path id="Area_2" d="M507.431,426.837L611.554,563.624L444.207,750.877L306.453,595.073L507.431,426.837Z"/> + <path id="Area_3" d="M395.175,1017L444.207,750.877L611.554,563.624L1080,954.462L1080,1017L395.175,1017Z"/> + <path id="Area_4" d="M611.554,563.624L756.528,293.236L562.239,198.526L471.45,382.082L611.554,563.624Z"/> + <path id="Area_5" d="M0,900.683L306.453,595.073L444.207,750.877L395.175,1017L0,1017L0,900.683Z"/> + <path id="Area_6" d="M471.45,381.938L557.227,207.284L354.832,65.656L237.257,104.014L471.45,381.938Z"/> + <path id="Area_7" d="M0,22.399L264.521,24.165L318.672,77.325L237.257,103.625L248.645,118.901L0,80.963L0,22.399Z"/> +</svg> +``` + +## info_screen_aois.svg + +This file defines the place of the AOI into the info screen frame. AOI positions have been edited [Inkscape software](https://inkscape.org/fr/) from a screenshot of the info screen then exported at SVG format. + +```svg +<svg > + <rect id="Strips" x="0" y="880" width="640" height="200"/> +</svg> +``` + +## aoi_metrics.csv + +The file contains all the metrics recorded by the *AOIScanPathAnalysisRecorder* objects as defined into the [observers.py](observers.md) file. + +## sector_screen.csv + +The file contains all the metrics recorded by the *ScanPathAnalysisRecorder* objects as defined into the [observers.py](observers.md) file. + +## sector_screen.mp4 + +The video file is a record of the sector screen frame image. + +## look_performance.csv + +This file contains the logs of *ArUcoCamera.look* method execution info. It is created into an *_export* folder from where the [*load* command](../../user_guide/utils/main_commands.md) is launched. + +On a MacBookPro (2,3GHz Intel Core i9 8 cores), the *look* method execution time is ~7ms and it is called ~115 times per second. + +## watch_performance.csv + +This file contains the logs of *ArUcoCamera.watch* method execution info. It file is created into an *_export* folder from where the [*load* command](../../user_guide/utils/main_commands.md) is launched. + +On a MacBookPro (2,3GHz Intel Core i9 8 cores), the *watch* method execution time is ~60ms and it is called ~10 times per second. diff --git a/docs/use_cases/pilot_gaze_monitoring/pipeline.md b/docs/use_cases/pilot_gaze_monitoring/pipeline.md index 74381e8..e50c325 100644 --- a/docs/use_cases/pilot_gaze_monitoring/pipeline.md +++ b/docs/use_cases/pilot_gaze_monitoring/pipeline.md @@ -316,4 +316,4 @@ On a Jetson Xavier computer, the *look* method execution time is ~0.5ms and it i This file contains the logs of *ArUcoCamera.watch* method execution info. It file is created into an *_export* folder from where the [*load* command](../../user_guide/utils/main_commands.md) is launched. -On a Jetson Xavier computer, the *watch* method execution time is ~~50ms and it is called 10 times per second. +On a Jetson Xavier computer, the *watch* method execution time is ~50ms and it is called ~10 times per second. |