Post processing pipeline
========================
The pipeline processes camera image and gaze data to enable gaze mapping and gaze analysis.
## post_processing_pipeline.json
For this use case we need to detect ArUco markers to enable gaze mapping: **ArGaze** provides the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) class to setup an [ArUco markers pipeline](../../user_guide/aruco_marker_pipeline/introduction.md).
```json
{
"argaze.ArUcoMarker.ArUcoCamera.ArUcoCamera": {
"name": "ATC_Study",
"size": [1920, 1080],
"sides_mask": 420,
"copy_background_into_scenes_frames": true,
"aruco_detector": {
"dictionary": "DICT_APRILTAG_16h5",
"optic_parameters": "optic_parameters.json",
"parameters": {
"adaptiveThreshConstant": 20,
"useAruco3Detection": true
}
},
"gaze_movement_identifier": {
"argaze.GazeAnalysis.DispersionThresholdIdentification.GazeMovementIdentifier": {
"deviation_max_threshold": 25,
"duration_min_threshold": 150
}
},
"layers": {
"Main" : {
"aoi_matcher": {
"argaze.GazeAnalysis.DeviationCircleCoverage.AOIMatcher": {
"coverage_threshold": 0.5
}
},
"aoi_scan_path" : {
"duration_max": 60000
},
"aoi_scan_path_analyzers": {
"argaze.GazeAnalysis.Basic.AOIScanPathAnalyzer": {},
"argaze.GazeAnalysis.TransitionMatrix.AOIScanPathAnalyzer": {},
"argaze.GazeAnalysis.NGram.AOIScanPathAnalyzer": {
"n_min": 3,
"n_max": 5
},
"argaze.GazeAnalysis.Entropy.AOIScanPathAnalyzer": {}
},
"observers": {
"observers.AOIScanPathAnalysisRecorder": {
"path": "aoi_metrics.csv"
}
}
}
},
"image_parameters": {
"background_weight": 1,
"draw_gaze_positions": {
"color": [0, 255, 255],
"size": 4
},
"draw_detected_markers": {
"color": [0, 255, 0]
},
"draw_layers": {
"Main": {
"draw_aoi_scene": {
"draw_aoi": {
"color": [255, 255, 255],
"border_size": 1
}
},
"draw_aoi_matching": {
"update_looked_aoi": true,
"draw_looked_aoi": {
"color": [0, 255, 0],
"border_size": 2
},
"looked_aoi_name_color": [255, 255, 255],
"looked_aoi_name_offset": [0, -10]
}
}
}
},
"scenes": {
"Workspace": {
"aruco_markers_group": "workspace_markers.obj",
"layers": {
"Main" : {
"aoi_scene": "workspace_aois.obj"
}
},
"frames": {
"Sector_Screen": {
"size": [1080, 1017],
"gaze_movement_identifier": {
"argaze.GazeAnalysis.DispersionThresholdIdentification.GazeMovementIdentifier": {
"deviation_max_threshold": 25,
"duration_min_threshold": 150
}
},
"scan_path": {
"duration_max": 30000
},
"scan_path_analyzers": {
"argaze.GazeAnalysis.Basic.ScanPathAnalyzer": {},
"argaze.GazeAnalysis.ExploreExploitRatio.ScanPathAnalyzer": {
"short_fixation_duration_threshold": 0
},
"argaze.GazeAnalysis.KCoefficient.ScanPathAnalyzer": {}
},
"layers" :{
"Main": {
"aoi_scene": "sector_screen_aois.svg"
}
},
"heatmap": {
"size": [80, 60]
},
"image_parameters": {
"background_weight": 1,
"heatmap_weight": 0.5,
"draw_gaze_positions": {
"color": [0, 127, 127],
"size": 4
},
"draw_scan_path": {
"draw_fixations": {
"deviation_circle_color": [255, 255, 255],
"duration_border_color": [0, 127, 127],
"duration_factor": 1e-2
},
"draw_saccades": {
"line_color": [0, 255, 255]
},
"deepness": 0
},
"draw_layers": {
"Main": {
"draw_aoi_scene": {
"draw_aoi": {
"color": [255, 255, 255],
"border_size": 1
}
},
"draw_aoi_matching": {
"draw_matched_fixation": {
"deviation_circle_color": [255, 255, 255],
"draw_positions": {
"position_color": [0, 255, 0],
"line_color": [0, 0, 0]
}
},
"draw_looked_aoi": {
"color": [0, 255, 0],
"border_size": 2
},
"looked_aoi_name_color": [255, 255, 255],
"looked_aoi_name_offset": [10, 10]
}
}
}
},
"observers": {
"observers.ScanPathAnalysisRecorder": {
"path": "sector_screen.csv"
},
"observers.VideoRecorder": {
"path": "sector_screen.mp4",
"width": 1080,
"height": 1024,
"fps": 25
}
}
},
"Info_Screen": {
"size": [640, 1080],
"layers" : {
"Main": {
"aoi_scene": "info_screen_aois.svg"
}
}
}
}
}
},
"observers": {
"argaze.utils.UtilsFeatures.LookPerformanceRecorder": {
"path": "look_performance.csv"
},
"argaze.utils.UtilsFeatures.WatchPerformanceRecorder": {
"path": "watch_performance.csv"
}
}
}
}
```
All the files mentioned above are described below.
The *ScanPathAnalysisRecorder* and *AOIScanPathAnalysisRecorder* observers objects are defined into the [observers.py](observers.md) file that is described in the next chapter.
## optic_parameters.json
This file defines the Tobii Pro glasses 2 scene camera optic parameters which has been calculated as explained into [the camera calibration chapter](../../user_guide/aruco_marker_pipeline/advanced_topics/optic_parameters_calibration.md).
```json
{
"rms": 0.6688921504088245,
"dimensions": [
1920,
1080
],
"K": [
[
1135.6524381415752,
0.0,
956.0685325355497
],
[
0.0,
1135.9272506869524,
560.059099810324
],
[
0.0,
0.0,
1.0
]
],
"D": [
0.01655492265003404,
0.1985524264972037,
0.002129965902489484,
-0.0019528582922179365,
-0.5792910353639452
]
}
```
## workspace_markers.obj
This file defines the place where are the ArUco markers into the workspace geometry. Markers' positions have been edited in Blender software from a 3D model of the workspace built manually then exported at OBJ format.
```obj
# Blender v3.0.1 OBJ File: 'workspace.blend'
# www.blender.org
o DICT_APRILTAG_16h5#1_Marker
v -2.532475 48.421242 0.081627
v 2.467094 48.355682 0.077174
v 2.532476 53.352734 -0.081634
v -2.467093 53.418293 -0.077182
s off
f 1 2 3 4
o DICT_APRILTAG_16h5#6_Marker
v 88.144676 23.084166 -0.070246
v 93.144661 23.094980 -0.072225
v 93.133904 28.092941 0.070232
v 88.133919 28.082127 0.072211
s off
f 5 6 7 8
o DICT_APRILTAG_16h5#2_Marker
v -6.234516 27.087950 0.176944
v -1.244015 27.005413 -0.119848
v -1.164732 32.004459 -0.176936
v -6.155232 32.086998 0.119855
s off
f 9 10 11 12
o DICT_APRILTAG_16h5#3_Marker
v -2.518053 -2.481743 -0.018721
v 2.481756 -2.518108 0.005601
v 2.518059 2.481743 0.018721
v -2.481749 2.518108 -0.005601
s off
f 13 14 15 16
o DICT_APRILTAG_16h5#5_Marker
v 48.746418 48.319012 -0.015691
v 53.746052 48.374046 0.009490
v 53.690983 53.373741 0.015698
v 48.691349 53.318699 -0.009490
s off
f 17 18 19 20
o DICT_APRILTAG_16h5#4_Marker
v 23.331947 -3.018721 5.481743
v 28.331757 -2.994399 5.518108
v 28.368059 -2.981279 0.518257
v 23.368252 -3.005600 0.481892
s off
f 21 22 23 24
```
## workspace_aois.obj
This file defines the place of the AOI into the workspace geometry. AOI positions have been edited in [Blender software](https://www.blender.org/) from a 3D model of the workspace built manually then exported at OBJ format.
```obj
# Blender v3.0.1 OBJ File: 'workspace.blend'
# www.blender.org
o Sector_Screen
v 0.000000 1.008786 0.000000
v 51.742416 1.008786 0.000000
v 0.000000 52.998108 0.000000
v 51.742416 52.998108 0.000000
s off
f 1 2 4 3
o Info_Screen
v 56.407101 0.000000 0.000000
v 91.407104 0.000000 0.000000
v 56.407101 52.499996 0.000000
v 91.407104 52.499996 0.000000
s off
f 5 6 8 7
```
## sector_screen_aois.svg
This file defines the place of the AOI into the sector screen frame. AOI positions have been edited [Inkscape software](https://inkscape.org/fr/) from a screenshot of the sector screen then exported at SVG format.
```svg
```
## info_screen_aois.svg
This file defines the place of the AOI into the info screen frame. AOI positions have been edited [Inkscape software](https://inkscape.org/fr/) from a screenshot of the info screen then exported at SVG format.
```svg
```
## aoi_metrics.csv
The file contains all the metrics recorded by the *AOIScanPathAnalysisRecorder* objects as defined into the [observers.py](observers.md) file.
## sector_screen.csv
The file contains all the metrics recorded by the *ScanPathAnalysisRecorder* objects as defined into the [observers.py](observers.md) file.
## sector_screen.mp4
The video file is a record of the sector screen frame image.
## look_performance.csv
This file contains the logs of *ArUcoCamera.look* method execution info. It is created into the folder where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
On a MacBookPro (2.3GHz Intel Core i9 8 cores), the *look* method execution time is ~1ms and it is called ~51 times per second.
## watch_performance.csv
This file contains the logs of *ArUcoCamera.watch* method execution info. It is created into the folder from where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
On a MacBookPro (2.3GHz Intel Core i9 8 cores) without CUDA acceleration, the *watch* method execution time is ~52ms and it is called more than 12 times per second.