aboutsummaryrefslogtreecommitdiff
path: root/docs/user_guide/gaze_analysis_pipeline
diff options
context:
space:
mode:
Diffstat (limited to 'docs/user_guide/gaze_analysis_pipeline')
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/visualization.md106
1 files changed, 54 insertions, 52 deletions
diff --git a/docs/user_guide/gaze_analysis_pipeline/visualization.md b/docs/user_guide/gaze_analysis_pipeline/visualization.md
index ed67892..6b9805c 100644
--- a/docs/user_guide/gaze_analysis_pipeline/visualization.md
+++ b/docs/user_guide/gaze_analysis_pipeline/visualization.md
@@ -1,7 +1,7 @@
Visualize pipeline steps
========================
-Visualisation is not a pipeline step but each [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline steps outputs can be drawn in real-time or afterward, depending of application purpose.
+Visualization is not a pipeline step, but each [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline step output can be drawn in real-time or afterward, depending on the application purposes.
![ArFrame visualization](../../img/visualization.png)
@@ -13,62 +13,64 @@ Here is an extract from the JSON ArFrame configuration file with a sample where
```json
{
- "name": "My FullHD screen",
- "size": [1920, 1080],
- ...
- "image_parameters": {
- "draw_gaze_positions": {
- "color": [0, 255, 255],
- "size": 2
- },
- "draw_fixations": {
- "deviation_circle_color": [255, 255, 255],
- "duration_border_color": [127, 0, 127],
- "duration_factor": 1e-2,
- "draw_positions": {
- "position_color": [0, 255, 255],
- "line_color": [0, 0, 0]
- }
- },
- "draw_saccades": {
- "line_color": [255, 0, 255]
- },
- "draw_scan_path": {
+ "argaze.ArFeatures.ArFrame": {
+ "name": "My FullHD screen",
+ "size": [1920, 1080],
+ ...
+ "image_parameters": {
+ "draw_gaze_positions": {
+ "color": [0, 255, 255],
+ "size": 2
+ },
"draw_fixations": {
- "deviation_circle_color": [255, 0, 255],
+ "deviation_circle_color": [255, 255, 255],
"duration_border_color": [127, 0, 127],
- "duration_factor": 1e-2
+ "duration_factor": 1e-2,
+ "draw_positions": {
+ "position_color": [0, 255, 255],
+ "line_color": [0, 0, 0]
+ }
},
"draw_saccades": {
"line_color": [255, 0, 255]
- }
- },
- "draw_layers": {
- "MyLayer": {
- "draw_aoi_scene": {
- "draw_aoi": {
- "color": [255, 255, 255],
- "border_size": 1
- }
- },
- "draw_aoi_matching": {
- "draw_matched_fixation": {
- "deviation_circle_color": [255, 255, 255],
- "draw_positions": {
- "position_color": [0, 255, 0],
- "line_color": [0, 0, 0]
+ },
+ "draw_scan_path": {
+ "draw_fixations": {
+ "deviation_circle_color": [255, 0, 255],
+ "duration_border_color": [127, 0, 127],
+ "duration_factor": 1e-2
+ },
+ "draw_saccades": {
+ "line_color": [255, 0, 255]
+ }
+ },
+ "draw_layers": {
+ "MyLayer": {
+ "draw_aoi_scene": {
+ "draw_aoi": {
+ "color": [255, 255, 255],
+ "border_size": 1
}
},
- "draw_matched_region": {
- "color": [0, 255, 0],
- "border_size": 4
- },
- "draw_looked_aoi": {
- "color": [0, 255, 0],
- "border_size": 2
- },
- "looked_aoi_name_color": [255, 255, 255],
- "looked_aoi_name_offset": [0, -10]
+ "draw_aoi_matching": {
+ "draw_matched_fixation": {
+ "deviation_circle_color": [255, 255, 255],
+ "draw_positions": {
+ "position_color": [0, 255, 0],
+ "line_color": [0, 0, 0]
+ }
+ },
+ "draw_matched_region": {
+ "color": [0, 255, 0],
+ "border_size": 4
+ },
+ "draw_looked_aoi": {
+ "color": [0, 255, 0],
+ "border_size": 2
+ },
+ "looked_aoi_name_color": [255, 255, 255],
+ "looked_aoi_name_offset": [0, -10]
+ }
}
}
}
@@ -78,13 +80,13 @@ Here is an extract from the JSON ArFrame configuration file with a sample where
!!! warning
Most of *image_parameters* entries work if related ArFrame/ArLayer pipeline steps are enabled.
- For example, JSON *draw_scan_path* entry needs GazeMovementIdentifier and ScanPath steps to be enabled.
+ For example, a JSON *draw_scan_path* entry needs GazeMovementIdentifier and ScanPath steps to be enabled.
Then, [ArFrame.image](../../argaze.md/#argaze.ArFeatures.ArFrame.image) method can be called in various situations.
## Live window display
-While timestamped gaze positions are processed by [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method, it is possible to display [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) image thanks to [OpenCV package](https://pypi.org/project/opencv-python/).
+While timestamped gaze positions are processed by [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method, it is possible to display the [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) image thanks to the [OpenCV package](https://pypi.org/project/opencv-python/).
```python
import cv2