1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
|
Visualize ArFrame and ArLayers
==============================
All [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and [ArLayers](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline steps result can be drawn in real time or afterward.
![ArFrame visualisation](../../img/ar_frame_visualisation.png)
## Export to PNG file
Once timestamped gaze positions have been processed by [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method, it is possible to write [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) image into a file thanks to [OpenCV package](https://pypi.org/project/opencv-python/).
```python
import cv2
# Assuming that timestamped gaze positions have been processed by ArFrame.look method
...
# Export heatmap image
cv2.imwrite('./ar_frame.png', ar_frame.image())
```
## Export to MP4 file
While timestamped gaze positions are processed by [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method, it is possible to write [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) image into a video file thanks to [OpenCV package](https://pypi.org/project/opencv-python/).
```python
import cv2
# Assuming ArFrame is loaded
...
# Create a video file to save ArFrame
video = cv2.VideoWriter('ar_frame.avi', cv2.VideoWriter_fourcc(*'MJPG'), 10, ar_frame.size)
# Assuming that timestamped gaze positions are being processed by ArFrame.look method
...
# Write ArFrame image into video file
video.write(ar_frame.image())
# Close video file
video.release()
```
## Live window display
While timestamped gaze positions are processed by [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method, it is possible to display [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) image thanks to [OpenCV package](https://pypi.org/project/opencv-python/).
```python
import cv2
def main():
# Assuming ArFrame is loaded
...
# Create a window to display ArFrame
cv2.namedWindow(ar_frame.name, cv2.WINDOW_AUTOSIZE)
# Assuming that timestamped gaze positions are being processed by ArFrame.look method
...
# Update ArFrame image display
cv2.imshow(ar_frame.name, ar_frame.image())
# Wait 10 ms
cv2.waitKey(10)
if __name__ == '__main__':
main()
```
## Edit ArFrame image parameters
[ArFrame.image](../../argaze.md/#argaze.ArFeatures.ArFrame.image) method parameters can be configured thanks to dictionary.
```python
# Assuming ArFrame is loaded
...
# Edit ArFrame image parameters
image_parameters = {
"draw_scan_path": {
"draw_fixations": {
"deviation_circle_color": [255, 0, 255],
"duration_border_color": [127, 0, 127],
"duration_factor": 1e-2
},
"draw_saccades": {
"line_color": [255, 0, 255]
},
"deepness": 0
},
"draw_layers": {
"MyLayer": {
"draw_aoi_scene": {
"draw_aoi": {
"color": [255, 255, 255],
"border_size": 1
}
},
"draw_aoi_matching": {
"draw_matched_fixation": {
"deviation_circle_color": [255, 255, 255]
},
"draw_matched_fixation_positions": {
"position_color": [0, 255, 255],
"line_color": [0, 0, 0]
},
"draw_matched_region": {
"color": [0, 255, 0],
"border_size": 4
},
"draw_looked_aoi": {
"color": [0, 255, 0],
"border_size": 2
},
"looked_aoi_name_color": [255, 255, 255],
"looked_aoi_name_offset": [0, -10]
}
}
},
"draw_gaze_position": {
"color": [0, 255, 255]
}
}
# Pass image parameters to ArFrame
ar_frame_image = ar_frame.image(**image_parameters)
# Do something with ArFrame image
...
```
## Configure ArFrame image parameters
[ArFrame.image](../../argaze.md/#argaze.ArFeatures.ArFrame.image) method parameters can also be configured thanks to a dedicated JSON entry.
Here is the JSON ArFrame configuration file example with image parameters included:
```json
{
"name": "My FullHD screen",
"size": [1920, 1080],
...
"image_parameters": {
"draw_scan_path": {
"draw_fixations": {
"deviation_circle_color": [255, 0, 255],
"duration_border_color": [127, 0, 127],
"duration_factor": 1e-2
},
"draw_saccades": {
"line_color": [255, 0, 255]
},
"deepness": 0
},
"draw_layers": {
"MyLayer": {
"draw_aoi_scene": {
"draw_aoi": {
"color": [255, 255, 255],
"border_size": 1
}
},
"draw_aoi_matching": {
"draw_matched_fixation": {
"deviation_circle_color": [255, 255, 255]
},
"draw_matched_fixation_positions": {
"position_color": [0, 255, 255],
"line_color": [0, 0, 0]
},
"draw_matched_region": {
"color": [0, 255, 0],
"border_size": 4
},
"draw_looked_aoi": {
"color": [0, 255, 0],
"border_size": 2
},
"looked_aoi_name_color": [255, 255, 255],
"looked_aoi_name_offset": [0, -10]
}
}
},
"draw_gaze_position": {
"color": [0, 255, 255]
}
}
}
```
!!! warning
Most of *image_parameters* entries work if related ArFrame/ArLayer pipeline steps are enabled.
For example, JSON *draw_scan_path* entry needs GazeMovementIdentifier and ScanPath steps to be enabled.
|