1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
|
Record gaze analysis
=================
[ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) analysis can be recorded by registering observers to their **look** method.
## Export gaze analysis to CSV file
[ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) have an **observers** attribute to enable pipeline execution recording.
Here is an extract from the JSON ArFrame configuration file where recording is enabled for the [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and for one [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) by loaded classes from Python files:
```json
{
"name": "My FullHD screen",
"size": [1920, 1080],
"observers": {
"my_recorders.ScanPathAnalysisRecorder": {
"path": "./scan_path_metrics.csv"
},
...
"layers": {
"MyLayer": {
"observers": {
"my_recorders.AOIScanPathAnalysisRecorder": {
"path": "./aoi_scan_path_metrics.csv"
}
},
...
}
}
}
```
!!! note
[ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and its [ArLayers](../../argaze.md/#argaze.ArFeatures.ArLayer) automatically notify **look** method observers after each call.
Here is *my_recorders.py* file:
```python
from argaze.utils import UtilsFeatures
class ScanPathAnalysisRecorder(UtilsFeatures.FileWriter):
def __init__(self, **kwargs):
# Init FileWriter
super().__init__(**kwargs)
# Edit hearder line
self.header = "Timestamp (ms)", "Duration (ms)", "Steps number"
def on_look(self, timestamp, ar_frame, exception):
"""Record scan path metrics"""
if ar_frame.is_analysis_available():
analysis = ar_frame.analysis()
data = (
timestamp,
analysis['argaze.GazeAnalysis.Basic.ScanPathAnalyzer'].path_duration,
analysis['argaze.GazeAnalysis.Basic.ScanPathAnalyzer'].steps_number
)
# Write to file
self.write(data)
class AOIScanPathAnalysisRecorder(UtilsFeatures.FileWriter):
def __init__(self, **kwargs):
# Init FileWriter
super().__init__(**kwargs)
# Edit header line
self.header = "Timestamp (ms)", "NGram counts"
def on_look(self, timestamp, ar_layer, exception):
"""Record aoi scan path metrics."""
if ar_layer.is_analysis_available():
data = (
timestamp,
ar_layer.analysis['argaze.GazeAnalysis.NGram.AOIScanPathAnalyzer'].ngrams_count
)
# Write to file
self.write(data)
```
Assuming that [ArGaze.GazeAnalysis.Basic](../../argaze.md/#argaze.GazeAnalysis.Basic) scan path analysis module is enabled for 'My FullHD screen' ArFrame, a ***scan_path_metrics.csv*** file would be created:
|Timestamp (ms)|Duration (ms)|Steps number|
|:-------------|:------------|:-----------|
|3460 |1750 |2 |
|4291 |2623 |3 |
|4769 |3107 |4 |
|6077 |4411 |5 |
|6433 |4760 |6 |
|7719 |6050 |7 |
|... |... |... |
Assuming that [ArGaze.GazeAnalysis.NGram](../../argaze.md/#argaze.GazeAnalysis.NGram) AOI scan path analysis module is enabled for 'MyLayer' ArLayer, a ***aoi_scan_path_metrics.csv*** file would be created:
|Timestamp (ms)|NGram counts|
|:-------------|:-----------|
|5687 |"{3: {}, 4: {}, 5: {}}"|
|6208 |"{3: {('LeftPanel', 'GeoSector', 'CircularWidget'): 1}, 4: {}, 5: {}}"|
|... |... |
!!! note ""
Learn to [script the pipeline](./advanced_topics/scripting.md) to know more about [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and [ArLayers](../../argaze.md/#argaze.ArFeatures.ArLayer) attributes.
### Export gaze analysis to video file
As explained in [pipeline steps visualisation chapter](visualisation.md), it is possible to get [ArFrame.image](../../argaze.md/#argaze.ArFeatures.ArFrame.image) once timestamped gaze positions have been processed by [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method.
Here is the JSON ArFrame configuration file where [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) observers are extended with a new my_recorders.VideoRecorder instance:
```json
{
"name": "My FullHD screen",
"size": [1920, 1080],
"observers": {
...
"my_recorders.FrameImageRecorder": {
"path": "./video.mp4",
"width": 1920,
"height": 1080,
"fps": 15
},
...
}
```
Here is *my_recorders.py* file extended with a new VideoRecorder class:
```python
...
class FrameImageRecorder(UtilsFeatures.VideoWriter):
def __init__(self, **kwargs):
# Init VideoWriter
super().__init__(**kwargs)
def on_look(self, timestamp, ar_frame, exception):
"""Record frame image into video file."""
self.write(ar_frame.image())
```
Assuming that [ArFrame.image_parameters](../../argaze.md/#argaze.ArFeatures.ArFrame.image_parameters) are provided, ***video.mp4*** file would be created.
|