aboutsummaryrefslogtreecommitdiff
path: root/docs/user_guide/gaze_analysis_pipeline/recording.md
diff options
context:
space:
mode:
Diffstat (limited to 'docs/user_guide/gaze_analysis_pipeline/recording.md')
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/recording.md156
1 files changed, 156 insertions, 0 deletions
diff --git a/docs/user_guide/gaze_analysis_pipeline/recording.md b/docs/user_guide/gaze_analysis_pipeline/recording.md
new file mode 100644
index 0000000..72aee58
--- /dev/null
+++ b/docs/user_guide/gaze_analysis_pipeline/recording.md
@@ -0,0 +1,156 @@
+Record gaze analysis
+=================
+
+[ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) analysis can be recorded by registering observers to their **look** method.
+
+## Export gaze analysis to CSV file
+
+[ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) have an **observers** attribute to enable pipeline execution recording.
+
+Here is an extract from the JSON ArFrame configuration file where recording is enabled for the [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and for one [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) by loaded classes from Python files:
+
+```json
+{
+ "name": "My FullHD screen",
+ "size": [1920, 1080],
+ "observers": {
+ "my_recorders.ScanPathAnalysisRecorder": {
+ "path": "./scan_path_metrics.csv"
+ },
+ ...
+ "layers": {
+ "MyLayer": {
+ "observers": {
+ "my_recorders.AOIScanPathAnalysisRecorder": {
+ "path": "./aoi_scan_path_metrics.csv"
+ }
+ },
+ ...
+ }
+ }
+}
+```
+
+!!! note
+ [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and its [ArLayers](../../argaze.md/#argaze.ArFeatures.ArLayer) automatically notify **look** method observers after each call.
+
+Here is *my_recorders.py* file:
+
+```python
+from argaze.utils import UtilsFeatures
+
+class ScanPathAnalysisRecorder(UtilsFeatures.FileWriter):
+
+ def __init__(self, **kwargs):
+
+ # Init FileWriter
+ super().__init__(**kwargs)
+
+ # Edit hearder line
+ self.header = "Timestamp (ms)", "Duration (ms)", "Steps number"
+
+ def on_look(self, timestamp, ar_frame, exception):
+ """Record scan path metrics"""
+
+ if ar_frame.is_analysis_available():
+
+ analysis = ar_frame.analysis()
+
+ data = (
+ timestamp,
+ analysis['argaze.GazeAnalysis.Basic.ScanPathAnalyzer'].path_duration,
+ analysis['argaze.GazeAnalysis.Basic.ScanPathAnalyzer'].steps_number
+ )
+
+ # Write to file
+ self.write(data)
+
+class AOIScanPathAnalysisRecorder(UtilsFeatures.FileWriter):
+
+ def __init__(self, **kwargs):
+
+ # Init FileWriter
+ super().__init__(**kwargs)
+
+ # Edit header line
+ self.header = "Timestamp (ms)", "NGram counts"
+
+ def on_look(self, timestamp, ar_layer, exception):
+ """Record aoi scan path metrics."""
+
+ if ar_layer.is_analysis_available():
+
+ data = (
+ timestamp,
+ ar_layer.analysis['argaze.GazeAnalysis.NGram.AOIScanPathAnalyzer'].ngrams_count
+ )
+
+ # Write to file
+ self.write(data)
+```
+
+Assuming that [ArGaze.GazeAnalysis.Basic](../../argaze.md/#argaze.GazeAnalysis.Basic) scan path analysis module is enabled for 'My FullHD screen' ArFrame, a ***scan_path_metrics.csv*** file would be created:
+
+|Timestamp (ms)|Duration (ms)|Steps number|
+|:-------------|:------------|:-----------|
+|3460 |1750 |2 |
+|4291 |2623 |3 |
+|4769 |3107 |4 |
+|6077 |4411 |5 |
+|6433 |4760 |6 |
+|7719 |6050 |7 |
+|... |... |... |
+
+Assuming that [ArGaze.GazeAnalysis.NGram](../../argaze.md/#argaze.GazeAnalysis.NGram) AOI scan path analysis module is enabled for 'MyLayer' ArLayer, a ***aoi_scan_path_metrics.csv*** file would be created:
+
+|Timestamp (ms)|NGram counts|
+|:-------------|:-----------|
+|5687 |"{3: {}, 4: {}, 5: {}}"|
+|6208 |"{3: {('LeftPanel', 'GeoSector', 'CircularWidget'): 1}, 4: {}, 5: {}}"|
+|... |... |
+
+!!! note ""
+
+ Learn to [script the pipeline](./advanced_topics/scripting.md) to know more about [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and [ArLayers](../../argaze.md/#argaze.ArFeatures.ArLayer) attributes.
+
+### Export gaze analysis to video file
+
+As explained in [pipeline steps visualisation chapter](visualisation.md), it is possible to get [ArFrame.image](../../argaze.md/#argaze.ArFeatures.ArFrame.image) once timestamped gaze positions have been processed by [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method.
+
+Here is the JSON ArFrame configuration file where [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) observers are extended with a new my_recorders.VideoRecorder instance:
+
+```json
+{
+ "name": "My FullHD screen",
+ "size": [1920, 1080],
+ "observers": {
+ ...
+ "my_recorders.FrameImageRecorder": {
+ "path": "./video.mp4",
+ "width": 1920,
+ "height": 1080,
+ "fps": 15
+ },
+ ...
+}
+```
+
+Here is *my_recorders.py* file extended with a new VideoRecorder class:
+
+```python
+...
+
+class FrameImageRecorder(UtilsFeatures.VideoWriter):
+
+ def __init__(self, **kwargs):
+
+ # Init VideoWriter
+ super().__init__(**kwargs)
+
+ def on_look(self, timestamp, ar_frame, exception):
+ """Record frame image into video file."""
+
+ self.write(ar_frame.image())
+```
+
+Assuming that [ArFrame.image_parameters](../../argaze.md/#argaze.ArFeatures.ArFrame.image_parameters) are provided, ***video.mp4*** file would be created. \ No newline at end of file