diff options
Diffstat (limited to 'docs/user_guide')
-rw-r--r-- | docs/user_guide/gaze_analysis_pipeline/recording.md | 52 |
1 files changed, 32 insertions, 20 deletions
diff --git a/docs/user_guide/gaze_analysis_pipeline/recording.md b/docs/user_guide/gaze_analysis_pipeline/recording.md index 26b7e82..826442f 100644 --- a/docs/user_guide/gaze_analysis_pipeline/recording.md +++ b/docs/user_guide/gaze_analysis_pipeline/recording.md @@ -1,31 +1,34 @@ Record gaze analysis ================= -[ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) analysis can be recorded by registering observers to their **look** method. +[ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) analyses can be recorded by registering observers for their **look** method. ## Export gaze analysis to CSV file [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) have an **observers** attribute to enable pipeline execution recording. -Here is an extract from the JSON ArFrame configuration file where recording is enabled for the [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and for one [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) by loaded classes from Python files: +Here is an extract from the JSON ArFrame configuration file where recording is enabled for the [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and for one [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) by loading classes from *my_recorders.py* Python file: ```json { - "name": "My FullHD screen", - "size": [1920, 1080], - "observers": { - "my_recorders.ScanPathAnalysisRecorder": { - "path": "./scan_path_metrics.csv" + "argaze.ArFeatures.ArFrame": { + "name": "My FullHD screen", + "size": [1920, 1080], + ... + "observers": { + "my_recorders.ScanPathAnalysisRecorder": { + "path": "scan_path_metrics.csv" + } }, - ... - "layers": { - "MyLayer": { - "observers": { - "my_recorders.AOIScanPathAnalysisRecorder": { - "path": "./aoi_scan_path_metrics.csv" + "layers": { + "MyLayer": { + ... + "observers": { + "my_recorders.AOIScanPathAnalysisRecorder": { + "path": "aoi_scan_path_metrics.csv" + } } - }, - ... + } } } } @@ -34,6 +37,9 @@ Here is an extract from the JSON ArFrame configuration file where recording is e !!! note [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and its [ArLayers](../../argaze.md/#argaze.ArFeatures.ArLayer) automatically notify **look** method observers after each call. +!!! note + The *scan_path_metrics.csv* and *aoi_scan_path_metrics.csv* are created relatively to the place where the Python script is executed. + Here is *my_recorders.py* file: ```python @@ -89,6 +95,9 @@ class AOIScanPathAnalysisRecorder(UtilsFeatures.FileWriter): self.write(data) ``` +!!! note + *my_recorders.py* file have to be in the same folder than the JSON ArFrame configuration file. + Assuming that [ArGaze.GazeAnalysis.Basic](../../argaze.md/#argaze.GazeAnalysis.Basic) scan path analysis module is enabled for 'My FullHD screen' ArFrame, a ***scan_path_metrics.csv*** file would be created: |Timestamp (ms)|Duration (ms)|Steps number| @@ -117,7 +126,7 @@ Assuming that [ArGaze.GazeAnalysis.NGram](../../argaze.md/#argaze.GazeAnalysis.N As explained in [pipeline steps visualization chapter](visualization.md), it is possible to get [ArFrame.image](../../argaze.md/#argaze.ArFeatures.ArFrame.image) once timestamped gaze positions have been processed by [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method. -Here is the JSON ArFrame configuration file where [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) observers are extended with a new my_recorders.VideoRecorder instance: +Here is the JSON ArFrame configuration file where [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) observers are extended with a new my_recorders.FrameImageRecorder instance: ```json { @@ -126,16 +135,19 @@ Here is the JSON ArFrame configuration file where [ArFrame](../../argaze.md/#arg "observers": { ... "my_recorders.FrameImageRecorder": { - "path": "./video.mp4", + "path": "video.mp4", "width": 1920, "height": 1080, "fps": 15 - }, + } ... } ``` -Here is *my_recorders.py* file extended with a new VideoRecorder class: +!!! note + The *video.mp4* is created relatively to the place where the Python script is executed. + +Here is *my_recorders.py* file extended with a new FrameImageRecorder class: ```python ... @@ -153,4 +165,4 @@ class FrameImageRecorder(UtilsFeatures.VideoWriter): self.write(ar_frame.image()) ``` -Assuming that [ArFrame.image_parameters](../../argaze.md/#argaze.ArFeatures.ArFrame.image_parameters) are provided, ***video.mp4*** file would be created.
\ No newline at end of file +Assuming that [ArFrame.image_parameters](../../argaze.md/#argaze.ArFeatures.ArFrame.image_parameters) are provided, ***video.mp4*** file would be created. |