From 01378ae467b6399a13042f02a67010dfc820aee2 Mon Sep 17 00:00:00 2001 From: Théo de la Hogue Date: Mon, 4 Sep 2023 14:49:27 +0200 Subject: Moving scripting features into a dedicated advanced chapter. --- .../ar_layer_configuration_and_execution.md | 160 --------------------- 1 file changed, 160 deletions(-) delete mode 100644 docs/user_guide/gaze_analysis_pipeline/ar_layer_configuration_and_execution.md (limited to 'docs/user_guide/gaze_analysis_pipeline/ar_layer_configuration_and_execution.md') diff --git a/docs/user_guide/gaze_analysis_pipeline/ar_layer_configuration_and_execution.md b/docs/user_guide/gaze_analysis_pipeline/ar_layer_configuration_and_execution.md deleted file mode 100644 index e96bfff..0000000 --- a/docs/user_guide/gaze_analysis_pipeline/ar_layer_configuration_and_execution.md +++ /dev/null @@ -1,160 +0,0 @@ -Add and execute ArLayer -============================= - -The [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) class defines a space where to make matching of gaze movements and AOIs and inside which those matchings need to be analyzed. - -![Empty layer area](../../img/ar_layer_empty.png) - -## Add ArLayer to ArFrame JSON configuration file - -An [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) instance can contains multiples [ArLayers](../../argaze.md/#argaze.ArFeatures.ArLayer). - -Here is the JSON ArFrame configuration file example where one layer is added: - -```json -{ - "name": "My FullHD screen", - "size": [1920, 1080], - ... - "layers": { - "MyLayer": { - "aoi_color": [0, 0, 255], - "aoi_scene" : { - "upper_left_area": [[0, 0], [960, 0], [960, 540], [0, 540]], - "upper_right_area": [[960, 0], [1920, 0], [1920, 540], [960, 540]], - "lower_left_area": [[0, 540], [960, 540], [960, 1080], [0, 1080]], - "lower_right_area": [[960, 540], [1920, 540], [1920, 1080], [960, 1080]] - }, - "aoi_matcher": { - "DeviationCircleCoverage": { - "coverage_threshold": 0.5 - } - }, - "aoi_scan_path": { - "duration_max": 30000 - }, - "aoi_scan_path_analyzers": { - "Basic": {}, - "TransitionMatrix": {}, - "NGram": { - "n_min": 3, - "n_max": 5 - } - } - } - } -} -``` - -Then, after the JSON file being loaded: - -```python -from argaze import ArFeatures - -# Assuming the ArFrame is loaded -... - -# Print ArLayer attributes -for name, ar_layer in ar_frame.layers.items(): - - print("name:", ar_layer.name) - print("AOI color:", ar_layer.aoi_color) - print("AOI scene:", ar_layer.aoi_scene) - print("AOI matcher type:", type(ar_layer.aoi_matcher)) - print("AOI scan path:", ar_layer.aoi_scan_path) - - for module, analyzer in ar_layer.aoi_scan_path_analyzers.items(): - print('AOI scan path analyzer module:', module) -``` - -Finally, here is what the program writes in console: - -```txt -... - -name: MyLayer -AOI color: [0, 0, 255] -AOI scene: - upper_left_corner: -[[0, 0], [960, 0], [960, 540], [0, 540]] - upper_right_corner: -[[960, 0], [1920, 0], [1920, 540], [960, 540]] - lower_left_corner: -[[0, 540], [960, 540], [960, 1080], [0, 1080]] - lower_right_corner: -[[960, 540], [1920, 540], [1920, 1080], [960, 1080]] -AOI matcher type: -AOI scan path: [] -AOI scan path analyzer module: argaze.GazeAnalysis.Basic -AOI scan path analyzer module: argaze.GazeAnalysis.TransitionMatrix -AOI scan path analyzer module: argaze.GazeAnalysis.NGram -``` - -Now, let's understand the meaning of each JSON entry. - -### Name - -The name of the [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer). Basically useful for visualisation purpose. - -### AOI Color - -The color of [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer)'s AOI. Basically useful for visualisation purpose. - -### AOI Scene - -The [AOIScene](../../argaze.md/#argaze.AreaOfInterest.AOIFeatures.AOIScene) defines a set of 2D [AreaOfInterest](../../argaze.md/#argaze.AreaOfInterest.AOIFeatures.AreaOfInterest) registered by name. - -![AOI Scene](../../img/ar_layer_aoi_scene.png) - -### AOI Matcher - -The first [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) pipeline step aims to make match identified gaze movement with an AOI of the scene. - -![AOI Matcher](../../img/ar_layer_aoi_matcher.png) - -The matching algorithm can be selected by instantiating a particular [AOIMatcher](../../argaze.md/#argaze.GazeFeatures.AOIMatcher) from the [argaze.GazeAnalysis](../../argaze.md/#argaze.GazeAnalysis) submodule or [from another python package](advanced_topics/module_loading.md). - -In the example file, the choosen matching algorithm is the [Deviation Circle Coverage](../../argaze.md/#argaze.GazeAnalysis.DeviationCircleCoverage) which has one specific *coverage_threshold* attribute. - -!!! warning - JSON *aoi_matcher* entry is mandatory. Otherwise, the AOIScanPath and AOIScanPathAnalyzers steps are disabled. - -### AOI Scan Path - -The second [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) pipeline step aims to build a [AOIScanPath](../../argaze.md/#argaze.GazeFeatures.AOIScanPath) defined as a list of [AOIScanSteps](../../argaze.md/#argaze.GazeFeatures.AOIScanStep) made by a set of successive fixations/saccades onto a same AOI. - -![AOI Scan Path](../../img/ar_layer_aoi_scan_path.png) - -Once identified gaze movements are matched to AOI, they are automatically appended to the AOIScanPath if required. - -The [AOIScanPath.duration_max](../../argaze.md/#argaze.GazeFeatures.AOIScanPath.duration_max) attribute is the duration from which older AOI scan steps are removed each time new AOI scan steps are added. - -!!! note - JSON *aoi_scan_path* entry is not mandatory. If aoi_scan_path_analyzers entry is not empty, the AOIScanPath step is automatically enabled. - -### AOI Scan Path Analyzers - -Finally, the last [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) pipeline step consists in passing the previously built [AOIScanPath](../../argaze.md/#argaze.GazeFeatures.AOIScanPath) to each loaded [AOIScanPathAnalyzer](../../argaze.md/#argaze.GazeFeatures.AOIScanPathAnalyzer). - -Each analysis algorithm can be selected by instantiating a particular [AOIScanPathAnalyzer](../../argaze.md/#argaze.GazeFeatures.AOIScanPathAnalyzer) from the [argaze.GazeAnalysis](../../argaze.md/#argaze.GazeAnalysis) submodule or [from another python package](advanced_topics/module_loading.md). - -## Pipeline execution - -Timestamped gaze movements identified by parent ArFrame are passed one by one to each [ArLayer.look](../../argaze.md/#argaze.ArFeatures.ArLayer.look) method to execute each layer intanciated pipeline. - -```python -# Assuming that timestamped gaze positions are available -... - - # Look ArFrame at a timestamped gaze position - movement, _, layers_analysis, _, _ = ar_frame.look(timestamp, gaze_position) - - # Check if a movement has been identified - if movement.valid and movement.finished: - - # Do something with each layer AOI scan path analysis - for layer_name, layer_aoi_scan_path_analysis in layers_analysis.items(): - for module, analysis in layer_aoi_scan_path_analysis.items(): - for data, value in analysis.items(): - ... -``` -- cgit v1.1