Add and execute ArLayer ============================= The [ArLayer](../../../argaze/#argaze.ArFeatures.ArLayer) class defines a space where to make matching of gaze movements and AOIs and inside which those matchings need to be analyzed. ![Empty layer area](../../img/ar_layer_empty.png) ## Add ArLayer to ArFrame JSON configuration file An [ArFrame](../../../argaze/#argaze.ArFeatures.ArFrame) instance can contains multiples [ArLayers](../../../argaze/#argaze.ArFeatures.ArLayer). Here is the JSON ArFrame configuration file example where one layer is added: ```json { "name": "My FullHD screen", "size": [1920, 1080], ... "layers": { "MyLayer": { "aoi_color": [0, 0, 255], "aoi_scene" : { "upper_left_corner": [[0, 0], [960, 0], [960, 540], [0, 540]], "upper_right_corner": [[960, 0], [1920, 0], [1920, 540], [960, 540]], "lower_left_corner": [[0, 540], [960, 540], [960, 1080], [0, 1080]], "lower_right_corner": [[960, 540], [1920, 540], [1920, 1080], [960, 1080]] }, "aoi_matcher": { "DeviationCircleCoverage": { "coverage_threshold": 0.5 } }, "aoi_scan_path": { "duration_max": 30000 }, "aoi_scan_path_analyzers": { "Basic": {}, "TransitionMatrix": {}, "NGram": { "n_min": 3, "n_max": 5 } } } } } ``` Then, after the JSON file being loaded: ```python from argaze import ArFeatures # Assuming the ArFrame is loaded ... # Print ArLayer attributes for name, ar_layer in ar_frame.layers.items(): print("name:", ar_layer.name) print("AOI color:", ar_layer.aoi_color) print("AOI scene:", ar_layer.aoi_scene) print("AOI matcher type:", type(ar_layer.aoi_matcher)) print("AOI scan path:", ar_layer.aoi_scan_path) for module, analyzer in ar_layer.aoi_scan_path_analyzers.items(): print('AOI scan path analyzer module:', module) ``` Finally, here is what the program writes in console: ```txt ... name: MyLayer AOI color: [0, 0, 255] AOI scene: upper_left_corner: [[0, 0], [960, 0], [960, 540], [0, 540]] upper_right_corner: [[960, 0], [1920, 0], [1920, 540], [960, 540]] lower_left_corner: [[0, 540], [960, 540], [960, 1080], [0, 1080]] lower_right_corner: [[960, 540], [1920, 540], [1920, 1080], [960, 1080]] AOI matcher type: AOI scan path: [] AOI scan path analyzer module: argaze.GazeAnalysis.Basic AOI scan path analyzer module: argaze.GazeAnalysis.TransitionMatrix AOI scan path analyzer module: argaze.GazeAnalysis.NGram ``` Now, let's understand the meaning of each JSON entry. ### Name The name of the [ArLayer](../../../argaze/#argaze.ArFeatures.ArLayer). Basically useful for visualisation purpose. ### AOI Color The color of [ArLayer](../../../argaze/#argaze.ArFeatures.ArLayer)'s AOI. Basically useful for visualisation purpose. ### AOI Scene The [AOIScene](../../../argaze/#argaze.AreaOfInterest.AOIFeatures.AOIScene) defines a set of 2D [AreaOfInterest](../../../argaze/#argaze.AreaOfInterest.AOIFeatures.AreaOfInterest) registered by name. ![AOI Scene](../../img/ar_layer_aoi_scene.png) ### AOI Matcher The first [ArLayer](../../../argaze/#argaze.ArFeatures.ArLayer) pipeline step aims to make match identified gaze movement with an AOI of the scene. ![AOI Matcher](../../img/ar_layer_aoi_matcher.png) The matching algorithm can be selected by instantiating a particular [AOIMatcher](../../../argaze/#argaze.GazeFeatures.AOIMatcher) from the [argaze.GazeAnalysis](../../../argaze/#argaze.GazeAnalysis) submodule or [from another python package](../advanced_topics/module_loading). In the example file, the choosen matching algorithm is the [Deviation Circle Coverage](../../../argaze/#argaze.GazeAnalysis.DeviationCircleCoverage) which has one specific *coverage_threshold* attribute. !!! warning JSON *aoi_matcher* entry is mandatory. Otherwise, the AOIScanPath and AOIScanPathAnalyzers steps are disabled. ### AOI Scan Path The second [ArLayer](../../../argaze/#argaze.ArFeatures.ArLayer) pipeline step aims to build a [AOIScanPath](../../../argaze/#argaze.GazeFeatures.AOIScanPath) defined as a list of [AOIScanSteps](../../../argaze/#argaze.GazeFeatures.AOIScanStep) made by a set of successive fixations/saccades onto a same AOI. ![AOI Scan Path](../../img/ar_layer_aoi_scan_path.png) Once identified gaze movements are matched to AOI, they are automatically appended to the AOIScanPath if required. The [AOIScanPath.duration_max](../../../argaze/#argaze.GazeFeatures.AOIScanPath.duration_max) attribute is the duration from which older AOI scan steps are removed each time new AOI scan steps are added. !!! note JSON *aoi_scan_path* entry is not mandatory. If aoi_scan_path_analyzers entry is not empty, the AOIScanPath step is automatically enabled. ### AOI Scan Path Analyzers Finally, the last [ArLayer](../../../argaze/#argaze.ArFeatures.ArLayer) pipeline step consists in passing the previously built [AOIScanPath](../../../argaze/#argaze.GazeFeatures.AOIScanPath) to each loaded [AOIScanPathAnalyzer](../../../argaze/#argaze.GazeFeatures.AOIScanPathAnalyzer). Each analysis algorithm can be selected by instantiating a particular [AOIScanPathAnalyzer](../../../argaze/#argaze.GazeFeatures.AOIScanPathAnalyzer) from the [argaze.GazeAnalysis](../../../argaze/#argaze.GazeAnalysis) submodule or [from another python package](../advanced_topics/module_loading). ## Pipeline execution Timestamped gaze movements identified by parent ArFrame are passed one by one to each [ArLayer.look](../../../argaze/#argaze.ArFeatures.ArLayer.look) method to execute each layer intanciated pipeline. ```python # Assuming that timestamped gaze positions are available ... # Look ArFrame at a timestamped gaze position movement, _, layers_analysis, _, _ = ar_frame.look(timestamp, gaze_position) # Check if a movement has been identified if movement.valid and movement.finished: # Do something with each layer AOI scan path analysis for layer_name, layer_aoi_scan_path_analysis in layers_analysis.items(): for module, analysis in layer_aoi_scan_path_analysis.items(): for data, value in analysis.items(): ... ```