From e88fe8f8c2b0bffee3d67b73200381cde2351dab Mon Sep 17 00:00:00 2001 From: Théo de la Hogue Date: Wed, 23 Aug 2023 13:58:16 +0200 Subject: Adding gaze analysis pipeline user guide section. Hidding older user guide sections. --- docs/img/ar_environment_axis.png | Bin 106367 -> 32166 bytes docs/img/ar_frame.png | Bin 3477 -> 0 bytes docs/img/ar_frame_empty.png | Bin 0 -> 3477 bytes docs/img/ar_layer_aoi_matcher.png | Bin 0 -> 22948 bytes docs/img/ar_layer_aoi_scan_path.png | Bin 0 -> 14711 bytes docs/img/ar_layer_aoi_scene.png | Bin 0 -> 9014 bytes docs/img/ar_layer_empty.png | Bin 0 -> 3477 bytes docs/img/argaze_pipeline.png | Bin 148893 -> 214562 bytes docs/img/gaze_analysis_pipeline.png | Bin 93401 -> 100908 bytes docs/index.md | 2 +- .../ar_frame_configuration_and_execution.md | 10 +- .../ar_layer_configuration_and_execution.md | 151 +++++++++++++++++++++ .../gaze_analysis_pipeline/introduction.md | 2 +- docs/user_guide/utils/demonstrations_scripts.md | 32 ++--- mkdocs.yml | 1 + 15 files changed, 175 insertions(+), 23 deletions(-) delete mode 100644 docs/img/ar_frame.png create mode 100644 docs/img/ar_frame_empty.png create mode 100644 docs/img/ar_layer_aoi_matcher.png create mode 100644 docs/img/ar_layer_aoi_scan_path.png create mode 100644 docs/img/ar_layer_aoi_scene.png create mode 100644 docs/img/ar_layer_empty.png create mode 100644 docs/user_guide/gaze_analysis_pipeline/ar_layer_configuration_and_execution.md diff --git a/docs/img/ar_environment_axis.png b/docs/img/ar_environment_axis.png index 01c1791..35acaf2 100644 Binary files a/docs/img/ar_environment_axis.png and b/docs/img/ar_environment_axis.png differ diff --git a/docs/img/ar_frame.png b/docs/img/ar_frame.png deleted file mode 100644 index f368635..0000000 Binary files a/docs/img/ar_frame.png and /dev/null differ diff --git a/docs/img/ar_frame_empty.png b/docs/img/ar_frame_empty.png new file mode 100644 index 0000000..f368635 Binary files /dev/null and b/docs/img/ar_frame_empty.png differ diff --git a/docs/img/ar_layer_aoi_matcher.png b/docs/img/ar_layer_aoi_matcher.png new file mode 100644 index 0000000..63caf4d Binary files /dev/null and b/docs/img/ar_layer_aoi_matcher.png differ diff --git a/docs/img/ar_layer_aoi_scan_path.png b/docs/img/ar_layer_aoi_scan_path.png new file mode 100644 index 0000000..1a4dad3 Binary files /dev/null and b/docs/img/ar_layer_aoi_scan_path.png differ diff --git a/docs/img/ar_layer_aoi_scene.png b/docs/img/ar_layer_aoi_scene.png new file mode 100644 index 0000000..96bfc12 Binary files /dev/null and b/docs/img/ar_layer_aoi_scene.png differ diff --git a/docs/img/ar_layer_empty.png b/docs/img/ar_layer_empty.png new file mode 100644 index 0000000..f368635 Binary files /dev/null and b/docs/img/ar_layer_empty.png differ diff --git a/docs/img/argaze_pipeline.png b/docs/img/argaze_pipeline.png index cad7b5e..cbba619 100644 Binary files a/docs/img/argaze_pipeline.png and b/docs/img/argaze_pipeline.png differ diff --git a/docs/img/gaze_analysis_pipeline.png b/docs/img/gaze_analysis_pipeline.png index d6140d3..42b8630 100644 Binary files a/docs/img/gaze_analysis_pipeline.png and b/docs/img/gaze_analysis_pipeline.png differ diff --git a/docs/index.md b/docs/index.md index af57d2b..8893cd5 100644 --- a/docs/index.md +++ b/docs/index.md @@ -12,7 +12,7 @@ title: What is ArGaze? ## Gaze analysis pipeline -Whether in real time or in post-processing, **ArGaze** provides extensible plugins library allowing to select application specific algorithm at each pipeline step: +Whether in real time or in post-processing, **ArGaze** provides extensible plugins library allowing to select application specific algorithms at each pipeline step: * **Fixation/Saccade identification**: dispersion threshold identification, velocity threshold identification, ... * **Area Of Interest (AOI) matching**: fixation deviation circle matching, ... diff --git a/docs/user_guide/gaze_analysis_pipeline/ar_frame_configuration_and_execution.md b/docs/user_guide/gaze_analysis_pipeline/ar_frame_configuration_and_execution.md index 00300a8..f1264c7 100644 --- a/docs/user_guide/gaze_analysis_pipeline/ar_frame_configuration_and_execution.md +++ b/docs/user_guide/gaze_analysis_pipeline/ar_frame_configuration_and_execution.md @@ -3,7 +3,7 @@ Configure and execute ArFrame The [ArFrame](../../../argaze/#argaze.ArFeatures.ArFrame) class defines a rectangular area where timestamped gaze positions are projected in and inside which they need to be analyzed. -![Timestamped Gaze Positions](../../img/ar_frame.png) +![Empty frame area](../../img/ar_frame_empty.png) ## Load JSON configuration file @@ -22,7 +22,7 @@ Here is a simple JSON ArFrame configuration file example: } }, "scan_path": { - "duration_max": 30000, + "duration_max": 30000 }, "scan_path_analyzers": { "Basic": {}, @@ -60,7 +60,7 @@ print("heatmap:", ar_frame.heatmap) Finally, here is what the program writes in console: -``` +```txt name: My FullHD screen size: [1920, 1080] gaze movement identifier type: @@ -88,7 +88,7 @@ The first [ArFrame](../../../argaze/#argaze.ArFeatures.ArFrame) pipeline step is The identification method can be selected by instantiating a particular [GazeMovementIdentifier](../../../argaze/#argaze.GazeFeatures.GazeMovementIdentifier) from the [argaze.GazeAnalysis](../../../argaze/#argaze.GazeAnalysis) submodule or [another python package](../advanced_topics/plugin_loading). -In the example file, the choosen identification method is the [Dispersion Threshold Identification (I-DT)](../../../argaze/#argaze.GazeAnalysis.DispersionThresholdIdentification) which has two specific deviation_max_threshold and duration_min_threshold attributes. +In the example file, the choosen identification method is the [Dispersion Threshold Identification (I-DT)](../../../argaze/#argaze.GazeAnalysis.DispersionThresholdIdentification) which has two specific *deviation_max_threshold* and *duration_min_threshold* attributes. !!! note In ArGaze, [Fixation](../../../argaze/#argaze.GazeFeatures.Fixation) and [Saccade](../../../argaze/#argaze.GazeFeatures.Saccade) are considered as particular [GazeMovements](../../../argaze/#argaze.GazeFeatures.GazeMovement). @@ -128,7 +128,7 @@ The Heatmap object have tree attributes to set its size, the sigma point spreadi Timestamped gaze positions have to be passed one by one to [ArFrame.look](../../../argaze/#argaze.ArFeatures.ArFrame.look) method to execute the whole intanciated pipeline. ```python -# Assuming that timestamp gaze positions are available +# Assuming that timestamped gaze positions are available ... # Look ArFrame at a timestamped gaze position diff --git a/docs/user_guide/gaze_analysis_pipeline/ar_layer_configuration_and_execution.md b/docs/user_guide/gaze_analysis_pipeline/ar_layer_configuration_and_execution.md new file mode 100644 index 0000000..c9ca097 --- /dev/null +++ b/docs/user_guide/gaze_analysis_pipeline/ar_layer_configuration_and_execution.md @@ -0,0 +1,151 @@ +Add and execute ArLayer +============================= + +The [ArLayer](../../../argaze/#argaze.ArFeatures.ArLayer) class defines a space where to make matching of gaze movements and AOIs and inside which those matchings need to be analyzed. + +![Empty layer area](../../img/ar_layer_empty.png) + +## Add ArLayer to ArFrame JSON configuration file + +An [ArFrame](../../../argaze/#argaze.ArFeatures.ArFrame) instance can contains multiples [ArLayers](../../../argaze/#argaze.ArFeatures.ArLayer). + +Here is the JSON ArFrame configuration file example where one layer is added: + +```json +{ + "name": "My FullHD screen", + "size": [1920, 1080], + ... + "layers": { + "MyLayer": { + "aoi_scene" : { + "upper_left_corner": [[0, 0], [960, 0], [960, 540], [0, 540]], + "upper_right_corner": [[960, 0], [1920, 0], [1920, 540], [960, 540]], + "lower_left_corner": [[0, 540], [960, 540], [960, 1080], [0, 1080]], + "lower_right_corner": [[960, 540], [1920, 540], [1920, 1080], [960, 1080]] + }, + "aoi_matcher": { + "DeviationCircleCoverage": { + "coverage_threshold": 0.5 + } + }, + "aoi_scan_path": { + "duration_max": 30000 + }, + "aoi_scan_path_analyzers": { + "Basic": {}, + "TransitionMatrix": {}, + "NGram": { + "n_min": 3, + "n_max": 5 + } + } + } + } +} +``` + +Then, after the JSON file being loaded: + +```python +from argaze import ArFeatures + +# Assuming the ArFrame is loaded +... + +# Print ArLayer attributes +for name, ar_layer in ar_frame.layers.items(): + + print("name:", ar_layer.name) + print("AOI scene:", ar_layer.aoi_scene) + print("AOI scan path:", ar_layer.aoi_scan_path) + + for module, analyzer in ar_layer.aoi_scan_path_analyzers.items(): + print('AOI scan path analyzer module:', module) +``` + +Finally, here is what the program writes in console: + +```txt +... + +name: MyLayer +AOI scene: + upper_left_corner: +[[0, 0], [960, 0], [960, 540], [0, 540]] + upper_right_corner: +[[960, 0], [1920, 0], [1920, 540], [960, 540]] + lower_left_corner: +[[0, 540], [960, 540], [960, 1080], [0, 1080]] + lower_right_corner: +[[960, 540], [1920, 540], [1920, 1080], [960, 1080]] +AOI scan path: [] +AOI scan path analyzer module: argaze.GazeAnalysis.Basic +AOI scan path analyzer module: argaze.GazeAnalysis.TransitionMatrix +AOI scan path analyzer module: argaze.GazeAnalysis.NGram +``` + +Now, let's understand the meaning of each JSON entry. + +### Name + +The name of the [ArLayer](../../../argaze/#argaze.ArFeatures.ArLayer). Basically useful for visualisation purpose. + +### AOI Scene + +The [AOIScene](../../../argaze/#argaze.AreaOfInterest.AOIFeatures.AOIScene) defines a set of 2D [AreaOfInterest](../../../argaze/#argaze.AreaOfInterest.AOIFeatures.AreaOfInterest) registered by name. + +![AOI Scene](../../img/ar_layer_aoi_scene.png) + +### AOI Matcher + +The first [ArLayer](../../../argaze/#argaze.ArFeatures.ArLayer) pipeline step aims to make match identified gaze movement with an AOI of the scene. + +![AOI Matcher](../../img/ar_layer_aoi_matcher.png) + +The matching method can be selected by instantiating a particular [AOIMatcher](../../../argaze/#argaze.GazeFeatures.AOIMatcher) from the [argaze.GazeAnalysis](../../../argaze/#argaze.GazeAnalysis) submodule or [another python package](../advanced_topics/plugin_loading). + +In the example file, the choosen matching method is the [Deviation Circle Coverage](../../../argaze/#argaze.GazeAnalysis.DeviationCircleCoverage) which has one specific *coverage_threshold* attribute. + +!!! warning + JSON *aoi_matcher* entry is mandatory. Otherwise, the AOIScanPath and AOIScanPathAnalyzers steps are disabled. + +### AOI Scan Path + +The second [ArLayer](../../../argaze/#argaze.ArFeatures.ArLayer) pipeline step aims to build a [AOIScanPath](../../../argaze/#argaze.GazeFeatures.AOIScanPath) defined as a list of [AOIScanSteps](../../../argaze/#argaze.GazeFeatures.AOIScanStep) made by a set of successive fixations/saccades onto a same AOI. + +![AOI Scan Path](../../img/ar_layer_aoi_scan_path.png) + +Once fixations and saccades are identified and fixations are matched to AOI, they are automatically appended to the AOIScanPath if required. + +The [AOIScanPath.duration_max](../../../argaze/#argaze.GazeFeatures.AOIScanPath.duration_max) attribute is the duration from which older AOI scan steps are removed each time new AOI scan steps are added. + +### AOI Scan Path Analyzers + +Finally, the last [ArLayer](../../../argaze/#argaze.ArFeatures.ArLayer) pipeline step consists in passing the previously built [AOIScanPath](../../../argaze/#argaze.GazeFeatures.AOIScanPath) to each loaded [AOIScanPathAnalyzer](../../../argaze/#argaze.GazeFeatures.AOIScanPathAnalyzer). + +Each analysis algorithm can be selected by instantiating a particular [AOIScanPathAnalyzer](../../../argaze/#argaze.GazeFeatures.AOIScanPathAnalyzer) from the [argaze.GazeAnalysis](../../../argaze/#argaze.GazeAnalysis) submodule or [another python package](../advanced_topics/plugin_loading). + +!!! note + JSON *aoi_scan_path* entry is not mandatory. If aoi_scan_path_analyzers entry is not empty, the AOIScanPath step is automatically enabled. + +## Pipeline execution + +Timestamped gaze movements identified by parent ArFrame are passed one by one to each [ArLayer.look](../../../argaze/#argaze.ArFeatures.ArLayer.look) method to execute each layer intanciated pipeline. + +```python +# Assuming that timestamped gaze positions are available +... + + # Look ArFrame at a timestamped gaze position + movement, _, layers_analysis, _, _ = ar_frame.look(timestamp, gaze_position) + + # Check if a movement has been identified + if movement.valid and movement.finished: + + # Do something with each layer AOI scan path analysis + for layer_name, layer_aoi_scan_path_analysis in layers_analysis.items(): + for module, analysis in layer_aoi_scan_path_analysis.items(): + for data, value in analysis.items(): + ... +``` diff --git a/docs/user_guide/gaze_analysis_pipeline/introduction.md b/docs/user_guide/gaze_analysis_pipeline/introduction.md index 568cba2..ee67c9d 100644 --- a/docs/user_guide/gaze_analysis_pipeline/introduction.md +++ b/docs/user_guide/gaze_analysis_pipeline/introduction.md @@ -11,7 +11,7 @@ To build your own gaze analysis pipeline, you need to know: * [How to edit timestamped gaze positions](../timestamped_gaze_positions_edition), * [How to deal with an ArFrame instance](../ar_frame_configuration_and_execution), -* [How to setup Areas Of Interest](../ar_frame_aoi_configuration), +* [How to deal with an ArLayer instance](../ar_layer_configuration_and_execution), * [How to log resulted gaze analysis](../analysis). More advanced features are also explained like: diff --git a/docs/user_guide/utils/demonstrations_scripts.md b/docs/user_guide/utils/demonstrations_scripts.md index 6666bb0..2bf04aa 100644 --- a/docs/user_guide/utils/demonstrations_scripts.md +++ b/docs/user_guide/utils/demonstrations_scripts.md @@ -9,23 +9,9 @@ Collection of command-line scripts for demonstration purpose. !!! note *Use -h option to get command arguments documentation.* -## AR environment demonstration +## ArFrame and ArLayer demonstration -Load AR environment from **setup.json** file, detect ArUco markers into a demo video source and estimate environment pose. - -```shell -python ./src/argaze/utils/demo_ar_features_run.py ./src/argaze/utils/demo_environment/demo_ar_features_setup.json -s ./src/argaze/utils/demo_environment/demo.mov -``` - -!!! note - To reproduce this demonstration with live video source, camera calibration have to be done and exported into **./src/argaze/utils/demo_environment/optic_parameters.json** file. - -!!! note - To reproduce this demonstration with live video source, print **A3_demo.pdf** file located in *./src/argaze/utils/demo_environment/* folder on A3 paper sheet. - -## Gaze features demonstration - -Simulate gaze position using mouse pointer to illustrate gaze features. +Load ArFrame with a single ArLayer from **demo_gaze_features_setup.json** file then, simulate gaze position using mouse pointer to illustrate gaze features. ```shell python ./src/argaze/utils/demo_gaze_features_run.py ./src/argaze/utils/demo_environment/demo_gaze_features_setup.json @@ -39,3 +25,17 @@ A picture is saved by pressing escape key. ```shell python ./src/argaze/utils/demo_heatmap_run.py ``` + +## ArEnvironment demonstration + +Load ArEnvironment from **demo_ar_features_setup.json** file then, detect ArUco markers into a demo video source and estimate environment pose. + +```shell +python ./src/argaze/utils/demo_ar_features_run.py ./src/argaze/utils/demo_environment/demo_ar_features_setup.json -s ./src/argaze/utils/demo_environment/demo.mov +``` + +!!! note + To reproduce this demonstration with live video source, camera calibration have to be done and exported into **./src/argaze/utils/demo_environment/optic_parameters.json** file. + +!!! note + To reproduce this demonstration with live video source, print **A3_demo.pdf** file located in *./src/argaze/utils/demo_environment/* folder on A3 paper sheet. diff --git a/mkdocs.yml b/mkdocs.yml index 06e0d2e..1aaae5a 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -8,6 +8,7 @@ nav: - user_guide/gaze_analysis_pipeline/introduction.md - user_guide/gaze_analysis_pipeline/timestamped_gaze_positions_edition.md - user_guide/gaze_analysis_pipeline/ar_frame_configuration_and_execution.md + - user_guide/gaze_analysis_pipeline/ar_layer_configuration_and_execution.md - Advanced Topics: - user_guide/gaze_analysis_pipeline/advanced_topics/plugin_loading.md # - ArUco Markers: -- cgit v1.1