aboutsummaryrefslogtreecommitdiff
path: root/docs
diff options
context:
space:
mode:
Diffstat (limited to 'docs')
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/advanced_topics/scripting.md160
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/ar_frame_configuration_and_execution.md47
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/ar_layer.md (renamed from docs/user_guide/gaze_analysis_pipeline/ar_layer_configuration_and_execution.md)71
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/background.md31
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/heatmap.md24
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/introduction.md6
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/logging.md2
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/visualisation.md196
8 files changed, 251 insertions, 286 deletions
diff --git a/docs/user_guide/gaze_analysis_pipeline/advanced_topics/scripting.md b/docs/user_guide/gaze_analysis_pipeline/advanced_topics/scripting.md
new file mode 100644
index 0000000..2db69fc
--- /dev/null
+++ b/docs/user_guide/gaze_analysis_pipeline/advanced_topics/scripting.md
@@ -0,0 +1,160 @@
+Script the pipeline
+===================
+
+All gaze analysis pipeline objects are accessible from Python script.
+This could be particularly useful for realtime gaze interaction applications.
+
+## Load ArFrame configuration from dictionary
+
+First of all, [ArFrame](../../../argaze.md/#argaze.ArFeatures.ArFrame) configuration can be loaded from a python dictionary.
+
+```python
+from argaze import ArFeatures
+
+# Edit a dict with ArFrame configuration
+configuration = {
+ "name": "My FullHD screen",
+ "size": (1920, 1080),
+ ...
+ "gaze_movement_identifier": {
+ ...
+ },
+ "scan_path": {
+ ...
+ },
+ "scan_path_analyzers": {
+ ...
+ },
+ "heatmap": {
+ ...
+ },
+ "layers": {
+ "MyLayer": {
+ ...
+ },
+ ...
+ },
+ "image_parameters": {
+ ...
+ }
+}
+
+# Load ArFrame
+ar_frame = ArFeatures.ArFrame.from_dict(configuration)
+
+# Do something with ArFrame
+...
+```
+
+## Access to ArFrame and ArLayers attributes
+
+Then, once the configuration is loaded, it is possible to access to its attributes: [read ArFrame code reference](../../../argaze.md/#argaze.ArFeatures.ArFrame) to get a complete list of what is available.
+
+Thus, the [ArFrame.layers](../../../argaze.md/#argaze.ArFeatures.ArFrame) attribute allows to access each loaded layer and so, access to their attributes: [read ArLayer code reference](../../../argaze.md/#argaze.ArFeatures.ArLayer) to get a complete list of what is available.
+
+```python
+from argaze import ArFeatures
+
+# Assuming the ArFrame is loaded
+...
+
+# Iterate over each ArFrame layers
+for name, ar_layer in ar_frame.layers.items():
+ ...
+```
+
+## Pipeline execution outputs
+
+[ArFrame.look](../../../argaze.md/#argaze.ArFeatures.ArFrame.look) method returns many data about pipeline execution.
+
+```python
+# Assuming that timestamped gaze positions are available
+...
+
+ # Look ArFrame at a timestamped gaze position
+ gaze_movement, scan_path_analysis, layers_analysis, execution_times, exception = ar_frame.look(timestamp, gaze_position)
+
+ # Check if a gaze movement has been identified
+ if gaze_movement.valid and gaze_movement.finished:
+
+ # Do something with identified fixation
+ if GazeFeatures.is_fixation(gaze_movement):
+ ...
+
+ # Do something with identified saccade
+ elif GazeFeatures.is_saccade(gaze_movement):
+ ...
+
+ # Do something with scan path analysis
+ for module, analysis in scan_path_analysis.items():
+ for data, value in analysis.items():
+ ...
+
+ # Do something with each layer AOI scan path analysis
+ for layer_name, layer_aoi_scan_path_analysis in layers_analysis.items():
+ for module, analysis in layer_aoi_scan_path_analysis.items():
+ for data, value in analysis.items():
+ ...
+
+ # Do something with pipeline execution times
+ ...
+
+ # Do something with pipeline exception
+ if exception:
+ ...
+```
+
+Let's understand the meaning of each returned data.
+
+### Gaze movement
+
+A [GazeMovement](../../../argaze.md/#argaze.GazeFeatures.GazeMovement) once it have been identified by [ArFrame.gaze_movement_identifier](../../../argaze.md/#argaze.ArFeatures.ArFrame) object from incoming consecutive timestamped gaze positions. If no gaze movement have been identified, it returns an [UnvalidGazeMovement](../../../argaze.md/#argaze.GazeFeatures.UnvalidGazeMovement).
+
+This could also be the current gaze movement if [ArFrame.filter_in_progress_fixation](../../../argaze.md/#argaze.ArFeatures.ArFrame) attribute is false.
+In that case, the returned gaze movement *finished* flag is false.
+
+Then, the returned gaze movement type can be tested thanks to [GazeFeatures.is_fixation](../../../argaze.md/#argaze.GazeFeatures.is_fixation) and [GazeFeatures.is_saccade](../../../argaze.md/#argaze.GazeFeatures.is_saccade) functions.
+
+### Scan path analysis
+
+A dictionary with all last scan path analysis if new scan step have been added to the [ArFrame.scan_path](../../../argaze.md/#argaze.ArFeatures.ArFrame) object.
+
+### Layers analysis
+
+A dictionary with all layers AOI scan path analysis if new AOI scan step have been added to an [ArLayer.aoi_scan_path](../../../argaze.md/#argaze.ArFeatures.ArLayer) object.
+
+### Execution times
+
+A dictionary with each pipeline step execution time.
+
+### Exception
+
+A [python Exception](https://docs.python.org/3/tutorial/errors.html#exceptions) object raised during pipeline execution.
+
+## Setup ArFrame image parameters
+
+[ArFrame.image](../../argaze.md/#argaze.ArFeatures.ArFrame.image) method parameters can be configured thanks to a python dictionary.
+
+```python
+# Assuming ArFrame is loaded
+...
+
+# Edit a dict with ArFrame image parameters
+image_parameters = {
+ "draw_scan_path": {
+ ...
+ },
+ "draw_layers": {
+ "MyLayer": {
+ ...
+ }
+ },
+ ...
+}
+
+# Pass image parameters to ArFrame
+ar_frame_image = ar_frame.image(**image_parameters)
+
+# Do something with ArFrame image
+...
+```
diff --git a/docs/user_guide/gaze_analysis_pipeline/ar_frame_configuration_and_execution.md b/docs/user_guide/gaze_analysis_pipeline/ar_frame_configuration_and_execution.md
index 37100ab..cdc7fe0 100644
--- a/docs/user_guide/gaze_analysis_pipeline/ar_frame_configuration_and_execution.md
+++ b/docs/user_guide/gaze_analysis_pipeline/ar_frame_configuration_and_execution.md
@@ -40,26 +40,6 @@ from argaze import ArFeatures
# Load ArFrame
ar_frame = ArFeatures.ArFrame.from_json('./configuration.json')
-
-# Print ArFrame attributes
-print("name:", ar_frame.name)
-print("size:", ar_frame.size)
-print("gaze movement identifier type:", type(ar_frame.gaze_movement_identifier))
-print("scan path:", ar_frame.scan_path)
-
-for module, analyzer in ar_frame.scan_path_analyzers.items():
- print('scan path analyzer module:', module)
-```
-
-Finally, here is what the program writes in console:
-
-```txt
-name: My FullHD screen
-size: [1920, 1080]
-gaze movement identifier type: <class 'argaze.GazeAnalysis.DispersionThresholdIdentification.GazeMovementIdentifier'>
-scan path: []
-scan path analyzer module: argaze.GazeAnalysis.Basic
-scan path analyzer module: argaze.GazeAnalysis.ExploitExploreRatio
```
Now, let's understand the meaning of each JSON entry.
@@ -119,30 +99,5 @@ Timestamped gaze positions have to be passed one by one to [ArFrame.look](../../
...
# Look ArFrame at a timestamped gaze position
- movement, scan_path_analysis, _, execution_times, exception = ar_frame.look(timestamp, gaze_position)
-
- # Check if a movement has been identified
- if movement.valid and movement.finished:
-
- # Do something with identified fixation
- if GazeFeatures.is_fixation(movement):
- ...
-
- # Do something with identified saccade
- elif GazeFeatures.is_saccade(movement):
- ...
-
- # Do something with scan path analysis
- for module, analysis in scan_path_analysis.items():
- for data, value in analysis.items():
- ...
-
- # Do something with pipeline execution times
- ...
-
- # Do something with pipeline exception
- if exception:
- ...
+ ar_frame.look(timestamp, gaze_position)
```
-
-
diff --git a/docs/user_guide/gaze_analysis_pipeline/ar_layer_configuration_and_execution.md b/docs/user_guide/gaze_analysis_pipeline/ar_layer.md
index e96bfff..893e50c 100644
--- a/docs/user_guide/gaze_analysis_pipeline/ar_layer_configuration_and_execution.md
+++ b/docs/user_guide/gaze_analysis_pipeline/ar_layer.md
@@ -1,5 +1,5 @@
-Add and execute ArLayer
-=============================
+Add an ArLayer
+==============
The [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) class defines a space where to make matching of gaze movements and AOIs and inside which those matchings need to be analyzed.
@@ -9,7 +9,7 @@ The [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) class defines a space
An [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) instance can contains multiples [ArLayers](../../argaze.md/#argaze.ArFeatures.ArLayer).
-Here is the JSON ArFrame configuration file example where one layer is added:
+Here is an extract from the JSON ArFrame configuration file with a sample where one layer is added:
```json
{
@@ -46,49 +46,9 @@ Here is the JSON ArFrame configuration file example where one layer is added:
}
```
-Then, after the JSON file being loaded:
-
-```python
-from argaze import ArFeatures
-
-# Assuming the ArFrame is loaded
-...
-
-# Print ArLayer attributes
-for name, ar_layer in ar_frame.layers.items():
-
- print("name:", ar_layer.name)
- print("AOI color:", ar_layer.aoi_color)
- print("AOI scene:", ar_layer.aoi_scene)
- print("AOI matcher type:", type(ar_layer.aoi_matcher))
- print("AOI scan path:", ar_layer.aoi_scan_path)
-
- for module, analyzer in ar_layer.aoi_scan_path_analyzers.items():
- print('AOI scan path analyzer module:', module)
-```
+!!! note
-Finally, here is what the program writes in console:
-
-```txt
-...
-
-name: MyLayer
-AOI color: [0, 0, 255]
-AOI scene:
- upper_left_corner:
-[[0, 0], [960, 0], [960, 540], [0, 540]]
- upper_right_corner:
-[[960, 0], [1920, 0], [1920, 540], [960, 540]]
- lower_left_corner:
-[[0, 540], [960, 540], [960, 1080], [0, 1080]]
- lower_right_corner:
-[[960, 540], [1920, 540], [1920, 1080], [960, 1080]]
-AOI matcher type: <class 'argaze.GazeAnalysis.DeviationCircleCoverage.AOIMatcher'>
-AOI scan path: []
-AOI scan path analyzer module: argaze.GazeAnalysis.Basic
-AOI scan path analyzer module: argaze.GazeAnalysis.TransitionMatrix
-AOI scan path analyzer module: argaze.GazeAnalysis.NGram
-```
+ Timestamped gaze movements identified by parent [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) are passed one by one to each [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer). So, the execution of all [ArLayers](../../argaze.md/#argaze.ArFeatures.ArLayer) is done during parent [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method call as explained in [previous chapter](ar_frame_configuration_and_execution.md).
Now, let's understand the meaning of each JSON entry.
@@ -137,24 +97,3 @@ The [AOIScanPath.duration_max](../../argaze.md/#argaze.GazeFeatures.AOIScanPath.
Finally, the last [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) pipeline step consists in passing the previously built [AOIScanPath](../../argaze.md/#argaze.GazeFeatures.AOIScanPath) to each loaded [AOIScanPathAnalyzer](../../argaze.md/#argaze.GazeFeatures.AOIScanPathAnalyzer).
Each analysis algorithm can be selected by instantiating a particular [AOIScanPathAnalyzer](../../argaze.md/#argaze.GazeFeatures.AOIScanPathAnalyzer) from the [argaze.GazeAnalysis](../../argaze.md/#argaze.GazeAnalysis) submodule or [from another python package](advanced_topics/module_loading.md).
-
-## Pipeline execution
-
-Timestamped gaze movements identified by parent ArFrame are passed one by one to each [ArLayer.look](../../argaze.md/#argaze.ArFeatures.ArLayer.look) method to execute each layer intanciated pipeline.
-
-```python
-# Assuming that timestamped gaze positions are available
-...
-
- # Look ArFrame at a timestamped gaze position
- movement, _, layers_analysis, _, _ = ar_frame.look(timestamp, gaze_position)
-
- # Check if a movement has been identified
- if movement.valid and movement.finished:
-
- # Do something with each layer AOI scan path analysis
- for layer_name, layer_aoi_scan_path_analysis in layers_analysis.items():
- for module, analysis in layer_aoi_scan_path_analysis.items():
- for data, value in analysis.items():
- ...
-```
diff --git a/docs/user_guide/gaze_analysis_pipeline/background.md b/docs/user_guide/gaze_analysis_pipeline/background.md
index 420dbdf..a7d59f6 100644
--- a/docs/user_guide/gaze_analysis_pipeline/background.md
+++ b/docs/user_guide/gaze_analysis_pipeline/background.md
@@ -1,7 +1,7 @@
-Add Background
-==============
+Add a background
+================
-Background is an optional [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) attribute to display any image.
+Background is an optional [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) attribute to display any image behind pipeline visualisation.
![Background](../../img/ar_frame_background.png)
@@ -9,7 +9,7 @@ Background is an optional [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame)
[ArFrame.background](../../argaze.md/#argaze.ArFeatures.ArFrame.background) can be enabled thanks to a dedicated JSON entry.
-Here is the JSON ArFrame configuration file example where a background picture is loaded and displayed:
+Here is an extract from the JSON ArFrame configuration file where a background picture is loaded and displayed:
```json
{
@@ -25,6 +25,9 @@ Here is the JSON ArFrame configuration file example where a background picture i
}
```
+!!! note
+ As explained in [visualisation chapter](visualisation.md), the resulting image is accessible thanks to [ArFrame.image](../../argaze.md/#argaze.ArFeatures.ArFrame.image) method.
+
Now, let's understand the meaning of each JSON entry.
### Background
@@ -34,23 +37,3 @@ The path to an image file on disk.
### Background weight
The weight of background overlay in [ArFrame.image](../../argaze.md/#argaze.ArFeatures.ArFrame.image) between 0 and 1.
-
-## Edit ArFrame background
-
-It is also possible to set background image and display it from script:
-
-```python
-import numpy
-
-# Assuming an ArFrame is loaded
-...
-
-# Set ArFrame background as gray
-ar_frame.background = numpy.full((ar_frame.size[1], ar_frame.size[0], 3), 127).astype(numpy.uint8)
-
-# Get ArFrame image with background and any other options
-ar_frame_image = ar_frame.image(background_weight = 1, ...)
-
-# Do something with ArFrame image
-...
-```
diff --git a/docs/user_guide/gaze_analysis_pipeline/heatmap.md b/docs/user_guide/gaze_analysis_pipeline/heatmap.md
index a741810..fe4246e 100644
--- a/docs/user_guide/gaze_analysis_pipeline/heatmap.md
+++ b/docs/user_guide/gaze_analysis_pipeline/heatmap.md
@@ -1,5 +1,5 @@
-Add Heatmap
-===========
+Add a heatmap
+=============
Heatmap is an optional [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline step. It is executed at each new gaze position to update heatmap image.
@@ -9,7 +9,7 @@ Heatmap is an optional [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pip
[ArFrame.heatmap](../../argaze.md/#argaze.ArFeatures.ArFrame.heatmap) can be enabled thanks to a dedicated JSON entry.
-Here is the JSON ArFrame configuration file example where heatmap is enabled and displayed:
+Here is an extract from the JSON ArFrame configuration file where heatmap is enabled and displayed:
```json
{
@@ -28,22 +28,8 @@ Here is the JSON ArFrame configuration file example where heatmap is enabled and
}
}
```
-
-Then, here is how to access to heatmap object:
-
-```python
-
-# Assuming an ArFrame is loaded
-...
-
-print("heatmap:", ar_frame.heatmap)
-```
-
-Finally, here is what the program writes in console:
-
-```txt
-heatmap: Heatmap(size=[320, 180], buffer=0, sigma=0.025)
-```
+!!! note
+ [ArFrame.heatmap](../../argaze.md/#argaze.ArFeatures.ArFrame.heatmap) is automatically updated each time the [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method is called. As explained in [visualisation chapter](visualisation.md), the resulting image is accessible thanks to [ArFrame.image](../../argaze.md/#argaze.ArFeatures.ArFrame.image) method.
Now, let's understand the meaning of each JSON entry.
diff --git a/docs/user_guide/gaze_analysis_pipeline/introduction.md b/docs/user_guide/gaze_analysis_pipeline/introduction.md
index c3f574b..002ba1f 100644
--- a/docs/user_guide/gaze_analysis_pipeline/introduction.md
+++ b/docs/user_guide/gaze_analysis_pipeline/introduction.md
@@ -11,11 +11,13 @@ To build your own gaze analysis pipeline, you need to know:
* [How to edit timestamped gaze positions](timestamped_gaze_positions_edition.md),
* [How to deal with an ArFrame instance](ar_frame_configuration_and_execution.md),
-* [How to deal with an ArLayer instance](ar_layer_configuration_and_execution.md),
+* [How to add ArLayer instance](ar_layer.md),
* [How to visualize ArFrame and ArLayers](visualisation.md),
* [How to log resulted gaze analysis](logging.md),
-* [How to add heatmap](heatmap.md).
+* [How to make heatmap image](heatmap.md).
+* [How to add a background image](background.md).
More advanced features are also explained like:
+* [How to script gaze analysis pipeline](advanced_topics/scripting.md)
* [How to load module from another package](advanced_topics/module_loading.md)
diff --git a/docs/user_guide/gaze_analysis_pipeline/logging.md b/docs/user_guide/gaze_analysis_pipeline/logging.md
index 422b43b..1dea712 100644
--- a/docs/user_guide/gaze_analysis_pipeline/logging.md
+++ b/docs/user_guide/gaze_analysis_pipeline/logging.md
@@ -7,7 +7,7 @@ Log gaze analysis
[ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) have a log attribute to enable analysis logging.
-Here is the JSON ArFrame configuration file example where logging is enabled for the ArFrame and for one ArLayer:
+Here is an extract from the JSON ArFrame configuration file where logging is enabled for the ArFrame and for one ArLayer:
```json
{
diff --git a/docs/user_guide/gaze_analysis_pipeline/visualisation.md b/docs/user_guide/gaze_analysis_pipeline/visualisation.md
index ad59d54..852cdc5 100644
--- a/docs/user_guide/gaze_analysis_pipeline/visualisation.md
+++ b/docs/user_guide/gaze_analysis_pipeline/visualisation.md
@@ -1,10 +1,75 @@
-Visualize ArFrame and ArLayers
-==============================
+Visualize ArFrame
+=================
-All [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and [ArLayers](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline steps result can be drawn in real time or afterward.
+Visualisation is not a pipeline step but each [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline steps outputs can be drawn in real time or afterward, depending of application purpose.
![ArFrame visualisation](../../img/ar_frame_visualisation.png)
+## Add image parameters to ArFrame JSON configuration file
+
+[ArFrame.image](../../argaze.md/#argaze.ArFeatures.ArFrame.image) method parameters can be configured thanks to a dedicated JSON entry.
+
+Here is an extract from the JSON ArFrame configuration file with a sample where image parameters are added:
+
+```json
+{
+ "name": "My FullHD screen",
+ "size": [1920, 1080],
+ ...
+ "image_parameters": {
+ "draw_scan_path": {
+ "draw_fixations": {
+ "deviation_circle_color": [255, 0, 255],
+ "duration_border_color": [127, 0, 127],
+ "duration_factor": 1e-2
+ },
+ "draw_saccades": {
+ "line_color": [255, 0, 255]
+ },
+ "deepness": 0
+ },
+ "draw_layers": {
+ "MyLayer": {
+ "draw_aoi_scene": {
+ "draw_aoi": {
+ "color": [255, 255, 255],
+ "border_size": 1
+ }
+ },
+ "draw_aoi_matching": {
+ "draw_matched_fixation": {
+ "deviation_circle_color": [255, 255, 255]
+ },
+ "draw_matched_fixation_positions": {
+ "position_color": [0, 255, 255],
+ "line_color": [0, 0, 0]
+ },
+ "draw_matched_region": {
+ "color": [0, 255, 0],
+ "border_size": 4
+ },
+ "draw_looked_aoi": {
+ "color": [0, 255, 0],
+ "border_size": 2
+ },
+ "looked_aoi_name_color": [255, 255, 255],
+ "looked_aoi_name_offset": [0, -10]
+ }
+ }
+ },
+ "draw_gaze_position": {
+ "color": [0, 255, 255]
+ }
+ }
+}
+```
+
+!!! warning
+ Most of *image_parameters* entries work if related ArFrame/ArLayer pipeline steps are enabled.
+ For example, JSON *draw_scan_path* entry needs GazeMovementIdentifier and ScanPath steps to be enabled.
+
+Then, [ArFrame.image](../../argaze.md/#argaze.ArFeatures.ArFrame.image) method can be called in various situations.
+
## Export to PNG file
Once timestamped gaze positions have been processed by [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method, it is possible to write [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) image into a file thanks to [OpenCV package](https://pypi.org/project/opencv-python/).
@@ -70,128 +135,3 @@ if __name__ == '__main__':
main()
```
-
-## Edit ArFrame image parameters
-
-[ArFrame.image](../../argaze.md/#argaze.ArFeatures.ArFrame.image) method parameters can be configured thanks to dictionary.
-
-```python
-# Assuming ArFrame is loaded
-...
-
-# Edit ArFrame image parameters
-image_parameters = {
- "draw_scan_path": {
- "draw_fixations": {
- "deviation_circle_color": [255, 0, 255],
- "duration_border_color": [127, 0, 127],
- "duration_factor": 1e-2
- },
- "draw_saccades": {
- "line_color": [255, 0, 255]
- },
- "deepness": 0
- },
- "draw_layers": {
- "MyLayer": {
- "draw_aoi_scene": {
- "draw_aoi": {
- "color": [255, 255, 255],
- "border_size": 1
- }
- },
- "draw_aoi_matching": {
- "draw_matched_fixation": {
- "deviation_circle_color": [255, 255, 255]
- },
- "draw_matched_fixation_positions": {
- "position_color": [0, 255, 255],
- "line_color": [0, 0, 0]
- },
- "draw_matched_region": {
- "color": [0, 255, 0],
- "border_size": 4
- },
- "draw_looked_aoi": {
- "color": [0, 255, 0],
- "border_size": 2
- },
- "looked_aoi_name_color": [255, 255, 255],
- "looked_aoi_name_offset": [0, -10]
- }
- }
- },
- "draw_gaze_position": {
- "color": [0, 255, 255]
- }
-}
-
-# Pass image parameters to ArFrame
-ar_frame_image = ar_frame.image(**image_parameters)
-
-# Do something with ArFrame image
-...
-```
-
-## Configure ArFrame image parameters
-
-[ArFrame.image](../../argaze.md/#argaze.ArFeatures.ArFrame.image) method parameters can also be configured thanks to a dedicated JSON entry.
-
-Here is the JSON ArFrame configuration file example with image parameters included:
-
-```json
-{
- "name": "My FullHD screen",
- "size": [1920, 1080],
- ...
- "image_parameters": {
- "draw_scan_path": {
- "draw_fixations": {
- "deviation_circle_color": [255, 0, 255],
- "duration_border_color": [127, 0, 127],
- "duration_factor": 1e-2
- },
- "draw_saccades": {
- "line_color": [255, 0, 255]
- },
- "deepness": 0
- },
- "draw_layers": {
- "MyLayer": {
- "draw_aoi_scene": {
- "draw_aoi": {
- "color": [255, 255, 255],
- "border_size": 1
- }
- },
- "draw_aoi_matching": {
- "draw_matched_fixation": {
- "deviation_circle_color": [255, 255, 255]
- },
- "draw_matched_fixation_positions": {
- "position_color": [0, 255, 255],
- "line_color": [0, 0, 0]
- },
- "draw_matched_region": {
- "color": [0, 255, 0],
- "border_size": 4
- },
- "draw_looked_aoi": {
- "color": [0, 255, 0],
- "border_size": 2
- },
- "looked_aoi_name_color": [255, 255, 255],
- "looked_aoi_name_offset": [0, -10]
- }
- }
- },
- "draw_gaze_position": {
- "color": [0, 255, 255]
- }
- }
-}
-```
-
-!!! warning
- Most of *image_parameters* entries work if related ArFrame/ArLayer pipeline steps are enabled.
- For example, JSON *draw_scan_path* entry needs GazeMovementIdentifier and ScanPath steps to be enabled. \ No newline at end of file