aboutsummaryrefslogtreecommitdiff
path: root/docs/user_guide
diff options
context:
space:
mode:
Diffstat (limited to 'docs/user_guide')
-rw-r--r--docs/user_guide/aruco_markers_pipeline/aoi_description.md4
-rw-r--r--docs/user_guide/aruco_markers_pipeline/aruco_camera_configuration_and_execution.md36
-rw-r--r--docs/user_guide/aruco_markers_pipeline/aruco_scene.md113
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/aoi_analysis.md (renamed from docs/user_guide/gaze_analysis_pipeline/ar_layer.md)20
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/configuration_and_execution.md (renamed from docs/user_guide/gaze_analysis_pipeline/ar_frame_configuration_and_execution.md)25
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/introduction.md4
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/pipeline_modules/gaze_movement_identifiers.md2
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/pipeline_modules/scan_path_analyzers.md2
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/timestamped_gaze_positions_edition.md4
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/visualisation.md4
10 files changed, 173 insertions, 41 deletions
diff --git a/docs/user_guide/aruco_markers_pipeline/aoi_description.md b/docs/user_guide/aruco_markers_pipeline/aoi_description.md
index 8c57cd1..80ad858 100644
--- a/docs/user_guide/aruco_markers_pipeline/aoi_description.md
+++ b/docs/user_guide/aruco_markers_pipeline/aoi_description.md
@@ -55,8 +55,8 @@ JSON file format allows to describe AOIs vertices.
``` json
{
- "YellowSquare": [[6.2, -7.275252, 25.246159], [31.2, -7.275252, 25.246159], [6.2, 1.275252, 1.753843], [31.2, 1.275252, 1.753843]],
- "GrayRectangle": [[2.5, 2.5, -0.5], [37.5, 2.5, -0.5], [2.5, 27.5, -0.5], [37.5, 27.5, -0.5]],
+ "YellowSquare": [[6.2, -7.275252, 25.246159], [31.2, -7.275252, 25.246159], [31.2, 1.275252, 1.753843], [6.2, 1.275252, 1.753843]],
+ "GrayRectangle": [[2.5, 2.5, -0.5], [37.5, 2.5, -0.5], [37.5, 27.5, -0.5], [2.5, 27.5, -0.5]],
"BlueTriangle": [[12.5, 7.5, -0.5], [27.5, 7.5, -0.5], [20, 22.5, -0.5]]
}
```
diff --git a/docs/user_guide/aruco_markers_pipeline/aruco_camera_configuration_and_execution.md b/docs/user_guide/aruco_markers_pipeline/aruco_camera_configuration_and_execution.md
index 824e466..ba19e45 100644
--- a/docs/user_guide/aruco_markers_pipeline/aruco_camera_configuration_and_execution.md
+++ b/docs/user_guide/aruco_markers_pipeline/aruco_camera_configuration_and_execution.md
@@ -3,9 +3,9 @@ Configure and execute ArUcoCamera
Once [ArUco markers are placed into a scene](aruco_markers_description.md) and [areas of interest are described](aoi_description.md), everything is ready to setup an ArUco marker pipeline thanks to [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) class.
-As it inherits from [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame), the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) class benefits from all the services described in [gaze analysis pipeline section](./user_guide/gaze_analysis_pipeline/introduction.md).
+As [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) inherits from [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame), the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) class benefits from all the services described in [gaze analysis pipeline section](./user_guide/gaze_analysis_pipeline/introduction.md).
-![ArUco camera markers detection](../../img/aruco_camera_markers_detection.png)
+![ArUco camera frame](../../img/aruco_camera_frame.png)
## Load JSON configuration file
@@ -68,19 +68,21 @@ The usual [ArFrame visualisation parameters](./user_guide/gaze_analysis_pipeline
## Pipeline execution
-Pass each camera image to [ArUcoCamera.watch](../../argaze.md/#argaze.ArFeatures.ArCamera.watch) method to execute the whole intanciated pipeline.
+### Detect ArUco markers, estimate scene pose and project scene
+
+Pass each camera image to [ArUcoCamera.watch](../../argaze.md/#argaze.ArFeatures.ArCamera.watch) method to execute the whole pipeline dedicated to ArUco markers detection, scene pose estimation and projection.
```python
# Assuming that Full HD (1920x1080) video stream or file is opened
...
-# Assuming there is a way to escape the while loop
-while ...:
+# Assuming that the video reading is handled in a looping code block
+...:
# Capture image from video stream of file
image = video_capture.read()
- # Detect ArUco markers and more...
+ # Detect ArUco markers, estimate scene pose then, project scene into camera frame
aruco_camera.watch(image)
# Display ArUcoCamera frame image to check that ArUco markers are well detected and scene is well projected
@@ -90,4 +92,24 @@ while ...:
!!! warning
ArUco markers pose estimation algorithm can lead to errors due to geometric ambiguities as explain in [this article](https://ieeexplore.ieee.org/document/1717461). To discard such ambiguous cases, markers should **not be parallel to camera plan**.
-At this point, the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) only detects ArUco markers as no scene description is provided. \ No newline at end of file
+
+
+### Analyse timestamped gaze positions into camera frame
+
+As mentioned above, [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) inherits from [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and so, benefits from all the services described in [gaze analysis pipeline section](./user_guide/gaze_analysis_pipeline/introduction.md).
+
+Particularly, timestamped gaze positions can be passed one by one to [ArUcoCamera.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method to execute the whole pipeline dedicated to gaze analysis.
+
+```python
+# Assuming that timestamped gaze positions are available
+...
+
+ # Look ArUcoCamera frame at a timestamped gaze position
+ aruco_camera.look(timestamp, gaze_position)
+```
+
+!!! warning ""
+
+ At this point, the [ArUcoCamera.watch](../../argaze.md/#argaze.ArFeatures.ArCamera.watch) method only detects ArUco markers and the [ArUcoCamera.look](../../argaze.md/#argaze.ArFeatures.ArCamera.look) method is not able to analyze gaze positions as no scene description is provided into the JSON configuration file.
+
+ Read the next chapters to learn [how to enable scene pose estimation and its projection](aruco_scene.md). \ No newline at end of file
diff --git a/docs/user_guide/aruco_markers_pipeline/aruco_scene.md b/docs/user_guide/aruco_markers_pipeline/aruco_scene.md
index b47fefb..91d2702 100644
--- a/docs/user_guide/aruco_markers_pipeline/aruco_scene.md
+++ b/docs/user_guide/aruco_markers_pipeline/aruco_scene.md
@@ -58,9 +58,7 @@ The 3D places of ArUco markers into the scene as defined at [ArUco markers descr
## Add ArLayer to ArUcoScene to load AOI
-The [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) class allows to load areas of interest description.
-
-An [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) instance can contains multiples [ArLayers](../../argaze.md/#argaze.ArFeatures.ArLayer).
+The [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) class allows to load areas of interest description. An [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) instance can contains multiples [ArLayers](../../argaze.md/#argaze.ArFeatures.ArLayer).
Here is the previous extract where one layer is added to the [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene):
@@ -77,8 +75,8 @@ Here is the previous extract where one layer is added to the [ArUcoScene](../../
"layers": {
"MyLayer": {
"aoi_scene": {
- "YellowSquare": [[6.2, -7.275252, 25.246159], [31.2, -7.275252, 25.246159], [6.2, 1.275252, 1.753843], [31.2, 1.275252, 1.753843]],
- "GrayRectangle": [[2.5, 2.5, -0.5], [37.5, 2.5, -0.5], [2.5, 27.5, -0.5], [37.5, 27.5, -0.5]],
+ "YellowSquare": [[6.2, -7.275252, 25.246159], [31.2, -7.275252, 25.246159], [31.2, 1.275252, 1.753843], [6.2, 1.275252, 1.753843]],
+ "GrayRectangle": [[2.5, 2.5, -0.5], [37.5, 2.5, -0.5], [37.5, 27.5, -0.5], [2.5, 27.5, -0.5]],
"BlueTriangle": [[12.5, 7.5, -0.5], [27.5, 7.5, -0.5], [20, 22.5, -0.5]]
}
}
@@ -123,7 +121,17 @@ Here is the previous extract where one layer is added to the [ArUcoCamera](../..
}
},
"layers": {
- "MyLayer": {}
+ "MyLayer": {
+ "aoi_matcher": {
+ ...
+ },
+ "aoi_scan_path": {
+ "duration_max": 30000
+ },
+ "aoi_scan_path_analyzers": {
+ ...
+ }
+ }
}
...
}
@@ -142,3 +150,96 @@ The name of the [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer). Basically
!!! note
[ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) layers are projected into their dedicated [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) layers when calling the [ArUcoCamera.watch](../../argaze.md/#argaze.ArFeatures.ArCamera.watch) method.
+
+## Add GazeMovementIdentifier to ArUcocamera to enable gaze movement identification
+
+Here is the previous extract where gaze movement identifier is added to the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera):
+
+```json
+{
+ "name": "My FullHD camera",
+ "size": [1920, 1080],
+ ...
+ "scenes": {
+ "MyScene" : {
+ "aruco_markers_group": {
+ ...
+ },
+ "layers": {
+ "MyLayer": {
+ "aoi_scene": {
+ ...
+ }
+ }
+ }
+ }
+ },
+ "layers": {
+ "MyLayer": {}
+ },
+ "gaze_movement_identifier": {
+ "DispersionThresholdIdentification": {
+ "deviation_max_threshold": 50,
+ "duration_min_threshold": 200
+ }
+ }
+ ...
+}
+```
+!!! note
+
+ Timestamped gaze positions are [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) layers execution is done when calling the [ArUcoCamera.look](../../argaze.md/#argaze.ArFeatures.ArCamera.look) method.
+
+## Add AOIMatcher, AOIScanPath and AOIScanPathAnalyzers to ArUcoCamera layer to enable gaze analysis
+
+Here is the previous extract where AOI matcher, AOI scan path and AOI scan path analyzers are added to the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) layer:
+
+```json
+{
+ "name": "My FullHD camera",
+ "size": [1920, 1080],
+ ...
+ "scenes": {
+ "MyScene" : {
+ "aruco_markers_group": {
+ ...
+ },
+ "layers": {
+ "MyLayer": {
+ "aoi_scene": {
+ ...
+ }
+ }
+ }
+ }
+ },
+ "layers": {
+ "MyLayer": {
+ "aoi_matcher": {
+ "DeviationCircleCoverage": {
+ "coverage_threshold": 0.5
+ }
+ },
+ "aoi_scan_path": {
+ "duration_max": 30000
+ },
+ "aoi_scan_path_analyzers": {
+ "Basic": {},
+ "TransitionMatrix": {},
+ "NGram": {
+ "n_min": 3,
+ "n_max": 5
+ }
+ }
+ }
+ },
+ "gaze_movement_identifier": {
+ ...
+ }
+ ...
+}
+```
+
+!!! warning
+
+ Adding scan path and scan path analyzers to [an ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) layer doesn't make sense if the camera is moving. \ No newline at end of file
diff --git a/docs/user_guide/gaze_analysis_pipeline/ar_layer.md b/docs/user_guide/gaze_analysis_pipeline/aoi_analysis.md
index f0291c3..ffc72c7 100644
--- a/docs/user_guide/gaze_analysis_pipeline/ar_layer.md
+++ b/docs/user_guide/gaze_analysis_pipeline/aoi_analysis.md
@@ -1,9 +1,9 @@
-Add an ArLayer
-==============
+Add AOI analysis
+================
-The [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) class defines a space where to make matching of gaze movements and AOIs and inside which those matchings need to be analyzed.
+The [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) class defines a space where to make matching of gaze movements with AOIs and inside which those matchings need to be analyzed.
-![Empty layer area](../../img/ar_layer_empty.png)
+![Layer](../../img/ar_layer.png)
## Add ArLayer to ArFrame JSON configuration file
@@ -47,7 +47,7 @@ Here is an extract from the JSON [ArFrame](../../argaze.md/#argaze.ArFeatures.Ar
!!! note
- Timestamped gaze movements identified by parent [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) are passed one by one to each [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer). So, the execution of all [ArLayers](../../argaze.md/#argaze.ArFeatures.ArLayer) is done during parent [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method call as explained in [previous chapter](ar_frame_configuration_and_execution.md).
+ Timestamped gaze movements identified by parent [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) are passed one by one to each [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer). So, the execution of all [ArLayers](../../argaze.md/#argaze.ArFeatures.ArLayer) is done during parent [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method call as explained in [previous chapter](configuration_and_execution.md).
Now, let's understand the meaning of each JSON entry.
@@ -67,11 +67,11 @@ The first [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) pipeline step ai
![AOI Matcher](../../img/ar_layer_aoi_matcher.png)
-The matching algorithm can be selected by instantiating a particular [AOIMatcher](../../argaze.md/#argaze.GazeFeatures.AOIMatcher) from the [argaze.GazeAnalysis](../../argaze.md/#argaze.GazeAnalysis) submodule or [from another python package](advanced_topics/module_loading.md).
+The matching algorithm can be selected by instantiating a particular AOIMatcher [from GazeAnalysis submodule](pipeline_modules/aoi_matchers.md) or [from another python package](advanced_topics/module_loading.md).
In the example file, the choosen matching algorithm is the [Deviation Circle Coverage](../../argaze.md/#argaze.GazeAnalysis.DeviationCircleCoverage) which has one specific *coverage_threshold* attribute.
-!!! warning
+!!! warning "Mandatory"
JSON *aoi_matcher* entry is mandatory. Otherwise, the AOIScanPath and AOIScanPathAnalyzers steps are disabled.
### AOI Scan Path
@@ -84,11 +84,13 @@ Once identified gaze movements are matched to AOI, they are automatically append
The [AOIScanPath.duration_max](../../argaze.md/#argaze.GazeFeatures.AOIScanPath.duration_max) attribute is the duration from which older AOI scan steps are removed each time new AOI scan steps are added.
-!!! note
+!!! note "Optional"
JSON *aoi_scan_path* entry is not mandatory. If aoi_scan_path_analyzers entry is not empty, the AOIScanPath step is automatically enabled.
### AOI Scan Path Analyzers
Finally, the last [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) pipeline step consists in passing the previously built [AOIScanPath](../../argaze.md/#argaze.GazeFeatures.AOIScanPath) to each loaded [AOIScanPathAnalyzer](../../argaze.md/#argaze.GazeFeatures.AOIScanPathAnalyzer).
-Each analysis algorithm can be selected by instantiating a particular [AOIScanPathAnalyzer](../../argaze.md/#argaze.GazeFeatures.AOIScanPathAnalyzer) from the [argaze.GazeAnalysis](../../argaze.md/#argaze.GazeAnalysis) submodule or [from another python package](advanced_topics/module_loading.md).
+Each analysis algorithm can be selected by instantiating a particular AOIScanPathAnalyzer [from GazeAnalysis submodule](pipeline_modules/aoi_scan_path_analyzers.md) or [from another python package](advanced_topics/module_loading.md).
+
+In the example file, the choosen analysis algorithms are the [Basic](../../argaze.md/#argaze.GazeAnalysis.Basic) module, the [TransitionMatrix](../../argaze.md/#argaze.GazeAnalysis.TransitionMatrix) module and the [NGram](../../argaze.md/#argaze.GazeAnalysis.NGram) module which has two specific *n_min* and *n_max* attributes.
diff --git a/docs/user_guide/gaze_analysis_pipeline/ar_frame_configuration_and_execution.md b/docs/user_guide/gaze_analysis_pipeline/configuration_and_execution.md
index d649a4b..f4aa2b7 100644
--- a/docs/user_guide/gaze_analysis_pipeline/ar_frame_configuration_and_execution.md
+++ b/docs/user_guide/gaze_analysis_pipeline/configuration_and_execution.md
@@ -1,9 +1,9 @@
-Configure and execute ArFrame
-=============================
+Load and execute pipeline
+=========================
The [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) class defines a rectangular area where timestamped gaze positions are projected in and inside which they need to be analyzed.
-![Empty frame area](../../img/ar_frame_empty.png)
+![Frame](../../img/ar_frame.png)
## Load JSON configuration file
@@ -52,8 +52,8 @@ The name of the [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame). Basically
The size of the [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) defines the dimension of the rectangular area where gaze positions are projected. Be aware that gaze positions have to be in the same range of value to be projected in.
-!!! warning
- **ArGaze doesn't impose any spatial unit.** Gaze positions can either be integer or float, pixels, millimeters or what ever you need. The only concern is that all spatial values used in further configurations have to be all the same unit.
+!!! warning "Free spatial unit"
+ Gaze positions can either be integer or float, pixels, millimeters or what ever you need. The only concern is that all spatial values used in further configurations have to be all the same unit.
### Gaze Movement Identifier
@@ -61,14 +61,14 @@ The first [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline step is
![Gaze Movement Identifier](../../img/ar_frame_gaze_movement_identifier.png)
-The identification algorithm can be selected by instantiating a particular [GazeMovementIdentifier](../../argaze.md/#argaze.GazeFeatures.GazeMovementIdentifier) from the [argaze.GazeAnalysis](../../argaze.md/#argaze.GazeAnalysis) submodule or [from another python package](advanced_topics/module_loading.md).
+The identification algorithm can be selected by instantiating a particular GazeMovementIdentifier [from GazeAnalysis submodule](pipeline_modules/gaze_movement_identifiers.md) or [from another python package](advanced_topics/module_loading.md).
In the example file, the choosen identification algorithm is the [Dispersion Threshold Identification (I-DT)](../../argaze.md/#argaze.GazeAnalysis.DispersionThresholdIdentification) which has two specific *deviation_max_threshold* and *duration_min_threshold* attributes.
!!! note
In ArGaze, [Fixation](../../argaze.md/#argaze.GazeFeatures.Fixation) and [Saccade](../../argaze.md/#argaze.GazeFeatures.Saccade) are considered as particular [GazeMovements](../../argaze.md/#argaze.GazeFeatures.GazeMovement).
-!!! warning
+!!! warning "Mandatory"
JSON *gaze_movement_identifier* entry is mandatory. Otherwise, the ScanPath and ScanPathAnalyzers steps are disabled.
### Scan Path
@@ -81,14 +81,16 @@ Once fixations and saccades are identified, they are automatically appended to t
The [ScanPath.duration_max](../../argaze.md/#argaze.GazeFeatures.ScanPath.duration_max) attribute is the duration from which older scan steps are removed each time new scan steps are added.
-!!! note
+!!! note "Optional"
JSON *scan_path* entry is not mandatory. If scan_path_analyzers entry is not empty, the ScanPath step is automatically enabled.
### Scan Path Analyzers
Finally, the last [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline step consists in passing the previously built [ScanPath](../../argaze.md/#argaze.GazeFeatures.ScanPath) to each loaded [ScanPathAnalyzer](../../argaze.md/#argaze.GazeFeatures.ScanPathAnalyzer).
-Each analysis algorithm can be selected by instantiating a particular [ScanPathAnalyzer](../../argaze.md/#argaze.GazeFeatures.ScanPathAnalyzer) from the [argaze.GazeAnalysis](../../argaze.md/#argaze.GazeAnalysis) submodule or [from another python package](advanced_topics/module_loading.md).
+Each analysis algorithm can be selected by instantiating a particular ScanPathAnalyzer [from GazeAnalysis submodule](pipeline_modules/scan_path_analyzers.md) or [from another python package](advanced_topics/module_loading.md).
+
+In the example file, the choosen analysis algorithms are the [Basic](../../argaze.md/#argaze.GazeAnalysis.Basic) module and the [ExploitExploreRatio](../../argaze.md/#argaze.GazeAnalysis.ExploitExploreRatio) module which has one specific *short_fixation_duration_threshold* attribute.
## Pipeline execution
@@ -101,3 +103,8 @@ Timestamped gaze positions have to be passed one by one to [ArFrame.look](../../
# Look ArFrame at a timestamped gaze position
ar_frame.look(timestamp, gaze_position)
```
+!!! warning ""
+
+ At this point, the [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method only process gaze movement identification and scan path analysis without any AOI neither any logging or visualisation supports.
+
+ Read the next chapters to learn how to [add AOI analysis](aoi_analysis.md), [log gaze analysis](logging.md) and [visualize pipeline steps](visualisation.md). \ No newline at end of file
diff --git a/docs/user_guide/gaze_analysis_pipeline/introduction.md b/docs/user_guide/gaze_analysis_pipeline/introduction.md
index 002ba1f..02aa82e 100644
--- a/docs/user_guide/gaze_analysis_pipeline/introduction.md
+++ b/docs/user_guide/gaze_analysis_pipeline/introduction.md
@@ -10,8 +10,8 @@ First, let's look at the schema below: it gives an overview of the main notions
To build your own gaze analysis pipeline, you need to know:
* [How to edit timestamped gaze positions](timestamped_gaze_positions_edition.md),
-* [How to deal with an ArFrame instance](ar_frame_configuration_and_execution.md),
-* [How to add ArLayer instance](ar_layer.md),
+* [How to load and execute gaze analysis pipeline](configuration_and_execution.md),
+* [How to add AOI analysis](aoi_analysis.md),
* [How to visualize ArFrame and ArLayers](visualisation.md),
* [How to log resulted gaze analysis](logging.md),
* [How to make heatmap image](heatmap.md).
diff --git a/docs/user_guide/gaze_analysis_pipeline/pipeline_modules/gaze_movement_identifiers.md b/docs/user_guide/gaze_analysis_pipeline/pipeline_modules/gaze_movement_identifiers.md
index 6ae66bf..751cc7b 100644
--- a/docs/user_guide/gaze_analysis_pipeline/pipeline_modules/gaze_movement_identifiers.md
+++ b/docs/user_guide/gaze_analysis_pipeline/pipeline_modules/gaze_movement_identifiers.md
@@ -3,7 +3,7 @@ Gaze movement identifiers
ArGaze provides ready-to-use gaze movement identification algorithms.
-Here are JSON samples to include a chosen module inside [ArFrame configuration](../ar_frame_configuration_and_execution.md) *gaze_movement_identifier* entry.
+Here are JSON samples to include a chosen module inside [ArFrame configuration](../configuration_and_execution.md) *gaze_movement_identifier* entry.
## Dispersion threshold identification (I-DT)
diff --git a/docs/user_guide/gaze_analysis_pipeline/pipeline_modules/scan_path_analyzers.md b/docs/user_guide/gaze_analysis_pipeline/pipeline_modules/scan_path_analyzers.md
index 29ee4f2..afba844 100644
--- a/docs/user_guide/gaze_analysis_pipeline/pipeline_modules/scan_path_analyzers.md
+++ b/docs/user_guide/gaze_analysis_pipeline/pipeline_modules/scan_path_analyzers.md
@@ -3,7 +3,7 @@ Scan path analyzers
ArGaze provides ready-to-use scan path analysis algorithms.
-Here are JSON samples to include a chosen module inside [ArFrame configuration](../ar_frame_configuration_and_execution.md) *scan_path_analyzers* entry.
+Here are JSON samples to include a chosen module inside [ArFrame configuration](../configuration_and_execution.md) *scan_path_analyzers* entry.
## Basic metrics
diff --git a/docs/user_guide/gaze_analysis_pipeline/timestamped_gaze_positions_edition.md b/docs/user_guide/gaze_analysis_pipeline/timestamped_gaze_positions_edition.md
index 493471e..93d2a65 100644
--- a/docs/user_guide/gaze_analysis_pipeline/timestamped_gaze_positions_edition.md
+++ b/docs/user_guide/gaze_analysis_pipeline/timestamped_gaze_positions_edition.md
@@ -68,7 +68,7 @@ start_time = time.time()
...
```
-!!! warning
- **ArGaze doesn't impose any time unit.** Timestamps can either be integer or float, second, millisecond or what ever you need. The only concern is that all time values used in further configurations have to be all the same unit.
+!!! warning "Free time unit"
+ Timestamps can either be integer or float, second, millisecond or what ever you need. The only concern is that all time values used in further configurations have to be all the same unit.
Now we have timestamped gaze positions at expected format, let's see how to analyze them. \ No newline at end of file
diff --git a/docs/user_guide/gaze_analysis_pipeline/visualisation.md b/docs/user_guide/gaze_analysis_pipeline/visualisation.md
index 852cdc5..55254dd 100644
--- a/docs/user_guide/gaze_analysis_pipeline/visualisation.md
+++ b/docs/user_guide/gaze_analysis_pipeline/visualisation.md
@@ -1,5 +1,5 @@
-Visualize ArFrame
-=================
+Visualize pipeline steps
+========================
Visualisation is not a pipeline step but each [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline steps outputs can be drawn in real time or afterward, depending of application purpose.