aboutsummaryrefslogtreecommitdiff
path: root/docs/user_guide
diff options
context:
space:
mode:
Diffstat (limited to 'docs/user_guide')
-rw-r--r--docs/user_guide/aruco_markers_pipeline/aoi_3d_description.md33
-rw-r--r--docs/user_guide/aruco_markers_pipeline/aoi_3d_frame.md30
-rw-r--r--docs/user_guide/aruco_markers_pipeline/aoi_3d_projection.md5
-rw-r--r--docs/user_guide/aruco_markers_pipeline/aruco_markers_description.md30
-rw-r--r--docs/user_guide/aruco_markers_pipeline/introduction.md6
-rw-r--r--docs/user_guide/aruco_markers_pipeline/pose_estimation.md8
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/aoi_2d_description.md26
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/aoi_analysis.md26
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/background.md4
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/configuration_and_execution.md4
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/heatmap.md2
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/timestamped_gaze_positions_edition.md2
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/visualisation.md2
13 files changed, 100 insertions, 78 deletions
diff --git a/docs/user_guide/aruco_markers_pipeline/aoi_3d_description.md b/docs/user_guide/aruco_markers_pipeline/aoi_3d_description.md
index 5a1a16e..502f905 100644
--- a/docs/user_guide/aruco_markers_pipeline/aoi_3d_description.md
+++ b/docs/user_guide/aruco_markers_pipeline/aoi_3d_description.md
@@ -3,7 +3,7 @@ Describe 3D AOI
Once [ArUco markers are placed into a scene](aruco_markers_description.md), [areas of interest (AOI)](../../argaze.md/#argaze.AreaOfInterest.AOIFeatures.AreaOfInterest) need to be described into the same 3D referential.
-In the example scene, each screen is considered as an area of interest more the blue triangle area inside the top screen.
+In the example scene, the screen and the sheet are considered as areas of interest.
![3D AOI description](../../img/aoi_3d_description.png)
@@ -21,26 +21,20 @@ All AOI need to be described from same origin than markers in a [right-handed 3D
OBJ file format could be exported from most 3D editors.
``` obj
-o YellowSquare
-v 6.200003 -7.275252 25.246159
-v 31.200003 -7.275252 25.246159
-v 6.200003 1.275252 1.753843
-v 31.200003 1.275252 1.753843
+o Sheet
+v 14.200000 -3.000000 28.350000
+v 35.200000 -3.000000 28.350000
+v 14.200000 -3.000000 -1.35
+v 35.200000 -3.000000 -1.35
s off
f 1 2 4 3
-o GrayRectangle
-v 2.500000 2.500000 -0.500000
-v 37.500000 2.500000 -0.500000
-v 2.500000 27.500000 -0.500000
-v 37.500000 27.500000 -0.500000
+o Screen
+v 2.750000 2.900000 -0.500000
+v 49.250000 2.900000 -0.500000
+v 2.750000 29.100000 -0.500000
+v 49.250000 29.100000 -0.500000
s off
f 5 6 8 7
-o BlueTriangle
-v 12.500002 7.500000 -0.500000
-v 27.500002 7.500000 -0.500000
-v 20.000002 22.500000 -0.500000
-s off
-f 9 10 11
```
Here are common OBJ file features needed to describe AOI:
@@ -55,8 +49,7 @@ JSON file format allows to describe AOI vertices.
``` json
{
- "YellowSquare": [[6.2, -7.275252, 25.246159], [31.2, -7.275252, 25.246159], [31.2, 1.275252, 1.753843], [6.2, 1.275252, 1.753843]],
- "GrayRectangle": [[2.5, 2.5, -0.5], [37.5, 2.5, -0.5], [37.5, 27.5, -0.5], [2.5, 27.5, -0.5]],
- "BlueTriangle": [[12.5, 7.5, -0.5], [27.5, 7.5, -0.5], [20, 22.5, -0.5]]
+ "Sheet": [[14.2, -3, 28.35], [35.2, -3, 28.35], [14.2, -3, -1.35], [35.2, -3, -1.35]],
+ "Screen": [[2.75, 2.9, -0.5], [49.25, 2.9, -0.5], [2.75, 29.1, -0.5], [49.25, 29.1, -0.5]]
}
```
diff --git a/docs/user_guide/aruco_markers_pipeline/aoi_3d_frame.md b/docs/user_guide/aruco_markers_pipeline/aoi_3d_frame.md
index 8075426..032e2b6 100644
--- a/docs/user_guide/aruco_markers_pipeline/aoi_3d_frame.md
+++ b/docs/user_guide/aruco_markers_pipeline/aoi_3d_frame.md
@@ -3,11 +3,13 @@ Define a 3D AOI as a frame
When an 3D AOI of the scene contains others coplanar 3D AOI, like a screen with GUI elements displayed on, it is better to described them as 2D AOI inside 2D coordinates system related to the containing 3D AOI.
+![3D AOI frame](../../img/aruco_camera_aoi_frame.png)
+
## Add ArFrame to ArUcoScene
The [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) class defines a rectangular area where timestamped gaze positions are projected in and inside which they need to be analyzed.
-Here is the previous extract where "MyScreen" AOI is defined as a frame into [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) configuration:
+Here is the previous extract where "Screen" AOI is defined as a frame into [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) configuration:
```json
{
@@ -22,18 +24,34 @@ Here is the previous extract where "MyScreen" AOI is defined as a frame into [Ar
"layers": {
"MyLayer": {
"aoi_scene": {
- "MyScreen": [[2.5, 2.5, -0.5], [37.5, 2.5, -0.5], [37.5, 27.5, -0.5], [2.5, 27.5, -0.5]]
+ "Sheet": [[14.2, -3, 28.35], [35.2, -3, 28.35], [14.2, -3, -1.35], [35.2, -3, -1.35]],
+ "Screen": [[2.75, 2.9, -0.5], [49.25, 2.9, -0.5], [2.75, 29.1, -0.5], [49.25, 29.1, -0.5]]
}
}
},
"frames": {
- "MyScreen": {
- "size": [350, 250],
+ "Screen": {
+ "size": [1920, 1080],
"layers": {
"MyLayer": {
"aoi_scene": {
- "BlueTriangle": [[100, 50], [250, 50], [175, 200]]
- }
+ "GeoSector": [[860, 160], [1380, 100], [1660, 400], [1380, 740], [1440, 960], [920, 920], [680, 800], [640, 560]],
+ "LeftPanel": {
+ "Rectangle": {
+ "x": 0,
+ "y": 0,
+ "width": 350,
+ "height": 1080
+ }
+ },
+ "CircularWidget": {
+ "Circle": {
+ "cx": 1800,
+ "cy": 120,
+ "radius": 80
+ }
+ }
+ }
}
}
}
diff --git a/docs/user_guide/aruco_markers_pipeline/aoi_3d_projection.md b/docs/user_guide/aruco_markers_pipeline/aoi_3d_projection.md
index acbe31d..0d58d9a 100644
--- a/docs/user_guide/aruco_markers_pipeline/aoi_3d_projection.md
+++ b/docs/user_guide/aruco_markers_pipeline/aoi_3d_projection.md
@@ -24,9 +24,8 @@ Here is the previous extract where one layer is added to [ArUcoScene](../../arga
"layers": {
"MyLayer": {
"aoi_scene": {
- "YellowSquare": [[6.2, -7.275252, 25.246159], [31.2, -7.275252, 25.246159], [31.2, 1.275252, 1.753843], [6.2, 1.275252, 1.753843]],
- "GrayRectangle": [[2.5, 2.5, -0.5], [37.5, 2.5, -0.5], [37.5, 27.5, -0.5], [2.5, 27.5, -0.5]],
- "BlueTriangle": [[12.5, 7.5, -0.5], [27.5, 7.5, -0.5], [20, 22.5, -0.5]]
+ "Sheet": [[14.2, -3, 28.35], [35.2, -3, 28.35], [14.2, -3, -1.35], [35.2, -3, -1.35]],
+ "Screen": [[2.75, 2.9, -0.5], [49.25, 2.9, -0.5], [2.75, 29.1, -0.5], [49.25, 29.1, -0.5]]
}
}
}
diff --git a/docs/user_guide/aruco_markers_pipeline/aruco_markers_description.md b/docs/user_guide/aruco_markers_pipeline/aruco_markers_description.md
index b3ea2bb..3addcab 100644
--- a/docs/user_guide/aruco_markers_pipeline/aruco_markers_description.md
+++ b/docs/user_guide/aruco_markers_pipeline/aruco_markers_description.md
@@ -3,7 +3,7 @@ Set up ArUco markers
First of all, ArUco markers needs to be printed and placed into the scene.
-Here is an example scene where markers are surrounding a multi-screen workspace with a triangle area inside one of them.
+Here is an example scene where markers are surrounding a workspace with a screen and a sheet on the table.
![Scene](../../img/scene.png)
@@ -69,19 +69,19 @@ vn 0.0000 0.0000 1.0000
s off
f 1//1 2//1 4//1 3//1
o DICT_APRILTAG_16h5#1_Marker
-v -1.767767 23.000002 3.767767
-v 1.767767 23.000002 0.232233
-v -1.767767 28.000002 3.767767
-v 1.767767 28.000002 0.232233
-vn 0.7071 0.0000 0.7071
+v -0.855050 24.000002 4.349232
+v 0.855050 24.000002 -0.349231
+v -0.855050 29.000002 4.349232
+v 0.855050 29.000002 -0.349231
+vn 0.9397 0.0000 0.3420
s off
f 5//2 6//2 8//2 7//2
o DICT_APRILTAG_16h5#2_Marker
-v 33.000000 -1.767767 4.767767
-v 38.000000 -1.767767 4.767767
-v 33.000000 1.767767 1.232233
-v 38.000000 1.767767 1.232233
-vn 0.0000 0.7071 0.7071
+v 44.000000 0.000000 9.500000
+v 49.000000 0.000000 9.500000
+v 44.000000 -0.000000 4.500000
+v 49.000000 -0.000000 4.500000
+vn 0.0000 1.0000 -0.0000
s off
f 9//3 10//3 12//3 11//3
```
@@ -110,12 +110,12 @@ JSON file format allows to describe markers places using translation and euler a
"rotation": [0, 0, 0]
},
"1": {
- "translation": [0, 25.5, 2],
- "rotation": [0, 45, 0]
+ "translation": [0, 26.5, 2],
+ "rotation": [0, 70, 0]
},
"2": {
- "translation": [35.5, 0, 3],
- "rotation": [-45, 0, 0]
+ "translation": [46.5, 0, 7],
+ "rotation": [-90, 0, 0]
}
}
}
diff --git a/docs/user_guide/aruco_markers_pipeline/introduction.md b/docs/user_guide/aruco_markers_pipeline/introduction.md
index 5a07b49..26294f7 100644
--- a/docs/user_guide/aruco_markers_pipeline/introduction.md
+++ b/docs/user_guide/aruco_markers_pipeline/introduction.md
@@ -1,11 +1,11 @@
Overview
========
-This section explains how to build augmented reality pipelines based on ArUco Markers technology for various use cases.
+This section explains how to build augmented reality pipelines based on [ArUco Markers technology](https://www.sciencedirect.com/science/article/abs/pii/S0031320314000235) for various use cases.
-The OpenCV library provides a module to detect fiducial markers into a picture and estimate their poses (cf [OpenCV ArUco tutorial page](https://docs.opencv.org/4.x/d5/dae/tutorial_aruco_detection.html)).
+The OpenCV library provides a module to detect fiducial markers into a picture and estimate their poses.
-![OpenCV ArUco markers](https://pyimagesearch.com/wp-content/uploads/2020/12/aruco_generate_tags_header.png)
+![OpenCV ArUco markers](../../img/opencv_aruco.png)
The ArGaze [ArUcoMarkers submodule](../../argaze.md/#argaze.ArUcoMarkers) eases markers creation, markers detection and 3D scene pose estimation through a set of high level classes.
diff --git a/docs/user_guide/aruco_markers_pipeline/pose_estimation.md b/docs/user_guide/aruco_markers_pipeline/pose_estimation.md
index 5dcde6f..6027039 100644
--- a/docs/user_guide/aruco_markers_pipeline/pose_estimation.md
+++ b/docs/user_guide/aruco_markers_pipeline/pose_estimation.md
@@ -27,12 +27,12 @@ Here is an extract from the JSON [ArUcoCamera](../../argaze.md/#argaze.ArUcoMark
"rotation": [0, 0, 0]
},
"1": {
- "translation": [0, 25.5, 2],
- "rotation": [0, 45, 0]
+ "translation": [0, 26.5, 2],
+ "rotation": [0, 70, 0]
},
"2": {
- "translation": [35.5, 0, 3],
- "rotation": [-45, 0, 0]
+ "translation": [46.5, 0, 7],
+ "rotation": [-90, 0, 0]
}
}
}
diff --git a/docs/user_guide/gaze_analysis_pipeline/aoi_2d_description.md b/docs/user_guide/gaze_analysis_pipeline/aoi_2d_description.md
index 0d5dbf0..ad8ee74 100644
--- a/docs/user_guide/gaze_analysis_pipeline/aoi_2d_description.md
+++ b/docs/user_guide/gaze_analysis_pipeline/aoi_2d_description.md
@@ -19,9 +19,9 @@ SVG file format could be exported from most vector graphics editors.
``` xml
<svg>
- <path id="Triangle" d="M960,664L1113,971L806,971L960,664Z"/>
- <rect id="RedSquare" x="268" y="203" width="308" height="308"/>
- <circle id="GreenCircle" cx="1497" cy="356" r="153"/>
+ <path id="GeoSector" d="M860,160L1380,100L1660,400L1380,740L1440,960L920,920L680,800L640,560L860,160Z"/>
+ <rect id="LeftPanel" x="0" y="0" width="350" height="1080"/>
+ <circle id="CircularWidget" cx="1800" cy="120" r="80"/>
</svg>
```
@@ -37,20 +37,20 @@ JSON file format allows to describe AOI.
``` json
{
- "BlueTriangle":[[960, 664], [1113, 971], [806, 971]],
- "RedSquare": {
+ "GeoSector": [[860, 160], [1380, 100], [1660, 400], [1380, 740], [1440, 960], [920, 920], [680, 800], [640, 560]],
+ "LeftPanel": {
"Rectangle": {
- "x": 268,
- "y": 203,
- "width": 308,
- "height": 308
+ "x": 0,
+ "y": 0,
+ "width": 350,
+ "height": 1080
}
},
- "GreenCircle": {
+ "CircularWidget": {
"Circle": {
- "cx": 1497,
- "cy": 356,
- "radius": 153
+ "cx": 1800,
+ "cy": 120,
+ "radius": 80
}
}
}
diff --git a/docs/user_guide/gaze_analysis_pipeline/aoi_analysis.md b/docs/user_guide/gaze_analysis_pipeline/aoi_analysis.md
index 9d2b3df..b282f80 100644
--- a/docs/user_guide/gaze_analysis_pipeline/aoi_analysis.md
+++ b/docs/user_guide/gaze_analysis_pipeline/aoi_analysis.md
@@ -19,10 +19,22 @@ Here is an extract from the JSON [ArFrame](../../argaze.md/#argaze.ArFeatures.Ar
"layers": {
"MyLayer": {
"aoi_scene" : {
- "upper_left_area": [[0, 0], [960, 0], [960, 540], [0, 540]],
- "upper_right_area": [[960, 0], [1920, 0], [1920, 540], [960, 540]],
- "lower_left_area": [[0, 540], [960, 540], [960, 1080], [0, 1080]],
- "lower_right_area": [[960, 540], [1920, 540], [1920, 1080], [960, 1080]]
+ "GeoSector": [[860, 160], [1380, 100], [1660, 400], [1380, 740], [1440, 960], [920, 920], [680, 800], [640, 560]],
+ "LeftPanel": {
+ "Rectangle": {
+ "x": 0,
+ "y": 0,
+ "width": 350,
+ "height": 1080
+ }
+ },
+ "CircularWidget": {
+ "Circle": {
+ "cx": 1800,
+ "cy": 120,
+ "radius": 80
+ }
+ }
},
"aoi_matcher": {
"DeviationCircleCoverage": {
@@ -63,13 +75,13 @@ The name of an [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer). Basically
The set of 2D AOI into the layer as defined at [2D AOI description chapter](aoi_2d_description.md).
-![AOI Scene](../../img/ar_layer_aoi_scene.png)
+![AOI scene](../../img/aoi_2d_description.png)
### *aoi_matcher*
The first [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) pipeline step aims to make match identified gaze movement with a layer's AOI.
-![AOI Matcher](../../img/ar_layer_aoi_matcher.png)
+![AOI matcher](../../img/aoi_matcher.png)
The matching algorithm can be selected by instantiating a particular [AOIMatcher from GazeAnalysis submodule](pipeline_modules/aoi_matchers.md) or [from another python package](advanced_topics/module_loading.md).
@@ -82,7 +94,7 @@ In the example file, the choosen matching algorithm is the [Deviation Circle Cov
The second [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) pipeline step aims to build a [AOIScanPath](../../argaze.md/#argaze.GazeFeatures.AOIScanPath) defined as a list of [AOIScanSteps](../../argaze.md/#argaze.GazeFeatures.AOIScanStep) made by a set of successive fixations/saccades onto a same AOI.
-![AOI Scan Path](../../img/ar_layer_aoi_scan_path.png)
+![AOI scan path](../../img/aoi_scan_path.png)
Once gaze movements are matched to AOI, they are automatically appended to the AOIScanPath if required.
diff --git a/docs/user_guide/gaze_analysis_pipeline/background.md b/docs/user_guide/gaze_analysis_pipeline/background.md
index a7d59f6..ee27495 100644
--- a/docs/user_guide/gaze_analysis_pipeline/background.md
+++ b/docs/user_guide/gaze_analysis_pipeline/background.md
@@ -3,7 +3,7 @@ Add a background
Background is an optional [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) attribute to display any image behind pipeline visualisation.
-![Background](../../img/ar_frame_background.png)
+![Background](../../img/background.png)
## Load and display ArFrame background
@@ -16,7 +16,7 @@ Here is an extract from the JSON ArFrame configuration file where a background p
"name": "My FullHD screen",
"size": [1920, 1080],
...
- "background": "./joconde.png",
+ "background": "./bosch.png",
...
"image_parameters": {
...
diff --git a/docs/user_guide/gaze_analysis_pipeline/configuration_and_execution.md b/docs/user_guide/gaze_analysis_pipeline/configuration_and_execution.md
index 8ddd97a..3b21cbd 100644
--- a/docs/user_guide/gaze_analysis_pipeline/configuration_and_execution.md
+++ b/docs/user_guide/gaze_analysis_pipeline/configuration_and_execution.md
@@ -59,7 +59,7 @@ The size of the [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) defines th
The first [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline step is to identify fixations or saccades from consecutive timestamped gaze positions.
-![Gaze Movement Identifier](../../img/ar_frame_gaze_movement_identifier.png)
+![Gaze movement identifier](../../img/gaze_movement_identifier.png)
The identification algorithm can be selected by instantiating a particular [GazeMovementIdentifier from GazeAnalysis submodule](pipeline_modules/gaze_movement_identifiers.md) or [from another python package](advanced_topics/module_loading.md).
@@ -75,7 +75,7 @@ In the example file, the choosen identification algorithm is the [Dispersion Thr
The second [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline step aims to build a [ScanPath](../../argaze.md/#argaze.GazeFeatures.ScanPath) defined as a list of [ScanSteps](../../argaze.md/#argaze.GazeFeatures.ScanStep) made by a fixation and a consecutive saccade.
-![Scan Path](../../img/ar_frame_scan_path.png)
+![Scan path](../../img/scan_path.png)
Once fixations and saccades are identified, they are automatically appended to the ScanPath if required.
diff --git a/docs/user_guide/gaze_analysis_pipeline/heatmap.md b/docs/user_guide/gaze_analysis_pipeline/heatmap.md
index fe4246e..5310d64 100644
--- a/docs/user_guide/gaze_analysis_pipeline/heatmap.md
+++ b/docs/user_guide/gaze_analysis_pipeline/heatmap.md
@@ -3,7 +3,7 @@ Add a heatmap
Heatmap is an optional [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline step. It is executed at each new gaze position to update heatmap image.
-![Heatmap](../../img/ar_frame_heatmap.png)
+![Heatmap](../../img/heatmap.png)
## Enable and display ArFrame heatmap
diff --git a/docs/user_guide/gaze_analysis_pipeline/timestamped_gaze_positions_edition.md b/docs/user_guide/gaze_analysis_pipeline/timestamped_gaze_positions_edition.md
index 93d2a65..2156f3b 100644
--- a/docs/user_guide/gaze_analysis_pipeline/timestamped_gaze_positions_edition.md
+++ b/docs/user_guide/gaze_analysis_pipeline/timestamped_gaze_positions_edition.md
@@ -3,7 +3,7 @@ Edit timestamped gaze positions
Whatever eye data comes from a file on disk or from a live stream, timestamped gaze positions are required before to go further.
-![Timestamped Gaze Positions](../../img/timestamped_gaze_positions.png)
+![Timestamped gaze positions](../../img/timestamped_gaze_positions.png)
## Import gaze positions from CSV file
diff --git a/docs/user_guide/gaze_analysis_pipeline/visualisation.md b/docs/user_guide/gaze_analysis_pipeline/visualisation.md
index c9cbf2c..cf6fa41 100644
--- a/docs/user_guide/gaze_analysis_pipeline/visualisation.md
+++ b/docs/user_guide/gaze_analysis_pipeline/visualisation.md
@@ -3,7 +3,7 @@ Visualize pipeline steps
Visualisation is not a pipeline step but each [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline steps outputs can be drawn in real time or afterward, depending of application purpose.
-![ArFrame visualisation](../../img/ar_frame_visualisation.png)
+![ArFrame visualisation](../../img/visualisation.png)
## Add image parameters to ArFrame JSON configuration file