aboutsummaryrefslogtreecommitdiff
path: root/docs
diff options
context:
space:
mode:
Diffstat (limited to 'docs')
-rw-r--r--docs/contributor_guide/build_package.md36
-rw-r--r--docs/img/4flight_aoi.pngbin311033 -> 0 bytes
-rw-r--r--docs/img/4flight_visual_pattern.pngbin0 -> 331959 bytes
-rw-r--r--docs/img/4flight_workspace.pngbin432921 -> 311033 bytes
-rw-r--r--docs/installation.md4
-rw-r--r--docs/use_cases/air_controller_gaze_study/context.md2
-rw-r--r--docs/use_cases/air_controller_gaze_study/introduction.md11
-rw-r--r--docs/use_cases/air_controller_gaze_study/pipeline.md14
-rw-r--r--docs/use_cases/pilot_gaze_tracking/context.md (renamed from docs/use_cases/pilot_gaze_monitoring/context.md)2
-rw-r--r--docs/use_cases/pilot_gaze_tracking/introduction.md (renamed from docs/use_cases/pilot_gaze_monitoring/introduction.md)0
-rw-r--r--docs/use_cases/pilot_gaze_tracking/observers.md (renamed from docs/use_cases/pilot_gaze_monitoring/observers.md)0
-rw-r--r--docs/use_cases/pilot_gaze_tracking/pipeline.md (renamed from docs/use_cases/pilot_gaze_monitoring/pipeline.md)6
-rw-r--r--docs/user_guide/eye_tracking_context/configuration_and_execution.md5
-rw-r--r--docs/user_guide/eye_tracking_context/context_modules/file.md75
-rw-r--r--docs/user_guide/eye_tracking_context/context_modules/opencv.md18
-rw-r--r--docs/user_guide/eye_tracking_context/context_modules/pupil_labs_invisible.md32
-rw-r--r--docs/user_guide/eye_tracking_context/context_modules/pupil_labs_neon.md (renamed from docs/user_guide/eye_tracking_context/context_modules/pupil_labs.md)10
-rw-r--r--docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_3.md32
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/aoi_analysis.md5
-rw-r--r--docs/user_guide/utils/demonstrations_scripts.md88
-rw-r--r--docs/user_guide/utils/estimate_aruco_markers_pose.md2
-rw-r--r--docs/user_guide/utils/main_commands.md3
22 files changed, 308 insertions, 37 deletions
diff --git a/docs/contributor_guide/build_package.md b/docs/contributor_guide/build_package.md
new file mode 100644
index 0000000..fae1730
--- /dev/null
+++ b/docs/contributor_guide/build_package.md
@@ -0,0 +1,36 @@
+Build package
+=============
+
+ArGaze build system is based on [setuptools](https://setuptools.pypa.io/en/latest/userguide/index.html) and [setuptools-scm](https://setuptools-scm.readthedocs.io/en/latest/) to use Git tag as package version number.
+
+!!! note
+
+ *Consider that all inline commands below have to be executed at the root of ArGaze Git repository.*
+
+Install or upgrade the required packages:
+
+```console
+pip install build setuptools setuptools-scm
+```
+
+Commit last changes then, tag the Git repository with a VERSION that follows the [setuptools versionning schemes](https://setuptools.pypa.io/en/latest/userguide/distribution.html):
+
+```console
+git tag -a VERSION -m "Version message"
+```
+
+Push commits and tags:
+
+```console
+git push && git push --tags
+```
+
+Then, build package:
+```console
+python -m build
+```
+
+Once the build is done, two files are created in a *dist* folder:
+
+* **argaze-VERSION-py3-none-any.whl**: the built package (*none* means for no specific OS, *any* means for any architecture).
+* **argaze-VERSION.tar.gz**: the source package.
diff --git a/docs/img/4flight_aoi.png b/docs/img/4flight_aoi.png
deleted file mode 100644
index f899ab2..0000000
--- a/docs/img/4flight_aoi.png
+++ /dev/null
Binary files differ
diff --git a/docs/img/4flight_visual_pattern.png b/docs/img/4flight_visual_pattern.png
new file mode 100644
index 0000000..0550063
--- /dev/null
+++ b/docs/img/4flight_visual_pattern.png
Binary files differ
diff --git a/docs/img/4flight_workspace.png b/docs/img/4flight_workspace.png
index 1c405c4..f899ab2 100644
--- a/docs/img/4flight_workspace.png
+++ b/docs/img/4flight_workspace.png
Binary files differ
diff --git a/docs/installation.md b/docs/installation.md
index 66b801b..fe4cfa4 100644
--- a/docs/installation.md
+++ b/docs/installation.md
@@ -37,8 +37,8 @@ pip install ./dist/argaze-VERSION.whl
!!! note "As ArGaze package contributor"
- *You should prefer to install the package in developer mode to test live code changes:*
+ *You should prefer to install the package in editable mode to test live code changes:*
```
- pip install -e .
+ pip install --editable .
```
diff --git a/docs/use_cases/air_controller_gaze_study/context.md b/docs/use_cases/air_controller_gaze_study/context.md
index 5b13ca5..8bb4ef8 100644
--- a/docs/use_cases/air_controller_gaze_study/context.md
+++ b/docs/use_cases/air_controller_gaze_study/context.md
@@ -19,4 +19,4 @@ While *segment* entry is specific to the [TobiiProGlasses2.SegmentPlayback](../.
}
```
-The [post_processing_pipeline.json](pipeline.md) file mentioned aboved is described in the next chapter.
+The [post_processing_pipeline.json](pipeline.md) file mentioned above is described in the next chapter.
diff --git a/docs/use_cases/air_controller_gaze_study/introduction.md b/docs/use_cases/air_controller_gaze_study/introduction.md
index 5f1c6ac..f188eec 100644
--- a/docs/use_cases/air_controller_gaze_study/introduction.md
+++ b/docs/use_cases/air_controller_gaze_study/introduction.md
@@ -18,9 +18,16 @@ During their training, controllers are taught to visually follow all aircraft st
![4Flight Workspace](../../img/4flight_workspace.png)
-A traffic simulation of moderate difficulty with a maximum of 13 and 16 aircraft simultaneously was performed by air traffic controllers. The controller could encounter lateral conflicts (same altitude) between 2 and 3 aircraft and conflicts between aircraft that need to ascend or descend within the sector. After the simulation, a directed interview about the gaze pattern was conducted. Eye tracking data was recorded with a Tobii Pro Glasses 2, a head-mounted eye tracker. The gaze and scene camera video were captured with Tobii Pro Lab software and post-processed with **ArGaze** software library. As the eye tracker model is head mounted, ArUco markers were placed around the two screens to ensure that several of them were always visible in the field of view of the eye tracker camera.
+A traffic simulation of moderate difficulty with a maximum of 13 and 16 aircraft simultaneously was performed by air traffic controllers. The controller could encounter lateral conflicts (same altitude) between 2 and 3 aircraft and conflicts between aircraft that need to ascend or descend within the sector.
+After the simulation, a directed interview about the gaze pattern was conducted.
+Eye tracking data was recorded with a Tobii Pro Glasses 2, a head-mounted eye tracker.
+The gaze and scene camera video were captured with Tobii Pro Lab software and post-processed with **ArGaze** software library.
+As the eye tracker model is head mounted, ArUco markers were placed around the two screens to ensure that several of them were always visible in the field of view of the eye tracker camera.
-![4Flight Workspace](../../img/4flight_aoi.png)
+Various metrics were exported with specific pipeline observers, including average fixation duration, explore/exploit ratio, K-coefficient, AOI distribution, transition matrix, entropy and N-grams.
+Although statistical analysis is not possible due to the small sample size of the study (6 instructors, 5 qualified controllers, and 5 trainees), visual pattern summaries have been manually built from transition matrix export to produce a qualitative interpretation showing what instructors attend during training and how qualified controllers work. Red arcs are more frequent than the blue ones. Instructors (Fig. a) and four different qualified controllers (Fig. b, c, d, e).
+
+![4Flight Visual pattern](../../img/4flight_visual_pattern.png)
## Setup
diff --git a/docs/use_cases/air_controller_gaze_study/pipeline.md b/docs/use_cases/air_controller_gaze_study/pipeline.md
index 3507038..69fdd2c 100644
--- a/docs/use_cases/air_controller_gaze_study/pipeline.md
+++ b/docs/use_cases/air_controller_gaze_study/pipeline.md
@@ -187,17 +187,17 @@ For this use case we need to detect ArUco markers to enable gaze mapping: **ArGa
},
"observers": {
"argaze.utils.UtilsFeatures.LookPerformanceRecorder": {
- "path": "_export/look_performance.csv"
+ "path": "look_performance.csv"
},
"argaze.utils.UtilsFeatures.WatchPerformanceRecorder": {
- "path": "_export/watch_performance.csv"
+ "path": "watch_performance.csv"
}
}
}
}
```
-All the files mentioned aboved are described below.
+All the files mentioned above are described below.
The *ScanPathAnalysisRecorder* and *AOIScanPathAnalysisRecorder* observers objects are defined into the [observers.py](observers.md) file that is described in the next chapter.
@@ -355,12 +355,12 @@ The video file is a record of the sector screen frame image.
## look_performance.csv
-This file contains the logs of *ArUcoCamera.look* method execution info. It is created into an *_export* folder from where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
+This file contains the logs of *ArUcoCamera.look* method execution info. It is created into the folder where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
-On a MacBookPro (2,3GHz Intel Core i9 8 cores), the *look* method execution time is ~7ms and it is called ~115 times per second.
+On a MacBookPro (2.3GHz Intel Core i9 8 cores), the *look* method execution time is ~1ms and it is called ~51 times per second.
## watch_performance.csv
-This file contains the logs of *ArUcoCamera.watch* method execution info. It is created into an *_export* folder from where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
+This file contains the logs of *ArUcoCamera.watch* method execution info. It is created into the folder from where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
-On a MacBookPro (2,3GHz Intel Core i9 8 cores), the *watch* method execution time is ~60ms and it is called ~10 times per second.
+On a MacBookPro (2.3GHz Intel Core i9 8 cores) without CUDA acceleration, the *watch* method execution time is ~52ms and it is called more than 12 times per second.
diff --git a/docs/use_cases/pilot_gaze_monitoring/context.md b/docs/use_cases/pilot_gaze_tracking/context.md
index 477276d..8839cb6 100644
--- a/docs/use_cases/pilot_gaze_monitoring/context.md
+++ b/docs/use_cases/pilot_gaze_tracking/context.md
@@ -36,6 +36,6 @@ While *address*, *project*, *participant* and *configuration* entries are specif
}
```
-The [live_processing_pipeline.json](pipeline.md) file mentioned aboved is described in the next chapter.
+The [live_processing_pipeline.json](pipeline.md) file mentioned above is described in the next chapter.
The *IvyBus* observer object is defined into the [observers.py](observers.md) file that is described in a next chapter. \ No newline at end of file
diff --git a/docs/use_cases/pilot_gaze_monitoring/introduction.md b/docs/use_cases/pilot_gaze_tracking/introduction.md
index 7e88c69..7e88c69 100644
--- a/docs/use_cases/pilot_gaze_monitoring/introduction.md
+++ b/docs/use_cases/pilot_gaze_tracking/introduction.md
diff --git a/docs/use_cases/pilot_gaze_monitoring/observers.md b/docs/use_cases/pilot_gaze_tracking/observers.md
index 5f5bc78..5f5bc78 100644
--- a/docs/use_cases/pilot_gaze_monitoring/observers.md
+++ b/docs/use_cases/pilot_gaze_tracking/observers.md
diff --git a/docs/use_cases/pilot_gaze_monitoring/pipeline.md b/docs/use_cases/pilot_gaze_tracking/pipeline.md
index 1450fed..65fccc3 100644
--- a/docs/use_cases/pilot_gaze_monitoring/pipeline.md
+++ b/docs/use_cases/pilot_gaze_tracking/pipeline.md
@@ -122,7 +122,7 @@ For this use case we need to detect ArUco markers to enable gaze mapping: **ArGa
}
```
-All the files mentioned aboved are described below.
+All the files mentioned above are described below.
The *ArUcoCameraLogger* observer object is defined into the [observers.py](observers.md) file that is described in the next chapter.
@@ -302,10 +302,10 @@ This file defines the place of the AOI into the PFD frame. AOI positions have be
This file contains the logs of *ArUcoCamera.look* method execution info. It is saved into an *_export* folder from where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
-On a Jetson Xavier computer, the *look* method execution time is ~0.5ms and it is called ~100 times per second.
+On a Jetson Xavier computer, the *look* method execution time is 5.7ms and it is called ~100 times per second.
## watch_performance.csv
This file contains the logs of *ArUcoCamera.watch* method execution info. It is saved into an *_export* folder from where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
-On a Jetson Xavier computer, the *watch* method execution time is ~50ms and it is called ~10 times per second.
+On a Jetson Xavier computer with CUDA acceleration, the *watch* method execution time is 46.5ms and it is called more than 12 times per second.
diff --git a/docs/user_guide/eye_tracking_context/configuration_and_execution.md b/docs/user_guide/eye_tracking_context/configuration_and_execution.md
index e1123fb..3deeb57 100644
--- a/docs/user_guide/eye_tracking_context/configuration_and_execution.md
+++ b/docs/user_guide/eye_tracking_context/configuration_and_execution.md
@@ -4,7 +4,10 @@ Edit and execute context
The [utils.contexts module](../../argaze.md/#argaze.utils.contexts) provides ready-made contexts like:
* [Tobii Pro Glasses 2](context_modules/tobii_pro_glasses_2.md) data capture and data playback contexts,
-* [Pupil Labs](context_modules/pupil_labs.md) data capture context,
+* [Tobii Pro Glasses 3](context_modules/tobii_pro_glasses_3.md) data capture context,
+* [Pupil Labs Invisible](context_modules/pupil_labs_invisible.md) data capture context,
+* [Pupil Labs Neon](context_modules/pupil_labs_neon.md) data capture context,
+* [File](context_modules/file.md) data playback contexts,
* [OpenCV](context_modules/opencv.md) window cursor position capture and movie playback,
* [Random](context_modules/random.md) gaze position generator.
diff --git a/docs/user_guide/eye_tracking_context/context_modules/file.md b/docs/user_guide/eye_tracking_context/context_modules/file.md
new file mode 100644
index 0000000..5b5c8e9
--- /dev/null
+++ b/docs/user_guide/eye_tracking_context/context_modules/file.md
@@ -0,0 +1,75 @@
+File
+======
+
+ArGaze provides a ready-made contexts to read data from various file format.
+
+To select a desired context, the JSON samples have to be edited and saved inside an [ArContext configuration](../configuration_and_execution.md) file.
+Notice that the *pipeline* entry is mandatory.
+
+```json
+{
+ JSON sample
+ "pipeline": ...
+}
+```
+
+Read more about [ArContext base class in code reference](../../../argaze.md/#argaze.ArFeatures.ArContext).
+
+## CSV
+
+::: argaze.utils.contexts.File.CSV
+
+### JSON sample: splitted case
+
+To use when gaze position coordinates are splitted in two separated columns.
+
+```json
+{
+ "argaze.utils.contexts.File.CSV": {
+ "name": "CSV file data playback",
+ "path": "./src/argaze/utils/demo/gaze_positions_splitted.csv",
+ "timestamp_column": "Timestamp (ms)",
+ "x_column": "Gaze Position X (px)",
+ "y_column": "Gaze Position Y (px)",
+ "pipeline": ...
+ }
+}
+```
+
+### JSON sample: joined case
+
+To use when gaze position coordinates are joined as a list in one single column.
+
+```json
+{
+ "argaze.utils.contexts.File.CSV" : {
+ "name": "CSV file data playback",
+ "path": "./src/argaze/utils/demo/gaze_positions_xy_joined.csv",
+ "timestamp_column": "Timestamp (ms)",
+ "xy_column": "Gaze Position (px)",
+ "pipeline": ...
+ }
+}
+```
+
+### JSON sample: left and right eyes
+
+To use when gaze position coordinates and validity are given for each eye in six separated columns.
+
+```json
+{
+ "argaze.utils.contexts.File.CSV": {
+ "name": "CSV file data playback",
+ "path": "./src/argaze/utils/demo/gaze_positions_left_right_eyes.csv",
+ "timestamp_column": "Timestamp (ms)",
+ "left_eye_x_column": "Left eye X",
+ "left_eye_y_column": "Left eye Y",
+ "left_eye_validity_column": "Left eye validity",
+ "right_eye_x_column": "Right eye X",
+ "right_eye_y_column": "Right eye Y",
+ "right_eye_validity_column": "Right eye validity",
+ "rescale_to_pipeline_size": true,
+ "pipeline": ...
+ }
+}
+```
diff --git a/docs/user_guide/eye_tracking_context/context_modules/opencv.md b/docs/user_guide/eye_tracking_context/context_modules/opencv.md
index 7244cd4..7d73a03 100644
--- a/docs/user_guide/eye_tracking_context/context_modules/opencv.md
+++ b/docs/user_guide/eye_tracking_context/context_modules/opencv.md
@@ -39,9 +39,25 @@ Read more about [ArContext base class in code reference](../../../argaze.md/#arg
```json
{
"argaze.utils.contexts.OpenCV.Movie": {
- "name": "Open CV cursor",
+ "name": "Open CV movie",
"path": "./src/argaze/utils/demo/tobii_record/segments/1/fullstream.mp4",
"pipeline": ...
}
}
```
+
+## Camera
+
+::: argaze.utils.contexts.OpenCV.Camera
+
+### JSON sample
+
+```json
+{
+ "argaze.utils.contexts.OpenCV.Camera": {
+ "name": "Open CV camera",
+ "identifier": 0,
+ "pipeline": ...
+ }
+}
+``` \ No newline at end of file
diff --git a/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_invisible.md b/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_invisible.md
new file mode 100644
index 0000000..1f4a94f
--- /dev/null
+++ b/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_invisible.md
@@ -0,0 +1,32 @@
+Pupil Labs Invisible
+==========
+
+ArGaze provides a ready-made context to work with Pupil Labs Invisible device.
+
+To select a desired context, the JSON samples have to be edited and saved inside an [ArContext configuration](../configuration_and_execution.md) file.
+Notice that the *pipeline* entry is mandatory.
+
+```json
+{
+ JSON sample
+ "pipeline": ...
+}
+```
+
+Read more about [ArContext base class in code reference](../../../argaze.md/#argaze.ArFeatures.ArContext).
+
+## Live Stream
+
+::: argaze.utils.contexts.PupilLabsInvisible.LiveStream
+
+### JSON sample
+
+```json
+{
+ "argaze.utils.contexts.PupilLabsInvisible.LiveStream": {
+ "name": "Pupil Labs Invisible live stream",
+ "project": "my_experiment",
+ "pipeline": ...
+ }
+}
+```
diff --git a/docs/user_guide/eye_tracking_context/context_modules/pupil_labs.md b/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_neon.md
index d2ec336..535f5d5 100644
--- a/docs/user_guide/eye_tracking_context/context_modules/pupil_labs.md
+++ b/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_neon.md
@@ -1,7 +1,7 @@
-Pupil Labs
+Pupil Labs Neon
==========
-ArGaze provides a ready-made context to work with Pupil Labs devices.
+ArGaze provides a ready-made context to work with Pupil Labs Neon device.
To select a desired context, the JSON samples have to be edited and saved inside an [ArContext configuration](../configuration_and_execution.md) file.
Notice that the *pipeline* entry is mandatory.
@@ -17,14 +17,14 @@ Read more about [ArContext base class in code reference](../../../argaze.md/#arg
## Live Stream
-::: argaze.utils.contexts.PupilLabs.LiveStream
+::: argaze.utils.contexts.PupilLabsNeon.LiveStream
### JSON sample
```json
{
- "argaze.utils.contexts.PupilLabs.LiveStream": {
- "name": "Pupil Labs live stream",
+ "argaze.utils.contexts.PupilLabsNeon.LiveStream": {
+ "name": "Pupil Labs Neon live stream",
"project": "my_experiment",
"pipeline": ...
}
diff --git a/docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_3.md b/docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_3.md
new file mode 100644
index 0000000..3d37fcc
--- /dev/null
+++ b/docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_3.md
@@ -0,0 +1,32 @@
+Tobii Pro Glasses 3
+===================
+
+ArGaze provides a ready-made context to work with Tobii Pro Glasses 3 devices.
+
+To select a desired context, the JSON samples have to be edited and saved inside an [ArContext configuration](../configuration_and_execution.md) file.
+Notice that the *pipeline* entry is mandatory.
+
+```json
+{
+ JSON sample
+ "pipeline": ...
+}
+```
+
+Read more about [ArContext base class in code reference](../../../argaze.md/#argaze.ArFeatures.ArContext).
+
+## Live Stream
+
+::: argaze.utils.contexts.TobiiProGlasses3.LiveStream
+
+### JSON sample
+
+```json
+{
+ "argaze.utils.contexts.TobiiProGlasses3.LiveStream": {
+ "name": "Tobii Pro Glasses 3 live stream",
+ "pipeline": ...
+ }
+}
+```
+
diff --git a/docs/user_guide/gaze_analysis_pipeline/aoi_analysis.md b/docs/user_guide/gaze_analysis_pipeline/aoi_analysis.md
index 2b64091..c2a6ac3 100644
--- a/docs/user_guide/gaze_analysis_pipeline/aoi_analysis.md
+++ b/docs/user_guide/gaze_analysis_pipeline/aoi_analysis.md
@@ -100,6 +100,11 @@ The second [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) pipeline step a
Once gaze movements are matched to AOI, they are automatically appended to the AOIScanPath if required.
+!!! warning "GazeFeatures.OutsideAOI"
+ When a fixation is not looking at any AOI, a step associated with an AOI called [GazeFeatures.OutsideAOI](../../argaze.md/#argaze.GazeFeatures.OutsideAOI) is added. As long as fixations are not looking at any AOI, all fixations/saccades are stored in this step. In this way, further analysis are calculated considering those extra [GazeFeatures.OutsideAOI](../../argaze.md/#argaze.GazeFeatures.OutsideAOI) steps.
+
+ This is particularly important when calculating transition matrices, because otherwise we could have arcs between two AOIs when in fact the gaze could have fixed itself outside in the meantime.
+
The [AOIScanPath.duration_max](../../argaze.md/#argaze.GazeFeatures.AOIScanPath.duration_max) attribute is the duration from which older AOI scan steps are removed each time new AOI scan steps are added.
!!! note "Optional"
diff --git a/docs/user_guide/utils/demonstrations_scripts.md b/docs/user_guide/utils/demonstrations_scripts.md
index 1d136f2..c7560eb 100644
--- a/docs/user_guide/utils/demonstrations_scripts.md
+++ b/docs/user_guide/utils/demonstrations_scripts.md
@@ -20,7 +20,32 @@ Load **random_context.json** file to generate random gaze positions:
python -m argaze load ./src/argaze/utils/demo/random_context.json
```
-## OpenCV cursor context
+## CSV file context
+
+Load **csv_file_context_xy_joined.json** file to analyze gaze positions from a CSV file where gaze position coordinates are joined as a list in one single column:
+
+```shell
+python -m argaze load ./src/argaze/utils/demo/csv_file_context_xy_joined.json
+```
+
+Load **csv_file_context_xy_splitted.json** file to analyze gaze positions from a CSV file where gaze position coordinates are splitted in two seperated column:
+
+```shell
+python -m argaze load ./src/argaze/utils/demo/csv_file_context_xy_splitted.json
+```
+
+Load **csv_file_context_left_right_eyes.json** file to analyze gaze positions from a CSV file where gaze position coordinates and validity are given for each eye in six separated columns.:
+
+```shell
+python -m argaze load ./src/argaze/utils/demo/csv_file_context_left_right_eyes.json
+```
+
+!!! note
+ The left/right eyes context allows to parse Tobii Spectrum data for example.
+
+## OpenCV
+
+### Cursor context
Load **opencv_cursor_context.json** file to capture cursor pointer positions over OpenCV window:
@@ -28,17 +53,17 @@ Load **opencv_cursor_context.json** file to capture cursor pointer positions ove
python -m argaze load ./src/argaze/utils/demo/opencv_cursor_context.json
```
-## OpenCV movie context
+### Movie context
-Load **opencv_movie_context.json** file to playback movie pictures and also capture cursor pointer positions over OpenCV window:
+Load **opencv_movie_context.json** file to playback a movie and also capture cursor pointer positions over OpenCV window:
```shell
python -m argaze load ./src/argaze/utils/demo/opencv_movie_context.json
```
-## OpenCV camera context
+### Camera context
-Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution and to reduce the value of the *sides_mask*.
+Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution and to set a consistent *sides_mask* value.
Edit **opencv_camera_context.json** file as to select camera device identifier (default is 0).
@@ -55,7 +80,9 @@ python -m argaze load ./src/argaze/utils/demo/opencv_camera_context.json
!!! note
This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
-Edit **tobii_live_stream_context.json** file as to select exisiting IP *address*, *project* or *participant* names and setup Tobii *configuration* parameters:
+Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1920, 1080]) and to set *sides_mask* value to 420.
+
+Edit **tobii_g2_live_stream_context.json** file as to select exisiting IP *address*, *project* or *participant* names and setup Tobii *configuration* parameters:
```json
{
@@ -78,15 +105,17 @@ Edit **tobii_live_stream_context.json** file as to select exisiting IP *address*
}
```
-Then, load **tobii_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
+Then, load **tobii_g2_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
```shell
-python -m argaze load ./src/argaze/utils/demo/tobii_live_stream_context.json
+python -m argaze load ./src/argaze/utils/demo/tobii_g2_live_stream_context.json
```
### Segment playback context
-Edit **tobii_segment_playback_context.json** file to select an existing Tobii *segment* folder:
+Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1920, 1080]) and to set *sides_mask* value to 420.
+
+Edit **tobii_g2_segment_playback_context.json** file to select an existing Tobii *segment* folder:
```json
{
@@ -98,12 +127,28 @@ Edit **tobii_segment_playback_context.json** file to select an existing Tobii *s
}
```
-Then, load **tobii_segment_playback_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
+Then, load **tobii_g2_segment_playback_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
+
+```shell
+python -m argaze load ./src/argaze/utils/demo/tobii_g2_segment_playback_context.json
+```
+
+## Tobii Pro Glasses 3
+
+### Live stream context
+
+!!! note
+ This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
+
+Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1920, 1080]) and to set *sides_mask* value to 420.
+
+Load **tobii_g3_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
```shell
-python -m argaze load ./src/argaze/utils/demo/tobii_segment_playback_context.json
+python -m argaze load ./src/argaze/utils/demo/tobii_g3_live_stream_context.json
```
+
## Pupil Invisible
### Live stream context
@@ -111,8 +156,25 @@ python -m argaze load ./src/argaze/utils/demo/tobii_segment_playback_context.jso
!!! note
This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
-Load **pupillabs_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
+Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1088, 1080]) and to set *sides_mask* value to 4.
+
+Load **pupillabs_invisible_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
+
+```shell
+python -m argaze load ./src/argaze/utils/demo/pupillabs_invisible_live_stream_context.json
+```
+
+## Pupil Neon
+
+### Live stream context
+
+!!! note
+ This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
+
+Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1600, 1200]) and to set *sides_mask* value to 200.
+
+Load **pupillabs_neon_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
```shell
-python -m argaze load ./src/argaze/utils/demo/pupillabs_live_stream_context.json
+python -m argaze load ./src/argaze/utils/demo/pupillabs_neon_live_stream_context.json
```
diff --git a/docs/user_guide/utils/estimate_aruco_markers_pose.md b/docs/user_guide/utils/estimate_aruco_markers_pose.md
index d1fd16e..55bd232 100644
--- a/docs/user_guide/utils/estimate_aruco_markers_pose.md
+++ b/docs/user_guide/utils/estimate_aruco_markers_pose.md
@@ -15,7 +15,7 @@ Firstly, edit **utils/estimate_markers_pose/context.json** file as to select a m
}
```
-Sencondly, edit **utils/estimate_markers_pose/pipeline.json** file to setup ArUco camera *size*, ArUco detector *dictionary*, *pose_size* and *pose_ids* attributes.
+Secondly, edit **utils/estimate_markers_pose/pipeline.json** file to setup ArUco camera *size*, ArUco detector *dictionary*, *pose_size* and *pose_ids* attributes.
```json
{
diff --git a/docs/user_guide/utils/main_commands.md b/docs/user_guide/utils/main_commands.md
index c4887a4..9227d8d 100644
--- a/docs/user_guide/utils/main_commands.md
+++ b/docs/user_guide/utils/main_commands.md
@@ -54,3 +54,6 @@ Modify the content of JSON CONFIGURATION file with another JSON CHANGES file the
```shell
python -m argaze edit CONFIGURATION CHANGES OUTPUT
```
+
+!!! note
+ Use *null* value to remove an entry.