aboutsummaryrefslogtreecommitdiff
path: root/docs
diff options
context:
space:
mode:
Diffstat (limited to 'docs')
-rw-r--r--docs/contributor_guide/build_package.md36
-rw-r--r--docs/installation.md4
-rw-r--r--docs/use_cases/air_controller_gaze_study/context.md2
-rw-r--r--docs/use_cases/air_controller_gaze_study/pipeline.md14
-rw-r--r--docs/use_cases/pilot_gaze_tracking/context.md (renamed from docs/use_cases/pilot_gaze_monitoring/context.md)2
-rw-r--r--docs/use_cases/pilot_gaze_tracking/introduction.md (renamed from docs/use_cases/pilot_gaze_monitoring/introduction.md)0
-rw-r--r--docs/use_cases/pilot_gaze_tracking/observers.md (renamed from docs/use_cases/pilot_gaze_monitoring/observers.md)0
-rw-r--r--docs/use_cases/pilot_gaze_tracking/pipeline.md (renamed from docs/use_cases/pilot_gaze_monitoring/pipeline.md)6
-rw-r--r--docs/user_guide/eye_tracking_context/configuration_and_execution.md5
-rw-r--r--docs/user_guide/eye_tracking_context/context_modules/file.md75
-rw-r--r--docs/user_guide/eye_tracking_context/context_modules/pupil_labs_invisible.md32
-rw-r--r--docs/user_guide/eye_tracking_context/context_modules/pupil_labs_neon.md (renamed from docs/user_guide/eye_tracking_context/context_modules/pupil_labs.md)10
-rw-r--r--docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_3.md32
-rw-r--r--docs/user_guide/utils/demonstrations_scripts.md78
-rw-r--r--docs/user_guide/utils/main_commands.md3
15 files changed, 270 insertions, 29 deletions
diff --git a/docs/contributor_guide/build_package.md b/docs/contributor_guide/build_package.md
new file mode 100644
index 0000000..fae1730
--- /dev/null
+++ b/docs/contributor_guide/build_package.md
@@ -0,0 +1,36 @@
+Build package
+=============
+
+ArGaze build system is based on [setuptools](https://setuptools.pypa.io/en/latest/userguide/index.html) and [setuptools-scm](https://setuptools-scm.readthedocs.io/en/latest/) to use Git tag as package version number.
+
+!!! note
+
+ *Consider that all inline commands below have to be executed at the root of ArGaze Git repository.*
+
+Install or upgrade the required packages:
+
+```console
+pip install build setuptools setuptools-scm
+```
+
+Commit last changes then, tag the Git repository with a VERSION that follows the [setuptools versionning schemes](https://setuptools.pypa.io/en/latest/userguide/distribution.html):
+
+```console
+git tag -a VERSION -m "Version message"
+```
+
+Push commits and tags:
+
+```console
+git push && git push --tags
+```
+
+Then, build package:
+```console
+python -m build
+```
+
+Once the build is done, two files are created in a *dist* folder:
+
+* **argaze-VERSION-py3-none-any.whl**: the built package (*none* means for no specific OS, *any* means for any architecture).
+* **argaze-VERSION.tar.gz**: the source package.
diff --git a/docs/installation.md b/docs/installation.md
index 66b801b..fe4cfa4 100644
--- a/docs/installation.md
+++ b/docs/installation.md
@@ -37,8 +37,8 @@ pip install ./dist/argaze-VERSION.whl
!!! note "As ArGaze package contributor"
- *You should prefer to install the package in developer mode to test live code changes:*
+ *You should prefer to install the package in editable mode to test live code changes:*
```
- pip install -e .
+ pip install --editable .
```
diff --git a/docs/use_cases/air_controller_gaze_study/context.md b/docs/use_cases/air_controller_gaze_study/context.md
index 5b13ca5..8bb4ef8 100644
--- a/docs/use_cases/air_controller_gaze_study/context.md
+++ b/docs/use_cases/air_controller_gaze_study/context.md
@@ -19,4 +19,4 @@ While *segment* entry is specific to the [TobiiProGlasses2.SegmentPlayback](../.
}
```
-The [post_processing_pipeline.json](pipeline.md) file mentioned aboved is described in the next chapter.
+The [post_processing_pipeline.json](pipeline.md) file mentioned above is described in the next chapter.
diff --git a/docs/use_cases/air_controller_gaze_study/pipeline.md b/docs/use_cases/air_controller_gaze_study/pipeline.md
index 3cff83a..69fdd2c 100644
--- a/docs/use_cases/air_controller_gaze_study/pipeline.md
+++ b/docs/use_cases/air_controller_gaze_study/pipeline.md
@@ -187,17 +187,17 @@ For this use case we need to detect ArUco markers to enable gaze mapping: **ArGa
},
"observers": {
"argaze.utils.UtilsFeatures.LookPerformanceRecorder": {
- "path": "_export/look_performance.csv"
+ "path": "look_performance.csv"
},
"argaze.utils.UtilsFeatures.WatchPerformanceRecorder": {
- "path": "_export/watch_performance.csv"
+ "path": "watch_performance.csv"
}
}
}
}
```
-All the files mentioned aboved are described below.
+All the files mentioned above are described below.
The *ScanPathAnalysisRecorder* and *AOIScanPathAnalysisRecorder* observers objects are defined into the [observers.py](observers.md) file that is described in the next chapter.
@@ -355,12 +355,12 @@ The video file is a record of the sector screen frame image.
## look_performance.csv
-This file contains the logs of *ArUcoCamera.look* method execution info. It is created into an *_export* folder from where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
+This file contains the logs of *ArUcoCamera.look* method execution info. It is created into the folder where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
-On a MacBookPro (2,3GHz Intel Core i9 8 cores), the *look* method execution time is ~4,8ms and it is called ~44 times per second.
+On a MacBookPro (2.3GHz Intel Core i9 8 cores), the *look* method execution time is ~1ms and it is called ~51 times per second.
## watch_performance.csv
-This file contains the logs of *ArUcoCamera.watch* method execution info. It is created into an *_export* folder from where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
+This file contains the logs of *ArUcoCamera.watch* method execution info. It is created into the folder from where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
-On a MacBookPro (2,3GHz Intel Core i9 8 cores), the *watch* method execution time is ~52ms and it is called ~11 times per second.
+On a MacBookPro (2.3GHz Intel Core i9 8 cores) without CUDA acceleration, the *watch* method execution time is ~52ms and it is called more than 12 times per second.
diff --git a/docs/use_cases/pilot_gaze_monitoring/context.md b/docs/use_cases/pilot_gaze_tracking/context.md
index 477276d..8839cb6 100644
--- a/docs/use_cases/pilot_gaze_monitoring/context.md
+++ b/docs/use_cases/pilot_gaze_tracking/context.md
@@ -36,6 +36,6 @@ While *address*, *project*, *participant* and *configuration* entries are specif
}
```
-The [live_processing_pipeline.json](pipeline.md) file mentioned aboved is described in the next chapter.
+The [live_processing_pipeline.json](pipeline.md) file mentioned above is described in the next chapter.
The *IvyBus* observer object is defined into the [observers.py](observers.md) file that is described in a next chapter. \ No newline at end of file
diff --git a/docs/use_cases/pilot_gaze_monitoring/introduction.md b/docs/use_cases/pilot_gaze_tracking/introduction.md
index 7e88c69..7e88c69 100644
--- a/docs/use_cases/pilot_gaze_monitoring/introduction.md
+++ b/docs/use_cases/pilot_gaze_tracking/introduction.md
diff --git a/docs/use_cases/pilot_gaze_monitoring/observers.md b/docs/use_cases/pilot_gaze_tracking/observers.md
index 5f5bc78..5f5bc78 100644
--- a/docs/use_cases/pilot_gaze_monitoring/observers.md
+++ b/docs/use_cases/pilot_gaze_tracking/observers.md
diff --git a/docs/use_cases/pilot_gaze_monitoring/pipeline.md b/docs/use_cases/pilot_gaze_tracking/pipeline.md
index 1450fed..65fccc3 100644
--- a/docs/use_cases/pilot_gaze_monitoring/pipeline.md
+++ b/docs/use_cases/pilot_gaze_tracking/pipeline.md
@@ -122,7 +122,7 @@ For this use case we need to detect ArUco markers to enable gaze mapping: **ArGa
}
```
-All the files mentioned aboved are described below.
+All the files mentioned above are described below.
The *ArUcoCameraLogger* observer object is defined into the [observers.py](observers.md) file that is described in the next chapter.
@@ -302,10 +302,10 @@ This file defines the place of the AOI into the PFD frame. AOI positions have be
This file contains the logs of *ArUcoCamera.look* method execution info. It is saved into an *_export* folder from where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
-On a Jetson Xavier computer, the *look* method execution time is ~0.5ms and it is called ~100 times per second.
+On a Jetson Xavier computer, the *look* method execution time is 5.7ms and it is called ~100 times per second.
## watch_performance.csv
This file contains the logs of *ArUcoCamera.watch* method execution info. It is saved into an *_export* folder from where the [*load* command](../../user_guide/utils/main_commands.md) is launched.
-On a Jetson Xavier computer, the *watch* method execution time is ~50ms and it is called ~10 times per second.
+On a Jetson Xavier computer with CUDA acceleration, the *watch* method execution time is 46.5ms and it is called more than 12 times per second.
diff --git a/docs/user_guide/eye_tracking_context/configuration_and_execution.md b/docs/user_guide/eye_tracking_context/configuration_and_execution.md
index e1123fb..3deeb57 100644
--- a/docs/user_guide/eye_tracking_context/configuration_and_execution.md
+++ b/docs/user_guide/eye_tracking_context/configuration_and_execution.md
@@ -4,7 +4,10 @@ Edit and execute context
The [utils.contexts module](../../argaze.md/#argaze.utils.contexts) provides ready-made contexts like:
* [Tobii Pro Glasses 2](context_modules/tobii_pro_glasses_2.md) data capture and data playback contexts,
-* [Pupil Labs](context_modules/pupil_labs.md) data capture context,
+* [Tobii Pro Glasses 3](context_modules/tobii_pro_glasses_3.md) data capture context,
+* [Pupil Labs Invisible](context_modules/pupil_labs_invisible.md) data capture context,
+* [Pupil Labs Neon](context_modules/pupil_labs_neon.md) data capture context,
+* [File](context_modules/file.md) data playback contexts,
* [OpenCV](context_modules/opencv.md) window cursor position capture and movie playback,
* [Random](context_modules/random.md) gaze position generator.
diff --git a/docs/user_guide/eye_tracking_context/context_modules/file.md b/docs/user_guide/eye_tracking_context/context_modules/file.md
new file mode 100644
index 0000000..5b5c8e9
--- /dev/null
+++ b/docs/user_guide/eye_tracking_context/context_modules/file.md
@@ -0,0 +1,75 @@
+File
+======
+
+ArGaze provides a ready-made contexts to read data from various file format.
+
+To select a desired context, the JSON samples have to be edited and saved inside an [ArContext configuration](../configuration_and_execution.md) file.
+Notice that the *pipeline* entry is mandatory.
+
+```json
+{
+ JSON sample
+ "pipeline": ...
+}
+```
+
+Read more about [ArContext base class in code reference](../../../argaze.md/#argaze.ArFeatures.ArContext).
+
+## CSV
+
+::: argaze.utils.contexts.File.CSV
+
+### JSON sample: splitted case
+
+To use when gaze position coordinates are splitted in two separated columns.
+
+```json
+{
+ "argaze.utils.contexts.File.CSV": {
+ "name": "CSV file data playback",
+ "path": "./src/argaze/utils/demo/gaze_positions_splitted.csv",
+ "timestamp_column": "Timestamp (ms)",
+ "x_column": "Gaze Position X (px)",
+ "y_column": "Gaze Position Y (px)",
+ "pipeline": ...
+ }
+}
+```
+
+### JSON sample: joined case
+
+To use when gaze position coordinates are joined as a list in one single column.
+
+```json
+{
+ "argaze.utils.contexts.File.CSV" : {
+ "name": "CSV file data playback",
+ "path": "./src/argaze/utils/demo/gaze_positions_xy_joined.csv",
+ "timestamp_column": "Timestamp (ms)",
+ "xy_column": "Gaze Position (px)",
+ "pipeline": ...
+ }
+}
+```
+
+### JSON sample: left and right eyes
+
+To use when gaze position coordinates and validity are given for each eye in six separated columns.
+
+```json
+{
+ "argaze.utils.contexts.File.CSV": {
+ "name": "CSV file data playback",
+ "path": "./src/argaze/utils/demo/gaze_positions_left_right_eyes.csv",
+ "timestamp_column": "Timestamp (ms)",
+ "left_eye_x_column": "Left eye X",
+ "left_eye_y_column": "Left eye Y",
+ "left_eye_validity_column": "Left eye validity",
+ "right_eye_x_column": "Right eye X",
+ "right_eye_y_column": "Right eye Y",
+ "right_eye_validity_column": "Right eye validity",
+ "rescale_to_pipeline_size": true,
+ "pipeline": ...
+ }
+}
+```
diff --git a/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_invisible.md b/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_invisible.md
new file mode 100644
index 0000000..1f4a94f
--- /dev/null
+++ b/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_invisible.md
@@ -0,0 +1,32 @@
+Pupil Labs Invisible
+==========
+
+ArGaze provides a ready-made context to work with Pupil Labs Invisible device.
+
+To select a desired context, the JSON samples have to be edited and saved inside an [ArContext configuration](../configuration_and_execution.md) file.
+Notice that the *pipeline* entry is mandatory.
+
+```json
+{
+ JSON sample
+ "pipeline": ...
+}
+```
+
+Read more about [ArContext base class in code reference](../../../argaze.md/#argaze.ArFeatures.ArContext).
+
+## Live Stream
+
+::: argaze.utils.contexts.PupilLabsInvisible.LiveStream
+
+### JSON sample
+
+```json
+{
+ "argaze.utils.contexts.PupilLabsInvisible.LiveStream": {
+ "name": "Pupil Labs Invisible live stream",
+ "project": "my_experiment",
+ "pipeline": ...
+ }
+}
+```
diff --git a/docs/user_guide/eye_tracking_context/context_modules/pupil_labs.md b/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_neon.md
index d2ec336..535f5d5 100644
--- a/docs/user_guide/eye_tracking_context/context_modules/pupil_labs.md
+++ b/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_neon.md
@@ -1,7 +1,7 @@
-Pupil Labs
+Pupil Labs Neon
==========
-ArGaze provides a ready-made context to work with Pupil Labs devices.
+ArGaze provides a ready-made context to work with Pupil Labs Neon device.
To select a desired context, the JSON samples have to be edited and saved inside an [ArContext configuration](../configuration_and_execution.md) file.
Notice that the *pipeline* entry is mandatory.
@@ -17,14 +17,14 @@ Read more about [ArContext base class in code reference](../../../argaze.md/#arg
## Live Stream
-::: argaze.utils.contexts.PupilLabs.LiveStream
+::: argaze.utils.contexts.PupilLabsNeon.LiveStream
### JSON sample
```json
{
- "argaze.utils.contexts.PupilLabs.LiveStream": {
- "name": "Pupil Labs live stream",
+ "argaze.utils.contexts.PupilLabsNeon.LiveStream": {
+ "name": "Pupil Labs Neon live stream",
"project": "my_experiment",
"pipeline": ...
}
diff --git a/docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_3.md b/docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_3.md
new file mode 100644
index 0000000..3d37fcc
--- /dev/null
+++ b/docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_3.md
@@ -0,0 +1,32 @@
+Tobii Pro Glasses 3
+===================
+
+ArGaze provides a ready-made context to work with Tobii Pro Glasses 3 devices.
+
+To select a desired context, the JSON samples have to be edited and saved inside an [ArContext configuration](../configuration_and_execution.md) file.
+Notice that the *pipeline* entry is mandatory.
+
+```json
+{
+ JSON sample
+ "pipeline": ...
+}
+```
+
+Read more about [ArContext base class in code reference](../../../argaze.md/#argaze.ArFeatures.ArContext).
+
+## Live Stream
+
+::: argaze.utils.contexts.TobiiProGlasses3.LiveStream
+
+### JSON sample
+
+```json
+{
+ "argaze.utils.contexts.TobiiProGlasses3.LiveStream": {
+ "name": "Tobii Pro Glasses 3 live stream",
+ "pipeline": ...
+ }
+}
+```
+
diff --git a/docs/user_guide/utils/demonstrations_scripts.md b/docs/user_guide/utils/demonstrations_scripts.md
index 59df85b..c7560eb 100644
--- a/docs/user_guide/utils/demonstrations_scripts.md
+++ b/docs/user_guide/utils/demonstrations_scripts.md
@@ -20,6 +20,29 @@ Load **random_context.json** file to generate random gaze positions:
python -m argaze load ./src/argaze/utils/demo/random_context.json
```
+## CSV file context
+
+Load **csv_file_context_xy_joined.json** file to analyze gaze positions from a CSV file where gaze position coordinates are joined as a list in one single column:
+
+```shell
+python -m argaze load ./src/argaze/utils/demo/csv_file_context_xy_joined.json
+```
+
+Load **csv_file_context_xy_splitted.json** file to analyze gaze positions from a CSV file where gaze position coordinates are splitted in two seperated column:
+
+```shell
+python -m argaze load ./src/argaze/utils/demo/csv_file_context_xy_splitted.json
+```
+
+Load **csv_file_context_left_right_eyes.json** file to analyze gaze positions from a CSV file where gaze position coordinates and validity are given for each eye in six separated columns.:
+
+```shell
+python -m argaze load ./src/argaze/utils/demo/csv_file_context_left_right_eyes.json
+```
+
+!!! note
+ The left/right eyes context allows to parse Tobii Spectrum data for example.
+
## OpenCV
### Cursor context
@@ -40,7 +63,7 @@ python -m argaze load ./src/argaze/utils/demo/opencv_movie_context.json
### Camera context
-Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution and to reduce the value of the *sides_mask*.
+Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution and to set a consistent *sides_mask* value.
Edit **opencv_camera_context.json** file as to select camera device identifier (default is 0).
@@ -57,7 +80,9 @@ python -m argaze load ./src/argaze/utils/demo/opencv_camera_context.json
!!! note
This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
-Edit **tobii_live_stream_context.json** file as to select exisiting IP *address*, *project* or *participant* names and setup Tobii *configuration* parameters:
+Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1920, 1080]) and to set *sides_mask* value to 420.
+
+Edit **tobii_g2_live_stream_context.json** file as to select exisiting IP *address*, *project* or *participant* names and setup Tobii *configuration* parameters:
```json
{
@@ -80,15 +105,17 @@ Edit **tobii_live_stream_context.json** file as to select exisiting IP *address*
}
```
-Then, load **tobii_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
+Then, load **tobii_g2_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
```shell
-python -m argaze load ./src/argaze/utils/demo/tobii_live_stream_context.json
+python -m argaze load ./src/argaze/utils/demo/tobii_g2_live_stream_context.json
```
### Segment playback context
-Edit **tobii_segment_playback_context.json** file to select an existing Tobii *segment* folder:
+Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1920, 1080]) and to set *sides_mask* value to 420.
+
+Edit **tobii_g2_segment_playback_context.json** file to select an existing Tobii *segment* folder:
```json
{
@@ -100,12 +127,28 @@ Edit **tobii_segment_playback_context.json** file to select an existing Tobii *s
}
```
-Then, load **tobii_segment_playback_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
+Then, load **tobii_g2_segment_playback_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
```shell
-python -m argaze load ./src/argaze/utils/demo/tobii_segment_playback_context.json
+python -m argaze load ./src/argaze/utils/demo/tobii_g2_segment_playback_context.json
```
+## Tobii Pro Glasses 3
+
+### Live stream context
+
+!!! note
+ This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
+
+Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1920, 1080]) and to set *sides_mask* value to 420.
+
+Load **tobii_g3_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
+
+```shell
+python -m argaze load ./src/argaze/utils/demo/tobii_g3_live_stream_context.json
+```
+
+
## Pupil Invisible
### Live stream context
@@ -113,8 +156,25 @@ python -m argaze load ./src/argaze/utils/demo/tobii_segment_playback_context.jso
!!! note
This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
-Load **pupillabs_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
+Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1088, 1080]) and to set *sides_mask* value to 4.
+
+Load **pupillabs_invisible_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
+
+```shell
+python -m argaze load ./src/argaze/utils/demo/pupillabs_invisible_live_stream_context.json
+```
+
+## Pupil Neon
+
+### Live stream context
+
+!!! note
+ This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
+
+Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1600, 1200]) and to set *sides_mask* value to 200.
+
+Load **pupillabs_neon_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
```shell
-python -m argaze load ./src/argaze/utils/demo/pupillabs_live_stream_context.json
+python -m argaze load ./src/argaze/utils/demo/pupillabs_neon_live_stream_context.json
```
diff --git a/docs/user_guide/utils/main_commands.md b/docs/user_guide/utils/main_commands.md
index c4887a4..9227d8d 100644
--- a/docs/user_guide/utils/main_commands.md
+++ b/docs/user_guide/utils/main_commands.md
@@ -54,3 +54,6 @@ Modify the content of JSON CONFIGURATION file with another JSON CHANGES file the
```shell
python -m argaze edit CONFIGURATION CHANGES OUTPUT
```
+
+!!! note
+ Use *null* value to remove an entry.