aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
-rw-r--r--docs/img/ar_frame.pngbin0 -> 3477 bytes
-rw-r--r--docs/img/ar_frame_gaze_movement_identifier.pngbin0 -> 27362 bytes
-rw-r--r--docs/img/ar_frame_heatmap.pngbin0 -> 60597 bytes
-rw-r--r--docs/img/ar_frame_scan_path.pngbin0 -> 18906 bytes
-rw-r--r--docs/img/argaze_pipeline.pngbin0 -> 148893 bytes
-rw-r--r--docs/img/gaze_analysis_pipeline.pngbin0 -> 93401 bytes
-rw-r--r--docs/img/timestamped_gaze_positions.pngbin0 -> 23134 bytes
-rw-r--r--docs/index.md26
-rw-r--r--docs/requirements.in1
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/advanced_topics/plugin_loading.md46
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/ar_frame_configuration_and_execution.md161
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/introduction.md19
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/timestamped_gaze_positions_edition.md73
-rw-r--r--docs/user_guide/timestamped_data/introduction.md2
-rw-r--r--mkdocs.yml66
15 files changed, 358 insertions, 36 deletions
diff --git a/docs/img/ar_frame.png b/docs/img/ar_frame.png
new file mode 100644
index 0000000..f368635
--- /dev/null
+++ b/docs/img/ar_frame.png
Binary files differ
diff --git a/docs/img/ar_frame_gaze_movement_identifier.png b/docs/img/ar_frame_gaze_movement_identifier.png
new file mode 100644
index 0000000..8a66cac
--- /dev/null
+++ b/docs/img/ar_frame_gaze_movement_identifier.png
Binary files differ
diff --git a/docs/img/ar_frame_heatmap.png b/docs/img/ar_frame_heatmap.png
new file mode 100644
index 0000000..812cc8f
--- /dev/null
+++ b/docs/img/ar_frame_heatmap.png
Binary files differ
diff --git a/docs/img/ar_frame_scan_path.png b/docs/img/ar_frame_scan_path.png
new file mode 100644
index 0000000..671d6a5
--- /dev/null
+++ b/docs/img/ar_frame_scan_path.png
Binary files differ
diff --git a/docs/img/argaze_pipeline.png b/docs/img/argaze_pipeline.png
new file mode 100644
index 0000000..cad7b5e
--- /dev/null
+++ b/docs/img/argaze_pipeline.png
Binary files differ
diff --git a/docs/img/gaze_analysis_pipeline.png b/docs/img/gaze_analysis_pipeline.png
new file mode 100644
index 0000000..d6140d3
--- /dev/null
+++ b/docs/img/gaze_analysis_pipeline.png
Binary files differ
diff --git a/docs/img/timestamped_gaze_positions.png b/docs/img/timestamped_gaze_positions.png
new file mode 100644
index 0000000..cc08ec0
--- /dev/null
+++ b/docs/img/timestamped_gaze_positions.png
Binary files differ
diff --git a/docs/index.md b/docs/index.md
index 7e679e3..af57d2b 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -2,18 +2,34 @@
title: What is ArGaze?
---
-# Enable gaze tracking in AR environment
+# Enable modular gaze processing pipeline
**Useful links**: [Installation](installation) | [Source Repository](https://git.recherche.enac.fr/projects/argaze/repository) | [Issue Tracker](https://git.recherche.enac.fr/projects/argaze/issues) | [Contact](mailto:achil-contact@recherche.enac.fr)
-**ArGaze** python toolkit provides solutions to build 3D modeled **Augmented Reality (AR)** environment defining **Areas Of Interest (AOI)** mapped on <a href="https://docs.opencv.org/4.x/d5/dae/tutorial_aruco_detection.html" target="_blank">OpenCV ArUco markers</a> and so ease experimentation design with wearable eye tracker device.
+**ArGaze** python toolkit provides a set of classes to build custom-made gaze processing pipelines that works with any kind of eye tracker devices.
-Further, tracked gaze can be projected onto AR environment for live or post **gaze analysis** thanks to **timestamped data** features.
+![AGaze pipeline](img/argaze_pipeline.png)
+
+## Gaze analysis pipeline
+
+Whether in real time or in post-processing, **ArGaze** provides extensible plugins library allowing to select application specific algorithm at each pipeline step:
+
+* **Fixation/Saccade identification**: dispersion threshold identification, velocity threshold identification, ...
+* **Area Of Interest (AOI) matching**: fixation deviation circle matching, ...
+* **Scan path analysis**: transition matrix, entropy, exploit/explore ratio, ...
+
+Once incoming data formatted as required, all those gaze analysis features can be used with any screen-based eye tracker devices.
+
+[Learn how to build gaze analysis pipelines for various use cases by reading user guide dedicated section](./user_guide/gaze_analysis_pipeline/introduction).
+
+## Augmented reality pipeline
+
+Things goes harder when gaze data comes from head-mounted eye tracker devices. That's why **ArGaze** enable 3D modeled **Augmented Reality (AR)** environment description including **Areas Of Interest (AOI)** mapped on <a href="https://docs.opencv.org/4.x/d5/dae/tutorial_aruco_detection.html" target="_blank">OpenCV ArUco markers</a>.
![AR environment axis](img/ar_environment_axis.png)
-ArGaze can be combined with any wearable eye tracking device python library like Tobii or Pupil glasses.
+This AR pipeline can be combined with any wearable eye tracking device python library like Tobii or Pupill glasses.
!!! note
- *This work is greatly inspired by [Andrew T. Duchowski, Vsevolod Peysakhovich and Krzysztof Krejtz article](https://git.recherche.enac.fr/attachments/download/1990/Using_Pose_Estimation_to_Map_Gaze_to_Detected_Fidu.pdf) about using pose estimation to map gaze to detected fiducial markers.*
+ *AR pipeline is greatly inspired by [Andrew T. Duchowski, Vsevolod Peysakhovich and Krzysztof Krejtz article](https://git.recherche.enac.fr/attachments/download/1990/Using_Pose_Estimation_to_Map_Gaze_to_Detected_Fidu.pdf) about using pose estimation to map gaze to detected fiducial markers.*
diff --git a/docs/requirements.in b/docs/requirements.in
index 647ae23..466edd4 100644
--- a/docs/requirements.in
+++ b/docs/requirements.in
@@ -2,3 +2,4 @@ mkdocs
mkdocstrings[python]
markdown-include
mkdocs-include-markdown-plugin
+mkdocs-video
diff --git a/docs/user_guide/gaze_analysis_pipeline/advanced_topics/plugin_loading.md b/docs/user_guide/gaze_analysis_pipeline/advanced_topics/plugin_loading.md
new file mode 100644
index 0000000..21e1f8b
--- /dev/null
+++ b/docs/user_guide/gaze_analysis_pipeline/advanced_topics/plugin_loading.md
@@ -0,0 +1,46 @@
+Loading plugins from another package
+====================================
+
+It possible to load GazeMovementIdentifier, ScanPathAnalyzer or AOIScanPathAnalyzer plugins from another [python package](https://docs.python.org/3/tutorial/modules.html#packages).
+
+To do so, simply prepend the package where to find the plugin into the JSON configuration file:
+
+```
+{
+ ...
+ "gaze_movement_identifier": {
+ "my_package.MyGazeMovementIdentifierMethod": {
+ "specific_plugin_parameter": 0
+ }
+ },
+ ...
+ "scan_path_analyzers": {
+ "my_package.MyScanPathAnalyzerAlgorithm": {
+ "specific_plugin_parameter": 0
+ }
+ }
+ ...
+ "aoi_scan_path_analyzers": {
+ "my_package.MyAOIScanPathAnalyzerAlgorithm": {
+ "specific_plugin_parameter": 0
+ }
+ }
+}
+```
+
+Then, load your package from the python script where the ArFrame is created.
+
+```python
+from argaze import ArFeatures
+
+# Import your own package
+import my_package
+
+# Load ArFrame
+ar_frame = ArFeatures.ArFrame.from_json('./configuration.json')
+
+# Print ArFrame attributes
+
+for name, scan_path_analyzer in ar_frame.scan_path_analyzers.items():
+ print('scan path analyzer type:', type(scan_path_analyzer))
+```
diff --git a/docs/user_guide/gaze_analysis_pipeline/ar_frame_configuration_and_execution.md b/docs/user_guide/gaze_analysis_pipeline/ar_frame_configuration_and_execution.md
new file mode 100644
index 0000000..00300a8
--- /dev/null
+++ b/docs/user_guide/gaze_analysis_pipeline/ar_frame_configuration_and_execution.md
@@ -0,0 +1,161 @@
+Configure and execute ArFrame
+=============================
+
+The [ArFrame](../../../argaze/#argaze.ArFeatures.ArFrame) class defines a rectangular area where timestamped gaze positions are projected in and inside which they need to be analyzed.
+
+![Timestamped Gaze Positions](../../img/ar_frame.png)
+
+## Load JSON configuration file
+
+The [ArFrame](../../../argaze/#argaze.ArFeatures.ArFrame) internal pipeline is entirely customizable from a JSON configuration file thanks to [ArFrame.from_json](../../../argaze/#argaze.ArFeatures.ArFrame.from_json) class method.
+
+Here is a simple JSON ArFrame configuration file example:
+
+```json
+{
+ "name": "My FullHD screen",
+ "size": [1920, 1080],
+ "gaze_movement_identifier": {
+ "DispersionThresholdIdentification": {
+ "deviation_max_threshold": 50,
+ "duration_min_threshold": 200
+ }
+ },
+ "scan_path": {
+ "duration_max": 30000,
+ },
+ "scan_path_analyzers": {
+ "Basic": {},
+ "ExploitExploreRatio": {
+ "short_fixation_duration_threshold": 0
+ }
+ },
+ "heatmap": {
+ "size": [320, 180],
+ "sigma": 0.025,
+ "buffer": 0
+ }
+}
+```
+
+Then, here is how to load the JSON file:
+
+```python
+from argaze import ArFeatures
+
+# Load ArFrame
+ar_frame = ArFeatures.ArFrame.from_json('./configuration.json')
+
+# Print ArFrame attributes
+print("name:", ar_frame.name)
+print("size:", ar_frame.size)
+print("gaze movement identifier type:", type(ar_frame.gaze_movement_identifier))
+print("scan path:", ar_frame.scan_path)
+
+for module, analyzer in ar_frame.scan_path_analyzers.items():
+ print('scan path analyzer module:', module)
+
+print("heatmap:", ar_frame.heatmap)
+```
+
+Finally, here is what the program writes in console:
+
+```
+name: My FullHD screen
+size: [1920, 1080]
+gaze movement identifier type: <class 'argaze.GazeAnalysis.DispersionThresholdIdentification.GazeMovementIdentifier'>
+scan path: []
+scan path analyzer module: argaze.GazeAnalysis.Basic
+scan path analyzer module: argaze.GazeAnalysis.ExploitExploreRatio
+heatmap: Heatmap(size=[320, 180], buffer=0, sigma=0.025)
+```
+
+Now, let's understand the meaning of each JSON entry.
+
+### Name
+
+The name of the [ArFrame](../../../argaze/#argaze.ArFeatures.ArFrame). Basically useful for visualisation purpose.
+
+### Size
+
+The size of the [ArFrame](../../../argaze/#argaze.ArFeatures.ArFrame) defines the dimension of the rectangular area where gaze positions are projected. Be aware that gaze positions have to be in the same range of value.
+
+### Gaze Movement Identifier
+
+The first [ArFrame](../../../argaze/#argaze.ArFeatures.ArFrame) pipeline step is to identify fixations or saccades from consecutive timestamped gaze positions.
+
+![Gaze Movement Identifier](../../img/ar_frame_gaze_movement_identifier.png)
+
+The identification method can be selected by instantiating a particular [GazeMovementIdentifier](../../../argaze/#argaze.GazeFeatures.GazeMovementIdentifier) from the [argaze.GazeAnalysis](../../../argaze/#argaze.GazeAnalysis) submodule or [another python package](../advanced_topics/plugin_loading).
+
+In the example file, the choosen identification method is the [Dispersion Threshold Identification (I-DT)](../../../argaze/#argaze.GazeAnalysis.DispersionThresholdIdentification) which has two specific deviation_max_threshold and duration_min_threshold attributes.
+
+!!! note
+ In ArGaze, [Fixation](../../../argaze/#argaze.GazeFeatures.Fixation) and [Saccade](../../../argaze/#argaze.GazeFeatures.Saccade) are considered as particular [GazeMovements](../../../argaze/#argaze.GazeFeatures.GazeMovement).
+
+!!! warning
+ JSON *gaze_movement_identifier* entry is mandatory. Otherwise, the ScanPath and ScanPathAnalyzers steps are disabled.
+
+### Scan Path
+
+The second [ArFrame](../../../argaze/#argaze.ArFeatures.ArFrame) pipeline step aims to build a [ScanPath](../../../argaze/#argaze.GazeFeatures.ScanPath) defined as a list of [ScanSteps](../../../argaze/#argaze.GazeFeatures.ScanStep) made by a fixation and a consecutive saccade.
+
+![Scan Path](../../img/ar_frame_scan_path.png)
+
+Once fixations and saccades are identified, they are automatically appended to the ScanPath if required.
+
+The [ScanPath.duration_max](../../../argaze/#argaze.GazeFeatures.ScanPath.duration_max) attribute is the duration from which older scan steps are removed each time new scan steps are added.
+
+### Scan Path Analyzers
+
+Finally, the last [ArFrame](../../../argaze/#argaze.ArFeatures.ArFrame) pipeline step consists in passing the previously built [ScanPath](../../../argaze/#argaze.GazeFeatures.ScanPath) to each loaded [ScanPathAnalyzer](../../../argaze/#argaze.GazeFeatures.ScanPathAnalyzer).
+
+Each analysis algorithm can be selected by instantiating a particular [ScanPathAnalyzer](../../../argaze/#argaze.GazeFeatures.ScanPathAnalyzer) from the [argaze.GazeAnalysis](../../../argaze/#argaze.GazeAnalysis) submodule or [another python package](../advanced_topics/plugin_loading).
+
+!!! note
+ JSON *scan_path* entry is not mandatory. If scan_path_analyzers entry is not empty, the ScanPath step is automatically enabled.
+
+### Heatmap
+
+This is an optional [ArFrame](../../../argaze/#argaze.ArFeatures.ArFrame) pipeline step. It is executed at each new gaze position to update Heatmap image.
+
+![Heatmap](../../img/ar_frame_heatmap.png)
+
+The Heatmap object have tree attributes to set its size, the sigma point spreading and optional buffer size (0 means no buffering).
+
+## Pipeline execution
+
+Timestamped gaze positions have to be passed one by one to [ArFrame.look](../../../argaze/#argaze.ArFeatures.ArFrame.look) method to execute the whole intanciated pipeline.
+
+```python
+# Assuming that timestamp gaze positions are available
+...
+
+ # Look ArFrame at a timestamped gaze position
+ movement, scan_path_analysis, _, execution_times, exception = ar_frame.look(timestamp, gaze_position)
+
+ # Check if a movement has been identified
+ if movement.valid and movement.finished:
+
+ # Do something with identified fixation
+ if GazeFeatures.is_fixation(movement):
+ ...
+
+ # Do something with identified saccade
+ elif GazeFeatures.is_saccade(movement):
+ ...
+
+ # Do something with scan path analysis
+ for module, analysis in scan_path_analysis.items():
+ for data, value in analysis.items():
+ ...
+
+ # Do something with pipeline execution times
+ ...
+
+ # Do something with pipeline exception
+ if exception:
+ ...
+```
+
+
diff --git a/docs/user_guide/gaze_analysis_pipeline/introduction.md b/docs/user_guide/gaze_analysis_pipeline/introduction.md
new file mode 100644
index 0000000..568cba2
--- /dev/null
+++ b/docs/user_guide/gaze_analysis_pipeline/introduction.md
@@ -0,0 +1,19 @@
+Overview
+========
+
+This section explains how to build gaze analysis pipelines for various use cases.
+
+First, let's look at the schema below: it gives an overview of the main notions involved in the following chapters.
+
+![Gaze analysis pipeline](../../img/gaze_analysis_pipeline.png)
+
+To build your own gaze analysis pipeline, you need to know:
+
+* [How to edit timestamped gaze positions](../timestamped_gaze_positions_edition),
+* [How to deal with an ArFrame instance](../ar_frame_configuration_and_execution),
+* [How to setup Areas Of Interest](../ar_frame_aoi_configuration),
+* [How to log resulted gaze analysis](../analysis).
+
+More advanced features are also explained like:
+
+* [How to load plugin from another package](../advanced_topics/plugin_loading)
diff --git a/docs/user_guide/gaze_analysis_pipeline/timestamped_gaze_positions_edition.md b/docs/user_guide/gaze_analysis_pipeline/timestamped_gaze_positions_edition.md
new file mode 100644
index 0000000..e7deab2
--- /dev/null
+++ b/docs/user_guide/gaze_analysis_pipeline/timestamped_gaze_positions_edition.md
@@ -0,0 +1,73 @@
+Edit Timestamped Gaze Positions
+===============================
+
+Whatever eye data comes from a file on disk or from a live stream, timestamped gaze positions are required before to go further.
+
+![Timestamped Gaze Positions](../../img/timestamped_gaze_positions.png)
+
+## Import gaze positions from CSV file
+
+It is possible to load timestamped gaze positions from a [Pandas DataFrame](https://pandas.pydata.org/docs/getting_started/intro_tutorials/01_table_oriented.html#min-tut-01-tableoriented) object which can be loaded from a CSV file.
+
+```python
+from argaze import GazeFeatures
+import pandas
+
+# Load gaze positions from a CSV file into Panda Dataframe
+dataframe = pandas.read_csv('gaze_positions.csv', delimiter=",", low_memory=False)
+
+# Convert Panda dataframe into timestamped gaze positions precising the use of each specific column labels
+ts_gaze_positions = GazeFeatures.TimeStampedGazePositions.from_dataframe(dataframe, timestamp = 'Recording timestamp [ms]', x = 'Gaze point X [px]', y = 'Gaze point Y [px]')
+
+# Iterate over timestamped gaze positions
+for timestamp, gaze_position in ts_gaze_positions.items():
+
+ # Do something with each timestamped gaze position
+ ...
+```
+
+## Edit gaze positions from live stream
+
+When gaze positions comes from a real time input, gaze position can be edited thanks to [GazePosition](../../../argaze/#argaze.GazeFeatures.GazePosition) class.
+Besides, timestamps can be edited from the incoming data stream or, if not available, they can be edited thanks to the python [time package](https://docs.python.org/3/library/time.html).
+
+``` python
+from argaze import GazeFeatures
+
+# Assuming to be inside the function where timestamp_µs, gaze_x and gaze_y values are catched
+...
+ # Edit a second timestamp from a microsecond second timestamp
+ timestamp = timestamp_µs * 1e-6
+
+ # Define a basic gaze position
+ gaze_position = GazeFeatures.GazePosition((gaze_x, gaze_y))
+
+ # Do something with each timestamped gaze position
+ ...
+```
+
+``` python
+from argaze import GazeFeatures
+
+import time
+
+# Init timestamp
+start_time = time.time()
+
+# Assuming to be inside the function where only gaze_x and gaze_y values are catched (no timestamp)
+...
+
+ # Edit a millisecond timestamp
+ timestamp = int((time.time() - start_time) * 1e3)
+
+ # Define a basic gaze position
+ gaze_position = GazeFeatures.GazePosition((gaze_x, gaze_y))
+
+ # Do something with each timestamped gaze position
+ ...
+```
+
+!!! warning
+ **ArGaze doesn't impose any time unit.** Timestamps can either be integer or float, second, millisecond or what ever you need. The only concern is that all time values used in further configurations have to be all the same unit.
+
+Now we have timestamped gaze positions at expected format, let's see how to analyze them. \ No newline at end of file
diff --git a/docs/user_guide/timestamped_data/introduction.md b/docs/user_guide/timestamped_data/introduction.md
index a36daca..df8b9b4 100644
--- a/docs/user_guide/timestamped_data/introduction.md
+++ b/docs/user_guide/timestamped_data/introduction.md
@@ -1,6 +1,6 @@
Timestamped data
================
-Working with wearable eye tracker devices implies to handle various timestamped data like gaze positions, pupils diameter, fixations, saccades, ...
+Working with wearable eye tracker devices implies to handle various timestamped data like gaze positions, pupills diameter, fixations, saccades, ...
This section mainly refers to [DataStructures.TimeStampedBuffer](../../../argaze/#argaze.DataStructures.TimeStampedBuffer) class.
diff --git a/mkdocs.yml b/mkdocs.yml
index a77d2ad..06e0d2e 100644
--- a/mkdocs.yml
+++ b/mkdocs.yml
@@ -4,36 +4,42 @@ nav:
- installation.md
- license.md
- User Guide:
- - ArUco Markers:
- - user_guide/aruco_markers/introduction.md
- - user_guide/aruco_markers/dictionary_selection.md
- - user_guide/aruco_markers/markers_creation.md
- - user_guide/aruco_markers/camera_calibration.md
- - user_guide/aruco_markers/markers_detection.md
- - user_guide/aruco_markers/markers_pose_estimation.md
- - user_guide/aruco_markers/markers_scene_description.md
- - Areas Of Interest:
- - user_guide/areas_of_interest/introduction.md
- - user_guide/areas_of_interest/aoi_scene_description.md
- - user_guide/areas_of_interest/aoi_scene_projection.md
- - user_guide/areas_of_interest/vision_cone_filtering.md
- - user_guide/areas_of_interest/aoi_matching.md
- - user_guide/areas_of_interest/heatmap.md
- - Augmented Reality environment:
- - user_guide/ar_environment/introduction.md
- - user_guide/ar_environment/environment_setup.md
- - user_guide/ar_environment/environment_exploitation.md
- - Gaze Analysis:
- - user_guide/gaze_analysis/introduction.md
- - user_guide/gaze_analysis/gaze_position.md
- - user_guide/gaze_analysis/gaze_movement.md
- - user_guide/gaze_analysis/scan_path.md
- - Timestamped data:
- - user_guide/timestamped_data/introduction.md
- - user_guide/timestamped_data/ordered_dictionary.md
- - user_guide/timestamped_data/saving_and_loading.md
- - user_guide/timestamped_data/data_synchronisation.md
- - user_guide/timestamped_data/pandas_dataframe_conversion.md
+ - Gaze Analysis Pipeline:
+ - user_guide/gaze_analysis_pipeline/introduction.md
+ - user_guide/gaze_analysis_pipeline/timestamped_gaze_positions_edition.md
+ - user_guide/gaze_analysis_pipeline/ar_frame_configuration_and_execution.md
+ - Advanced Topics:
+ - user_guide/gaze_analysis_pipeline/advanced_topics/plugin_loading.md
+# - ArUco Markers:
+# - user_guide/aruco_markers/introduction.md
+# - user_guide/aruco_markers/dictionary_selection.md
+# - user_guide/aruco_markers/markers_creation.md
+# - user_guide/aruco_markers/camera_calibration.md
+# - user_guide/aruco_markers/markers_detection.md
+# - user_guide/aruco_markers/markers_pose_estimation.md
+# - user_guide/aruco_markers/markers_scene_description.md
+# - Areas Of Interest:
+# - user_guide/areas_of_interest/introduction.md
+# - user_guide/areas_of_interest/aoi_scene_description.md
+# - user_guide/areas_of_interest/aoi_scene_projection.md
+# - user_guide/areas_of_interest/vision_cone_filtering.md
+# - user_guide/areas_of_interest/aoi_matching.md
+# - user_guide/areas_of_interest/heatmap.md
+# - Augmented Reality environment:
+# - user_guide/ar_environment/introduction.md
+# - user_guide/ar_environment/environment_setup.md
+# - user_guide/ar_environment/environment_exploitation.md
+# - Gaze Analysis:
+# - user_guide/gaze_analysis/introduction.md
+# - user_guide/gaze_analysis/gaze_position.md
+# - user_guide/gaze_analysis/gaze_movement.md
+# - user_guide/gaze_analysis/scan_path.md
+# - Timestamped data:
+# - user_guide/timestamped_data/introduction.md
+# - user_guide/timestamped_data/ordered_dictionary.md
+# - user_guide/timestamped_data/saving_and_loading.md
+# - user_guide/timestamped_data/data_synchronisation.md
+# - user_guide/timestamped_data/pandas_dataframe_conversion.md
- utils:
- user_guide/utils/ready-made_scripts.md
- user_guide/utils/demonstrations_scripts.md