aboutsummaryrefslogtreecommitdiff
path: root/docs/index.md
diff options
context:
space:
mode:
authorThéo de la Hogue2023-08-10 09:04:31 +0200
committerThéo de la Hogue2023-08-10 09:04:31 +0200
commit293d1cc9b0fe6d7e871511cd716001f5765d9118 (patch)
tree444cff250f3a3e9997288dedf1d88c6dd8499209 /docs/index.md
parent80e122453c1120b4211015a3b7625be089db8a9f (diff)
downloadargaze-293d1cc9b0fe6d7e871511cd716001f5765d9118.zip
argaze-293d1cc9b0fe6d7e871511cd716001f5765d9118.tar.gz
argaze-293d1cc9b0fe6d7e871511cd716001f5765d9118.tar.bz2
argaze-293d1cc9b0fe6d7e871511cd716001f5765d9118.tar.xz
Working on gaze analysis pipeline documentation. Still in progress...
Diffstat (limited to 'docs/index.md')
-rw-r--r--docs/index.md26
1 files changed, 21 insertions, 5 deletions
diff --git a/docs/index.md b/docs/index.md
index 7e679e3..af57d2b 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -2,18 +2,34 @@
title: What is ArGaze?
---
-# Enable gaze tracking in AR environment
+# Enable modular gaze processing pipeline
**Useful links**: [Installation](installation) | [Source Repository](https://git.recherche.enac.fr/projects/argaze/repository) | [Issue Tracker](https://git.recherche.enac.fr/projects/argaze/issues) | [Contact](mailto:achil-contact@recherche.enac.fr)
-**ArGaze** python toolkit provides solutions to build 3D modeled **Augmented Reality (AR)** environment defining **Areas Of Interest (AOI)** mapped on <a href="https://docs.opencv.org/4.x/d5/dae/tutorial_aruco_detection.html" target="_blank">OpenCV ArUco markers</a> and so ease experimentation design with wearable eye tracker device.
+**ArGaze** python toolkit provides a set of classes to build custom-made gaze processing pipelines that works with any kind of eye tracker devices.
-Further, tracked gaze can be projected onto AR environment for live or post **gaze analysis** thanks to **timestamped data** features.
+![AGaze pipeline](img/argaze_pipeline.png)
+
+## Gaze analysis pipeline
+
+Whether in real time or in post-processing, **ArGaze** provides extensible plugins library allowing to select application specific algorithm at each pipeline step:
+
+* **Fixation/Saccade identification**: dispersion threshold identification, velocity threshold identification, ...
+* **Area Of Interest (AOI) matching**: fixation deviation circle matching, ...
+* **Scan path analysis**: transition matrix, entropy, exploit/explore ratio, ...
+
+Once incoming data formatted as required, all those gaze analysis features can be used with any screen-based eye tracker devices.
+
+[Learn how to build gaze analysis pipelines for various use cases by reading user guide dedicated section](./user_guide/gaze_analysis_pipeline/introduction).
+
+## Augmented reality pipeline
+
+Things goes harder when gaze data comes from head-mounted eye tracker devices. That's why **ArGaze** enable 3D modeled **Augmented Reality (AR)** environment description including **Areas Of Interest (AOI)** mapped on <a href="https://docs.opencv.org/4.x/d5/dae/tutorial_aruco_detection.html" target="_blank">OpenCV ArUco markers</a>.
![AR environment axis](img/ar_environment_axis.png)
-ArGaze can be combined with any wearable eye tracking device python library like Tobii or Pupil glasses.
+This AR pipeline can be combined with any wearable eye tracking device python library like Tobii or Pupill glasses.
!!! note
- *This work is greatly inspired by [Andrew T. Duchowski, Vsevolod Peysakhovich and Krzysztof Krejtz article](https://git.recherche.enac.fr/attachments/download/1990/Using_Pose_Estimation_to_Map_Gaze_to_Detected_Fidu.pdf) about using pose estimation to map gaze to detected fiducial markers.*
+ *AR pipeline is greatly inspired by [Andrew T. Duchowski, Vsevolod Peysakhovich and Krzysztof Krejtz article](https://git.recherche.enac.fr/attachments/download/1990/Using_Pose_Estimation_to_Map_Gaze_to_Detected_Fidu.pdf) about using pose estimation to map gaze to detected fiducial markers.*