From 293d1cc9b0fe6d7e871511cd716001f5765d9118 Mon Sep 17 00:00:00 2001 From: Théo de la Hogue Date: Thu, 10 Aug 2023 09:04:31 +0200 Subject: Working on gaze analysis pipeline documentation. Still in progress... --- docs/index.md | 26 +++++++++++++++++++++----- 1 file changed, 21 insertions(+), 5 deletions(-) (limited to 'docs/index.md') diff --git a/docs/index.md b/docs/index.md index 7e679e3..af57d2b 100644 --- a/docs/index.md +++ b/docs/index.md @@ -2,18 +2,34 @@ title: What is ArGaze? --- -# Enable gaze tracking in AR environment +# Enable modular gaze processing pipeline **Useful links**: [Installation](installation) | [Source Repository](https://git.recherche.enac.fr/projects/argaze/repository) | [Issue Tracker](https://git.recherche.enac.fr/projects/argaze/issues) | [Contact](mailto:achil-contact@recherche.enac.fr) -**ArGaze** python toolkit provides solutions to build 3D modeled **Augmented Reality (AR)** environment defining **Areas Of Interest (AOI)** mapped on OpenCV ArUco markers and so ease experimentation design with wearable eye tracker device. +**ArGaze** python toolkit provides a set of classes to build custom-made gaze processing pipelines that works with any kind of eye tracker devices. -Further, tracked gaze can be projected onto AR environment for live or post **gaze analysis** thanks to **timestamped data** features. +![AGaze pipeline](img/argaze_pipeline.png) + +## Gaze analysis pipeline + +Whether in real time or in post-processing, **ArGaze** provides extensible plugins library allowing to select application specific algorithm at each pipeline step: + +* **Fixation/Saccade identification**: dispersion threshold identification, velocity threshold identification, ... +* **Area Of Interest (AOI) matching**: fixation deviation circle matching, ... +* **Scan path analysis**: transition matrix, entropy, exploit/explore ratio, ... + +Once incoming data formatted as required, all those gaze analysis features can be used with any screen-based eye tracker devices. + +[Learn how to build gaze analysis pipelines for various use cases by reading user guide dedicated section](./user_guide/gaze_analysis_pipeline/introduction). + +## Augmented reality pipeline + +Things goes harder when gaze data comes from head-mounted eye tracker devices. That's why **ArGaze** enable 3D modeled **Augmented Reality (AR)** environment description including **Areas Of Interest (AOI)** mapped on OpenCV ArUco markers. ![AR environment axis](img/ar_environment_axis.png) -ArGaze can be combined with any wearable eye tracking device python library like Tobii or Pupil glasses. +This AR pipeline can be combined with any wearable eye tracking device python library like Tobii or Pupill glasses. !!! note - *This work is greatly inspired by [Andrew T. Duchowski, Vsevolod Peysakhovich and Krzysztof Krejtz article](https://git.recherche.enac.fr/attachments/download/1990/Using_Pose_Estimation_to_Map_Gaze_to_Detected_Fidu.pdf) about using pose estimation to map gaze to detected fiducial markers.* + *AR pipeline is greatly inspired by [Andrew T. Duchowski, Vsevolod Peysakhovich and Krzysztof Krejtz article](https://git.recherche.enac.fr/attachments/download/1990/Using_Pose_Estimation_to_Map_Gaze_to_Detected_Fidu.pdf) about using pose estimation to map gaze to detected fiducial markers.* -- cgit v1.1