aboutsummaryrefslogtreecommitdiff
path: root/docs/index.md
diff options
context:
space:
mode:
Diffstat (limited to 'docs/index.md')
-rw-r--r--docs/index.md18
1 files changed, 15 insertions, 3 deletions
diff --git a/docs/index.md b/docs/index.md
index 2d00d16..ca9271a 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -7,20 +7,26 @@ title: What is ArGaze?
**Useful links**: [Installation](installation.md) | [Source Repository](https://gitpub.recherche.enac.fr/argaze) | [Issue Tracker](https://git.recherche.enac.fr/projects/argaze/issues) | [Contact](mailto:argaze-contact@recherche.enac.fr)
**ArGaze** is an open and flexible Python software library designed to provide a unified and modular approach to gaze analysis or gaze interaction.
-**ArGaze** facilitates **real-time and/or post-processing analysis** for both **screen-based and head-mounted** eye tracking systems.
+
By offering a wide array of gaze metrics and supporting easy extension to incorporate additional metrics, **ArGaze** empowers researchers and practitioners to explore novel analytical approaches efficiently.
![ArGaze pipeline](img/argaze_pipeline.png)
+## Eye tracking context
+
+**ArGaze** facilitates the integration of both **screen-based and head-mounted** eye tracking systems for **live data capture and afterward data playback**.
+
+[Learn how to handle various eye tracking context by reading the dedicated user guide section](./user_guide/eye_tracking_context/introduction.md).
+
## Gaze analysis pipeline
-**ArGaze** provides an extensible modules library, allowing to select application-specific algorithms at each pipeline step:
+Once incoming eye tracking data available, **ArGaze** provides an extensible modules library, allowing to select application-specific algorithms at each pipeline step:
* **Fixation/Saccade identification**: dispersion threshold identification, velocity threshold identification, etc.
* **Area Of Interest (AOI) matching**: focus point inside, deviation circle coverage, etc.
* **Scan path analysis**: transition matrix, entropy, explore/exploit ratio, etc.
-Once the incoming data is formatted as required, all those gaze analysis features can be used with any screen-based eye tracker devices.
+All those gaze analysis features can be used with any screen-based eye tracker devices.
[Learn how to build gaze analysis pipelines for various use cases by reading the dedicated user guide section](./user_guide/gaze_analysis_pipeline/introduction.md).
@@ -37,3 +43,9 @@ This ArUco marker pipeline can be combined with any wearable eye tracking device
!!! note
*ArUco marker pipeline is greatly inspired by [Andrew T. Duchowski, Vsevolod Peysakhovich and Krzysztof Krejtz article](https://git.recherche.enac.fr/attachments/download/1990/Using_Pose_Estimation_to_Map_Gaze_to_Detected_Fidu.pdf) about using pose estimation to map gaze to detected fiducial markers.*
+
+## Demonstration
+
+![type:video](https://achil.recherche.enac.fr/videos/argaze_features.mp4)
+
+[Test **ArGaze** by reading the dedicated user guide section](./user_guide/utils/demonstrations_scripts.md). \ No newline at end of file