diff options
Diffstat (limited to 'docs/index.md')
-rw-r--r-- | docs/index.md | 18 |
1 files changed, 9 insertions, 9 deletions
diff --git a/docs/index.md b/docs/index.md index 3c398ba..3784bdd 100644 --- a/docs/index.md +++ b/docs/index.md @@ -2,36 +2,36 @@ title: What is ArGaze? --- -# Build real-time or post-processing eye tracking applications +# Develop post- or real-time gaze processing applications **Useful links**: [Installation](installation.md) | [Source Repository](https://git.recherche.enac.fr/projects/argaze/repository) | [Issue Tracker](https://git.recherche.enac.fr/projects/argaze/issues) | [Contact](mailto:achil-contact@recherche.enac.fr) -**ArGaze** python toolkit provides a set of classes to build **custom-made gaze processing pipelines** that works with **any kind of eye tracker devices** whether on **live data stream** or for **data post-processing**. +**ArGaze** is a Python software library that lets you build **custom-made gaze processing pipelines** for **any kind of eye tracker device,** whether for **post- or real-time data processing**. ![ArGaze pipeline](img/argaze_pipeline.png) ## Gaze analysis pipeline -First of all, **ArGaze** provides extensible modules library allowing to select application specific algorithms at each pipeline step: +**ArGaze** provides an extensible modules library, allowing to select application-specific algorithms at each pipeline step: * **Fixation/Saccade identification**: dispersion threshold identification, velocity threshold identification, ... * **Area Of Interest (AOI) matching**: focus point inside, deviation circle coverage, ... * **Scan path analysis**: transition matrix, entropy, explore/exploit ratio, ... -Once incoming data are formatted as required, all those gaze analysis features can be used with any screen-based eye tracker devices. +Once the incoming data is formatted as required, all those gaze analysis features can be used with any screen-based eye tracker devices. -[Learn how to build gaze analysis pipelines for various use cases by reading user guide dedicated section](./user_guide/gaze_analysis_pipeline/introduction.md). +[Learn how to build gaze analysis pipelines for various use cases by reading the dedicated user guide section](./user_guide/gaze_analysis_pipeline/introduction.md). -## Augmented reality based on ArUco markers pipeline +## Augmented reality based on ArUco marker pipeline Things goes harder when gaze data comes from head-mounted eye tracker devices. That's why **ArGaze** provides **Augmented Reality (AR)** support to map **Areas Of Interest (AOI)** on [OpenCV ArUco markers](https://www.sciencedirect.com/science/article/abs/pii/S0031320314000235). ![ArUco pipeline axis](img/aruco_pipeline_axis.png) -This ArUco markers pipeline can be combined with any wearable eye tracking device python library like Tobii or Pupill glasses. +This ArUco marker pipeline can be combined with any wearable eye tracking device Python library, like Tobii or Pupil glasses. -[Learn how to build ArUco markers pipelines for various use cases by reading user guide dedicated section](./user_guide/aruco_markers_pipeline/introduction.md). +[Learn how to build ArUco marker pipelines for various use cases by reading the dedicated user guide section](./user_guide/aruco_marker_pipeline/introduction.md). !!! note - *ArUco markers pipeline is greatly inspired by [Andrew T. Duchowski, Vsevolod Peysakhovich and Krzysztof Krejtz article](https://git.recherche.enac.fr/attachments/download/1990/Using_Pose_Estimation_to_Map_Gaze_to_Detected_Fidu.pdf) about using pose estimation to map gaze to detected fiducial markers.* + *ArUco marker pipeline is greatly inspired by [Andrew T. Duchowski, Vsevolod Peysakhovich and Krzysztof Krejtz article](https://git.recherche.enac.fr/attachments/download/1990/Using_Pose_Estimation_to_Map_Gaze_to_Detected_Fidu.pdf) about using pose estimation to map gaze to detected fiducial markers.* |