--- title: What is ArGaze? --- # Develop post- or real-time gaze analysis applications **Useful links**: [Installation](installation.md) | [Source Repository](https://gitpub.recherche.enac.fr/argaze) | [Issue Tracker](https://git.recherche.enac.fr/projects/argaze/issues) | [Contact](mailto:argaze-contact@recherche.enac.fr) **ArGaze** is an open and flexible Python software library designed to provide a unified and modular approach to gaze analysis or gaze interaction. By offering a wide array of gaze metrics and supporting easy extension to incorporate additional metrics, **ArGaze** empowers researchers and practitioners to explore novel analytical approaches efficiently. ![ArGaze pipeline](img/argaze_pipeline.png) ## Eye tracking context **ArGaze** facilitates the integration of both **screen-based and head-mounted** eye tracking systems for **real-time and/or post-processing analysis**. [Learn how to handle various eye tracking context by reading the dedicated user guide section](./user_guide/eye_tracking_context/introduction.md). ## Gaze analysis pipeline Once incoming eye tracking data available, **ArGaze** provides an extensible modules library, allowing to select application-specific algorithms at each pipeline step: * **Fixation/Saccade identification**: dispersion threshold identification, velocity threshold identification, etc. * **Area Of Interest (AOI) matching**: focus point inside, deviation circle coverage, etc. * **Scan path analysis**: transition matrix, entropy, explore/exploit ratio, etc. All those gaze analysis features can be used with any screen-based eye tracker devices. [Learn how to build gaze analysis pipelines for various use cases by reading the dedicated user guide section](./user_guide/gaze_analysis_pipeline/introduction.md). ## Augmented reality based on ArUco marker pipeline Things goes harder when gaze data comes from head-mounted eye tracker devices. That's why **ArGaze** provides **Augmented Reality (AR)** support to map **Areas Of Interest (AOI)** on [OpenCV ArUco markers](https://www.sciencedirect.com/science/article/abs/pii/S0031320314000235). ![ArUco pipeline axis](img/aruco_pipeline_axis.png) This ArUco marker pipeline can be combined with any wearable eye tracking device Python library, like Tobii or Pupil glasses. [Learn how to build ArUco marker pipelines for various use cases by reading the dedicated user guide section](./user_guide/aruco_marker_pipeline/introduction.md). !!! note *ArUco marker pipeline is greatly inspired by [Andrew T. Duchowski, Vsevolod Peysakhovich and Krzysztof Krejtz article](https://git.recherche.enac.fr/attachments/download/1990/Using_Pose_Estimation_to_Map_Gaze_to_Detected_Fidu.pdf) about using pose estimation to map gaze to detected fiducial markers.* ## Demonstration ![type:video](http://achil.recherche.enac.fr/videos/argaze_features.mp4) [Test **ArGaze** by reading the dedicated user guide section](./user_guide/utils/demonstrations_scripts.md).