diff options
author | Théo de la Hogue | 2023-06-07 14:34:14 +0200 |
---|---|---|
committer | Théo de la Hogue | 2023-06-07 14:34:14 +0200 |
commit | c4552e04e1271a9210a934233beae5be1943d034 (patch) | |
tree | a44041e544bc700976237bfea9058ec06f9a2904 /docs/index.md | |
parent | bd9cd27c9d44c072164f564ffffeb22e37106b89 (diff) | |
download | argaze-c4552e04e1271a9210a934233beae5be1943d034.zip argaze-c4552e04e1271a9210a934233beae5be1943d034.tar.gz argaze-c4552e04e1271a9210a934233beae5be1943d034.tar.bz2 argaze-c4552e04e1271a9210a934233beae5be1943d034.tar.xz |
Writing User guide and use cases section.
Diffstat (limited to 'docs/index.md')
-rw-r--r-- | docs/index.md | 20 |
1 files changed, 19 insertions, 1 deletions
diff --git a/docs/index.md b/docs/index.md index 563ed56..7e679e3 100644 --- a/docs/index.md +++ b/docs/index.md @@ -1 +1,19 @@ -{!README.md!} +--- +title: What is ArGaze? +--- + +# Enable gaze tracking in AR environment + +**Useful links**: [Installation](installation) | [Source Repository](https://git.recherche.enac.fr/projects/argaze/repository) | [Issue Tracker](https://git.recherche.enac.fr/projects/argaze/issues) | [Contact](mailto:achil-contact@recherche.enac.fr) + +**ArGaze** python toolkit provides solutions to build 3D modeled **Augmented Reality (AR)** environment defining **Areas Of Interest (AOI)** mapped on <a href="https://docs.opencv.org/4.x/d5/dae/tutorial_aruco_detection.html" target="_blank">OpenCV ArUco markers</a> and so ease experimentation design with wearable eye tracker device. + +Further, tracked gaze can be projected onto AR environment for live or post **gaze analysis** thanks to **timestamped data** features. + +![AR environment axis](img/ar_environment_axis.png) + +ArGaze can be combined with any wearable eye tracking device python library like Tobii or Pupil glasses. + +!!! note + + *This work is greatly inspired by [Andrew T. Duchowski, Vsevolod Peysakhovich and Krzysztof Krejtz article](https://git.recherche.enac.fr/attachments/download/1990/Using_Pose_Estimation_to_Map_Gaze_to_Detected_Fidu.pdf) about using pose estimation to map gaze to detected fiducial markers.* |