aboutsummaryrefslogtreecommitdiff
path: root/README.md
diff options
context:
space:
mode:
authorThéo de la Hogue2023-06-07 14:34:14 +0200
committerThéo de la Hogue2023-06-07 14:34:14 +0200
commitc4552e04e1271a9210a934233beae5be1943d034 (patch)
treea44041e544bc700976237bfea9058ec06f9a2904 /README.md
parentbd9cd27c9d44c072164f564ffffeb22e37106b89 (diff)
downloadargaze-c4552e04e1271a9210a934233beae5be1943d034.zip
argaze-c4552e04e1271a9210a934233beae5be1943d034.tar.gz
argaze-c4552e04e1271a9210a934233beae5be1943d034.tar.bz2
argaze-c4552e04e1271a9210a934233beae5be1943d034.tar.xz
Writing User guide and use cases section.
Diffstat (limited to 'README.md')
-rw-r--r--README.md18
1 files changed, 2 insertions, 16 deletions
diff --git a/README.md b/README.md
index 31a5b63..f33dd18 100644
--- a/README.md
+++ b/README.md
@@ -1,17 +1,3 @@
-# ArGaze documentation
+# Welcome into ArGaze package
-**Useful links**: [Installation](getting_started#installation) | [Source Repository](https://git.recherche.enac.fr/projects/argaze/repository) | [Issue Tracker](https://git.recherche.enac.fr/projects/argaze/issues) | [Contact](mailto:achil-contact@recherche.enac.fr)
-
-![Logo](logo-large.png){ width=640px }
-
-**ArGaze** is a python toolkit to deal with gaze tracking in **Augmented Reality (AR) environment**.
-
-The ArGaze toolkit provides solutions to build 3D modeled AR environment defining **Areas Of Interest (AOI)** mapped on <a href="https://docs.opencv.org/4.x/d5/dae/tutorial_aruco_detection.html" target="_blank">OpenCV ArUco markers</a> and so ease experimentation design with wearable eye tracker device.
-
-Further, tracked gaze can be projected onto AR environment for live or post **gaze analysis** thanks to **timestamped data** features.
-
-ArGaze can be combined with any wearable eye tracking device python library like Tobii or Pupil glasses.
-
-!!! note
-
- *This work is greatly inspired by [Andrew T. Duchowski, Vsevolod Peysakhovich and Krzysztof Krejtz article](https://git.recherche.enac.fr/attachments/download/1942/Using_Pose_Estimation_to_Map_Gaze_to_Detected_Fidu.pdf) about using pose estimation to map gaze to detected fiducial markers.*
+Please visit [ArGaze documentation website](http://achil.recherche.enac.fr/features/eye/argaze/index.html) to get started. \ No newline at end of file