aboutsummaryrefslogtreecommitdiff
path: root/docs/index.md
diff options
context:
space:
mode:
authorThéo de la Hogue2023-09-06 08:36:39 +0200
committerThéo de la Hogue2023-09-06 08:36:39 +0200
commit38f1e22702bc08a3494728874c59978e868cdf03 (patch)
tree9c5b42c18fe92b2d61982960cb53e3176a6d1218 /docs/index.md
parent4a4c8f01edc5a644e2a1018589b37cb0ff74146f (diff)
downloadargaze-38f1e22702bc08a3494728874c59978e868cdf03.zip
argaze-38f1e22702bc08a3494728874c59978e868cdf03.tar.gz
argaze-38f1e22702bc08a3494728874c59978e868cdf03.tar.bz2
argaze-38f1e22702bc08a3494728874c59978e868cdf03.tar.xz
Improving ArUco markers pipeline description in ArGaze presentation.
Diffstat (limited to 'docs/index.md')
-rw-r--r--docs/index.md14
1 files changed, 8 insertions, 6 deletions
diff --git a/docs/index.md b/docs/index.md
index c351421..7565c5e 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -8,7 +8,7 @@ title: What is ArGaze?
**ArGaze** python toolkit provides a set of classes to build **custom-made gaze processing pipelines** that works with **any kind of eye tracker devices** whether on **live data stream** or for **data post-processing**.
-![AGaze pipeline](img/argaze_pipeline.png)
+![ArGaze pipeline](img/argaze_pipeline.png)
## Gaze analysis pipeline
@@ -22,14 +22,16 @@ Once incoming data formatted as required, all those gaze analysis features can b
[Learn how to build gaze analysis pipelines for various use cases by reading user guide dedicated section](./user_guide/gaze_analysis_pipeline/introduction.md).
-## Augmented reality pipeline
+## Augmented reality based on ArUco markers pipeline
-Things goes harder when gaze data comes from head-mounted eye tracker devices. That's why **ArGaze** enable 3D modeled **Augmented Reality (AR)** environment description including **Areas Of Interest (AOI)** mapped on <a href="https://docs.opencv.org/4.x/d5/dae/tutorial_aruco_detection.html" target="_blank">OpenCV ArUco markers</a>.
+Things goes harder when gaze data comes from head-mounted eye tracker devices. That's why **ArGaze** provides **Augmented Reality (AR)** support to map **Areas Of Interest (AOI)** on <a href="https://docs.opencv.org/4.x/d5/dae/tutorial_aruco_detection.html" target="_blank">OpenCV ArUco markers</a>.
-![AR environment axis](img/ar_environment_axis.png)
+![ArUco pipeline axis](img/aruco_markers_pipeline_axis.png)
-This AR pipeline can be combined with any wearable eye tracking device python library like Tobii or Pupill glasses.
+This ArUco markers pipeline can be combined with any wearable eye tracking device python library like Tobii or Pupill glasses.
+
+[Learn how to build ArUco markers pipelines for various use cases by reading user guide dedicated section](./user_guide/aruco_markers_pipeline/introduction.md).
!!! note
- *AR pipeline is greatly inspired by [Andrew T. Duchowski, Vsevolod Peysakhovich and Krzysztof Krejtz article](https://git.recherche.enac.fr/attachments/download/1990/Using_Pose_Estimation_to_Map_Gaze_to_Detected_Fidu.pdf) about using pose estimation to map gaze to detected fiducial markers.*
+ *ArUco markers pipeline is greatly inspired by [Andrew T. Duchowski, Vsevolod Peysakhovich and Krzysztof Krejtz article](https://git.recherche.enac.fr/attachments/download/1990/Using_Pose_Estimation_to_Map_Gaze_to_Detected_Fidu.pdf) about using pose estimation to map gaze to detected fiducial markers.*