aboutsummaryrefslogtreecommitdiff
path: root/docs/user_guide/aruco_marker_pipeline
diff options
context:
space:
mode:
authorThéo de la Hogue2024-04-17 17:04:38 +0200
committerThéo de la Hogue2024-04-17 17:04:38 +0200
commitee10651a9aa0d87fa323423d1a7489798e083b90 (patch)
tree3198f864c5fe26fca64d6ebd37dbc5a83d263f90 /docs/user_guide/aruco_marker_pipeline
parent9958a965725764aee7150a67d9f63e241b4c345e (diff)
downloadargaze-ee10651a9aa0d87fa323423d1a7489798e083b90.zip
argaze-ee10651a9aa0d87fa323423d1a7489798e083b90.tar.gz
argaze-ee10651a9aa0d87fa323423d1a7489798e083b90.tar.bz2
argaze-ee10651a9aa0d87fa323423d1a7489798e083b90.tar.xz
Improving orthographic and grammar.
Diffstat (limited to 'docs/user_guide/aruco_marker_pipeline')
-rw-r--r--docs/user_guide/aruco_marker_pipeline/advanced_topics/aruco_detector_configuration.md4
-rw-r--r--docs/user_guide/aruco_marker_pipeline/advanced_topics/optic_parameters_calibration.md8
-rw-r--r--docs/user_guide/aruco_marker_pipeline/advanced_topics/scripting.md12
-rw-r--r--docs/user_guide/aruco_marker_pipeline/aoi_3d_description.md18
-rw-r--r--docs/user_guide/aruco_marker_pipeline/aoi_3d_frame.md20
-rw-r--r--docs/user_guide/aruco_marker_pipeline/aoi_3d_projection.md24
-rw-r--r--docs/user_guide/aruco_marker_pipeline/aruco_marker_description.md10
-rw-r--r--docs/user_guide/aruco_marker_pipeline/configuration_and_execution.md4
8 files changed, 50 insertions, 50 deletions
diff --git a/docs/user_guide/aruco_marker_pipeline/advanced_topics/aruco_detector_configuration.md b/docs/user_guide/aruco_marker_pipeline/advanced_topics/aruco_detector_configuration.md
index 53c137a..975f278 100644
--- a/docs/user_guide/aruco_marker_pipeline/advanced_topics/aruco_detector_configuration.md
+++ b/docs/user_guide/aruco_marker_pipeline/advanced_topics/aruco_detector_configuration.md
@@ -1,7 +1,7 @@
-Improve ArUco markers detection
+Improve ArUco marker detection
===============================
-As explain in [OpenCV ArUco documentation](https://docs.opencv.org/4.x/d1/dcd/structcv_1_1aruco_1_1DetectorParameters.html), ArUco markers detection is highly configurable.
+As explain in the [OpenCV ArUco documentation](https://docs.opencv.org/4.x/d1/dcd/structcv_1_1aruco_1_1DetectorParameters.html), ArUco marker detection is highly configurable.
## Load ArUcoDetector parameters
diff --git a/docs/user_guide/aruco_marker_pipeline/advanced_topics/optic_parameters_calibration.md b/docs/user_guide/aruco_marker_pipeline/advanced_topics/optic_parameters_calibration.md
index 7bbfc63..625f257 100644
--- a/docs/user_guide/aruco_marker_pipeline/advanced_topics/optic_parameters_calibration.md
+++ b/docs/user_guide/aruco_marker_pipeline/advanced_topics/optic_parameters_calibration.md
@@ -7,7 +7,7 @@ A camera device have to be calibrated to compensate its optical distorsion.
## Print calibration board
-The first step to calibrate a camera is to create an [ArUcoBoard](../../../argaze.md/#argaze.ArUcoMarker.ArUcoBoard) like in the code below:
+The first step to calibrating a camera is to create an [ArUcoBoard](../../../argaze.md/#argaze.ArUcoMarker.ArUcoBoard) like in the code below:
```python
from argaze.ArUcoMarker import ArUcoMarkerDictionary, ArUcoBoard
@@ -23,13 +23,13 @@ aruco_board.save('./calibration_board.png', 300)
```
!!! note
- There is **A3_DICT_APRILTAG_16h5_3cm_35cmx25cm.pdf** file located in *./src/argaze/ArUcoMarker/utils/* folder ready to be printed on A3 paper sheet.
+ There is **A3_DICT_APRILTAG_16h5_3cm_35cmx25cm.pdf** file located in *./src/argaze/ArUcoMarker/utils/* folder ready to be printed on an A3 paper sheet.
-Let's print the calibration board before to go further.
+Let's print the calibration board before going further.
## Capture board pictures
-Then, the calibration process needs to make many different captures of an [ArUcoBoard](../../../argaze.md/#argaze.ArUcoMarker.ArUcoBoard) through the camera and then, pass them to an [ArUcoDetector](../../../argaze.md/#argaze.ArUcoMarker.ArUcoDetector.ArUcoDetector) instance to detect board corners and store them as calibration data into an [ArUcoOpticCalibrator](../../../argaze.md/#argaze.ArUcoMarker.ArUcoOpticCalibrator) for final calibration process.
+Then, the calibration process needs to make many different captures of an [ArUcoBoard](../../../argaze.md/#argaze.ArUcoMarker.ArUcoBoard) through the camera and then, pass them to an [ArUcoDetector](../../../argaze.md/#argaze.ArUcoMarker.ArUcoDetector.ArUcoDetector) instance to detect board corners and store them as calibration data into an [ArUcoOpticCalibrator](../../../argaze.md/#argaze.ArUcoMarker.ArUcoOpticCalibrator) for the final calibration process.
![Calibration step](../../../img/optic_calibration_step.png)
diff --git a/docs/user_guide/aruco_marker_pipeline/advanced_topics/scripting.md b/docs/user_guide/aruco_marker_pipeline/advanced_topics/scripting.md
index 4d5d44c..c81d57d 100644
--- a/docs/user_guide/aruco_marker_pipeline/advanced_topics/scripting.md
+++ b/docs/user_guide/aruco_marker_pipeline/advanced_topics/scripting.md
@@ -1,8 +1,8 @@
Script the pipeline
===================
-All aruco markers pipeline objects are accessible from Python script.
-This could be particularly useful for realtime AR interaction applications.
+All ArUco marker pipeline objects are accessible from a Python script.
+This could be particularly useful for real-time AR interaction applications.
## Load ArUcoCamera configuration from dictionary
@@ -61,7 +61,7 @@ with ArUcoCamera.ArUcoCamera(**configuration) as aruco_camera:
Then, once the configuration is loaded, it is possible to access to its attributes: [read ArUcoCamera code reference](../../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) to get a complete list of what is available.
-Thus, the [ArUcoCamera.scenes](../../../argaze.md/#argaze.ArFeatures.ArCamera) attribute allows to access each loaded aruco scene and so, access to their attributes: [read ArUcoScene code reference](../../../argaze.md/#argaze.ArUcoMarker.ArUcoScene) to get a complete list of what is available.
+Thus, the [ArUcoCamera.scenes](../../../argaze.md/#argaze.ArFeatures.ArCamera) attribute allows to access each loaded ArUcoScene and so, access to their attributes: [read ArUcoScene code reference](../../../argaze.md/#argaze.ArUcoMarker.ArUcoScene) to get a complete list of what is available.
```python
from argaze import ArFeatures
@@ -76,7 +76,7 @@ from argaze import ArFeatures
## Pipeline execution outputs
-[ArUcoCamera.watch](../../../argaze.md/#argaze.ArFeatures.ArCamera.watch) method returns data about pipeline execution.
+The [ArUcoCamera.watch](../../../argaze.md/#argaze.ArFeatures.ArCamera.watch) method returns data about pipeline execution.
```python
# Assuming that timestamped images are available
@@ -101,7 +101,7 @@ Let's understand the meaning of each returned data.
### *aruco_camera.aruco_detector.detected_markers()*
-A dictionary containing all detected markers provided by [ArUcoDetector](../../../argaze.md/#argaze.ArUcoMarker.ArUcoDetector) class.
+A dictionary containing all detected markers is provided by [ArUcoDetector](../../../argaze.md/#argaze.ArUcoMarker.ArUcoDetector) class.
## Setup ArUcoCamera image parameters
@@ -133,4 +133,4 @@ aruco_camera_image = aruco_camera.image(**image_parameters)
```
!!! note
- [ArUcoCamera](../../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) inherits from [ArFrame](../../../argaze.md/#argaze.ArFeatures.ArFrame) and so, benefits from all image parameters described in [gaze analysis pipeline visualization section](../../gaze_analysis_pipeline/visualization.md). \ No newline at end of file
+ [ArUcoCamera](../../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) inherits from [ArFrame](../../../argaze.md/#argaze.ArFeatures.ArFrame) and, so, benefits from all image parameters described in [gaze analysis pipeline visualization section](../../gaze_analysis_pipeline/visualization.md). \ No newline at end of file
diff --git a/docs/user_guide/aruco_marker_pipeline/aoi_3d_description.md b/docs/user_guide/aruco_marker_pipeline/aoi_3d_description.md
index e8342c7..46422b8 100644
--- a/docs/user_guide/aruco_marker_pipeline/aoi_3d_description.md
+++ b/docs/user_guide/aruco_marker_pipeline/aoi_3d_description.md
@@ -1,17 +1,17 @@
Describe 3D AOI
===============
-Now [scene pose is estimated](aruco_marker_description.md) thanks to ArUco markers description, [areas of interest (AOI)](../../argaze.md/#argaze.AreaOfInterest.AOIFeatures.AreaOfInterest) need to be described into the same 3D referential.
+Now that the [scene pose is estimated](aruco_marker_description.md) thanks to ArUco markers description, [areas of interest (AOI)](../../argaze.md/#argaze.AreaOfInterest.AOIFeatures.AreaOfInterest) need to be described into the same 3D referential.
-In the example scene, the two screens, the control panel and the window are considered as areas of interest.
+In the example scene, the two screens—the control panel and the window—are considered to be areas of interest.
![3D AOI description](../../img/aoi_3d_description.png)
-All AOI need to be described from same origin than markers in a [right-handed 3D axis](https://robotacademy.net.au/lesson/right-handed-3d-coordinate-frame/) where:
+All AOI need to be described from the same origin as markers on a [right-handed 3D axis](https://robotacademy.net.au/lesson/right-handed-3d-coordinate-frame/), where:
* +X is pointing to the right,
* +Y is pointing to the top,
-* +Z is pointing to the backward.
+* +Z is pointing backward.
!!! warning
All AOI spatial values must be given in **centimeters**.
@@ -48,15 +48,15 @@ v -35.850000 35.000000 -15.000000
f 13 14 16 15 17
```
-Here are common OBJ file features needed to describe AOI:
+Here are some common OBJ file features needed to describe AOI:
-* Object lines (starting with *o* key) indicate AOI name.
-* Vertice lines (starting with *v* key) indicate AOI vertices.
-* Face (starting with *f* key) link vertices together.
+* Object line (starting with the *o* key) indicates AOI name.
+* Vertice line (starting with the *v* key) indicates AOI vertices.
+* Face line (starting with the *f* key) links vertices together.
### Edit JSON file description
-JSON file format allows to describe AOI vertices.
+The JSON file format allows for the description of AOI vertices.
``` json
{
diff --git a/docs/user_guide/aruco_marker_pipeline/aoi_3d_frame.md b/docs/user_guide/aruco_marker_pipeline/aoi_3d_frame.md
index 358c412..da8f15c 100644
--- a/docs/user_guide/aruco_marker_pipeline/aoi_3d_frame.md
+++ b/docs/user_guide/aruco_marker_pipeline/aoi_3d_frame.md
@@ -1,7 +1,7 @@
Define a 3D AOI as a frame
==========================
-When an 3D AOI of the scene contains others coplanar 3D AOI, like a screen with GUI elements displayed on, it is better to described them as 2D AOI inside 2D coordinates system related to the containing 3D AOI.
+When an 3D AOI of the scene contains other coplanar 3D AOI, like a screen with GUI elements displayed on it, it is better to describe them as 2D AOI inside a 2D coordinates system related to the containing 3D AOI.
![3D AOI frame](../../img/aruco_camera_aoi_frame.png)
@@ -78,7 +78,7 @@ Now, let's understand the meaning of each JSON entry.
### *frames*
-An [ArUcoScene](../../argaze.md/#argaze.ArUcoMarker.ArUcoScene) instance can contains multiples [ArFrames](../../argaze.md/#argaze.ArFeatures.ArFrame) stored by name.
+An [ArUcoScene](../../argaze.md/#argaze.ArUcoMarker.ArUcoScene) instance can contain multiples [ArFrames](../../argaze.md/#argaze.ArFeatures.ArFrame) stored by name.
### Left_Screen & Right_Screen
@@ -90,7 +90,7 @@ The names of 3D AOI **and** their related [ArFrames](../../argaze.md/#argaze.ArF
!!! warning "Layer name policy"
- An [ArUcoScene](../../argaze.md/#argaze.ArUcoMarker.ArUcoScene) frame layer is projected into [ArUcoScene](../../argaze.md/#argaze.ArUcoMarker.ArUcoScene) layer, **provided they have the same name**.
+ An [ArUcoScene](../../argaze.md/#argaze.ArUcoMarker.ArUcoScene) frame layer is projected into an [ArUcoScene](../../argaze.md/#argaze.ArUcoMarker.ArUcoScene) layer, **provided they have the same name**.
!!! note
@@ -100,7 +100,7 @@ The names of 3D AOI **and** their related [ArFrames](../../argaze.md/#argaze.ArF
### Map ArUcoCamera image into ArUcoScenes frames
-After camera image is passed to [ArUcoCamera.watch](../../argaze.md/#argaze.ArFeatures.ArCamera.watch) method, it is possible to apply a perpective transformation in order to project watched image into each [ArUcoScenes](../../argaze.md/#argaze.ArUcoMarker.ArUcoScene) [frames background](../../argaze.md/#argaze.ArFeatures.ArFrame) image.
+After the camera image is passed to the [ArUcoCamera.watch](../../argaze.md/#argaze.ArFeatures.ArCamera.watch) method, it is possible to apply a perspective transformation in order to project the watched image into each [ArUcoScene](../../argaze.md/#argaze.ArUcoMarker.ArUcoScene) [frame's background](../../argaze.md/#argaze.ArFeatures.ArFrame) image.
```python
# Assuming that Full HD (1920x1080) timestamped images are available
@@ -113,22 +113,22 @@ After camera image is passed to [ArUcoCamera.watch](../../argaze.md/#argaze.ArFe
aruco_camera.map(timestamp=timestamp)
```
-### Analyse timestamped gaze positions into ArUcoScenes frames
+### Analyze timestamped gaze positions into ArUcoScene frames
-[ArUcoScenes](../../argaze.md/#argaze.ArUcoMarker.ArUcoScene) frames benefits from all the services described in [gaze analysis pipeline section](../gaze_analysis_pipeline/introduction.md).
+[ArUcoScene](../../argaze.md/#argaze.ArUcoMarker.ArUcoScene) frames benefits from all the services described in the [gaze analysis pipeline section](../gaze_analysis_pipeline/introduction.md).
!!! note
- Timestamped [GazePositions](../../argaze.md/#argaze.GazeFeatures.GazePosition) passed to [ArUcoCamera.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method are projected into [ArUcoScenes](../../argaze.md/#argaze.ArUcoMarker.ArUcoScene) frames if applicable.
+ Timestamped [GazePositions](../../argaze.md/#argaze.GazeFeatures.GazePosition) passed to the [ArUcoCamera.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method are projected into [ArUcoScene](../../argaze.md/#argaze.ArUcoMarker.ArUcoScene) frames if applicable.
-### Display each ArUcoScenes frames
+### Display each ArUcoScene frames
-All [ArUcoScenes](../../argaze.md/#argaze.ArUcoMarker.ArUcoScene) frames image can be displayed as any [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame).
+All [ArUcoScene](../../argaze.md/#argaze.ArUcoMarker.ArUcoScene) frames image can be displayed as any [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame).
```python
...
- # Display all ArUcoScenes frames
+ # Display all ArUcoScene frames
for frame in aruco_camera.scene_frames:
... frame.image()
diff --git a/docs/user_guide/aruco_marker_pipeline/aoi_3d_projection.md b/docs/user_guide/aruco_marker_pipeline/aoi_3d_projection.md
index ae31075..90fdb38 100644
--- a/docs/user_guide/aruco_marker_pipeline/aoi_3d_projection.md
+++ b/docs/user_guide/aruco_marker_pipeline/aoi_3d_projection.md
@@ -1,15 +1,15 @@
-Project 3D AOI into camera frame
+Project 3D AOI in the camera frame
================================
-Once [ArUcoScene pose is estimated](pose_estimation.md) and [3D AOI are described](aoi_3d_description.md), AOI can be projected into [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) frame.
+Once the [ArUcoScene pose is estimated](pose_estimation.md) and [3D AOI are described](aoi_3d_description.md), they can be projected in the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) frame.
![3D AOI projection](../../img/aruco_camera_aoi_projection.png)
## Add ArLayer to ArUcoScene to load 3D AOI
-The [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) class allows to load 3D AOI description.
+The [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) class allows you to load 3D AOI descriptions.
-Here is the previous extract where one layer is added to [ArUcoScene](../../argaze.md/#argaze.ArUcoMarker.ArUcoScene) configuration:
+Here is the previous extract, where one layer is added to [ArUcoScene](../../argaze.md/#argaze.ArUcoMarker.ArUcoScene) configuration:
```json
{
@@ -43,7 +43,7 @@ Now, let's understand the meaning of each JSON entry.
### *layers*
-An [ArUcoScene](../../argaze.md/#argaze.ArUcoMarker.ArUcoScene) instance can contains multiples [ArLayers](../../argaze.md/#argaze.ArFeatures.ArLayer) stored by name.
+An [ArUcoScene](../../argaze.md/#argaze.ArUcoMarker.ArUcoScene) instance can contain multiples [ArLayers](../../argaze.md/#argaze.ArFeatures.ArLayer) stored by name.
### MyLayer
@@ -51,11 +51,11 @@ The name of an [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer). Basically,
### *aoi_scene*
-The set of 3D AOI into the layer as defined at [3D AOI description chapter](aoi_3d_description.md).
+The set of 3D AOI into the layer, as defined in the [3D AOI description chapter](aoi_3d_description.md).
## Add ArLayer to ArUcoCamera to project 3D AOI into
-Here is the previous extract where one layer is added to [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) configuration and displayed:
+Here is the previous extract, where one layer is added to the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) configuration and displayed:
```json
{
@@ -102,7 +102,7 @@ Now, let's understand the meaning of each JSON entry.
### *layers*
-An [ArUcoCamera](../../argaze.md/#argaze.ArFeatures.ArFrame) instance can contains multiples [ArLayers](../../argaze.md/#argaze.ArFeatures.ArLayer) stored by name.
+An [ArUcoCamera](../../argaze.md/#argaze.ArFeatures.ArFrame) instance can contain multiples [ArLayers](../../argaze.md/#argaze.ArFeatures.ArLayer) stored by name.
### MyLayer
@@ -118,11 +118,11 @@ The name of an [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer). Basically,
## Add AOI analysis features to ArUcoCamera layer
-When a scene layer is projected into a camera layer, it means that the 3D scene's AOI are transformed into 2D camera's AOI.
+When a scene layer is projected into a camera layer, it means that the 3D scene's AOI are transformed into the 2D camera's AOI.
-Therefore, it means that [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) benefits from all the services described in [AOI analysis pipeline section](../gaze_analysis_pipeline/aoi_analysis.md).
+Therefore, it means that [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) benefits from all the services described in the [AOI analysis pipeline section](../gaze_analysis_pipeline/aoi_analysis.md).
-Here is the previous extract where AOI matcher, AOI scan path and AOI scan path analyzers are added to the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) layer:
+Here is the previous extract, where AOI matcher, AOI scan path and AOI scan path analyzers were added to the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) layer:
```json
{
@@ -171,4 +171,4 @@ Here is the previous extract where AOI matcher, AOI scan path and AOI scan path
!!! warning
- Adding scan path and scan path analyzers to an [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) layer doesn't make sense as the space viewed thru camera frame doesn't necessary reflect the space the gaze is covering.
+ Adding scan path and scan path analyzers to an [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) layer doesn't make sense, as the space viewed through the camera frame doesn't have to reflect the space the gaze is covering.
diff --git a/docs/user_guide/aruco_marker_pipeline/aruco_marker_description.md b/docs/user_guide/aruco_marker_pipeline/aruco_marker_description.md
index 1d703d4..ac457dd 100644
--- a/docs/user_guide/aruco_marker_pipeline/aruco_marker_description.md
+++ b/docs/user_guide/aruco_marker_pipeline/aruco_marker_description.md
@@ -57,7 +57,7 @@ Wherever the origin point is, all markers places need to be described on a [righ
### Edit OBJ file description
-OBJ file format could be exported from most 3D editors.
+The OBJ file format could be exported from most 3D editors.
``` obj
o DICT_APRILTAG_16h5#0_Marker
@@ -82,9 +82,9 @@ f 9 10 12 11
Here are some common OBJ file features needed to describe ArUco markers places:
-* Object line (starting with *o* key) indicate marker dictionary and id by following this format: **DICTIONARY**#**ID**\_Marker.
-* Vertice line (starting with *v* key) indicate marker corners. The marker size will be automatically deducted from the geometry.
-* Face line (starting with *f* key) link vertice and normal indexes together.
+* Object line (starting with the *o* key) indicates marker dictionary and id by following this format: **DICTIONARY**#**ID**\_Marker.
+* Vertice line (starting with the *v* key) indicates marker corners. The marker size will be automatically deducted from the geometry.
+* Face line (starting with the *f* key) links vertice and normal indexes together.
!!! warning
Markers have to belong to the same dictionary.
@@ -94,7 +94,7 @@ Here are some common OBJ file features needed to describe ArUco markers places:
### Edit JSON file description
-JSON file format allows to describe markers places using translation and euler angle rotation vectors.
+the JSON file format allows for the description of marker places using translation and Euler angle rotation vectors.
``` json
{
diff --git a/docs/user_guide/aruco_marker_pipeline/configuration_and_execution.md b/docs/user_guide/aruco_marker_pipeline/configuration_and_execution.md
index cadaf61..729608f 100644
--- a/docs/user_guide/aruco_marker_pipeline/configuration_and_execution.md
+++ b/docs/user_guide/aruco_marker_pipeline/configuration_and_execution.md
@@ -127,9 +127,9 @@ Pass each camera image to the [ArUcoCamera.watch](../../argaze.md/#argaze.ArFeat
... aruco_camera.image()
```
-### Analyse time-stamped gaze positions into the camera frame
+### Analyse timestamped gaze positions into the camera frame
-As mentioned above, [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) inherits from [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and, so benefits from all the services described in the [gaze analysis pipeline section](../gaze_analysis_pipeline/introduction.md).
+As mentioned above, [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarker.ArUcoCamera) inherits from [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) and, so, benefits from all the services described in the [gaze analysis pipeline section](../gaze_analysis_pipeline/introduction.md).
Particularly, timestamped gaze positions can be passed one by one to the [ArUcoCamera.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method to execute the whole pipeline dedicated to gaze analysis.