aboutsummaryrefslogtreecommitdiff
path: root/docs/user_guide
diff options
context:
space:
mode:
Diffstat (limited to 'docs/user_guide')
-rw-r--r--docs/user_guide/ar_environment/environment_exploitation.md8
-rw-r--r--docs/user_guide/ar_environment/environment_setup.md10
-rw-r--r--docs/user_guide/aruco_markers/introduction.md2
-rw-r--r--docs/user_guide/aruco_markers/markers_scene_description.md34
-rw-r--r--docs/user_guide/utils/demonstrations_scripts.md2
5 files changed, 28 insertions, 28 deletions
diff --git a/docs/user_guide/ar_environment/environment_exploitation.md b/docs/user_guide/ar_environment/environment_exploitation.md
index 28d61b9..9e4b236 100644
--- a/docs/user_guide/ar_environment/environment_exploitation.md
+++ b/docs/user_guide/ar_environment/environment_exploitation.md
@@ -1,19 +1,19 @@
Environment exploitation
========================
-Once loaded, [ArEnvironment](../../argaze.md/#argaze.ArFeatures.ArEnvironment) assets can be exploited as illustrated below:
+Once loaded, [ArCamera](../../argaze.md/#argaze.ArFeatures.ArCamera) assets can be exploited as illustrated below:
```python
# Access to AR environment ArUco detector passing it a image where to detect ArUco markers
-ar_environment.aruco_detector.detect_markers(image)
+ar_camera.aruco_detector.detect_markers(image)
# Access to an AR environment scene
-my_first_scene = ar_environment.scenes['my first AR scene']
+my_first_scene = ar_camera.scenes['my first AR scene']
try:
# Try to estimate AR scene pose from detected markers
- tvec, rmat, consistent_markers = my_first_scene.estimate_pose(ar_environment.aruco_detector.detected_markers)
+ tvec, rmat, consistent_markers = my_first_scene.estimate_pose(ar_camera.aruco_detector.detected_markers)
# Project AR scene into camera image according estimated pose
# Optional visual_hfov argument is set to 160° to clip AOI scene according a cone vision
diff --git a/docs/user_guide/ar_environment/environment_setup.md b/docs/user_guide/ar_environment/environment_setup.md
index f18cc61..1f26d26 100644
--- a/docs/user_guide/ar_environment/environment_setup.md
+++ b/docs/user_guide/ar_environment/environment_setup.md
@@ -1,9 +1,9 @@
Environment Setup
=================
-[ArEnvironment](../../argaze.md/#argaze.ArFeatures.ArEnvironment) setup is loaded from JSON file format.
+[ArCamera](../../argaze.md/#argaze.ArFeatures.ArCamera) setup is loaded from JSON file format.
-Each [ArEnvironment](../../argaze.md/#argaze.ArFeatures.ArEnvironment) defines a unique [ArUcoDetector](../../argaze.md/#argaze.ArUcoMarkers.ArUcoDetector.ArUcoDetector) dedicated to detection of markers from a specific [ArUcoMarkersDictionary](../../argaze.md/#argaze.ArUcoMarkers.ArUcoMarkersDictionary) and with a given size. However, it is possible to load multiple [ArScene](../../argaze.md/#argaze.ArFeatures.ArScene) into a same [ArEnvironment](../../argaze.md/#argaze.ArFeatures.ArEnvironment).
+Each [ArCamera](../../argaze.md/#argaze.ArFeatures.ArCamera) defines a unique [ArUcoDetector](../../argaze.md/#argaze.ArUcoMarkers.ArUcoDetector.ArUcoDetector) dedicated to detection of markers from a specific [ArUcoMarkersDictionary](../../argaze.md/#argaze.ArUcoMarkers.ArUcoMarkersDictionary) and with a given size. However, it is possible to load multiple [ArScene](../../argaze.md/#argaze.ArFeatures.ArScene) into a same [ArCamera](../../argaze.md/#argaze.ArFeatures.ArCamera).
Here is JSON environment file example where it is assumed that mentioned .obj files are located relatively to the environment file on disk.
@@ -54,13 +54,13 @@ Here is JSON environment file example where it is assumed that mentioned .obj fi
},
"scenes": {
"my first AR scene" : {
- "aruco_scene": "./first_scene/markers.obj",
+ "aruco_markers_group": "./first_scene/markers.obj",
"aoi_scene": "./first_scene/aoi.obj",
"angle_tolerance": 15.0,
"distance_tolerance": 2.54
},
"my second AR scene" : {
- "aruco_scene": "./second_scene/markers.obj",
+ "aruco_markers_group": "./second_scene/markers.obj",
"aoi_scene": "./second_scene/aoi.obj",
"angle_tolerance": 15.0,
"distance_tolerance": 2.54
@@ -73,5 +73,5 @@ Here is JSON environment file example where it is assumed that mentioned .obj fi
from argaze import ArFeatures
# Load AR environment
-ar_environment = ArFeatures.ArEnvironment.from_json('./environment.json')
+ar_camera = ArFeatures.ArCamera.from_json('./environment.json')
```
diff --git a/docs/user_guide/aruco_markers/introduction.md b/docs/user_guide/aruco_markers/introduction.md
index dc8d4cb..9d78de0 100644
--- a/docs/user_guide/aruco_markers/introduction.md
+++ b/docs/user_guide/aruco_markers/introduction.md
@@ -12,4 +12,4 @@ The ArGaze [ArUcoMarkers submodule](../../argaze.md/#argaze.ArUcoMarkers) eases
* [ArUcoBoard](../../argaze.md/#argaze.ArUcoMarkers.ArUcoBoard)
* [ArUcoOpticCalibrator](../../argaze.md/#argaze.ArUcoMarkers.ArUcoOpticCalibrator)
* [ArUcoDetector](../../argaze.md/#argaze.ArUcoMarkers.ArUcoDetector)
-* [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) \ No newline at end of file
+* [ArUcoMarkersGroup](../../argaze.md/#argaze.ArUcoMarkers.ArUcoMarkersGroup) \ No newline at end of file
diff --git a/docs/user_guide/aruco_markers/markers_scene_description.md b/docs/user_guide/aruco_markers/markers_scene_description.md
index e1cd651..c6dbf31 100644
--- a/docs/user_guide/aruco_markers/markers_scene_description.md
+++ b/docs/user_guide/aruco_markers/markers_scene_description.md
@@ -1,11 +1,11 @@
Markers scene description
=========================
-The ArGaze toolkit provides [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) class to describe where [ArUcoMarkers](../../argaze.md/#argaze.ArUcoMarkers.ArUcoMarker) are placed into a 3D model.
+The ArGaze toolkit provides [ArUcoMarkersGroup](../../argaze.md/#argaze.ArUcoMarkers.ArUcoMarkersGroup) class to describe where [ArUcoMarkers](../../argaze.md/#argaze.ArUcoMarkers.ArUcoMarker) are placed into a 3D model.
-![ArUco scene](../../img/aruco_scene.png)
+![ArUco scene](../../img/aruco_markers_group.png)
-[ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) is useful to:
+[ArUcoMarkersGroup](../../argaze.md/#argaze.ArUcoMarkers.ArUcoMarkersGroup) is useful to:
* filter markers that belongs to this predefined scene,
* check the consistency of detected markers according the place where each marker is expected to be,
@@ -37,16 +37,16 @@ f 5//2 6//2 8//2 7//2
...
```
-Here is a sample of code to show the loading of an [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) OBJ file description:
+Here is a sample of code to show the loading of an [ArUcoMarkersGroup](../../argaze.md/#argaze.ArUcoMarkers.ArUcoMarkersGroup) OBJ file description:
``` python
-from argaze.ArUcoMarkers import ArUcoScene
+from argaze.ArUcoMarkers import ArUcoMarkersGroup
# Create an ArUco scene from a OBJ file description
-aruco_scene = ArUcoScene.ArUcoScene.from_obj('./markers.obj')
+aruco_markers_group = ArUcoMarkersGroup.ArUcoMarkersGroup.from_obj('./markers.obj')
# Print loaded marker places
-for place_id, place in aruco_scene.places.items():
+for place_id, place in aruco_markers_group.places.items():
print(f'place {place_id} for marker: ', place.marker.identifier)
print(f'place {place_id} translation: ', place.translation)
@@ -55,7 +55,7 @@ for place_id, place in aruco_scene.places.items():
### from JSON
-[ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) description can also be written in a JSON file format.
+[ArUcoMarkersGroup](../../argaze.md/#argaze.ArUcoMarkers.ArUcoMarkersGroup) description can also be written in a JSON file format.
``` json
{
@@ -83,13 +83,13 @@ for place_id, place in aruco_scene.places.items():
Here is a more advanced usage where ArUco scene is built from markers detected into an image:
``` python
-from argaze.ArUcoMarkers import ArUcoScene
+from argaze.ArUcoMarkers import ArUcoMarkersGroup
# Assuming markers have been detected and their pose estimated thanks to ArUcoDetector
...
# Build ArUco scene from detected markers
-aruco_scene = ArUcoScene.ArUcoScene(aruco_detector.marker_size, aruco_detector.dictionary, aruco_detector.detected_markers)
+aruco_markers_group = ArUcoMarkersGroup.ArUcoMarkersGroup(aruco_detector.marker_size, aruco_detector.dictionary, aruco_detector.detected_markers)
```
## Markers filtering
@@ -97,7 +97,7 @@ aruco_scene = ArUcoScene.ArUcoScene(aruco_detector.marker_size, aruco_detector.d
Considering markers are detected, here is how to filter them to consider only those which belongs to the scene:
``` python
-scene_markers, remaining_markers = aruco_scene.filter_markers(aruco_detector.detected_markers)
+scene_markers, remaining_markers = aruco_markers_group.filter_markers(aruco_detector.detected_markers)
```
## Marker poses consistency
@@ -106,12 +106,12 @@ Then, scene markers poses can be validated by verifying their spatial consistenc
``` python
# Check scene markers consistency with 10° angle tolerance and 1 cm distance tolerance
-consistent_markers, unconsistent_markers, unconsistencies = aruco_scene.check_markers_consistency(scene_markers, 10, 1)
+consistent_markers, unconsistent_markers, unconsistencies = aruco_markers_group.check_markers_consistency(scene_markers, 10, 1)
```
## Scene pose estimation
-Several approaches are available to perform [ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) pose estimation from markers belonging to the scene.
+Several approaches are available to perform [ArUcoMarkersGroup](../../argaze.md/#argaze.ArUcoMarkers.ArUcoMarkersGroup) pose estimation from markers belonging to the scene.
The first approach considers that scene pose can be estimated **from a single marker pose**:
@@ -120,20 +120,20 @@ The first approach considers that scene pose can be estimated **from a single ma
marker_id, marker = consistent_markers.popitem()
# Estimate scene pose from a single marker
-tvec, rmat = self.aruco_scene.estimate_pose_from_single_marker(marker)
+tvec, rmat = self.aruco_markers_group.estimate_pose_from_single_marker(marker)
```
The second approach considers that scene pose can be estimated by **averaging several marker poses**:
``` python
# Estimate scene pose from all consistent scene markers
-tvec, rmat = self.aruco_scene.estimate_pose_from_markers(consistent_markers)
+tvec, rmat = self.aruco_markers_group.estimate_pose_from_markers(consistent_markers)
```
The third approach is only available when ArUco markers are placed in such a configuration that is possible to **define orthogonal axis**:
``` python
-tvec, rmat = self.aruco_scene.estimate_pose_from_axis_markers(origin_marker, horizontal_axis_marker, vertical_axis_marker)
+tvec, rmat = self.aruco_markers_group.estimate_pose_from_axis_markers(origin_marker, horizontal_axis_marker, vertical_axis_marker)
```
## Scene exportation
@@ -142,5 +142,5 @@ As ArUco scene can be exported to OBJ file description to import it into most 3D
``` python
# Export an ArUco scene as OBJ file description
-aruco_scene.to_obj('markers.obj')
+aruco_markers_group.to_obj('markers.obj')
```
diff --git a/docs/user_guide/utils/demonstrations_scripts.md b/docs/user_guide/utils/demonstrations_scripts.md
index 5d2d760..4f73092 100644
--- a/docs/user_guide/utils/demonstrations_scripts.md
+++ b/docs/user_guide/utils/demonstrations_scripts.md
@@ -19,7 +19,7 @@ python ./src/argaze/utils/demo_gaze_analysis_run.py ./src/argaze/utils/demo_envi
## Augmented reality pipeline demonstration
-Load ArEnvironment from **demo_augmented_reality_setup.json** file then, detect ArUco markers into a demo video source and estimate environment pose.
+Load ArCamera from **demo_augmented_reality_setup.json** file then, detect ArUco markers into a demo video source and estimate environment pose.
```shell
python ./src/argaze/utils/demo_augmented_reality_run.py ./src/argaze/utils/demo_environment/demo_augmented_reality_setup.json -s ./src/argaze/utils/demo_environment/demo.mov