diff options
author | Théo de la Hogue | 2023-09-06 08:34:32 +0200 |
---|---|---|
committer | Théo de la Hogue | 2023-09-06 08:34:32 +0200 |
commit | 3b8681b848fd91989a03d0ff6a03c7deaec4addd (patch) | |
tree | 438e0eb91d4954564c774b16421c94ad2d3bbdbd /docs/user_guide/augmented_reality_pipeline | |
parent | 3dba9640ad57e48d1979d19cfe8cab8c9be2d621 (diff) | |
download | argaze-3b8681b848fd91989a03d0ff6a03c7deaec4addd.zip argaze-3b8681b848fd91989a03d0ff6a03c7deaec4addd.tar.gz argaze-3b8681b848fd91989a03d0ff6a03c7deaec4addd.tar.bz2 argaze-3b8681b848fd91989a03d0ff6a03c7deaec4addd.tar.xz |
Renaming folder.
Diffstat (limited to 'docs/user_guide/augmented_reality_pipeline')
3 files changed, 0 insertions, 283 deletions
diff --git a/docs/user_guide/augmented_reality_pipeline/aruco_scene_creation.md b/docs/user_guide/augmented_reality_pipeline/aruco_scene_creation.md deleted file mode 100644 index d9a7be5..0000000 --- a/docs/user_guide/augmented_reality_pipeline/aruco_scene_creation.md +++ /dev/null @@ -1,131 +0,0 @@ -Setup ArUco markers scene -========================= - -The OpenCV library provides a module to detect fiducial markers into a picture and estimate its pose (cf [OpenCV ArUco tutorial page](https://docs.opencv.org/4.x/d5/dae/tutorial_aruco_detection.html)). - -![OpenCV ArUco markers](https://pyimagesearch.com/wp-content/uploads/2020/12/aruco_generate_tags_header.png) - -The ArGaze [ArUcoMarkers submodule](../../argaze.md/#argaze.ArUcoMarkers) eases markers creation and description of their expected place for further camera pose estimation. - -## Print ArUco markers from a ArUco dictionary - -ArUco markers always belongs to a set of markers called ArUco markers dictionary. - -![ArUco dictionaries](../../img/aruco_dictionaries.png) - -Many ArUco dictionaries exist with properties concerning the format, the number of markers or the difference between each markers to avoid error in tracking. - -Here is the documention [about ArUco markers dictionaries](https://docs.opencv.org/3.4/d9/d6a/group__aruco.html#gac84398a9ed9dd01306592dd616c2c975). - -The creation of [ArUcoMarkers](../../argaze.md/#argaze.ArUcoMarkers.ArUcoMarker) pictures from a dictionary is illustrated in the code below: - -``` python -from argaze.ArUcoMarkers import ArUcoMarkersDictionary - -# Create a dictionary of specific April tags -aruco_dictionary = ArUcoMarkersDictionary.ArUcoMarkersDictionary('DICT_APRILTAG_16h5') - -# Export marker n°5 as 3.5 cm picture with 300 dpi resolution -aruco_dictionary.create_marker(5, 3.5).save('./markers/', 300) - -# Export all dictionary markers as 3.5 cm pictures with 300 dpi resolution -aruco_dictionary.save('./markers/', 3.5, 300) -``` - -Let's print some of them before to go further. - -!!! warning - Print markers with a blank zone around them to help in their detection. - -## Describe expected ArUco markers place - -Once [ArUcoMarkers](../../argaze.md/#argaze.ArUcoMarkers.ArUcoMarker) pictures are placed into a scene it is possible to describe their expected 3D place into a file. - -![ArUco scene](../../img/aruco_scene.png) - -Where ever the origin point is, all expected markers places need to be described in a [right-handed 3D axis](https://robotacademy.net.au/lesson/right-handed-3d-coordinate-frame/) where: - -* +X is pointing to the right, -* +Y is pointing to the top, -* +Z is pointing to the backward. - -!!! warning - All ArUco markers spatial values must be given in **centimeters**. - -### Edit OBJ file description - -OBJ file format could be exported from most 3D editors. - -``` obj -o DICT_APRILTAG_16h5#0_Marker -v -5.000000 14.960000 0.000000 -v 0.000000 14.960000 0.000000 -v -5.000000 19.959999 0.000000 -v 0.000000 19.959999 0.000000 -vn 0.0000 0.0000 1.0000 -s off -f 1//1 2//1 4//1 3//1 -o DICT_APRILTAG_16h5#1_Marker -v 25.000000 14.960000 0.000000 -v 30.000000 14.960000 0.000000 -v 25.000000 19.959999 0.000000 -v 30.000000 19.959999 0.000000 -vn 0.0000 0.0000 1.0000 -s off -f 5//2 6//2 8//2 7//2 -o DICT_APRILTAG_16h5#2_Marker -v -5.000000 -5.000000 0.000000 -v 0.000000 -5.000000 0.000000 -v -5.000000 0.000000 0.000000 -v 0.000000 0.000000 0.000000 -vn 0.0000 0.0000 1.0000 -s off -f 9//3 10//3 12//3 11//3 -o DICT_APRILTAG_16h5#3_Marker -v 25.000000 -5.000000 0.000000 -v 30.000000 -5.000000 0.000000 -v 25.000000 0.000000 0.000000 -v 30.000000 0.000000 0.000000 -vn 0.0000 0.0000 1.0000 -s off -f 13//4 14//4 16//4 15//4 -``` - -Here are common OBJ file features needed to describe ArUco markers place: - -* Object lines (line starting with *o* key) indicate markers dictionary and id by following a format: **DICTIONARY**#**ID**\_Marker. -* Vertice lines (lines starting with *v* key) indicate markers corners. The marker size will be automatically deducted from the geometry. -* Plane normals (lines starting with *vn* key) need to be exported for further pose estimation. -* Face (lines starting with *f* key) link vertices and normals indexes together. - -!!! warning - All markers must have the same size and belong to the same dictionary. - -### Edit JSON file description - -JSON file format allows to describe markers places using translation and euler angle rotation vectors. - -``` json -{ - "dictionary": "DICT_APRILTAG_16h5", - "marker_size": 5, - "places": { - "0": { - "translation": [-2.5, 17.5, 0], - "rotation": [0.0, 0.0, 0.0] - }, - "1": { - "translation": [27.5, 17.5, 0], - "rotation": [0.0, 0.0, 0.0] - }, - "2": { - "translation": [-2.5, -2.5, 0], - "rotation": [0.0, 0.0, 0.0] - }, - "3": { - "translation": [27.5, -2.5, 0], - "rotation": [0.0, 0.0, 0.0] - } - } -} -``` diff --git a/docs/user_guide/augmented_reality_pipeline/introduction.md b/docs/user_guide/augmented_reality_pipeline/introduction.md deleted file mode 100644 index a06b1e2..0000000 --- a/docs/user_guide/augmented_reality_pipeline/introduction.md +++ /dev/null @@ -1,19 +0,0 @@ -Overview -======== - -This section explains how to build augmented reality pipelines based on ArUco Markers technology for various use cases. - -First, let's look at the schema below: it gives an overview of the main notions involved in the following chapters. - -![Augmented reality pipeline](../../img/augmented_reality_pipeline.png) - -To build your own augmented reality pipeline, you need to know: - -* [How to setup an ArUco markers scene](aruco_scene_creation.md), -* [How to deal with an ArCamera instance](ar_camera_configuration_and_execution.md), -* [How to add ArScene instance](ar_scene.md), -* [How to visualize ArCamera and ArScenes](visualisation.md) - -More advanced features are also explained like: - -* [How to script augmented reality pipeline](advanced_topics/scripting.md) diff --git a/docs/user_guide/augmented_reality_pipeline/optic_parameters_calibration.md b/docs/user_guide/augmented_reality_pipeline/optic_parameters_calibration.md deleted file mode 100644 index 0561112..0000000 --- a/docs/user_guide/augmented_reality_pipeline/optic_parameters_calibration.md +++ /dev/null @@ -1,133 +0,0 @@ -Calibrate optic parameters -========================== - -A camera device have to be calibrated to compensate its optical distorsion. - -![Optic parameters calibration](../../img/optic_calibration.png) - -## Print calibration board - -The first step to calibrate a camera is to create an [ArUcoBoard](../../argaze.md/#argaze.ArUcoMarkers.ArUcoBoard) like in the code below: - -``` python -from argaze.ArUcoMarkers import ArUcoMarkersDictionary, ArUcoBoard - -# Create ArUco dictionary -aruco_dictionary = ArUcoMarkersDictionary.ArUcoMarkersDictionary('DICT_APRILTAG_16h5') - -# Create an ArUco board of 7 columns and 5 rows with 5 cm squares with 3cm ArUco markers inside -aruco_board = ArUcoBoard.ArUcoBoard(7, 5, 5, 3, aruco_dictionary) - -# Export ArUco board with 300 dpi resolution -aruco_board.save('./calibration_board.png', 300) -``` - -!!! note - There is **A3_DICT_APRILTAG_16h5_3cm_35cmx25cm.pdf** file located in *./src/argaze/ArUcoMarkers/utils/* folder ready to be printed on A3 paper sheet. - -Let's print the calibration board before to go further. - -## Capture board pictures - -Then, the calibration process needs to make many different captures of an [ArUcoBoard](../../argaze.md/#argaze.ArUcoMarkers.ArUcoBoard) through the camera and then, pass them to an [ArUcoDetector](../../argaze.md/#argaze.ArUcoMarkers.ArUcoDetector.ArUcoDetector) instance to detect board corners and store them as calibration data into an [ArUcoOpticCalibrator](../../argaze.md/#argaze.ArUcoMarkers.ArUcoOpticCalibrator) for final calibration process. - -![Calibration step](../../img/optic_calibration_step.png) - -The sample of code below illustrates how to: - -* load all required ArGaze objects, -* detect board corners into a Full HD camera video stream, -* store detected corners as calibration data then, -* once enough captures are made, process them to find optic parameters and, -* finally, save optic parameters into a JSON file. - -``` python -from argaze.ArUcoMarkers import ArUcoMarkersDictionary, ArUcoOpticCalibrator, ArUcoBoard, ArUcoDetector - -# Create ArUco dictionary -aruco_dictionary = ArUcoMarkersDictionary.ArUcoMarkersDictionary('DICT_APRILTAG_16h5') - -# Create ArUco optic calibrator -aruco_optic_calibrator = ArUcoOpticCalibrator.ArUcoOpticCalibrator() - -# Create ArUco board of 7 columns and 5 rows with 5 cm squares with 3cm aruco markers inside -# Note: This board is the one expected during further board tracking -expected_aruco_board = ArUcoBoard.ArUcoBoard(7, 5, 5, 3, aruco_dictionary) - -# Create ArUco detector -aruco_detector = ArUcoDetector.ArUcoDetector(dictionary=aruco_dictionary, marker_size=3) - -# Assuming that live Full HD (1920x1080) video stream is enabled -... - -# Assuming there is a way to escape the while loop -... - - # Capture images from video stream - while video_stream.is_alive(): - - image = video_stream.read() - - # Detect all board corners in image - aruco_detector.detect_board(image, expected_aruco_board, expected_aruco_board.markers_number) - - # If all board corners are detected - if aruco_detector.board_corners_number == expected_aruco_board.corners_number: - - # Draw board corners to show that board tracking succeeded - aruco_detector.draw_board(image) - - # Append tracked board data for further calibration processing - aruco_optic_calibrator.store_calibration_data(aruco_detector.board_corners, aruco_detector.board_corners_identifier) - -# Start optic calibration processing for Full HD image resolution -print('Calibrating optic...') -optic_parameters = aruco_optic_calibrator.calibrate(aruco_board, dimensions=(1920, 1080)) - -if optic_parameters: - - # Export optic parameters - optic_parameters.to_json('./optic_parameters.json') - - print('Calibration succeeded: optic_parameters.json file exported.') - -else: - - print('Calibration failed.') -``` - -Below, an optic_parameters JSON file example: - -```json -{ - "rms": 0.6688921504088245, - "dimensions": [ - 1920, - 1080 - ], - "K": [ - [ - 1135.6524381415752, - 0.0, - 956.0685325355497 - ], - [ - 0.0, - 1135.9272506869524, - 560.059099810324 - ], - [ - 0.0, - 0.0, - 1.0 - ] - ], - "D": [ - 0.01655492265003404, - 0.1985524264972037, - 0.002129965902489484, - -0.0019528582922179365, - -0.5792910353639452 - ] -} -``` |