Configure and execute ArUcoCamera ================================= Once [ArUco markers are placed into a scene](aruco_scene_creation.md) and [the camera optic have been calibrated](optic_parameters_calibration.md), everything is ready to setup an ArUco marker pipeline thanks to [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) class. As it inherits from [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame), the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) class benefits from all the services described in [gaze analysis pipeline section](./user_guide/gaze_analysis_pipeline/introduction.md). ![ArUco camera frame](../../img/aruco_camera_frame.png) ## Load JSON configuration file The [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) internal pipeline loads from a JSON configuration file thanks to [ArUcoCamera.from_json](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera.from_json) class method. Here is a simple JSON [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) configuration file example: ```json { "name": "My FullHD camera", "size": [1920, 1080], "aruco_detector": { "dictionary": "DICT_APRILTAG_16h5", "marker_size": 5, "optic_parameters": "optic_parameters.json", }, "image_parameters": { "background_weight": 1, "draw_detected_markers": { "color": [0, 255, 0], "draw_axes": { "thickness": 3 } } } } ``` Then, here is how to load the JSON file: ```python from argaze.ArUcoMarkers import ArUcoCamera # Load ArUcoCamera aruco_camera = ArUcoCamera.ArUcoCamera.from_json('./configuration.json') ``` Now, let's understand the meaning of each JSON entry. ### Name - *inherited from [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame)* The name of the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) frame. Basically useful for visualisation purpose. ### Size - *inherited from [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame)* The size of the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) frame in pixels. Be aware that gaze positions have to be in the same range of value to be projected in. ### ArUco Detector The first [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) pipeline step is to detect ArUco markers inside input image. The [ArUcoDetector](../../argaze.md/#argaze.ArUcoMarkers.ArUcoDetector) is in charge to detect ... !!! warning JSON *aruco_detector* entry is mandatory. ### Image parameters (inherited from ArFrame) ... ## Pipeline execution Timestamped gaze positions have to be passed one by one to [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method to execute the whole intanciated pipeline. ```python # Assuming that live Full HD (1920x1080) video stream is enabled ... # Assuming there is a way to escape the while loop ... while video_stream.is_alive(): # Capture image from video stream image = video_stream.read() # Detect ArUco markers in image aruco_camera.watch(image) # Do something with ArUcoCamera frame image ... ```