Demonstrations scripts ====================== Collection of command-line scripts for demonstration purpose. !!! note *Consider that all inline commands below have to be executed at the root of ArGaze package folder.* !!! note *Use -h option to get command arguments documentation.* ## Gaze analysis pipeline demonstration Load ArFrame with a single ArLayer from **demo_gaze_analysis_setup.json** file then, simulate gaze position using mouse pointer to illustrate gaze features. ```shell python ./src/argaze/utils/demo_gaze_analysis_run.py ./src/argaze/utils/demo/gaze_analysis_pipeline.json ``` ## ArUco markers pipeline demonstration Load ArUcoCamera from **demo_aruco_markers_setup.json** file then, detect ArUco markers into a demo video source and estimate camera pose. ```shell python ./src/argaze/utils/demo_aruco_markers_run.py ./src/argaze/utils/demo/aruco_markers_pipeline.json -s ./src/argaze/utils/demo_data/demo.mov ``` !!! note To reproduce this demonstration with live video source (-s ), print **A3_demo.pdf** file located in *./src/argaze/utils/demo_data/* folder on A3 paper sheet. ## Worn device stream demonstration Load ArUcoCamera from a configuration file then, stream and process gaze positions and image from any worn eye-tracker device. ### With Tobii Pro Glasses 2 device To use a Tobii Pro Glasses 2 device, you need to edit **eyetracker_setup.json** file as below with your own parameters values: ```json { "argaze.utils.eyetrackers.TobiiProGlasses2.LiveStream" : { "address": "10.34.0.12", "project": "MyProject", "participant": "NewParticipant", "configuration": { "sys_ec_preset": "Indoor", "sys_sc_width": 1920, "sys_sc_height": 1080, "sys_sc_fps": 25, "sys_sc_preset": "Auto", "sys_et_freq": 50, "sys_mems_freq": 100 }, "pipeline": "demo_aruco_markers_setup.json" } } ``` ```shell python ./src/argaze/utils/pipeline_run.py ./src/argaze/utils/demo/eyetracker_setup.json ```