From 58ef2e8276318d76fa2b11a257ed65850352c7fd Mon Sep 17 00:00:00 2001 From: Théo de la Hogue Date: Wed, 27 Mar 2024 18:08:30 +0100 Subject: Updating demonstration. --- docs/user_guide/utils/demonstrations_scripts.md | 41 ++++++++++++------------- docs/user_guide/utils/ready-made_scripts.md | 6 ++-- 2 files changed, 22 insertions(+), 25 deletions(-) (limited to 'docs') diff --git a/docs/user_guide/utils/demonstrations_scripts.md b/docs/user_guide/utils/demonstrations_scripts.md index c3a5c9b..a230678 100644 --- a/docs/user_guide/utils/demonstrations_scripts.md +++ b/docs/user_guide/utils/demonstrations_scripts.md @@ -9,37 +9,26 @@ Collection of command-line scripts for demonstration purpose. !!! note *Use -h option to get command arguments documentation.* -## Gaze analysis pipeline demonstration +## OpenCV window context Load ArFrame with a single ArLayer from **demo_gaze_analysis_setup.json** file then, simulate gaze position using mouse pointer to illustrate gaze features. ```shell -python ./src/argaze/utils/demo_gaze_analysis_run.py ./src/argaze/utils/demo/gaze_analysis_pipeline.json +python ./src/argaze/utils/context_run.py ./src/argaze/utils/demo/opencv_window_context_setup.json ``` -## ArUco markers pipeline demonstration - -Load ArUcoCamera from **demo_aruco_markers_setup.json** file then, detect ArUco markers into a demo video source and estimate camera pose. - -```shell -python ./src/argaze/utils/demo_aruco_markers_run.py ./src/argaze/utils/demo/aruco_markers_pipeline.json -s ./src/argaze/utils/demo_data/demo.mov -``` +## Tobii live stream context demonstration !!! note - To reproduce this demonstration with live video source (-s ), print **A3_demo.pdf** file located in *./src/argaze/utils/demo_data/* folder on A3 paper sheet. - -## Worn device stream demonstration + this demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo_data/* folder on A3 paper sheet. -Load ArUcoCamera from a configuration file then, stream and process gaze positions and image from any worn eye-tracker device. - -### With Tobii Pro Glasses 2 device - -To use a Tobii Pro Glasses 2 device, you need to edit **eyetracker_setup.json** file as below with your own parameters values: +Edit **tobii_live_stream_context_setup.json** file as below with your own parameters values: ```json { - "argaze.utils.eyetrackers.TobiiProGlasses2.LiveStream" : { - "address": "10.34.0.12", + "argaze.utils.contexts.TobiiProGlasses2.LiveStream" : { + "name": "Tobii Pro Glasses 2 live stream", + "address": "10.34.0.17", "project": "MyProject", "participant": "NewParticipant", "configuration": { @@ -51,11 +40,19 @@ To use a Tobii Pro Glasses 2 device, you need to edit **eyetracker_setup.json** "sys_et_freq": 50, "sys_mems_freq": 100 }, - "pipeline": "demo_aruco_markers_setup.json" + "pipeline": "aruco_markers_pipeline.json", + "image_parameters": { + "draw_something": false, + "draw_times": true, + "draw_exceptions": true + } } } ``` +Then, execute this command: + ```shell -python ./src/argaze/utils/pipeline_run.py ./src/argaze/utils/demo/eyetracker_setup.json -``` \ No newline at end of file +python ./src/argaze/utils/context_run.py ./src/argaze/utils/demo/tobii_live_stream_context_setup.json +``` + diff --git a/docs/user_guide/utils/ready-made_scripts.md b/docs/user_guide/utils/ready-made_scripts.md index 262a0ef..4767969 100644 --- a/docs/user_guide/utils/ready-made_scripts.md +++ b/docs/user_guide/utils/ready-made_scripts.md @@ -9,12 +9,12 @@ Collection of command-line scripts to provide useful features. !!! note *Use -h option to get command arguments documentation.* -## Eyetracker pipeline handler +## ArGaze context handler -Load and execute eyetracker pipeline. +Load and execute any ArGaze context from a JSON CONFIGURATION file. ```shell -python ./src/argaze/utils/pipeline_run.py CONFIGURATION +python ./src/argaze/utils/context_run.py CONFIGURATION ``` ## ArUco markers group exporter -- cgit v1.1