diff options
author | Théo de la Hogue | 2024-04-23 07:58:41 +0200 |
---|---|---|
committer | Théo de la Hogue | 2024-04-23 07:58:41 +0200 |
commit | 1487defacef6ba3e63d92f46d0e54a8339a37897 (patch) | |
tree | c990719cbdf1c923fe28465f1f9dc70f1e4029fc /docs/user_guide/utils/demonstrations_scripts.md | |
parent | 95857cf4f31bf529bfdd3921150262b12b444888 (diff) | |
download | argaze-1487defacef6ba3e63d92f46d0e54a8339a37897.zip argaze-1487defacef6ba3e63d92f46d0e54a8339a37897.tar.gz argaze-1487defacef6ba3e63d92f46d0e54a8339a37897.tar.bz2 argaze-1487defacef6ba3e63d92f46d0e54a8339a37897.tar.xz |
Updating ArContext documentation.
Diffstat (limited to 'docs/user_guide/utils/demonstrations_scripts.md')
-rw-r--r-- | docs/user_guide/utils/demonstrations_scripts.md | 57 |
1 files changed, 15 insertions, 42 deletions
diff --git a/docs/user_guide/utils/demonstrations_scripts.md b/docs/user_guide/utils/demonstrations_scripts.md index ed2f8d9..d83915f 100644 --- a/docs/user_guide/utils/demonstrations_scripts.md +++ b/docs/user_guide/utils/demonstrations_scripts.md @@ -11,7 +11,7 @@ Collection of command-line scripts for demonstration purpose. ## Random context -Load **random_context.json** file to analyze random gaze positions. +Load **random_context.json** file to analyze random gaze positions: ```shell python -m argaze ./src/argaze/utils/demo/random_context.json @@ -19,7 +19,7 @@ python -m argaze ./src/argaze/utils/demo/random_context.json ## OpenCV window context -Load **opencv_window_context.json** file to analyze mouse pointer positions over OpenCV window. +Load **opencv_window_context.json** file to analyze mouse pointer positions over OpenCV window: ```shell python -m argaze ./src/argaze/utils/demo/opencv_window_context.json @@ -27,12 +27,12 @@ python -m argaze ./src/argaze/utils/demo/opencv_window_context.json ## Tobii Pro Glasses 2 -### Tobii live stream context +### Live stream context !!! note - this demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet. + This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet. -Edit **tobii_live_stream_context.json** file as below with your own parameters values: +Edit **tobii_live_stream_context.json** file as to select exisiting IP *address*, *project* or *participant* names and setup Tobii *configuration* parameters: ```json { @@ -50,45 +50,35 @@ Edit **tobii_live_stream_context.json** file as below with your own parameters v "sys_et_freq": 50, "sys_mems_freq": 100 }, - "pipeline": "aruco_markers_pipeline.json", - "catch_exceptions": true, - "image_parameters": { - "draw_times": true, - "draw_exceptions": true - } + "pipeline": "aruco_markers_pipeline.json" } } ``` -Then, execute this command: +Then, load **tobii_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: ```shell python -m argaze ./src/argaze/utils/demo/tobii_live_stream_context.json ``` -### Tobii post-processing context +### Post-processing context !!! note - this demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet. + This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet. -Edit **tobii_post_processing_context.json** file as below with your own parameters values: +Edit **tobii_post_processing_context.json** file to select an existing Tobii *segment* folder: ```json { "argaze.utils.contexts.TobiiProGlasses2.PostProcessing" : { "name": "Tobii Pro Glasses 2 post-processing", "segment": "record/segments/1", - "pipeline": "aruco_markers_pipeline.json", - "catch_exceptions": true, - "image_parameters": { - "draw_times": true, - "draw_exceptions": true - } + "pipeline": "aruco_markers_pipeline.json" } } ``` -Then, execute this command: +Then, load **tobii_post_processing_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: ```shell python -m argaze ./src/argaze/utils/demo/tobii_post_processing_context.json @@ -96,30 +86,13 @@ python -m argaze ./src/argaze/utils/demo/tobii_post_processing_context.json ## Pupil Invisible -### Pupil Invisible live stream context +### Live stream context !!! note - this demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet. + This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet. -Edit **pupillabs_live_stream_context.json** file as below with your own parameters values: - -```json -{ - "argaze.utils.contexts.PupilLabs.LiveStream" : { - "name": "PupilLabs", - "pipeline": "aruco_markers_pipeline.json", - "catch_exceptions": true, - "image_parameters": { - "draw_times": true, - "draw_exceptions": true - } - } -} -``` - -Then, execute this command: +Load **pupillabs_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: ```shell python -m argaze ./src/argaze/utils/demo/pupillabs_live_stream_context.json ``` - |