aboutsummaryrefslogtreecommitdiff
path: root/docs/user_guide/utils/demonstrations_scripts.md
diff options
context:
space:
mode:
Diffstat (limited to 'docs/user_guide/utils/demonstrations_scripts.md')
-rw-r--r--docs/user_guide/utils/demonstrations_scripts.md57
1 files changed, 15 insertions, 42 deletions
diff --git a/docs/user_guide/utils/demonstrations_scripts.md b/docs/user_guide/utils/demonstrations_scripts.md
index ed2f8d9..d83915f 100644
--- a/docs/user_guide/utils/demonstrations_scripts.md
+++ b/docs/user_guide/utils/demonstrations_scripts.md
@@ -11,7 +11,7 @@ Collection of command-line scripts for demonstration purpose.
## Random context
-Load **random_context.json** file to analyze random gaze positions.
+Load **random_context.json** file to analyze random gaze positions:
```shell
python -m argaze ./src/argaze/utils/demo/random_context.json
@@ -19,7 +19,7 @@ python -m argaze ./src/argaze/utils/demo/random_context.json
## OpenCV window context
-Load **opencv_window_context.json** file to analyze mouse pointer positions over OpenCV window.
+Load **opencv_window_context.json** file to analyze mouse pointer positions over OpenCV window:
```shell
python -m argaze ./src/argaze/utils/demo/opencv_window_context.json
@@ -27,12 +27,12 @@ python -m argaze ./src/argaze/utils/demo/opencv_window_context.json
## Tobii Pro Glasses 2
-### Tobii live stream context
+### Live stream context
!!! note
- this demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
+ This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
-Edit **tobii_live_stream_context.json** file as below with your own parameters values:
+Edit **tobii_live_stream_context.json** file as to select exisiting IP *address*, *project* or *participant* names and setup Tobii *configuration* parameters:
```json
{
@@ -50,45 +50,35 @@ Edit **tobii_live_stream_context.json** file as below with your own parameters v
"sys_et_freq": 50,
"sys_mems_freq": 100
},
- "pipeline": "aruco_markers_pipeline.json",
- "catch_exceptions": true,
- "image_parameters": {
- "draw_times": true,
- "draw_exceptions": true
- }
+ "pipeline": "aruco_markers_pipeline.json"
}
}
```
-Then, execute this command:
+Then, load **tobii_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
```shell
python -m argaze ./src/argaze/utils/demo/tobii_live_stream_context.json
```
-### Tobii post-processing context
+### Post-processing context
!!! note
- this demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
+ This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
-Edit **tobii_post_processing_context.json** file as below with your own parameters values:
+Edit **tobii_post_processing_context.json** file to select an existing Tobii *segment* folder:
```json
{
"argaze.utils.contexts.TobiiProGlasses2.PostProcessing" : {
"name": "Tobii Pro Glasses 2 post-processing",
"segment": "record/segments/1",
- "pipeline": "aruco_markers_pipeline.json",
- "catch_exceptions": true,
- "image_parameters": {
- "draw_times": true,
- "draw_exceptions": true
- }
+ "pipeline": "aruco_markers_pipeline.json"
}
}
```
-Then, execute this command:
+Then, load **tobii_post_processing_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
```shell
python -m argaze ./src/argaze/utils/demo/tobii_post_processing_context.json
@@ -96,30 +86,13 @@ python -m argaze ./src/argaze/utils/demo/tobii_post_processing_context.json
## Pupil Invisible
-### Pupil Invisible live stream context
+### Live stream context
!!! note
- this demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
+ This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
-Edit **pupillabs_live_stream_context.json** file as below with your own parameters values:
-
-```json
-{
- "argaze.utils.contexts.PupilLabs.LiveStream" : {
- "name": "PupilLabs",
- "pipeline": "aruco_markers_pipeline.json",
- "catch_exceptions": true,
- "image_parameters": {
- "draw_times": true,
- "draw_exceptions": true
- }
- }
-}
-```
-
-Then, execute this command:
+Load **pupillabs_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
```shell
python -m argaze ./src/argaze/utils/demo/pupillabs_live_stream_context.json
```
-