1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
|
Demonstrations scripts
======================
Collection of command-line scripts for demonstration purpose.
!!! note
*Consider that all inline commands below have to be executed at the root of ArGaze package folder.*
!!! note
*Use -h option to get command arguments documentation.*
## Random context
Load **random_context.json** file to analyze random gaze positions:
```shell
python -m argaze ./src/argaze/utils/demo/random_context.json
```
## OpenCV window context
Load **opencv_window_context.json** file to analyze mouse pointer positions over OpenCV window:
```shell
python -m argaze ./src/argaze/utils/demo/opencv_window_context.json
```
## Tobii Pro Glasses 2
### Live stream context
!!! note
This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
Edit **tobii_live_stream_context.json** file as to select exisiting IP *address*, *project* or *participant* names and setup Tobii *configuration* parameters:
```json
{
"argaze.utils.contexts.TobiiProGlasses2.LiveStream" : {
"name": "Tobii Pro Glasses 2 live stream",
"address": "10.34.0.17",
"project": "MyProject",
"participant": "NewParticipant",
"configuration": {
"sys_ec_preset": "Indoor",
"sys_sc_width": 1920,
"sys_sc_height": 1080,
"sys_sc_fps": 25,
"sys_sc_preset": "Auto",
"sys_et_freq": 50,
"sys_mems_freq": 100
},
"pipeline": "aruco_markers_pipeline.json"
}
}
```
Then, load **tobii_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
```shell
python -m argaze ./src/argaze/utils/demo/tobii_live_stream_context.json
```
### Post-processing context
!!! note
This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
Edit **tobii_post_processing_context.json** file to select an existing Tobii *segment* folder:
```json
{
"argaze.utils.contexts.TobiiProGlasses2.PostProcessing" : {
"name": "Tobii Pro Glasses 2 post-processing",
"segment": "record/segments/1",
"pipeline": "aruco_markers_pipeline.json"
}
}
```
Then, load **tobii_post_processing_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
```shell
python -m argaze ./src/argaze/utils/demo/tobii_post_processing_context.json
```
## Pupil Invisible
### Live stream context
!!! note
This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
Load **pupillabs_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
```shell
python -m argaze ./src/argaze/utils/demo/pupillabs_live_stream_context.json
```
|