aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
-rw-r--r--src/argaze/utils/README.md8
1 files changed, 4 insertions, 4 deletions
diff --git a/src/argaze/utils/README.md b/src/argaze/utils/README.md
index abc44dc..321e83a 100644
--- a/src/argaze/utils/README.md
+++ b/src/argaze/utils/README.md
@@ -72,16 +72,16 @@ python ./src/argaze/utils/replay_tobii_session.py -s SEGMENT_PATH -r IN OUT
python ./src/argaze/utils/export_tobii_segment_movements.py -s SEGMENT_PATH -r IN OUT
```
-- Track ArUco markers into a Tobii camera video segment (-s SEGMENT_PATH) into a time range selection (-r IN OUT). Load aoi scene .obj file related to each marker (-mi MARKER_ID_SCENE), position it virtually relatively to the detected ArUco markers and project the scene into camera frame. Then, detect if Tobii gaze point is focusing onto AOIs to build the segment visual scan and export it as a visual_scan.csv, visual_scan.jpg, visual_scan.mp4 files:
+- Track ArUco markers into a Tobii camera video segment (-s SEGMENT_PATH) into a time range selection (-r IN OUT). Load aoi scene .obj file related to each marker (-mi MARKER_ID, PATH_TO_AOI_SCENE), position each scene virtually relatively to its detected ArUco markers then project the scene into camera frame. Then, detect if Tobii gaze point is focusing onto AOIs to build the segment visual scan and export it as a visual_scan.csv, visual_scan.jpg, visual_scan.mp4 files:
```
-python ./src/argaze/utils/export_tobii_segment_aruco_visual_scan.py -s SEGMENT_PATH -c export/tobii_camera.json -ms 5 -mi MARKER_ID_SCENE -r IN OUT
+python ./src/argaze/utils/export_tobii_segment_aruco_visual_scan.py -s SEGMENT_PATH -c export/tobii_camera.json -r IN OUT -ms 5 -mi '{"MARKER_ID":"PATH_TO_AOI_SCENE.obj",...}'
```
-- Track ArUco markers into Tobii camera video stream (-t IP_ADDRESS). Load aoi scene .obj file related to each marker (-mi MARKER_ID_SCENE), position it virtually relatively to any detected ArUco markers and project the scene into camera frame. Then, detect if Tobii gaze point is inside any AOI and send the look at pointer over Ivy default bus:
+- Track ArUco markers into Tobii camera video stream (-t IP_ADDRESS). Load aoi scene .obj file related to each marker (-mi MARKER_ID, PATH_TO_AOI_SCENE), position each scene virtually relatively to its detected ArUco markers then project the scene into camera frame. Then, detect if Tobii gaze point is inside any AOI and send the look at pointer over Ivy default bus:
```
-python ./src/argaze/utils/live_tobii_aruco_aoi_ivy_controller.py -t IP_ADDRESS -c export/tobii_camera.json -ms 5 -mi MARKER_ID_SCENE
+python ./src/argaze/utils/live_tobii_aruco_aoi_ivy_controller.py -t IP_ADDRESS -c export/tobii_camera.json -ms 5 -mi '{"MARKER_ID":"PATH_TO_AOI_SCENE.obj",...}'
```
- Define AOI scene from a ArUco marker (-a AOI_SCENE) and bind to Ivy default bus to receive live look at pointer data.: