diff options
Diffstat (limited to 'docs/user_guide/utils')
-rw-r--r-- | docs/user_guide/utils/demonstrations_scripts.md | 88 | ||||
-rw-r--r-- | docs/user_guide/utils/main_commands.md | 3 |
2 files changed, 78 insertions, 13 deletions
diff --git a/docs/user_guide/utils/demonstrations_scripts.md b/docs/user_guide/utils/demonstrations_scripts.md index 1d136f2..c7560eb 100644 --- a/docs/user_guide/utils/demonstrations_scripts.md +++ b/docs/user_guide/utils/demonstrations_scripts.md @@ -20,7 +20,32 @@ Load **random_context.json** file to generate random gaze positions: python -m argaze load ./src/argaze/utils/demo/random_context.json ``` -## OpenCV cursor context +## CSV file context + +Load **csv_file_context_xy_joined.json** file to analyze gaze positions from a CSV file where gaze position coordinates are joined as a list in one single column: + +```shell +python -m argaze load ./src/argaze/utils/demo/csv_file_context_xy_joined.json +``` + +Load **csv_file_context_xy_splitted.json** file to analyze gaze positions from a CSV file where gaze position coordinates are splitted in two seperated column: + +```shell +python -m argaze load ./src/argaze/utils/demo/csv_file_context_xy_splitted.json +``` + +Load **csv_file_context_left_right_eyes.json** file to analyze gaze positions from a CSV file where gaze position coordinates and validity are given for each eye in six separated columns.: + +```shell +python -m argaze load ./src/argaze/utils/demo/csv_file_context_left_right_eyes.json +``` + +!!! note + The left/right eyes context allows to parse Tobii Spectrum data for example. + +## OpenCV + +### Cursor context Load **opencv_cursor_context.json** file to capture cursor pointer positions over OpenCV window: @@ -28,17 +53,17 @@ Load **opencv_cursor_context.json** file to capture cursor pointer positions ove python -m argaze load ./src/argaze/utils/demo/opencv_cursor_context.json ``` -## OpenCV movie context +### Movie context -Load **opencv_movie_context.json** file to playback movie pictures and also capture cursor pointer positions over OpenCV window: +Load **opencv_movie_context.json** file to playback a movie and also capture cursor pointer positions over OpenCV window: ```shell python -m argaze load ./src/argaze/utils/demo/opencv_movie_context.json ``` -## OpenCV camera context +### Camera context -Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution and to reduce the value of the *sides_mask*. +Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution and to set a consistent *sides_mask* value. Edit **opencv_camera_context.json** file as to select camera device identifier (default is 0). @@ -55,7 +80,9 @@ python -m argaze load ./src/argaze/utils/demo/opencv_camera_context.json !!! note This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet. -Edit **tobii_live_stream_context.json** file as to select exisiting IP *address*, *project* or *participant* names and setup Tobii *configuration* parameters: +Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1920, 1080]) and to set *sides_mask* value to 420. + +Edit **tobii_g2_live_stream_context.json** file as to select exisiting IP *address*, *project* or *participant* names and setup Tobii *configuration* parameters: ```json { @@ -78,15 +105,17 @@ Edit **tobii_live_stream_context.json** file as to select exisiting IP *address* } ``` -Then, load **tobii_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: +Then, load **tobii_g2_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: ```shell -python -m argaze load ./src/argaze/utils/demo/tobii_live_stream_context.json +python -m argaze load ./src/argaze/utils/demo/tobii_g2_live_stream_context.json ``` ### Segment playback context -Edit **tobii_segment_playback_context.json** file to select an existing Tobii *segment* folder: +Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1920, 1080]) and to set *sides_mask* value to 420. + +Edit **tobii_g2_segment_playback_context.json** file to select an existing Tobii *segment* folder: ```json { @@ -98,12 +127,28 @@ Edit **tobii_segment_playback_context.json** file to select an existing Tobii *s } ``` -Then, load **tobii_segment_playback_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: +Then, load **tobii_g2_segment_playback_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: + +```shell +python -m argaze load ./src/argaze/utils/demo/tobii_g2_segment_playback_context.json +``` + +## Tobii Pro Glasses 3 + +### Live stream context + +!!! note + This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet. + +Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1920, 1080]) and to set *sides_mask* value to 420. + +Load **tobii_g3_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: ```shell -python -m argaze load ./src/argaze/utils/demo/tobii_segment_playback_context.json +python -m argaze load ./src/argaze/utils/demo/tobii_g3_live_stream_context.json ``` + ## Pupil Invisible ### Live stream context @@ -111,8 +156,25 @@ python -m argaze load ./src/argaze/utils/demo/tobii_segment_playback_context.jso !!! note This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet. -Load **pupillabs_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: +Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1088, 1080]) and to set *sides_mask* value to 4. + +Load **pupillabs_invisible_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: + +```shell +python -m argaze load ./src/argaze/utils/demo/pupillabs_invisible_live_stream_context.json +``` + +## Pupil Neon + +### Live stream context + +!!! note + This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet. + +Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1600, 1200]) and to set *sides_mask* value to 200. + +Load **pupillabs_neon_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: ```shell -python -m argaze load ./src/argaze/utils/demo/pupillabs_live_stream_context.json +python -m argaze load ./src/argaze/utils/demo/pupillabs_neon_live_stream_context.json ``` diff --git a/docs/user_guide/utils/main_commands.md b/docs/user_guide/utils/main_commands.md index c4887a4..9227d8d 100644 --- a/docs/user_guide/utils/main_commands.md +++ b/docs/user_guide/utils/main_commands.md @@ -54,3 +54,6 @@ Modify the content of JSON CONFIGURATION file with another JSON CHANGES file the ```shell python -m argaze edit CONFIGURATION CHANGES OUTPUT ``` + +!!! note + Use *null* value to remove an entry. |