diff options
Diffstat (limited to 'docs/user_guide')
-rw-r--r-- | docs/user_guide/eye_tracking_context/configuration_and_execution.md | 5 | ||||
-rw-r--r-- | docs/user_guide/eye_tracking_context/context_modules/file.md | 75 | ||||
-rw-r--r-- | docs/user_guide/eye_tracking_context/context_modules/pupil_labs_invisible.md | 32 | ||||
-rw-r--r-- | docs/user_guide/eye_tracking_context/context_modules/pupil_labs_neon.md (renamed from docs/user_guide/eye_tracking_context/context_modules/pupil_labs.md) | 10 | ||||
-rw-r--r-- | docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_3.md | 32 | ||||
-rw-r--r-- | docs/user_guide/utils/demonstrations_scripts.md | 78 | ||||
-rw-r--r-- | docs/user_guide/utils/main_commands.md | 3 |
7 files changed, 220 insertions, 15 deletions
diff --git a/docs/user_guide/eye_tracking_context/configuration_and_execution.md b/docs/user_guide/eye_tracking_context/configuration_and_execution.md index e1123fb..3deeb57 100644 --- a/docs/user_guide/eye_tracking_context/configuration_and_execution.md +++ b/docs/user_guide/eye_tracking_context/configuration_and_execution.md @@ -4,7 +4,10 @@ Edit and execute context The [utils.contexts module](../../argaze.md/#argaze.utils.contexts) provides ready-made contexts like: * [Tobii Pro Glasses 2](context_modules/tobii_pro_glasses_2.md) data capture and data playback contexts, -* [Pupil Labs](context_modules/pupil_labs.md) data capture context, +* [Tobii Pro Glasses 3](context_modules/tobii_pro_glasses_3.md) data capture context, +* [Pupil Labs Invisible](context_modules/pupil_labs_invisible.md) data capture context, +* [Pupil Labs Neon](context_modules/pupil_labs_neon.md) data capture context, +* [File](context_modules/file.md) data playback contexts, * [OpenCV](context_modules/opencv.md) window cursor position capture and movie playback, * [Random](context_modules/random.md) gaze position generator. diff --git a/docs/user_guide/eye_tracking_context/context_modules/file.md b/docs/user_guide/eye_tracking_context/context_modules/file.md new file mode 100644 index 0000000..5b5c8e9 --- /dev/null +++ b/docs/user_guide/eye_tracking_context/context_modules/file.md @@ -0,0 +1,75 @@ +File +====== + +ArGaze provides a ready-made contexts to read data from various file format. + +To select a desired context, the JSON samples have to be edited and saved inside an [ArContext configuration](../configuration_and_execution.md) file. +Notice that the *pipeline* entry is mandatory. + +```json +{ + JSON sample + "pipeline": ... +} +``` + +Read more about [ArContext base class in code reference](../../../argaze.md/#argaze.ArFeatures.ArContext). + +## CSV + +::: argaze.utils.contexts.File.CSV + +### JSON sample: splitted case + +To use when gaze position coordinates are splitted in two separated columns. + +```json +{ + "argaze.utils.contexts.File.CSV": { + "name": "CSV file data playback", + "path": "./src/argaze/utils/demo/gaze_positions_splitted.csv", + "timestamp_column": "Timestamp (ms)", + "x_column": "Gaze Position X (px)", + "y_column": "Gaze Position Y (px)", + "pipeline": ... + } +} +``` + +### JSON sample: joined case + +To use when gaze position coordinates are joined as a list in one single column. + +```json +{ + "argaze.utils.contexts.File.CSV" : { + "name": "CSV file data playback", + "path": "./src/argaze/utils/demo/gaze_positions_xy_joined.csv", + "timestamp_column": "Timestamp (ms)", + "xy_column": "Gaze Position (px)", + "pipeline": ... + } +} +``` + +### JSON sample: left and right eyes + +To use when gaze position coordinates and validity are given for each eye in six separated columns. + +```json +{ + "argaze.utils.contexts.File.CSV": { + "name": "CSV file data playback", + "path": "./src/argaze/utils/demo/gaze_positions_left_right_eyes.csv", + "timestamp_column": "Timestamp (ms)", + "left_eye_x_column": "Left eye X", + "left_eye_y_column": "Left eye Y", + "left_eye_validity_column": "Left eye validity", + "right_eye_x_column": "Right eye X", + "right_eye_y_column": "Right eye Y", + "right_eye_validity_column": "Right eye validity", + "rescale_to_pipeline_size": true, + "pipeline": ... + } +} +``` diff --git a/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_invisible.md b/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_invisible.md new file mode 100644 index 0000000..1f4a94f --- /dev/null +++ b/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_invisible.md @@ -0,0 +1,32 @@ +Pupil Labs Invisible +========== + +ArGaze provides a ready-made context to work with Pupil Labs Invisible device. + +To select a desired context, the JSON samples have to be edited and saved inside an [ArContext configuration](../configuration_and_execution.md) file. +Notice that the *pipeline* entry is mandatory. + +```json +{ + JSON sample + "pipeline": ... +} +``` + +Read more about [ArContext base class in code reference](../../../argaze.md/#argaze.ArFeatures.ArContext). + +## Live Stream + +::: argaze.utils.contexts.PupilLabsInvisible.LiveStream + +### JSON sample + +```json +{ + "argaze.utils.contexts.PupilLabsInvisible.LiveStream": { + "name": "Pupil Labs Invisible live stream", + "project": "my_experiment", + "pipeline": ... + } +} +``` diff --git a/docs/user_guide/eye_tracking_context/context_modules/pupil_labs.md b/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_neon.md index d2ec336..535f5d5 100644 --- a/docs/user_guide/eye_tracking_context/context_modules/pupil_labs.md +++ b/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_neon.md @@ -1,7 +1,7 @@ -Pupil Labs +Pupil Labs Neon ========== -ArGaze provides a ready-made context to work with Pupil Labs devices. +ArGaze provides a ready-made context to work with Pupil Labs Neon device. To select a desired context, the JSON samples have to be edited and saved inside an [ArContext configuration](../configuration_and_execution.md) file. Notice that the *pipeline* entry is mandatory. @@ -17,14 +17,14 @@ Read more about [ArContext base class in code reference](../../../argaze.md/#arg ## Live Stream -::: argaze.utils.contexts.PupilLabs.LiveStream +::: argaze.utils.contexts.PupilLabsNeon.LiveStream ### JSON sample ```json { - "argaze.utils.contexts.PupilLabs.LiveStream": { - "name": "Pupil Labs live stream", + "argaze.utils.contexts.PupilLabsNeon.LiveStream": { + "name": "Pupil Labs Neon live stream", "project": "my_experiment", "pipeline": ... } diff --git a/docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_3.md b/docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_3.md new file mode 100644 index 0000000..3d37fcc --- /dev/null +++ b/docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_3.md @@ -0,0 +1,32 @@ +Tobii Pro Glasses 3 +=================== + +ArGaze provides a ready-made context to work with Tobii Pro Glasses 3 devices. + +To select a desired context, the JSON samples have to be edited and saved inside an [ArContext configuration](../configuration_and_execution.md) file. +Notice that the *pipeline* entry is mandatory. + +```json +{ + JSON sample + "pipeline": ... +} +``` + +Read more about [ArContext base class in code reference](../../../argaze.md/#argaze.ArFeatures.ArContext). + +## Live Stream + +::: argaze.utils.contexts.TobiiProGlasses3.LiveStream + +### JSON sample + +```json +{ + "argaze.utils.contexts.TobiiProGlasses3.LiveStream": { + "name": "Tobii Pro Glasses 3 live stream", + "pipeline": ... + } +} +``` + diff --git a/docs/user_guide/utils/demonstrations_scripts.md b/docs/user_guide/utils/demonstrations_scripts.md index 59df85b..c7560eb 100644 --- a/docs/user_guide/utils/demonstrations_scripts.md +++ b/docs/user_guide/utils/demonstrations_scripts.md @@ -20,6 +20,29 @@ Load **random_context.json** file to generate random gaze positions: python -m argaze load ./src/argaze/utils/demo/random_context.json ``` +## CSV file context + +Load **csv_file_context_xy_joined.json** file to analyze gaze positions from a CSV file where gaze position coordinates are joined as a list in one single column: + +```shell +python -m argaze load ./src/argaze/utils/demo/csv_file_context_xy_joined.json +``` + +Load **csv_file_context_xy_splitted.json** file to analyze gaze positions from a CSV file where gaze position coordinates are splitted in two seperated column: + +```shell +python -m argaze load ./src/argaze/utils/demo/csv_file_context_xy_splitted.json +``` + +Load **csv_file_context_left_right_eyes.json** file to analyze gaze positions from a CSV file where gaze position coordinates and validity are given for each eye in six separated columns.: + +```shell +python -m argaze load ./src/argaze/utils/demo/csv_file_context_left_right_eyes.json +``` + +!!! note + The left/right eyes context allows to parse Tobii Spectrum data for example. + ## OpenCV ### Cursor context @@ -40,7 +63,7 @@ python -m argaze load ./src/argaze/utils/demo/opencv_movie_context.json ### Camera context -Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution and to reduce the value of the *sides_mask*. +Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution and to set a consistent *sides_mask* value. Edit **opencv_camera_context.json** file as to select camera device identifier (default is 0). @@ -57,7 +80,9 @@ python -m argaze load ./src/argaze/utils/demo/opencv_camera_context.json !!! note This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet. -Edit **tobii_live_stream_context.json** file as to select exisiting IP *address*, *project* or *participant* names and setup Tobii *configuration* parameters: +Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1920, 1080]) and to set *sides_mask* value to 420. + +Edit **tobii_g2_live_stream_context.json** file as to select exisiting IP *address*, *project* or *participant* names and setup Tobii *configuration* parameters: ```json { @@ -80,15 +105,17 @@ Edit **tobii_live_stream_context.json** file as to select exisiting IP *address* } ``` -Then, load **tobii_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: +Then, load **tobii_g2_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: ```shell -python -m argaze load ./src/argaze/utils/demo/tobii_live_stream_context.json +python -m argaze load ./src/argaze/utils/demo/tobii_g2_live_stream_context.json ``` ### Segment playback context -Edit **tobii_segment_playback_context.json** file to select an existing Tobii *segment* folder: +Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1920, 1080]) and to set *sides_mask* value to 420. + +Edit **tobii_g2_segment_playback_context.json** file to select an existing Tobii *segment* folder: ```json { @@ -100,12 +127,28 @@ Edit **tobii_segment_playback_context.json** file to select an existing Tobii *s } ``` -Then, load **tobii_segment_playback_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: +Then, load **tobii_g2_segment_playback_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: ```shell -python -m argaze load ./src/argaze/utils/demo/tobii_segment_playback_context.json +python -m argaze load ./src/argaze/utils/demo/tobii_g2_segment_playback_context.json ``` +## Tobii Pro Glasses 3 + +### Live stream context + +!!! note + This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet. + +Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1920, 1080]) and to set *sides_mask* value to 420. + +Load **tobii_g3_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: + +```shell +python -m argaze load ./src/argaze/utils/demo/tobii_g3_live_stream_context.json +``` + + ## Pupil Invisible ### Live stream context @@ -113,8 +156,25 @@ python -m argaze load ./src/argaze/utils/demo/tobii_segment_playback_context.jso !!! note This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet. -Load **pupillabs_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: +Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1088, 1080]) and to set *sides_mask* value to 4. + +Load **pupillabs_invisible_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: + +```shell +python -m argaze load ./src/argaze/utils/demo/pupillabs_invisible_live_stream_context.json +``` + +## Pupil Neon + +### Live stream context + +!!! note + This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet. + +Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1600, 1200]) and to set *sides_mask* value to 200. + +Load **pupillabs_neon_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: ```shell -python -m argaze load ./src/argaze/utils/demo/pupillabs_live_stream_context.json +python -m argaze load ./src/argaze/utils/demo/pupillabs_neon_live_stream_context.json ``` diff --git a/docs/user_guide/utils/main_commands.md b/docs/user_guide/utils/main_commands.md index c4887a4..9227d8d 100644 --- a/docs/user_guide/utils/main_commands.md +++ b/docs/user_guide/utils/main_commands.md @@ -54,3 +54,6 @@ Modify the content of JSON CONFIGURATION file with another JSON CHANGES file the ```shell python -m argaze edit CONFIGURATION CHANGES OUTPUT ``` + +!!! note + Use *null* value to remove an entry. |