diff options
Diffstat (limited to 'docs/user_guide')
-rw-r--r-- | docs/user_guide/eye_tracking_context/advanced_topics/context_definition.md | 12 | ||||
-rw-r--r-- | docs/user_guide/eye_tracking_context/configuration_and_execution.md | 5 | ||||
-rw-r--r-- | docs/user_guide/eye_tracking_context/context_modules/file.md | 75 | ||||
-rw-r--r-- | docs/user_guide/eye_tracking_context/context_modules/opencv.md | 18 | ||||
-rw-r--r-- | docs/user_guide/eye_tracking_context/context_modules/pupil_labs_invisible.md | 32 | ||||
-rw-r--r-- | docs/user_guide/eye_tracking_context/context_modules/pupil_labs_neon.md (renamed from docs/user_guide/eye_tracking_context/context_modules/pupil_labs.md) | 10 | ||||
-rw-r--r-- | docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_3.md | 32 | ||||
-rw-r--r-- | docs/user_guide/gaze_analysis_pipeline/aoi_analysis.md | 5 | ||||
-rw-r--r-- | docs/user_guide/utils/demonstrations_scripts.md | 88 | ||||
-rw-r--r-- | docs/user_guide/utils/estimate_aruco_markers_pose.md | 2 | ||||
-rw-r--r-- | docs/user_guide/utils/main_commands.md | 3 |
11 files changed, 256 insertions, 26 deletions
diff --git a/docs/user_guide/eye_tracking_context/advanced_topics/context_definition.md b/docs/user_guide/eye_tracking_context/advanced_topics/context_definition.md index 17dfea9..a543bc7 100644 --- a/docs/user_guide/eye_tracking_context/advanced_topics/context_definition.md +++ b/docs/user_guide/eye_tracking_context/advanced_topics/context_definition.md @@ -7,7 +7,7 @@ The [ArContext](../../../argaze.md/#argaze.ArFeatures.ArContext) class interface Besides, there is also a [DataCaptureContext](../../../argaze.md/#argaze.ArFeatures.DataCaptureContext) class that inherits from [ArContext](../../../argaze.md/#argaze.ArFeatures.ArContext) and that defines an abstract *calibrate* method to write specific device calibration process. -In the same way, there is a [DataPlaybackContext](../../../argaze.md/#argaze.ArFeatures.DataPlaybackContext) class that inherits from [ArContext](../../../argaze.md/#argaze.ArFeatures.ArContext) and that defines abstract *previous* and *next* playback methods to move into record's frames and also defines *duration* and *progression* properties to get information about a record length and playback advancement. +In the same way, there is a [DataPlaybackContext](../../../argaze.md/#argaze.ArFeatures.DataPlaybackContext) class that inherits from [ArContext](../../../argaze.md/#argaze.ArFeatures.ArContext) and that defines *duration* and *progression* properties to get information about a record length and playback advancement. Finally, a specific eye tracking context can be defined into a Python file by writing a class that inherits either from [ArContext](../../../argaze.md/#argaze.ArFeatures.ArContext), [DataCaptureContext](../../../argaze.md/#argaze.ArFeatures.DataCaptureContext) or [DataPlaybackContext](../../../argaze.md/#argaze.ArFeatures.DataPlaybackContext) class. @@ -182,12 +182,14 @@ class DataPlaybackExample(ArFeatures.DataPlaybackContext): # Stop playback threads threading.Thread.join(self.__data_thread) - def previous(self): - """Go to previous camera image frame.""" + @property + def duration(self) -> int|float: + """Get data duration.""" ... - def next(self): - """Go to next camera image frame.""" + @property + def progression(self) -> float: + """Get data playback progression between 0 and 1.""" ... ``` diff --git a/docs/user_guide/eye_tracking_context/configuration_and_execution.md b/docs/user_guide/eye_tracking_context/configuration_and_execution.md index e1123fb..3deeb57 100644 --- a/docs/user_guide/eye_tracking_context/configuration_and_execution.md +++ b/docs/user_guide/eye_tracking_context/configuration_and_execution.md @@ -4,7 +4,10 @@ Edit and execute context The [utils.contexts module](../../argaze.md/#argaze.utils.contexts) provides ready-made contexts like: * [Tobii Pro Glasses 2](context_modules/tobii_pro_glasses_2.md) data capture and data playback contexts, -* [Pupil Labs](context_modules/pupil_labs.md) data capture context, +* [Tobii Pro Glasses 3](context_modules/tobii_pro_glasses_3.md) data capture context, +* [Pupil Labs Invisible](context_modules/pupil_labs_invisible.md) data capture context, +* [Pupil Labs Neon](context_modules/pupil_labs_neon.md) data capture context, +* [File](context_modules/file.md) data playback contexts, * [OpenCV](context_modules/opencv.md) window cursor position capture and movie playback, * [Random](context_modules/random.md) gaze position generator. diff --git a/docs/user_guide/eye_tracking_context/context_modules/file.md b/docs/user_guide/eye_tracking_context/context_modules/file.md new file mode 100644 index 0000000..5b5c8e9 --- /dev/null +++ b/docs/user_guide/eye_tracking_context/context_modules/file.md @@ -0,0 +1,75 @@ +File +====== + +ArGaze provides a ready-made contexts to read data from various file format. + +To select a desired context, the JSON samples have to be edited and saved inside an [ArContext configuration](../configuration_and_execution.md) file. +Notice that the *pipeline* entry is mandatory. + +```json +{ + JSON sample + "pipeline": ... +} +``` + +Read more about [ArContext base class in code reference](../../../argaze.md/#argaze.ArFeatures.ArContext). + +## CSV + +::: argaze.utils.contexts.File.CSV + +### JSON sample: splitted case + +To use when gaze position coordinates are splitted in two separated columns. + +```json +{ + "argaze.utils.contexts.File.CSV": { + "name": "CSV file data playback", + "path": "./src/argaze/utils/demo/gaze_positions_splitted.csv", + "timestamp_column": "Timestamp (ms)", + "x_column": "Gaze Position X (px)", + "y_column": "Gaze Position Y (px)", + "pipeline": ... + } +} +``` + +### JSON sample: joined case + +To use when gaze position coordinates are joined as a list in one single column. + +```json +{ + "argaze.utils.contexts.File.CSV" : { + "name": "CSV file data playback", + "path": "./src/argaze/utils/demo/gaze_positions_xy_joined.csv", + "timestamp_column": "Timestamp (ms)", + "xy_column": "Gaze Position (px)", + "pipeline": ... + } +} +``` + +### JSON sample: left and right eyes + +To use when gaze position coordinates and validity are given for each eye in six separated columns. + +```json +{ + "argaze.utils.contexts.File.CSV": { + "name": "CSV file data playback", + "path": "./src/argaze/utils/demo/gaze_positions_left_right_eyes.csv", + "timestamp_column": "Timestamp (ms)", + "left_eye_x_column": "Left eye X", + "left_eye_y_column": "Left eye Y", + "left_eye_validity_column": "Left eye validity", + "right_eye_x_column": "Right eye X", + "right_eye_y_column": "Right eye Y", + "right_eye_validity_column": "Right eye validity", + "rescale_to_pipeline_size": true, + "pipeline": ... + } +} +``` diff --git a/docs/user_guide/eye_tracking_context/context_modules/opencv.md b/docs/user_guide/eye_tracking_context/context_modules/opencv.md index 7244cd4..7d73a03 100644 --- a/docs/user_guide/eye_tracking_context/context_modules/opencv.md +++ b/docs/user_guide/eye_tracking_context/context_modules/opencv.md @@ -39,9 +39,25 @@ Read more about [ArContext base class in code reference](../../../argaze.md/#arg ```json { "argaze.utils.contexts.OpenCV.Movie": { - "name": "Open CV cursor", + "name": "Open CV movie", "path": "./src/argaze/utils/demo/tobii_record/segments/1/fullstream.mp4", "pipeline": ... } } ``` + +## Camera + +::: argaze.utils.contexts.OpenCV.Camera + +### JSON sample + +```json +{ + "argaze.utils.contexts.OpenCV.Camera": { + "name": "Open CV camera", + "identifier": 0, + "pipeline": ... + } +} +```
\ No newline at end of file diff --git a/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_invisible.md b/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_invisible.md new file mode 100644 index 0000000..1f4a94f --- /dev/null +++ b/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_invisible.md @@ -0,0 +1,32 @@ +Pupil Labs Invisible +========== + +ArGaze provides a ready-made context to work with Pupil Labs Invisible device. + +To select a desired context, the JSON samples have to be edited and saved inside an [ArContext configuration](../configuration_and_execution.md) file. +Notice that the *pipeline* entry is mandatory. + +```json +{ + JSON sample + "pipeline": ... +} +``` + +Read more about [ArContext base class in code reference](../../../argaze.md/#argaze.ArFeatures.ArContext). + +## Live Stream + +::: argaze.utils.contexts.PupilLabsInvisible.LiveStream + +### JSON sample + +```json +{ + "argaze.utils.contexts.PupilLabsInvisible.LiveStream": { + "name": "Pupil Labs Invisible live stream", + "project": "my_experiment", + "pipeline": ... + } +} +``` diff --git a/docs/user_guide/eye_tracking_context/context_modules/pupil_labs.md b/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_neon.md index d2ec336..535f5d5 100644 --- a/docs/user_guide/eye_tracking_context/context_modules/pupil_labs.md +++ b/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_neon.md @@ -1,7 +1,7 @@ -Pupil Labs +Pupil Labs Neon ========== -ArGaze provides a ready-made context to work with Pupil Labs devices. +ArGaze provides a ready-made context to work with Pupil Labs Neon device. To select a desired context, the JSON samples have to be edited and saved inside an [ArContext configuration](../configuration_and_execution.md) file. Notice that the *pipeline* entry is mandatory. @@ -17,14 +17,14 @@ Read more about [ArContext base class in code reference](../../../argaze.md/#arg ## Live Stream -::: argaze.utils.contexts.PupilLabs.LiveStream +::: argaze.utils.contexts.PupilLabsNeon.LiveStream ### JSON sample ```json { - "argaze.utils.contexts.PupilLabs.LiveStream": { - "name": "Pupil Labs live stream", + "argaze.utils.contexts.PupilLabsNeon.LiveStream": { + "name": "Pupil Labs Neon live stream", "project": "my_experiment", "pipeline": ... } diff --git a/docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_3.md b/docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_3.md new file mode 100644 index 0000000..3d37fcc --- /dev/null +++ b/docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_3.md @@ -0,0 +1,32 @@ +Tobii Pro Glasses 3 +=================== + +ArGaze provides a ready-made context to work with Tobii Pro Glasses 3 devices. + +To select a desired context, the JSON samples have to be edited and saved inside an [ArContext configuration](../configuration_and_execution.md) file. +Notice that the *pipeline* entry is mandatory. + +```json +{ + JSON sample + "pipeline": ... +} +``` + +Read more about [ArContext base class in code reference](../../../argaze.md/#argaze.ArFeatures.ArContext). + +## Live Stream + +::: argaze.utils.contexts.TobiiProGlasses3.LiveStream + +### JSON sample + +```json +{ + "argaze.utils.contexts.TobiiProGlasses3.LiveStream": { + "name": "Tobii Pro Glasses 3 live stream", + "pipeline": ... + } +} +``` + diff --git a/docs/user_guide/gaze_analysis_pipeline/aoi_analysis.md b/docs/user_guide/gaze_analysis_pipeline/aoi_analysis.md index 2b64091..c2a6ac3 100644 --- a/docs/user_guide/gaze_analysis_pipeline/aoi_analysis.md +++ b/docs/user_guide/gaze_analysis_pipeline/aoi_analysis.md @@ -100,6 +100,11 @@ The second [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) pipeline step a Once gaze movements are matched to AOI, they are automatically appended to the AOIScanPath if required. +!!! warning "GazeFeatures.OutsideAOI" + When a fixation is not looking at any AOI, a step associated with an AOI called [GazeFeatures.OutsideAOI](../../argaze.md/#argaze.GazeFeatures.OutsideAOI) is added. As long as fixations are not looking at any AOI, all fixations/saccades are stored in this step. In this way, further analysis are calculated considering those extra [GazeFeatures.OutsideAOI](../../argaze.md/#argaze.GazeFeatures.OutsideAOI) steps. + + This is particularly important when calculating transition matrices, because otherwise we could have arcs between two AOIs when in fact the gaze could have fixed itself outside in the meantime. + The [AOIScanPath.duration_max](../../argaze.md/#argaze.GazeFeatures.AOIScanPath.duration_max) attribute is the duration from which older AOI scan steps are removed each time new AOI scan steps are added. !!! note "Optional" diff --git a/docs/user_guide/utils/demonstrations_scripts.md b/docs/user_guide/utils/demonstrations_scripts.md index 1d136f2..c7560eb 100644 --- a/docs/user_guide/utils/demonstrations_scripts.md +++ b/docs/user_guide/utils/demonstrations_scripts.md @@ -20,7 +20,32 @@ Load **random_context.json** file to generate random gaze positions: python -m argaze load ./src/argaze/utils/demo/random_context.json ``` -## OpenCV cursor context +## CSV file context + +Load **csv_file_context_xy_joined.json** file to analyze gaze positions from a CSV file where gaze position coordinates are joined as a list in one single column: + +```shell +python -m argaze load ./src/argaze/utils/demo/csv_file_context_xy_joined.json +``` + +Load **csv_file_context_xy_splitted.json** file to analyze gaze positions from a CSV file where gaze position coordinates are splitted in two seperated column: + +```shell +python -m argaze load ./src/argaze/utils/demo/csv_file_context_xy_splitted.json +``` + +Load **csv_file_context_left_right_eyes.json** file to analyze gaze positions from a CSV file where gaze position coordinates and validity are given for each eye in six separated columns.: + +```shell +python -m argaze load ./src/argaze/utils/demo/csv_file_context_left_right_eyes.json +``` + +!!! note + The left/right eyes context allows to parse Tobii Spectrum data for example. + +## OpenCV + +### Cursor context Load **opencv_cursor_context.json** file to capture cursor pointer positions over OpenCV window: @@ -28,17 +53,17 @@ Load **opencv_cursor_context.json** file to capture cursor pointer positions ove python -m argaze load ./src/argaze/utils/demo/opencv_cursor_context.json ``` -## OpenCV movie context +### Movie context -Load **opencv_movie_context.json** file to playback movie pictures and also capture cursor pointer positions over OpenCV window: +Load **opencv_movie_context.json** file to playback a movie and also capture cursor pointer positions over OpenCV window: ```shell python -m argaze load ./src/argaze/utils/demo/opencv_movie_context.json ``` -## OpenCV camera context +### Camera context -Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution and to reduce the value of the *sides_mask*. +Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution and to set a consistent *sides_mask* value. Edit **opencv_camera_context.json** file as to select camera device identifier (default is 0). @@ -55,7 +80,9 @@ python -m argaze load ./src/argaze/utils/demo/opencv_camera_context.json !!! note This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet. -Edit **tobii_live_stream_context.json** file as to select exisiting IP *address*, *project* or *participant* names and setup Tobii *configuration* parameters: +Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1920, 1080]) and to set *sides_mask* value to 420. + +Edit **tobii_g2_live_stream_context.json** file as to select exisiting IP *address*, *project* or *participant* names and setup Tobii *configuration* parameters: ```json { @@ -78,15 +105,17 @@ Edit **tobii_live_stream_context.json** file as to select exisiting IP *address* } ``` -Then, load **tobii_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: +Then, load **tobii_g2_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: ```shell -python -m argaze load ./src/argaze/utils/demo/tobii_live_stream_context.json +python -m argaze load ./src/argaze/utils/demo/tobii_g2_live_stream_context.json ``` ### Segment playback context -Edit **tobii_segment_playback_context.json** file to select an existing Tobii *segment* folder: +Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1920, 1080]) and to set *sides_mask* value to 420. + +Edit **tobii_g2_segment_playback_context.json** file to select an existing Tobii *segment* folder: ```json { @@ -98,12 +127,28 @@ Edit **tobii_segment_playback_context.json** file to select an existing Tobii *s } ``` -Then, load **tobii_segment_playback_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: +Then, load **tobii_g2_segment_playback_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: + +```shell +python -m argaze load ./src/argaze/utils/demo/tobii_g2_segment_playback_context.json +``` + +## Tobii Pro Glasses 3 + +### Live stream context + +!!! note + This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet. + +Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1920, 1080]) and to set *sides_mask* value to 420. + +Load **tobii_g3_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: ```shell -python -m argaze load ./src/argaze/utils/demo/tobii_segment_playback_context.json +python -m argaze load ./src/argaze/utils/demo/tobii_g3_live_stream_context.json ``` + ## Pupil Invisible ### Live stream context @@ -111,8 +156,25 @@ python -m argaze load ./src/argaze/utils/demo/tobii_segment_playback_context.jso !!! note This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet. -Load **pupillabs_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: +Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1088, 1080]) and to set *sides_mask* value to 4. + +Load **pupillabs_invisible_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: + +```shell +python -m argaze load ./src/argaze/utils/demo/pupillabs_invisible_live_stream_context.json +``` + +## Pupil Neon + +### Live stream context + +!!! note + This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet. + +Edit **aruco_markers_pipeline.json** file as to adapt the *size* to the camera resolution ([1600, 1200]) and to set *sides_mask* value to 200. + +Load **pupillabs_neon_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI: ```shell -python -m argaze load ./src/argaze/utils/demo/pupillabs_live_stream_context.json +python -m argaze load ./src/argaze/utils/demo/pupillabs_neon_live_stream_context.json ``` diff --git a/docs/user_guide/utils/estimate_aruco_markers_pose.md b/docs/user_guide/utils/estimate_aruco_markers_pose.md index d1fd16e..55bd232 100644 --- a/docs/user_guide/utils/estimate_aruco_markers_pose.md +++ b/docs/user_guide/utils/estimate_aruco_markers_pose.md @@ -15,7 +15,7 @@ Firstly, edit **utils/estimate_markers_pose/context.json** file as to select a m } ``` -Sencondly, edit **utils/estimate_markers_pose/pipeline.json** file to setup ArUco camera *size*, ArUco detector *dictionary*, *pose_size* and *pose_ids* attributes. +Secondly, edit **utils/estimate_markers_pose/pipeline.json** file to setup ArUco camera *size*, ArUco detector *dictionary*, *pose_size* and *pose_ids* attributes. ```json { diff --git a/docs/user_guide/utils/main_commands.md b/docs/user_guide/utils/main_commands.md index c4887a4..9227d8d 100644 --- a/docs/user_guide/utils/main_commands.md +++ b/docs/user_guide/utils/main_commands.md @@ -54,3 +54,6 @@ Modify the content of JSON CONFIGURATION file with another JSON CHANGES file the ```shell python -m argaze edit CONFIGURATION CHANGES OUTPUT ``` + +!!! note + Use *null* value to remove an entry. |