aboutsummaryrefslogtreecommitdiff
path: root/docs/user_guide/eye_tracking_context
diff options
context:
space:
mode:
Diffstat (limited to 'docs/user_guide/eye_tracking_context')
-rw-r--r--docs/user_guide/eye_tracking_context/advanced_topics/context_definition.md82
-rw-r--r--docs/user_guide/eye_tracking_context/advanced_topics/scripting.md8
-rw-r--r--docs/user_guide/eye_tracking_context/advanced_topics/timestamped_gaze_positions_edition.md4
-rw-r--r--docs/user_guide/eye_tracking_context/configuration_and_execution.md9
-rw-r--r--docs/user_guide/eye_tracking_context/context_modules/file.md75
-rw-r--r--docs/user_guide/eye_tracking_context/context_modules/opencv.md18
-rw-r--r--docs/user_guide/eye_tracking_context/context_modules/pupil_labs_invisible.md32
-rw-r--r--docs/user_guide/eye_tracking_context/context_modules/pupil_labs_neon.md (renamed from docs/user_guide/eye_tracking_context/context_modules/pupil_labs.md)10
-rw-r--r--docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_2.md8
-rw-r--r--docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_3.md32
10 files changed, 219 insertions, 59 deletions
diff --git a/docs/user_guide/eye_tracking_context/advanced_topics/context_definition.md b/docs/user_guide/eye_tracking_context/advanced_topics/context_definition.md
index c163696..a543bc7 100644
--- a/docs/user_guide/eye_tracking_context/advanced_topics/context_definition.md
+++ b/docs/user_guide/eye_tracking_context/advanced_topics/context_definition.md
@@ -3,27 +3,27 @@ Define a context class
The [ArContext](../../../argaze.md/#argaze.ArFeatures.ArContext) class defines a generic base class interface to handle incoming eye tracker data before to pass them to a processing pipeline according to [Python context manager feature](https://docs.python.org/3/reference/datamodel.html#context-managers).
-The [ArContext](../../../argaze.md/#argaze.ArFeatures.ArContext) class interface provides playback features to stop or pause processings, performance assement features to measure how many times processings are called and the time spent by the process.
+The [ArContext](../../../argaze.md/#argaze.ArFeatures.ArContext) class interface provides control features to stop or pause working threads, performance assessment features to measure how many times processings are called and the time spent by the process.
-Besides, there is also a [LiveProcessingContext](../../../argaze.md/#argaze.ArFeatures.LiveProcessingContext) class that inherits from [ArContext](../../../argaze.md/#argaze.ArFeatures.ArContext) and that defines an abstract *calibrate* method to write specific device calibration process.
+Besides, there is also a [DataCaptureContext](../../../argaze.md/#argaze.ArFeatures.DataCaptureContext) class that inherits from [ArContext](../../../argaze.md/#argaze.ArFeatures.ArContext) and that defines an abstract *calibrate* method to write specific device calibration process.
-In the same way, there is a [PostProcessingContext](../../../argaze.md/#argaze.ArFeatures.PostProcessingContext) class that inherits from [ArContext](../../../argaze.md/#argaze.ArFeatures.ArContext) and that defines abstract *previous* and *next* playback methods to move into record's frames and also defines *duration* and *progression* properties to get information about a record length and processing advancement.
+In the same way, there is a [DataPlaybackContext](../../../argaze.md/#argaze.ArFeatures.DataPlaybackContext) class that inherits from [ArContext](../../../argaze.md/#argaze.ArFeatures.ArContext) and that defines *duration* and *progression* properties to get information about a record length and playback advancement.
-Finally, a specific eye tracking context can be defined into a Python file by writing a class that inherits either from [ArContext](../../../argaze.md/#argaze.ArFeatures.ArContext), [LiveProcessingContext](../../../argaze.md/#argaze.ArFeatures.LiveProcessingContext) or [PostProcessingContext](../../../argaze.md/#argaze.ArFeatures.PostProcessingContext) class.
+Finally, a specific eye tracking context can be defined into a Python file by writing a class that inherits either from [ArContext](../../../argaze.md/#argaze.ArFeatures.ArContext), [DataCaptureContext](../../../argaze.md/#argaze.ArFeatures.DataCaptureContext) or [DataPlaybackContext](../../../argaze.md/#argaze.ArFeatures.DataPlaybackContext) class.
-## Write live processing context
+## Write data capture context
-Here is a live processing context example that processes gaze positions and camera images in two separated threads:
+Here is a data cpature context example that processes gaze positions and camera images in two separated threads:
```python
from argaze import ArFeatures, DataFeatures
-class LiveProcessingExample(ArFeatures.LiveProcessingContext):
+class DataCaptureExample(ArFeatures.DataCaptureContext):
@DataFeatures.PipelineStepInit
def __init__(self, **kwargs):
- # Init LiveProcessingContext class
+ # Init DataCaptureContext class
super().__init__()
# Init private attribute
@@ -45,23 +45,23 @@ class LiveProcessingExample(ArFeatures.LiveProcessingContext):
# Start context according any specific parameter
... self.parameter
- # Start a gaze position processing thread
- self.__gaze_thread = threading.Thread(target = self.__gaze_position_processing)
+ # Start a gaze position capture thread
+ self.__gaze_thread = threading.Thread(target = self.__gaze_position_capture)
self.__gaze_thread.start()
- # Start a camera image processing thread if applicable
- self.__camera_thread = threading.Thread(target = self.__camera_image_processing)
+ # Start a camera image capture thread if applicable
+ self.__camera_thread = threading.Thread(target = self.__camera_image_capture)
self.__camera_thread.start()
return self
- def __gaze_position_processing(self):
- """Process gaze position."""
+ def __gaze_position_capture(self):
+ """Capture gaze position."""
- # Processing loop
+ # Capture loop
while self.is_running():
- # Pause processing
+ # Pause capture
if not self.is_paused():
# Assuming that timestamp, x and y values are available
@@ -73,13 +73,13 @@ class LiveProcessingExample(ArFeatures.LiveProcessingContext):
# Wait some time eventually
...
- def __camera_image_processing(self):
- """Process camera image if applicable."""
+ def __camera_image_capture(self):
+ """Capture camera image if applicable."""
- # Processing loop
+ # Capture loop
while self.is_running():
- # Pause processing
+ # Pause capture
if not self.is_paused():
# Assuming that timestamp, camera_image are available
@@ -95,10 +95,10 @@ class LiveProcessingExample(ArFeatures.LiveProcessingContext):
def __exit__(self, exception_type, exception_value, exception_traceback):
"""End context."""
- # Stop processing loops
+ # Stop capture loops
self.stop()
- # Stop processing threads
+ # Stop capture threads
threading.Thread.join(self.__gaze_thread)
threading.Thread.join(self.__camera_thread)
@@ -108,19 +108,19 @@ class LiveProcessingExample(ArFeatures.LiveProcessingContext):
...
```
-## Write post processing context
+## Write data playback context
-Here is a post processing context example that processes gaze positions and camera images in a same thread:
+Here is a data playback context example that reads gaze positions and camera images in a same thread:
```python
from argaze import ArFeatures, DataFeatures
-class PostProcessingExample(ArFeatures.PostProcessingContext):
+class DataPlaybackExample(ArFeatures.DataPlaybackContext):
@DataFeatures.PipelineStepInit
def __init__(self, **kwargs):
- # Init LiveProcessingContext class
+ # Init DataCaptureContext class
super().__init__()
# Init private attribute
@@ -142,19 +142,19 @@ class PostProcessingExample(ArFeatures.PostProcessingContext):
# Start context according any specific parameter
... self.parameter
- # Start a reading data thread
- self.__read_thread = threading.Thread(target = self.__data_reading)
- self.__read_thread.start()
+ # Start a data playback thread
+ self.__data_thread = threading.Thread(target = self.__data_playback)
+ self.__data_thread.start()
return self
- def __data_reading(self):
- """Process gaze position and camera image if applicable."""
+ def __data_playback(self):
+ """Playback gaze position and camera image if applicable."""
- # Processing loop
+ # Playback loop
while self.is_running():
- # Pause processing
+ # Pause playback
if not self.is_paused():
# Assuming that timestamp, camera_image are available
@@ -176,18 +176,20 @@ class PostProcessingExample(ArFeatures.PostProcessingContext):
def __exit__(self, exception_type, exception_value, exception_traceback):
"""End context."""
- # Stop processing loops
+ # Stop playback loop
self.stop()
- # Stop processing threads
- threading.Thread.join(self.__read_thread)
+ # Stop playback threads
+ threading.Thread.join(self.__data_thread)
- def previous(self):
- """Go to previous camera image frame."""
+ @property
+ def duration(self) -> int|float:
+ """Get data duration."""
...
- def next(self):
- """Go to next camera image frame."""
+ @property
+ def progression(self) -> float:
+ """Get data playback progression between 0 and 1."""
...
```
diff --git a/docs/user_guide/eye_tracking_context/advanced_topics/scripting.md b/docs/user_guide/eye_tracking_context/advanced_topics/scripting.md
index 8753eb6..d8eb389 100644
--- a/docs/user_guide/eye_tracking_context/advanced_topics/scripting.md
+++ b/docs/user_guide/eye_tracking_context/advanced_topics/scripting.md
@@ -68,12 +68,12 @@ from argaze import ArFeatures
# Check context type
- # Live processing case: calibration method is available
- if issubclass(type(context), ArFeatures.LiveProcessingContext):
+ # Data capture case: calibration method is available
+ if issubclass(type(context), ArFeatures.DataCaptureContext):
...
- # Post processing case: more playback methods are available
- if issubclass(type(context), ArFeatures.PostProcessingContext):
+ # Data playback case: playback methods are available
+ if issubclass(type(context), ArFeatures.DataPlaybackContext):
...
# Check pipeline type
diff --git a/docs/user_guide/eye_tracking_context/advanced_topics/timestamped_gaze_positions_edition.md b/docs/user_guide/eye_tracking_context/advanced_topics/timestamped_gaze_positions_edition.md
index 340dbaf..959d955 100644
--- a/docs/user_guide/eye_tracking_context/advanced_topics/timestamped_gaze_positions_edition.md
+++ b/docs/user_guide/eye_tracking_context/advanced_topics/timestamped_gaze_positions_edition.md
@@ -28,8 +28,8 @@ for timestamped_gaze_position in ts_gaze_positions:
## Edit timestamped gaze positions from live stream
-Real-time gaze positions can be edited thanks to the [GazePosition](../../../argaze.md/#argaze.GazeFeatures.GazePosition) class.
-Besides, timestamps can be edited from the incoming data stream or, if not available, they can be edited thanks to the Python [time package](https://docs.python.org/3/library/time.html).
+Real-time gaze positions can be edited using directly the [GazePosition](../../../argaze.md/#argaze.GazeFeatures.GazePosition) class.
+Besides, timestamps can be edited from the incoming data stream or, if not available, they can be edited using the Python [time package](https://docs.python.org/3/library/time.html).
```python
from argaze import GazeFeatures
diff --git a/docs/user_guide/eye_tracking_context/configuration_and_execution.md b/docs/user_guide/eye_tracking_context/configuration_and_execution.md
index f13c6a2..3deeb57 100644
--- a/docs/user_guide/eye_tracking_context/configuration_and_execution.md
+++ b/docs/user_guide/eye_tracking_context/configuration_and_execution.md
@@ -3,9 +3,12 @@ Edit and execute context
The [utils.contexts module](../../argaze.md/#argaze.utils.contexts) provides ready-made contexts like:
-* [Tobii Pro Glasses 2](context_modules/tobii_pro_glasses_2.md) live stream and post processing contexts,
-* [Pupil Labs](context_modules/pupil_labs.md) live stream context,
-* [OpenCV](context_modules/opencv.md) window cursor position and movie processing,
+* [Tobii Pro Glasses 2](context_modules/tobii_pro_glasses_2.md) data capture and data playback contexts,
+* [Tobii Pro Glasses 3](context_modules/tobii_pro_glasses_3.md) data capture context,
+* [Pupil Labs Invisible](context_modules/pupil_labs_invisible.md) data capture context,
+* [Pupil Labs Neon](context_modules/pupil_labs_neon.md) data capture context,
+* [File](context_modules/file.md) data playback contexts,
+* [OpenCV](context_modules/opencv.md) window cursor position capture and movie playback,
* [Random](context_modules/random.md) gaze position generator.
## Edit JSON configuration
diff --git a/docs/user_guide/eye_tracking_context/context_modules/file.md b/docs/user_guide/eye_tracking_context/context_modules/file.md
new file mode 100644
index 0000000..5b5c8e9
--- /dev/null
+++ b/docs/user_guide/eye_tracking_context/context_modules/file.md
@@ -0,0 +1,75 @@
+File
+======
+
+ArGaze provides a ready-made contexts to read data from various file format.
+
+To select a desired context, the JSON samples have to be edited and saved inside an [ArContext configuration](../configuration_and_execution.md) file.
+Notice that the *pipeline* entry is mandatory.
+
+```json
+{
+ JSON sample
+ "pipeline": ...
+}
+```
+
+Read more about [ArContext base class in code reference](../../../argaze.md/#argaze.ArFeatures.ArContext).
+
+## CSV
+
+::: argaze.utils.contexts.File.CSV
+
+### JSON sample: splitted case
+
+To use when gaze position coordinates are splitted in two separated columns.
+
+```json
+{
+ "argaze.utils.contexts.File.CSV": {
+ "name": "CSV file data playback",
+ "path": "./src/argaze/utils/demo/gaze_positions_splitted.csv",
+ "timestamp_column": "Timestamp (ms)",
+ "x_column": "Gaze Position X (px)",
+ "y_column": "Gaze Position Y (px)",
+ "pipeline": ...
+ }
+}
+```
+
+### JSON sample: joined case
+
+To use when gaze position coordinates are joined as a list in one single column.
+
+```json
+{
+ "argaze.utils.contexts.File.CSV" : {
+ "name": "CSV file data playback",
+ "path": "./src/argaze/utils/demo/gaze_positions_xy_joined.csv",
+ "timestamp_column": "Timestamp (ms)",
+ "xy_column": "Gaze Position (px)",
+ "pipeline": ...
+ }
+}
+```
+
+### JSON sample: left and right eyes
+
+To use when gaze position coordinates and validity are given for each eye in six separated columns.
+
+```json
+{
+ "argaze.utils.contexts.File.CSV": {
+ "name": "CSV file data playback",
+ "path": "./src/argaze/utils/demo/gaze_positions_left_right_eyes.csv",
+ "timestamp_column": "Timestamp (ms)",
+ "left_eye_x_column": "Left eye X",
+ "left_eye_y_column": "Left eye Y",
+ "left_eye_validity_column": "Left eye validity",
+ "right_eye_x_column": "Right eye X",
+ "right_eye_y_column": "Right eye Y",
+ "right_eye_validity_column": "Right eye validity",
+ "rescale_to_pipeline_size": true,
+ "pipeline": ...
+ }
+}
+```
diff --git a/docs/user_guide/eye_tracking_context/context_modules/opencv.md b/docs/user_guide/eye_tracking_context/context_modules/opencv.md
index 7244cd4..7d73a03 100644
--- a/docs/user_guide/eye_tracking_context/context_modules/opencv.md
+++ b/docs/user_guide/eye_tracking_context/context_modules/opencv.md
@@ -39,9 +39,25 @@ Read more about [ArContext base class in code reference](../../../argaze.md/#arg
```json
{
"argaze.utils.contexts.OpenCV.Movie": {
- "name": "Open CV cursor",
+ "name": "Open CV movie",
"path": "./src/argaze/utils/demo/tobii_record/segments/1/fullstream.mp4",
"pipeline": ...
}
}
```
+
+## Camera
+
+::: argaze.utils.contexts.OpenCV.Camera
+
+### JSON sample
+
+```json
+{
+ "argaze.utils.contexts.OpenCV.Camera": {
+ "name": "Open CV camera",
+ "identifier": 0,
+ "pipeline": ...
+ }
+}
+``` \ No newline at end of file
diff --git a/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_invisible.md b/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_invisible.md
new file mode 100644
index 0000000..1f4a94f
--- /dev/null
+++ b/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_invisible.md
@@ -0,0 +1,32 @@
+Pupil Labs Invisible
+==========
+
+ArGaze provides a ready-made context to work with Pupil Labs Invisible device.
+
+To select a desired context, the JSON samples have to be edited and saved inside an [ArContext configuration](../configuration_and_execution.md) file.
+Notice that the *pipeline* entry is mandatory.
+
+```json
+{
+ JSON sample
+ "pipeline": ...
+}
+```
+
+Read more about [ArContext base class in code reference](../../../argaze.md/#argaze.ArFeatures.ArContext).
+
+## Live Stream
+
+::: argaze.utils.contexts.PupilLabsInvisible.LiveStream
+
+### JSON sample
+
+```json
+{
+ "argaze.utils.contexts.PupilLabsInvisible.LiveStream": {
+ "name": "Pupil Labs Invisible live stream",
+ "project": "my_experiment",
+ "pipeline": ...
+ }
+}
+```
diff --git a/docs/user_guide/eye_tracking_context/context_modules/pupil_labs.md b/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_neon.md
index d2ec336..535f5d5 100644
--- a/docs/user_guide/eye_tracking_context/context_modules/pupil_labs.md
+++ b/docs/user_guide/eye_tracking_context/context_modules/pupil_labs_neon.md
@@ -1,7 +1,7 @@
-Pupil Labs
+Pupil Labs Neon
==========
-ArGaze provides a ready-made context to work with Pupil Labs devices.
+ArGaze provides a ready-made context to work with Pupil Labs Neon device.
To select a desired context, the JSON samples have to be edited and saved inside an [ArContext configuration](../configuration_and_execution.md) file.
Notice that the *pipeline* entry is mandatory.
@@ -17,14 +17,14 @@ Read more about [ArContext base class in code reference](../../../argaze.md/#arg
## Live Stream
-::: argaze.utils.contexts.PupilLabs.LiveStream
+::: argaze.utils.contexts.PupilLabsNeon.LiveStream
### JSON sample
```json
{
- "argaze.utils.contexts.PupilLabs.LiveStream": {
- "name": "Pupil Labs live stream",
+ "argaze.utils.contexts.PupilLabsNeon.LiveStream": {
+ "name": "Pupil Labs Neon live stream",
"project": "my_experiment",
"pipeline": ...
}
diff --git a/docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_2.md b/docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_2.md
index fba6931..6ff44bd 100644
--- a/docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_2.md
+++ b/docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_2.md
@@ -42,16 +42,16 @@ Read more about [ArContext base class in code reference](../../../argaze.md/#arg
}
```
-## Post Processing
+## Segment Playback
-::: argaze.utils.contexts.TobiiProGlasses2.PostProcessing
+::: argaze.utils.contexts.TobiiProGlasses2.SegmentPlayback
### JSON sample
```json
{
- "argaze.utils.contexts.TobiiProGlasses2.PostProcessing" : {
- "name": "Tobii Pro Glasses 2 post-processing",
+ "argaze.utils.contexts.TobiiProGlasses2.SegmentPlayback" : {
+ "name": "Tobii Pro Glasses 2 segment playback",
"segment": "./src/argaze/utils/demo/tobii_record/segments/1",
"pipeline": ...
}
diff --git a/docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_3.md b/docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_3.md
new file mode 100644
index 0000000..3d37fcc
--- /dev/null
+++ b/docs/user_guide/eye_tracking_context/context_modules/tobii_pro_glasses_3.md
@@ -0,0 +1,32 @@
+Tobii Pro Glasses 3
+===================
+
+ArGaze provides a ready-made context to work with Tobii Pro Glasses 3 devices.
+
+To select a desired context, the JSON samples have to be edited and saved inside an [ArContext configuration](../configuration_and_execution.md) file.
+Notice that the *pipeline* entry is mandatory.
+
+```json
+{
+ JSON sample
+ "pipeline": ...
+}
+```
+
+Read more about [ArContext base class in code reference](../../../argaze.md/#argaze.ArFeatures.ArContext).
+
+## Live Stream
+
+::: argaze.utils.contexts.TobiiProGlasses3.LiveStream
+
+### JSON sample
+
+```json
+{
+ "argaze.utils.contexts.TobiiProGlasses3.LiveStream": {
+ "name": "Tobii Pro Glasses 3 live stream",
+ "pipeline": ...
+ }
+}
+```
+