aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorThéo de la Hogue2024-04-23 07:58:41 +0200
committerThéo de la Hogue2024-04-23 07:58:41 +0200
commit1487defacef6ba3e63d92f46d0e54a8339a37897 (patch)
treec990719cbdf1c923fe28465f1f9dc70f1e4029fc
parent95857cf4f31bf529bfdd3921150262b12b444888 (diff)
downloadargaze-1487defacef6ba3e63d92f46d0e54a8339a37897.zip
argaze-1487defacef6ba3e63d92f46d0e54a8339a37897.tar.gz
argaze-1487defacef6ba3e63d92f46d0e54a8339a37897.tar.bz2
argaze-1487defacef6ba3e63d92f46d0e54a8339a37897.tar.xz
Updating ArContext documentation.
-rw-r--r--docs/user_guide/pipeline_input_context/context_definition.md2
-rw-r--r--docs/user_guide/pipeline_input_context/introduction.md8
-rw-r--r--docs/user_guide/utils/demonstrations_scripts.md57
3 files changed, 20 insertions, 47 deletions
diff --git a/docs/user_guide/pipeline_input_context/context_definition.md b/docs/user_guide/pipeline_input_context/context_definition.md
index 9f4981c..5456243 100644
--- a/docs/user_guide/pipeline_input_context/context_definition.md
+++ b/docs/user_guide/pipeline_input_context/context_definition.md
@@ -44,8 +44,6 @@ class Example(ArFeatures.ArContext):
# Process timestamped gaze position
self._process_gaze_position(timestamp = timestamp, x = x, y = y)
- return self
-
@DataFeatures.PipelineStepExit
def __exit__(self, exception_type, exception_value, exception_traceback):
diff --git a/docs/user_guide/pipeline_input_context/introduction.md b/docs/user_guide/pipeline_input_context/introduction.md
index 002f1e2..59c723b 100644
--- a/docs/user_guide/pipeline_input_context/introduction.md
+++ b/docs/user_guide/pipeline_input_context/introduction.md
@@ -1,7 +1,7 @@
Overview
========
-This section explains how to wrap any pipeline detailled in previous sections into various context.
+This section explains how to connect [gaze analysis](../gaze_analysis_pipeline/introduction.md) or [augmented reality](../aruco_marker_pipeline/introduction.md) pipelines with various input contexts.
First, let's look at the schema below: it gives an overview of the main notions involved in the following chapters.
@@ -10,6 +10,8 @@ First, let's look at the schema below: it gives an overview of the main notions
To build your own input context, you need to know:
* [How to define a context class](context_definition.md),
-* [How to load and connect a context](configuration_and_connection.md),
+* [How to load a context to connect with a pipeline](configuration_and_connection.md),
+* [How to stop a context](stop.md),
+* [How to pause and resume a context](pause_and_resume.md),
* [How to visualize a context](visualization.md),
-* [How to pause and resume a context](pause_and_resume.md)
+* [How to handle pipeline exceptions](exceptions.md)
diff --git a/docs/user_guide/utils/demonstrations_scripts.md b/docs/user_guide/utils/demonstrations_scripts.md
index ed2f8d9..d83915f 100644
--- a/docs/user_guide/utils/demonstrations_scripts.md
+++ b/docs/user_guide/utils/demonstrations_scripts.md
@@ -11,7 +11,7 @@ Collection of command-line scripts for demonstration purpose.
## Random context
-Load **random_context.json** file to analyze random gaze positions.
+Load **random_context.json** file to analyze random gaze positions:
```shell
python -m argaze ./src/argaze/utils/demo/random_context.json
@@ -19,7 +19,7 @@ python -m argaze ./src/argaze/utils/demo/random_context.json
## OpenCV window context
-Load **opencv_window_context.json** file to analyze mouse pointer positions over OpenCV window.
+Load **opencv_window_context.json** file to analyze mouse pointer positions over OpenCV window:
```shell
python -m argaze ./src/argaze/utils/demo/opencv_window_context.json
@@ -27,12 +27,12 @@ python -m argaze ./src/argaze/utils/demo/opencv_window_context.json
## Tobii Pro Glasses 2
-### Tobii live stream context
+### Live stream context
!!! note
- this demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
+ This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
-Edit **tobii_live_stream_context.json** file as below with your own parameters values:
+Edit **tobii_live_stream_context.json** file as to select exisiting IP *address*, *project* or *participant* names and setup Tobii *configuration* parameters:
```json
{
@@ -50,45 +50,35 @@ Edit **tobii_live_stream_context.json** file as below with your own parameters v
"sys_et_freq": 50,
"sys_mems_freq": 100
},
- "pipeline": "aruco_markers_pipeline.json",
- "catch_exceptions": true,
- "image_parameters": {
- "draw_times": true,
- "draw_exceptions": true
- }
+ "pipeline": "aruco_markers_pipeline.json"
}
}
```
-Then, execute this command:
+Then, load **tobii_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
```shell
python -m argaze ./src/argaze/utils/demo/tobii_live_stream_context.json
```
-### Tobii post-processing context
+### Post-processing context
!!! note
- this demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
+ This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
-Edit **tobii_post_processing_context.json** file as below with your own parameters values:
+Edit **tobii_post_processing_context.json** file to select an existing Tobii *segment* folder:
```json
{
"argaze.utils.contexts.TobiiProGlasses2.PostProcessing" : {
"name": "Tobii Pro Glasses 2 post-processing",
"segment": "record/segments/1",
- "pipeline": "aruco_markers_pipeline.json",
- "catch_exceptions": true,
- "image_parameters": {
- "draw_times": true,
- "draw_exceptions": true
- }
+ "pipeline": "aruco_markers_pipeline.json"
}
}
```
-Then, execute this command:
+Then, load **tobii_post_processing_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
```shell
python -m argaze ./src/argaze/utils/demo/tobii_post_processing_context.json
@@ -96,30 +86,13 @@ python -m argaze ./src/argaze/utils/demo/tobii_post_processing_context.json
## Pupil Invisible
-### Pupil Invisible live stream context
+### Live stream context
!!! note
- this demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
+ This demonstration requires to print **A3_demo.pdf** file located in *./src/argaze/utils/demo/* folder on A3 paper sheet.
-Edit **pupillabs_live_stream_context.json** file as below with your own parameters values:
-
-```json
-{
- "argaze.utils.contexts.PupilLabs.LiveStream" : {
- "name": "PupilLabs",
- "pipeline": "aruco_markers_pipeline.json",
- "catch_exceptions": true,
- "image_parameters": {
- "draw_times": true,
- "draw_exceptions": true
- }
- }
-}
-```
-
-Then, execute this command:
+Load **pupillabs_live_stream_context.json** file to find ArUco marker into camera image and, project gaze positions into AOI:
```shell
python -m argaze ./src/argaze/utils/demo/pupillabs_live_stream_context.json
```
-