aboutsummaryrefslogtreecommitdiff
path: root/docs/use_cases/pilot_gaze_monitoring/introduction.md
diff options
context:
space:
mode:
Diffstat (limited to 'docs/use_cases/pilot_gaze_monitoring/introduction.md')
-rw-r--r--docs/use_cases/pilot_gaze_monitoring/introduction.md5
1 files changed, 3 insertions, 2 deletions
diff --git a/docs/use_cases/pilot_gaze_monitoring/introduction.md b/docs/use_cases/pilot_gaze_monitoring/introduction.md
index 453a443..f5de773 100644
--- a/docs/use_cases/pilot_gaze_monitoring/introduction.md
+++ b/docs/use_cases/pilot_gaze_monitoring/introduction.md
@@ -30,7 +30,7 @@ Finally, fixation events were sent in real-time through [Ivy bus middleware](htt
## Setup
-The setup to integrate **ArGaze** to the experiment is defined by 3 main files:
+The setup to integrate **ArGaze** to the experiment is defined by 3 main files detailled in the next chapters:
* The context file that captures gaze data and scene camera video: [live_streaming_context.json](context.md)
* The pipeline file that processes gaze data and scene camera video: [live_processing_pipeline.json](pipeline.md)
@@ -42,5 +42,6 @@ As any **ArGaze** setup, it is loaded by executing the following command:
python -m argaze load live_streaming_context.json
```
-## Performance
+This command opens a GUI window that allows to start gaze calibration, to launch recording and to monitor gaze mapping.
+![ArGaze load GUI for Haiku](../../img/argaze_load_gui_haiku.png)