From a4d35335db238955baac2723eba9edece805b83a Mon Sep 17 00:00:00 2001 From: Théo de la Hogue Date: Mon, 8 Jul 2024 09:50:13 +0200 Subject: Improving pilot monitoring use case. --- docs/use_cases/pilot_gaze_monitoring/introduction.md | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) (limited to 'docs/use_cases/pilot_gaze_monitoring/introduction.md') diff --git a/docs/use_cases/pilot_gaze_monitoring/introduction.md b/docs/use_cases/pilot_gaze_monitoring/introduction.md index 453a443..f5de773 100644 --- a/docs/use_cases/pilot_gaze_monitoring/introduction.md +++ b/docs/use_cases/pilot_gaze_monitoring/introduction.md @@ -30,7 +30,7 @@ Finally, fixation events were sent in real-time through [Ivy bus middleware](htt ## Setup -The setup to integrate **ArGaze** to the experiment is defined by 3 main files: +The setup to integrate **ArGaze** to the experiment is defined by 3 main files detailled in the next chapters: * The context file that captures gaze data and scene camera video: [live_streaming_context.json](context.md) * The pipeline file that processes gaze data and scene camera video: [live_processing_pipeline.json](pipeline.md) @@ -42,5 +42,6 @@ As any **ArGaze** setup, it is loaded by executing the following command: python -m argaze load live_streaming_context.json ``` -## Performance +This command opens a GUI window that allows to start gaze calibration, to launch recording and to monitor gaze mapping. +![ArGaze load GUI for Haiku](../../img/argaze_load_gui_haiku.png) -- cgit v1.1