From 91857bacd4f82e48d8a06b43bcc48294701d6a1d Mon Sep 17 00:00:00 2001 From: Théo de la Hogue Date: Mon, 19 Jun 2023 10:01:10 +0200 Subject: Using relative path to code reference to make it works online. --- .../ar_environment/environment_exploitation.md | 2 +- .../user_guide/ar_environment/environment_setup.md | 4 +- docs/user_guide/areas_of_interest/aoi_frame.md | 2 +- docs/user_guide/areas_of_interest/aoi_matching.md | 4 +- .../areas_of_interest/aoi_scene_description.md | 4 +- .../areas_of_interest/aoi_scene_projection.md | 2 +- docs/user_guide/areas_of_interest/introduction.md | 8 ++-- .../areas_of_interest/vision_cone_filtering.md | 2 +- .../user_guide/aruco_markers/camera_calibration.md | 6 +-- docs/user_guide/aruco_markers/introduction.md | 16 +++---- docs/user_guide/aruco_markers/markers_creation.md | 2 +- docs/user_guide/aruco_markers/markers_detection.md | 6 +-- .../aruco_markers/markers_pose_estimation.md | 2 +- .../aruco_markers/markers_scene_description.md | 10 ++--- docs/user_guide/gaze_analysis/gaze_movement.md | 28 ++++++------ docs/user_guide/gaze_analysis/gaze_position.md | 8 ++-- docs/user_guide/gaze_analysis/introduction.md | 4 +- docs/user_guide/gaze_analysis/scan_path.md | 50 +++++++++++----------- .../timestamped_data/data_synchronisation.md | 10 ++--- docs/user_guide/timestamped_data/introduction.md | 2 +- .../timestamped_data/ordered_dictionary.md | 2 +- .../pandas_dataframe_conversion.md | 2 +- .../timestamped_data/saving_and_loading.md | 2 +- 23 files changed, 89 insertions(+), 89 deletions(-) (limited to 'docs/user_guide') diff --git a/docs/user_guide/ar_environment/environment_exploitation.md b/docs/user_guide/ar_environment/environment_exploitation.md index a28b74d..f07d150 100644 --- a/docs/user_guide/ar_environment/environment_exploitation.md +++ b/docs/user_guide/ar_environment/environment_exploitation.md @@ -1,7 +1,7 @@ Environment exploitation ======================== -Once loaded, [ArEnvironment](/argaze/#argaze.ArFeatures.ArEnvironment) assets can be exploited as illustrated below: +Once loaded, [ArEnvironment](../../../argaze/#argaze.ArFeatures.ArEnvironment) assets can be exploited as illustrated below: ```python # Access to AR environment ArUco detector passing it a frame where to detect ArUco markers diff --git a/docs/user_guide/ar_environment/environment_setup.md b/docs/user_guide/ar_environment/environment_setup.md index bbfdbd6..1147181 100644 --- a/docs/user_guide/ar_environment/environment_setup.md +++ b/docs/user_guide/ar_environment/environment_setup.md @@ -1,9 +1,9 @@ Environment Setup ================= -[ArEnvironment](/argaze/#argaze.ArFeatures.ArEnvironment) setup is loaded from JSON file format. +[ArEnvironment](../../../argaze/#argaze.ArFeatures.ArEnvironment) setup is loaded from JSON file format. -Each [ArEnvironment](/argaze/#argaze.ArFeatures.ArEnvironment) defines a unique [ArUcoDetector](/argaze/#argaze.ArUcoMarkers.ArUcoDetector.ArUcoDetector) dedicated to detection of markers from a specific [ArUcoMarkersDictionary](/argaze/#argaze.ArUcoMarkers.ArUcoMarkersDictionary) and with a given size. However, it is possible to load multiple [ArScene](/argaze/#argaze.ArFeatures.ArScene) into a same [ArEnvironment](/argaze/#argaze.ArFeatures.ArEnvironment). +Each [ArEnvironment](../../../argaze/#argaze.ArFeatures.ArEnvironment) defines a unique [ArUcoDetector](../../../argaze/#argaze.ArUcoMarkers.ArUcoDetector.ArUcoDetector) dedicated to detection of markers from a specific [ArUcoMarkersDictionary](../../../argaze/#argaze.ArUcoMarkers.ArUcoMarkersDictionary) and with a given size. However, it is possible to load multiple [ArScene](../../../argaze/#argaze.ArFeatures.ArScene) into a same [ArEnvironment](../../../argaze/#argaze.ArFeatures.ArEnvironment). Here is JSON environment file example where it is assumed that mentioned .obj files are located relatively to the environment file on disk. diff --git a/docs/user_guide/areas_of_interest/aoi_frame.md b/docs/user_guide/areas_of_interest/aoi_frame.md index 855e302..350efa8 100644 --- a/docs/user_guide/areas_of_interest/aoi_frame.md +++ b/docs/user_guide/areas_of_interest/aoi_frame.md @@ -5,7 +5,7 @@ title: AOI frame AOI Frame ========= -[AOIFeatures](/argaze/#argaze/AreaOfInterest.AOIFeatures) provides [AOIFrame](/argaze/#argaze/AreaOfInterest.AOIFeatures.AOIFrame) class to draw into an 2D AOI. +[AOIFeatures](../../../argaze/#argaze/AreaOfInterest.AOIFeatures) provides [AOIFrame](../../../argaze/#argaze/AreaOfInterest.AOIFeatures.AOIFrame) class to draw into an 2D AOI. ## Point spread diff --git a/docs/user_guide/areas_of_interest/aoi_matching.md b/docs/user_guide/areas_of_interest/aoi_matching.md index 299a212..1e18238 100644 --- a/docs/user_guide/areas_of_interest/aoi_matching.md +++ b/docs/user_guide/areas_of_interest/aoi_matching.md @@ -5,9 +5,9 @@ title: AOI matching AOI matching ============ -Once [AOI3DScene](/argaze/#argaze.AreaOfInterest.AOI3DScene) is projected into a frame as [AOI2DScene](/argaze/#argaze.AreaOfInterest.AOI2DScene), it could be needed to know which AOI is looked. +Once [AOI3DScene](../../../argaze/#argaze.AreaOfInterest.AOI3DScene) is projected into a frame as [AOI2DScene](../../../argaze/#argaze.AreaOfInterest.AOI2DScene), it could be needed to know which AOI is looked. -The [AreaOfInterest](/argaze/#argaze.AreaOfInterest.AOIFeatures.AreaOfInterest) class in [AOIFeatures](/argaze/#argaze.AreaOfInterest.AOIFeatures) provides two ways to accomplish such task. +The [AreaOfInterest](../../../argaze/#argaze.AreaOfInterest.AOIFeatures.AreaOfInterest) class in [AOIFeatures](../../../argaze/#argaze.AreaOfInterest.AOIFeatures) provides two ways to accomplish such task. ## Pointer-based matching diff --git a/docs/user_guide/areas_of_interest/aoi_scene_description.md b/docs/user_guide/areas_of_interest/aoi_scene_description.md index e6a8a16..d50f061 100644 --- a/docs/user_guide/areas_of_interest/aoi_scene_description.md +++ b/docs/user_guide/areas_of_interest/aoi_scene_description.md @@ -5,7 +5,7 @@ title: AOI scene description AOI scene description ===================== -An [AOI3DScene](/argaze/#argaze.AreaOfInterest.AOI3DScene) is built from a 3D model with all AOI as 3D planes and loaded through OBJ file format. +An [AOI3DScene](../../../argaze/#argaze.AreaOfInterest.AOI3DScene) is built from a 3D model with all AOI as 3D planes and loaded through OBJ file format. Notice that plane normals are not needed and planes are not necessary 4 vertices shapes. ``` obj @@ -47,7 +47,7 @@ f 185 190 186 188 191 187 189 ... ``` -Here is a sample of code to show the loading of an [AOI3DScene](/argaze/#argaze.AreaOfInterest.AOI3DScene) from an OBJ file description: +Here is a sample of code to show the loading of an [AOI3DScene](../../../argaze/#argaze.AreaOfInterest.AOI3DScene) from an OBJ file description: ``` python from argaze.AreaOfInterest import AOI3DScene diff --git a/docs/user_guide/areas_of_interest/aoi_scene_projection.md b/docs/user_guide/areas_of_interest/aoi_scene_projection.md index 4d06e87..ad50f6f 100644 --- a/docs/user_guide/areas_of_interest/aoi_scene_projection.md +++ b/docs/user_guide/areas_of_interest/aoi_scene_projection.md @@ -5,7 +5,7 @@ title: AOI scene projection AOI scene projection ==================== -An [AOI3DScene](/argaze/#argaze.AreaOfInterest.AOI3DScene) can be rotated and translated according to a pose estimation before to project it onto camera frame as an [AOI2DScene](/argaze/#argaze.AreaOfInterest.AOI2DScene). +An [AOI3DScene](../../../argaze/#argaze.AreaOfInterest.AOI3DScene) can be rotated and translated according to a pose estimation before to project it onto camera frame as an [AOI2DScene](../../../argaze/#argaze.AreaOfInterest.AOI2DScene). ![AOI projection](../../img/aoi_projection.png) diff --git a/docs/user_guide/areas_of_interest/introduction.md b/docs/user_guide/areas_of_interest/introduction.md index 201e4b2..ce0a6ef 100644 --- a/docs/user_guide/areas_of_interest/introduction.md +++ b/docs/user_guide/areas_of_interest/introduction.md @@ -1,8 +1,8 @@ About Areas Of Interest (AOI) ============================= -The [AreaOfInterest submodule](/argaze/#argaze.AreaOfInterest) allows to deal with AOI in a AR environment through a set of high level classes: +The [AreaOfInterest submodule](../../../argaze/#argaze.AreaOfInterest) allows to deal with AOI in a AR environment through a set of high level classes: -* [AOIFeatures](/argaze/#argaze.AreaOfInterest.AOIFeatures) -* [AOI3DScene](/argaze/#argaze.AreaOfInterest.AOI3DScene) -* [AOI2DScene](/argaze/#argaze.AreaOfInterest.AOI2DScene) \ No newline at end of file +* [AOIFeatures](../../../argaze/#argaze.AreaOfInterest.AOIFeatures) +* [AOI3DScene](../../../argaze/#argaze.AreaOfInterest.AOI3DScene) +* [AOI2DScene](../../../argaze/#argaze.AreaOfInterest.AOI2DScene) \ No newline at end of file diff --git a/docs/user_guide/areas_of_interest/vision_cone_filtering.md b/docs/user_guide/areas_of_interest/vision_cone_filtering.md index e1f4f82..ddd83bd 100644 --- a/docs/user_guide/areas_of_interest/vision_cone_filtering.md +++ b/docs/user_guide/areas_of_interest/vision_cone_filtering.md @@ -1,7 +1,7 @@ Vision cone filtering ===================== -The [AOI3DScene](/argaze/#argaze.AreaOfInterest.AOI3DScene) provides cone clipping support in order to select only [AOI](/argaze/#argaze.AreaOfInterest.AOIFeatures.AreaOfInterest) which are inside vision cone field. +The [AOI3DScene](../../../argaze/#argaze.AreaOfInterest.AOI3DScene) provides cone clipping support in order to select only [AOI](../../../argaze/#argaze.AreaOfInterest.AOIFeatures.AreaOfInterest) which are inside vision cone field. ![Vision cone](../../img/vision_cone.png) diff --git a/docs/user_guide/aruco_markers/camera_calibration.md b/docs/user_guide/aruco_markers/camera_calibration.md index ea2c51a..7bff480 100644 --- a/docs/user_guide/aruco_markers/camera_calibration.md +++ b/docs/user_guide/aruco_markers/camera_calibration.md @@ -5,7 +5,7 @@ Any camera device have to be calibrated to compensate its optical distorsion. ![Camera calibration](../../img/camera_calibration.png) -The first step to calibrate a camera is to create an [ArUcoBoard](/argaze/#argaze.ArUcoMarkers.ArUcoBoard) like in the code below: +The first step to calibrate a camera is to create an [ArUcoBoard](../../../argaze/#argaze.ArUcoMarkers.ArUcoBoard) like in the code below: ``` python from argaze.ArUcoMarkers import ArUcoMarkersDictionary, ArUcoBoard @@ -20,7 +20,7 @@ aruco_board = ArUcoBoard.ArUcoBoard(7, 5, 5, 3, aruco_dictionary) aruco_board.save('./calibration_board.png', 300) ``` -Then, the calibration process needs to make many different captures of an [ArUcoBoard](/argaze/#argaze.ArUcoMarkers.ArUcoBoard) through the camera and then, pass them to an [ArUcoDetector](/argaze/#argaze.ArUcoMarkers.ArUcoDetector.ArUcoDetector) instance to detect board corners and store them as calibration data to an [ArUcoOpticCalibrator](/argaze/#argaze.ArUcoMarkers.ArUcoOpticCalibrator) for final calibration process. +Then, the calibration process needs to make many different captures of an [ArUcoBoard](../../../argaze/#argaze.ArUcoMarkers.ArUcoBoard) through the camera and then, pass them to an [ArUcoDetector](../../../argaze/#argaze.ArUcoMarkers.ArUcoDetector.ArUcoDetector) instance to detect board corners and store them as calibration data to an [ArUcoOpticCalibrator](../../../argaze/#argaze.ArUcoMarkers.ArUcoOpticCalibrator) for final calibration process. ![Calibration step](../../img/camera_calibration_step.png) @@ -81,7 +81,7 @@ else: print('\nCalibration error.') ``` -Then, the camera calibration data are loaded to compensate optical distorsion during [ArUcoMarkers](/argaze/#argaze.ArUcoMarkers.ArUcoMarker) detection: +Then, the camera calibration data are loaded to compensate optical distorsion during [ArUcoMarkers](../../../argaze/#argaze.ArUcoMarkers.ArUcoMarker) detection: ``` python from argaze.ArUcoMarkers import ArUcoOpticCalibrator diff --git a/docs/user_guide/aruco_markers/introduction.md b/docs/user_guide/aruco_markers/introduction.md index 7da045c..8f4baf9 100644 --- a/docs/user_guide/aruco_markers/introduction.md +++ b/docs/user_guide/aruco_markers/introduction.md @@ -5,11 +5,11 @@ About ArUco markers The OpenCV library provides a module to detect fiducial markers into a picture and estimate its pose (cf [OpenCV ArUco tutorial page](https://docs.opencv.org/4.x/d5/dae/tutorial_aruco_detection.html)). -The ArGaze [ArUcoMarkers submodule](/argaze/#argaze.ArUcoMarkers) eases markers creation, camera calibration, markers detection and 3D scene pose estimation through a set of high level classes: - -* [ArUcoMarkersDictionary](/argaze/#argaze.ArUcoMarkers.ArUcoMarkersDictionary) -* [ArUcoMarkers](/argaze/#argaze.ArUcoMarkers.ArUcoMarker) -* [ArUcoBoard](/argaze/#argaze.ArUcoMarkers.ArUcoBoard) -* [ArUcoOpticCalibrator](/argaze/#argaze.ArUcoMarkers.ArUcoOpticCalibrator) -* [ArUcoDetector](/argaze/#argaze.ArUcoMarkers.ArUcoDetector) -* [ArUcoScene](/argaze/#argaze.ArUcoMarkers.ArUcoScene) \ No newline at end of file +The ArGaze [ArUcoMarkers submodule](../../../argaze/#argaze.ArUcoMarkers) eases markers creation, camera calibration, markers detection and 3D scene pose estimation through a set of high level classes: + +* [ArUcoMarkersDictionary](../../../argaze/#argaze.ArUcoMarkers.ArUcoMarkersDictionary) +* [ArUcoMarkers](../../../argaze/#argaze.ArUcoMarkers.ArUcoMarker) +* [ArUcoBoard](../../../argaze/#argaze.ArUcoMarkers.ArUcoBoard) +* [ArUcoOpticCalibrator](../../../argaze/#argaze.ArUcoMarkers.ArUcoOpticCalibrator) +* [ArUcoDetector](../../../argaze/#argaze.ArUcoMarkers.ArUcoDetector) +* [ArUcoScene](../../../argaze/#argaze.ArUcoMarkers.ArUcoScene) \ No newline at end of file diff --git a/docs/user_guide/aruco_markers/markers_creation.md b/docs/user_guide/aruco_markers/markers_creation.md index 1725fe4..89c7fc6 100644 --- a/docs/user_guide/aruco_markers/markers_creation.md +++ b/docs/user_guide/aruco_markers/markers_creation.md @@ -1,7 +1,7 @@ Markers creation ================ -The creation of [ArUcoMarkers](/argaze/#argaze.ArUcoMarkers.ArUcoMarker) from a dictionary is illustrated in the code below: +The creation of [ArUcoMarkers](../../../argaze/#argaze.ArUcoMarkers.ArUcoMarker) from a dictionary is illustrated in the code below: ``` python from argaze.ArUcoMarkers import ArUcoMarkersDictionary diff --git a/docs/user_guide/aruco_markers/markers_detection.md b/docs/user_guide/aruco_markers/markers_detection.md index f8a23f9..3851cb4 100644 --- a/docs/user_guide/aruco_markers/markers_detection.md +++ b/docs/user_guide/aruco_markers/markers_detection.md @@ -3,7 +3,7 @@ Markers detection ![Detected markers](../../img/detected_markers.png) -Firstly, the [ArUcoDetector](/argaze/#argaze.ArUcoMarkers.ArUcoDetector.ArUcoDetector) needs to know the expected dictionary and size (in centimeter) of the [ArUcoMarkers](/argaze/#argaze.ArUcoMarkers.ArUcoMarker) it have to detect. +Firstly, the [ArUcoDetector](../../../argaze/#argaze.ArUcoMarkers.ArUcoDetector.ArUcoDetector) needs to know the expected dictionary and size (in centimeter) of the [ArUcoMarkers](../../../argaze/#argaze.ArUcoMarkers.ArUcoMarker) it have to detect. Notice that extra parameters are passed to detector: see [OpenCV ArUco markers detection parameters documentation](https://docs.opencv.org/4.x/d1/dcd/structcv_1_1aruco_1_1DetectorParameters.html) to know more. @@ -19,7 +19,7 @@ extra_parameters = ArUcoDetector.DetectorParameters.from_json('./detector_parame aruco_detector = ArUcoDetector.ArUcoDetector(optic_parameters=optic_parameters, dictionary='DICT_APRILTAG_16h5', marker_size=5, parameters=extra_parameters) ``` -Here is [DetectorParameters](/argaze/#argaze.ArUcoMarkers.ArUcoDetector.DetectorParameters) JSON file example: +Here is [DetectorParameters](../../../argaze/#argaze.ArUcoMarkers.ArUcoDetector.DetectorParameters) JSON file example: ``` { @@ -29,7 +29,7 @@ Here is [DetectorParameters](/argaze/#argaze.ArUcoMarkers.ArUcoDetector.Detector } ``` -The [ArUcoDetector](/argaze/#argaze.ArUcoMarkers.ArUcoDetector.ArUcoDetector) processes frame to detect markers and allows to draw detection results onto it: +The [ArUcoDetector](../../../argaze/#argaze.ArUcoMarkers.ArUcoDetector.ArUcoDetector) processes frame to detect markers and allows to draw detection results onto it: ``` python # Detect markers into a frame and draw them diff --git a/docs/user_guide/aruco_markers/markers_pose_estimation.md b/docs/user_guide/aruco_markers/markers_pose_estimation.md index 09db325..45ef70d 100644 --- a/docs/user_guide/aruco_markers/markers_pose_estimation.md +++ b/docs/user_guide/aruco_markers/markers_pose_estimation.md @@ -1,7 +1,7 @@ Markers pose estimation ======================= -After [ArUcoMarkers](/argaze/#argaze.ArUcoMarkers.ArUcoMarker) detection, it is possible to estimate [ArUcoMarkers](/argaze/#argaze.ArUcoMarkers.ArUcoMarker) pose in camera axis. +After [ArUcoMarkers](../../../argaze/#argaze.ArUcoMarkers.ArUcoMarker) detection, it is possible to estimate [ArUcoMarkers](../../../argaze/#argaze.ArUcoMarkers.ArUcoMarker) pose in camera axis. ![Pose estimation](../../img/pose_estimation.png) diff --git a/docs/user_guide/aruco_markers/markers_scene_description.md b/docs/user_guide/aruco_markers/markers_scene_description.md index 6dbc4fd..3ea962e 100644 --- a/docs/user_guide/aruco_markers/markers_scene_description.md +++ b/docs/user_guide/aruco_markers/markers_scene_description.md @@ -1,11 +1,11 @@ Markers scene description ========================= -The ArGaze toolkit provides [ArUcoScene](/argaze/#argaze.ArUcoMarkers.ArUcoScene) class to describe where [ArUcoMarkers](/argaze/#argaze.ArUcoMarkers.ArUcoMarker) are placed into a 3D model. +The ArGaze toolkit provides [ArUcoScene](../../../argaze/#argaze.ArUcoMarkers.ArUcoScene) class to describe where [ArUcoMarkers](../../../argaze/#argaze.ArUcoMarkers.ArUcoMarker) are placed into a 3D model. ![ArUco scene](../../img/aruco_scene.png) -[ArUcoScene](/argaze/#argaze.ArUcoMarkers.ArUcoScene) is useful to: +[ArUcoScene](../../../argaze/#argaze.ArUcoMarkers.ArUcoScene) is useful to: * filter markers that belongs to this predefined scene, * check the consistency of detected markers according the place where each marker is expected to be, @@ -33,7 +33,7 @@ f 5//2 6//2 8//2 7//2 ... ``` -[ArUcoScene](/argaze/#argaze.ArUcoMarkers.ArUcoScene) description can also be written in a JSON file format. +[ArUcoScene](../../../argaze/#argaze.ArUcoMarkers.ArUcoScene) description can also be written in a JSON file format. ``` json { @@ -56,7 +56,7 @@ f 5//2 6//2 8//2 7//2 } ``` -Here is a sample of code to show the loading of an [ArUcoScene](/argaze/#argaze.ArUcoMarkers.ArUcoScene) OBJ file description: +Here is a sample of code to show the loading of an [ArUcoScene](../../../argaze/#argaze.ArUcoMarkers.ArUcoScene) OBJ file description: ``` python from argaze.ArUcoMarkers import ArUcoScene @@ -91,7 +91,7 @@ consistent_markers, unconsistent_markers, unconsistencies = aruco_scene.check_ma ## Scene pose estimation -Several approaches are available to perform [ArUcoScene](/argaze/#argaze.ArUcoMarkers.ArUcoScene) pose estimation from markers belonging to the scene. +Several approaches are available to perform [ArUcoScene](../../../argaze/#argaze.ArUcoMarkers.ArUcoScene) pose estimation from markers belonging to the scene. The first approach considers that scene pose can be estimated **from a single marker pose**: diff --git a/docs/user_guide/gaze_analysis/gaze_movement.md b/docs/user_guide/gaze_analysis/gaze_movement.md index 932afac..e022be0 100644 --- a/docs/user_guide/gaze_analysis/gaze_movement.md +++ b/docs/user_guide/gaze_analysis/gaze_movement.md @@ -9,26 +9,26 @@ Gaze movement Citation from ["One algorithm to rule them all? An evaluation and discussion of ten eye movement event-detection algorithms"](https://link.springer.com/article/10.3758/s13428-016-0738-9) article. -[GazeFeatures](/argaze/#argaze.GazeFeatures) defines abstract [GazeMovement](/argaze/#argaze.GazeFeatures.GazeMovement) class, then abstract [Fixation](/argaze/#argaze.GazeFeatures.Fixation) and [Saccade](/argaze/#argaze.GazeFeatures.Saccade) classes which inherit from [GazeMovement](/argaze/#argaze.GazeFeatures.GazeMovement). +[GazeFeatures](../../../argaze/#argaze.GazeFeatures) defines abstract [GazeMovement](../../../argaze/#argaze.GazeFeatures.GazeMovement) class, then abstract [Fixation](../../../argaze/#argaze.GazeFeatures.Fixation) and [Saccade](../../../argaze/#argaze.GazeFeatures.Saccade) classes which inherit from [GazeMovement](../../../argaze/#argaze.GazeFeatures.GazeMovement). -The **positions** [GazeMovement](/argaze/#argaze.GazeFeatures.GazeMovement) attribute contain all [GazePositions](/argaze/#argaze.GazeFeatures.GazePosition) belonging to itself. +The **positions** [GazeMovement](../../../argaze/#argaze.GazeFeatures.GazeMovement) attribute contain all [GazePositions](../../../argaze/#argaze.GazeFeatures.GazePosition) belonging to itself. ![Fixation and Saccade](../../img/fixation_and_saccade.png) ## Identification -[GazeFeatures](/argaze/#argaze.GazeFeatures) defines abstract [GazeMovementIdentifier](/argaze/#argaze.GazeFeatures.GazeMovementIdentifier) classe to let add various identification algorithms. +[GazeFeatures](../../../argaze/#argaze.GazeFeatures) defines abstract [GazeMovementIdentifier](../../../argaze/#argaze.GazeFeatures.GazeMovementIdentifier) classe to let add various identification algorithms. -Some gaze movement identification algorithms are available thanks to [GazeAnalysis](/argaze/#argaze.GazeAnalysis) submodule: +Some gaze movement identification algorithms are available thanks to [GazeAnalysis](../../../argaze/#argaze.GazeAnalysis) submodule: -* [Dispersion threshold identification (I-DT)](/argaze/#argaze.GazeAnalysis.DispersionThresholdIdentification) -* [Velocity threshold identification (I-VT)](/argaze/#argaze.GazeAnalysis.VelocityThresholdIdentification) +* [Dispersion threshold identification (I-DT)](../../../argaze/#argaze.GazeAnalysis.DispersionThresholdIdentification) +* [Velocity threshold identification (I-VT)](../../../argaze/#argaze.GazeAnalysis.VelocityThresholdIdentification) ### Identify method -[GazeMovementIdentifier.identify](/argaze/#argaze.GazeFeatures.GazeMovementIdentifier.identify) method allows to fed its identification algorithm with successive gaze positions to output Fixation, Saccade or any kind of GazeMovement instances. +[GazeMovementIdentifier.identify](../../../argaze/#argaze.GazeFeatures.GazeMovementIdentifier.identify) method allows to fed its identification algorithm with successive gaze positions to output Fixation, Saccade or any kind of GazeMovement instances. -Here is a sample of code based on [I-DT](/argaze/#argaze.GazeAnalysis.DispersionThresholdIdentification) algorithm to illustrate how to use it: +Here is a sample of code based on [I-DT](../../../argaze/#argaze.GazeAnalysis.DispersionThresholdIdentification) algorithm to illustrate how to use it: ``` python from argaze import GazeFeatures @@ -81,13 +81,13 @@ gaze_movement_identifier = DispersionThresholdIdentification.GazeMovementIdentif ### Browse method -[GazeMovementIdentifier.browse](/argaze/#argaze.GazeFeatures.GazeMovementIdentifier.browse) method allows to pass a [TimeStampedGazePositions](/argaze/#argaze.GazeFeatures.TimeStampedGazePositions) buffer to apply identification algorithm on all gaze positions inside. +[GazeMovementIdentifier.browse](../../../argaze/#argaze.GazeFeatures.GazeMovementIdentifier.browse) method allows to pass a [TimeStampedGazePositions](../../../argaze/#argaze.GazeFeatures.TimeStampedGazePositions) buffer to apply identification algorithm on all gaze positions inside. Identified gaze movements are returned through: -* [TimeStampedGazeMovements](/argaze/#argaze.GazeFeatures.TimeStampedGazeMovements) instance where all fixations are stored by starting gaze position timestamp. -* [TimeStampedGazeMovements](/argaze/#argaze.GazeFeatures.TimeStampedGazeMovements) instance where all saccades are stored by starting gaze position timestamp. -* [TimeStampedGazeStatus](/argaze/#argaze.GazeFeatures.TimeStampedGazeStatus) instance where all gaze positions are linked to a fixation or saccade index. +* [TimeStampedGazeMovements](../../../argaze/#argaze.GazeFeatures.TimeStampedGazeMovements) instance where all fixations are stored by starting gaze position timestamp. +* [TimeStampedGazeMovements](../../../argaze/#argaze.GazeFeatures.TimeStampedGazeMovements) instance where all saccades are stored by starting gaze position timestamp. +* [TimeStampedGazeStatus](../../../argaze/#argaze.GazeFeatures.TimeStampedGazeStatus) instance where all gaze positions are linked to a fixation or saccade index. ``` python # Assuming that timestamped gaze positions are provided through data reading @@ -136,13 +136,13 @@ ts_fixations, ts_saccades, ts_status = gaze_movement_identifier.browse(ts_gaze_p !!! note - [TimeStampedGazeMovements](/argaze/#argaze.GazeFeatures.TimeStampedGazeMovements), [TimeStampedGazeMovements](/argaze/#argaze.GazeFeatures.TimeStampedGazeMovements) and [TimeStampedGazeStatus](/argaze/#argaze.GazeFeatures.TimeStampedGazeStatus) classes inherit from [TimeStampedBuffer](/argaze/#argaze.DataStructures.TimeStampedBuffer) class. + [TimeStampedGazeMovements](../../../argaze/#argaze.GazeFeatures.TimeStampedGazeMovements), [TimeStampedGazeMovements](../../../argaze/#argaze.GazeFeatures.TimeStampedGazeMovements) and [TimeStampedGazeStatus](../../../argaze/#argaze.GazeFeatures.TimeStampedGazeStatus) classes inherit from [TimeStampedBuffer](../../../argaze/#argaze.DataStructures.TimeStampedBuffer) class. Read [Timestamped data](../timestamped_data/introduction.md) section to understand all features it provides. ### Generator method -[GazeMovementIdentifier](/argaze/#argaze.GazeFeatures.GazeMovementIdentifier) can be called with a [TimeStampedGazePositions](/argaze/#argaze.GazeFeatures.TimeStampedGazePositions) buffer in argument to generate gaze movement each time one is identified. +[GazeMovementIdentifier](../../../argaze/#argaze.GazeFeatures.GazeMovementIdentifier) can be called with a [TimeStampedGazePositions](../../../argaze/#argaze.GazeFeatures.TimeStampedGazePositions) buffer in argument to generate gaze movement each time one is identified. ``` python # Assuming that timestamped gaze positions are provided through data reading diff --git a/docs/user_guide/gaze_analysis/gaze_position.md b/docs/user_guide/gaze_analysis/gaze_position.md index 67f15f8..9cc7f85 100644 --- a/docs/user_guide/gaze_analysis/gaze_position.md +++ b/docs/user_guide/gaze_analysis/gaze_position.md @@ -1,7 +1,7 @@ Gaze position ============= -[GazeFeatures](/argaze/#argaze.GazeFeatures) defines a [GazePosition](/argaze/#argaze.GazeFeatures.GazePosition) class to handle point coordinates with a precision value. +[GazeFeatures](../../../argaze/#argaze.GazeFeatures) defines a [GazePosition](../../../argaze/#argaze.GazeFeatures.GazePosition) class to handle point coordinates with a precision value. ``` python from argaze import GazeFeatures @@ -20,7 +20,7 @@ print(f'precision: {gaze_position.precision}') ## Validity -[GazeFeatures](/argaze/#argaze.GazeFeatures) defines also a [UnvalidGazePosition](/argaze/#argaze.GazeFeatures.UnvalidGazePosition) class that inherits from [GazePosition](/argaze/#argaze.GazeFeatures.GazePosition) to handle case where no gaze position exists because of any specific device reason. +[GazeFeatures](../../../argaze/#argaze.GazeFeatures) defines also a [UnvalidGazePosition](../../../argaze/#argaze.GazeFeatures.UnvalidGazePosition) class that inherits from [GazePosition](../../../argaze/#argaze.GazeFeatures.GazePosition) to handle case where no gaze position exists because of any specific device reason. ``` python from argaze import GazeFeatures @@ -38,7 +38,7 @@ print(f'validity: {gaze_position.valid}') ## Distance -[GazePosition](/argaze/#argaze.GazeFeatures.GazePosition) class provides a **distance** method to calculate the distance to another gaze position instance. +[GazePosition](../../../argaze/#argaze.GazeFeatures.GazePosition) class provides a **distance** method to calculate the distance to another gaze position instance. ![Distance](../../img/distance.png) @@ -49,7 +49,7 @@ d = gaze_position_A.distance(gaze_position_B) ## Overlapping -[GazePosition](/argaze/#argaze.GazeFeatures.GazePosition) class provides an **overlap** method to test if a gaze position overlaps another one considering their precisions. +[GazePosition](../../../argaze/#argaze.GazeFeatures.GazePosition) class provides an **overlap** method to test if a gaze position overlaps another one considering their precisions. ![Gaze overlapping](../../img/overlapping.png) diff --git a/docs/user_guide/gaze_analysis/introduction.md b/docs/user_guide/gaze_analysis/introduction.md index d1bb122..c888181 100644 --- a/docs/user_guide/gaze_analysis/introduction.md +++ b/docs/user_guide/gaze_analysis/introduction.md @@ -3,5 +3,5 @@ Gaze analysis This section refers to: -* [GazeFeatures](/argaze/#argaze.GazeFeatures) -* [GazeAnalysis](/argaze/#argaze.GazeAnalysis) \ No newline at end of file +* [GazeFeatures](../../../argaze/#argaze.GazeFeatures) +* [GazeAnalysis](../../../argaze/#argaze.GazeAnalysis) \ No newline at end of file diff --git a/docs/user_guide/gaze_analysis/scan_path.md b/docs/user_guide/gaze_analysis/scan_path.md index c84c879..60e9d31 100644 --- a/docs/user_guide/gaze_analysis/scan_path.md +++ b/docs/user_guide/gaze_analysis/scan_path.md @@ -1,27 +1,27 @@ Scan path ========= -[GazeFeatures](/argaze/#argaze.GazeFeatures) defines classes to handle successive fixations/saccades and analyse their spatial or temporal properties. +[GazeFeatures](../../../argaze/#argaze.GazeFeatures) defines classes to handle successive fixations/saccades and analyse their spatial or temporal properties. ## Fixation based scan path ### Definition -The [ScanPath](/argaze/#argaze.GazeFeatures.ScanPath) class is defined as a list of [ScanSteps](/argaze/#argaze.GazeFeatures.ScanStep) which are defined as a fixation and a consecutive saccade. +The [ScanPath](../../../argaze/#argaze.GazeFeatures.ScanPath) class is defined as a list of [ScanSteps](../../../argaze/#argaze.GazeFeatures.ScanStep) which are defined as a fixation and a consecutive saccade. ![Fixation based scan path](../../img/scan_path.png) -As fixations and saccades are identified, the scan path is built by calling respectively [append_fixation](/argaze/#argaze.GazeFeatures.ScanPath.append_fixation) and [append_saccade](/argaze/#argaze.GazeFeatures.ScanPath.append_saccade) methods. +As fixations and saccades are identified, the scan path is built by calling respectively [append_fixation](../../../argaze/#argaze.GazeFeatures.ScanPath.append_fixation) and [append_saccade](../../../argaze/#argaze.GazeFeatures.ScanPath.append_saccade) methods. ### Analysis -[GazeFeatures](/argaze/#argaze.GazeFeatures) defines abstract [ScanPathAnalyzer](/argaze/#argaze.GazeFeatures.ScanPathAnalyzer) classe to let add various analysis algorithms. +[GazeFeatures](../../../argaze/#argaze.GazeFeatures) defines abstract [ScanPathAnalyzer](../../../argaze/#argaze.GazeFeatures.ScanPathAnalyzer) classe to let add various analysis algorithms. -Some scan path analysis are available thanks to [GazeAnalysis](/argaze/#argaze.GazeAnalysis) submodule: +Some scan path analysis are available thanks to [GazeAnalysis](../../../argaze/#argaze.GazeAnalysis) submodule: -* [K-Coefficient](/argaze/#argaze.GazeAnalysis.KCoefficient) -* [Nearest Neighbor Index](/argaze/#argaze.GazeAnalysis.NearestNeighborIndex) -* [Exploit Explore Ratio](/argaze/#argaze.GazeAnalysis.ExploitExploreRatio) +* [K-Coefficient](../../../argaze/#argaze.GazeAnalysis.KCoefficient) +* [Nearest Neighbor Index](../../../argaze/#argaze.GazeAnalysis.NearestNeighborIndex) +* [Exploit Explore Ratio](../../../argaze/#argaze.GazeAnalysis.ExploitExploreRatio) ### Example @@ -65,23 +65,23 @@ kc_analyzer = KCoefficient.ScanPathAnalyzer() ### Definition -The [AOIScanPath](/argaze/#argaze.GazeFeatures.AOIScanPath) class is defined as a list of [AOIScanSteps](/argaze/#argaze.GazeFeatures.AOIScanStep) which are defined as set of consecutives fixations looking at a same Area Of Interest (AOI) and a consecutive saccade. +The [AOIScanPath](../../../argaze/#argaze.GazeFeatures.AOIScanPath) class is defined as a list of [AOIScanSteps](../../../argaze/#argaze.GazeFeatures.AOIScanStep) which are defined as set of consecutives fixations looking at a same Area Of Interest (AOI) and a consecutive saccade. ![AOI based scan path](../../img/aoi_scan_path.png) -As fixations and saccades are identified, the scan path is built by calling respectively [append_fixation](/argaze/#argaze.GazeFeatures.AOIScanPath.append_fixation) and [append_saccade](/argaze/#argaze.GazeFeatures.AOIScanPath.append_saccade) methods. +As fixations and saccades are identified, the scan path is built by calling respectively [append_fixation](../../../argaze/#argaze.GazeFeatures.AOIScanPath.append_fixation) and [append_saccade](../../../argaze/#argaze.GazeFeatures.AOIScanPath.append_saccade) methods. ### Analysis -[GazeFeatures](/argaze/#argaze.GazeFeatures) defines abstract [AOIScanPathAnalyzer](/argaze/#argaze.GazeFeatures.AOIScanPathAnalyzer) classe to let add various analysis algorithms. +[GazeFeatures](../../../argaze/#argaze.GazeFeatures) defines abstract [AOIScanPathAnalyzer](../../../argaze/#argaze.GazeFeatures.AOIScanPathAnalyzer) classe to let add various analysis algorithms. -Some scan path analysis are available thanks to [GazeAnalysis](/argaze/#argaze.GazeAnalysis) submodule: +Some scan path analysis are available thanks to [GazeAnalysis](../../../argaze/#argaze.GazeAnalysis) submodule: -* [Transition matrix](/argaze/#argaze.GazeAnalysis.TransitionMatrix) -* [Entropy](/argaze/#argaze.GazeAnalysis.Entropy) -* [Lempel-Ziv complexity](/argaze/#argaze.GazeAnalysis.LempelZivComplexity) -* [N-Gram](/argaze/#argaze.GazeAnalysis.NGram) -* [K-modified coefficient](/argaze/#argaze.GazeAnalysis.KCoefficient) +* [Transition matrix](../../../argaze/#argaze.GazeAnalysis.TransitionMatrix) +* [Entropy](../../../argaze/#argaze.GazeAnalysis.Entropy) +* [Lempel-Ziv complexity](../../../argaze/#argaze.GazeAnalysis.LempelZivComplexity) +* [N-Gram](../../../argaze/#argaze.GazeAnalysis.NGram) +* [K-modified coefficient](../../../argaze/#argaze.GazeAnalysis.KCoefficient) ### Example @@ -130,13 +130,13 @@ lzc_analyzer = LempelZivComplexity.AOIScanPathAnalyzer() ### Advanced -The [AOIScanPath](/argaze/#argaze.GazeFeatures.AOIScanPath) class provides some advanced features to analyse it. +The [AOIScanPath](../../../argaze/#argaze.GazeFeatures.AOIScanPath) class provides some advanced features to analyse it. #### String representation -When a new [AOIScanStep](/argaze/#argaze.GazeFeatures.AOIScanStep) is created, the [AOIScanPath](/argaze/#argaze.GazeFeatures.AOIScanPath) internally affects a unique letter index related to its AOI to ease pattern analysis. -Then, the [AOIScanPath str](/argaze/#argaze.GazeFeatures.AOIScanPath.__str__) representation returns the concatenation of each [AOIScanStep](/argaze/#argaze.GazeFeatures.AOIScanStep) letter. -The [AOIScanPath get_letter_aoi](/argaze/#argaze.GazeFeatures.AOIScanPath.get_letter_aoi) method helps to get back the AOI related to a letter index. +When a new [AOIScanStep](../../../argaze/#argaze.GazeFeatures.AOIScanStep) is created, the [AOIScanPath](../../../argaze/#argaze.GazeFeatures.AOIScanPath) internally affects a unique letter index related to its AOI to ease pattern analysis. +Then, the [AOIScanPath str](../../../argaze/#argaze.GazeFeatures.AOIScanPath.__str__) representation returns the concatenation of each [AOIScanStep](../../../argaze/#argaze.GazeFeatures.AOIScanStep) letter. +The [AOIScanPath get_letter_aoi](../../../argaze/#argaze.GazeFeatures.AOIScanPath.get_letter_aoi) method helps to get back the AOI related to a letter index. ``` python # Assuming the following AOI scan path is built: Foo > Bar > Shu > Foo @@ -152,10 +152,10 @@ print(aoi_scan_path.get_letter_aoi('B')) #### Transition matrix -When a new [AOIScanStep](/argaze/#argaze.GazeFeatures.AOIScanStep) is created, the [AOIScanPath](/argaze/#argaze.GazeFeatures.AOIScanPath) internally counts the number of transitions from an AOI to another AOI to ease Markov chain analysis. -Then, the [AOIScanPath transition_matrix](/argaze/#argaze.GazeFeatures.AOIScanPath.transition_matrix) property returns a [Pandas DataFrame](https://pandas.pydata.org/docs/reference/api/pandas.DataFrame.html) where indexes are transition departures and columns are transition destinations. +When a new [AOIScanStep](../../../argaze/#argaze.GazeFeatures.AOIScanStep) is created, the [AOIScanPath](../../../argaze/#argaze.GazeFeatures.AOIScanPath) internally counts the number of transitions from an AOI to another AOI to ease Markov chain analysis. +Then, the [AOIScanPath transition_matrix](../../../argaze/#argaze.GazeFeatures.AOIScanPath.transition_matrix) property returns a [Pandas DataFrame](https://pandas.pydata.org/docs/reference/api/pandas.DataFrame.html) where indexes are transition departures and columns are transition destinations. -Here is an exemple of transition matrix for the following [AOIScanPath](/argaze/#argaze.GazeFeatures.AOIScanPath): Foo > Bar > Shu > Foo > Bar +Here is an exemple of transition matrix for the following [AOIScanPath](../../../argaze/#argaze.GazeFeatures.AOIScanPath): Foo > Bar > Shu > Foo > Bar | |Foo|Bar|Shu| |:--|:--|:--|:--| @@ -166,4 +166,4 @@ Here is an exemple of transition matrix for the following [AOIScanPath](/argaze/ #### Fixations count -The [AOIScanPath fixations_count](/argaze/#argaze.GazeFeatures.AOIScanPath.fixations_count) method returns the total number of fixations in the whole scan path and a dictionary to get the fixations count per AOI. +The [AOIScanPath fixations_count](../../../argaze/#argaze.GazeFeatures.AOIScanPath.fixations_count) method returns the total number of fixations in the whole scan path and a dictionary to get the fixations count per AOI. diff --git a/docs/user_guide/timestamped_data/data_synchronisation.md b/docs/user_guide/timestamped_data/data_synchronisation.md index de861a9..24a474c 100644 --- a/docs/user_guide/timestamped_data/data_synchronisation.md +++ b/docs/user_guide/timestamped_data/data_synchronisation.md @@ -3,13 +3,13 @@ Data synchronisation Recorded data needs to be synchronized to link them before further processings. -The [TimeStampedBuffer](/argaze/#argaze.DataStructures.TimeStampedBuffer) class provides various methods to help in such task. +The [TimeStampedBuffer](../../../argaze/#argaze.DataStructures.TimeStampedBuffer) class provides various methods to help in such task. ## Pop last before ![Pop last before](../../img/pop_last_before.png) -The code below shows how to use [pop_last_before](/argaze/#argaze.DataStructures.TimeStampedBuffer.pop_last_before) method in order to synchronise two timestamped data buffers with different timestamps: +The code below shows how to use [pop_last_before](../../../argaze/#argaze.DataStructures.TimeStampedBuffer.pop_last_before) method in order to synchronise two timestamped data buffers with different timestamps: ``` python from argaze import DataStructures @@ -34,7 +34,7 @@ for A_ts, A_data in A_data_record.items(): ![Pop last until](../../img/pop_last_until.png) -The code below shows how to use [pop_last_until](/argaze/#argaze.DataStructures.TimeStampedBuffer.pop_last_until) method in order to synchronise two timestamped data buffers with different timestamps: +The code below shows how to use [pop_last_until](../../../argaze/#argaze.DataStructures.TimeStampedBuffer.pop_last_until) method in order to synchronise two timestamped data buffers with different timestamps: ``` python from argaze import DataStructures @@ -59,7 +59,7 @@ for A_ts, A_data in A_data_record.items(): ![Get last before](../../img/get_last_before.png) -The code below shows how to use [get_last_before](/argaze/#argaze.DataStructures.TimeStampedBuffer.get_last_before) method in order to synchronise two timestamped data buffers with different timestamps: +The code below shows how to use [get_last_before](../../../argaze/#argaze.DataStructures.TimeStampedBuffer.get_last_before) method in order to synchronise two timestamped data buffers with different timestamps: ``` python from argaze import DataStructures @@ -84,7 +84,7 @@ for A_ts, A_data in A_data_record.items(): ![Get last until](../../img/get_last_until.png) -The code below shows how to use [get_last_until](/argaze/#argaze.DataStructures.TimeStampedBuffer.get_last_until) method in order to synchronise two timestamped data buffers with different timestamps: +The code below shows how to use [get_last_until](../../../argaze/#argaze.DataStructures.TimeStampedBuffer.get_last_until) method in order to synchronise two timestamped data buffers with different timestamps: ``` python from argaze import DataStructures diff --git a/docs/user_guide/timestamped_data/introduction.md b/docs/user_guide/timestamped_data/introduction.md index 2cee263..ed13d85 100644 --- a/docs/user_guide/timestamped_data/introduction.md +++ b/docs/user_guide/timestamped_data/introduction.md @@ -3,4 +3,4 @@ Timestamped data Working with wearable eye tracker devices implies to handle various timestamped data like frames, gaze positions, pupils diameter, fixations, saccades, ... -This section mainly refers to [DataStructures.TimeStampedBuffer](/argaze/#argaze.DataStructures.TimeStampedBuffer) class. +This section mainly refers to [DataStructures.TimeStampedBuffer](../../../argaze/#argaze.DataStructures.TimeStampedBuffer) class. diff --git a/docs/user_guide/timestamped_data/ordered_dictionary.md b/docs/user_guide/timestamped_data/ordered_dictionary.md index 8c93fc6..a3154eb 100644 --- a/docs/user_guide/timestamped_data/ordered_dictionary.md +++ b/docs/user_guide/timestamped_data/ordered_dictionary.md @@ -1,7 +1,7 @@ Ordered dictionary ================== -[TimeStampedBuffer](/argaze/#argaze.DataStructures.TimeStampedBuffer) class inherits from [OrderedDict](https://docs.python.org/3/library/collections.html#collections.OrderedDict) as data are de facto ordered by time. +[TimeStampedBuffer](../../../argaze/#argaze.DataStructures.TimeStampedBuffer) class inherits from [OrderedDict](https://docs.python.org/3/library/collections.html#collections.OrderedDict) as data are de facto ordered by time. Any data type can be stored using int or float keys as timestamp. diff --git a/docs/user_guide/timestamped_data/pandas_dataframe_conversion.md b/docs/user_guide/timestamped_data/pandas_dataframe_conversion.md index 3e6c0a4..839460a 100644 --- a/docs/user_guide/timestamped_data/pandas_dataframe_conversion.md +++ b/docs/user_guide/timestamped_data/pandas_dataframe_conversion.md @@ -7,7 +7,7 @@ Pandas DataFrame conversion A [Pandas DataFrame](https://pandas.pydata.org/docs/getting_started/intro_tutorials/01_table_oriented.html#min-tut-01-tableoriented) is a python data structure allowing powerful table processings. -[TimeStampedBuffer](/argaze/#argaze.DataStructures.TimeStampedBuffer) instance can be converted into dataframe provided that data values are stored as dictionaries. +[TimeStampedBuffer](../../../argaze/#argaze.DataStructures.TimeStampedBuffer) instance can be converted into dataframe provided that data values are stored as dictionaries. ```python from argaze import DataStructures diff --git a/docs/user_guide/timestamped_data/saving_and_loading.md b/docs/user_guide/timestamped_data/saving_and_loading.md index d3f2b9c..ae26052 100644 --- a/docs/user_guide/timestamped_data/saving_and_loading.md +++ b/docs/user_guide/timestamped_data/saving_and_loading.md @@ -1,7 +1,7 @@ Saving and loading ================== -[TimeStampedBuffer](/argaze/#argaze.DataStructures.TimeStampedBuffer) instance can be saved as and loaded from JSON file format. +[TimeStampedBuffer](../../../argaze/#argaze.DataStructures.TimeStampedBuffer) instance can be saved as and loaded from JSON file format. ```python -- cgit v1.1