aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorThéo de la Hogue2023-09-27 23:08:38 +0200
committerThéo de la Hogue2023-09-27 23:08:38 +0200
commit2d59cfc56590ed356a30d28cc52c00b533ab7a9e (patch)
treed5e94d4d99f97f1d1c2c2833feb8c9c11c23cbf1
parent66b84b019fe760a2cb9901a9f17b2d202d935ba4 (diff)
downloadargaze-2d59cfc56590ed356a30d28cc52c00b533ab7a9e.zip
argaze-2d59cfc56590ed356a30d28cc52c00b533ab7a9e.tar.gz
argaze-2d59cfc56590ed356a30d28cc52c00b533ab7a9e.tar.bz2
argaze-2d59cfc56590ed356a30d28cc52c00b533ab7a9e.tar.xz
Removing hidden sections and chapters.
-rw-r--r--docs/index.md2
-rw-r--r--docs/user_guide/areas_of_interest/aoi_matching.md48
-rw-r--r--docs/user_guide/areas_of_interest/heatmap.md40
-rw-r--r--docs/user_guide/areas_of_interest/introduction.md8
-rw-r--r--docs/user_guide/areas_of_interest/vision_cone_filtering.md18
-rw-r--r--docs/user_guide/gaze_features/gaze_movement.md163
-rw-r--r--docs/user_guide/gaze_features/gaze_position.md98
-rw-r--r--docs/user_guide/gaze_features/introduction.md7
-rw-r--r--docs/user_guide/gaze_features/scan_path.md169
-rw-r--r--docs/user_guide/timestamped_data/data_synchronisation.md106
-rw-r--r--docs/user_guide/timestamped_data/introduction.md6
-rw-r--r--docs/user_guide/timestamped_data/ordered_dictionary.md19
-rw-r--r--docs/user_guide/timestamped_data/pandas_dataframe_conversion.md41
-rw-r--r--docs/user_guide/timestamped_data/saving_and_loading.md14
-rw-r--r--mkdocs.yml19
15 files changed, 1 insertions, 757 deletions
diff --git a/docs/index.md b/docs/index.md
index f234a94..00e2e29 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -24,7 +24,7 @@ Once incoming data are formatted as required, all those gaze analysis features c
## Augmented reality based on ArUco markers pipeline
-Things goes harder when gaze data comes from head-mounted eye tracker devices. That's why **ArGaze** provides **Augmented Reality (AR)** support to map **Areas Of Interest (AOI)** on <a href="https://docs.opencv.org/4.x/d5/dae/tutorial_aruco_detection.html" target="_blank">OpenCV ArUco markers</a>.
+Things goes harder when gaze data comes from head-mounted eye tracker devices. That's why **ArGaze** provides **Augmented Reality (AR)** support to map **Areas Of Interest (AOI)** on [OpenCV ArUco markers](https://www.sciencedirect.com/science/article/abs/pii/S0031320314000235).
![ArUco pipeline axis](img/aruco_pipeline_axis.png)
diff --git a/docs/user_guide/areas_of_interest/aoi_matching.md b/docs/user_guide/areas_of_interest/aoi_matching.md
deleted file mode 100644
index 60467f9..0000000
--- a/docs/user_guide/areas_of_interest/aoi_matching.md
+++ /dev/null
@@ -1,48 +0,0 @@
----
-title: AOI matching
----
-
-AOI matching
-============
-
-Once [AOI3DScene](../../argaze.md/#argaze.AreaOfInterest.AOI3DScene) is projected as [AOI2DScene](../../argaze.md/#argaze.AreaOfInterest.AOI2DScene), it could be needed to know which AOI is looked.
-
-The [AreaOfInterest](../../argaze.md/#argaze.AreaOfInterest.AOIFeatures.AreaOfInterest) class in [AOIFeatures](../../argaze.md/#argaze.AreaOfInterest.AOIFeatures) provides two ways to accomplish such task.
-
-## Pointer-based matching
-
-Test if 2D pointer is inside or not AOI using contains_point() method as illustrated below.
-
-![Contains point](../../img/contains_point.png)
-
-``` python
-pointer = (x, y)
-
-for name, aoi in aoi2D_scene.items():
-
- if aoi.contains_point(pointer):
-
- # Do something with looked aoi
- ...
-
-```
-
-It is also possible to get where a pointer is looking inside an AOI provided that AOI is a rectangular plane:
-
-``` python
-
-inner_x, inner_y = aoi.inner_axis(pointer)
-
-```
-
-## Circle-based matching
-
-As positions have limited accuracy, it is possible to define a radius around a pointer to test circle intersection with AOI.
-
-![Circle intersection](../../img/circle_intersection.png)
-
-``` python
-
-intersection_shape, intersection_aoi_ratio, intersection_circle_ratio = aoi.circle_intersection(pointer, radius)
-
-```
diff --git a/docs/user_guide/areas_of_interest/heatmap.md b/docs/user_guide/areas_of_interest/heatmap.md
deleted file mode 100644
index 450c033..0000000
--- a/docs/user_guide/areas_of_interest/heatmap.md
+++ /dev/null
@@ -1,40 +0,0 @@
----
-title: Heatmap
----
-
-Heatmap
-=========
-
-[AOIFeatures](../../argaze.md/#argaze.AreaOfInterest.AOIFeatures) provides [Heatmap](../../argaze.md/#argaze.AreaOfInterest.AOIFeatures.Heatmap) class to draw heatmap image.
-
-## Point spread
-
-The **point_spread** method draw a gaussian point spread into heatmap image at a given pointer position.
-
-![Point spread](../../img/point_spread.png)
-
-## Heatmap
-
-Heatmap visualisation allows to show where a pointer is most of the time.
-
-![Heatmap](../../img/heatmap.png)
-
-```python
-from argaze.AreaOfInterest import AOIFeatures
-
-# Create heatmap of 800px * 600px resolution
-heatmap = AOIFeatures.Heatmap((800, 600))
-
-# Initialize heatmap
-heatmap.init()
-
-# Assuming a pointer position (x, y) is moving inside frame
-...:
-
- # Update heatmap at pointer position
- heatmap.update((x, y), sigma=0.05)
-
- # Do something with heatmap image
- ... heatmap.image
-
-``` \ No newline at end of file
diff --git a/docs/user_guide/areas_of_interest/introduction.md b/docs/user_guide/areas_of_interest/introduction.md
deleted file mode 100644
index 9467963..0000000
--- a/docs/user_guide/areas_of_interest/introduction.md
+++ /dev/null
@@ -1,8 +0,0 @@
-About Areas Of Interest (AOI)
-=============================
-
-The [AreaOfInterest submodule](../../argaze.md/#argaze.AreaOfInterest) allows to deal with AOI through a set of high level classes:
-
-* [AOIFeatures](../../argaze.md/#argaze.AreaOfInterest.AOIFeatures)
-* [AOI3DScene](../../argaze.md/#argaze.AreaOfInterest.AOI3DScene)
-* [AOI2DScene](../../argaze.md/#argaze.AreaOfInterest.AOI2DScene) \ No newline at end of file
diff --git a/docs/user_guide/areas_of_interest/vision_cone_filtering.md b/docs/user_guide/areas_of_interest/vision_cone_filtering.md
deleted file mode 100644
index 5c377bf..0000000
--- a/docs/user_guide/areas_of_interest/vision_cone_filtering.md
+++ /dev/null
@@ -1,18 +0,0 @@
-Vision cone filtering
-=====================
-
-The [AOI3DScene](../../argaze.md/#argaze.AreaOfInterest.AOI3DScene) provides cone clipping support in order to select only AOI which are inside vision cone field.
-
-![Vision cone](../../img/vision_cone.png)
-
-``` python
-# Transform scene into camera referential
-aoi3D_camera = aoi3D_scene.transform(tvec, rmat)
-
-# Get aoi inside vision cone field
-# The vision cone tip is positionned behind the head
-aoi3D_inside, aoi3D_outside = aoi3D_camera.vision_cone(cone_radius=300, cone_height=150, cone_tip=[0., 0., -20.])
-
-# Keep only aoi inside vision cone field
-aoi3D_scene = aoi3D_scene.copy(exclude=aoi3D_outside.keys())
-```
diff --git a/docs/user_guide/gaze_features/gaze_movement.md b/docs/user_guide/gaze_features/gaze_movement.md
deleted file mode 100644
index 83f67e1..0000000
--- a/docs/user_guide/gaze_features/gaze_movement.md
+++ /dev/null
@@ -1,163 +0,0 @@
-Gaze movement
-=============
-
-## Definition
-
-!!! note
-
- *"The act of classifying eye movements into distinct events is, on a general level, driven by a desire to isolate different intervals of the data stream strongly correlated with certain oculomotor or cognitive properties."*
-
- Citation from ["One algorithm to rule them all? An evaluation and discussion of ten eye movement event-detection algorithms"](https://link.springer.com/article/10.3758/s13428-016-0738-9) article.
-
-[GazeFeatures](../../argaze.md/#argaze.GazeFeatures) defines abstract [GazeMovement](../../argaze.md/#argaze.GazeFeatures.GazeMovement) class, then abstract [Fixation](../../argaze.md/#argaze.GazeFeatures.Fixation) and [Saccade](../../argaze.md/#argaze.GazeFeatures.Saccade) classes which inherit from [GazeMovement](../../argaze.md/#argaze.GazeFeatures.GazeMovement).
-
-The **positions** [GazeMovement](../../argaze.md/#argaze.GazeFeatures.GazeMovement) attribute contain all [GazePositions](../../argaze.md/#argaze.GazeFeatures.GazePosition) belonging to itself.
-
-![Fixation and Saccade](../../img/fixation_and_saccade.png)
-
-## Identification
-
-[GazeFeatures](../../argaze.md/#argaze.GazeFeatures) defines abstract [GazeMovementIdentifier](../../argaze.md/#argaze.GazeFeatures.GazeMovementIdentifier) classe to let add various identification algorithms.
-
-Some gaze movement identification algorithms are available thanks to [GazeAnalysis](../../argaze.md/#argaze.GazeAnalysis) submodule:
-
-* [Dispersion threshold identification (I-DT)](../../argaze.md/#argaze.GazeAnalysis.DispersionThresholdIdentification)
-* [Velocity threshold identification (I-VT)](../../argaze.md/#argaze.GazeAnalysis.VelocityThresholdIdentification)
-
-### Identify method
-
-[GazeMovementIdentifier.identify](../../argaze.md/#argaze.GazeFeatures.GazeMovementIdentifier.identify) method allows to fed its identification algorithm with successive gaze positions to output Fixation, Saccade or any kind of GazeMovement instances.
-
-Here is a sample of code based on [I-DT](../../argaze.md/#argaze.GazeAnalysis.DispersionThresholdIdentification) algorithm to illustrate how to use it:
-
-``` python
-from argaze import GazeFeatures
-from argaze.GazeAnalysis import DispersionThresholdIdentification
-
-# Create a gaze movement identifier based on dispersion algorithm with 50px max deviation 200 ms max duration thresholds
-gaze_movement_identifier = DispersionThresholdIdentification.GazeMovementIdentifier(50, 200)
-
-# Assuming that timestamped gaze positions are provided through live stream or later data reading
-...:
-
- gaze_movement = gaze_movement_identifier.identify(timestamp, gaze_position)
-
- # Fixation identified
- if GazeFeatures.is_fixation(gaze_movement):
-
- # Access to first gaze position of identified fixation
- start_ts, start_position = gaze_movement.positions.first
-
- # Access to fixation duration
- print('duration: {gaze_movement.duration}')
-
- # Iterate over all gaze positions of identified fixation
- for ts, position in gaze_movement.positions.items():
-
- # Do something with each fixation position
- ...
-
- # Saccade identified
- elif GazeFeatures.is_saccade(gaze_movement):
-
- # Access to first gaze position of identified saccade
- start_ts, start_position = gaze_movement.positions.first
-
- # Access to saccade amplitude
- print('amplitude: {gaze_movement.amplitude}')
-
- # Iterate over all gaze positions of identified saccade
- for ts, position in gaze_movement.positions.items():
-
- # Do something with each saccade position
- ...
-
- # No gaze movement identified
- else:
-
- continue
-
-```
-
-### Browse method
-
-[GazeMovementIdentifier.browse](../../argaze.md/#argaze.GazeFeatures.GazeMovementIdentifier.browse) method allows to pass a [TimeStampedGazePositions](../../argaze.md/#argaze.GazeFeatures.TimeStampedGazePositions) buffer to apply identification algorithm on all gaze positions inside.
-
-Identified gaze movements are returned through:
-
-* [TimeStampedGazeMovements](../../argaze.md/#argaze.GazeFeatures.TimeStampedGazeMovements) instance where all fixations are stored by starting gaze position timestamp.
-* [TimeStampedGazeMovements](../../argaze.md/#argaze.GazeFeatures.TimeStampedGazeMovements) instance where all saccades are stored by starting gaze position timestamp.
-* [TimeStampedGazeStatus](../../argaze.md/#argaze.GazeFeatures.TimeStampedGazeStatus) instance where all gaze positions are linked to a fixation or saccade index.
-
-``` python
-# Assuming that timestamped gaze positions are provided through data reading
-
-ts_fixations, ts_saccades, ts_status = gaze_movement_identifier.browse(ts_gaze_positions)
-
-```
-
-* ts_fixations would look like:
-
-|timestamp|positions |duration|dispersion|focus |
-|:--------|:-------------------------------------------------------------|:-------|:---------|:--------|
-|60034 |{"60034":[846,620], "60044":[837,641], "60054":[835,649], ...}|450 |40 |(840,660)|
-|60504 |{"60504":[838,667], "60514":[838,667], "60524":[837,669], ...}|100 |38 |(834,651)|
-|... |... |... |.. |... |
-
-* ts_saccades would look like:
-
-|timestamp|positions |duration|
-|:--------|:---------------------------------------|:-------|
-|60484 |{"60484":[836, 669], "60494":[837, 669]}|10 |
-|60594 |{"60594":[833, 613], "60614":[927, 601]}|20 |
-|... |... |... |
-
-* ts_status would look like:
-
-|timestamp|position |type |index|
-|:--------|:---------|:-------|:----|
-|60034 |(846, 620)|Fixation|1 |
-|60044 |(837, 641)|Fixation|1 |
-|... |... |... |. |
-|60464 |(836, 668)|Fixation|1 |
-|60474 |(836, 668)|Fixation|1 |
-|60484 |(836, 669)|Saccade |1 |
-|60494 |(837, 669)|Saccade |1 |
-|60504 |(838, 667)|Fixation|2 |
-|60514 |(838, 667)|Fixation|2 |
-|... |... |... |. |
-|60574 |(825, 629)|Fixation|2 |
-|60584 |(829, 615)|Fixation|2 |
-|60594 |(833, 613)|Saccade |2 |
-|60614 |(927, 601)|Saccade |2 |
-|60624 |(933, 599)|Fixation|3 |
-|60634 |(934, 603)|Fixation|3 |
-|... |... |... |. |
-
-
-!!! note
- [TimeStampedGazeMovements](../../argaze.md/#argaze.GazeFeatures.TimeStampedGazeMovements), [TimeStampedGazeMovements](../../argaze.md/#argaze.GazeFeatures.TimeStampedGazeMovements) and [TimeStampedGazeStatus](../../argaze.md/#argaze.GazeFeatures.TimeStampedGazeStatus) classes inherit from [TimeStampedBuffer](../../argaze.md/#argaze.DataStructures.TimeStampedBuffer) class.
-
- Read [Timestamped data](../timestamped_data/introduction.md) section to understand all features it provides.
-
-### Generator method
-
-[GazeMovementIdentifier](../../argaze.md/#argaze.GazeFeatures.GazeMovementIdentifier) can be called with a [TimeStampedGazePositions](../../argaze.md/#argaze.GazeFeatures.TimeStampedGazePositions) buffer in argument to generate gaze movement each time one is identified.
-
-``` python
-# Assuming that timestamped gaze positions are provided through data reading
-
-for ts, gaze_movement in gaze_movement_identifier(ts_gaze_positions):
-
- # Fixation identified
- if GazeFeatures.is_fixation(gaze_movement):
-
- # Do something with each fixation
- ...
-
- # Saccade identified
- elif GazeFeatures.is_saccade(gaze_movement):
-
- # Do something with each saccade
- ...
-``` \ No newline at end of file
diff --git a/docs/user_guide/gaze_features/gaze_position.md b/docs/user_guide/gaze_features/gaze_position.md
deleted file mode 100644
index 48495b4..0000000
--- a/docs/user_guide/gaze_features/gaze_position.md
+++ /dev/null
@@ -1,98 +0,0 @@
-Gaze position
-=============
-
-[GazeFeatures](../../argaze.md/#argaze.GazeFeatures) defines a [GazePosition](../../argaze.md/#argaze.GazeFeatures.GazePosition) class to handle point coordinates with a precision value.
-
-``` python
-from argaze import GazeFeatures
-
-# Define a basic gaze position
-gaze_position = GazeFeatures.GazePosition((123, 456))
-
-# Define a gaze position with a precision value
-gaze_position = GazeFeatures.GazePosition((789, 765), precision=10)
-
-# Access to gaze position value and precision
-print(f'position: {gaze_position.value}')
-print(f'precision: {gaze_position.precision}')
-
-```
-
-## Validity
-
-[GazeFeatures](../../argaze.md/#argaze.GazeFeatures) defines also a [UnvalidGazePosition](../../argaze.md/#argaze.GazeFeatures.UnvalidGazePosition) class that inherits from [GazePosition](../../argaze.md/#argaze.GazeFeatures.GazePosition) to handle case where no gaze position exists because of any specific device reason.
-
-``` python
-from argaze import GazeFeatures
-
-# Define a basic unvalid gaze position
-gaze_position = GazeFeatures.UnvalidGazePosition()
-
-# Define a basic unvalid gaze position with a message value
-gaze_position = GazeFeatures.UnvalidGazePosition("Something bad happened")
-
-# Access to gaze position validity
-print(f'validity: {gaze_position.valid}')
-
-```
-
-## Distance
-
-[GazePosition](../../argaze.md/#argaze.GazeFeatures.GazePosition) class provides a **distance** method to calculate the distance to another gaze position instance.
-
-![Distance](../../img/distance.png)
-
-``` python
-# Distance between A and B positions
-d = gaze_position_A.distance(gaze_position_B)
-```
-
-## Overlapping
-
-[GazePosition](../../argaze.md/#argaze.GazeFeatures.GazePosition) class provides an **overlap** method to test if a gaze position overlaps another one considering their precisions.
-
-![Gaze overlapping](../../img/overlapping.png)
-
-``` python
-# Check that A overlaps B
-if gaze_position_A.overlap(gaze_position_B):
-
- # Do something if A overlaps B
- ...
-
-# Check that A overlaps B and B overlaps A
-if gaze_position_A.overlap(gaze_position_B, both=True):
-
- # Do something if A overlaps B AND B overlaps A
- ...
-```
-
-## Timestamped gaze positions
-
-[TimeStampedGazePositions](../../argaze.md/#argaze.GazeFeatures.TimeStampedGazePositions) inherits from [TimeStampedBuffer](../../argaze.md/#argaze.DataStructures.TimeStampedBuffer) class to handle especially gaze positions.
-
-### Import from dataframe
-
-It is possible to load timestamped gaze positions from a [Pandas DataFrame](https://pandas.pydata.org/docs/getting_started/intro_tutorials/01_table_oriented.html#min-tut-01-tableoriented) object.
-
-```python
-import pandas
-
-# Load gaze positions from a CSV file into Panda Dataframe
-dataframe = pandas.read_csv('gaze_positions.csv', delimiter="\t", low_memory=False)
-
-# Convert Panda dataframe into TimestampedGazePositions buffer precising the use of each specific column labels
-ts_gaze_positions = GazeFeatures.TimeStampedGazePositions.from_dataframe(dataframe, timestamp = 'Recording timestamp [ms]', x = 'Gaze point X [px]', y = 'Gaze point Y [px]')
-
-```
-### Iterator
-
-Like [TimeStampedBuffer](../../argaze.md/#argaze.DataStructures.TimeStampedBuffer), [TimeStampedGazePositions](../../argaze.md/#argaze.GazeFeatures.TimeStampedGazePositions) class provides iterator feature:
-
-```python
-for timestamp, gaze_position in ts_gaze_positions.items():
-
- # Do something with each gaze position
- ...
-
-```
diff --git a/docs/user_guide/gaze_features/introduction.md b/docs/user_guide/gaze_features/introduction.md
deleted file mode 100644
index bf818ba..0000000
--- a/docs/user_guide/gaze_features/introduction.md
+++ /dev/null
@@ -1,7 +0,0 @@
-Gaze analysis
-=============
-
-This section refers to:
-
-* [GazeFeatures](../../argaze.md/#argaze.GazeFeatures)
-* [GazeAnalysis](../../argaze.md/#argaze.GazeAnalysis) \ No newline at end of file
diff --git a/docs/user_guide/gaze_features/scan_path.md b/docs/user_guide/gaze_features/scan_path.md
deleted file mode 100644
index 46af28b..0000000
--- a/docs/user_guide/gaze_features/scan_path.md
+++ /dev/null
@@ -1,169 +0,0 @@
-Scan path
-=========
-
-[GazeFeatures](../../argaze.md/#argaze.GazeFeatures) defines classes to handle successive fixations/saccades and analyse their spatial or temporal properties.
-
-## Fixation based scan path
-
-### Definition
-
-The [ScanPath](../../argaze.md/#argaze.GazeFeatures.ScanPath) class is defined as a list of [ScanSteps](../../argaze.md/#argaze.GazeFeatures.ScanStep) which are defined as a fixation and a consecutive saccade.
-
-![Fixation based scan path](../../img/scan_path.png)
-
-As fixations and saccades are identified, the scan path is built by calling respectively [append_fixation](../../argaze.md/#argaze.GazeFeatures.ScanPath.append_fixation) and [append_saccade](../../argaze.md/#argaze.GazeFeatures.ScanPath.append_saccade) methods.
-
-### Analysis
-
-[GazeFeatures](../../argaze.md/#argaze.GazeFeatures) defines abstract [ScanPathAnalyzer](../../argaze.md/#argaze.GazeFeatures.ScanPathAnalyzer) classe to let add various analysis algorithms.
-
-Some scan path analysis are available thanks to [GazeAnalysis](../../argaze.md/#argaze.GazeAnalysis) submodule:
-
-* [K-Coefficient](../../argaze.md/#argaze.GazeAnalysis.KCoefficient)
-* [Nearest Neighbor Index](../../argaze.md/#argaze.GazeAnalysis.NearestNeighborIndex)
-* [Exploit Explore Ratio](../../argaze.md/#argaze.GazeAnalysis.ExploitExploreRatio)
-
-### Example
-
-Here is a sample of code to illustrate how to built a scan path and analyze it:
-
-``` python
-from argaze import GazeFeatures
-from argaze.GazeAnalysis import KCoefficient
-
-# Create a empty scan path
-scan_path = GazeFeatures.ScanPath()
-
-# Create a K coefficient analyzer
-kc_analyzer = KCoefficient.ScanPathAnalyzer()
-
-# Assuming a gaze movement is identified at ts time
-...:
-
- # Fixation identified
- if GazeFeatures.is_fixation(gaze_movement):
-
- # Append fixation to scan path : no step is created
- scan_path.append_fixation(ts, gaze_movement)
-
- # Saccade identified
- elif GazeFeatures.is_saccade(gaze_movement):
-
- # Append saccade to scan path : a new step should be created
- new_step = scan_path.append_saccade(data_ts, gaze_movement)
-
- # Analyse scan path
- if new_step:
-
- K = kc_analyzer.analyze(scan_path)
-
- # Do something with K metric
- ...
-```
-
-## AOI based scan path
-
-### Definition
-
-The [AOIScanPath](../../argaze.md/#argaze.GazeFeatures.AOIScanPath) class is defined as a list of [AOIScanSteps](../../argaze.md/#argaze.GazeFeatures.AOIScanStep) which are defined as set of consecutives fixations looking at a same Area Of Interest (AOI) and a consecutive saccade.
-
-![AOI based scan path](../../img/aoi_scan_path.png)
-
-As fixations and saccades are identified, the scan path is built by calling respectively [append_fixation](../../argaze.md/#argaze.GazeFeatures.AOIScanPath.append_fixation) and [append_saccade](../../argaze.md/#argaze.GazeFeatures.AOIScanPath.append_saccade) methods.
-
-### Analysis
-
-[GazeFeatures](../../argaze.md/#argaze.GazeFeatures) defines abstract [AOIScanPathAnalyzer](../../argaze.md/#argaze.GazeFeatures.AOIScanPathAnalyzer) classe to let add various analysis algorithms.
-
-Some scan path analysis are available thanks to [GazeAnalysis](../../argaze.md/#argaze.GazeAnalysis) submodule:
-
-* [Transition matrix](../../argaze.md/#argaze.GazeAnalysis.TransitionMatrix)
-* [Entropy](../../argaze.md/#argaze.GazeAnalysis.Entropy)
-* [Lempel-Ziv complexity](../../argaze.md/#argaze.GazeAnalysis.LempelZivComplexity)
-* [N-Gram](../../argaze.md/#argaze.GazeAnalysis.NGram)
-* [K-modified coefficient](../../argaze.md/#argaze.GazeAnalysis.KCoefficient)
-
-### Example
-
-Here is a sample of code to illustrate how to built a AOI scan path and analyze it:
-
-``` python
-from argaze import GazeFeatures
-from argaze.GazeAnalysis import LempelZivComplexity
-
-# Assuming all AOI names are listed
-...
-
-# Create a empty AOI scan path
-aoi_scan_path = GazeFeatures.AOIScanPath(aoi_names)
-
-# Create a Lempel-Ziv complexity analyzer
-lzc_analyzer = LempelZivComplexity.AOIScanPathAnalyzer()
-
-# Assuming a gaze movement is identified at ts time
-...:
-
- # Fixation identified
- if GazeFeatures.is_fixation(gaze_movement):
-
- # Assuming fixation is detected as inside an AOI
- ...
-
- # Append fixation to AOI scan path : a new step should be created
- new_step = aoi_scan_path.append_fixation(ts, gaze_movement, looked_aoi_name)
-
- # Analyse AOI scan path
- if new_step:
-
- LZC = kc_analyzer.analyze(aoi_scan_path)
-
- # Do something with LZC metric
- ...
-
- # Saccade identified
- elif GazeFeatures.is_saccade(gaze_movement):
-
- # Append saccade to scan path : no step is created
- aoi_scan_path.append_saccade(data_ts, gaze_movement)
-
-```
-
-### Advanced
-
-The [AOIScanPath](../../argaze.md/#argaze.GazeFeatures.AOIScanPath) class provides some advanced features to analyse it.
-
-#### Letter sequence
-
-When a new [AOIScanStep](../../argaze.md/#argaze.GazeFeatures.AOIScanStep) is created, the [AOIScanPath](../../argaze.md/#argaze.GazeFeatures.AOIScanPath) internally affects a unique letter index related to its AOI to ease pattern analysis.
-Then, the [AOIScanPath letter_sequence](../../argaze.md/#argaze.GazeFeatures.AOIScanPath.letter_sequence) property returns the concatenation of each [AOIScanStep](../../argaze.md/#argaze.GazeFeatures.AOIScanStep) letter.
-The [AOIScanPath get_letter_aoi](../../argaze.md/#argaze.GazeFeatures.AOIScanPath.get_letter_aoi) method helps to get back the AOI related to a letter index.
-
-``` python
-# Assuming the following AOI scan path is built: Foo > Bar > Shu > Foo
-aoi_scan_path = ...
-
-# Letter sequence representation should be: 'ABCA'
-print(aoi_scan_path.letter_sequence)
-
-# Output should be: 'Bar'
-print(aoi_scan_path.get_letter_aoi('B'))
-
-```
-
-#### Transition matrix
-
-When a new [AOIScanStep](../../argaze.md/#argaze.GazeFeatures.AOIScanStep) is created, the [AOIScanPath](../../argaze.md/#argaze.GazeFeatures.AOIScanPath) internally counts the number of transitions from an AOI to another AOI to ease Markov chain analysis.
-Then, the [AOIScanPath transition_matrix](../../argaze.md/#argaze.GazeFeatures.AOIScanPath.transition_matrix) property returns a [Pandas DataFrame](https://pandas.pydata.org/docs/reference/api/pandas.DataFrame.html) where indexes are transition departures and columns are transition destinations.
-
-Here is an exemple of transition matrix for the following [AOIScanPath](../../argaze.md/#argaze.GazeFeatures.AOIScanPath): Foo > Bar > Shu > Foo > Bar
-
-| |Foo|Bar|Shu|
-|:--|:--|:--|:--|
-|Foo|0 |2 |0 |
-|Bar|0 |0 |1 |
-|Shu|1 |0 |0 |
-
-
-#### Fixations count
-
-The [AOIScanPath fixations_count](../../argaze.md/#argaze.GazeFeatures.AOIScanPath.fixations_count) method returns the total number of fixations in the whole scan path and a dictionary to get the fixations count per AOI.
diff --git a/docs/user_guide/timestamped_data/data_synchronisation.md b/docs/user_guide/timestamped_data/data_synchronisation.md
deleted file mode 100644
index 5190eab..0000000
--- a/docs/user_guide/timestamped_data/data_synchronisation.md
+++ /dev/null
@@ -1,106 +0,0 @@
-Data synchronisation
-====================
-
-Recorded data needs to be synchronized to link them before further processings.
-
-The [TimeStampedBuffer](../../argaze.md/#argaze.DataStructures.TimeStampedBuffer) class provides various methods to help in such task.
-
-## Pop last before
-
-![Pop last before](../../img/pop_last_before.png)
-
-The code below shows how to use [pop_last_before](../../argaze.md/#argaze.DataStructures.TimeStampedBuffer.pop_last_before) method in order to synchronise two timestamped data buffers with different timestamps:
-
-``` python
-from argaze import DataStructures
-
-# Assuming A_data_record and B_data_record are TimeStampedBuffer instances with different timestamps
-
-for A_ts, A_data in A_data_record.items():
-
- try:
-
- # Get nearest B data before current A data and remove all B data before (including the returned one)
- B_ts, B_data = B_data_record.pop_last_before(A_ts)
-
- # No data stored before A_ts timestamp
- except KeyError:
-
- pass
-
-```
-
-## Pop last until
-
-![Pop last until](../../img/pop_last_until.png)
-
-The code below shows how to use [pop_last_until](../../argaze.md/#argaze.DataStructures.TimeStampedBuffer.pop_last_until) method in order to synchronise two timestamped data buffers with different timestamps:
-
-``` python
-from argaze import DataStructures
-
-# Assuming A_data_record and B_data_record are TimeStampedBuffer instances with different timestamps
-
-for A_ts, A_data in A_data_record.items():
-
- try:
-
- # Get nearest B data after current A data and remove all B data before
- B_ts, B_data = B_data_record.pop_last_until(A_ts)
-
- # No data stored until A_ts timestamp
- except KeyError:
-
- pass
-
-```
-
-## Get last before
-
-![Get last before](../../img/get_last_before.png)
-
-The code below shows how to use [get_last_before](../../argaze.md/#argaze.DataStructures.TimeStampedBuffer.get_last_before) method in order to synchronise two timestamped data buffers with different timestamps:
-
-``` python
-from argaze import DataStructures
-
-# Assuming A_data_record and B_data_record are TimeStampedBuffer instances with different timestamps
-
-for A_ts, A_data in A_data_record.items():
-
- try:
-
- # Get nearest B data before current A data
- B_ts, B_data = B_data_record.get_last_before(A_ts)
-
- # No data stored before A_ts timestamp
- except KeyError:
-
- pass
-
-```
-
-## Get last until
-
-![Get last until](../../img/get_last_until.png)
-
-The code below shows how to use [get_last_until](../../argaze.md/#argaze.DataStructures.TimeStampedBuffer.get_last_until) method in order to synchronise two timestamped data buffers with different timestamps:
-
-``` python
-from argaze import DataStructures
-
-# Assuming A_data_record and B_data_record are TimeStampedBuffer instances with different timestamps
-
-for A_ts, A_data in A_data_record.items():
-
- try:
-
- # Get nearest B data after current A data
- B_ts, B_data = B_data_record.get_last_until(A_ts)
-
- # No data stored until A_ts timestamp
- except KeyError:
-
- pass
-
-```
diff --git a/docs/user_guide/timestamped_data/introduction.md b/docs/user_guide/timestamped_data/introduction.md
deleted file mode 100644
index 974e2be..0000000
--- a/docs/user_guide/timestamped_data/introduction.md
+++ /dev/null
@@ -1,6 +0,0 @@
-Timestamped data
-================
-
-Working with wearable eye tracker devices implies to handle various timestamped data like gaze positions, pupills diameter, fixations, saccades, ...
-
-This section mainly refers to [DataStructures.TimeStampedBuffer](../../argaze.md/#argaze.DataStructures.TimeStampedBuffer) class.
diff --git a/docs/user_guide/timestamped_data/ordered_dictionary.md b/docs/user_guide/timestamped_data/ordered_dictionary.md
deleted file mode 100644
index 64dd899..0000000
--- a/docs/user_guide/timestamped_data/ordered_dictionary.md
+++ /dev/null
@@ -1,19 +0,0 @@
-Ordered dictionary
-==================
-
-[TimeStampedBuffer](../../argaze.md/#argaze.DataStructures.TimeStampedBuffer) class inherits from [OrderedDict](https://docs.python.org/3/library/collections.html#collections.OrderedDict) as data are de facto ordered by time.
-
-Any data type can be stored using int or float keys as timestamp.
-
-```python
-from argaze import DataStructures
-
-# Create a timestamped data buffer
-ts_data = DataStructures.TimeStampedBuffer()
-
-# Store any data type using numeric keys
-ts_data[0] = 123
-ts_data[0.1] = "message"
-ts_data[0.23] = {"key": value}
-...
-```
diff --git a/docs/user_guide/timestamped_data/pandas_dataframe_conversion.md b/docs/user_guide/timestamped_data/pandas_dataframe_conversion.md
deleted file mode 100644
index 7614e73..0000000
--- a/docs/user_guide/timestamped_data/pandas_dataframe_conversion.md
+++ /dev/null
@@ -1,41 +0,0 @@
----
-title: Pandas DataFrame conversion
----
-
-Pandas DataFrame conversion
-===========================
-
-A [Pandas DataFrame](https://pandas.pydata.org/docs/getting_started/intro_tutorials/01_table_oriented.html#min-tut-01-tableoriented) is a python data structure allowing powerful table processings.
-
-## Export as dataframe
-
-[TimeStampedBuffer](../../argaze.md/#argaze.DataStructures.TimeStampedBuffer) instance can be converted into dataframe provided that data values are stored as dictionaries.
-
-```python
-from argaze import DataStructures
-
-# Create a timestamped data buffer
-ts_data = DataStructures.TimeStampedBuffer()
-
-# Store various data as dictionary
-ts_data[10] = {"A_key": 0, "B_key": 0.123}}
-ts_data[20] = {"A_key": 4, "B_key": 0.567}}
-ts_data[30] = {"A_key": 8, "B_key": 0.901}}
-...
-
-# Convert timestamped data buffer into dataframe
-ts_buffer_dataframe = ts_buffer.as_dataframe()
-```
-
-ts_buffer_dataframe would look like:
-
-|timestamp|A_key|B_key|
-|:--------|:----|:----|
-|10 |0 |0.123|
-|20 |4 |0.567|
-|30 |8 |0.901|
-|... |... |... |
-
-## Import from dataframe
-
-Reversely, [TimeStampedBuffer](../../argaze.md/#argaze.DataStructures.TimeStampedBuffer) instance can be created from dataframe, as a result of which each dataframe columns label will become a key of data value dictionary. Notice that the column containing timestamp values have to be called 'timestamp'.
diff --git a/docs/user_guide/timestamped_data/saving_and_loading.md b/docs/user_guide/timestamped_data/saving_and_loading.md
deleted file mode 100644
index 4e6a094..0000000
--- a/docs/user_guide/timestamped_data/saving_and_loading.md
+++ /dev/null
@@ -1,14 +0,0 @@
-Saving and loading
-==================
-
-[TimeStampedBuffer](../../argaze.md/#argaze.DataStructures.TimeStampedBuffer) instance can be saved as and loaded from JSON file format.
-
-```python
-
-# Save
-ts_data.to_json('./data.json')
-
-# Load
-ts_data = DataStructures.TimeStampedBuffer.from_json('./data.json')
-
-```
diff --git a/mkdocs.yml b/mkdocs.yml
index 784c9e2..d00d6e7 100644
--- a/mkdocs.yml
+++ b/mkdocs.yml
@@ -34,25 +34,6 @@ nav:
- user_guide/aruco_markers_pipeline/advanced_topics/scripting.md
- user_guide/aruco_markers_pipeline/advanced_topics/optic_parameters_calibration.md
- user_guide/aruco_markers_pipeline/advanced_topics/aruco_detector_configuration.md
-
-# - Areas Of Interest:
-# - user_guide/areas_of_interest/introduction.md
-# - user_guide/areas_of_interest/aoi_scene_description.md
-# - user_guide/areas_of_interest/aoi_scene_projection.md
-# - user_guide/areas_of_interest/vision_cone_filtering.md
-# - user_guide/areas_of_interest/aoi_matching.md
-# - user_guide/areas_of_interest/heatmap.md
-# - Gaze Features:
-# - user_guide/gaze_features/introduction.md
-# - user_guide/gaze_features/gaze_position.md
-# - user_guide/gaze_features/gaze_movement.md
-# - user_guide/gaze_features/scan_path.md
-# - Timestamped data:
-# - user_guide/timestamped_data/introduction.md
-# - user_guide/timestamped_data/ordered_dictionary.md
-# - user_guide/timestamped_data/saving_and_loading.md
-# - user_guide/timestamped_data/data_synchronisation.md
-# - user_guide/timestamped_data/pandas_dataframe_conversion.md
- utils:
- user_guide/utils/ready-made_scripts.md
- user_guide/utils/demonstrations_scripts.md