aboutsummaryrefslogtreecommitdiff
path: root/docs/user_guide
diff options
context:
space:
mode:
Diffstat (limited to 'docs/user_guide')
-rw-r--r--docs/user_guide/gaze_analysis/gaze_movement.md8
-rw-r--r--docs/user_guide/gaze_analysis/scan_path.md6
-rw-r--r--docs/user_guide/timestamped_data/pandas_dataframe_conversion.md8
3 files changed, 13 insertions, 9 deletions
diff --git a/docs/user_guide/gaze_analysis/gaze_movement.md b/docs/user_guide/gaze_analysis/gaze_movement.md
index 6c7ab76..377e787 100644
--- a/docs/user_guide/gaze_analysis/gaze_movement.md
+++ b/docs/user_guide/gaze_analysis/gaze_movement.md
@@ -26,9 +26,9 @@ Some gaze movement identification algorithms are available thanks to [GazeAnalys
### Identify method
-[GazeMovementIdentifier](/argaze/#argaze.GazeFeatures.GazeMovementIdentifier) **identify** method allows to fed its identification algorithm with successive gaze positions to output Fixation, Saccade or any kind of GazeMovement instances.
+[GazeMovementIdentifier.identify](/argaze/#argaze.GazeFeatures.GazeMovementIdentifier.identify) method allows to fed its identification algorithm with successive gaze positions to output Fixation, Saccade or any kind of GazeMovement instances.
-Here is a sample of code based on I-DT algorithm to illustrate how to use it:
+Here is a sample of code based on [I-DT](/argaze/#argaze.GazeAnalysis.DispersionThresholdIdentification) algorithm to illustrate how to use it:
``` python
from argaze import GazeFeatures
@@ -81,7 +81,7 @@ gaze_movement_identifier = DispersionThresholdIdentification.GazeMovementIdentif
### Browse method
-[GazeMovementIdentifier](/argaze/#argaze.GazeFeatures.GazeMovementIdentifier) **browse** method allows to pass a TimeStampedGazePositions buffer to apply identification algorithm on all gaze positions inside.
+[GazeMovementIdentifier.browse](/argaze/#argaze.GazeFeatures.GazeMovementIdentifier.browse) method allows to pass a [TimeStampedGazePositions](/argaze/#argaze.GazeFeatures.TimeStampedGazePositions) buffer to apply identification algorithm on all gaze positions inside.
Identified gaze movements are returned through:
@@ -138,4 +138,4 @@ ts_fixations, ts_saccades, ts_status = gaze_movement_identifier.browse(ts_gaze_p
!!! note
[TimeStampedGazeMovements](/argaze/#argaze.GazeFeatures.TimeStampedGazeMovements), [TimeStampedGazeMovements](/argaze/#argaze.GazeFeatures.TimeStampedGazeMovements) and [TimeStampedGazeStatus](/argaze/#argaze.GazeFeatures.TimeStampedGazeStatus) classes inherit from [TimeStampedBuffer](/argaze/#argaze.DataStructures.TimeStampedBuffer) class.
- Read [timestamped data] page to understand all features it provides. \ No newline at end of file
+ Read [Timestamped data](../timestamped_data/introduction.md) section to understand all features it provides. \ No newline at end of file
diff --git a/docs/user_guide/gaze_analysis/scan_path.md b/docs/user_guide/gaze_analysis/scan_path.md
index e00682f..67600ac 100644
--- a/docs/user_guide/gaze_analysis/scan_path.md
+++ b/docs/user_guide/gaze_analysis/scan_path.md
@@ -11,7 +11,7 @@ The [ScanPath](/argaze/#argaze.GazeFeatures.ScanPath) class is defined as a list
![Fixation based scan path](../../img/scan_path.png)
-As fixations and saccades are identified, the scan path is built by calling respectively **append_fixation** and **append_saccade** methods.
+As fixations and saccades are identified, the scan path is built by calling respectively [append_fixation](/argaze/#argaze.GazeFeatures.ScanPath.append_fixation) and [append_saccade](/argaze/#argaze.GazeFeatures.ScanPath.append_saccade) methods.
### Analysis
@@ -68,7 +68,7 @@ The [AOIScanPath](/argaze/#argaze.GazeFeatures.AOIScanPath) class is defined as
![AOI based scan path](../../img/aoi_scan_path.png)
-As fixations and saccades are identified, the scan path is built by calling respectively **append_fixation** and **append_saccade** methods.
+As fixations and saccades are identified, the scan path is built by calling respectively [append_fixation](/argaze/#argaze.GazeFeatures.AOIScanPath.append_fixation) and [append_saccade](/argaze/#argaze.GazeFeatures.AOIScanPath.append_saccade) methods.
### Analysis
@@ -152,7 +152,7 @@ print(aoi_scan_path.get_letter_aoi('B'))
#### Transition matrix
When a new [AOIScanStep](/argaze/#argaze.GazeFeatures.AOIScanStep) is created, the [AOIScanPath](/argaze/#argaze.GazeFeatures.AOIScanPath) internally counts the number of transitions from an AOI to another AOI to ease Markov chain analysis.
-Then, the [AOIScanPath transition_matrix](/argaze/#argaze.GazeFeatures.AOIScanPath.transition_matrix) property returns a *Pandas DataFrame* where indexes are transition departures and columns are transition destinations.
+Then, the [AOIScanPath transition_matrix](/argaze/#argaze.GazeFeatures.AOIScanPath.transition_matrix) property returns a [Pandas DataFrame](https://pandas.pydata.org/docs/reference/api/pandas.DataFrame.html) where indexes are transition departures and columns are transition destinations.
Here is an exemple of transition matrix for the following [AOIScanPath](/argaze/#argaze.GazeFeatures.AOIScanPath): Foo > Bar > Shu > Foo > Bar
diff --git a/docs/user_guide/timestamped_data/pandas_dataframe_conversion.md b/docs/user_guide/timestamped_data/pandas_dataframe_conversion.md
index caddb11..3e6c0a4 100644
--- a/docs/user_guide/timestamped_data/pandas_dataframe_conversion.md
+++ b/docs/user_guide/timestamped_data/pandas_dataframe_conversion.md
@@ -1,7 +1,11 @@
-Pandas dataframe conversion
+---
+title: Pandas DataFrame conversion
+---
+
+Pandas DataFrame conversion
===========================
-A [Pandas dataframe](https://pandas.pydata.org/docs/getting_started/intro_tutorials/01_table_oriented.html#min-tut-01-tableoriented) is a python data structure allowing powerful table processings.
+A [Pandas DataFrame](https://pandas.pydata.org/docs/getting_started/intro_tutorials/01_table_oriented.html#min-tut-01-tableoriented) is a python data structure allowing powerful table processings.
[TimeStampedBuffer](/argaze/#argaze.DataStructures.TimeStampedBuffer) instance can be converted into dataframe provided that data values are stored as dictionaries.