From 23fa1a7835b3c7cfd976b1d160878289b1f0657c Mon Sep 17 00:00:00 2001 From: Theo De La Hogue Date: Sat, 23 Sep 2023 07:22:23 +0200 Subject: Fixing code annotation. Removing useless documentation section. Fixing documentation cross reference. --- docs/user_guide/gaze_features/gaze_movement.md | 163 ++++++++++++++++++++++++ docs/user_guide/gaze_features/gaze_position.md | 98 ++++++++++++++ docs/user_guide/gaze_features/introduction.md | 7 + docs/user_guide/gaze_features/scan_path.md | 169 +++++++++++++++++++++++++ 4 files changed, 437 insertions(+) create mode 100644 docs/user_guide/gaze_features/gaze_movement.md create mode 100644 docs/user_guide/gaze_features/gaze_position.md create mode 100644 docs/user_guide/gaze_features/introduction.md create mode 100644 docs/user_guide/gaze_features/scan_path.md (limited to 'docs/user_guide/gaze_features') diff --git a/docs/user_guide/gaze_features/gaze_movement.md b/docs/user_guide/gaze_features/gaze_movement.md new file mode 100644 index 0000000..83f67e1 --- /dev/null +++ b/docs/user_guide/gaze_features/gaze_movement.md @@ -0,0 +1,163 @@ +Gaze movement +============= + +## Definition + +!!! note + + *"The act of classifying eye movements into distinct events is, on a general level, driven by a desire to isolate different intervals of the data stream strongly correlated with certain oculomotor or cognitive properties."* + + Citation from ["One algorithm to rule them all? An evaluation and discussion of ten eye movement event-detection algorithms"](https://link.springer.com/article/10.3758/s13428-016-0738-9) article. + +[GazeFeatures](../../argaze.md/#argaze.GazeFeatures) defines abstract [GazeMovement](../../argaze.md/#argaze.GazeFeatures.GazeMovement) class, then abstract [Fixation](../../argaze.md/#argaze.GazeFeatures.Fixation) and [Saccade](../../argaze.md/#argaze.GazeFeatures.Saccade) classes which inherit from [GazeMovement](../../argaze.md/#argaze.GazeFeatures.GazeMovement). + +The **positions** [GazeMovement](../../argaze.md/#argaze.GazeFeatures.GazeMovement) attribute contain all [GazePositions](../../argaze.md/#argaze.GazeFeatures.GazePosition) belonging to itself. + +![Fixation and Saccade](../../img/fixation_and_saccade.png) + +## Identification + +[GazeFeatures](../../argaze.md/#argaze.GazeFeatures) defines abstract [GazeMovementIdentifier](../../argaze.md/#argaze.GazeFeatures.GazeMovementIdentifier) classe to let add various identification algorithms. + +Some gaze movement identification algorithms are available thanks to [GazeAnalysis](../../argaze.md/#argaze.GazeAnalysis) submodule: + +* [Dispersion threshold identification (I-DT)](../../argaze.md/#argaze.GazeAnalysis.DispersionThresholdIdentification) +* [Velocity threshold identification (I-VT)](../../argaze.md/#argaze.GazeAnalysis.VelocityThresholdIdentification) + +### Identify method + +[GazeMovementIdentifier.identify](../../argaze.md/#argaze.GazeFeatures.GazeMovementIdentifier.identify) method allows to fed its identification algorithm with successive gaze positions to output Fixation, Saccade or any kind of GazeMovement instances. + +Here is a sample of code based on [I-DT](../../argaze.md/#argaze.GazeAnalysis.DispersionThresholdIdentification) algorithm to illustrate how to use it: + +``` python +from argaze import GazeFeatures +from argaze.GazeAnalysis import DispersionThresholdIdentification + +# Create a gaze movement identifier based on dispersion algorithm with 50px max deviation 200 ms max duration thresholds +gaze_movement_identifier = DispersionThresholdIdentification.GazeMovementIdentifier(50, 200) + +# Assuming that timestamped gaze positions are provided through live stream or later data reading +...: + + gaze_movement = gaze_movement_identifier.identify(timestamp, gaze_position) + + # Fixation identified + if GazeFeatures.is_fixation(gaze_movement): + + # Access to first gaze position of identified fixation + start_ts, start_position = gaze_movement.positions.first + + # Access to fixation duration + print('duration: {gaze_movement.duration}') + + # Iterate over all gaze positions of identified fixation + for ts, position in gaze_movement.positions.items(): + + # Do something with each fixation position + ... + + # Saccade identified + elif GazeFeatures.is_saccade(gaze_movement): + + # Access to first gaze position of identified saccade + start_ts, start_position = gaze_movement.positions.first + + # Access to saccade amplitude + print('amplitude: {gaze_movement.amplitude}') + + # Iterate over all gaze positions of identified saccade + for ts, position in gaze_movement.positions.items(): + + # Do something with each saccade position + ... + + # No gaze movement identified + else: + + continue + +``` + +### Browse method + +[GazeMovementIdentifier.browse](../../argaze.md/#argaze.GazeFeatures.GazeMovementIdentifier.browse) method allows to pass a [TimeStampedGazePositions](../../argaze.md/#argaze.GazeFeatures.TimeStampedGazePositions) buffer to apply identification algorithm on all gaze positions inside. + +Identified gaze movements are returned through: + +* [TimeStampedGazeMovements](../../argaze.md/#argaze.GazeFeatures.TimeStampedGazeMovements) instance where all fixations are stored by starting gaze position timestamp. +* [TimeStampedGazeMovements](../../argaze.md/#argaze.GazeFeatures.TimeStampedGazeMovements) instance where all saccades are stored by starting gaze position timestamp. +* [TimeStampedGazeStatus](../../argaze.md/#argaze.GazeFeatures.TimeStampedGazeStatus) instance where all gaze positions are linked to a fixation or saccade index. + +``` python +# Assuming that timestamped gaze positions are provided through data reading + +ts_fixations, ts_saccades, ts_status = gaze_movement_identifier.browse(ts_gaze_positions) + +``` + +* ts_fixations would look like: + +|timestamp|positions |duration|dispersion|focus | +|:--------|:-------------------------------------------------------------|:-------|:---------|:--------| +|60034 |{"60034":[846,620], "60044":[837,641], "60054":[835,649], ...}|450 |40 |(840,660)| +|60504 |{"60504":[838,667], "60514":[838,667], "60524":[837,669], ...}|100 |38 |(834,651)| +|... |... |... |.. |... | + +* ts_saccades would look like: + +|timestamp|positions |duration| +|:--------|:---------------------------------------|:-------| +|60484 |{"60484":[836, 669], "60494":[837, 669]}|10 | +|60594 |{"60594":[833, 613], "60614":[927, 601]}|20 | +|... |... |... | + +* ts_status would look like: + +|timestamp|position |type |index| +|:--------|:---------|:-------|:----| +|60034 |(846, 620)|Fixation|1 | +|60044 |(837, 641)|Fixation|1 | +|... |... |... |. | +|60464 |(836, 668)|Fixation|1 | +|60474 |(836, 668)|Fixation|1 | +|60484 |(836, 669)|Saccade |1 | +|60494 |(837, 669)|Saccade |1 | +|60504 |(838, 667)|Fixation|2 | +|60514 |(838, 667)|Fixation|2 | +|... |... |... |. | +|60574 |(825, 629)|Fixation|2 | +|60584 |(829, 615)|Fixation|2 | +|60594 |(833, 613)|Saccade |2 | +|60614 |(927, 601)|Saccade |2 | +|60624 |(933, 599)|Fixation|3 | +|60634 |(934, 603)|Fixation|3 | +|... |... |... |. | + + +!!! note + [TimeStampedGazeMovements](../../argaze.md/#argaze.GazeFeatures.TimeStampedGazeMovements), [TimeStampedGazeMovements](../../argaze.md/#argaze.GazeFeatures.TimeStampedGazeMovements) and [TimeStampedGazeStatus](../../argaze.md/#argaze.GazeFeatures.TimeStampedGazeStatus) classes inherit from [TimeStampedBuffer](../../argaze.md/#argaze.DataStructures.TimeStampedBuffer) class. + + Read [Timestamped data](../timestamped_data/introduction.md) section to understand all features it provides. + +### Generator method + +[GazeMovementIdentifier](../../argaze.md/#argaze.GazeFeatures.GazeMovementIdentifier) can be called with a [TimeStampedGazePositions](../../argaze.md/#argaze.GazeFeatures.TimeStampedGazePositions) buffer in argument to generate gaze movement each time one is identified. + +``` python +# Assuming that timestamped gaze positions are provided through data reading + +for ts, gaze_movement in gaze_movement_identifier(ts_gaze_positions): + + # Fixation identified + if GazeFeatures.is_fixation(gaze_movement): + + # Do something with each fixation + ... + + # Saccade identified + elif GazeFeatures.is_saccade(gaze_movement): + + # Do something with each saccade + ... +``` \ No newline at end of file diff --git a/docs/user_guide/gaze_features/gaze_position.md b/docs/user_guide/gaze_features/gaze_position.md new file mode 100644 index 0000000..48495b4 --- /dev/null +++ b/docs/user_guide/gaze_features/gaze_position.md @@ -0,0 +1,98 @@ +Gaze position +============= + +[GazeFeatures](../../argaze.md/#argaze.GazeFeatures) defines a [GazePosition](../../argaze.md/#argaze.GazeFeatures.GazePosition) class to handle point coordinates with a precision value. + +``` python +from argaze import GazeFeatures + +# Define a basic gaze position +gaze_position = GazeFeatures.GazePosition((123, 456)) + +# Define a gaze position with a precision value +gaze_position = GazeFeatures.GazePosition((789, 765), precision=10) + +# Access to gaze position value and precision +print(f'position: {gaze_position.value}') +print(f'precision: {gaze_position.precision}') + +``` + +## Validity + +[GazeFeatures](../../argaze.md/#argaze.GazeFeatures) defines also a [UnvalidGazePosition](../../argaze.md/#argaze.GazeFeatures.UnvalidGazePosition) class that inherits from [GazePosition](../../argaze.md/#argaze.GazeFeatures.GazePosition) to handle case where no gaze position exists because of any specific device reason. + +``` python +from argaze import GazeFeatures + +# Define a basic unvalid gaze position +gaze_position = GazeFeatures.UnvalidGazePosition() + +# Define a basic unvalid gaze position with a message value +gaze_position = GazeFeatures.UnvalidGazePosition("Something bad happened") + +# Access to gaze position validity +print(f'validity: {gaze_position.valid}') + +``` + +## Distance + +[GazePosition](../../argaze.md/#argaze.GazeFeatures.GazePosition) class provides a **distance** method to calculate the distance to another gaze position instance. + +![Distance](../../img/distance.png) + +``` python +# Distance between A and B positions +d = gaze_position_A.distance(gaze_position_B) +``` + +## Overlapping + +[GazePosition](../../argaze.md/#argaze.GazeFeatures.GazePosition) class provides an **overlap** method to test if a gaze position overlaps another one considering their precisions. + +![Gaze overlapping](../../img/overlapping.png) + +``` python +# Check that A overlaps B +if gaze_position_A.overlap(gaze_position_B): + + # Do something if A overlaps B + ... + +# Check that A overlaps B and B overlaps A +if gaze_position_A.overlap(gaze_position_B, both=True): + + # Do something if A overlaps B AND B overlaps A + ... +``` + +## Timestamped gaze positions + +[TimeStampedGazePositions](../../argaze.md/#argaze.GazeFeatures.TimeStampedGazePositions) inherits from [TimeStampedBuffer](../../argaze.md/#argaze.DataStructures.TimeStampedBuffer) class to handle especially gaze positions. + +### Import from dataframe + +It is possible to load timestamped gaze positions from a [Pandas DataFrame](https://pandas.pydata.org/docs/getting_started/intro_tutorials/01_table_oriented.html#min-tut-01-tableoriented) object. + +```python +import pandas + +# Load gaze positions from a CSV file into Panda Dataframe +dataframe = pandas.read_csv('gaze_positions.csv', delimiter="\t", low_memory=False) + +# Convert Panda dataframe into TimestampedGazePositions buffer precising the use of each specific column labels +ts_gaze_positions = GazeFeatures.TimeStampedGazePositions.from_dataframe(dataframe, timestamp = 'Recording timestamp [ms]', x = 'Gaze point X [px]', y = 'Gaze point Y [px]') + +``` +### Iterator + +Like [TimeStampedBuffer](../../argaze.md/#argaze.DataStructures.TimeStampedBuffer), [TimeStampedGazePositions](../../argaze.md/#argaze.GazeFeatures.TimeStampedGazePositions) class provides iterator feature: + +```python +for timestamp, gaze_position in ts_gaze_positions.items(): + + # Do something with each gaze position + ... + +``` diff --git a/docs/user_guide/gaze_features/introduction.md b/docs/user_guide/gaze_features/introduction.md new file mode 100644 index 0000000..bf818ba --- /dev/null +++ b/docs/user_guide/gaze_features/introduction.md @@ -0,0 +1,7 @@ +Gaze analysis +============= + +This section refers to: + +* [GazeFeatures](../../argaze.md/#argaze.GazeFeatures) +* [GazeAnalysis](../../argaze.md/#argaze.GazeAnalysis) \ No newline at end of file diff --git a/docs/user_guide/gaze_features/scan_path.md b/docs/user_guide/gaze_features/scan_path.md new file mode 100644 index 0000000..46af28b --- /dev/null +++ b/docs/user_guide/gaze_features/scan_path.md @@ -0,0 +1,169 @@ +Scan path +========= + +[GazeFeatures](../../argaze.md/#argaze.GazeFeatures) defines classes to handle successive fixations/saccades and analyse their spatial or temporal properties. + +## Fixation based scan path + +### Definition + +The [ScanPath](../../argaze.md/#argaze.GazeFeatures.ScanPath) class is defined as a list of [ScanSteps](../../argaze.md/#argaze.GazeFeatures.ScanStep) which are defined as a fixation and a consecutive saccade. + +![Fixation based scan path](../../img/scan_path.png) + +As fixations and saccades are identified, the scan path is built by calling respectively [append_fixation](../../argaze.md/#argaze.GazeFeatures.ScanPath.append_fixation) and [append_saccade](../../argaze.md/#argaze.GazeFeatures.ScanPath.append_saccade) methods. + +### Analysis + +[GazeFeatures](../../argaze.md/#argaze.GazeFeatures) defines abstract [ScanPathAnalyzer](../../argaze.md/#argaze.GazeFeatures.ScanPathAnalyzer) classe to let add various analysis algorithms. + +Some scan path analysis are available thanks to [GazeAnalysis](../../argaze.md/#argaze.GazeAnalysis) submodule: + +* [K-Coefficient](../../argaze.md/#argaze.GazeAnalysis.KCoefficient) +* [Nearest Neighbor Index](../../argaze.md/#argaze.GazeAnalysis.NearestNeighborIndex) +* [Exploit Explore Ratio](../../argaze.md/#argaze.GazeAnalysis.ExploitExploreRatio) + +### Example + +Here is a sample of code to illustrate how to built a scan path and analyze it: + +``` python +from argaze import GazeFeatures +from argaze.GazeAnalysis import KCoefficient + +# Create a empty scan path +scan_path = GazeFeatures.ScanPath() + +# Create a K coefficient analyzer +kc_analyzer = KCoefficient.ScanPathAnalyzer() + +# Assuming a gaze movement is identified at ts time +...: + + # Fixation identified + if GazeFeatures.is_fixation(gaze_movement): + + # Append fixation to scan path : no step is created + scan_path.append_fixation(ts, gaze_movement) + + # Saccade identified + elif GazeFeatures.is_saccade(gaze_movement): + + # Append saccade to scan path : a new step should be created + new_step = scan_path.append_saccade(data_ts, gaze_movement) + + # Analyse scan path + if new_step: + + K = kc_analyzer.analyze(scan_path) + + # Do something with K metric + ... +``` + +## AOI based scan path + +### Definition + +The [AOIScanPath](../../argaze.md/#argaze.GazeFeatures.AOIScanPath) class is defined as a list of [AOIScanSteps](../../argaze.md/#argaze.GazeFeatures.AOIScanStep) which are defined as set of consecutives fixations looking at a same Area Of Interest (AOI) and a consecutive saccade. + +![AOI based scan path](../../img/aoi_scan_path.png) + +As fixations and saccades are identified, the scan path is built by calling respectively [append_fixation](../../argaze.md/#argaze.GazeFeatures.AOIScanPath.append_fixation) and [append_saccade](../../argaze.md/#argaze.GazeFeatures.AOIScanPath.append_saccade) methods. + +### Analysis + +[GazeFeatures](../../argaze.md/#argaze.GazeFeatures) defines abstract [AOIScanPathAnalyzer](../../argaze.md/#argaze.GazeFeatures.AOIScanPathAnalyzer) classe to let add various analysis algorithms. + +Some scan path analysis are available thanks to [GazeAnalysis](../../argaze.md/#argaze.GazeAnalysis) submodule: + +* [Transition matrix](../../argaze.md/#argaze.GazeAnalysis.TransitionMatrix) +* [Entropy](../../argaze.md/#argaze.GazeAnalysis.Entropy) +* [Lempel-Ziv complexity](../../argaze.md/#argaze.GazeAnalysis.LempelZivComplexity) +* [N-Gram](../../argaze.md/#argaze.GazeAnalysis.NGram) +* [K-modified coefficient](../../argaze.md/#argaze.GazeAnalysis.KCoefficient) + +### Example + +Here is a sample of code to illustrate how to built a AOI scan path and analyze it: + +``` python +from argaze import GazeFeatures +from argaze.GazeAnalysis import LempelZivComplexity + +# Assuming all AOI names are listed +... + +# Create a empty AOI scan path +aoi_scan_path = GazeFeatures.AOIScanPath(aoi_names) + +# Create a Lempel-Ziv complexity analyzer +lzc_analyzer = LempelZivComplexity.AOIScanPathAnalyzer() + +# Assuming a gaze movement is identified at ts time +...: + + # Fixation identified + if GazeFeatures.is_fixation(gaze_movement): + + # Assuming fixation is detected as inside an AOI + ... + + # Append fixation to AOI scan path : a new step should be created + new_step = aoi_scan_path.append_fixation(ts, gaze_movement, looked_aoi_name) + + # Analyse AOI scan path + if new_step: + + LZC = kc_analyzer.analyze(aoi_scan_path) + + # Do something with LZC metric + ... + + # Saccade identified + elif GazeFeatures.is_saccade(gaze_movement): + + # Append saccade to scan path : no step is created + aoi_scan_path.append_saccade(data_ts, gaze_movement) + +``` + +### Advanced + +The [AOIScanPath](../../argaze.md/#argaze.GazeFeatures.AOIScanPath) class provides some advanced features to analyse it. + +#### Letter sequence + +When a new [AOIScanStep](../../argaze.md/#argaze.GazeFeatures.AOIScanStep) is created, the [AOIScanPath](../../argaze.md/#argaze.GazeFeatures.AOIScanPath) internally affects a unique letter index related to its AOI to ease pattern analysis. +Then, the [AOIScanPath letter_sequence](../../argaze.md/#argaze.GazeFeatures.AOIScanPath.letter_sequence) property returns the concatenation of each [AOIScanStep](../../argaze.md/#argaze.GazeFeatures.AOIScanStep) letter. +The [AOIScanPath get_letter_aoi](../../argaze.md/#argaze.GazeFeatures.AOIScanPath.get_letter_aoi) method helps to get back the AOI related to a letter index. + +``` python +# Assuming the following AOI scan path is built: Foo > Bar > Shu > Foo +aoi_scan_path = ... + +# Letter sequence representation should be: 'ABCA' +print(aoi_scan_path.letter_sequence) + +# Output should be: 'Bar' +print(aoi_scan_path.get_letter_aoi('B')) + +``` + +#### Transition matrix + +When a new [AOIScanStep](../../argaze.md/#argaze.GazeFeatures.AOIScanStep) is created, the [AOIScanPath](../../argaze.md/#argaze.GazeFeatures.AOIScanPath) internally counts the number of transitions from an AOI to another AOI to ease Markov chain analysis. +Then, the [AOIScanPath transition_matrix](../../argaze.md/#argaze.GazeFeatures.AOIScanPath.transition_matrix) property returns a [Pandas DataFrame](https://pandas.pydata.org/docs/reference/api/pandas.DataFrame.html) where indexes are transition departures and columns are transition destinations. + +Here is an exemple of transition matrix for the following [AOIScanPath](../../argaze.md/#argaze.GazeFeatures.AOIScanPath): Foo > Bar > Shu > Foo > Bar + +| |Foo|Bar|Shu| +|:--|:--|:--|:--| +|Foo|0 |2 |0 | +|Bar|0 |0 |1 | +|Shu|1 |0 |0 | + + +#### Fixations count + +The [AOIScanPath fixations_count](../../argaze.md/#argaze.GazeFeatures.AOIScanPath.fixations_count) method returns the total number of fixations in the whole scan path and a dictionary to get the fixations count per AOI. -- cgit v1.1