diff options
Diffstat (limited to 'docs/user_guide/gaze_analysis')
-rw-r--r-- | docs/user_guide/gaze_analysis/gaze_movement.md | 141 | ||||
-rw-r--r-- | docs/user_guide/gaze_analysis/gaze_position.md | 68 | ||||
-rw-r--r-- | docs/user_guide/gaze_analysis/introduction.md | 7 | ||||
-rw-r--r-- | docs/user_guide/gaze_analysis/scan_path.md | 168 |
4 files changed, 384 insertions, 0 deletions
diff --git a/docs/user_guide/gaze_analysis/gaze_movement.md b/docs/user_guide/gaze_analysis/gaze_movement.md new file mode 100644 index 0000000..6c7ab76 --- /dev/null +++ b/docs/user_guide/gaze_analysis/gaze_movement.md @@ -0,0 +1,141 @@ +Gaze movement +============= + +## Definition + +!!! note + + *"The act of classifying eye movements into distinct events is, on a general level, driven by a desire to isolate different intervals of the data stream strongly correlated with certain oculomotor or cognitive properties."* + + Citation from ["One algorithm to rule them all? An evaluation and discussion of ten eye movement event-detection algorithms"](https://link.springer.com/article/10.3758/s13428-016-0738-9) article. + +[GazeFeatures](/argaze/#argaze.GazeFeatures) defines abstract [GazeMovement](/argaze/#argaze.GazeFeatures.GazeMovement) class, then abstract [Fixation](/argaze/#argaze.GazeFeatures.Fixation) and [Saccade](/argaze/#argaze.GazeFeatures.Saccade) classes which inherit from [GazeMovement](/argaze/#argaze.GazeFeatures.GazeMovement). + +The **positions** [GazeMovement](/argaze/#argaze.GazeFeatures.GazeMovement) attribute contain all [GazePositions](/argaze/#argaze.GazeFeatures.GazePosition) belonging to itself. + +![Fixation and Saccade](../../img/fixation_and_saccade.png) + +## Identification + +[GazeFeatures](/argaze/#argaze.GazeFeatures) defines abstract [GazeMovementIdentifier](/argaze/#argaze.GazeFeatures.GazeMovementIdentifier) classe to let add various identification algorithms. + +Some gaze movement identification algorithms are available thanks to [GazeAnalysis](/argaze/#argaze.GazeAnalysis) submodule: + +* [Dispersion threshold identification (I-DT)](/argaze/#argaze.GazeAnalysis.DispersionThresholdIdentification) +* [Velocity threshold identification (I-VT)](/argaze/#argaze.GazeAnalysis.VelocityThresholdIdentification) + +### Identify method + +[GazeMovementIdentifier](/argaze/#argaze.GazeFeatures.GazeMovementIdentifier) **identify** method allows to fed its identification algorithm with successive gaze positions to output Fixation, Saccade or any kind of GazeMovement instances. + +Here is a sample of code based on I-DT algorithm to illustrate how to use it: + +``` python +from argaze import GazeFeatures +from argaze.GazeAnalysis import DispersionThresholdIdentification + +# Create a gaze movement identifier based on dispersion algorithm with 50px max deviation 200 ms max duration thresholds +gaze_movement_identifier = DispersionThresholdIdentification.GazeMovementIdentifier(50, 200) + +# Assuming that timestamped gaze positions are provided through live stream or later data reading +...: + + gaze_movement = gaze_movement_identifier.identify(timestamp, gaze_position) + + # Fixation identified + if GazeFeatures.is_fixation(gaze_movement): + + # Access to first gaze position of identified fixation + start_ts, start_position = gaze_movement.positions.first + + # Access to fixation duration + print('duration: {gaze_movement.duration}') + + # Iterate over all gaze positions of identified fixation + for ts, position in gaze_movement.positions.items(): + + # Do something with each fixation position + ... + + # Saccade identified + elif GazeFeatures.is_saccade(gaze_movement): + + # Access to first gaze position of identified saccade + start_ts, start_position = gaze_movement.positions.first + + # Access to saccade amplitude + print('amplitude: {gaze_movement.amplitude}') + + # Iterate over all gaze positions of identified saccade + for ts, position in gaze_movement.positions.items(): + + # Do something with each saccade position + ... + + # No gaze movement identified + else: + + continue + +``` + +### Browse method + +[GazeMovementIdentifier](/argaze/#argaze.GazeFeatures.GazeMovementIdentifier) **browse** method allows to pass a TimeStampedGazePositions buffer to apply identification algorithm on all gaze positions inside. + +Identified gaze movements are returned through: + +* [TimeStampedGazeMovements](/argaze/#argaze.GazeFeatures.TimeStampedGazeMovements) instance where all fixations are stored by starting gaze position timestamp. +* [TimeStampedGazeMovements](/argaze/#argaze.GazeFeatures.TimeStampedGazeMovements) instance where all saccades are stored by starting gaze position timestamp. +* [TimeStampedGazeStatus](/argaze/#argaze.GazeFeatures.TimeStampedGazeStatus) instance where all gaze positions are linked to a fixation or saccade index. + +``` python +# Assuming that timestamped gaze positions are provided through data reading + +ts_fixations, ts_saccades, ts_status = gaze_movement_identifier.browse(ts_gaze_positions) + +``` + +* ts_fixations would look like: + +|timestamp|positions |duration|dispersion|focus | +|:--------|:-------------------------------------------------------------|:-------|:---------|:--------| +|60034 |{"60034":[846,620], "60044":[837,641], "60054":[835,649], ...}|450 |40 |(840,660)| +|60504 |{"60504":[838,667], "60514":[838,667], "60524":[837,669], ...}|100 |38 |(834,651)| +|... |... |... |.. |... | + +* ts_saccades would look like: + +|timestamp|positions |duration| +|:--------|:---------------------------------------|:-------| +|60484 |{"60484":[836, 669], "60494":[837, 669]}|10 | +|60594 |{"60594":[833, 613], "60614":[927, 601]}|20 | +|... |... |... | + +* ts_status would look like: + +|timestamp|position |type |index| +|:--------|:---------|:-------|:----| +|60034 |(846, 620)|Fixation|1 | +|60044 |(837, 641)|Fixation|1 | +|... |... |... |. | +|60464 |(836, 668)|Fixation|1 | +|60474 |(836, 668)|Fixation|1 | +|60484 |(836, 669)|Saccade |1 | +|60494 |(837, 669)|Saccade |1 | +|60504 |(838, 667)|Fixation|2 | +|60514 |(838, 667)|Fixation|2 | +|... |... |... |. | +|60574 |(825, 629)|Fixation|2 | +|60584 |(829, 615)|Fixation|2 | +|60594 |(833, 613)|Saccade |2 | +|60614 |(927, 601)|Saccade |2 | +|60624 |(933, 599)|Fixation|3 | +|60634 |(934, 603)|Fixation|3 | +|... |... |... |. | + + +!!! note + [TimeStampedGazeMovements](/argaze/#argaze.GazeFeatures.TimeStampedGazeMovements), [TimeStampedGazeMovements](/argaze/#argaze.GazeFeatures.TimeStampedGazeMovements) and [TimeStampedGazeStatus](/argaze/#argaze.GazeFeatures.TimeStampedGazeStatus) classes inherit from [TimeStampedBuffer](/argaze/#argaze.DataStructures.TimeStampedBuffer) class. + + Read [timestamped data] page to understand all features it provides.
\ No newline at end of file diff --git a/docs/user_guide/gaze_analysis/gaze_position.md b/docs/user_guide/gaze_analysis/gaze_position.md new file mode 100644 index 0000000..67f15f8 --- /dev/null +++ b/docs/user_guide/gaze_analysis/gaze_position.md @@ -0,0 +1,68 @@ +Gaze position +============= + +[GazeFeatures](/argaze/#argaze.GazeFeatures) defines a [GazePosition](/argaze/#argaze.GazeFeatures.GazePosition) class to handle point coordinates with a precision value. + +``` python +from argaze import GazeFeatures + +# Define a basic gaze position +gaze_position = GazeFeatures.GazePosition((123, 456)) + +# Define a gaze position with a precision value +gaze_position = GazeFeatures.GazePosition((789, 765), precision=10) + +# Access to gaze position value and precision +print(f'position: {gaze_position.value}') +print(f'precision: {gaze_position.precision}') + +``` + +## Validity + +[GazeFeatures](/argaze/#argaze.GazeFeatures) defines also a [UnvalidGazePosition](/argaze/#argaze.GazeFeatures.UnvalidGazePosition) class that inherits from [GazePosition](/argaze/#argaze.GazeFeatures.GazePosition) to handle case where no gaze position exists because of any specific device reason. + +``` python +from argaze import GazeFeatures + +# Define a basic unvalid gaze position +gaze_position = GazeFeatures.UnvalidGazePosition() + +# Define a basic unvalid gaze position with a message value +gaze_position = GazeFeatures.UnvalidGazePosition("Something bad happened") + +# Access to gaze position validity +print(f'validity: {gaze_position.valid}') + +``` + +## Distance + +[GazePosition](/argaze/#argaze.GazeFeatures.GazePosition) class provides a **distance** method to calculate the distance to another gaze position instance. + +![Distance](../../img/distance.png) + +``` python +# Distance between A and B positions +d = gaze_position_A.distance(gaze_position_B) +``` + +## Overlapping + +[GazePosition](/argaze/#argaze.GazeFeatures.GazePosition) class provides an **overlap** method to test if a gaze position overlaps another one considering their precisions. + +![Gaze overlapping](../../img/overlapping.png) + +``` python +# Check that A overlaps B +if gaze_position_A.overlap(gaze_position_B): + + # Do something if A overlaps B + ... + +# Check that A overlaps B and B overlaps A +if gaze_position_A.overlap(gaze_position_B, both=True): + + # Do something if A overlaps B AND B overlaps A + ... +``` diff --git a/docs/user_guide/gaze_analysis/introduction.md b/docs/user_guide/gaze_analysis/introduction.md new file mode 100644 index 0000000..d1bb122 --- /dev/null +++ b/docs/user_guide/gaze_analysis/introduction.md @@ -0,0 +1,7 @@ +Gaze analysis +============= + +This section refers to: + +* [GazeFeatures](/argaze/#argaze.GazeFeatures) +* [GazeAnalysis](/argaze/#argaze.GazeAnalysis)
\ No newline at end of file diff --git a/docs/user_guide/gaze_analysis/scan_path.md b/docs/user_guide/gaze_analysis/scan_path.md new file mode 100644 index 0000000..e00682f --- /dev/null +++ b/docs/user_guide/gaze_analysis/scan_path.md @@ -0,0 +1,168 @@ +Scan path +========= + +[GazeFeatures](/argaze/#argaze.GazeFeatures) defines classes to handle successive fixations/saccades and analyse their spatial or temporal properties. + +## Fixation based scan path + +### Definition + +The [ScanPath](/argaze/#argaze.GazeFeatures.ScanPath) class is defined as a list of [ScanSteps](/argaze/#argaze.GazeFeatures.ScanStep) which are defined as a fixation and a consecutive saccade. + +![Fixation based scan path](../../img/scan_path.png) + +As fixations and saccades are identified, the scan path is built by calling respectively **append_fixation** and **append_saccade** methods. + +### Analysis + +[GazeFeatures](/argaze/#argaze.GazeFeatures) defines abstract [ScanPathAnalyzer](/argaze/#argaze.GazeFeatures.ScanPathAnalyzer) classe to let add various analysis algorithms. + +Some scan path analysis are available thanks to [GazeAnalysis](/argaze/#argaze.GazeAnalysis) submodule: + +* [K-Coefficient](/argaze/#argaze.GazeAnalysis.KCoefficient) +* [Nearest Neighbor Index](/argaze/#argaze.GazeAnalysis.NearestNeighborIndex) + +### Example + +Here is a sample of code to illustrate how to built a scan path and analyze it: + +``` python +from argaze import GazeFeatures +from argaze.GazeAnalysis import KCoefficient + +# Create a empty scan path +scan_path = GazeFeatures.ScanPath() + +# Create a K coefficient analyzer +kc_analyzer = KCoefficient.ScanPathAnalyzer() + +# Assuming a gaze movement is identified at ts time +...: + + # Fixation identified + if GazeFeatures.is_fixation(gaze_movement): + + # Append fixation to scan path : no step is created + scan_path.append_fixation(ts, gaze_movement) + + # Saccade identified + elif GazeFeatures.is_saccade(gaze_movement): + + # Append saccade to scan path : a new step should be created + new_step = scan_path.append_saccade(data_ts, gaze_movement) + + # Analyse scan path + if new_step: + + K = kc_analyzer.analyze(scan_path) + + # Do something with K metric + ... +``` + +## AOI based scan path + +### Definition + +The [AOIScanPath](/argaze/#argaze.GazeFeatures.AOIScanPath) class is defined as a list of [AOIScanSteps](/argaze/#argaze.GazeFeatures.AOIScanStep) which are defined as set of consecutives fixations looking at a same Area Of Interest (AOI) and a consecutive saccade. + +![AOI based scan path](../../img/aoi_scan_path.png) + +As fixations and saccades are identified, the scan path is built by calling respectively **append_fixation** and **append_saccade** methods. + +### Analysis + +[GazeFeatures](/argaze/#argaze.GazeFeatures) defines abstract [AOIScanPathAnalyzer](/argaze/#argaze.GazeFeatures.AOIScanPathAnalyzer) classe to let add various analysis algorithms. + +Some scan path analysis are available thanks to [GazeAnalysis](/argaze/#argaze.GazeAnalysis) submodule: + +* [Transition matrix](/argaze/#argaze.GazeAnalysis.TransitionMatrix) +* [Entropy](/argaze/#argaze.GazeAnalysis.Entropy) +* [Lempel-Ziv complexity](/argaze/#argaze.GazeAnalysis.LempelZivComplexity) +* [N-Gram](/argaze/#argaze.GazeAnalysis.NGram) +* [K-modified coefficient](/argaze/#argaze.GazeAnalysis.KCoefficient) + +### Example + +Here is a sample of code to illustrate how to built a AOI scan path and analyze it: + +``` python +from argaze import GazeFeatures +from argaze.GazeAnalysis import LempelZivComplexity + +# Assuming all AOI names are listed +... + +# Create a empty AOI scan path +aoi_scan_path = GazeFeatures.AOIScanPath(aoi_names) + +# Create a Lempel-Ziv complexity analyzer +lzc_analyzer = LempelZivComplexity.AOIScanPathAnalyzer() + +# Assuming a gaze movement is identified at ts time +...: + + # Fixation identified + if GazeFeatures.is_fixation(gaze_movement): + + # Assuming fixation is detected as inside an AOI + ... + + # Append fixation to AOI scan path : a new step should be created + new_step = aoi_scan_path.append_fixation(ts, gaze_movement, looked_aoi_name) + + # Analyse AOI scan path + if new_step: + + LZC = kc_analyzer.analyze(aoi_scan_path) + + # Do something with LZC metric + ... + + # Saccade identified + elif GazeFeatures.is_saccade(gaze_movement): + + # Append saccade to scan path : no step is created + aoi_scan_path.append_saccade(data_ts, gaze_movement) + +``` + +### Advanced + +The [AOIScanPath](/argaze/#argaze.GazeFeatures.AOIScanPath) class provides some advanced features to analyse it. + +#### String representation + +When a new [AOIScanStep](/argaze/#argaze.GazeFeatures.AOIScanStep) is created, the [AOIScanPath](/argaze/#argaze.GazeFeatures.AOIScanPath) internally affects a unique letter index related to its AOI to ease pattern analysis. +Then, the [AOIScanPath str](/argaze/#argaze.GazeFeatures.AOIScanPath.__str__) representation returns the concatenation of each [AOIScanStep](/argaze/#argaze.GazeFeatures.AOIScanStep) letter. +The [AOIScanPath get_letter_aoi](/argaze/#argaze.GazeFeatures.AOIScanPath.get_letter_aoi) method helps to get back the AOI related to a letter index. + +``` python +# Assuming the following AOI scan path is built: Foo > Bar > Shu > Foo +aoi_scan_path = ... + +# String representation should be: 'ABCA' +print(str(aoi_scan_path)) + +# Output should be: 'Bar' +print(aoi_scan_path.get_letter_aoi('B')) + +``` + +#### Transition matrix + +When a new [AOIScanStep](/argaze/#argaze.GazeFeatures.AOIScanStep) is created, the [AOIScanPath](/argaze/#argaze.GazeFeatures.AOIScanPath) internally counts the number of transitions from an AOI to another AOI to ease Markov chain analysis. +Then, the [AOIScanPath transition_matrix](/argaze/#argaze.GazeFeatures.AOIScanPath.transition_matrix) property returns a *Pandas DataFrame* where indexes are transition departures and columns are transition destinations. + +Here is an exemple of transition matrix for the following [AOIScanPath](/argaze/#argaze.GazeFeatures.AOIScanPath): Foo > Bar > Shu > Foo > Bar + +| |Foo|Bar|Shu| +|:--|:--|:--|:--| +|Foo|0 |2 |0 | +|Bar|0 |0 |1 | +|Shu|1 |0 |0 | + + +#### Fixations count + +The [AOIScanPath fixations_count](/argaze/#argaze.GazeFeatures.AOIScanPath.fixations_count) method returns the total number of fixations in the whole scan path and a dictionary to get the fixations count per AOI. |