aboutsummaryrefslogtreecommitdiff
path: root/docs/user_guide/gaze_analysis_pipeline/ar_frame_configuration_and_execution.md
blob: 37100abe1cae6f9e4268d2cef69af45f45578b43 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
Configure and execute ArFrame
=============================

The [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) class defines a rectangular area where timestamped gaze positions are projected in and inside which they need to be analyzed.

![Empty frame area](../../img/ar_frame_empty.png)

## Load JSON configuration file

The [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) internal pipeline is entirely customizable from a JSON configuration file thanks to [ArFrame.from_json](../../argaze.md/#argaze.ArFeatures.ArFrame.from_json) class method. 

Here is a simple JSON ArFrame configuration file example:

```json
{
	"name": "My FullHD screen",
	"size": [1920, 1080],
	"gaze_movement_identifier": {
		"DispersionThresholdIdentification": {
			"deviation_max_threshold": 50,
			"duration_min_threshold": 200
		}
	},
	"scan_path": {
		"duration_max": 30000
	},
	"scan_path_analyzers": {
		"Basic": {},
		"ExploitExploreRatio": {
            "short_fixation_duration_threshold": 0
        }
	}
}
```

Then, here is how to load the JSON file:

```python
from argaze import ArFeatures

# Load ArFrame
ar_frame = ArFeatures.ArFrame.from_json('./configuration.json')

# Print ArFrame attributes
print("name:", ar_frame.name)
print("size:", ar_frame.size)
print("gaze movement identifier type:", type(ar_frame.gaze_movement_identifier))
print("scan path:", ar_frame.scan_path)

for module, analyzer in ar_frame.scan_path_analyzers.items():
	print('scan path analyzer module:', module)
```

Finally, here is what the program writes in console:

```txt
name: My FullHD screen
size: [1920, 1080]
gaze movement identifier type: <class 'argaze.GazeAnalysis.DispersionThresholdIdentification.GazeMovementIdentifier'>
scan path: []
scan path analyzer module: argaze.GazeAnalysis.Basic
scan path analyzer module: argaze.GazeAnalysis.ExploitExploreRatio
```

Now, let's understand the meaning of each JSON entry.

### Name

The name of the [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame). Basically useful for visualisation purpose.

### Size

The size of the [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) defines the dimension of the rectangular area where gaze positions are projected. Be aware that gaze positions have to be in the same range of value.

!!! warning
	**ArGaze doesn't impose any spatial unit.** Gaze positions can either be integer or float, pixels, millimeters or what ever you need. The only concern is that all spatial values used in further configurations have to be all the same unit.

### Gaze Movement Identifier

The first [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline step is to identify fixations or saccades from consecutive timestamped gaze positions.

![Gaze Movement Identifier](../../img/ar_frame_gaze_movement_identifier.png)

The identification algorithm can be selected by instantiating a particular [GazeMovementIdentifier](../../argaze.md/#argaze.GazeFeatures.GazeMovementIdentifier) from the [argaze.GazeAnalysis](../../argaze.md/#argaze.GazeAnalysis) submodule or [from another python package](advanced_topics/module_loading.md).

In the example file, the choosen identification algorithm is the [Dispersion Threshold Identification (I-DT)](../../argaze.md/#argaze.GazeAnalysis.DispersionThresholdIdentification) which has two specific *deviation_max_threshold* and *duration_min_threshold* attributes.

!!! note
	In ArGaze, [Fixation](../../argaze.md/#argaze.GazeFeatures.Fixation) and [Saccade](../../argaze.md/#argaze.GazeFeatures.Saccade) are considered as particular [GazeMovements](../../argaze.md/#argaze.GazeFeatures.GazeMovement).

!!! warning
	JSON *gaze_movement_identifier* entry is mandatory. Otherwise, the ScanPath and ScanPathAnalyzers steps are disabled.

### Scan Path

The second [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline step aims to build a [ScanPath](../../argaze.md/#argaze.GazeFeatures.ScanPath) defined as a list of [ScanSteps](../../argaze.md/#argaze.GazeFeatures.ScanStep) made by a fixation and a consecutive saccade.

![Scan Path](../../img/ar_frame_scan_path.png)

Once fixations and saccades are identified, they are automatically appended to the ScanPath if required.

The [ScanPath.duration_max](../../argaze.md/#argaze.GazeFeatures.ScanPath.duration_max) attribute is the duration from which older scan steps are removed each time new scan steps are added.

!!! note
	JSON *scan_path* entry is not mandatory. If scan_path_analyzers entry is not empty, the ScanPath step is automatically enabled.

### Scan Path Analyzers

Finally, the last [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline step consists in passing the previously built [ScanPath](../../argaze.md/#argaze.GazeFeatures.ScanPath) to each loaded [ScanPathAnalyzer](../../argaze.md/#argaze.GazeFeatures.ScanPathAnalyzer).

Each analysis algorithm can be selected by instantiating a particular [ScanPathAnalyzer](../../argaze.md/#argaze.GazeFeatures.ScanPathAnalyzer) from the [argaze.GazeAnalysis](../../argaze.md/#argaze.GazeAnalysis) submodule or [from another python package](advanced_topics/module_loading.md).

## Pipeline execution

Timestamped gaze positions have to be passed one by one to [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method to execute the whole intanciated pipeline.

```python
# Assuming that timestamped gaze positions are available
...

    # Look ArFrame at a timestamped gaze position
    movement, scan_path_analysis, _, execution_times, exception = ar_frame.look(timestamp, gaze_position)

    # Check if a movement has been identified
    if movement.valid and movement.finished:

        # Do something with identified fixation
        if GazeFeatures.is_fixation(movement):
            ...

        # Do something with identified saccade
        elif GazeFeatures.is_saccade(movement):
        	...

		# Do something with scan path analysis
		for module, analysis in scan_path_analysis.items():
			for data, value in analysis.items():
				...

    # Do something with pipeline execution times
    ...

    # Do something with pipeline exception
    if exception:
    	...
```