aboutsummaryrefslogtreecommitdiff
path: root/docs/user_guide/gaze_analysis_pipeline/configuration_and_execution.md
blob: 57a9d717c2b50839c93e3159d5ff86fbe0737dd6 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
Load and execute pipeline
=========================

The [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) class defines a rectangular area where timestamped gaze positions are projected in and inside which they need to be analyzed.

![Frame](../../img/ar_frame.png)

## Load JSON configuration file

An [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline can be loaded from a JSON configuration file thanks to the [argaze.load](../../argaze.md/#argaze.load) package method. 

Here is a simple JSON [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) configuration file example:

```json
{
	"argaze.ArFeatures.ArFrame": {
		"name": "My FullHD screen",
		"size": [1920, 1080],
		"gaze_movement_identifier": {
			"argaze.GazeAnalysis.DispersionThresholdIdentification.GazeMovementIdentifier": {
				"deviation_max_threshold": 50,
				"duration_min_threshold": 200
			}
		},
		"scan_path": {
			"duration_max": 30000
		},
		"scan_path_analyzers": {
			"argaze.GazeAnalysis.Basic.ScanPathAnalyzer": {},
			"argaze.GazeAnalysis.ExploreExploitRatio.ScanPathAnalyzer": {
				"short_fixation_duration_threshold": 0
			}
		}
	}
}
```

Then, here is how to load the JSON file:

```python
import argaze

# Load ArFrame
with argaze.load('./configuration.json') as ar_frame:

	# Do something with ArFrame
	...
```

Now, let's understand the meaning of each JSON entry.

### argaze.ArFeatures.ArFrame

The class name of the object being loaded.

### *name*

The name of the [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame). Basically useful for visualization purposes.

### *size*

The size of the [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) defines the dimension of the rectangular area where gaze positions are projected. Be aware that gaze positions have to be in the same range of value to be projected.

!!! warning "Free spatial unit"
	Gaze positions can either be integers or floats, pixels, millimeters or what ever you need. The only concern is that all spatial values used in further configurations have to be in the same unit.

### *gaze_movement_identifier*

The first [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline step is to identify fixations or saccades from consecutive timestamped gaze positions.

![Gaze movement identifier](../../img/gaze_movement_identifier.png)

The identification algorithm can be selected by instantiating a particular [GazeMovementIdentifier from the GazeAnalysis submodule](pipeline_modules/gaze_movement_identifiers.md) or [from another Python package](advanced_topics/module_loading.md).

In the example file, the chosen identification algorithm is the [Dispersion Threshold Identification (I-DT)](../../argaze.md/#argaze.GazeAnalysis.DispersionThresholdIdentification) which has two specific *deviation_max_threshold* and *duration_min_threshold* attributes.

!!! note
	In ArGaze, [Fixation](../../argaze.md/#argaze.GazeFeatures.Fixation) and [Saccade](../../argaze.md/#argaze.GazeFeatures.Saccade) are considered as particular [GazeMovements](../../argaze.md/#argaze.GazeFeatures.GazeMovement).

!!! warning "Mandatory"
	JSON *gaze_movement_identifier* entry is mandatory. Otherwise, the ScanPath and ScanPathAnalyzers steps are disabled.

### *scan_path*

The second [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline step aims to build a [ScanPath](../../argaze.md/#argaze.GazeFeatures.ScanPath) defined as a list of [ScanSteps](../../argaze.md/#argaze.GazeFeatures.ScanStep) made by a fixation and a consecutive saccade.

![Scan path](../../img/scan_path.png)

Once fixations and saccades are identified, they are automatically appended to the ScanPath if required.

The [ScanPath.duration_max](../../argaze.md/#argaze.GazeFeatures.ScanPath.duration_max) attribute is the duration from which older scan steps are removed each time new scan steps are added.

!!! note "Optional"
	JSON *scan_path* entry is not mandatory. If *scan_path_analyzers* entry is not empty, the ScanPath step is automatically enabled.

### *scan_path_analyzers*

Finally, the last [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline step consists of passing the previously built [ScanPath](../../argaze.md/#argaze.GazeFeatures.ScanPath) to each loaded [ScanPathAnalyzer](../../argaze.md/#argaze.GazeFeatures.ScanPathAnalyzer).

Each analysis algorithm can be selected by instantiating a particular [ScanPathAnalyzer from the GazeAnalysis submodule](pipeline_modules/scan_path_analyzers.md) or [from another Python package](advanced_topics/module_loading.md).

In the example file, the chosen analysis algorithms are the [Basic](../../argaze.md/#argaze.GazeAnalysis.Basic) module and the [ExploreExploitRatio](../../argaze.md/#argaze.GazeAnalysis.ExploreExploitRatio) module, which has one specific *short_fixation_duration_threshold* attribute.

## Pipeline execution

Timestamped [GazePositions](../../argaze.md/#argaze.GazeFeatures.GazePosition) have to be passed one by one to the [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method to execute the whole instantiated pipeline. 

!!! warning "Mandatory"

	The [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method must be called from a *try* block to catch pipeline exceptions.

```python
# Assuming that timestamped gaze positions are available
...

	try:

		# Look ArFrame at a timestamped gaze position
		ar_frame.look(timestamped_gaze_position)

	# Do something with pipeline exception
	except Exception as e:
		
		...
```
!!! note ""

	At this point, the [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method only processes gaze movement identification and scan path analysis without any AOI neither any recording or visualization supports.

	Read the next chapters to learn how to [describe AOI](aoi_2d_description.md), [add AOI analysis](aoi_analysis.md), [record gaze analysis](recording.md) and [visualize pipeline steps](visualization.md).