aboutsummaryrefslogtreecommitdiff
path: root/docs/user_guide/gaze_analysis_pipeline/ar_layer_configuration_and_execution.md
blob: 3503d1ab10dce97647005d46cfb97503ba32a4d7 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
Add and execute ArLayer
=============================

The [ArLayer](../../../argaze/#argaze.ArFeatures.ArLayer) class defines a space where to make matching of gaze movements and AOIs and inside which those matchings need to be analyzed.

![Empty layer area](../../img/ar_layer_empty.png)

## Add ArLayer to ArFrame JSON configuration file

An [ArFrame](../../../argaze/#argaze.ArFeatures.ArFrame) instance can contains multiples [ArLayers](../../../argaze/#argaze.ArFeatures.ArLayer).

Here is the JSON ArFrame configuration file example where one layer is added:

```json
{
	"name": "My FullHD screen",
	"size": [1920, 1080],
	...
	"layers": {
		"MyLayer": {
			"aoi_color": [0, 0, 255],
			"aoi_scene" : {
        		"upper_left_corner": [[0, 0], [960, 0], [960, 540], [0, 540]],
        		"upper_right_corner": [[960, 0], [1920, 0], [1920, 540], [960, 540]],
        		"lower_left_corner": [[0, 540], [960, 540], [960, 1080], [0, 1080]],
        		"lower_right_corner": [[960, 540], [1920, 540], [1920, 1080], [960, 1080]]
        	},
        	"aoi_matcher": {
                "DeviationCircleCoverage": {
                    "coverage_threshold": 0.5
                }
            },
        	"aoi_scan_path": {
        		"duration_max": 30000
        	},
			"aoi_scan_path_analyzers": {
				"Basic": {},
				"TransitionMatrix": {},
				"NGram": {
					"n_min": 3,
					"n_max": 5
				}
			}
		}
	}
}
```

Then, after the JSON file being loaded:

```python
from argaze import ArFeatures

# Assuming the ArFrame is loaded
...

# Print ArLayer attributes
for name, ar_layer in ar_frame.layers.items():

	print("name:", ar_layer.name)
	print("AOI color:", ar_layer.aoi_color)
	print("AOI scene:", ar_layer.aoi_scene)
	print("AOI matcher type:", type(ar_layer.aoi_matcher))
	print("AOI scan path:", ar_layer.aoi_scan_path)

	for module, analyzer in ar_layer.aoi_scan_path_analyzers.items():
		print('AOI scan path analyzer module:', module)
```

Finally, here is what the program writes in console:

```txt
...

name: MyLayer
AOI color: [0, 0, 255]
AOI scene: 
	upper_left_corner:
[[0, 0], [960, 0], [960, 540], [0, 540]]
	upper_right_corner:
[[960, 0], [1920, 0], [1920, 540], [960, 540]]
	lower_left_corner:
[[0, 540], [960, 540], [960, 1080], [0, 1080]]
	lower_right_corner:
[[960, 540], [1920, 540], [1920, 1080], [960, 1080]]
AOI matcher type: <class 'argaze.GazeAnalysis.DeviationCircleCoverage.AOIMatcher'>
AOI scan path: []
AOI scan path analyzer module: argaze.GazeAnalysis.Basic
AOI scan path analyzer module: argaze.GazeAnalysis.TransitionMatrix
AOI scan path analyzer module: argaze.GazeAnalysis.NGram
```

Now, let's understand the meaning of each JSON entry.

### Name

The name of the [ArLayer](../../../argaze/#argaze.ArFeatures.ArLayer). Basically useful for visualisation purpose.

### AOI Color

The color of [ArLayer](../../../argaze/#argaze.ArFeatures.ArLayer)'s AOI. Basically useful for visualisation purpose.

### AOI Scene

The [AOIScene](../../../argaze/#argaze.AreaOfInterest.AOIFeatures.AOIScene) defines a set of 2D [AreaOfInterest](../../../argaze/#argaze.AreaOfInterest.AOIFeatures.AreaOfInterest) registered by name.

![AOI Scene](../../img/ar_layer_aoi_scene.png)

### AOI Matcher

The first [ArLayer](../../../argaze/#argaze.ArFeatures.ArLayer) pipeline step aims to make match identified gaze movement with an AOI of the scene.

![AOI Matcher](../../img/ar_layer_aoi_matcher.png)

The matching algorithm can be selected by instantiating a particular [AOIMatcher](../../../argaze/#argaze.GazeFeatures.AOIMatcher) from the [argaze.GazeAnalysis](../../../argaze/#argaze.GazeAnalysis) submodule or [from another python package](../advanced_topics/module_loading).

In the example file, the choosen matching algorithm is the [Deviation Circle Coverage](../../../argaze/#argaze.GazeAnalysis.DeviationCircleCoverage) which has one specific *coverage_threshold* attribute.

!!! warning
	JSON *aoi_matcher* entry is mandatory. Otherwise, the AOIScanPath and AOIScanPathAnalyzers steps are disabled.

### AOI Scan Path

The second [ArLayer](../../../argaze/#argaze.ArFeatures.ArLayer) pipeline step aims to build a [AOIScanPath](../../../argaze/#argaze.GazeFeatures.AOIScanPath) defined as a list of [AOIScanSteps](../../../argaze/#argaze.GazeFeatures.AOIScanStep) made by a set of successive fixations/saccades onto a same AOI.

![AOI Scan Path](../../img/ar_layer_aoi_scan_path.png)

Once identified gaze movements are matched to AOI, they are automatically appended to the AOIScanPath if required.

The [AOIScanPath.duration_max](../../../argaze/#argaze.GazeFeatures.AOIScanPath.duration_max) attribute is the duration from which older AOI scan steps are removed each time new AOI scan steps are added.

!!! note
	JSON *aoi_scan_path* entry is not mandatory. If aoi_scan_path_analyzers entry is not empty, the AOIScanPath step is automatically enabled.

### AOI Scan Path Analyzers

Finally, the last [ArLayer](../../../argaze/#argaze.ArFeatures.ArLayer) pipeline step consists in passing the previously built [AOIScanPath](../../../argaze/#argaze.GazeFeatures.AOIScanPath) to each loaded [AOIScanPathAnalyzer](../../../argaze/#argaze.GazeFeatures.AOIScanPathAnalyzer).

Each analysis algorithm can be selected by instantiating a particular [AOIScanPathAnalyzer](../../../argaze/#argaze.GazeFeatures.AOIScanPathAnalyzer) from the [argaze.GazeAnalysis](../../../argaze/#argaze.GazeAnalysis) submodule or [from another python package](../advanced_topics/module_loading).

## Pipeline execution

Timestamped gaze movements identified by parent ArFrame are passed one by one to each [ArLayer.look](../../../argaze/#argaze.ArFeatures.ArLayer.look) method to execute each layer intanciated pipeline.

```python
# Assuming that timestamped gaze positions are available
...

    # Look ArFrame at a timestamped gaze position
    movement, _, layers_analysis, _, _ = ar_frame.look(timestamp, gaze_position)

    # Check if a movement has been identified
    if movement.valid and movement.finished:

		# Do something with each layer AOI scan path analysis
		for layer_name, layer_aoi_scan_path_analysis in layers_analysis.items():
			for module, analysis in layer_aoi_scan_path_analysis.items():
				for data, value in analysis.items():
					...
```