aboutsummaryrefslogtreecommitdiff
path: root/docs/user_guide/gaze_analysis_pipeline/visualisation.md
blob: 5f06facf25c53d8d93ade16b4aa2e5b62d33b2b2 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
Visualize pipeline steps
========================

Visualisation is not a pipeline step but each [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) pipeline steps outputs can be drawn in real time or afterward, depending of application purpose.

![ArFrame visualisation](../../img/visualisation.png)

## Add image parameters to ArFrame JSON configuration file

[ArFrame.image](../../argaze.md/#argaze.ArFeatures.ArFrame.image) method parameters can be configured thanks to a dedicated JSON entry.

Here is an extract from the JSON ArFrame configuration file with a sample where image parameters are added:

```json
{
    "name": "My FullHD screen",
    "size": [1920, 1080],
    ...
    "image_parameters": {
        "draw_gaze_positions": {
            "color": [0, 255, 255],
            "size": 2
        },
        "draw_fixations": {
            "deviation_circle_color": [255, 255, 255],
            "duration_border_color": [127, 0, 127],
            "duration_factor": 1e-2,
            "draw_positions": {
                "position_color": [0, 255, 255],
                "line_color": [0, 0, 0]
            }
        }, 
        "draw_saccades": {
            "line_color": [255, 0, 255]
        },
        "draw_scan_path": {
            "draw_fixations": {
                "deviation_circle_color": [255, 0, 255],
                "duration_border_color": [127, 0, 127],
                "duration_factor": 1e-2
            }, 
            "draw_saccades": {
                "line_color": [255, 0, 255]
            }
        },
        "draw_layers": {
            "MyLayer": {
                "draw_aoi_scene": {
                    "draw_aoi": {
                        "color": [255, 255, 255],
                        "border_size": 1
                    }
                },
                "draw_aoi_matching": {
                    "draw_matched_fixation": {
                        "deviation_circle_color": [255, 255, 255],
                        "draw_positions": {
                            "position_color": [0, 255, 0],
                            "line_color": [0, 0, 0]
                        }
                    },
                    "draw_matched_region": {
                        "color": [0, 255, 0],
                        "border_size": 4
                    }, 
                    "draw_looked_aoi": {
                        "color": [0, 255, 0],
                        "border_size": 2
                    },
                    "looked_aoi_name_color": [255, 255, 255],
                    "looked_aoi_name_offset": [0, -10]
                }
            }
        }
    }
}
```

!!! warning
    Most of *image_parameters* entries work if related ArFrame/ArLayer pipeline steps are enabled.  
    For example, JSON *draw_scan_path* entry needs GazeMovementIdentifier and ScanPath steps to be enabled.

Then, [ArFrame.image](../../argaze.md/#argaze.ArFeatures.ArFrame.image) method can be called in various situations.

## Export to PNG file

Once timestamped gaze positions have been processed by [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method, it is possible to write [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) image into a file thanks to [OpenCV package](https://pypi.org/project/opencv-python/).

```python
import cv2

# Assuming that timestamped gaze positions have been processed by ArFrame.look method
...

# Export ArFrame image
cv2.imwrite('./ar_frame.png', ar_frame.image())
```

## Export to MP4 file

While timestamped gaze positions are processed by [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method, it is possible to write [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) image into a video file thanks to [OpenCV package](https://pypi.org/project/opencv-python/).

```python
import cv2

# Assuming ArFrame is loaded
...

# Create a video file to save ArFrame
video = cv2.VideoWriter('ar_frame.avi', cv2.VideoWriter_fourcc(*'MJPG'), 10, ar_frame.size)

# Assuming that timestamped gaze positions are being processed by ArFrame.look method
...

    # Write ArFrame image into video file
    video.write(ar_frame.image())

# Close video file
video.release()
```

## Live window display

While timestamped gaze positions are processed by [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method, it is possible to display [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) image thanks to [OpenCV package](https://pypi.org/project/opencv-python/).

```python
import cv2

def main():

	# Assuming ArFrame is loaded
	...

    # Create a window to display ArFrame
    cv2.namedWindow(ar_frame.name, cv2.WINDOW_AUTOSIZE)

    # Assuming that timestamped gaze positions are being processed by ArFrame.look method
    ...

        # Update ArFrame image display
        cv2.imshow(ar_frame.name, ar_frame.image())

        # Wait 10 ms
        cv2.waitKey(10)

if __name__ == '__main__':

    main()
```