aboutsummaryrefslogtreecommitdiff
path: root/docs/user_guide/eye_tracking_context/advanced_topics/context_definition.md
blob: 0702c8e14ca039748f6ee8748ecf639617264924 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
Define a context class
======================

The [ArContext](../../../argaze.md/#argaze.ArFeatures.ArContext) class defines a generic base class interface to handle incoming eye tracker data before to pass them to a processing pipeline according to [Python context manager feature](https://docs.python.org/3/reference/datamodel.html#context-managers). 

The [ArContext](../../../argaze.md/#argaze.ArFeatures.ArContext) class interface provides control features to stop or pause working threads, performance assement features to measure how many times processings are called and the time spent by the process.

Besides, there is also a [DataCaptureContext](../../../argaze.md/#argaze.ArFeatures.DataCaptureContext) class that inherits from [ArContext](../../../argaze.md/#argaze.ArFeatures.ArContext) and that defines an abstract *calibrate* method to write specific device calibration process.

In the same way, there is a [DataPlaybackContext](../../../argaze.md/#argaze.ArFeatures.DataPlaybackContext) class that inherits from [ArContext](../../../argaze.md/#argaze.ArFeatures.ArContext) and that defines abstract *previous* and *next* playback methods to move into record's frames and also defines *duration* and *progression* properties to get information about a record length and playback advancement.

Finally, a specific eye tracking context can be defined into a Python file by writing a class that inherits either from [ArContext](../../../argaze.md/#argaze.ArFeatures.ArContext), [DataCaptureContext](../../../argaze.md/#argaze.ArFeatures.DataCaptureContext) or [DataPlaybackContext](../../../argaze.md/#argaze.ArFeatures.DataPlaybackContext) class.

## Write data capture context

Here is a data cpature context example that processes gaze positions and camera images in two separated threads:

```python
from argaze import ArFeatures, DataFeatures

class DataCaptureExample(ArFeatures.DataCaptureContext):

	@DataFeatures.PipelineStepInit
	def __init__(self, **kwargs):

		# Init DataCaptureContext class
		super().__init__()

		# Init private attribute
		self.__parameter = ...

	@property
	def parameter(self):
		"""Any context specific parameter."""
		return self.__parameter

	@parameter.setter
	def parameter(self, parameter):
		self.__parameter = parameter
	
	@DataFeatures.PipelineStepEnter
	def __enter__(self):
		"""Start context."""

		# Start context according any specific parameter
		... self.parameter

		# Start a gaze position capture thread
		self.__gaze_thread = threading.Thread(target = self.__gaze_position_capture)
		self.__gaze_thread.start()

		# Start a camera image capture thread if applicable
		self.__camera_thread = threading.Thread(target = self.__camera_image_capture)
		self.__camera_thread.start()

		return self

	def __gaze_position_capture(self):
		"""Capture gaze position."""

		# Capture loop
		while self.is_running():

			# Pause capture
			if not self.is_paused():

				# Assuming that timestamp, x and y values are available
				...

				# Process timestamped gaze position
				self._process_gaze_position(timestamp = timestamp, x = x, y = y)

			# Wait some time eventually
			...

	def __camera_image_capture(self):
		"""Capture camera image if applicable."""

		# Capture loop
		while self.is_running():

			# Pause capture
			if not self.is_paused():

				# Assuming that timestamp, camera_image are available
				...

				# Process timestamped camera image
				self._process_camera_image(timestamp = timestamp, image = camera_image)

			# Wait some time eventually
			...

	@DataFeatures.PipelineStepExit
	def __exit__(self, exception_type, exception_value, exception_traceback):
		"""End context."""
		
		# Stop capture loops
		self.stop()

		# Stop capture threads
		threading.Thread.join(self.__gaze_thread)
		threading.Thread.join(self.__camera_thread)

	def calibrate(self):
		"""Handle device calibration process."""

		...
```

## Write data playback context

Here is a data playback context example that reads gaze positions and camera images in a same thread:

```python
from argaze import ArFeatures, DataFeatures

class DataPlaybackExample(ArFeatures.DataPlaybackContext):

	@DataFeatures.PipelineStepInit
	def __init__(self, **kwargs):

		# Init DataCaptureContext class
		super().__init__()

		# Init private attribute
		self.__parameter = ...

	@property
	def parameter(self):
		"""Any context specific parameter."""
		return self.__parameter

	@parameter.setter
	def parameter(self, parameter):
		self.__parameter = parameter
	
	@DataFeatures.PipelineStepEnter
	def __enter__(self):
		"""Start context."""

		# Start context according any specific parameter
		... self.parameter

		# Start a data playback thread
		self.__data_thread = threading.Thread(target = self.__data_playback)
		self.__data_thread.start()

		return self

	def __data_playback(self):
		"""Playback gaze position and camera image if applicable."""

		# Playback loop
		while self.is_running():

			# Pause playback
			if not self.is_paused():

				# Assuming that timestamp, camera_image are available
				...

					# Process timestamped camera image
					self._process_camera_image(timestamp = timestamp, image = camera_image)

					# Assuming that timestamp, x and y values are available
					...
				
						# Process timestamped gaze position
						self._process_gaze_position(timestamp = timestamp, x = x, y = y)

			# Wait some time eventually
			...

	@DataFeatures.PipelineStepExit
	def __exit__(self, exception_type, exception_value, exception_traceback):
		"""End context."""
		
		# Stop playback loop
		self.stop()

		# Stop playback threads
		threading.Thread.join(self.__data_thread)

	def previous(self):
		"""Go to previous camera image frame."""
		...

	def next(self):
		"""Go to next camera image frame."""
		...
```