blob: 8753eb66cad5db6cf1c5b2bf21f50601922021a1 (
plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
|
Scritp the context
==================
Context objects are accessible from a Python script.
## Load configuration from JSON file
A context configuration can be loaded from a JSON file using the [*load*](../../../argaze.md/#argaze.load) function.
```python
from argaze import load
# Load a context
with load(configuration_filepath) as context:
while context.is_running():
# Do something with context
...
# Wait some time eventually
...
```
!!! note
The **with** statement enables context by calling its **enter** method then ensures that its **exit** method is always called at the end.
## Load configuration from dictionary
A context configuration can be loaded from a Python dictionary using the [*from_dict*](../../../argaze.md/#argaze.DataFeatures.from_dict) function.
```python
from argaze import DataFeatures
import my_package
# Set working directory to enable relative file path loading
DataFeatures.set_working_directory('path/to/folder')
# Edit a dict with context configuration
configuration = {
"name": "My context",
"parameter": ...,
"pipeline": ...
}
# Load a context from a package
with DataFeatures.from_dict(my_package.MyContext, configuration) as context:
while context.is_running():
# Do something with context
...
# Wait some time eventually
...
```
## Manage context
Check the context or the pipeline type to adapt features.
```python
from argaze import ArFeatures
# Assuming the context is loaded and is running
...
# Check context type
# Live processing case: calibration method is available
if issubclass(type(context), ArFeatures.LiveProcessingContext):
...
# Post processing case: more playback methods are available
if issubclass(type(context), ArFeatures.PostProcessingContext):
...
# Check pipeline type
# Screen-based case: only gaze positions are processes
if issubclass(type(context.pipeline), ArFeatures.ArFrame):
...
# Head-mounted case: camera images also processes
if issubclass(type(context.pipeline), ArFeatures.ArCamera):
...
```
## Display context
The context image can be displayed in low priority to not block pipeline processing.
```python
# Assuming the context is loaded and is running
...
# Display context if the pipeline is available
try:
... = context.image(wait = False)
except DataFeatures.SharedObjectBusy:
pass
```
|