aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
-rw-r--r--docs/index.md2
-rw-r--r--docs/user_guide/aruco_markers_pipeline/aoi_3d_description.md8
-rw-r--r--docs/user_guide/aruco_markers_pipeline/aoi_3d_projection.md8
-rw-r--r--docs/user_guide/aruco_markers_pipeline/introduction.md2
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/aoi_2d_description.md52
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/aoi_analysis.md (renamed from docs/user_guide/gaze_analysis_pipeline/aoi_2d_analysis.md)4
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/configuration_and_execution.md2
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/introduction.md2
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/pipeline_modules/aoi_matchers.md2
-rw-r--r--docs/user_guide/gaze_analysis_pipeline/pipeline_modules/aoi_scan_path_analyzers.md2
-rw-r--r--mkdocs.yml2
-rw-r--r--src/argaze.test/AreaOfInterest/AOI2DScene.py6
-rw-r--r--src/argaze.test/AreaOfInterest/AOI3DScene.py6
-rw-r--r--src/argaze.test/GazeFeatures.py4
-rw-r--r--src/argaze/ArFeatures.py24
-rw-r--r--src/argaze/AreaOfInterest/AOI2DScene.py6
-rw-r--r--src/argaze/AreaOfInterest/AOI3DScene.py10
-rw-r--r--src/argaze/GazeAnalysis/DeviationCircleCoverage.py14
-rw-r--r--src/argaze/GazeAnalysis/TransitionMatrix.py2
-rw-r--r--src/argaze/GazeFeatures.py22
20 files changed, 92 insertions, 88 deletions
diff --git a/docs/index.md b/docs/index.md
index 2306490..f234a94 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -18,7 +18,7 @@ First of all, **ArGaze** provides extensible modules library allowing to select
* **Area Of Interest (AOI) matching**: focus point inside, deviation circle coverage, ...
* **Scan path analysis**: transition matrix, entropy, exploit/explore ratio, ...
-Once incoming data formatted as required, all those gaze analysis features can be used with any screen-based eye tracker devices.
+Once incoming data are formatted as required, all those gaze analysis features can be used with any screen-based eye tracker devices.
[Learn how to build gaze analysis pipelines for various use cases by reading user guide dedicated section](./user_guide/gaze_analysis_pipeline/introduction.md).
diff --git a/docs/user_guide/aruco_markers_pipeline/aoi_3d_description.md b/docs/user_guide/aruco_markers_pipeline/aoi_3d_description.md
index 13f9c86..a2bb8d7 100644
--- a/docs/user_guide/aruco_markers_pipeline/aoi_3d_description.md
+++ b/docs/user_guide/aruco_markers_pipeline/aoi_3d_description.md
@@ -7,14 +7,14 @@ In the example scene, each screen is considered as an area of interest more the
![3D AOI description](../../img/aoi_3d_description.png)
-All AOIs need to be described from same origin than markers in a [right-handed 3D axis](https://robotacademy.net.au/lesson/right-handed-3d-coordinate-frame/) where:
+All AOI need to be described from same origin than markers in a [right-handed 3D axis](https://robotacademy.net.au/lesson/right-handed-3d-coordinate-frame/) where:
* +X is pointing to the right,
* +Y is pointing to the top,
* +Z is pointing to the backward.
!!! warning
- All AOIs spatial values must be given in **centimeters**.
+ All AOI spatial values must be given in **centimeters**.
### Edit OBJ file description
@@ -43,7 +43,7 @@ s off
f 9 10 11
```
-Here are common OBJ file features needed to describe AOIs:
+Here are common OBJ file features needed to describe AOI:
* Object lines (starting with *o* key) indicate AOI name.
* Vertice lines (starting with *v* key) indicate AOI vertices.
@@ -51,7 +51,7 @@ Here are common OBJ file features needed to describe AOIs:
### Edit JSON file description
-JSON file format allows to describe AOIs vertices.
+JSON file format allows to describe AOI vertices.
``` json
{
diff --git a/docs/user_guide/aruco_markers_pipeline/aoi_3d_projection.md b/docs/user_guide/aruco_markers_pipeline/aoi_3d_projection.md
index bdebd6c..d7df765 100644
--- a/docs/user_guide/aruco_markers_pipeline/aoi_3d_projection.md
+++ b/docs/user_guide/aruco_markers_pipeline/aoi_3d_projection.md
@@ -1,7 +1,7 @@
Project AOI into camera frame
=============================
-Once [ArUcoScene pose is estimated](pose_estimation.md) and [3D AOIs are described](aoi_3d_description.md), AOIs can be projected into [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) frame.
+Once [ArUcoScene pose is estimated](pose_estimation.md) and [3D AOI are described](aoi_3d_description.md), AOI can be projected into [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) frame.
![3D AOI projection](../../img/aruco_camera_aoi_projection.png)
@@ -46,7 +46,7 @@ The name of the [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer). Basically
The [AOIScene](../../argaze.md/#argaze.AreaOfInterest.AOIFeatures.AOIScene) defines a set of 3D [AreaOfInterest](../../argaze.md/#argaze.AreaOfInterest.AOIFeatures.AreaOfInterest) registered by name.
-## Add ArLayer to ArUcoCamera to project 3D AOIs
+## Add ArLayer to ArUcoCamera to project 3D AOI
Here is the previous extract where one layer is added to the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) and displayed:
@@ -103,11 +103,11 @@ The name of the [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer). Basically
[ArUcoScene](../../argaze.md/#argaze.ArUcoMarkers.ArUcoScene) layers are projected into their dedicated [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) layers when calling the [ArUcoCamera.watch](../../argaze.md/#argaze.ArFeatures.ArCamera.watch) method.
-## Add 2D AOIs analysis
+## Add 2D AOI analysis
When a scene layer is projected into a camera layer, it means that the 3D [ArLayer.aoi_scene](../../argaze.md/#argaze.ArFeatures.ArLayer.aoi_scene) description of the scene becomes the 2D camera's [ArLayer.aoi_scene](../../argaze.md/#argaze.ArFeatures.ArLayer.aoi_scene) description of the camera.
-Therefore, it means that [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) benefits from all the services described in [2D AOIs analysis pipeline section](../gaze_analysis_pipeline/aoi_2d_analysis.md).
+Therefore, it means that [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) benefits from all the services described in [2D AOI analysis pipeline section](../gaze_analysis_pipeline/aoi_analysis.md).
Here is the previous extract where AOI matcher, AOI scan path and AOI scan path analyzers are added to the [ArUcoCamera](../../argaze.md/#argaze.ArUcoMarkers.ArUcoCamera) layer:
diff --git a/docs/user_guide/aruco_markers_pipeline/introduction.md b/docs/user_guide/aruco_markers_pipeline/introduction.md
index 917245d..dc3aa4a 100644
--- a/docs/user_guide/aruco_markers_pipeline/introduction.md
+++ b/docs/user_guide/aruco_markers_pipeline/introduction.md
@@ -19,7 +19,7 @@ To build your own ArUco markers pipeline, you need to know:
* [How to describe scene's AOI](aoi_3d_description.md),
* [How to load and execute ArUco markers pipeline](configuration_and_execution.md),
* [How to estimate scene pose](pose_estimation.md),
-* [How to project AOIs into camera frame](aoi_3d_projection.md),
+* [How to project AOI into camera frame](aoi_3d_projection.md),
* [How to define an AOI as a frame](aoi_frame.md)
More advanced features are also explained like:
diff --git a/docs/user_guide/gaze_analysis_pipeline/aoi_2d_description.md b/docs/user_guide/gaze_analysis_pipeline/aoi_2d_description.md
index b2f0b90..6cca7ce 100644
--- a/docs/user_guide/gaze_analysis_pipeline/aoi_2d_description.md
+++ b/docs/user_guide/gaze_analysis_pipeline/aoi_2d_description.md
@@ -1,17 +1,17 @@
Describe 2D AOI
-===============
+================
-Once [frame is configured](configuration_and_execution.md), areas of interest need to be described into the same 2D referential.
+Once [ArFrame](../../argaze.md/#argaze.ArFeatures.ArFrame) is [configured](configuration_and_execution.md), areas of interest need to be described to know what is looked in frame.
![2D AOI description](../../img/aoi_2d_description.png)
-According a common computer graphics coordinates convention, all AOIs need to be described from a top left frame corner origin in a coordinate system where:
+According common computer graphics coordinates convention, all AOI need to be described from a top left frame corner origin with a coordinate system where:
* +X is pointing to the right,
* +Y is pointing to the downward.
!!! warning
- All AOIs spatial values must be given in **pixels**.
+ All AOI spatial values must be given in **pixels**.
### Edit SVG file description
@@ -27,7 +27,7 @@ SVG file format could be exported from most vector graphics editors.
</svg>
```
-Here are common SVG file features needed to describe AOIs:
+Here are common SVG file features needed to describe AOI:
* *id* attribute indicates AOI name.
* *path* element describes any polygon using only [M, L and Z path intructions](https://www.w3.org/TR/SVG2/paths.html#PathData)
@@ -35,36 +35,40 @@ Here are common SVG file features needed to describe AOIs:
### Edit JSON file description
-JSON file format allows to describe AOIs.
+JSON file format allows to describe AOI.
``` json
{
"Triangle" : [[1288.1, 189.466], [1991.24, 3399.34], [584.958, 3399.34]],
"BlueRectangle": {
- "shape": "rectangle",
- "x": 1257,
- "y": 1905.18,
- "width": 604.169,
- "height": 988.564
+ "Rectangle": {
+ "x": 1257,
+ "y": 1905.18,
+ "width": 604.169,
+ "height": 988.564
+ }
},
"RedSquare": {
- "shape": "rectangle",
- "x": 623.609,
- "y": 658.357,
- "width": 803.15,
- "height": 803.15
+ "Rectangle": {
+ "x": 623.609,
+ "y": 658.357,
+ "width": 803.15,
+ "height": 803.15
+ }
},
"GreenCircle": {
- "shape": "circle",
- "cx": 675.77,
- "cy": 2163.5,
- "radius": 393.109
+ "Circle": {
+ "cx": 675.77,
+ "cy": 2163.5,
+ "radius": 393.109
+ }
},
"PinkCircle": {
- "shape": "circle",
- "cx": 1902.02,
- "cy": 879.316,
- "radius": 195.313
+ "Circle": {
+ "cx": 1902.02,
+ "cy": 879.316,
+ "radius": 195.313
+ }
}
}
```
diff --git a/docs/user_guide/gaze_analysis_pipeline/aoi_2d_analysis.md b/docs/user_guide/gaze_analysis_pipeline/aoi_analysis.md
index 66763ad..cce3fcb 100644
--- a/docs/user_guide/gaze_analysis_pipeline/aoi_2d_analysis.md
+++ b/docs/user_guide/gaze_analysis_pipeline/aoi_analysis.md
@@ -1,7 +1,7 @@
-Enable 2D AOIs analysis
+Enable AOI analysis
===================
-The [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) class defines a space where to make matching of gaze movements with AOIs and inside which those matchings need to be analyzed.
+The [ArLayer](../../argaze.md/#argaze.ArFeatures.ArLayer) class defines a space where to make matching of gaze movements with AOI and inside which those matchings need to be analyzed.
![Layer](../../img/ar_layer.png)
diff --git a/docs/user_guide/gaze_analysis_pipeline/configuration_and_execution.md b/docs/user_guide/gaze_analysis_pipeline/configuration_and_execution.md
index 7657935..bb8eeaa 100644
--- a/docs/user_guide/gaze_analysis_pipeline/configuration_and_execution.md
+++ b/docs/user_guide/gaze_analysis_pipeline/configuration_and_execution.md
@@ -107,4 +107,4 @@ Timestamped gaze positions have to be passed one by one to [ArFrame.look](../../
At this point, the [ArFrame.look](../../argaze.md/#argaze.ArFeatures.ArFrame.look) method only process gaze movement identification and scan path analysis without any AOI neither any logging or visualisation supports.
- Read the next chapters to learn how to [add AOI analysis](aoi_2d_analysis.md), [log gaze analysis](logging.md) and [visualize pipeline steps](visualisation.md). \ No newline at end of file
+ Read the next chapters to learn how to [describe frame's AOI](aoi_2d_description.md), [add AOI analysis](aoi_analysis.md), [log gaze analysis](logging.md) and [visualize pipeline steps](visualisation.md). \ No newline at end of file
diff --git a/docs/user_guide/gaze_analysis_pipeline/introduction.md b/docs/user_guide/gaze_analysis_pipeline/introduction.md
index d33d308..76a146c 100644
--- a/docs/user_guide/gaze_analysis_pipeline/introduction.md
+++ b/docs/user_guide/gaze_analysis_pipeline/introduction.md
@@ -12,7 +12,7 @@ To build your own gaze analysis pipeline, you need to know:
* [How to edit timestamped gaze positions](timestamped_gaze_positions_edition.md),
* [How to load and execute gaze analysis pipeline](configuration_and_execution.md),
* [How to describe frame's AOI](aoi_2d_description.md),
-* [How to enable AOIs analysis](aoi_2d_analysis.md),
+* [How to enable AOI analysis](aoi_analysis.md),
* [How to visualize ArFrame and ArLayers](visualisation.md),
* [How to log resulted gaze analysis](logging.md),
* [How to make heatmap image](heatmap.md).
diff --git a/docs/user_guide/gaze_analysis_pipeline/pipeline_modules/aoi_matchers.md b/docs/user_guide/gaze_analysis_pipeline/pipeline_modules/aoi_matchers.md
index 8ba751f..61338cc 100644
--- a/docs/user_guide/gaze_analysis_pipeline/pipeline_modules/aoi_matchers.md
+++ b/docs/user_guide/gaze_analysis_pipeline/pipeline_modules/aoi_matchers.md
@@ -3,7 +3,7 @@ AOI matchers
ArGaze provides ready-to-use AOI matching algorithms.
-Here are JSON samples to include the chosen module inside [ArLayer configuration](../aoi_2d_analysis.md) *aoi_matcher* entry.
+Here are JSON samples to include the chosen module inside [ArLayer configuration](../aoi_analysis.md) *aoi_matcher* entry.
## Deviation circle coverage
diff --git a/docs/user_guide/gaze_analysis_pipeline/pipeline_modules/aoi_scan_path_analyzers.md b/docs/user_guide/gaze_analysis_pipeline/pipeline_modules/aoi_scan_path_analyzers.md
index e395750..ad1832d 100644
--- a/docs/user_guide/gaze_analysis_pipeline/pipeline_modules/aoi_scan_path_analyzers.md
+++ b/docs/user_guide/gaze_analysis_pipeline/pipeline_modules/aoi_scan_path_analyzers.md
@@ -3,7 +3,7 @@ AOI scan path analyzers
ArGaze provides ready-to-use AOI scan path analysis algorithms.
-Here are JSON samples to include a chosen module inside [ArLayer configuration](../aoi_2d_analysis.md) *aoi_scan_path_analyzers* entry.
+Here are JSON samples to include a chosen module inside [ArLayer configuration](../aoi_analysis.md) *aoi_scan_path_analyzers* entry.
## Basic metrics
diff --git a/mkdocs.yml b/mkdocs.yml
index 385ebef..3c5f10c 100644
--- a/mkdocs.yml
+++ b/mkdocs.yml
@@ -9,7 +9,7 @@ nav:
- user_guide/gaze_analysis_pipeline/timestamped_gaze_positions_edition.md
- user_guide/gaze_analysis_pipeline/configuration_and_execution.md
- user_guide/gaze_analysis_pipeline/aoi_2d_description.md
- - user_guide/gaze_analysis_pipeline/aoi_2d_analysis.md
+ - user_guide/gaze_analysis_pipeline/aoi_analysis.md
- user_guide/gaze_analysis_pipeline/visualisation.md
- user_guide/gaze_analysis_pipeline/logging.md
- user_guide/gaze_analysis_pipeline/heatmap.md
diff --git a/src/argaze.test/AreaOfInterest/AOI2DScene.py b/src/argaze.test/AreaOfInterest/AOI2DScene.py
index 4e96e98..10ff430 100644
--- a/src/argaze.test/AreaOfInterest/AOI2DScene.py
+++ b/src/argaze.test/AreaOfInterest/AOI2DScene.py
@@ -187,14 +187,14 @@ class TestTimeStampedAOIScenesClass(unittest.TestCase):
aoi_2D_B = AOIFeatures.AreaOfInterest([[1, 1], [1, 2], [2, 2], [2, 1]])
aoi_2d_scene = AOI2DScene.AOI2DScene({"A": aoi_2D_A, "B": aoi_2D_B})
- ts_aois_scenes = AOIFeatures.TimeStampedAOIScenes()
+ ts_aoi_scenes = AOIFeatures.TimeStampedAOIScenes()
- ts_aois_scenes[0] = aoi_2d_scene
+ ts_aoi_scenes[0] = aoi_2d_scene
# Check that only AOIScene can be added
with self.assertRaises(AssertionError):
- ts_aois_scenes[1] = "This string is not an AOI2DScene"
+ ts_aoi_scenes[1] = "This string is not an AOI2DScene"
if __name__ == '__main__':
diff --git a/src/argaze.test/AreaOfInterest/AOI3DScene.py b/src/argaze.test/AreaOfInterest/AOI3DScene.py
index b386432..d09f2a8 100644
--- a/src/argaze.test/AreaOfInterest/AOI3DScene.py
+++ b/src/argaze.test/AreaOfInterest/AOI3DScene.py
@@ -107,14 +107,14 @@ class TestTimeStampedAOIScenesClass(unittest.TestCase):
aoi_3D_B = AOIFeatures.AreaOfInterest([[1, 1, 0], [1, 2, 0], [2, 2, 0], [2, 1, 0]])
aoi_3d_scene = AOI3DScene.AOI3DScene({"A": aoi_3D_A, "B": aoi_3D_B})
- ts_aois_scenes = AOIFeatures.TimeStampedAOIScenes()
+ ts_aoi_scenes = AOIFeatures.TimeStampedAOIScenes()
- ts_aois_scenes[0] = aoi_3d_scene
+ ts_aoi_scenes[0] = aoi_3d_scene
# Check that only AOIScene can be added
with self.assertRaises(AssertionError):
- ts_aois_scenes[1] = "This string is not an AOI3DScene"
+ ts_aoi_scenes[1] = "This string is not an AOI3DScene"
if __name__ == '__main__':
diff --git a/src/argaze.test/GazeFeatures.py b/src/argaze.test/GazeFeatures.py
index d609dd2..b41c7c7 100644
--- a/src/argaze.test/GazeFeatures.py
+++ b/src/argaze.test/GazeFeatures.py
@@ -497,10 +497,10 @@ class TestAOIScanStepClass(unittest.TestCase):
aoi_scan_step = GazeFeatures.AOIScanStep(movements, 'Test')
-def build_aoi_scan_path(expected_aois, aoi_path):
+def build_aoi_scan_path(expected_aoi, aoi_path):
"""Build AOI scan path"""
- aoi_scan_path = GazeFeatures.AOIScanPath(expected_aois)
+ aoi_scan_path = GazeFeatures.AOIScanPath(expected_aoi)
# Append a hidden last step to allow last given step creation
aoi_path.append(aoi_path[-2])
diff --git a/src/argaze/ArFeatures.py b/src/argaze/ArFeatures.py
index 0750cb5..122efe8 100644
--- a/src/argaze/ArFeatures.py
+++ b/src/argaze/ArFeatures.py
@@ -96,7 +96,7 @@ DEFAULT_ARLAYER_DRAW_PARAMETERS = {
@dataclass
class ArLayer():
"""
- Defines a space where to make matching of gaze movements and AOIs and inside which those matchings need to be analyzed.
+ Defines a space where to make matching of gaze movements and AOI and inside which those matchings need to be analyzed.
Parameters:
name: name of the layer
@@ -203,10 +203,10 @@ class ArLayer():
new_aoi_scene = AOI2DScene.AOI2DScene()
# Edit expected AOI list by removing AOI with name equals to layer name
- expected_aois = list(new_aoi_scene.keys())
+ expected_aoi = list(new_aoi_scene.keys())
- if new_layer_name in expected_aois:
- expected_aois.remove(new_layer_name)
+ if new_layer_name in expected_aoi:
+ expected_aoi.remove(new_layer_name)
# Load aoi matcher
try:
@@ -230,13 +230,13 @@ class ArLayer():
try:
new_aoi_scan_path_data = layer_data.pop('aoi_scan_path')
- new_aoi_scan_path_data['expected_aois'] = expected_aois
+ new_aoi_scan_path_data['expected_aoi'] = expected_aoi
new_aoi_scan_path = GazeFeatures.AOIScanPath(**new_aoi_scan_path_data)
except KeyError:
new_aoi_scan_path_data = {}
- new_aoi_scan_path_data['expected_aois'] = expected_aois
+ new_aoi_scan_path_data['expected_aoi'] = expected_aoi
new_aoi_scan_path = None
# Load AOI scan path analyzers
@@ -1208,7 +1208,7 @@ class ArScene():
# Check that the frame have a layer named like this scene layer
aoi_2d_scene = new_frame.layers[scene_layer_name].aoi_scene
- # Transform 2D frame layer AOIs into 3D scene layer AOIs
+ # Transform 2D frame layer AOI into 3D scene layer AOI
# Then, add them to scene layer
scene_layer.aoi_scene |= aoi_2d_scene.dimensionalize(frame_3d, new_frame.size)
@@ -1228,12 +1228,12 @@ class ArScene():
if frame_layer.aoi_scan_path is not None:
# Edit expected AOI list by removing AOI with name equals to frame layer name
- expected_aois = list(layer.aoi_scene.keys())
+ expected_aoi = list(layer.aoi_scene.keys())
- if frame_layer_name in expected_aois:
- expected_aois.remove(frame_layer_name)
+ if frame_layer_name in expected_aoi:
+ expected_aoi.remove(frame_layer_name)
- frame_layer.aoi_scan_path.expected_aois = expected_aois
+ frame_layer.aoi_scan_path.expected_aoi = expected_aoi
except KeyError:
@@ -1353,7 +1353,7 @@ class ArCamera(ArFrame):
continue
- layer.aoi_scan_path.expected_aois = all_aoi_list
+ layer.aoi_scan_path.expected_aoi = all_aoi_list
# Init a lock to share scene projections into camera frame between multiple threads
self._frame_lock = threading.Lock()
diff --git a/src/argaze/AreaOfInterest/AOI2DScene.py b/src/argaze/AreaOfInterest/AOI2DScene.py
index 4dc47f4..a726b23 100644
--- a/src/argaze/AreaOfInterest/AOI2DScene.py
+++ b/src/argaze/AreaOfInterest/AOI2DScene.py
@@ -26,9 +26,9 @@ AOI3DSceneType = TypeVar('AOI3DScene', bound="AOI3DScene")
class AOI2DScene(AOIFeatures.AOIScene):
"""Define AOI 2D scene."""
- def __init__(self, aois_2d: dict = None):
+ def __init__(self, aoi_2d: dict = None):
- super().__init__(2, aois_2d)
+ super().__init__(2, aoi_2d)
@classmethod
def from_svg(self, svg_filepath: str) -> AOI2DSceneType:
@@ -121,7 +121,7 @@ class AOI2DScene(AOIFeatures.AOIScene):
yield name, aoi, matching
def draw_raycast(self, image: numpy.array, pointer:tuple, exclude=[], base_color=(0, 0, 255), matching_color=(0, 255, 0)):
- """Draw AOIs with their matching status."""
+ """Draw AOI with their matching status."""
for name, aoi, matching in self.raycast(pointer):
diff --git a/src/argaze/AreaOfInterest/AOI3DScene.py b/src/argaze/AreaOfInterest/AOI3DScene.py
index bfe189a..33a815c 100644
--- a/src/argaze/AreaOfInterest/AOI3DScene.py
+++ b/src/argaze/AreaOfInterest/AOI3DScene.py
@@ -38,15 +38,15 @@ AOI2DSceneType = TypeVar('AOI2DScene', bound="AOI2DScene")
class AOI3DScene(AOIFeatures.AOIScene):
"""Define AOI 3D scene."""
- def __init__(self, aois_3d: dict = None):
+ def __init__(self, aoi_3d: dict = None):
- super().__init__(3, aois_3d)
+ super().__init__(3, aoi_3d)
@classmethod
def from_obj(self, obj_filepath: str) -> AOI3DSceneType:
"""Load AOI3D scene from .obj file."""
- aois_3d = {}
+ aoi_3d = {}
# regex rules for .obj file parsing
OBJ_RX_DICT = {
@@ -111,12 +111,12 @@ class AOI3DScene(AOIFeatures.AOIScene):
# retreive all aoi3D vertices and sort them in clockwise order
for name, face in faces.items():
aoi3D = AOIFeatures.AreaOfInterest([ vertices[i-1] for i in reversed(face) ])
- aois_3d[name] = aoi3D
+ aoi_3d[name] = aoi3D
except IOError:
raise IOError(f'File not found: {obj_filepath}')
- return AOI3DScene(aois_3d)
+ return AOI3DScene(aoi_3d)
def to_obj(self, obj_filepath: str):
"""Save AOI3D scene into .obj file."""
diff --git a/src/argaze/GazeAnalysis/DeviationCircleCoverage.py b/src/argaze/GazeAnalysis/DeviationCircleCoverage.py
index f0decfc..6dadaba 100644
--- a/src/argaze/GazeAnalysis/DeviationCircleCoverage.py
+++ b/src/argaze/GazeAnalysis/DeviationCircleCoverage.py
@@ -34,7 +34,7 @@ class AOIMatcher(GazeFeatures.AOIMatcher):
self.__look_count = 0
self.__looked_aoi_data = (None, None)
self.__circle_ratio_sum = {}
- self.__aois_coverages = {}
+ self.__aoi_coverages = {}
self.__matched_gaze_movement = None
self.__matched_region = None
@@ -79,14 +79,14 @@ class AOIMatcher(GazeFeatures.AOIMatcher):
self.__looked_aoi_data = most_likely_looked_aoi_data
# Calculate looked aoi circle ratio means
- self.__aois_coverages = {}
+ self.__aoi_coverages = {}
for aoi_name, circle_ratio_sum in self.__circle_ratio_sum.items():
circle_ratio_mean = circle_ratio_sum / self.__look_count
# filter circle ration mean greater than 1
- self.__aois_coverages[aoi_name] = circle_ratio_mean if circle_ratio_mean < 1 else 1
+ self.__aoi_coverages[aoi_name] = circle_ratio_mean if circle_ratio_mean < 1 else 1
# Update matched gaze movement
self.__matched_gaze_movement = gaze_movement
@@ -95,7 +95,7 @@ class AOIMatcher(GazeFeatures.AOIMatcher):
self.__matched_region = matched_region
# Return
- if self.__aois_coverages[most_likely_looked_aoi_data[0]] > self.coverage_threshold:
+ if self.__aoi_coverages[most_likely_looked_aoi_data[0]] > self.coverage_threshold:
return self.__looked_aoi_data
@@ -179,8 +179,8 @@ class AOIMatcher(GazeFeatures.AOIMatcher):
return self.__looked_aoi_data[0]
@property
- def aois_coverages(self) -> dict:
- """Get all aois coverage means for current fixation.
+ def aoi_coverages(self) -> dict:
+ """Get all aoi coverage means for current fixation.
It represents the ratio of fixation deviation circle surface that used to cover the aoi."""
- return self.__aois_coverages \ No newline at end of file
+ return self.__aoi_coverages \ No newline at end of file
diff --git a/src/argaze/GazeAnalysis/TransitionMatrix.py b/src/argaze/GazeAnalysis/TransitionMatrix.py
index 6f408e4..b346b5a 100644
--- a/src/argaze/GazeAnalysis/TransitionMatrix.py
+++ b/src/argaze/GazeAnalysis/TransitionMatrix.py
@@ -42,7 +42,7 @@ class AOIScanPathAnalyzer(GazeFeatures.AOIScanPathAnalyzer):
row_sum = aoi_scan_path.transition_matrix.apply(lambda row: row.sum(), axis=1)
# Editing transition matrix probabilities
- # Note: when no transiton starts from an aoi, destination probabilites is equal to 1/S where S is the number of aois
+ # Note: when no transiton starts from an aoi, destination probabilites is equal to 1/S where S is the number of aoi
self.__transition_matrix_probabilities = aoi_scan_path.transition_matrix.apply(lambda row: row.apply(lambda p: p / row_sum[row.name] if row_sum[row.name] > 0 else 1 / row_sum.size), axis=1)
# Calculate matrix density
diff --git a/src/argaze/GazeFeatures.py b/src/argaze/GazeFeatures.py
index 2dd1cab..814753e 100644
--- a/src/argaze/GazeFeatures.py
+++ b/src/argaze/GazeFeatures.py
@@ -842,13 +842,13 @@ AOIScanPathType = TypeVar('AOIScanPathType', bound="AOIScanPathType")
class AOIScanPath(list):
"""List of aoi scan steps over successive aoi."""
- def __init__(self, expected_aois: list[str] = [], duration_max: int|float = 0):
+ def __init__(self, expected_aoi: list[str] = [], duration_max: int|float = 0):
super().__init__()
self.duration_max = duration_max
- self.expected_aois = expected_aois
+ self.expected_aoi = expected_aoi
self.__duration = 0
@property
@@ -903,13 +903,13 @@ class AOIScanPath(list):
return sequence
@property
- def expected_aois(self):
+ def expected_aoi(self):
"""List of all expected aoi."""
- return self.__expected_aois
+ return self.__expected_aoi
- @expected_aois.setter
- def expected_aois(self, expected_aois: list[str] = []):
+ @expected_aoi.setter
+ def expected_aoi(self, expected_aoi: list[str] = []):
"""Edit list of all expected aoi.
!!! warning
@@ -917,15 +917,15 @@ class AOIScanPath(list):
"""
self.clear()
- self.__expected_aois = expected_aois
+ self.__expected_aoi = expected_aoi
self.__movements = TimeStampedGazeMovements()
self.__current_aoi = ''
self.__index = ord('A')
self.__aoi_letter = {}
self.__letter_aoi = {}
- size = len(self.__expected_aois)
- self.__transition_matrix = pandas.DataFrame(numpy.zeros((size, size)), index=self.__expected_aois, columns=self.__expected_aois)
+ size = len(self.__expected_aoi)
+ self.__transition_matrix = pandas.DataFrame(numpy.zeros((size, size)), index=self.__expected_aoi, columns=self.__expected_aoi)
@property
def current_aoi(self):
@@ -953,7 +953,7 @@ class AOIScanPath(list):
!!! warning
It could raise AOIScanStepError"""
- if looked_aoi not in self.__expected_aois:
+ if looked_aoi not in self.__expected_aoi:
raise AOIScanStepError('AOI not expected', looked_aoi)
@@ -1013,7 +1013,7 @@ class AOIScanPath(list):
"""Get how many fixations are there in the scan path and how many fixation are there in each aoi."""
scan_fixations_count = 0
- aoi_fixations_count = {aoi: 0 for aoi in self.__expected_aois}
+ aoi_fixations_count = {aoi: 0 for aoi in self.__expected_aoi}
for aoi_scan_step in self: