aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorThéo de la Hogue2022-11-16 20:15:09 +0100
committerThéo de la Hogue2022-11-16 20:15:09 +0100
commit02bd20b7cf914709e64980c1ce5faa9af2ad15f2 (patch)
treefd613ab7057946efebae3a8974a46a7f51c596db
parent33c8d6b7ed9f1ab2993da405430cca2910c5c636 (diff)
downloadargaze-02bd20b7cf914709e64980c1ce5faa9af2ad15f2.zip
argaze-02bd20b7cf914709e64980c1ce5faa9af2ad15f2.tar.gz
argaze-02bd20b7cf914709e64980c1ce5faa9af2ad15f2.tar.bz2
argaze-02bd20b7cf914709e64980c1ce5faa9af2ad15f2.tar.xz
Improving heading size.
-rw-r--r--README.md16
-rw-r--r--src/argaze/utils/README.md40
2 files changed, 27 insertions, 29 deletions
diff --git a/README.md b/README.md
index 2b9c0b7..cd6b147 100644
--- a/README.md
+++ b/README.md
@@ -1,6 +1,6 @@
An open-source python toolkit to deal with dynamic Areas Of Interest (AOI) and gaze tracking in Augmented Reality (AR) environnement.
-## Architecture
+# Architecture
The ArGaze toolkit provides some generics data structures and algorithms to build AR environement with dynamic AOI and so allow gaze tracking with mobil eye tracker devices. It is divided in submodules dedicated to various specifics features:
@@ -10,7 +10,7 @@ The ArGaze toolkit provides some generics data structures and algorithms to buil
* `argaze.TobiiGlassesPro2`: A gaze tracking device interface.
* `argaze.utils`: Collection of command-line high level features scripts based on ArGaze toolkit.
-## Installation
+# Installation
Consider that all inline commands below needs to be executed into ArGaze root folder.
@@ -39,17 +39,17 @@ pip install ./dist/argaze-VERSION.whl
pip install -e .
```
-## Documentation
+# Documentation
-### Wiki
+## Wiki
The [wiki](https://git.recherche.enac.fr/projects/argaze/wiki) provides many explanations about how works ArGaze, what is possible to do and code samples.
-### Cookbook
+## Cookbook
The `argaze.utils` submodule is a good place to get ready made code examples.
-### Code
+## Code
ArGaze code documentation is based on [pdoc](https://pdoc.dev/).
To generate html documentation:
@@ -73,9 +73,9 @@ pdoc -o ./doc ./src/argaze/
pdoc ./src/argaze/
```
-## Test
+# Test
-ArGaze package unitary tests are based on *unittest* module.
+ArGaze package unitary tests are based on [unittest](https://docs.python.org/fr/3.10/library/unittest.html) module.
Test files tree structure is mirroring the file tree structure of src/argaze folder.
To run all unitary tests:
diff --git a/src/argaze/utils/README.md b/src/argaze/utils/README.md
index 2fc3bab..7bc56bd 100644
--- a/src/argaze/utils/README.md
+++ b/src/argaze/utils/README.md
@@ -1,6 +1,4 @@
-Collection of command-line high level features based on ArGaze toolkit.
-
-## Ready-to-use commands
+Collection of ready-to-use commands based on ArGaze toolkit.
.. note::
*Consider that all inline commands below needs to be executed into ArGaze root folder.*
@@ -8,51 +6,51 @@ Collection of command-line high level features based on ArGaze toolkit.
.. note::
*Use -h option to get command arguments documentation.*
-### ArUco factory
+# ArUco factory
-- Export all markers from DICT_APRILTAG_16h5 dictionary as 5 cm pictures with 300 dpi resolution into an export/markers folder:
+Export all markers from DICT_APRILTAG_16h5 dictionary as 5 cm pictures with 300 dpi resolution into an export/markers folder:
```
python ./src/argaze/utils/aruco_markers_export.py -o export/markers -d DICT_APRILTAG_16h5 -s 5 -r 300
```
-- Export a 7 columns and 5 rows calibration board made of 5cm squares with 3cm markers from DICT_APRILTAG_16h5 dictionary at 50 dpi into an export folder:
+Export a 7 columns and 5 rows calibration board made of 5cm squares with 3cm markers from DICT_APRILTAG_16h5 dictionary at 300 dpi into an export folder:
```
-python ./src/argaze/utils/aruco_calibration_board_export.py 7 5 5 3 -o export -d DICT_APRILTAG_16h5
+python ./src/argaze/utils/aruco_calibration_board_export.py 7 5 5 3 -o export -d DICT_APRILTAG_16h5 -r 300
```
-### Tobii calibration
+# Tobii calibration
-- Calibrate Tobii Glasses Pro 2 camera (-t IP_ADDRESS) using a 7 columns and 5 rows calibration board made of 5cm squares with 3cm markers from DICT_APRILTAG_16h5 dictionary. Then, export its optical parameters into an tobii_camera.json file:
+Calibrate Tobii Glasses Pro 2 camera (-t IP_ADDRESS) using a 7 columns and 5 rows calibration board made of 5cm squares with 3cm markers from DICT_APRILTAG_16h5 dictionary. Then, export its optical parameters into an tobii_camera.json file:
```
python ./src/argaze/utils/tobii_camera_calibrate.py 7 5 5 3 -t IP_ADDRESS -d DICT_APRILTAG_16h5 -o export/tobii_camera.json
```
-- Calibrate Tobii Glasses Pro 2 inertial measure unit (-t IP_ADDRESS) then, export calibration parameters into an imu.json file:
+Calibrate Tobii Glasses Pro 2 inertial measure unit (-t IP_ADDRESS) then, export calibration parameters into an imu.json file:
```
python ./src/argaze/utils/tobii_imu_calibrate.py -t IP_ADDRESS -o export/imu.json
```
-### Tobii session
+# Tobii session
-- Display Tobii Glasses Pro 2 camera video stream (-t IP_ADDRESS) with a live gaze pointer. Loading calibration file to display inertial sensors data:
+Display Tobii Glasses Pro 2 camera video stream (-t IP_ADDRESS) with a live gaze pointer. Loading calibration file to display inertial sensors data:
```
python ./src/argaze/utils/tobii_stream_display.py -t IP_ADDRESS -i export/imu.json
```
-- Record a Tobii Glasses Pro 2 'myProject' session for a 'myUser' participant on Tobii interface's SD card (-t IP_ADDRESS):
+Record a Tobii Glasses Pro 2 'myProject' session for a 'myUser' participant on Tobii interface's SD card (-t IP_ADDRESS):
```
python ./src/argaze/utils/tobii_segment_record.py -t IP_ADDRESS -p myProject -u myUser
```
-### Tobii drive
+# Tobii drive
-- Explore Tobii Glasses Pro 2 interface's SD Card (-d DRIVE_PATH, -p PROJECT_PATH, -r RECORDING_PATH, -s SEGMENT_PATH):
+Explore Tobii Glasses Pro 2 interface's SD Card (-d DRIVE_PATH, -p PROJECT_PATH, -r RECORDING_PATH, -s SEGMENT_PATH):
```
python ./src/argaze/utils/tobii_sdcard_explore.py -d DRIVE_PATH
@@ -70,29 +68,29 @@ python ./src/argaze/utils/tobii_sdcard_explore.py -r RECORDING_PATH
python ./src/argaze/utils/tobii_sdcard_explore.py -s SEGMENT_PATH
```
-### Tobii post-processing
+# Tobii post-processing
-- Replay a time range selection (-r IN OUT) Tobii Glasses Pro 2 session (-s SEGMENT_PATH) synchronizing video and some data together:
+Replay a time range selection (-r IN OUT) Tobii Glasses Pro 2 session (-s SEGMENT_PATH) synchronizing video and some data together:
```
python ./src/argaze/utils/tobii_segment_display.py -s SEGMENT_PATH -r IN OUT
```
-- Export Tobii segment fixations and saccades (-s SEGMENT_PATH) from a time range selection (-r IN OUT) as fixations.csv and saccades.csv files saved into the segment folder:
+Export Tobii segment fixations and saccades (-s SEGMENT_PATH) from a time range selection (-r IN OUT) as fixations.csv and saccades.csv files saved into the segment folder:
```
python ./src/argaze/utils/tobii_segment_gaze_movements_export.py -s SEGMENT_PATH -r IN OUT
```
-### Tobii with ArUco
+# Tobii with ArUco
-- Track ArUco markers into Tobii camera video stream (-t IP_ADDRESS). Load aoi scene .obj file related to each marker (-mi MARKER_ID, PATH_TO_AOI_SCENE), position each scene virtually relatively to its detected ArUco markers then project the scene into camera frame:
+Track ArUco markers into Tobii camera video stream (-t IP_ADDRESS). Load aoi scene .obj file related to each marker (-mi MARKER_ID, PATH_TO_AOI_SCENE), position each scene virtually relatively to its detected ArUco markers then project the scene into camera frame:
```
python ./src/argaze/utils/tobii_stream_aruco_aoi_display.py -t IP_ADDRESS -c export/tobii_camera.json -ms 5 -mi '{"MARKER_ID":"PATH_TO_AOI_SCENE.obj",...}'
```
-- Track ArUco markers into a Tobii camera video segment (-s SEGMENT_PATH) into a time range selection (-r IN OUT). Load aoi scene .obj file related to each marker (-mi MARKER_ID, PATH_TO_AOI_SCENE), position each scene virtually relatively to its detected ArUco markers then project the scene into camera frame. Export aoi video and data as a aruco_aoi.csv, aruco_aoi.mp4 files:
+Track ArUco markers into a Tobii camera video segment (-s SEGMENT_PATH) into a time range selection (-r IN OUT). Load aoi scene .obj file related to each marker (-mi MARKER_ID, PATH_TO_AOI_SCENE), position each scene virtually relatively to its detected ArUco markers then project the scene into camera frame. Export aoi video and data as a aruco_aoi.csv, aruco_aoi.mp4 files:
```
python ./src/argaze/utils/tobii_segment_aruco_aoi_export.py -s SEGMENT_PATH -c export/tobii_camera.json -r IN OUT -ms 5 -mi '{"MARKER_ID":"PATH_TO_AOI_SCENE.obj",...}'
```