Mixed Reality Eye Tracking
Mixed Reality Eye Tracking
Updated for version 1.9.9 (contact support@worldviz.com if need latest version)
To utilize eye tracking on top of real world objects, see the example MixedReality_EyeTracking.py in the ExampleScripts-Mixed Reality folder
Keys:
Spacebar- Start and Stop Data Collection
Trigger to grab regions
t- print region 1 pos, y- print region 2 pos
r - reset position
l- hide all regions
o- toggle origin
c,g - measuring tape start and end points desktop
RH grip and RH B button - measuring tape start and end points headset
To be able to collect eye tracking data on real world objects, then use visualizations such as a heatmap, follow these steps:
Edit the MR_Config.py in the ExampleScripts- Mixed Reality folder by right clicking and choosing "Edit" or open in Vizard.
Adjust these options:
Depending on how many regions and objects you want to add, copy and paste the RegionCode (i.e. copy Region2Name, Region2Position and Region2Scale and create new names in the config file.). You can then change the names, 'table', 'floor', etc. to whatever names you want. There are 2 regions included.
Run the script and use the left and right trigger buttons on the controller to grab the regions and place them over which real world objects you want to track. You can then press the 't' and 'y' keys to have their positions and orientations print out in the interactive window in the Vizard editor. These are the values you will then paste into config file, so that the Regions overlay on top of the real world objects. You can either manually adjust the scale if you know the dimensions of the real world objects or you can use the virtual measuring tool.
It might be a good idea to place a marker or put some tape down on the ground where your starting point will be if you want to run multiple participants and keep the positioning stable. There is a virtual object that shows the origin.
Use the 'o' key to toggle the virtual origin object on or off
If gaze based ID is on you will 3D text names on top of the objects when they're viewed over the gaze time threshold (by default is 500 ms). Views are then collected alongside other eye tracking data in the data folder.
You can also view a full Session Replay with heatmaps, scan paths and overlays by running SessionReplay_passthrough.py in the MixedReality folder
To view in passthrough choose "OpenXR" and you will need to edit the attention_map.py file in sightlab_utils (will have to turn on permissions) to change line 33 to " gl_FragColor.a = 0.5;" in order to see the passthrough scene on the session replay.