Session Replay 

To view the tracking data after a simulation is run, start the Session Replay file (also referred to as "SightLab Replay"). You will be presented with a list of options for session playback. Note you must first run an experiment session for session replay to show anything. For 360 video/photo playback, the viewpoint will be from a fixed position. Press spacebar to start/stop and r to reset. 


The playback can be viewed either in desktop  mode or in a headset if choose "Meta, SteamVR or OpenXR".

Note: Session Replay uses a "Free Camera" mode, meaning you can use the WASD and ZX keys to move the camera around, or use the VR controls or walk freely if viewing in a headset. 

Trial Number - Choose to select from any trials in your session

Open Data Folder - Will open the data files associated with this session

Avatar-  Toggle visualization of avatar (or avatars if multi-user). Head and hands separately

Gaze Visualizations -  Toggle the gaze point, gaze ray and dwell spheres. Dwell spheres are objects that will change in color and size depending on time spend in dwell on an object or region of interest. 

Fixation Spheres-  Toggle the viewing of fixation spheres, these objects will detect when a user is in fixation (less than one degree of angular distance and over 100ms threshold default, which can be modified, using the identification by dispersion method). See page on eye tracking metrics for more detail.  Click "Static" to view all at once. 

Heat Map -   View a dynamic heatmap of the gaze data. "Static" will show all at once. Checking "Occlusion" will allow a heatmap that accurately reflects occlusion of objects in the scene (such as walls). Adjust the sliders for parameters of how you want the heat map to look. 

Scan Path -  Toggle a scan path of the scene using either a gaze path or dispersion path. Static or dynamic. 

Console -   Toggle the various text overlays on the replay

Views -   Toggle a first or third person view in 3D models mode. For 360 media will toggle whether the head position is moving related to the user's head movement or you can look around freely. 

Reset to Default - Resets all the settings to default. 

Use the slider along the bottom to scrub through the playback


Additional keys:

If using a headset the default controls apply, except in the case of the Vive Pro/2/Vive Pro Eye, where the system button (one above trackpad) on the right needs to be held down and use the left trackpad to move, and the left system button is used for the arc teleport. For the other headsets, the navigation is their default 

There is also  an option for an aggregated heatmap, this can show all the data in the folder, for a specific session or a using a specific condition. 

As of version 1.9.9 you can also view a walk path. This is in preview mode and may affect loading time. This will be found in the ExampleScripts folder as of 1.9.9.  It can be toggled on or off with the 'p' key. 

Noise Filter

The main purpose of this function is to smooth the eye tracking data by averaging the gaze points over a specified window of time. This helps to reduce the impact of random noise or fluctuations in the eye tracking data, resulting in more stable and reliable gaze points. The moving average algorithm works by maintaining a rolling window of the most recent gaze points and calculating the average for each new point as it becomes available. 

replay = Replay.SightLabReplay(noiseFilter = True)

Data Directory

If you have a custom location for your data files, you can use the "dataDirectory" attribute to give the path to your custom data folder

replay = Replay.SightLabReplay(dataDirectory = 'data')

Advanced Configuration/Settings

Change look and settings for scan path. Add this code and adjust as needed on the SightLabVR_Replay script

replay.replayVisualisations.scanPathConfig = {

            POINT_COLOR: viz.GREEN,

            POINT_SIZE: 2,

            LINE_COLOR: viz.BLUE,

            LINE_WIDTH: 2}

Set position

replay.transportNode.setPosition([x,y,z])

Accessing environment SessionReplay

env = replay.getEnvironmentObject()

Accessing Scene Objects

screen = replay.sceneObjects[GAZE_OBJECTS]['screen']

Matching up 2D media playback on the Replay

360 Video will automatically synchronize with the replay. To get traditional video to match up (from a virtual screen for instance), use this code:

def reset():

video.setTime(0)

  video.play()

def play():

  video.play()

def pause():

  video.pause()

def scrub(time):

  video.setTime(time)

viz.callback(REPLAY_RESET, reset)

viz.callback(REPLAY_PLAYING, play)

viz.callback(REPLAY_PAUSED, pause)

viz.callback(REPLAY_SCRUB, scrub)

Loading older SightLab replay files (SightLab 1x)

replay = Replay.SightLabReplay(sightlab_1_9 = True)

Setting playback of custom functions and Events

Use  in your main script: sightlab.setCustomReplayFlag('name of flag', variable to keep track of) and call a function whenever you want to keep track of this and change the variable, using something like a keypress or any event. 

#Example:

def changeColor():

color_list = [viz.RED, viz.BLUE, viz.GREEN, viz.YELLOW]

target_color = random.choice(color_list)

basketball.color(target_color)

sightlab.setCustomReplayFlag('color', target_color)


vizact.onkeydown('a', changeColor)

In Replay Script then add similar to this code

def changeColor():

basketball = replay.sceneObjects[GAZE_OBJECTS]['basketball']

flag_data = replay.getCurrFlag()

if "color" in flag_data:

print(flag_data)

basketball.color(flag_data["color"])

import vizact

vizact.onupdate(0,changeColor)