How it works

How it works
GIF 5.gif

In phase 2 of our live volumetric R&D lab our main technology focus is on live capture, rather than fidelity. This meant we needed two separate spaces, one for volumetric live recording and the other for the audience to experience the work remotely through VR headsets.

Our capture space is 2x2 metres and we are using an array calibrated depth sensors that are networked together. The data is triangulated and a 3D mesh is generated. This data then gets processed and composited with the video texture overlaid into the games engine in realtime.

In this iteration of the lab our experience space was equipped with two networked HTC Vive headsets that where placed in the same instance.



Ben has produced multiple XR applications for clients in Sheffield/London, leading small teams of creatives and technologists to create unique digital experiences for the arts, marketing and healthcare sectors. Ben has a First Class (Hons) degree in Drama from Kingston University and is currently developing a new digital course for National Youth Theatre which will run later this year.