Two main strands of innovation are being explored in this Accelerator:
1. Creative Innovation:
How can broadcast experiences progress from flat, linear, proscenium representations to real-time, 6DOF, emergent and interactive representations of curated (narrative) as well as ‘real-world’ events?
The consortium will draw on the group’s diverse experience in immersive theatre production, videogame development, cinema, music and live events production to explore how next-gen 6DOF spatial broadcast experiences might be created, staged and distributed.
2. Technical Innovation:
How can XR technology be leveraged to enable realistic shared first-person experiences for groups of people? How can XR technology be utilised to bridge the digital divide between IRL (‘in real life’) and online experiences?
A number of technologies are being combined and deployed to produce the proof-of-concept experiences:
Positional tracking: real-time positional capture technology (external camera tracking) and consumer head-tracking using internal camera systems (webcam tracking).
- Multiplayer game-engine rendering: real-time production and rendering using game-engine technology
- Volumetric audio synthesis: MagicBeans’ volumetric audio synthesis platform Roundhead provides the tech framework and audio synthesis models that will allow audiences to freely move around and interact with the volumetric scenes.
- Acoustic modelling technology: braud.io's patented artificial reverberation and room acoustic simulation technology provide realistic room simulation integrated into Roundhead.
- Audio Definition Model (ADM): to what extent can current standardised metadata models for describing the technical properties of audio be utilised for 6DOF spatial broadcasts.
This POC explores the future of content where highly precise geolocated content can be deployed for audiences who are free to explore it as they see fit.
It seeks to understand how virtual content which is connected to physical spaces will affect linear workflows and how broadcast content might be adapted to satisfy demands for both realtime and linear outputs.