6DOF Audio-Led Narrative and Music Experiences in the Metaverse

6DOF Audio-Led Narrative and Music Experiences in the Metaverse - Creating first-person-perspective music and narrative experiences that audiences are free to move around and interact with, understanding how 6DOF (six degrees of freedom) audio content in the metaverse will impact upon content creation and delivery. This Challenge also explored interactive sound experiences, and took a deep-dive into ADM (Audio Definition Models). 


Champions: BBC, Paramount, Kings College London

Participants: Magic Beans, TuneURL, White Light

The Challenge Objective:

The objectives in this challenge started with the core premise that wearable tech and geo-located content ('the real-world metaverse’) are set to become crucial drivers of content innovation.

It sought to answer some of the questions that arise from this understanding:

* What happens when audiences are free to move around audiovisual content as they see fit, and how can broadcasters create compelling interactive experiences that are manageable across new and old distribution chains?

* How can broadcast content be created and adapted for a world that enables such 6DOF content, and how can this material be utilised ‘down-stream’ in more traditional workflows - as reusable transmedia assets that complement ‘real world’ experiences with versions that can be deployed across more standard distribution platforms allowing broadcast audiences to interact with the content.

This 2022 Challenge developed on themes first explored in the 2021 IBC Accelerator ‘Immersive Audio and Sound Imagery’ project, where the consortium - including Audible/Amazon, BBC, Kings College London/Braud.io, Omnilive, MagicBeans, CTOiC and Twickenham Film Studios - explored how immersive audio (via spoken word and classical music) can best be deployed to increase engagement and immersion for audiences using fixed and dynamic (head-tracked) binaural audio.

The Aim of the POC:

This Accelerator sought to explore the challenge objectives through the creation of a multi-user location-based audiovisual experience that allows an audience to explore a 6DOF audio experience that is overlaid onto a physical set.

A bespoke multi-user narrative experience was created, alongside a reimagined live music performance - which was captured and virtually reconstructed in this new physical space.

A virtual ‘digital twin’ of the core location-based experience was then be created - providing a virtual ‘metaverse-style’ representation of the location-based experience that audiences can explore in the cloud and on their own devices.

This POC also explored ways to repurpose the core 6DOF content for more traditional broadcast outputs. Linear, stereo and fixed binaural outputs will all be created and tested that bring aspects of the core LBE to life for audiences ‘at home’ to engage with broadcast content in the moment, in an easy and entertaining way.

Other outputs that retain elements of the original immersive intent and can be distributed to standard consumer devices was also explored - including multi-device audio orchestration and audio-triggered interactivity.

The Innovation:

Two main strands of innovation were explored in this Accelerator:

1. Creative Innovation:
How can broadcast experiences progress from flat, linear, proscenium representations to real-time, 6DOF, emergent and interactive representations of curated (narrative) as well as ‘real-world’ events?

The consortium drew on the group’s diverse experience in immersive theatre production, videogame development, cinema, music and live events production to explore how next-gen 6DOF spatial broadcast experiences might be created, staged and distributed.

2. Technical Innovation:
How can XR technology be leveraged to enable realistic shared first-person experiences for groups of people? How can XR technology be utilised to bridge the digital divide between IRL (‘in real life’) and online experiences? How can interactive audio technology be utilised to bridge the divide between broadcasting and online experiences?

A number of technologies were combined and deployed to produce the proof-of-concept experiences:
Positional tracking: real-time positional capture technology (external camera tracking) and consumer head-tracking using internal camera systems (webcam tracking).
Multiplayer game-engine rendering: real-time production and rendering using game-engine technology
Volumetric audio synthesis: MagicBeans’ volumetric audio synthesis platform Roundhead provided the tech framework and audio synthesis models that will allow audiences to freely move around and interact with the volumetric scenes.
Acoustic modelling technology: braud.io's patented artificial reverberation and room acoustic simulation technology provided realistic room simulation integrated into Roundhead.
Audio Definition Model (ADM): to what extent can current standardised metadata models for describing the technical properties of audio be utilised for 6DOF spatial broadcasts.

3. Interactive Audio
TuneURL's patented technology was used to enable test audiences to react in-the-moment to broadcast content (i.e., video, FM radio, podcasts) with a simple, safe second-screen interaction while capturing engagement and attribution data on each call-to-action, using Paramount's Isle of MTV music content.

Additional Innovation:
This POC explored the future of content where highly precise geolocated content can be deployed for audiences who are free to explore it as they see fit.
It sought to understand how virtual content which is connected to physical spaces will affect linear workflows and how broadcast content might be adapted to satisfy demands for both realtime and linear outputs.

Champions:

Participants:

Proof of Concept

Kickstart Day Pitch