IBC2022 | BACK AT THE RAI AMSTERDAM

9-12 SEPTEMBER 2022

Default Image

No content here.

6DOF Audio-Led Narrative and Music Experiences in the Metaverse

6DOF Audio-Led Narrative and Music Experiences in the Metaverse

6DOF (six degrees of freedom) & interactive audio are explored for broadcasting, XR, metaverse and more

Champions: BBC, Paramount, Kings College London

Participants: Magic Beans, TuneURL

The Challenge Objective:

The Challenge Objective:

The objectives in this Challenge starts with a core observation that wearable tech is set to become a crucial driver of content innovation, so what happens when audiences are free to move around audiovisual content as they see fit, and how can broadcasters create compelling, interactive experiences that can be managed across new and old distribution chains?

With this in mind, how can broadcast content be created and adapted for a world of wearables that enable such 6DOF content, and how can this material be utilised ‘down-stream’ in more traditional workflows and reusable, transmedia assets to complement the experiential experience in real life and across platforms into the Metaverse.

This 2022 Challenge will develop on themes first explored in the 2021 IBC Accelerator ‘Immersive Audio and Sound Imagery’ project, where the consortium including Audible/Amazon, BBC, Kings College London/Braud.io, Omnilive, MagicBeans, CTOiC and Twickenham Film Studios explored how immersive audio (via spoken word and classical music) can best be deployed to increase engagement and immersion for audiences using fixed and dynamic (head-tracked) binaural audio.
The Aim of the POC:

The Aim of the POC:

This Accelerator will seek to explore the challenge objectives through the creation of a multi-user location-based audiovisual experience that allows an audience to explore a 6DOF audio experience that is overlaid onto a physical set.

A bespoke multi-user narrative experience will be created, alongside a reimagined live music performance - which will be captured and virtually reconstructed in this new physical space.

A virtual ‘digital twin’ of the core location-based experience will then be created - providing a virtual ‘metaverse-style’ representation of the location-based experience that audiences can explore in the cloud and on their own devices.

This POC will also explore ways to repurpose the core 6DOF content for more traditional broadcast outputs - linear, stereo and fixed binaural outputs will all be created and tested that bring aspects of the core LBE to life for audiences ‘at home’.

Other outputs that retain elements of the original immersive intent and can be distributed to standard consumer devices will also be explored - including multi-device audio orchestration and audio-triggered interactivity.
The Innovation:

The Innovation:

Two main strands of innovation are being explored in this Accelerator:

1. Creative Innovation:
How can broadcast experiences progress from flat, linear, proscenium representations to real-time, 6DOF, emergent and interactive representations of curated (narrative) as well as ‘real-world’ events?

The consortium will draw on the group’s diverse experience in immersive theatre production, videogame development, cinema, music and live events production to explore how next-gen 6DOF spatial broadcast experiences might be created, staged and distributed.

2. Technical Innovation:
How can XR technology be leveraged to enable realistic shared first-person experiences for groups of people? How can XR technology be utilised to bridge the digital divide between IRL (‘in real life’) and online experiences?

A number of technologies are being combined and deployed to produce the proof-of-concept experiences:
Positional tracking: real-time positional capture technology (external camera tracking) and consumer head-tracking using internal camera systems (webcam tracking).
  • Multiplayer game-engine rendering: real-time production and rendering using game-engine technology
  • Volumetric audio synthesis: MagicBeans’ volumetric audio synthesis platform Roundhead provides the tech framework and audio synthesis models that will allow audiences to freely move around and interact with the volumetric scenes.
  • Acoustic modelling technology: braud.io's patented artificial reverberation and room acoustic simulation technology provide realistic room simulation integrated into Roundhead.
  • Audio Definition Model (ADM): to what extent can current standardised metadata models for describing the technical properties of audio be utilised for 6DOF spatial broadcasts.

Additional Innovation:
This POC explores the future of content where highly precise geolocated content can be deployed for audiences who are free to explore it as they see fit.
It seeks to understand how virtual content which is connected to physical spaces will affect linear workflows and how broadcast content might be adapted to satisfy demands for both realtime and linear outputs.
4-featuredimage