Continuing R&D from the award-winning 2021 Accelerator project that started to explore 5G Location-Based XR, this revived 2023 Accelerator aimed to build on the latest, cutting-edge innovations in interactive digital athletes, motion capture, AI and high-speed edge content delivery for live XR sports workflows for virtual 3D worlds, and the potential use cases into OTT platforms.
The original aim of this 2023 Challenge was to broadcast digital twin XR Sports using high-speed combat MMA (Mixed Martial Arts) athletes into live, immersive high-end graphics, spatial and social audio, and real-time, edge-compute deliverability to fans in a 3D virtual world via VR headsets, computers, mobile devices and also simultaneously to an outdoor Location-Based Experience (LBE) venue at The Outernet in London.
Pitched by Prima Terra Labs at IBC Kickstart Day, the Final POC Results were created by an all-star consortium of Champions and Participants with expertise in sports broadcasting, virtual production, 3D animation, spatial & social audio, edge engineering, academia, and talent identification.
Champions: Kings College London, Trinity College Dublin, Prima Terra Labs, Outernet Global, Vodafone Group, University of Southampton, RVPA, Bowie State University
Participants: D&B Solutions, Salsa Sound, Sparkup, Hand, AMD, HearMeCheer, Movrs
Kickstart Day Pitch:
Core POC Learnings
Concept and Capture:
MMA (Mixed Martial Arts) athletes were captured in the real world through the markerless motion capture technology by Movrs, at the White Light/D&B Solutions studios in Reading, UK. 3D digital twins of the athletes, created by Prima Terra Labs were animated by this live motion capture data, then placed into live, immersive, photorealistic environments designed to emulate the MMA culture where the sport was originated. Additional and crucial animation, logistical and virtual production support was also provided by the Regional Virtual Production Academy (RVPA), White Light/D&B Solutions and Bowie State University.
Edge Computing, Connectivity, and Processing Power:
Edge computing and high-speed connections were provided by Vodafone to accommodate the broadcast of the highly detailed 3D virtual world and live combat sports matches to four-story, 4K screens on four surrounding walls at our Location-Based Experience (LBE) venue, The Outernet in London. The processing of this data at the Outernet was enabled by generous hardware support from AMD. In future iterations, the experience will also be delivered to VR headsets, computers, mobile devices, and consoles.
The Immersive Soundscape and Audio Fan Engagement:
To create greater immersion, spatial audio of the live event was captured by Salsa Sound, delivered alongside the commentaries by two seasoned commentators from the combat sports community, Dorian Price (UFC veteran and coach) and Rhett Butler (UFC fight promoter and journalist for the Boxing Writers Association of America). Through the 2nd screen experience HearMeCheer app, the ‘demo fans’ real-world audio and cheering could also be heard by the test audience within the app. This delicately balanced soundscape was engineered by a combination of audio professionals and award-winning scholars from our stellar Spatial Audio dedicated Workstream group comprised of Salsa Sound, White Light/D&B Solutions, Kings College London, University of Southampton and HearMeCheer.
Visual Fan Engagement Broadcast:
Additional fan engagement was provided through a 2nd screen experience by the Sparkup app, also broadcast into several of The Outernet’s giant, four-story LED walls, that allowed fans to participate by capturing and streaming videos of their reactions and sending emojis and comments to support their favorite fighter, participate in “Meet the Fighter” activities and connect with others in the fan community.
Digital Twins and Talent Identity for Athletes:
Prima Terra Labs captured the likeness of six fighters from the Surrey Martial Arts Club and 21st Century Combat, including a Pro K1 fighter, coach and gym owner, David Zetolofsky. Over the course of two days at the White Light/D&B Solutions studio in Reading, UK, the fighters were captured through the Live Link Face for Unreal Engine app and a combination of photos and videos so that the fighters' digital twin MetaHumans could be created. Their new MetaHuman images were then added to the HAND (Human & Digital) talent identity registry database, along with their photographic (legal) images for the purpose of providing reliable verification of the fighters as real individuals, virtual counterparts, and fictional entities.
Academic White Paper [future delivery]:
To detail our processes and progress on this project, an academic white paper is currently being produced by our academic champions Trinity College Dublin, Bowie State University, the Regional Virtual Production Academy (RVPA), Kings College London, and the University of Southampton, led by Asst. Professor Gareth W. Young at Trinity College Dublin and supported by all of 2023’s Champions and Participants. Stay posted here on developments in our continuing research and development on this project and the publication date of our white paper.
Core Objectives & Outcomes:
- Real-time motion capture and broadcast of high-speed XR sports (Mixed Martial Arts) with two or more athletes
- Low-to-no latency connectivity and delivery via edge technologies
- Create and use virtual avatars of real-world athletes, augmented with VFX
- High-quality, photorealistic graphics in immersive environments
- Audiences in virtual 3D worlds will be able to participate and spectate on multiple devices (VR headsets, mobiles, computers)
- Athlete IP consideration via interoperable talent ID labeling for provenance verification and automation.