Accelerator Project

RESPONSIVE NARRATIVE FACTORY

FINAL SHOWCASE SESSION

 

Watch the Responsive Narrative Factory project's Final Showcase Session in the hero video above, and also via IBC365.  

View on IBC365

Delivering customised video experiences at scale. 

The success of future streaming services will depend on how effectively that video content is tailored to individual preferences and needs.  

If a viewer feels that a program was personally made for them, then engagement improves and the chances of repeat viewing and personal recommendation increase. The Responsive Narrative Factory Accelerator project demonstrated at IBC 2023 that dynamically aligning rich media metadata with viewing preference metadata creates highly customised and extremely engaging viewing experiences. The Responsive Narrative Factory approach exceeded other demonstrations of personalised video that have had limited industry traction to date.  

To fully realise the potential for customisation at scale requires rich datasets, more efficient ways of generating and transporting metadata, and to adopt a modular approach to content production and delivery.  

This project brought together subject experts across the supply chain who collaborated to identify and overcome the many challenges of broadcasting personalised video.  

This proof of concept was delivered using existing CDN technology by a multi-disciplinary team of over 40, including: 

Champions: BBC, IET

Participants: Infuse Video, Metarex, Cuvo.io, JPB Media Solutions, EZDRM


Kickstart Day Pitch:


 

The Objectives

Develop a scalable solution for broadcasters to deliver user-defined storytelling using modular video components. This will improve engagement, value, and accessibility by tailoring content to each viewer. By combining emerging technologies, to innovate a way to transform custom video delivery from experimental one-off projects to a commercially viable and mainstream approach. The key goals of the project are to achieve the following: 

Increase engagement and value  

Audiences receive greater value by choosing their own viewing experience, which has been proven to increase their demand and engagement time. 

Improve inclusivity 

Dynamically create age-appropriate, inclusive, and accessible content by compiling metadata-tagged video segments. 

Reduce operational costs 

Streamline production and versioning workflows, reducing content duplication and storage usage. 

The Challenges

Customised video experiences have not been widely available due to current workflows and technology limitations. To make the transition from linear to multi-version, the following needed to be addressed. 

Metadata enrichment, transport and use 

More efficient processes and technologies are needed to produce the granular level of metadata required for customisation. A mechanism must be developed to ensure existing metadata is preserved and easily combined with generated metadata. It then needs to be delivered in a format to identify the video segments that align with viewing preferences. 

Production and workflows 

The approach to production, content management, preparation, and storage would be improved by being adapted to accommodate componentised rather than full-length content. 

Segment compilation and delivery 

Systems and rules need to be developed to combine video segments seamlessly and retaining editorial control over the narrative 

Project Delivery

The team successfully developed a new methodology and technical solution to enable customised content to be delivered at scale. These are key elements of the proof of concept. 

Content selection 

Video customisation is particularly well suited to magazine programs where the program is made up of standalone segments. For this reason, an episode of the BBC nature program Springwatch was selected to provide a practical demonstration of how content can be customised retrospectively. 

Metadata enrichment 

Temporal metadata is required to segment the program and to identify the required sections of video. To generate metadata at scale and cost-effectively machine learning was applied to identify text, scene timings, and images. All data was displayed on a single timeline for manual verification and to ensure editorial context was maintained. 

Metadata  

Current broadcast workflows lose metadata that can used downstream for customisation. To preserve metadata the new metadata framework MetaRex was used to preserve, collate, transport and deliver the Metadata which was used to configure the video segments. 

Platform and player 

The Infuse video player was created to take temporal metadata to identify video segments and to create playlists from these using predefined rules to create tailored videos. The media assets were taken from IMF packages to create customised versions.  

The result 

The 59 minute episode of the BBC’s Springwatch, nature programme was used to demonstrate our approach and the technology was used to retrospectively reconfigure/ produce over 3.6 Million versions at a comparable to cost to creating several versions using traditional workflows.  

The Business case for widespread adoption

Engagement and Value

Why do audiences care?

  • Audiences return to content that feels Just Right
  • 1 content package gives millions of audience options
  • Editorial (& Easter Egg) control retained by producer
  • Per-versioning costs drop rapidly with scale, with up to 90% saving

Social responsibility

Why is this a better supply chain?

  • Audiences are able to modify content to a broader set of personal sensitivities.
  • More personalisation. No redundant rendering
  • More accessibility, inclusivity and localisation for a lower cost

New markets

Why invest?

  • New small markets become viable as localisation costs fall
  • Long-tail library content becomes affordable
  • Targeted opt-in advertising related to opt-in content

FINAL POC RESULTS

What industry leaders have to say about the Responsive Narrative Factory

"The RNF accelerator programme is a very interesting project and one Clearcast followed closely.

As we move to more complex dynamic, multi-version and addressable advertising workflows driven by user data, it's important that we're able to improve the end-to-end supply chain by incorporating containerisation formats such as IMF." 

Head Of Technology, Clearcast Ltd 

"I am enthusiastic about the Responsive Narrative Factory (RNF) proof of concept as it has the potential to revolutionize the localization process and experience for AMC Studios International digital streaming audiences worldwide. 

It is good to see the implementation of Metadata driving IMF elements to deliver tailor-made experiences to meet a diverse set of individual customer preferences, cultural sensitivities, accessibility and regulatory requirements.

I believe my colleagues who also lead media distribution supply chain operations would also be thrilled to be able to achieve higher levels of customization while reducing the cost of editing, storing and transporting full length regional versions."

VP - Program Operations bij AMC Studios International 

"This tops my list of favourite things at IBC."

Director of Technology - Hollywood Studio.

 

"How are you doing that? It's not possible."

CTO - Major Film School

"This changes the way can personalise adverts." 

CEP - Advert Metadata Broker

 

“The MOST exciting new concept at IBC this year... a long time coming, but this looks like the dawn of a new type of both video production and consumption.” 

Product Manager – BroadScreen.

Champions:

Participants:

The team have identified opportunities for advertising, sport, factual, movie and episodic production, content revisioning, asset management and storage, and metadata enrichment. QC, social media and Streaming. Please reach out if you would like to explore how our work can benefit your business.