Creating the future of production: MyWorld is pushing the boundaries of the cutting-edge digital content production tools of today, to enable the immersive environments of tomorrow

Web 1.0, or the first generation of the internet, provided connectivity and mostly text-based material to largely passive users. Today, Web 2.0 is more visual, social and interactive, and predominantly based on user-generated content.

What will Web 3.0 look like? Some emphasise that it will be de-centralised. Others say artificial intelligence (AI) and the blockchain will be key. In truth, no one knows for sure yet.

One thing that can be said about tomorrow’s internet, and digital experiences in general, is that they will be more immersive. To be more specific, many currently purely physical experiences will be extended or replaced with absorbing digitally-generated visual, audio and touch-based content.

But how precisely will that come about? More specifically, how will the immersive content of the future be produced? That’s a question that MyWorld is addressing in a major programme of research and experimental productions. Here we look at some of the primary technologies we are working to develop.

Capturing motion on screen and stage

Andy Serkis’s 2001 portrayal of Gollum in The Lord of the Rings: The Fellowship of the Ring was many people’s first experience of motion capture. Serkis was filmed wearing a tight suit studded with reflective markers, providing the data to drive the movements of the digitally-generated on-screen Gollum.

The technique has evolved greatly since then, with the use of facial markers for more detailed portrayals, LED markers for outdoor filming, and non-optical systems for camera-free applications, for example.

It has also been applied beyond screen-based entertainment. The Imaginarium, a production company co-founded by Serkis, worked with the Royal Shakespeare Company to produce a version of The Tempest which included the projection of 3D avatars created using motion capture onto the stage.

Researchers from the Centre for the Analysis of Motion, Entertainment Research and Applications (CAMERA) at the University of Bath used motion capture to help make Magic Butterfly, an immersive, animated virtual reality opera experience in 2017.

Musicians are also getting in on the act. The band Twenty One Pilots, for example, used motion capture to perform a gig in online gaming platform Roblox last year.

From 2D to 3D

The need for performers to wear suits and markers limits the use of motion capture. A newer technique called volumetric capture does away with that requirement. Multiple cameras are used to take images of people, objects and scenes, which are then stitched together to produce dynamic 3D representations. This opens up new applications, such as allowing viewers of sporting events to choose the angle they view from.

Users could potentially see and interact with highly realistic 3D representations of other people rather than avatars thanks to volumetric capture, sometimes called volumetric video. Until recently, the technique required the use of fixed studios, green screens and large numbers of precisely-calibrated cameras. Bristol-based start-up Condense Reality is aiming to be the first company able to stream broadcast quality volumetric video of live events, such as boxing matches, in real-time. Its system uses deep learning to reduce the number of cameras and the processing time needed, and can be used outside studios.

Beyond the green screen

A technique called virtual production is transforming high-end filmmaking. Unless working in real-world locations, film actors generally perform on set in front of giant green screens, with backgrounds added later in post-production. Working this way can be time-consuming because it requires the filming of multiple takes as actors look in the direction of where they expect CGI (computer-generated imagery) elements to appear, for example.

Virtual production replaces green screens with walls of LED screens, typically 60-70ft wide and 20ft high, displaying backdrops and CGI effects in real-time. This provides a more immersive environment for filmmakers and enables lighting and camera angles to be digitally manipulated in real-time.  for large-scale productions such as films, and as the technology matures, its viability for smaller-scale uses in advertising, TV and games is increasing. It can be combined with motion capture, enabling real-time adjustment of backgrounds based on the movement of cameras and props.

The technique was used to make The Mandalorian, the Star Wars live action series, released in 2019. It makes bridging between the real and the virtual easier, and can be applied to create video games, virtual events and metaverse environments.

Not just games

Virtual production requires the use of games engines such as Unreal Engine. These enable developers to create and manipulate graphics, sound, movement, lighting and other elements of 3D production without the need for separate multiples software tools. Originally developed for gaming, they can also be used in films, virtual and mixed reality and other 3D environments.

Games engines are increasingly being used alongside other advanced digital production techniques. Bristol-based Aardman Animations and French developers DigixArt used both motion capture facilities at CAMERA and the games engine Unity, in the production of 11-11: Memories Retold, their innovative World War One console game.

Combining at the cutting edge

A series of MyWorld research projects are investigating the future of these advanced production techniques. These sit alongside other work on, for example, how the content produced can be disseminated using 5G deployable networks in remote locations. That’s something the team at Bristol-based Lost Horizon, who hosted an immersive Shangri-La Glastonbury VR event in 2020, is investigating.

MyWorld researchers are combining different cutting-edge tools and techniques to push forward the boundaries of what can be achieved.

Showcasing to the world

Further still, MyWorld will be commissioning a number of experimental productions itself, through a collaborative team of the region’s world-leading creative organisations and research teams. Details are still being finalised, however productions will vary in size, will be themed around social, cultural and creative challenges and opportunities relevant to the West of England, and will be produced alongside industrial partners. Watch this space!

Oscar portrait
“The possibilities arising from the R&D into various productions pipelines being delivered through MyWorld are so exciting. There is an opportunity to help break down production barriers of cost, access to specialised facilities and skills and to enable producers of all backgrounds and resources to engage with these new processes and products. As well as funding research projects, MyWorld is investing in a range of production facilities and equipment to try and build capacity in the West of England and make sure that our region’s truly amazing producers can get access and grow our already stellar reputation for great quality creative content and research.”
Oscar De Mello -- Operations Director, MyWorld