Catalysts and Connectors: Tools for the Creative Industries is the first of the MyWorld Challenge Calls led by Digital Catapult. We are delighted to share details of the nine companies that have been awarded £50,000 funding to address the industry challenge set by Industry Partner NVIDIA or the open challenge, exploring innovative tooling solutions relating to the creation, delivery and assessment of experiences. The Challenge teams will also benefit from a 16-week support programme delivered by Digital Catapult with, NVIDIA.
TLG- Live Puppetry Performance for Digital Character Animation
The Black Laboratory working with NVIDIA
The Black Laboratory is developing a system to allow digital animated and digital controlled characters to be performed using the techniques of physical puppet performance.
Otto: A real-time immersive visual-media solution for the performing arts
Larkhall will be working to advance immersive lighting technologies by creating a prototype system which supports real-time immersive visual-media solutions for the performing arts.
Force of Habit working with NVIDIA
IMPRESS Launchpad is a machine learning-assisted gaming influencer discovery and outreach tool, empowering SME ("indie") game development studios to reach larger audiences. It is a video game marketing technology which will help studios gain insights into content-creators, influencers and market data, and will offer further frictionless workflows to assist in reaching niche/segmented gaming audiences.
Future Places Toolkit
Zubr.co in partnership with Uninvited Guests
Future Places Toolkit will be a powerful and flexible augmented reality application for use in participatory architectural design and creative consultation. By combining augmented reality with realtime participant feedback, it will be possible to see live drawings, 3D models, and plans overlaid onto existing buildings, allowing discussions with communities to take place in-situ where a development is proposed.
Feel sound. Anytime, anywhere with GroundWaves haptic shoes
GroundWaves is developing an innovative haptic calibration tool that enables an enhanced haptic experience.
Feel Learn Do
Creating a prototype of Feel Learn Do; an application consisting of a unique suite of technical tools that monitor, log, aggregate and translate a user’s physiological and behavioural data during their experience of a piece of narrative VR or MR.
Enhancing Safety and Creativity in AGITO Operations: A Centralized Software Control and Sensor Integration
Motion Impossible working with NVIDIA
The project aims to develop a lightweight control and visualisation software for AGITO systems, eliminating the need for a one-to-one Master Controller relationship while adding motion control features and libraries for AGITO Sports, AGITO Trax, and AGITO MagTrax configurations.
AI toolsets for VFX
Lux Aeterna working with NVIDIA
Lux Aeterna proposes bringing the power of generative AI to our 3D VFX tools, to create a malleable exchange between these two worlds that can be applied to a wide range of use cases to provide a new way to generate and improve assets, and achieve unique visuals.
“Active Agent” NPC Tooling
Meaning Machine working with NVIDIA
“Active Agent” NPC Tooling helps developers move away from rigid, pre-defined in-game characters (NPCs) and towards the concept of “Active Agents” – who can engage in freeform, improvised conversations that are meaningful to every player’s playthrough.
Collaborative Research and Development (CR&D)
As part of MyWorld’s Collaborative Research and Development (CR&D) Open Call, led by Digital Catapult, five projects have been awarded a share of £1 million. The projects represent a mix of early-stage and award-winning companies whose work is steadily gaining momentum. The projects will explore a range of ambitious ideas addressing emerging challenges from across different areas of the Creative Industries.
In collaboration with our globally recognised universities in the region, the project teams will develop an exciting mix of innovative prototypes. These prototypes will drive cutting-edge research and impact industry which will further fuel the West of England’s creative technology sector and beyond.
MLF and All Seeing Eye in collaboration with UWE
STREAM (Synchronous Tools for Realtime Experiential Activity Management) this project will develop a suite of tools designed to support the streamlined delivery of co-present and high fidelity XR experiences to mass audiences. This project is developed by Marshmallow Laser Feast (MLF), All Seeing Eye and The VR Lab at the Digital Cultures Research Centre, University of the West of England (UWE Bristol)
Awarri in collaboration with UWE
MetaversEngine is a cross-platform avatar system that allows users to commission avatars from 3D artists and uses NFT technology to give users control over their identity, representation, appearance and personal data in virtual worlds.
Giant Tactile Robots
Air Giants in collaboration with University of Bristol
Air Giants are pairing up with the Bristol Interaction Group, an academic group specialised in the interactions between humans and technology. Bristol Interaction Group will be studying how users interact with their soft creative robotics, looking at the vocabulary of touch and at what emotional power these interactions can have, to let Air Giants understand what these new capabilities can be used for.
Digibeat: Immersive Feedback Using Inertial Sensors
Zero Point Motion in collaboration with UWE
This project will aim to start a new project on fine motion finger tracking and heart monitoring via ballistocardiography with a single sensor. Through this physiological data, they hope to provide creatives with an insight into the immersion or emotional state of the user. The team is led by Founder and CEO, Dr Lia Li, a 2022 recipient of InnovateUK’s Women in Innovation Award.
Cloud Compression for Live Volumetric Video
Condense Reality in collaboration with University of Bristol
Condense, is a Bristol-based startup, focusing on the delivery of live immersive experiences to end users.
This project will look at the novel application of deep learning to efficiently compress live-streamed volumetric video in order to host live music events in the metaverse. The project will make a major contribution to the state of the art in volumetric video delivery and generate a significant impact on a wide range of immersive video applications.
Celestials Labs in collaboration with Bath Spa University
Celestial Live is a project that will enable the creation of new drone light shows that astound and amaze audiences. As opposed to current pre-recorded drone animation, Celestial Live will have drones that respond to performers in real-time enabling the expression of their music or dance through an immersive fusion of colour, scale and movement.
Six creative teams have each been awarded £45,000 to build an urban prototype that connects people to their city, thanks to Watershed’s Playable City Sandbox as part of MyWorld’s IDEAS programme. The below projects place play at the heart of the city, sparking imagination and conversation about inclusion, sustainability, surveillance and the future of cities. The prototypes will be showcased in Bristol in July 2023.
Jack Wates & Thomas Blackburn
A zoetropic light experience, designed to be viewed from moving train windows upon arrival and departure from Bristol Temple Meads station. The proposal aims to capitalise on the railway as a cinematic space of arrival in the city to deliver a playful and memorable experience.
The House of Weaving Songs
The House of Weaving Songs is inspired by the Somali-style nomadic structure called the Aqal. The interactive installation will integrate Somali weaving songs and woven tapestries in an experiment to connect the city to cultural practices that can inspire us in our fight to tackle climate change.
With a network of modules that replace existing paving slabs, Street Pixel transforms the pavement beneath your feet into a dynamic system of lights; drawing a path in front of you, linking you to another person or inviting you to play a game. Street Pixel is being developed with the general public and the environment in mind. It is a project that celebrates creative technology and a sustainable approach to materials and electronic hardware and we will be searching for sustainable solutions and opportunities at every step of the way.
Squeeze Me is an installation designed to provoke play and connection between strangers in an urban environment, using Air Giants’ novel inflatable soft robotic technology to provide a compelling and charming tactile and visual experience.
How (Not) to Get Hit by a Self-driving Car
Tomo Kihara + Playfool
In this intervention, passers-by are challenged to avoid being detected as humans by an AI-powered image recognition system. With the help of a machine learning model, a camera tries to detect human activity every three seconds. Anyone can join a game where they try to go from the beginning to the end of the court without being detected.
Fireflies, a Glitch by Screaming Color and Arcane
Glitch AR, Screaming Color
Fireflies, a Glitch by Screaming Color and Arcane is next-generation street art brought to life through augmented reality (AR). Glitch AR are creating a transformative experience that turns Bristol’s streets into a colourful, sci-fi-infused digital jungle awash with spectacular visuals and music from local artists.
Fellow in Residence at Bristol Old Vic
Ben Samuels is Watershed’s Fellow in Residence as part of the MyWorld IDEAS programme. He is spending seven months in residence at Bristol Old Vic, embedded in the digital presentation of Complicités production. This is a practical fellowship that will develop approaches to digital presentation of theatrical performance, focusing on liveness, immersion in digital formats and connecting live and digital audiences.
Fellows in Residence at Ultraleap
Joseph Wilk is one of Watershed’s Fellows in Residence as part of the MyWorld IDEAS programme. He is spending ten months in residence at Ultraleap and the Pervasive Media Studio, prototyping hand-tracking gestures for home VR gaming. This is a practical Fellowship that will create guidance and tooling for game developers through prototyping and experimentation.
Fellow in Residence at Condense Reality
Ellie Chadwick, is working at Condense, in a practice-based Fellowship exploring ways of performing in their virtual venue and creating guidance for performers through experimentation and user experience research. She will also work with an Academic Fellow who is evaluating different ways of performing in Condense’s virtual venue.
Fellow in Residence at Condense Reality
Helen Brown is working at Condense, but as an Academic Fellow in collaboration with the MyWorld Understanding Audiences Research team. She is focusing on evaluating different ways of performing in Condense’s virtual venue and shaping guidance for performers through experimentation and user experience research.
Fellow in Residence at Celestial
Shakara Thompson is working at Celestial, in a practical and analytical Fellowship exploring the world of drone light shows, and the possibilities for Artificial Intelligence to influence the process of creating them, both technically and creatively.
Fellow in Residence at Zero Point Motion
Harry Willmott is working at Zero Point Motion, in a practical and experimental Fellowship focused on designing and conducting user research using VR enriched with finger tracking and physiological monitoring, to understand how these features impact the user’s presence and immersion in the experience.
Fellow in Residence at Air Giants
Amy Rose is working at Air Giants, in a practice-based Fellowship exploring how the context of interaction with Giant Tactile Robots shapes participant journeys, and how to devise stories that invite engagement and play.
Fellow in Residence at UWE Bristol, working with Marshmallow Laser Feast and All Seeing Eye
Clarice Hilton is working at UWE Bristol, working with Marshmallow Laser Feast and All Seeing Eye, in a practical Fellowship exploring the ways in which people with different Disabilities might experience and value various modalities (such as such as type of headset, controller and environmental interaction) in location-based VR, and how this should inform the design process.