What are MetaHumans?
MetaHumans are essentially high-fidelity digital characters created using the MetaHuman Creator, a cloud-streamed app. They bridge the gap between cartoonish characters and fully photorealistic humans in digital environments. These characters are not just visually impressive but are also equipped with a range of realistic facial expressions and movements, bringing the technology closer to replicating live performance.
R&D processes
Over several years, we ran short sprints to explore the tools, technologies and formats for hybrid production. Working with our partners, creatives and technologists, this period of activity tested viable workflows which offer the potential to incorporate into RSC future productions.
Research Questions:
- How can we manage real time live data feeds from actors on stage and serve across multiple channels with low latency & suitable QoS for creative work?
- How can physical worlds (set, environments, objects, worlds) be successfully replicated digitally without affecting the live experience? How can those objects respond to interaction both live & digitally?
- Where are the challenges and limitations of the current technology offering to achieve our aims?
We toured a demonstration of outcomes to SXSW festival (2022), Beyond Conference (2022) and, more recently at Mozfest, Mozilla Foundation’s convening in Amsterdam (2024) which you can watch online below:
How has The RSC used performance capture and rigs in performance?
The RSC has used performance capture technology in previous productions and digital projects – from The Tempest in 2016 (Realtime avatar of Ariel on stage) and Dream 2021 (using mocap and facial data to enable performers to animate digital characters in game engine).
Capacities of MetaHumans
Visual Realism
With the MetaHuman Creator, users can design characters that range from absolutely lifelike to any imaginative variation. The skin, eyes, teeth, hair, and every detail are intricately designed. The tool allows for extensive customisation. From facial features, skin tones, hairstyles, and even imperfections.
Rigging and Animation
MetaHumans come with a predefined skeleton and facial rig, making it easier to animate them. With the integration of motion capture technology, their movements can mimic those of a real actor without considerable developer costs as previously required. As they are part of the Unreal Engine ecosystem, they can easily inport into games and other UE5 applications.
Motion Capture
Actors can wear motion capture suits and perform live. Their movements are then translated in real-time to their MetaHuman counterparts in the game environment. We can use this data as recorded performance, or enable performers to interact with audiences in game.
The blend of MetaHuman technology, motion capture, and real-time game engines like Unreal Engine have the potential to reshape the boundaries of what’s possible in live performance and gaming. As tools evolve and become more accessible, we can explore how and where we might use these tools in our creative work.