Unreal’s new MetaHuman Animator can turn an iPhone video into a scarily accurate game animation in three minutes

Footage from Ninja Theory’s upcoming Hellblade 2 is nothing short of astonishing.

A close up of Senua's face, showing off the facial animation work being done for Hellblade 2.

 Yesterday’s State of Unreal 2023 presentation was rather overshadowed by a certain surprise game announcement (opens in new tab), but Epic showed off some genuinely amazing tech coming to Unreal Engine 5 in the near future. We got to see impressive new procedural environment tools (opens in new tab), Lords of the Fallen showing off the engine’s enhancements to character creation and realistic armour and clothing (opens in new tab), and more. But what really blew my socks off was the MetaHuman Animator stage demo.

We’ve seen MetaHuman (opens in new tab) before—it’s a tool that allows developers to generate highly realistic human faces, fully rigged for animation. Actually animating those faces in a way that matches their realistic look—avoiding the uncanny valley—has been a difficult and time-consuming process thus far, however. Motion capture—that is, getting actors to perform the movements and then turning that footage into animation—has produced the best results, but requires specialised equipment and months of expert work.  

Well, until now. On stage, Epic showed off a new feature called MetaHuman Animator, designed to completely streamline the process. During the stage demo (opens in new tab), you can see actor and developer Melina Juergens—star of the Hellblade games—recording a video of her face on an iPhone, before a technician uploads it, processes it, and turns it automatically into a full animated sequence in the space of literally three minutes. 

The final video isn’t completely finished—an animator would ideally go in and touch up elements of it, and obviously work it into the in-game scene—but it looks 90% of the way there, and that’s astonishing. You don’t even have to map it onto a MetaHuman modelled on that person’s face—you can use it with any faces you have ready, whether photo-realistic or stylised and cartoonish. 

The potential here is huge. Not only will major studios be able to create facial animations in a fraction of the time, allowing for increasingly realistic interactions in upcoming games, but smaller developers will be able to create mocap-quality scenes with just a phone and a PC, instead of an entire studio full of 4D cameras and those little white dot things. 

Then, just to show off, Epic revealed some footage from the development of Hellblade 2 using the tech with Ninja Theory’s mocap equipment, a video it claims “hasn’t been polished or edited in any way” beyond the program’s automatic processing. It’s a brief look, but I think it’s probably the most photorealistic videogame animation I’ve ever seen.

If developers are as excited about this as I am, get ready for a whole generation of games that are mostly about characters delivering Shakespearean monologues in extreme close-up. By 2030 you’ll be intimately familiar with every wrinkle and pore on Doom Guy’s face.  

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *