Search

Facial animation made easy with Unreal Engine's 'Live Link Face' iPhone app - TechSpot

The big picture: Streamlining work in any business reduces overhead and is good for the bottom line. Epic thinks it has an Unreal Engine tool that can help developers and motion capture artists reduce the production time of animating facial expressions in games and CGI scenes. Surprisingly, the software leverages tech created by Apple.

Epic Games has already shown that the upcoming Unreal Engine 5 has some new tools to take advantage of next-generation hardware with an impressive tech demo on a PlayStation 5 devkit. The middleware is capable of graphical scaling at an unprecedented level with better dynamic lighting effects, among other enhancement.

Aside from the tools that will make gamers' experience more immersive, Epic revealed on Thursday, a new iOS app called "Live Link Face," which will allow developers and motion-capture (mo-cap) artists to record facial gestures from an iPhone and import them directly to a character in Unreal Engine 5. The app captures from the neck up, so head tilts and turns are recognized just as easily as expressions.

Using the iPhone's front-facing camera and TrueDepth sensors, Live Link Face can stream an actor's facial animations directly into Unreal in real-time. It also utilizes Apple's augmented reality tech, ARKit, to help track the actor's face. ARKit is the same software used for Apple's Memoji and Animoji avatar animations. It has also been used before for a cheap alternative to full motion-capture studios.

"The Live Link Face app harnesses the amazing facial capture quality of iPhone ARKit and turns it into a streamlined production tool," noted Verizon Media Director of Animation Technology Addy Ghani. "At RYOT [a Verizon-owned studio], we believe in the democratization of capture technology and real-time content, and this solution is perfect for a creator at home or a professional studio team like ours."

Epic sees the tool being used right at a developer's desktop rather than having to perform in a mo-cap studio. Of course, this use case will only be ideal for animating the face. A studio will still be necessary for choreographed scenes involving the whole body.

Fortunately, using a unique iPhone head mount (seen above), actors can perform full-body scenes in the studio, with UE5 simultaneously handling both body and face capture. Additionally, multiple iPhones can be used on each mo-cap artist to animate several in-engine models at once. A single tap syncs and initiates recording on all iPhones connected to the Live Link network.

Whether you are a professional developer or an Unreal hobbyist, you don't have to wait to try out this tech. As long as you have access to the UE5 beta and an iPhone, the Live Link Face app is already up on Apple's app store.

Let's block ads! (Why?)



Technology - Latest - Google News
July 10, 2020 at 04:04AM
https://ift.tt/2ZSuitm

Facial animation made easy with Unreal Engine's 'Live Link Face' iPhone app - TechSpot
Technology - Latest - Google News
https://ift.tt/2AaD5dD


Bagikan Berita Ini

0 Response to "Facial animation made easy with Unreal Engine's 'Live Link Face' iPhone app - TechSpot"

Post a Comment

Powered by Blogger.