Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Reimagining Characters with Unreal Engine's MetaHuman Creator

You're reading from   Reimagining Characters with Unreal Engine's MetaHuman Creator Elevate your films with cinema-quality character designs and motion capture animation

Arrow left icon
Product type Paperback
Published in Dec 2022
Publisher Packt
ISBN-13 9781801817721
Length 356 pages
Edition 1st Edition
Tools
Concepts
Arrow right icon
Authors (2):
Arrow left icon
Ciaran Kavanagh Ciaran Kavanagh
Author Profile Icon Ciaran Kavanagh
Ciaran Kavanagh
Brian Rossney Brian Rossney
Author Profile Icon Brian Rossney
Brian Rossney
Arrow right icon
View More author details
Toc

Table of Contents (17) Chapters Close

Preface 1. Part 1: Creating a Character
2. Chapter 1: Getting Started with Unreal FREE CHAPTER 3. Chapter 2: Creating Characters in the MetaHuman Interface 4. Part 2: Exploring Blueprints, Body Motion Capture, and Retargeting
5. Chapter 3: Diving into the MetaHuman Blueprint 6. Chapter 4: Retargeting Animations 7. Chapter 5: Retargeting Animations with Mixamo 8. Chapter 6: Adding Motion Capture with DeepMotion 9. Part 3: Exploring the Level Sequencer, Facial Motion Capture, and Rendering
10. Chapter 7: Using the Level Sequencer 11. Chapter 8: Using an iPhone for Facial Motion Capture 12. Chapter 9: Using Faceware for Facial Motion Capture 13. Chapter 10: Blending Animations and Advanced Rendering with the Level Sequencer 14. Chapter 11: Using the Mesh to MetaHuman Plugin 15. Index 16. Other Books You May Enjoy

Calibrating and capturing live data

To get a better result, the ARKit capturing process needs to acquire data that takes the performer’s characteristics into account. As you can see from Figure 8.13, the app has done a good job of capturing the performer’s facial proportions, clearly outlining the eyes, nose, mouth, and jaw. However, giving the app a baseline of information (such as a neutral pose) would really help to refine the overall outcome.

Figure 8.13: Calibrating the Live Link Face app

To understand how and why baselines and neutral poses work, let’s take a look at the eyebrows as an example using an arbitrary unit of measurement. If in the resting or neutral position the eyebrows are at 0 and a frown is at –10, and a surprised expression raises the eyebrows to +15, then knowing what the neutral pose is would give the app a better idea of where to determine the frown or surprised expressions to coincide with its tracking...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image