AnaConDaR: anatomically-constrained data-adaptive facial retargeting
Loading...
Date
2024-06-27
Journal Title
Journal ISSN
Volume Title
Publisher
Alternative Title(s)
Abstract
Offline facial retargeting, i.e., transferring facial expressions from a source to a target character, is a common production task that still regularly leads to considerable algorithmic challenges. This task can be roughly dissected into the transfer of sequential facial animations and non-sequential blendshape personalization. Both problems are typically solved by data-driven methods that require an extensive corpus of costly target examples. Other than that, geometrically motivated approaches do not require intensive data collection but cannot account for character-specific deformations and are known to cause manifold visual artifacts.
We present AnaConDaR, a novel method for offline facial retargeting, as a hybrid of data-driven and geometry-driven methods that incorporates anatomical constraints through a physics-based simulation. As a result, our approach combines the advantages of both paradigms while balancing out the respective disadvantages. In contrast to other recent concepts, AnaConDaR achieves substantially individualized results even when only a handful of target examples are available. At the same time, we do not make the common assumption that for each target example a matching source expression must be known. Instead, AnaConDaR establishes correspondences between the source and the target character by a data-driven embedding of the target examples in the source domain. We evaluate our offline facial retargeting algorithm visually, quantitatively, and in two user studies.
Description
Table of contents
Keywords
Facial animation, Offline performance retargeting, Physics-based simulation
