Photorealistic facial synthesis in the dimensional affect space
Author(s)
Kollias, D
Cheng, S
Pantic, M
Zafeiriou, S
Type
Conference Paper
Abstract
This paper presents a novel approach for synthesizing facial affect, which is based on our annotating 600,000 frames of the 4DFAB database in terms of valence and arousal. The input of this approach is a pair of these emotional state descriptors and a neutral 2D image of a person to whom the corresponding affect will be synthesized. Given this target pair, a set of 3D facial meshes is selected, which is used to build a blendshape model and generate the new facial affect. To synthesize the affect on the 2D neutral image, 3DMM fitting is performed and the reconstructed face is deformed to generate the target facial expressions. Last, the new face is rendered into the original image. Both qualitative and quantitative experimental studies illustrate the generation of realistic images, when the neutral image is sampled from a variety of well known databases, such as the Aff-Wild, AFEW, Multi-PIE, AFEW-VA, BU-3DFE, Bosphorus.
Date Issued
2019-01-29
Date Acceptance
2018-09-08
Citation
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2019, 11130 LNCS, pp.475-491
ISBN
9783030110116
ISSN
0302-9743
Publisher
Springer
Start Page
475
End Page
491
Journal / Book Title
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume
11130 LNCS
Copyright Statement
© 2019, Springer Nature Switzerland AG.
Sponsor
Engineering & Physical Science Research Council (EPSRC)
Engineering & Physical Science Research Council (E
Grant Number
EP/J017787/1
EP/N007743/1
Source
European Conference on Computer Vision
Subjects
08 Information and Computing Sciences
Artificial Intelligence & Image Processing
Publication Status
Published
Start Date
2018-09-08
Finish Date
2018-09-14
Coverage Spatial
Munich, Germany
Date Publish Online
2019-01-29