Cardiff University | Prifysgol Caerdydd ORCA
Online Research @ Cardiff 
WelshClear Cookie - decide language by browser settings

Line drawings for face portraits from photos using global and local structure based GANs

Yi, Ran, Xia, Mengfei, Liu, Yong-Jin, Lai, Yu-kun and Rosin, Paul L. 2020. Line drawings for face portraits from photos using global and local structure based GANs. IEEE Transactions on Pattern Analysis and Machine Intelligence

[img]
Preview
PDF - Accepted Post-Print Version
Download (20MB) | Preview

Abstract

Despite significant effort and notable success of neural style transfer, it remains challenging for highly abstract styles, in particular line drawings. In this paper, we propose APDrawingGAN++, a generative adversarial network (GAN) for transforming face photos to artistic portrait drawings (APDrawings), which addresses substantial challenges including highly abstract style, different drawing techniques for different facial features, and high perceptual sensitivity to artifacts. To address these, we propose a composite GAN architecture that consists of local networks (to learn effective representations for specific facial features) and a global network (to capture the overall content). We provide a theoretical explanation for the necessity of this composite GAN structure by proving that any GAN with a single generator cannot generate artistic styles like APDrawings. We further introduce a classification-and-synthesis approach for lips and hair where different drawing styles are used by artists, which applies suitable styles for a given input. To capture the highly abstract art form inherent in APDrawings, we address two challenging operations — (1) coping with lines with small misalignments while penalizing large discrepancy and (2) generating more continuous lines — by introducing two novel loss terms: one is a novel distance transform loss with nonlinear mapping and the other is a novel line continuity loss, both of which improve the line quality. We also develop dedicated data augmentation and pre-training to further improve results. Extensive experiments, including a user study, show that our method outperforms state-of-the-art methods, both qualitatively and quantitatively.

Item Type: Article
Status: In Press
Schools: Computer Science & Informatics
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
ISSN: 0162-8828
Funders: The Royal Society
Date of First Compliant Deposit: 12 April 2020
Date of Acceptance: 7 April 2020
Last Modified: 14 Apr 2020 14:21
URI: http://orca-mwe.cf.ac.uk/id/eprint/130961

Actions (repository staff only)

Edit Item Edit Item

Downloads

Downloads per month over past year

View more statistics