Modeling and forecasting art movements with CGANs
File(s)rsos.191569.pdf (971.21 KB)
Published version
Author(s)
Lisi, Edoardo
Malekzadeh, Mohammad
Haddadi, hamed
Lau, Din-Houn
Flaxman, Seth
Type
Journal Article
Abstract
Conditional generative adversarial networks (CGANs) are a recent and popular method for generating samples from a probability distribution conditioned on latent information. The latent information often comes in the form of a discrete label from a small set. We propose a novel method for training CGANs which allows us to condition on a sequence of continuous latent distributions f(1), …, f(K). This training allows CGANs to generate samples from a sequence of distributions. We apply our method to paintings from a sequence of artistic movements, where each movement is considered to be its own distribution. Exploiting the temporal aspect of the data, a vector autoregressive (VAR) model is fitted to the means of the latent distributions that we learn, and used for one-step-ahead forecasting, to predict the latent distribution of a future art movement f(K+1). Realizations from this distribution can be used by the CGAN to generate ‘future’ paintings. In experiments, this novel methodology generates accurate predictions of the evolution of art. The training set consists of a large dataset of past paintings. While there is no agreement on exactly what current art period we find ourselves in, we test on plausible candidate sets of present art, and show that the mean distance to our predictions is small.
Date Issued
2020-03-31
Date Acceptance
2020-03-16
Citation
Royal Society Open Science, 2020, 7 (4)
ISSN
2054-5703
Publisher
Royal Society, The
Journal / Book Title
Royal Society Open Science
Volume
7
Issue
4
Copyright Statement
© 2020 The Authors.
Published by the Royal Society under the terms of the Creative Commons Attribution License http://creativecommons.org/licenses/by/4.0/, which permits unrestricted use, provided the original author and source are credited.
Published by the Royal Society under the terms of the Creative Commons Attribution License http://creativecommons.org/licenses/by/4.0/, which permits unrestricted use, provided the original author and source are credited.
Publication Status
Published
Date Publish Online
2020-04-22