Keras example — style transfer

An extension of deep dreaming was described in this paper (for more information refer to: Image Style Transfer Using Convolutional Neural Networks, by L. A. Gatys, A. S. Ecker, and M. Bethge, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2016), which showed that trained neural networks, such as the VGG-16, learn both content and style, and these two can be manipulated independently. Thus an image of an object (content) could be styled to look like a painting by combining it with the image of a painting (style).

Let us start, as usual, by importing our libraries:

from keras.applications import vgg16from keras import backend as Kfrom scipy.misc import imresizeimport matplotlib.pyplot ...

Get Deep Learning with Keras now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.