An extension of deep dreaming was described in this paper (for more information refer to: Image Style Transfer Using Convolutional Neural Networks, by L. A. Gatys, A. S. Ecker, and M. Bethge, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2016), which showed that trained neural networks, such as the VGG-16, learn both content and style, and these two can be manipulated independently. Thus an image of an object (content) could be styled to look like a painting by combining it with the image of a painting (style).
Let us start, as usual, by importing our libraries:
from keras.applications import vgg16from keras import backend as Kfrom scipy.misc import imresizeimport matplotlib.pyplot ...