scispace - formally typeset
L

Leon A. Gatys

Researcher at University of Tübingen

Publications -  36
Citations -  10110

Leon A. Gatys is an academic researcher from University of Tübingen. The author has contributed to research in topics: Convolutional neural network & Artificial neural network. The author has an hindex of 20, co-authored 36 publications receiving 7559 citations. Previous affiliations of Leon A. Gatys include Apple Inc..

Papers
More filters
Proceedings ArticleDOI

Image Style Transfer Using Convolutional Neural Networks

TL;DR: A Neural Algorithm of Artistic Style is introduced that can separate and recombine the image content and style of natural images and provide new insights into the deep image representations learned by Convolutional Neural Networks and demonstrate their potential for high level image synthesis and manipulation.
Proceedings Article

Texture synthesis using convolutional neural networks

TL;DR: A new model of natural textures based on the feature spaces of convolutional neural networks optimised for object recognition is introduced, showing that across layers the texture representations increasingly capture the statistical properties of natural images while making object information more and more explicit.
Posted Content

A Neural Algorithm of Artistic Style

TL;DR: This work introduces an artificial system based on a Deep Neural Network that creates artistic images of high perceptual quality and offers a path forward to an algorithmic understanding of how humans create and perceive artistic imagery.
Journal ArticleDOI

A Neural Algorithm of Artistic Style

TL;DR: In this article, an artificial system based on a Deep Neural Network that creates artistic images of high perceptual quality is presented. But the system uses neural representations to separate and recombine content and style of arbitrary images, providing a neural algorithm for the creation of artistic images.
Proceedings ArticleDOI

Controlling Perceptual Factors in Neural Style Transfer

TL;DR: The existing Neural Style Transfer method is extended to introduce control over spatial location, colour information and across spatial scale, enabling the combination of style information from multiple sources to generate new, perceptually appealing styles from existing ones.