Meta Networks for Neural Style Transfer
Peking University
arXiv:1709.04111 [cs.CV], (13 Sep 2017)
@article{shen2017meta,
title={Meta Networks for Neural Style Transfer},
author={Shen, Falong and Yan, Shuicheng and Zeng, Gang},
year={2017},
month={sep},
archivePrefix={"arXiv"},
primaryClass={cs.CV}
}
In this paper we propose a new method to get the specified network parameters through one time feed-forward propagation of the meta networks and explore the application to neural style transfer. Recent works on style transfer typically need to train image transformation networks for every new style, and the style is encoded in the network parameters by enormous iterations of stochastic gradient descent. To tackle these issues, we build a meta network which takes in the style image and produces a corresponding image transformations network directly. Compared with optimization-based methods for every style, our meta networks can handle an arbitrary new style within $19ms$ seconds on one modern GPU card. The fast image transformation network generated by our meta network is only 449KB, which is capable of real-time executing on a mobile device. We also investigate the manifold of the style transfer networks by operating the hidden features from meta networks. Experiments have well validated the effectiveness of our method. Code and trained models has been released.
September 16, 2017 by hgpu