Published on Wed Feb 28 2018

Stereoscopic Neural Style Transfer

Dongdong Chen, Lu Yuan, Jing Liao, Nenghai Yu, Gang Hua

This paper presents the first attempt at stereoscopic neural style transfer. It responds to the emerging demand for 3D movies or AR/VR. The proposed method clearly outperforms the baseline algorithms.

0
0
0
Abstract

This paper presents the first attempt at stereoscopic neural style transfer, which responds to the emerging demand for 3D movies or AR/VR. We start with a careful examination of applying existing monocular style transfer methods to left and right views of stereoscopic images separately. This reveals that the original disparity consistency cannot be well preserved in the final stylization results, which causes 3D fatigue to the viewers. To address this issue, we incorporate a new disparity loss into the widely adopted style loss function by enforcing the bidirectional disparity constraint in non-occluded regions. For a practical real-time solution, we propose the first feed-forward network by jointly training a stylization sub-network and a disparity sub-network, and integrate them in a feature level middle domain. Our disparity sub-network is also the first end-to-end network for simultaneous bidirectional disparity and occlusion mask estimation. Finally, our network is effectively extended to stereoscopic videos, by considering both temporal coherence and disparity consistency. We will show that the proposed method clearly outperforms the baseline algorithms both quantitatively and qualitatively.

Tue Feb 27 2018
Computer Vision
Neural Stereoscopic Image Style Transfer
Neural style transfer is an emerging technique which is able to endow daily-life images with attractive artistic styles. Previous work has succeeded in applying convolutional neural networks (CNNs) to style transfer for monocular images or videos. Style transfer is still a missing piece.
0
0
0
Tue Nov 19 2019
Computer Vision
Two-Stream FCNs to Balance Content and Style for Style Transfer
Style transfer is to render given image contents in given styles, and it has an important role in both computer vision fundamental research and industrial applications. We propose an end-to-end two-stream Fully Convolutional Networks (FCNs) aiming at balancing the contributions of the content and the style in rendered images.
0
0
0
Thu Sep 17 2020
Computer Vision
Arbitrary Video Style Transfer via Multi-Channel Correlation
Video style transfer is getting more attention in AI community for its numerous applications such as augmented reality and animation productions. Multi-Channel Correction network can be trained to fuse the exemplar style features and input content features for efficient style transfer while naturally maintaining the temporal coherence.
0
0
0
Mon Mar 27 2017
Computer Vision
Coherent Online Video Style Transfer
We propose the first end-to-end network for online video style transfer. It generates temporally coherent stylized video sequences in near real-time. Our network can incorporate different image stylization networks.
0
0
0
Thu May 27 2021
Computer Vision
Stylizing 3D Scene via Implicit Representation and HyperNetwork
In this work, we aim to address the 3D scene stylization problem. A straightforward solution is to combine existing novel view synthesis and image/video style transfer approaches. Inspired by the high quality results of the neural radiance fields(NeRF) method, we propose a joint framework.
2
44
167
Sat Jul 06 2019
Computer Vision
Fast Universal Style Transfer for Artistic and Photorealistic Rendering
ArtNet and PhotoNet can achieve 3X to 100X speed-up over the state-of-the-art algorithms, which is a major advantage for large content images. ArtNet generates images with fewer artifacts anddistortions against the state of the art artistic transfer algorithms.
0
0
0