Learning a Neural 3D Texture Space from 2D Exemplars

1University College London     2 Adobe

Our approach allows casually captured 2D textures (blue)to be mapped to latent texture codes which can be decoded in 3D for synthesis, design and interpolated (blue-red).


Abstract

We propose a generative model of 2D and 3D natural textures with diversity, visual fidelity and at high computational efficiency. This is enabled by a family of methods that extend ideas from classic stochastic procedural texturing (Perlin noise) to learned, deep, non-linearities. The key idea is a hard-coded, tunable and differentiable step that feeds multiple transformed random 2D or 3D fields into an MLP that can be sampled over infinite domains. Our model encodes all exemplars from a diverse set of textures without a need to be re-trained for each exemplar. Applications include texture interpolation, and learning 3D textures from 2D exemplars


Overview

Overview of our approach, comprising of three mainparts: The first is an encodergthat takes as input texture imagesyand generates a compact latent code z (orange). A small translation network h converts this latent code into parameters p that condition a non-convolutional (MLP) decoder (dotted) f that takes noise sampled with learned transformations (green) and maps this to appearance (pink) that has the same statistics as the exemplar (blue).

3D Results
Texture class:
Wood
Marble
Grass
Rust
Shape:
Cube
Sphere
Figurine
2D Results (Click on textures to see competitor comparison)
Zoom (hover to zoom in/out)
Wood Grass Marble Rust
Links