Deep Point Correlation Design

Thomas Leimkühler1, Gurprit Singh1, Karol Myszkowski1, Hans-Peter Seidel1, Tobias Ritschel2
1Max Planck Institute for Informatics, Saarbrücken, 2University College London, UK
In ACM Transactions on Graphics (Proceedings of SIGGRAPH Asia 2019, Volume 38 issue 6)
Snow
We propose a framework for designing correlated point patterns. A design specification (left) is used to train an architecture mapping random points to correlated ones (middle). The resultant points are useful in rendering, object placement or dithering (right).

Abstract

Designing point patterns with desired properties can require substantial effort, both in hand-crafting coding and mathematical derivation. Retaining these properties in multiple dimensions or for a substantial number of points can be challenging and computationally expensive. Tackling those two issues, we suggest to automatically generate scalable point patterns from design goals using deep learning. We phrase pattern generation as a deep composition of weighted distance-based unstructured filters. Deep point pattern design means to optimize over the space of all such compositions according toa user-provided point correlation loss, a small program which measures a pattern’s fidelity in respect to its spatial or spectral statistics, linear or non-linear(e. g., radial) projections, or any arbitrary combination thereof. Our analysis shows that we can emulate a large set of existing patterns (blue, green, step, projective, stair, etc.-noise), generalize them to countless new combinations in a systematic way and leverage existing error estimation formulations to generate novel point patterns for a user-provided class of integrand functions. Our point patterns scale favorably to multiple dimensions and numbers of points: we demonstrate nearly 10k points in 10-D produced in one second on one GPU.

Material

Paper
Supplemental document
Source code
Powerpoint Slides (78MB) PDF Slides (20MB)
Fast Forward Video (100MB)

Pre-trained network

Fig8-BNOT (23MB) Fig8-Jitter (23MB) Fig8-Step (23MB) Fig10-BNOT-Step-Jitter (34MB)
All trainings are provided for N=1024 samples. Each .zip file contains training parameters file (env.txt), trained kernels and 1000 realizations of the respective sampling pattern.

Acknowledgements

We would like to thank all the reviewers for their detailed and constructive feedback. This work was partly supported by the Fraunhofer and the Max Planck cooperation program within the framework of the German pact for research and innovation (PFI).

Copyright Disclaimer

The Author(s) / ACM. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive Version of Record is available at doi.acm.org.

Imprint / Data Protection