Neural Wavelet-domain Diffusion for 3D Shape Generation, Inversion, and Manipulation
Publication in refereed journal

Altmetrics Information
.

Other information
AbstractThis paper presents a new approach for 3D shape generation, inversion, and manipulation, through a direct generative modeling on a continuous implicit representation in wavelet domain. Specifically, we propose a compact wavelet representation with a pair of coarse and detail coefficient volumes to implicitly represent 3D shapes via truncated signed distance functions and multi-scale biorthogonal wavelets. Then, we design a pair of neural networks: a diffusion-based generator to produce diverse shapes in the form of the coarse coefficient volumes and a detail predictor to produce compatible detail coefficient volumes for introducing fine structures and details. Further, we may jointly train an encoder network to learn a latent space for inverting shapes, allowing us to enable a rich variety of whole-shape and region-aware shape manipulations. Both quantitative and qualitative experimental results manifest the compelling shape generation, inversion, and manipulation capabilities of our approach over the state-of-the-art methods.
All Author(s) ListJingyu Hu, Ka-Hei Hui, Zhengzhe Liu, Ruihui Li, Chi-Wing Fu
Journal nameACM Transactions on Graphics
Detailed descriptionJingyu Hu^, Ka-Hei Hui^, Zhengzhe Liu, Ruihui Li, and Chi-Wing Fu (^ joint 1st authors)
Year2024
Month4
Volume Number43
Issue Number2
PublisherACM
Article number16
Pages1 - 18
ISSN0730-0301
LanguagesEnglish-United States

Last updated on 2024-15-10 at 09:59