Meta-PU : An Arbitrary-Scale Upsampling Network for Point Cloud

Research output: Journal Publications and ReviewsRGC 21 - Publication in refereed journalpeer-review

58 Scopus Citations
View graph of relations

Author(s)

Related Research Unit(s)

Detail(s)

Original languageEnglish
Pages (from-to)3206-3218
Journal / PublicationIEEE Transactions on Visualization and Computer Graphics
Volume28
Issue number9
Online published9 Feb 2021
Publication statusPublished - 1 Sept 2022

Abstract

Point cloud upsampling is vital for the quality of the mesh in three-dimensional reconstruction. Recent research on point cloud upsampling has achieved great success due to the development of deep learning. However, the existing methods regard point cloud upsampling of different scale factors as independent tasks. Thus, the methods need to train a specific model for each scale factor, which is both inefficient and impractical for storage and computation in real applications. To address this limitation, in this work, we propose a novel method called ``Meta-PU" to firstly support point cloud upsampling of arbitrary scale factors with a single model. In the Meta-PU method, besides the backbone network consisting of residual graph convolution (RGC) blocks, a meta-subnetwork is learned to adjust the weights of the RGC blocks dynamically, and a farthest sampling block is adopted to sample different numbers of points. Together, these two blocks enable our Meta-PU to continuously upsample the point cloud with arbitrary scale factors by using only a single model. In addition, the experiments reveal that training on multiple scales simultaneously is beneficial to each other. Thus, Meta-PU even outperforms the existing methods trained for a specific scale factor only.

Research Area(s)

  • Computational modeling, Convolution, deep learning, Deep learning, Feature extraction, meta-learning, Neural networks, Point cloud, Task analysis, Three-dimensional displays, upsampling