CPFN: Cascaded Primitive Fitting Networks for High-Resolution Point Clouds

Eric-Tuan Lê1         Minhyuk Sung2         Duygu Ceylan3         Radomir Mech3         Tamy Boubekeur3         Niloy J. Mitra1,3

1University College London     2 KAIST     3 Adobe Research

ICCV 2021

A side-by-side comparison between SPFN and our CPFN. Our cascaded networks (CPFN) are designed to accurately detect and fit small primitives in a high-resolution point cloud. It includes two SPFNs: one for the entire object and the other for local patches. A Patch Selection network takes a downsampled point cloud as input and determines where the local patches should be sampled at test time. The per-point predictions from both SPFNs are then integrated in the merging step.


Representing human-made objects as a collection of base primitives has a long history in computer vision and reverse engineering. In the case of high-resolution point cloud scans, the challenge is to be able to detect both large primitives as well as those explaining the detailed parts. While the classical RANSAC approach requires case-specific parameter tuning, state-of-the-art networks are limited by memory consumption of their backbone modules such as PointNet++, and hence fail to detect the fine-scale primitives. We present Cascaded Primitive Fitting Network (CPFN) that relies on an adaptive patch sampling network to assemble detection results of global and local primitive detection networks. As a key enabler, we present a merging formulation that dynamically aggregates the primitives across global and local scales. Our evaluation demonstrates that CPFN improves the state-of-the-art SPFN performance by 13-14% on high-resolution point cloud datasets and specifically improves the detection of fine-scale primitives by 20-22%. Our code is available at: https://github.com/erictuanle/CPFN.

Pipeline Architecture

Given an input point cloud with N points (where N is 128k in our experiments), our networks operate at two levels, namely global and local. We utilize a global primitive fitting network, SPFN, trained on downsampled versions of the input point clouds (i.e., trained on pointclouds of size n<N) due to high memory footprints of point cloud processing backbone modules such as PointNet++. We train an additional version of SPFN that operates on local patches of the high resolution point clouds. Given the local predictions for patches and the global predictions for the rest of the point cloud, the core of our method is a novel merging step that consolidates all the predictions. In order to ensure that the capacity of the local network is utilized to learn the prediction of small primitives, at training stage we utilize a smart strategy to select the training patches from regions of the point cloud that contain such primitives. At inference time, we utilize a patch selection network that predicts the regions that are likely to contain small primitives and thus should be processed with the local network.

Evaluation of the Quality of Primitive Fitting on TraceParts

By improving the detection of the smaller primitives, CPFN achieves a substantial boost in all metrics. Specifically, we achieve a significant improvement in terms of mIoU (+13.35%), type accuracy (+6.95%), point normal accuracy (decreasing difference by −38.81%), and {Sk} coverage (+3.70% at ε=0.01).

Generalization to Unseen ABC Dataset

While the use of local patches improves the generalization capability of CPFN, as the shapes become significantly different than those seen during training, the performance degrades.

Evaluation on Real Scans

The resolution of the provided scans is much lower than our synthetic dataset: only 20k points compared to 128k points used in our experiments. The pattern and the scale of the noise are also significantly different with our previous experiments. Yet, CPFN provides reasonable results.


  title={CPFN: Cascaded Primitive Fitting Networks for High-Resolution Point Clouds},
  author={Lê, Eric-Tuan and Sung, Minhyuk and Ceylan, Duygu and Mech, Radomir and Boubekeur, Tamy and Mitra, Niloy J.},