PartDistillation: Learning Parts from Instance Segmentation

Jang Hyun Cho1,2 Philipp Krähenbühl2 Vignesh Ramanathan1
1FAIR, Meta AI 2The University of Texas at Austin

Overview

We present a scalable framework to learn part segmentation from object instance labels. State-of-the-art instance segmentation models contain a surprising amount of part information. However, much of this information is hidden from plain view. For each object instance, the part information is noisy, inconsistent,and incomplete. PartDistillation transfers the part information of an instance segmentation model into a part segmentation model through self-supervised self-training on a large dataset. The resulting segmentation model is robust, accurate, and generalizes well. We evaluate the model on various part segmentation datasets. Our model outperforms supervised part segmentation in zero-shot generalization performance by a large margin. Our model outperforms when finetuned on target datasets compared to supervised counterpart and other baselines especially in few-shot regime. Finally, our model provides a wider coverage of rare parts when evaluated over 10K object classes.

Why PartDistillation?

Main Results

SOTA unsupervised part segmentation model. Better zero-shot part segmentation than supervised baselines. More label-efficient in few-shot part segmentation than supervised baselines.

Manual Evaluation

PartDistillation marries with a strong open-vocabulary object detector and covers over 10k object categories. However, standard benchmark with common part segmentation datasets such as PartImageNet and PascalPart only covers common object categories. Therefore, we evaluate our model outside the coverage of annotated datasets by conducting extensive manual evaluation. We test PartDistillation as well as supervised model trained on PartImageNet as a strong baseline. PartDistillation is more precise, diverse, and general compared to supervised part segmentation model.

Qualitative Results

Random Examples of Discovered Parts

[More examples]

People


Jang Hyun Cho

Philipp Krähenbühl

Vignesh Ramanathan

Paper

J. H. Cho, P. Krähenbühl, V. Ramanathan
PartDistillation: Learning Parts from Instance Segmentation
CVPR 2023
[paper] [supp] [code/models (coming soon)] [bibtex]

Acknowledgement

This material is in part based upon work supported by the National Science Foundation under Grant No. IIS-1845485 and IIS-2006820.