On Iterative Neural Network Pruning, Reinitialization, and the Similarity of Masks

ICLR workshop on Practical ML for Developing Countries

Abstract

Iterative pruning methods, such as lottery ticket pruning, provide evidence for the ability to use models with state-of-the-art properties on low-compute devices such as mobile phones. We examine how recently documented, fundamental phenomena in pruned deep learning models is affected by changes in pruning procedure. We address questions of the uniqueness high-sparsity sub-networks and their dependence on pruning method by analyzing differences in connectivity structure and learning dynamics. In convolutional layers, we document the emergence of structure induced by magnitude-based unstructured pruning in conjunction with weight rewinding that resembles the effects of structured pruning.

Latest Publications

Boosted Dense Retriever

Patrick Lewis, Barlas Oğuz, Wenhan Xiong, Fabio Petroni, Wen-tau Yih, Sebastian Riedel

NAACL - 2022