Abstract
Pre-trained large-scale language models have increasingly demonstrated high accuracy on many natural language processing (NLP) tasks. However, the limited weight storage and computational speed on hardware platforms have impeded the popularity of pre-trained models, especially in the era of edge computing. In this work, we propose an efficient transformer-based large-scale language representation using hardware-friendly block structure pruning. We incorporate the reweighted group Lasso into block-structured pruning for optimization. Besides the significantly reduced weight storage and computation, the proposed approach achieves high compression rates. Experimental results on different models (BERT, RoBERTa, and DistilBERT) on the General Language Understanding Evaluation (GLUE) benchmark tasks show that we achieve up to 5.0× with zero or minor accuracy degradation on certain task(s). Our proposed method is also orthogonal to existing compact pre-trained language models such as DistilBERT using knowledge distillation, since a further 1.79× average compression rate can be achieved on top of DistilBERT with zero or minor accuracy degradation. It is suitable to deploy the final compressed model on resource-constrained edge devices. We share the related codes and models at: https://bit.ly/3cvs2N2
Original language | English (US) |
---|---|
Title of host publication | Findings of the Association for Computational Linguistics Findings of ACL |
Subtitle of host publication | EMNLP 2020 |
Publisher | Association for Computational Linguistics (ACL) |
Pages | 3187-3199 |
Number of pages | 13 |
ISBN (Electronic) | 9781952148903 |
State | Published - 2020 |
Externally published | Yes |
Event | Findings of the Association for Computational Linguistics, ACL 2020: EMNLP 2020 - Virtual, Online Duration: Nov 16 2020 → Nov 20 2020 |
Publication series
Name | Findings of the Association for Computational Linguistics Findings of ACL: EMNLP 2020 |
---|
Conference
Conference | Findings of the Association for Computational Linguistics, ACL 2020: EMNLP 2020 |
---|---|
City | Virtual, Online |
Period | 11/16/20 → 11/20/20 |
Bibliographical note
Publisher Copyright:© 2020 Association for Computational Linguistics