VCC: Scaling Transformers to 128K Tokens or More by Prioritizing Important Tokens

Zhanpeng Zeng, Mingyi Hong, Cole Hawkins, Aston Zhang, Nikolaos Pappas, Vikas Singh, Shuai Zheng

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

Transformers are central in modern natural language processing and computer vision applications.Despite recent works devoted to reducing the quadratic cost of such models with respect to sequence length, dealing with ultra long sequences (e.g., >16K tokens) remains challenging.Applications such as answering questions based on a book or summarizing a scientific article are inefficient or infeasible.Here, we propose to significantly improve the efficiency of Transformers for ultra long sequences, by compressing the sequence into a much smaller representation at each layer.Specifically, by exploiting the fact that in many tasks, only a small subset of special tokens, which we call VIP-tokens, are most relevant to the final prediction, we propose a VIP-token centric compression (VCC) scheme which selectively compresses the sequence based on their impact on approximating the representation of the VIP-tokens.Compared with competitive baselines, our algorithm is not only efficient (achieving more than 3× compute efficiency gain compared to baselines on 4K and 16K lengths), but also offers competitive/better performance on a large number of tasks.Further, we show that our algorithm scales to 128K tokens (or more) while consistently offering accuracy improvement.Code is available at https://github.com/mlpen/VCC.

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems 36 - 37th Conference on Neural Information Processing Systems, NeurIPS 2023
EditorsA. Oh, T. Neumann, A. Globerson, K. Saenko, M. Hardt, S. Levine
PublisherNeural information processing systems foundation
ISBN (Electronic)9781713899921
StatePublished - 2023
Event37th Conference on Neural Information Processing Systems, NeurIPS 2023 - New Orleans, United States
Duration: Dec 10 2023Dec 16 2023

Publication series

NameAdvances in Neural Information Processing Systems
Volume36
ISSN (Print)1049-5258

Conference

Conference37th Conference on Neural Information Processing Systems, NeurIPS 2023
Country/TerritoryUnited States
CityNew Orleans
Period12/10/2312/16/23

Bibliographical note

Publisher Copyright:
© 2023 Neural information processing systems foundation. All rights reserved.

Fingerprint

Dive into the research topics of 'VCC: Scaling Transformers to 128K Tokens or More by Prioritizing Important Tokens'. Together they form a unique fingerprint.

Cite this