PAUSE: Positive and Annealed Unlabeled Sentence Embedding

Lele Cao, Emil Larsson, Vilhelm von Ehrenheim, Dhiana Deva Cavalcanti Rocha, Anna Martin, Sonja Horn

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations

Abstract

Sentence embedding refers to a set of effective and versatile techniques for converting raw text into numerical vector representations that can be used in a wide range of natural language processing (NLP) applications. The majority of these techniques are either supervised or unsupervised. Compared to the unsupervised methods, the supervised ones make less assumptions about optimization objectives and usually achieve better results. However, the training requires a large amount of labeled sentence pairs, which is not available in many industrial scenarios. To that end, we propose a generic and end-to-end approach - PAUSE (Positive and Annealed Unlabeled Sentence Embedding), capable of learning high-quality sentence embeddings from a partially labeled dataset. We experimentally show that PAUSE achieves, and sometimes surpasses, state-ofthe-art results using only a small fraction of labeled sentence pairs on various benchmark tasks. When applied to a real industrial use case where labeled samples are scarce, PAUSE encourages us to extend our dataset without the burden of extensive manual annotation work.

Original languageEnglish (US)
Title of host publicationEMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings
PublisherAssociation for Computational Linguistics (ACL)
Pages10096-10107
Number of pages12
ISBN (Electronic)9781955917094
StatePublished - 2021
Event2021 Conference on Empirical Methods in Natural Language Processing, EMNLP 2021 - Virtual, Punta Cana, Dominican Republic
Duration: Nov 7 2021Nov 11 2021

Publication series

NameEMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings

Conference

Conference2021 Conference on Empirical Methods in Natural Language Processing, EMNLP 2021
Country/TerritoryDominican Republic
CityVirtual, Punta Cana
Period11/7/2111/11/21

Bibliographical note

Funding Information:
EQT Group and the Motherbrain team have provided great support along the journey of accomplishing this work; particularly, we would like to appreciate the insights/support of all sorts from (alphabetically ordered) Alex Patow, Andjela Kusmuk, Andreas Beccau, Andrey Melentyev, Anton Andersson Andrejic, Anton Ask Åström Daniel Ström, Daniel Wroblewski, Elin Bäcklund, Emil Broman, Emma Sjöström, Erik Ferm, Filip Byrén, Guillermo Rodas, Hannes Ingelhag, Henrik Landgren, Joar Wandborg, Love Larsson, Lucas Magnum, Niklas Skaar, Peter Finnman, Sarah Bernelind, Olof Hernell, Pietro Casella, Richard Stahl, Sebastian Lindblom, Sven Törnkvist, Ylva Lundegård. Additionally, the first author would also like to thank Xiaolong Liu (Intel Labs) and Xiaoxue Li (Shanghai University of Finance and Economics) for the initial discussion around PU learning methodologies and their applications.

Publisher Copyright:
© 2021 Association for Computational Linguistics

Fingerprint

Dive into the research topics of 'PAUSE: Positive and Annealed Unlabeled Sentence Embedding'. Together they form a unique fingerprint.

Cite this