SPABERT: A Pretrained Language Model from Geographic Data for Geo-Entity Representation

Zekun Li, Jina Kim, Yao Yi Chiang, Muhao Chen

Research output: Contribution to conferencePaperpeer-review

1 Scopus citations

Abstract

Named geographic entities (geo-entities for short) are the building blocks of many geographic datasets. Characterizing geo-entities is integral to various application domains, such as geo-intelligence and map comprehension, while a key challenge is to capture the spatial-varying context of an entity. We hypothesize that we shall know the characteristics of a geo-entity by its surrounding entities, similar to knowing word meanings by their linguistic context. Accordingly, we propose a novel spatial language model, SPABERT (), which provides a general-purpose geo-entity representation based on neighboring entities in geospatial data. SPABERT extends BERT to capture linearized spatial context, while incorporating a spatial coordinate embedding mechanism to preserve spatial relations of entities in the 2-dimensional space. SPABERT is pretrained with masked language modeling and masked entity prediction tasks to learn spatial dependencies. We apply SPABERT to two downstream tasks: geo-entity typing and geo-entity linking. Compared with the existing language models that do not use spatial context, SPABERT shows significant performance improvement on both tasks. We also analyze the entity representation from SPABERT in various settings and the effect of spatial coordinate embedding.

Original languageEnglish (US)
Pages2757-2769
Number of pages13
StatePublished - 2022
Event2022 Findings of the Association for Computational Linguistics: EMNLP 2022 - Abu Dhabi, United Arab Emirates
Duration: Dec 7 2022Dec 11 2022

Conference

Conference2022 Findings of the Association for Computational Linguistics: EMNLP 2022
Country/TerritoryUnited Arab Emirates
CityAbu Dhabi
Period12/7/2212/11/22

Bibliographical note

Funding Information:
We thank the reviewers for their insightful comments and suggestions. We thank Dr. Valeria Vi-tale for her valuable input. This material is based upon work supported in part by NVIDIA Corporation, the National Endowment for the Humanities under Award No. HC-278125-21 and Council Reference AH/V009400/1, the National Science Foundation of United States Grant IIS 2105329, a Cisco Faculty Research Award (72953213), and the University of Minnesota, Computer Science & Engineering Faculty startup funds.

Publisher Copyright:
© 2022 Association for Computational Linguistics.

Fingerprint

Dive into the research topics of 'SPABERT: A Pretrained Language Model from Geographic Data for Geo-Entity Representation'. Together they form a unique fingerprint.

Cite this