Abstract
Named geographic entities (geo-entities for short) are the building blocks of many geographic datasets. Characterizing geo-entities is integral to various application domains, such as geo-intelligence and map comprehension, while a key challenge is to capture the spatial-varying context of an entity. We hypothesize that we shall know the characteristics of a geo-entity by its surrounding entities, similar to knowing word meanings by their linguistic context. Accordingly, we propose a novel spatial language model, SPABERT (), which provides a general-purpose geo-entity representation based on neighboring entities in geospatial data. SPABERT extends BERT to capture linearized spatial context, while incorporating a spatial coordinate embedding mechanism to preserve spatial relations of entities in the 2-dimensional space. SPABERT is pretrained with masked language modeling and masked entity prediction tasks to learn spatial dependencies. We apply SPABERT to two downstream tasks: geo-entity typing and geo-entity linking. Compared with the existing language models that do not use spatial context, SPABERT shows significant performance improvement on both tasks. We also analyze the entity representation from SPABERT in various settings and the effect of spatial coordinate embedding.
Original language | English (US) |
---|---|
Pages | 2757-2769 |
Number of pages | 13 |
State | Published - 2022 |
Event | 2022 Findings of the Association for Computational Linguistics: EMNLP 2022 - Abu Dhabi, United Arab Emirates Duration: Dec 7 2022 → Dec 11 2022 |
Conference
Conference | 2022 Findings of the Association for Computational Linguistics: EMNLP 2022 |
---|---|
Country/Territory | United Arab Emirates |
City | Abu Dhabi |
Period | 12/7/22 → 12/11/22 |
Bibliographical note
Publisher Copyright:© 2022 Association for Computational Linguistics.