Traffic volume prediction for scenic spots based on multi-source and heterogeneous data

Yuan Gao, Yao Yi Chiang, Xiaoxi Zhang, Min Zhang

Research output: Contribution to journalArticlepeer-review

3 Scopus citations


Traffic prediction for scenic spots is an important topic in modeling an urban traffic system. Existing traffic prediction approaches typically use raw traffic data and road networks without considering the physical environment and human–environment interaction. This article presents a novel traffic prediction model that considers: (1) the topological structure of the city road network; (2) the popularity and accessibility of each scenic spot in the city; and (3) the traffic volumes of nearby scenic spots. The proposed model first learns a series of traffic dependency graphs by the Multi-graph Convolutional Network using multiple data sources describing historical traffic volumes, scenic spots popularity, land function, location, and accessibility. The graph nodes represent the scenic spots, and the links between them represent their traffic dependency, considering all traffic and geographic features. Then the proposed model uses the Gated Recurrent Unit (GRU) to capture the temporal dependency between multiple fused graphs for traffic volume prediction. The experiments show that the proposed model (M-GCNGRU) can effectively exploit and integrate geographic data with historical traffic data for traffic volume prediction, outperforming several classical and state-of-the-art methods.

Original languageEnglish (US)
Pages (from-to)2415-2439
Number of pages25
JournalTransactions in GIS
Issue number6
StatePublished - Sep 2022
Externally publishedYes

Bibliographical note

Funding Information:
We hereby acknowledge the support received from the National Social Science Foundation of China (20BTJ047).

Publisher Copyright:
© 2022 John Wiley & Sons Ltd.


Dive into the research topics of 'Traffic volume prediction for scenic spots based on multi-source and heterogeneous data'. Together they form a unique fingerprint.

Cite this