Leveraging video data to better understand driver-pedestrian interactions in a smart city environment

Tianyi Li, John Cullom, Raphael Stern

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

New data sources such as video promise to provide insights into how humans navigate urban infrastructure and enable analysis of human-in-the-loop interactions. This work considers the deployment of portable video data collection units to understand human-driver interactions at unsignalized intersections. Specifically, we present preliminary data collection and results that highlight the value of video data in capturing the nuanced interactions of pedestrians with vehicles when navigating urban streets.

Original languageEnglish (US)
Title of host publicationDICPS 2021 - Proceedings of the ACM 1st Workshop on Data-Driven and Intelligent Cyber-Physical Systems, Part of CPS-IoT Week 2021
PublisherAssociation for Computing Machinery, Inc
Pages6-11
Number of pages6
ISBN (Electronic)9781450384452
DOIs
StatePublished - May 18 2021
Event1st ACM Workshop on Data-Driven and Intelligent Cyber-Physical Systems, DICPS 2021 - Part of CPS-IoT Week 2021 - Virtual, Online, United States
Duration: May 18 2021 → …

Publication series

NameDICPS 2021 - Proceedings of the ACM 1st Workshop on Data-Driven and Intelligent Cyber-Physical Systems, Part of CPS-IoT Week 2021

Conference

Conference1st ACM Workshop on Data-Driven and Intelligent Cyber-Physical Systems, DICPS 2021 - Part of CPS-IoT Week 2021
Country/TerritoryUnited States
CityVirtual, Online
Period5/18/21 → …

Bibliographical note

Funding Information:
This work is supported by the Minnesota Department of Transportation under contract No. 1036210.

Publisher Copyright:
© 2021 ACM.

Keywords

  • Urban sensing
  • transportation data analysis
  • video as a sensor

Fingerprint

Dive into the research topics of 'Leveraging video data to better understand driver-pedestrian interactions in a smart city environment'. Together they form a unique fingerprint.

Cite this