Sequential Change-Point Detection for Mutually Exciting Point Processes

Haoyun Wang, Liyan Xie, Yao Xie, Alex Cuozzo, Simon Mak

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

We present a new CUSUM procedure for sequential change-point detection in self- and mutually-exciting point processes (specifically, Hawkes networks) using discrete events data. Hawkes networks have become a popular model in statistics and machine learning, primarily due to their capability in modeling irregularly observed data where the timing between events carries a lot of information. The problem of detecting abrupt changes in Hawkes networks arises from various applications, including neuroengineering, sensor networks, and social network monitoring. Despite this, there has not been an efficient online algorithm for detecting such changes from sequential data. To this end, we propose an online recursive implementation of the CUSUM statistic for Hawkes processes, which is computationally and memory-efficient and can be decentralized for distributed computing. We first prove theoretical properties of this new CUSUM procedure, then show the improved performance of this approach over existing methods, including the Shewhart procedure based on count data, the generalized likelihood ratio statistic, and the standard score statistic. This is demonstrated via simulation studies and an application to population code change-detection in neuroengineering.

Original languageEnglish (US)
Pages (from-to)44-56
Number of pages13
JournalTechnometrics
Volume65
Issue number1
DOIs
StatePublished - 2023
Externally publishedYes

Bibliographical note

Publisher Copyright:
© 2022 American Statistical Association and the American Society for Quality.

Keywords

  • CUSUM
  • Change-point detection
  • Hawkes processes
  • Neuroengineering
  • Online monitoring

Fingerprint

Dive into the research topics of 'Sequential Change-Point Detection for Mutually Exciting Point Processes'. Together they form a unique fingerprint.

Cite this