Message Propagation Through Time: An Algorithm for Sequence Dependency Retention in Time Series Modeling

Shaoming Xu, Ankush Khandelwal, Arvind Renganathan, Vipin Kumar

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Time series modeling, a crucial area in science, often encounters challenges when training Machine Learning (ML) models like Recurrent Neural Networks (RNNs) using the conventional mini-batch training strategy that assumes independent and identically distributed (IID) samples and initializes RNNs with zero hidden states. The IID assumption ignores temporal dependencies among samples, resulting in poor performance. This paper proposes the Message Propagation Through Time (MPTT) algorithm to effectively incorporate long temporal dependencies while preserving faster training times relative to the stateful algorithms. MPTT utilizes two memory modules to asynchronously manage initial hidden states for RNNs, fostering seamless information exchange between samples and allowing diverse mini-batches throughout epochs. MPTT further implements three policies to filter outdated and preserve essential information in the hidden states to generate informative initial hidden states for RNNs, facilitating robust training. Experimental results demonstrate that MPTT outperforms seven strategies on four climate datasets with varying levels of temporal dependencies.

Original languageEnglish (US)
Title of host publicationProceedings of the 2024 SIAM International Conference on Data Mining, SDM 2024
EditorsShashi Shekhar, Vagelis Papalexakis, Jing Gao, Zhe Jiang, Matteo Riondato
PublisherSociety for Industrial and Applied Mathematics Publications
Pages307-315
Number of pages9
ISBN (Electronic)9781611978032
StatePublished - 2024
Event2024 SIAM International Conference on Data Mining, SDM 2024 - Houston, United States
Duration: Apr 18 2024Apr 20 2024

Publication series

NameProceedings of the 2024 SIAM International Conference on Data Mining, SDM 2024

Conference

Conference2024 SIAM International Conference on Data Mining, SDM 2024
Country/TerritoryUnited States
CityHouston
Period4/18/244/20/24

Bibliographical note

Publisher Copyright:
Copyright © 2024 by SIAM.

Keywords

  • Long-Term Dependencies
  • Mini-Batch Training
  • Neural Networks
  • Time Series modeling

Fingerprint

Dive into the research topics of 'Message Propagation Through Time: An Algorithm for Sequence Dependency Retention in Time Series Modeling'. Together they form a unique fingerprint.

Cite this