Abstract
Deep operator network (DeepONet) has demonstrated great success in various learning tasks, including learning solution operators of partial differential equations. In particular, it provides an efficient approach to predict the evolution equations in a finite time horizon. Nevertheless, the vanilla DeepONet suffers from the issue of stability degradation in the longtime prediction. This paper proposes a transfer-learning aided DeepONet to enhance the stability. Our idea is to use transfer learning to sequentially update the DeepONets as the surrogates for propagators learned in different time frames. The evolving DeepONets can better track the varying complexities of the evolution equations, while only need to be updated by efficient training of a tiny fraction of the operator networks. Through systematic experiments, we show that the proposed method not only improves the long-time accuracy of DeepONet while maintaining similar computational cost but also substantially reduces the sample size of the training set.
Original language | English (US) |
---|---|
Title of host publication | AAAI-23 Technical Tracks 9 |
Editors | Brian Williams, Yiling Chen, Jennifer Neville |
Publisher | AAAI press |
Pages | 10629-10636 |
Number of pages | 8 |
ISBN (Electronic) | 9781577358800 |
DOIs | |
State | Published - Jun 27 2023 |
Event | 37th AAAI Conference on Artificial Intelligence, AAAI 2023 - Washington, United States Duration: Feb 7 2023 → Feb 14 2023 |
Publication series
Name | Proceedings of the 37th AAAI Conference on Artificial Intelligence, AAAI 2023 |
---|---|
Volume | 37 |
Conference
Conference | 37th AAAI Conference on Artificial Intelligence, AAAI 2023 |
---|---|
Country/Territory | United States |
City | Washington |
Period | 2/7/23 → 2/14/23 |
Bibliographical note
Publisher Copyright:Copyright © 2023, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.