Abstract
Long-term time series forecasting has widespread applications in real life, including finance, traffic, weather and sensor data analysis. Time series possess seasonal, trend and irregular components. However, current MLP-based mixing models fall short in effectively capturing multi-scale sequential information and achieving comprehensive feature fusion. The sequence information becomes non-stationary, and as the network depth increases, issues with missing or distorted information flow may arise. In our study, we propose a novel architecture called UNetPlusTS that combines the strengths of Mixer models and Linear models to enhance forecasting capabilities. Specifically, we split series into multiple channels, apply seasonal-trend decomposition to each series and process them independently using our meticulously designed UNet ++ architecture named UNetPlus Mixing Module (UPM). Combined with our unique sampling strategy, it promotes deep integration of seasonal and trend components, thereby alleviating non-stationarity. Experimental results on multiple real-world datasets show that UNetPlusTS has significantly improved the forecasting accuracy, demonstrating its effectiveness and robustness.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bai, S., Kolter, J.Z., Koltun, V.: An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv preprint arXiv:1803.01271 (2018)
Cai, L., Janowicz, K., Mai, G., Yan, B., Zhu, R.: Traffic transformer: capturing the continuity and periodicity of time series for traffic forecasting. Trans. GIS 24(3), 736–755 (2020)
Chen, S.A., Li, C.L., Yoder, N., Arik, S.O., Pfister, T.: TSmixer: an all-MLP architecture for time series forecasting. arXiv preprint arXiv:2303.06053 (2023)
Cleveland, R.B., Cleveland, W.S., McRae, J.E., Terpenning, I., et al.: STL: a seasonal-trend decomposition. J. Off. Stat. 6(1), 3–73 (1990)
Das, A., Kong, W., Leach, A., Sen, R., Yu, R.: Long-term forecasting with tide: time-series dense encoder. arXiv preprint arXiv:2304.08424 (2023)
Fan, W., Zheng, S., Wang, P., Xie, R., Bian, J., Fu, Y.: Addressing distribution shift in time series forecasting with instance normalization flows. arXiv preprint arXiv:2401.16777 (2024)
Gu, A., Goel, K., Re, C.: Efficiently modeling long sequences with structured state spaces. In: International Conference on Learning Representations (2022). https://openreview.net/forum?id=uYLFoz1vlAC
Hamilton, J.D.: Time Series Analysis. Princeton University Press, Princeton (2020)
Hochreiter, S., Schmidhuber, J.: Long short-term memory. Neural Comput. 9(8), 1735–1780 (1997)
Karevan, Z., Suykens, J.A.: Transductive lstm for time-series prediction: an application to weather forecasting. Neural Netw. 125, 1–9 (2020)
Lai, G., Chang, W.C., Yang, Y., Liu, H.: Modeling long-and short-term temporal patterns with deep neural networks. In: The 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, pp. 95–104 (2018)
Liu, S., Wu, K., Jiang, C., Huang, B., Ma, D.: Financial time-series forecasting: towards synergizing performance and interpretability within a hybrid machine learning approach. arXiv preprint arXiv:2401.00534 (2023)
Liu, Y., et al.: iTransformer: inverted transformers are effective for time series forecasting. arXiv preprint arXiv:2310.06625 (2023)
Nie, Y., H. Nguyen, N., Sinthong, P., Kalagnanam, J.: A time series is worth 64 words: long-term forecasting with transformers. In: International Conference on Learning Representations (2023)
Ronneberger, O., Fischer, P., Brox, T.: U-Net: convolutional networks for biomedical image segmentation. In: Navab, N., Hornegger, J., Wells, W., Frangi, A. (eds.) MICCAI 2015. LNCS, Part III, vol. 9351, pp. 234–241. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-24574-4_28
Wang, H., Peng, J., Huang, F., Wang, J., Chen, J., Xiao, Y.: MICN: multi-scale local and global context modeling for long-term series forecasting. In: The Eleventh International Conference on Learning Representations (2022)
Wang, S., et al.: TimeMixer: decomposable multiscale mixing for time series forecasting. In: The Twelfth International Conference on Learning Representations (2024). https://openreview.net/forum?id=7oLshfEIC2
Wu, H., Hu, T., Liu, Y., Zhou, H., Wang, J., Long, M.: TimesNet: temporal 2D-variation modeling for general time series analysis. In: The Eleventh International Conference on Learning Representations (2022)
Wu, H., Xu, J., Wang, J., Long, M.: Autoformer: decomposition transformers with auto-correlation for long-term series forecasting. In: Advances in Neural Information Processing Systems (2021)
Zeng, A., Chen, M., Zhang, L., Xu, Q.: Are transformers effective for time series forecasting? arXiv preprint arXiv:2205.13504 (2022)
Zhang, T., et al.: Less is more: fast multivariate time series forecasting with light sampling-oriented mlp structures. arXiv preprint arXiv:2207.01186 (2022)
Zhang, Y., Yan, J.: Crossformer: transformer utilizing cross-dimension dependency for multivariate time series forecasting. In: The Eleventh International Conference on Learning Representations (2022)
Zhao, T., Ma, X., Li, X., Zhang, C.: MPR-Net: multi-scale pattern reproduction guided universality time series interpretable forecasting. arXiv preprint arXiv:2307.06736 (2023)
Zhou, H., et al.: Informer: beyond efficient transformer for long sequence time-series forecasting. In: The Thirty-Fifth AAAI Conference on Artificial Intelligence, AAAI 2021, Virtual Conference, vol. 35, pp. 11106–11115. AAAI Press (2021)
Zhou, T., et al.: Film: frequency improved legendre memory model for long-term time series forecasting. In: Advances in Neural Information Processing Systems, vol. 35, pp. 12677–12690 (2022)
Zhou, T., Ma, Z., Wen, Q., Wang, X., Sun, L., Jin, R.: Fedformer: frequency enhanced decomposed transformer for long-term series forecasting. In: International Conference on Machine Learning, pp. 27268–27286. PMLR (2022)
Zhou, Z., Siddiquee, M.M.R., Tajbakhsh, N., Liang, J.: Unet++: redesigning skip connections to exploit multiscale features in image segmentation. IEEE Trans. Med. Imaging 39(6), 1856–1867 (2019)
Acknowledgments
This work is supported by the Joint Innovation Laboratory for Future Observation, Zhejiang University, focusing on the research of cloud-native observability technology based on Guance Cloud (JS20220726-0016).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Cheng, X., Chen, X., Wu, B., Zou, X., Yang, H., Zhao, R. (2024). UNetPlusTS: Decomposition-Mixing UNet++ Architecture for Long-Term Time Series Forecasting. In: Huang, DS., Zhang, C., Pan, Y. (eds) Advanced Intelligent Computing Technology and Applications. ICIC 2024. Lecture Notes in Computer Science(), vol 14876. Springer, Singapore. https://doi.org/10.1007/978-981-97-5666-7_4
Download citation
DOI: https://doi.org/10.1007/978-981-97-5666-7_4
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-97-5665-0
Online ISBN: 978-981-97-5666-7
eBook Packages: Computer ScienceComputer Science (R0)