Prediction Study Based on TCN-BiLSTM-SA Time Series Model
- DOI
- 10.2991/978-94-6463-266-8_21How to use a DOI?
- Keywords
- Time series prediction; TCN; BiLSTM; Self-Attention
- Abstract
To enhance the accuracy of time series prediction, this study proposes a hybrid network model called TCN-BiLSTM-SA, which combines Temporal Convolutional Network (TCN), Bidirectional Long Short-term Memory (BiLSTM), and Self Attention (SA). The TCN is employed to learn sequence features, while the BiLSTM model captures preceding and succeeding states to extract more information for prediction. The self-attention mechanism calculates weights for each time step's output, effectively utilizing the cell memory information of BiLSTM to capture global features and improve prediction accuracy. Experimental results on the Beijing PM 2.5 dataset demonstrate that the TCN-BiLSTM-SA network outperforms the BiLSTM model in terms of RMSE, MAE, and MAPE error rates, while also exhibiting greater stability. This model holds promising potential for various time series prediction applications. It has broad application prospects in time series prediction.
- Copyright
- © 2024 The Author(s)
- Open Access
- Open Access This chapter is licensed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
Cite this article
TY - CONF AU - He Zhang AU - Peng Chu PY - 2023 DA - 2023/10/10 TI - Prediction Study Based on TCN-BiLSTM-SA Time Series Model BT - Proceedings of the 2nd International Conference on Intelligent Design and Innovative Technology (ICIDIT 2023) PB - Atlantis Press SP - 192 EP - 197 SN - 2589-4919 UR - https://doi.org/10.2991/978-94-6463-266-8_21 DO - 10.2991/978-94-6463-266-8_21 ID - Zhang2023 ER -