Comparison of Prediction Effectiveness in Deep Learning Perspective of China’s Data Finance
DOI:
https://doi.org/10.61173/xv2z5k58Keywords:
Deep Learning, Data Finance, Transformer-Encoder modelAbstract
The study has developed a financial time series forecasting model, the Transformer-Encoder model, which utilizes the attention mechanism. This model has been applied to predict the closing price of the Shanghai Stock Exchange (SSE) index, a reliable indicator of financial trends. Furthermore, the study has conducted a comparative analysis, evaluating the performance of our model against other deep learning models, machine learning models, and traditional time series data forecasting models across short, medium, and long-term forecasting periods. Our study has yielded the following key findings: Firstly, the Transformer-Encoder model, leveraging the attention mechanism, demonstrates strong performance in predicting closing prices across short, medium, and long-term periods. This indicates the model’s viability in handling non-stationary financial data and its potential as a forecasting tool applicable to time series prediction problems within the economic sphere. Secondly, compared to alternative deep learning models, machine learning models, and traditional time series data forecasting models, our proposed model consistently outperforms them.