Indonesian Gold Price Forecasting Using Simple and Stacked LSTM with Expanding Window
DOI:
https://doi.org/10.30871/jaic.v10i1.12148Keywords:
Deep Learning, Expanding Window, Forecasting, Gold Price, LSTMAbstract
This study investigates the performance of two deep learning architectures, namely Simple LSTM and Stacked LSTM, for Indonesian gold price forecasting, with a particular focus on evaluating the effect of optimizer selection and learning rate configurations. An experimental framework is implemented using daily Indonesian gold price data from 2021 to 2024. Model performance is assessed using five-fold expanding window time series cross-validation to ensure robustness and avoid data leakage. Four adaptive training optimizers (Adam, Nadam, Adamax, and RMSprop) are evaluated across three learning-rate settings as part of a systematic sensitivity analysis of training hyperparameters. The results indicate that the Simple LSTM consistently outperforms the Stacked LSTM. The best performance is achieved by the Simple LSTM using the Adam optimizer with a learning rate of 0.01, yielding an RMSE of 9.235, MAE of 7.060, and MAPE of 0.71%. These findings demonstrate that simpler architectures combined with appropriate training configurations can provide superior forecasting accuracy for volatile financial time series.
Downloads
References
[1] S. Sathyanarayana dan T. Mohanasundaram, “The Surge in Gold Price Volatility: Macroeconomic Drivers, Geopolitical Risk, and Market Dynamics,” IRA-International J. Manag. Soc. Sci. (ISSN 2455-2267), vol. 21, no. 1, hal. 15, Apr 2025, doi: 10.21013/jmss.v21.n1.p2.
[2] H. M. Zangana dan S. Ramadan, “Deep Learning - based Gold Price Prediction : A Novel Approach using Time Series Analysis,” J. Sist. Informasi, vol. 13, hal. 2581–2591, 2024, doi: https://doi.org/10.32520/stmsi.v13i6.4651.
[3] I. D. Mienye, T. G. Swart, dan G. Obaido, “Recurrent Neural Networks : A Comprehensive Review of Architectures , Variants , and Applications,” Information, hal. 1–34, 2024, doi: https://doi.org/10.3390/info15090517.
[4] X. Kong et al., Deep learning for time series forecasting : a survey, vol. 16, no. 7. Springer Berlin Heidelberg, 2025. doi: 10.1007/s13042-025-02560-w.
[5] N. Chen, “Exploring the development and application of LSTM variants,” Appl. Comput. Eng., vol. 0, hal. 103–107, 2024, doi: 10.54254/2755-2721/53/20241288.
[6] S. Hochreiter dan J. Schmidhuber, “Long Short-Term Memory,” Neural Comput., vol. 9, no. 8, hal. 1735–1780, 1997, doi: 10.1162/neco.1997.9.8.1735.
[7] B. J. Kim dan I.-W. Nam, “A Review of Hybrid LSTM Models in Smart Cities,” Processes, vol. 13, no. 7, hal. 1–46, 2025, doi: 10.3390/pr13072298.
[8] H. Oukhouya dan E. Khalid, “A comparative study of ARIMA, SVMs, and LSTM models in forecasting the Moroccan stock market,” Int. J. Simul. Process Model., vol. 20, no. 2, hal. 125–143, 2023, doi: 10.1504/IJSPM.2023.136481.
[9] Y. R. Madhika, Kusrini, dan T. Hidayat, “Gold Price Prediction Using the ARIMA and LSTM Models,” Sinkron, vol. 8, no. 3, hal. 1255–1264, 2023, doi: 10.33395/sinkron.v8i3.12461.
[10] T.-Y. Kim, H.-S. Yun, H.-M. Yoon, dan S.-J. Lee, “Comparative Analysis and Validation of LSTM and GRU Models for Predicting Annual Mean Sea Level in the East Sea: A Case Study of Ulleungdo Island,” Appl. Sci., vol. 15, no. 20, hal. 1–19, 2025, doi: 10.3390/app152011067.
[11] S. Zahara, Sugianto, dan M. B. Ilmiddafiq, “Prediksi Indeks Harga Konsumen Menggunakan Metode Long Short Term Memory (LSTM) Berbasis Cloud Computing,” J. RESTI (Rekayasa Sist. dan Teknol. Informasi), vol. 3, no. 3, hal. 357–363, 2019, doi: 10.29207/resti.v3i3.1086.
[12] Mahendra, M. M. Amalia, dan H. Leon, “Analisis Pengaruh Suku Bunga, Harga Minyak Dunia, Harga Emas Dunia Terhadap Indeks Harga Saham Gabungan Dengan Inflasi Sebagai Variabel Moderating Di Indonesia,” Owner, vol. 6, no. 1, hal. 1069–1082, 2022, doi: 10.33395/owner.v6i1.725.
[13] S. M. Hasanat et al., “Enhancing Load Forecasting Accuracy in Smart Grids: A Novel Parallel Multichannel Network Approach Using 1D CNN and Bi-LSTM Models,” Int. J. Energy Res., vol. 2024, 2024, doi: 10.1155/2024/2403847.
[14] Y. Ye, “Improved Gold Price Prediction Based on the LSTM-ARIMA Hybrid Model,” Appl. Comput. Eng., vol. 165, no. 1, hal. 56–65, 2025, doi: 10.54254/2755-2721/2025.ld24518.
[15] M. R. Nurhambali, Y. Angraini, dan A. Fitrianto, “Implementation of Long Short-Term Memory for Gold Prices Forecasting,” Malaysian J. Math. Sci., vol. 18, no. 2, hal. 399–422, 2024, doi: 10.47836/mjms.18.2.11.
[16] L. O. Joel, W. Doorsamy, dan B. S. Paul, “A Review of Missing Data Handling Techniques for Machine Learning,” Int. J. Innov. Technol. Interdiscip. Sci., vol. 5, no. 3, hal. 971–1005, 2022, doi: https://doi.org/10.15157/IJITIS.2022.5.3.971-1005.
[17] K. Oskam, Driving Change : Time Series Forecasting For Electric Vehicle Adoption In The Netherlands. 2024.
[18] R. S. Santos, M. A. Ponti, dan K. R. Rodrigues, “Analyzing college student dropout risk prediction in real data using walk-forward validation,” Intell. Syst., hal. 291–305, 2023.
[19] H. Hewamalage, C. Bergmeir, dan K. Bandara, “Recurrent Neural Networks for Time Series Forecasting: Current status and future directions,” Int. J. Forecast., vol. 37, no. 1, hal. 388–427, 2021, doi: 10.1016/j.ijforecast.2020.06.008.
[20] I. M. Vitor Cerqueira, Luis Torgo, Evaluating time series forecasting models: an empirical study on performance estimation methods, vol. 109, no. 11. Springer US, 2020. doi: 10.1007/s10994-020-05910-7.
[21] S. Wang, K. Li, Y. Chen, dan X. Tang, “VIX constant maturity futures trading strategy: A walk-forward machine learning study,” PLoS One, vol. 19, no. 4 April, hal. 1–22, 2024, doi: 10.1371/journal.pone.0302289.
[22] P. J. M. Ali, “Investigating the Impact of Min-Max Data Normalization on the Regression Performance of K-Nearest Neighbor with Different Similarity Measurements,” ARO-The Sci. J. Koya Univ., vol. 10, no. 1, hal. 85–91, 2022, doi: 10.14500/aro.10955.
[23] L. Sasse et al., “On Leakage in Machine Learning Pipelines,” J. Big Data, 2023, doi: https://doi.org/10.1186/s40537-025-01193-8.
[24] A. Pranolo et al., “Enhanced Multivariate Time Series Analysis Using LSTM: A Comparative Study of Min-Max and Z-Score Normalization Techniques,” Ilk. J. Ilm., vol. 16, no. 2, hal. 210–220, 2024, doi: 10.33096/ilkom.v16i2.2333.210-220.
[25] A. Suominen, “Deep Learning for Cryptocurrency Price Prediction-LSTM Model Performance Comparison and Evaluation Deep Learning for Cryptocurrency Price Prediction-LSTM Model Performance Comparison and Evaluation,” 2024.
[26] H. Huang, Z. Wang, Y. Liao, W. Gao, C. Lai, dan X. Wu, “Ecological Informatics Improving the explainability of CNN-LSTM-based flood prediction with integrating SHAP technique,” Ecol. Inform., vol. 84, no. July, 2024, doi: https://doi.org/10.1016/j.ecoinf.2024.102904.
[27] J. Yuan, H. Dengxin, W. Yufeng, Y. Xueting, D. Huige, dan Y. Qing, “Attention mechanism based CNN-LSTM hybrid deep learning model for atmospheric ozone concentration prediction,” Sci. Rep., hal. 1–16, 2025, doi: https://doi.org/10.1038/s41598-025-05877-2 1.
[28] S. Ghimire, R. C. Deo, D. Casillas-pérez, dan S. Salcedo-sanz, “Deep learning CNN-LSTM-MLP hybrid fusion model for feature optimizations and daily solar radiation prediction,” Measurement, vol. 202, no. August, hal. 111759, 2022, doi: 10.1016/j.measurement.2022.111759.
[29] A. Grenyer, O. Schwabe, J. A. Erkoyuncu, dan Y. Zhao, “Multistep prediction of dynamic uncertainty under limited data,” CIRP J. Manuf. Sci. Technol., vol. 37, hal. 37–54, 2022, doi: 10.1016/j.cirpj.2022.01.002.
[30] W. Riyadi dan Jasmir, “Comparative Analysis of Optimizer Effectiveness in GRU and CNN-GRU Models for Airport Traffic Prediction,” J. Ilm. Tek. Elektro Komput. dan Inform., vol. 10, no. 3, hal. 580–593, 2024, doi: 10.26555/jiteki.v10i3.29659.
[31] B. Radomirovic et al., “Optimizing long-short term memory neural networks for electroencephalogram anomaly detection using variable neighborhood search with dynamic strategy change,” Complex Intell. Syst., vol. 10, no. 6, hal. 7987–8009, 2024, doi: 10.1007/s40747-024-01592-z.
[32] R. Elshamy, O. Abu-Elnasr, M. Elhoseny, dan S. Elmougy, “Improving the efficiency of RMSProp optimizer by utilizing Nestrove in deep learning,” Sci. Rep., vol. 13, no. 1, hal. 1–16, 2023, doi: 10.1038/s41598-023-35663-x.
[33] R. Taylor, V. Ojha, I. Martino, dan G. Nicosia, “Sensitivity Analysis for Deep Learning : Ranking Hyper-parameter Influence,” Int. Conf. Tools with Artif. Intell., hal. 512–516, 2021, doi: 10.1109/ICTAI52525.2021.00083.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2026 Rahmat Tegar Patriot Hari Lambang, Ifnu Wisma Dwi Prastya, Mula Agung Barata Barata

This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License (Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) ) that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).








