Enhancing Demand Forecasting Performance Using Deep Learning and Time Series Data Augmentation Techniques

Authors

  • Jinseop Yun, Yejun Park, Doohee Chung

Keywords:

Time Series Data Augmentation; Deep Learning; Demand Forecasting, ADI-CV

Abstract

Accurately forecasting demand remains a persistent challenge for organizations, especially under conditions of high uncertainty and data scarcity. While machine learning and deep learning methods have advanced beyond traditional statistical approaches, their effectiveness is often constrained by limited data availability. To address this critical issue, this paper introduces an innovative and systematic framework that integrates advanced time series data augmentation techniques, the Long Short-Term Memory (LSTM) deep learning model, and Average Demand Interval-Coefficient of Variance (ADI-CV) methodology. The proposed framework leverages ADI-CV to categorize time series patterns, enabling the application of tailored augmentation techniques such as Moving Block Bootstrap (MBB), Time-Conditional GAN (T-CGAN), and Transformer-based Time-Series Conditional GAN (TTS-CGAN). These techniques ensure the generation of synthetic data that accurately reflects temporal characteristics and market conditions, overcoming the traditional limitations of data scarcity. Our experimental results demonstrate that the augmented time series data significantly enhances forecasting performance across diverse and complex demand scenarios. This framework not only addresses the critical gap in demand forecasting methodologies but also establishes a scalable and adaptable solution for enterprises operating in volatile and dynamic market environments. By offering a robust tool to improve predictive accuracy and reliability, this study contributes a novel methodology with the potential to transform business decision-making processes.

Downloads

Download data is not yet available.

References

Bandara, K., Bergmeir, C., & Hewamalage, H. (2020). LSTM-MSNet: Leveraging forecasts on sets of related time series with multiple seasonal patterns. IEEE transactions on neural networks and learning systems, 32(4), 1586-1599.

Bohanec, M., Borštnar, M. K., & Robnik-Šikonja, M. (2017). Explaining machine learning models in sales predictions. Expert Systems with Applications, 71, 416-428.

Wheelwright, S., Makridakis, S., & Hyndman, R. J. (1998). Forecasting: methods and applications. John Wiley & Sons.

Ramanathan, U. (2012). Supply chain collaboration for improved forecast accuracy of promotional sales. International Journal of Operations & Production Management, 32(6), 676-695.

Seeger, M. W., Salinas, D., & Flunkert, V. (2016). Bayesian intermittent demand forecasting for large inventories. Advances in Neural Information Processing Systems, 29.

Pinkus, A. (1999). Approximation theory of the MLP model in neural networks. Acta numerica, 8, 143-195.

Elman, J. L. (1990). Finding structure in time. Cognitive science, 14(2), 179-211.

Hong, J. K. (2021). LSTM-based Sales Forecasting Model. KSII Transactions on Internet & Information Systems, 15(4).

Pliszczuk, D., Lesiak, P., Zuk, K., & Cieplak, T. (2021). Forecasting sales in the supply chain based on the LSTM network: the case of furniture industry.

Salamanis, A., Xanthopoulou, G., Kehagias, D., & Tzovaras, D. (2022). LSTM-based deep learning models for long-term tourism demand forecasting. Electronics, 11(22), 3681.

Sina, L. B., Secco, C. A., Blazevic, M., & Nazemi, K. (2023). Hybrid Forecasting Methods—A Systematic Review. Electronics, 12(9), 2019.

Noh, J., Park, H. J., Kim, J. S., & Hwang, S. J. (2020). Gated recurrent unit with genetic algorithm for product demand forecasting in supply chain management. Mathematics, 8(4), 565.

Makridakis, S., Spiliotis, E., & Assimakopoulos, V. (2018). The M4 Competition: Results, findings, conclusion and way forward. International Journal of Forecasting, 34(4), 802-808.

Chen, Y., Xie, X., Pei, Z., Yi, W., Wang, C., Zhang, W., & Ji, Z. (2024). Development of a Time Series E-Commerce Sales Prediction Method for Short-Shelf-Life Products Using GRU-LightGBM. Applied Sciences, 14(2), 866.

Schmidt, A., Kabir, M. W. U., & Hoque, M. T. (2022). Machine learning based restaurant sales forecasting. Machine Learning and Knowledge Extraction, 4(1), 105-130.

Smirnov, P. S., & Sudakov, V. A. (2021, May). Forecasting new product demand using machine learning. In Journal of Physics: Conference Series (Vol. 1925, No. 1, p. 012033). IOP Publishing.

Fourkiotis, K. P., & Tsadiras, A. (2024). Applying Machine Learning and Statistical Forecasting Methods for Enhancing Pharmaceutical Sales Predictions. Forecasting, 6(1), 170-186.

The HR Director (2024). Dei initiatives on the rise, but strategic execution is lacking.

Shorten, C., & Khoshgoftaar, T. M. (2019). A survey on image data augmentation for deep learning. Journal of big data, 6(1), 1-48.

Wen, Q., Sun, L., Yang, F., Song, X., Gao, J., Wang, X., & Xu, H. (2020). Time series data augmentation for deep learning: A survey. arXiv preprint arXiv:2002.12478.

Chen, M., Xu, Z., Zeng, A., & Xu, Q. (2023). FrAug: Frequency Domain Augmentation for Time Series Forecasting. arXiv preprint arXiv:2302.09292.

Pouyanfar, S., Sadiq, S., Yan, Y., Tian, H., Tao, Y., Reyes, M. P., ... & Iyengar, S. S. (2018). A survey on deep learning: Algorithms, techniques, and applications. ACM Computing Surveys (CSUR), 51(5), 1-36.

Gamboa, J. C. B. (2017). Deep learning for time-series analysis. arXiv preprint arXiv:1701.01887.

Najafabadi, M. M., Villanustre, F., Khoshgoftaar, T. M., Seliya, N., Wald, R., & Muharemagic, E. (2015). Deep learning applications and challenges in big data analytics. Journal of big data, 2(1), 1-21.

Zhang, T., Zhang, Y., Cao, W., Bian, J., Yi, X., Zheng, S., & Li, J. (2022). Less is more: Fast multivariate time series forecasting with light sampling-oriented mlp structures. arXiv preprint arXiv:2207.01186.

Pascanu, R. (2013). On the difficulty of training recurrent neural networks. arXiv preprint arXiv:1211.5063.

Cho, K., Van Merriënboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., & Bengio, Y. (2014). Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078.

Che, Z., Purushotham, S., Cho, K., Sontag, D., & Liu, Y. (2018). Recurrent neural networks for multivariate time series with missing values. Scientific reports, 8(1), 6085.

Hochreiter, S., & Schmidhuber, J. (1997). Long short-term memory. Neural computation, 9(8), 1735-1780.

Abbasimehr, H., Shabani, M., & Yousefi, M. (2020). An optimized model using LSTM network for demand forecasting. Computers & industrial engineering, 143, 106435.

Alzubaidi, L., Zhang, J., Humaidi, A. J., Al-Dujaili, A., Duan, Y., Al-Shamma, O., ... & Farhan, L. (2021). Review of deep learning: Concepts, CNN architectures, challenges, applications, future directions. Journal of big Data, 8, 1-74.

Li, X., Xiong, H., Li, X., Wu, X., Zhang, X., Liu, J., ... & Dou, D. (2022). Interpretable deep learning: Interpretation, interpretability, trustworthiness, and beyond. Knowledge and Information Systems, 64(12), 3197-3234.

Zhang, Y., Li, G., Muskat, B., & Law, R. (2021). Tourism demand forecasting: A decomposed deep learning approach. Journal of Travel Research, 60(5), 981-997.

Kim, M., Choi, W., Jeon, Y., & Liu, L. (2019). A hybrid neural network model for power demand forecasting. Energies, 12(5), 931.

Salman, S., & Liu, X. (2019). Overfitting mechanism and avoidance in deep neural networks. arXiv preprint arXiv:1901.06566.

Giri, C., & Chen, Y. (2022). Deep learning for demand forecasting in the fashion and apparel retail industry. Forecasting, 4(2), 565-581.

Alzubaidi, L., Bai, J., Al-Sabaawi, A., Santamaría, J., Albahri, A. S., Al-dabbagh, B. S. N., ... & Gu, Y. (2023). A survey on deep learning tools dealing with data scarcity: definitions, challenges, solutions, tips, and applications. Journal of Big Data, 10(1), 46.

Goodfellow, I., Pouget-Abadie, J., Mirza, M., Xu, B., Warde-Farley, D., Ozair, S., ... & Bengio, Y. (2014). Generative adversarial nets. Advances in neural information processing systems, 27.

Bowles, C., Chen, L., Guerrero, R., Bentley, P., Gunn, R., Hammers, A., ... & Rueckert, D. (2018). Gan augmentation: Augmenting training data using generative adversarial networks. arXiv preprint arXiv:1810.10863.

Han, Z., Zhao, J., Leung, H., Ma, K. F., & Wang, W. (2019). A review of deep learning models for time series prediction. IEEE Sensors Journal, 21(6), 7833-7848.

Lee, S. W., & Kim, H. Y. (2020). Stock market forecasting with super-high dimensional time-series data using ConvLSTM, trend sampling, and specialized data augmentation. Expert systems with applications, 161, 113704.

Iglesias, G., Talavera, E., González-Prieto, Á., Mozo, A., & Gómez-Canaval, S. (2023). Data Augmentation techniques in time series domain: a survey and taxonomy. Neural Computing and Applications, 35(14), 10123-10145.

Bergmeir, C., Hyndman, R. J., & Benítez, J. M. (2016). Bagging exponential smoothing methods using STL decomposition and Box–Cox transformation. International journal of forecasting, 32(2), 303-312.

Petitjean, F., Forestier, G., Webb, G. I., Nicholson, A. E., Chen, Y., & Keogh, E. (2014, December). Dynamic time warping averaging of time series allows faster and more accurate classification. In 2014 IEEE international conference on data mining (pp. 470-479). IEEE.

Denaxas, E. A., Bandyopadhyay, R., Patiño-Echeverri, D., & Pitsianis, N. (2015, April). SynTiSe: A modified multi-regime MCMC approach for generation of wind power synthetic time series. In 2015 Annual IEEE Systems Conference (SysCon) Proceedings (pp. 668-674). IEEE.

Kang, Y., Hyndman, R. J., & Li, F. (2020). GRATIS: GeneRAting TIme Series with diverse and controllable characteristics. Statistical Analysis and Data Mining: The ASA Data Science Journal, 13(4), 354-376.

Kingma, D. P., & Welling, M. (2013). Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114.

Yoon, J., Jarrett, D., & Van der Schaar, M. (2019). Time-series generative adversarial networks. Advances in neural information processing systems, 32.

Bandara, K., Hewamalage, H., Liu, Y. H., Kang, Y., & Bergmeir, C. (2021). Improving the accuracy of global forecasting models using time series data augmentation. Pattern Recognition, 120, 108148.

Iwana, B. K., & Uchida, S. (2021). An empirical survey of data augmentation for time series classification with neural networks. Plos one, 16(7), e0254841.

Deng, Y., Liang, R., Wang, D., Li, A., & Xiao, F. (2023, November). Decomposition-based Data Augmentation for Time-series Building Load Data. In Proceedings of the 10th ACM International Conference on Systems for Energy-Efficient Buildings, Cities, and Transportation (pp. 51-60).

Syntetos, A. A., Boylan, J. E., & Croston, J. D. (2005). On the categorization of demand patterns. Journal of the operational research society, 56, 495-503.

Kim, J. S., Hwang, J. S., & Jung, J. W. (2020). A New LSTM Method Using Data Decomposition of Time Series for Forecasting the Demand of Aircraft Spare Parts. Korean Management Science Review, 37(2), 1-18.

Yu, M., Tian, X., & Tao, Y. (2022). Dynamic Model Selection Based on Demand Pattern Classification in Retail Sales Forecasting. Mathematics, 10(17), 3179.

Costantino, F., Di Gravio, G., Patriarca, R., & Petrella, L. (2018). Spare parts management for irregular demand items. Omega, 81, 57-66.

Kuncoro, E. G. B., Aurachman, R., & Santosa, B. (2018, November). Inventory policy for relining roll spare parts to minimize total cost of inventory with periodic review (R, s, Q) and periodic review (R, S) (Case study: PT. Z). In IOP conference series: Materials science and engineering (Vol. 453, No. 1, p. 012021). IOP Publishing.

Ramponi, G., Protopapas, P., Brambilla, M., & Janssen, R. (2018). T-cgan: Conditional generative adversarial network for data augmentation in noisy time series with irregular sampling. arXiv preprint arXiv:1811.08295.

Li, X., Ngu, A. H. H., & Metsis, V. (2022). Tts-cgan: A transformer time-series conditional gan for biosignal data augmentation. arXiv preprint arXiv:2206.13676.

Efron, B. (1992). Bootstrap methods: another look at the jackknife. In Breakthroughs in statistics: Methodology and distribution, pages 569–593. Springer.

Kunsch, H. R. (1989). The jackknife and the bootstrap for general stationary observations. The annals of Statistics, 1217-1241.

Carlstein, E., Do, K. A., Hall, P., Hesterberg, T., & Künsch, H. R. (1998). Matched-block bootstrap for dependent data. Bernoulli, 305-328.

Tian, X., Wang, H., & Erjiang, E. (2021). Forecasting intermittent demand for inventory management by retailers: A new approach. Journal of Retailing and Consumer Services, 62, 102662.

Downloads

Published

18.10.2025

How to Cite

Jinseop Yun. (2025). Enhancing Demand Forecasting Performance Using Deep Learning and Time Series Data Augmentation Techniques. International Journal of Intelligent Systems and Applications in Engineering, 13(1), 526–538. Retrieved from https://ijisae.org/index.php/IJISAE/article/view/7895

Issue

Section

Research Article