LSTM Inefficiency in Long-Term Dependencies Regression Problems

Al-Selwi, S.M. and Hassan, M.F. and Abdulkadir, S.J. and Muneer, A. (2023) LSTM Inefficiency in Long-Term Dependencies Regression Problems. Journal of Advanced Research in Applied Sciences and Engineering Technology, 30 (3). pp. 16-31.

Full text not available from this repository.
Official URL: https://www.scopus.com/inward/record.uri?eid=2-s2....

Abstract

Recurrent neural networks (RNNs) are an excellent fit for regression problems where sequential data are the norm since their recurrent internal structure can analyse and process data for long. However, RNNs are prone to the phenomenal vanishing gradient problem (VGP) that causes the network to stop learning and generate poor prediction accuracy, especially in long-term dependencies. Originally, gated units such as long short-term memory (LSTM) and gated recurrent unit (GRU) were created to address this problem. However, VGP was and still is an unsolved problem, even in gated units. This problem occurs during the backpropagation process when the recurrent network weights tend to vanishingly reduce and hinder the network from learning the correlation between temporally distant events (long-term dependencies), that results in slow or no network convergence. This study aims to provide an empirical analysis of LSTM networks with an emphasis on inefficiency in long-term dependencies convergence because of VGP. Case studies on NASA�s turbofan engine degradation are examined and empirically analysed. © 2023, Penerbit Akademia Baru. All rights reserved.

Item Type: Article
Additional Information: cited By 9
Depositing User: Mr Ahmad Suhairi UTP
Date Deposited: 04 Jun 2024 14:10
Last Modified: 04 Jun 2024 14:10
URI: https://khub.utp.edu.my/scholars/id/eprint/18553

Actions (login required)

View Item
View Item