Author
Sajjad Ravakhah
Bio: Sajjad Ravakhah is an academic researcher from Iran University of Science and Technology. The author has contributed to research in topics: Artificial neural network & Perceptron. The author has an hindex of 1, co-authored 1 publications receiving 17 citations.
Topics: Artificial neural network, Perceptron
Papers
More filters
••
TL;DR: The findings show that the modified optimizer and the designed classifier using mWOA significantly outperform the other benchmark classifiers.
86 citations
Cited by
More filters
••
TL;DR: A novel combination prediction model is proposed based on wavelet transform (WT), long short‐term memory (LSTM), and stacked autoencoder (SAE) that exceeds the benchmark models and outperforms the EMD, EEMD‐SAE‐L STM, and SAE.
82 citations
••
TL;DR: In this paper, an innovative coupled model based on wavelet transform (WT), long short-term memory (LSTM), and stacked autoencoder (SAE) is proposed.
80 citations
••
TL;DR: A crossover experiment with 160 components of each WBF and forecasts 320 schemes with sparse autoencoder and long short-term memory, developing a combination model with WT, SAE, and LSTM which indicates that the SAE-LSTM exceeds other AI models and outperforms other preprocessing algorithms based on forecasting accuracy.
72 citations
••
TL;DR: Wang et al. as mentioned in this paper implemented two ANN-based scenarios to approximate the uniaxial compressive strength of manufactured-sand concrete, and two improved ANNs were created with metaheuristic algorithms, namely biogeography-based optimization (BBO) and multi-tracker optimization algorithm (MTOA).
58 citations
••
TL;DR: In this paper, the effect of different interlayer content on the stability and usability of the underground energy storage caverns in bedded salt of China was investigated, and a method to increase the effective cavern volume was discussed.
44 citations