scispace - formally typeset
Search or ask a question
Topic

Hybrid neural network

About: Hybrid neural network is a research topic. Over the lifetime, 1305 publications have been published within this topic receiving 18223 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: The predicted recovery from the ACO-ANN model, in comparison with other proposed models in literature, were in good agreement with those measured from simulations, and were comparable to those estimated from theother proposed models.
Abstract: Hybrid system is a potential tool to deal with nonlinear regression problems. The authors present an efficient prediction model for gas assisted gravity drainage injection recovery process based on artificial neural network (ANN) and dimensionless groups. Ant colony optimization (ACO) is applied to determine the network parameters. Results show that ACO optimization algorithm can obtain the optimal parameters of the ANN model with very high predictive accuracy. The predicted recovery from the ACO-ANN model, in comparison with other proposed models in literature, were in good agreement with those measured from simulations, and were comparable to those estimated from the other proposed models.

1 citations

Posted Content
TL;DR: In this article, a hybrid explicit-implict learning (HEI) learning method is proposed to learn the explicit and implicit parts of a spatio-temporal solution, where the implicit part of the solution is more difficult to solve, while the explicit part can be computed.
Abstract: Splitting is a method to handle application problems by splitting physics, scales, domain, and so on. Many splitting algorithms have been designed for efficient temporal discretization. In this paper, our goal is to use temporal splitting concepts in designing machine learning algorithms and, at the same time, help splitting algorithms by incorporating data and speeding them up. Since the spitting solution usually has an explicit and implicit part, we will call our method hybrid explicit-implict (HEI) learning. We will consider a recently introduced multiscale splitting algorithms. To approximate the dynamics, only a few degrees of freedom are solved implicitly, while others explicitly. In this paper, we use this splitting concept in machine learning and propose several strategies. First, the implicit part of the solution can be learned as it is more difficult to solve, while the explicit part can be computed. This provides a speed-up and data incorporation for splitting approaches. Secondly, one can design a hybrid neural network architecture because handling explicit parts requires much fewer communications among neurons and can be done efficiently. Thirdly, one can solve the coarse grid component via PDEs or other approximation methods and construct simpler neural networks for the explicit part of the solutions. We discuss these options and implement one of them by interpreting it as a machine translation task. This interpretation successfully enables us using the Transformer since it can perform model reduction for multiple time series and learn the connection. We also find that the splitting scheme is a great platform to predict the coarse solution with insufficient information of the target model: the target problem is partially given and we need to solve it through a known problem. We conduct four numerical examples and the results show that our method is stable and accurate.

1 citations

Journal Article
TL;DR: Making the comparation GSA network with standard BP network, simulation analysis demonstrates that this network model can attain higher categories of precision and the optimized BP network is not easy to trap into the local minima and has good generalization characteristic.
Abstract: After studying the disadvantage of BP neural network which has low convergent speed and trap into local minima easily,an idea of designing a new hybrid neural network model which adopts the method of numerical optimization is presented.By using Genetic-Stimulated Annealing algorithm(GSA),expands the updated space of weight.On the basis,it makes the acquired better value as the weight of BP neural network,and the optimized BP network is not easy to trap into the local minima and has good generalization characteristic.Making the comparation GSA network with standard BP network,simulation analysis demonstrates that this network model can attain higher categories of precision.

1 citations


Network Information
Related Topics (5)
Artificial neural network
207K papers, 4.5M citations
89% related
Feature extraction
111.8K papers, 2.1M citations
88% related
Fuzzy logic
151.2K papers, 2.3M citations
85% related
Convolutional neural network
74.7K papers, 2M citations
84% related
Deep learning
79.8K papers, 2.1M citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20233
20228
2021128
2020119
2019104
201863