scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

A Review of Soft Computing Based on Deep Learning

01 Dec 2016-pp 136-144
TL;DR: This review not only reveals a promising direction for soft computing by incorporating deep learning, but also gives some suggestions for improving the performance of deep learning with soft computing techniques.
Abstract: A review of deep learning based soft computing techniques in several applications is presented. On one hand, soft computing, defined as a group of methodologies, is an important element for constructing a new generation of computational intelligent system and has gained great success in solving practical computing problems. On the other hand, deep learning has become one of the most promising techniques in artificial intelligence in the past decade. Since soft computing is an evolving collection of methodologies, by presenting the latest research results of soft computing based on deep learning, this review not only reveals a promising direction for soft computing by incorporating deep learning, but also gives some suggestions for improving the performance of deep learning with soft computing techniques.
Citations
More filters
Journal ArticleDOI
TL;DR: A Deep Rule-Based Fuzzy System (DRBFS) is proposed to develop an accurate in-hospital mortality prediction in the intensive care unit (ICU) patients employing a large number of input variables and can be well scaled up for large heterogeneous data sets.

71 citations

Proceedings ArticleDOI
01 Dec 2017
TL;DR: To enhance practice efficiency in applying ML in SDM, a framework for division of work load between the two scientific communities is proposed and a better predictive power can be derived from hybrid ML approaches, such as ensemble learning.
Abstract: This paper introduces four representative machine learning (ML) methods, i.e., random forest, MaxEnt, support vector machine and artificial neural network and reviews their application in species' distribution modelling (SDM), an inherently interdisciplinary field that requires close collaboration between ecologists and data scientists. The benefits and flaws of these ML methods are discussed in detail with several examples of contemporary SDM studies. To enhance practice efficiency in applying ML in SDM, a framework for division of work load between the two scientific communities is proposed. A better predictive power can be derived from hybrid ML approaches, such as ensemble learning. The deep learning method, however, has been largely underdeveloped in this field. Albeit challenging, the vast potential of applying deep learning in SDM needs to be seen, especially in an increasingly data-rich and open-access world.

19 citations


Cites background from "A Review of Soft Computing Based on..."

  • ...In fact, an in-depth research has been made in this area and considerable progresses have been achieved [63], [64]....

    [...]

Journal ArticleDOI
TL;DR: A α -cut induced Fuzzy layer to the Deep Neural Network is introduced to generate the final change map which facilitates a final representation of data more suitable for the process of classification and clustering.

13 citations

Journal ArticleDOI
TL;DR: The results demonstrated that the proposed nonlinear connectivity estimator was capable of detecting novel correlates: significant differences have been observed among different states of consciousness, not only in presence of attention, as the linear method detected it, but also in absence of attention.

6 citations

References
More filters
Book
01 Aug 1996
TL;DR: A separation theorem for convex fuzzy sets is proved without requiring that the fuzzy sets be disjoint.
Abstract: A fuzzy set is a class of objects with a continuum of grades of membership. Such a set is characterized by a membership (characteristic) function which assigns to each object a grade of membership ranging between zero and one. The notions of inclusion, union, intersection, complement, relation, convexity, etc., are extended to such sets, and various properties of these notions in the context of fuzzy sets are established. In particular, a separation theorem for convex fuzzy sets is proved without requiring that the fuzzy sets be disjoint.

52,705 citations

Journal ArticleDOI
28 May 2015-Nature
TL;DR: Deep learning is making major advances in solving problems that have resisted the best attempts of the artificial intelligence community for many years, and will have many more successes in the near future because it requires very little engineering by hand and can easily take advantage of increases in the amount of available computation and data.
Abstract: Deep learning allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction. These methods have dramatically improved the state-of-the-art in speech recognition, visual object recognition, object detection and many other domains such as drug discovery and genomics. Deep learning discovers intricate structure in large data sets by using the backpropagation algorithm to indicate how a machine should change its internal parameters that are used to compute the representation in each layer from the representation in the previous layer. Deep convolutional nets have brought about breakthroughs in processing images, video, speech and audio, whereas recurrent nets have shone light on sequential data such as text and speech.

46,982 citations

Journal ArticleDOI
01 Jan 1998
TL;DR: In this article, a graph transformer network (GTN) is proposed for handwritten character recognition, which can be used to synthesize a complex decision surface that can classify high-dimensional patterns, such as handwritten characters.
Abstract: Multilayer neural networks trained with the back-propagation algorithm constitute the best example of a successful gradient based learning technique. Given an appropriate network architecture, gradient-based learning algorithms can be used to synthesize a complex decision surface that can classify high-dimensional patterns, such as handwritten characters, with minimal preprocessing. This paper reviews various methods applied to handwritten character recognition and compares them on a standard handwritten digit recognition task. Convolutional neural networks, which are specifically designed to deal with the variability of 2D shapes, are shown to outperform all other techniques. Real-life document recognition systems are composed of multiple modules including field extraction, segmentation recognition, and language modeling. A new learning paradigm, called graph transformer networks (GTN), allows such multimodule systems to be trained globally using gradient-based methods so as to minimize an overall performance measure. Two systems for online handwriting recognition are described. Experiments demonstrate the advantage of global training, and the flexibility of graph transformer networks. A graph transformer network for reading a bank cheque is also described. It uses convolutional neural network character recognizers combined with global training techniques to provide record accuracy on business and personal cheques. It is deployed commercially and reads several million cheques per day.

42,067 citations

Proceedings ArticleDOI
06 Aug 2002
TL;DR: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced, and the evolution of several paradigms is outlined, and an implementation of one of the paradigm is discussed.
Abstract: A concept for the optimization of nonlinear functions using particle swarm methodology is introduced. The evolution of several paradigms is outlined, and an implementation of one of the paradigms is discussed. Benchmark testing of the paradigm is described, and applications, including nonlinear function optimization and neural network training, are proposed. The relationships between particle swarm optimization and both artificial life and genetic algorithms are described.

35,104 citations

Book
01 Jan 1975
TL;DR: Names of founding work in the area of Adaptation and modiication, which aims to mimic biological optimization, and some (Non-GA) branches of AI.
Abstract: Name of founding work in the area. Adaptation is key to survival and evolution. Evolution implicitly optimizes organisims. AI wants to mimic biological optimization { Survival of the ttest { Exploration and exploitation { Niche nding { Robust across changing environments (Mammals v. Dinos) { Self-regulation,-repair and-reproduction 2 Artiicial Inteligence Some deenitions { "Making computers do what they do in the movies" { "Making computers do what humans (currently) do best" { "Giving computers common sense; letting them make simple deci-sions" (do as I want, not what I say) { "Anything too new to be pidgeonholed" Adaptation and modiication is root of intelligence Some (Non-GA) branches of AI: { Expert Systems (Rule based deduction)

32,573 citations


Additional excerpts

  • ...Two members of EC, which are most frequently used in soft computing, are genetic algorithm (GA) [25] proposed by Holland in 1975 and differential...

    [...]

Trending Questions (1)
What is the relationship between soft computing and deep reinforcement learning?

The paper discusses the potential of incorporating deep learning techniques into soft computing to improve the performance of deep reinforcement learning.