scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Hybrid credit ranking intelligent system using expert system and artificial neural networks

01 Feb 2011-Applied Intelligence (Springer US)-Vol. 34, Iss: 1, pp 28-46
TL;DR: A hybrid intelligent system for credit ranking using reasoning-transformational models and expert system as symbolic module and artificial neural network as non-symbolic module are components of this hybrid system.
Abstract: The main goal of all commercial banks is to collect the savings of legal and real persons and allocate them as credit to industrial, services and production companies. Non repayment of such credits cause many problems to the banks such as incapability to repay the central bank's loans, increasing the amount of credit allocations comparing to credit repayment and incapability to allocate more credits to customers. The importance of credit allocation in banking industry and it's important role in economic growth and employment creation leads the development of many models to evaluate the credit risk of applicants. But many of these models are classic and are incapable to do credit evaluation completely and efficiently. Therefore the demand to use artificial intelligence in this field has grown up. In this paper after providing appropriate credit ranking model and collecting expert's knowledge, we design a hybrid intelligent system for credit ranking using reasoning-transformational models. Expert system as symbolic module and artificial neural network as non-symbolic module are components of this hybrid system. Such models provide the unique features of each components, the reasoning and explanation of expert system and the generalization and adaptability of artificial neural networks. The results of this system demonstrate hybrid intelligence system is more accurate and powerful in credit ranking comparing to expert systems and traditional banking models.
Citations
More filters
Journal ArticleDOI
TL;DR: Comparative research review of three famous artificial intelligent techniques in financial market shows that accuracy of these artificial intelligent methods is superior to that of traditional statistical methods in dealing with financial problems, especially regarding nonlinear patterns.
Abstract: Nowadays, many current real financial applications have nonlinear and uncertain behaviors which change across the time. Therefore, the need to solve highly nonlinear, time variant problems has been growing rapidly. These problems along with other problems of traditional models caused growing interest in artificial intelligent techniques. In this paper, comparative research review of three famous artificial intelligence techniques, i.e., artificial neural networks, expert systems and hybrid intelligence systems, in financial market has been done. A financial market also has been categorized on three domains: credit evaluation, portfolio management and financial prediction and planning. For each technique, most famous and especially recent researches have been discussed in comparative aspect. Results show that accuracy of these artificial intelligent methods is superior to that of traditional statistical methods in dealing with financial problems, especially regarding nonlinear patterns. However, this outperformance is not absolute.

404 citations


Cites methods from "Hybrid credit ranking intelligent s..."

  • ...sive partitioning algorithm have been suggested to be used in credit scoring [27, 28] but with the growth and development of the credit industry and the large loan portfolios under management nowadays, the industry is frequently developing and using more accurate credit scoring models....

    [...]

  • ...[28] Credit ranking ES and BPNN ES and conventional banking methods Hybrid model performs better...

    [...]

  • ...Finally in 2009 [28] designed a credit ranking HIS using ES and ANNs....

    [...]

Journal ArticleDOI
TL;DR: The generalized hidden-mapping transductive learning method is proposed to realize transfer learning for several classical intelligent models, including feedforward neural networks, fuzzy systems, and kernelized linear models, which can be trained effectively even though the data available are insufficient for model training.
Abstract: Electroencephalogram (EEG) signal identification based on intelligent models is an important means in epilepsy detection. In the recognition of epileptic EEG signals, traditional intelligent methods usually assume that the training dataset and testing dataset have the same distribution, and the data available for training are adequate. However, these two conditions cannot always be met in practice, which reduces the ability of the intelligent recognition model obtained in detecting epileptic EEG signals. To overcome this issue, an effective strategy is to introduce transfer learning in the construction of the intelligent models, where knowledge is learned from the related scenes (source domains) to enhance the performance of model trained in the current scene (target domain). Although transfer learning has been used in EEG signal identification, many existing transfer learning techniques are designed only for a specific intelligent model, which limit their applicability to other classical intelligent models. To extend the scope of application, the generalized hidden-mapping transductive learning method is proposed to realize transfer learning for several classical intelligent models, including feedforward neural networks, fuzzy systems, and kernelized linear models. These intelligent models can be trained effectively by the proposed method even though the data available are insufficient for model training, and the generalization abilities of the trained model is also enhanced by transductive learning. A number of experiments are carried out to demonstrate the effectiveness of the proposed method in epileptic EEG recognition. The results show that the method is highly competitive or superior to some existing state-of-the-art methods.

50 citations


Cites background from "Hybrid credit ranking intelligent s..."

  • ..., speech recognition [12], signal processing [13] and robotics control [14]....

    [...]

Journal ArticleDOI
TL;DR: The proposed model (RST–ANN–CBR) is able to achieve more accurate credit scoring than four other methods and is validated to recover potentially lost customers and to increase business revenues.
Abstract: The development of an effective credit scoring model has become a very important issue as the credit industry is confronted with ever-intensifying competition and aggravating bad debt problems. During the past few years, a substantial number of studies in the field of statistics have been conducted to improve the accuracy of credit scoring models. In order to refine the classification and decrease misclassification, this paper presents a two-stage model. Focusing on classification, the first stage aims at constructing an artificial neural network (ANN)-based credit scoring model to categorize applicants into the group of accepted (good) credit and the group of rejected (bad) credit. Switching from classification to reassignment, the second stage proceeds to reduce the Type I error by retrieving the originally rejected good credit applicants to conditional acceptance using the Case-Based Reasoning (CBR) classification technique. The proposed model (RST–ANN–CBR) is applied to a credit card dataset to verify its effectiveness. As the results indicate, the proposed model is able to achieve more accurate credit scoring than four other methods; more importantly, it is validated to recover potentially lost customers and to increase business revenues.

48 citations

Journal ArticleDOI
TL;DR: The improved performance of FS‐HB is attributed to the important features used for developing the classifier thereby reducing the complexity of the algorithm and the use of ensemble methodology, which added to the classical bias variance trade‐off and performed better than standalone classifiers.
Abstract: Hybrid models based on feature selection and machine learning techniques have significantly enhanced the accuracy of standalone models This paper presents a feature selection-based hybrid-bagging algorithm (FS-HB) for improved credit risk evaluation The 2 feature selection methods chi-square and principal component analysis were used for ranking and selecting the important features from the datasets The classifiers were built on 5 training and test data partitions of the input data set The performance of the hybrid algorithm was compared with that of the standalone classifiers: feature selection-based classifiers and bagging The hybrid FS-HB algorithm performed best for qualitative dataset with less features and tree-based unstable base classifier Its performance on numeric data was also better than other standalone classifiers, whereas comparable to bagging with only selected features Its performance was found better on 70:30 data partition and the type II error, which is very significant in risk evaluation was also reduced significantly The improved performance of FS-HB is attributed to the important features used for developing the classifier thereby reducing the complexity of the algorithm and the use of ensemble methodology, which added to the classical bias variance trade-off and performed better than standalone classifiers

34 citations


Cites background from "Hybrid credit ranking intelligent s..."

  • ...Bahrammirzaee and Ghatari (2011) categorized the hybrid models into four types....

    [...]

References
More filters
Book
16 Jul 1998
TL;DR: Thorough, well-organized, and completely up to date, this book examines all the important aspects of this emerging technology, including the learning process, back-propagation learning, radial-basis function networks, self-organizing systems, modular networks, temporal processing and neurodynamics, and VLSI implementation of neural networks.
Abstract: From the Publisher: This book represents the most comprehensive treatment available of neural networks from an engineering perspective. Thorough, well-organized, and completely up to date, it examines all the important aspects of this emerging technology, including the learning process, back-propagation learning, radial-basis function networks, self-organizing systems, modular networks, temporal processing and neurodynamics, and VLSI implementation of neural networks. Written in a concise and fluid manner, by a foremost engineering textbook author, to make the material more accessible, this book is ideal for professional engineers and graduate students entering this exciting field. Computer experiments, problems, worked examples, a bibliography, photographs, and illustrations reinforce key concepts.

29,130 citations

Book ChapterDOI
01 Jan 1988
TL;DR: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion.
Abstract: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion

17,604 citations

Book
03 Jan 1986
TL;DR: In this paper, the problem of the generalized delta rule is discussed and the Generalized Delta Rule is applied to the simulation results of simulation results in terms of the generalized delta rule.
Abstract: This chapter contains sections titled: The Problem, The Generalized Delta Rule, Simulation Results, Some Further Generalizations, Conclusion

13,579 citations

Book
01 Jan 1984
TL;DR: The purpose and nature of Biological Memory, as well as some of the aspects of Memory Aspects, are explained.
Abstract: 1. Various Aspects of Memory.- 1.1 On the Purpose and Nature of Biological Memory.- 1.1.1 Some Fundamental Concepts.- 1.1.2 The Classical Laws of Association.- 1.1.3 On Different Levels of Modelling.- 1.2 Questions Concerning the Fundamental Mechanisms of Memory.- 1.2.1 Where Do the Signals Relating to Memory Act Upon?.- 1.2.2 What Kind of Encoding is Used for Neural Signals?.- 1.2.3 What are the Variable Memory Elements?.- 1.2.4 How are Neural Signals Addressed in Memory?.- 1.3 Elementary Operations Implemented by Associative Memory.- 1.3.1 Associative Recall.- 1.3.2 Production of Sequences from the Associative Memory.- 1.3.3 On the Meaning of Background and Context.- 1.4 More Abstract Aspects of Memory.- 1.4.1 The Problem of Infinite-State Memory.- 1.4.2 Invariant Representations.- 1.4.3 Symbolic Representations.- 1.4.4 Virtual Images.- 1.4.5 The Logic of Stored Knowledge.- 2. Pattern Mathematics.- 2.1 Mathematical Notations and Methods.- 2.1.1 Vector Space Concepts.- 2.1.2 Matrix Notations.- 2.1.3 Further Properties of Matrices.- 2.1.4 Matrix Equations.- 2.1.5 Projection Operators.- 2.1.6 On Matrix Differential Calculus.- 2.2 Distance Measures for Patterns.- 2.2.1 Measures of Similarity and Distance in Vector Spaces.- 2.2.2 Measures of Similarity and Distance Between Symbol Strings.- 2.2.3 More Accurate Distance Measures for Text.- 3. Classical Learning Systems.- 3.1 The Adaptive Linear Element (Adaline).- 3.1.1 Description of Adaptation by the Stochastic Approximation.- 3.2 The Perceptron.- 3.3 The Learning Matrix.- 3.4 Physical Realization of Adaptive Weights.- 3.4.1 Perceptron and Adaline.- 3.4.2 Classical Conditioning.- 3.4.3 Conjunction Learning Switches.- 3.4.4 Digital Representation of Adaptive Circuits.- 3.4.5 Biological Components.- 4. A New Approach to Adaptive Filters.- 4.1 Survey of Some Necessary Functions.- 4.2 On the "Transfer Function" of the Neuron.- 4.3 Models for Basic Adaptive Units.- 4.3.1 On the Linearization of the Basic Unit.- 4.3.2 Various Cases of Adaptation Laws.- 4.3.3 Two Limit Theorems.- 4.3.4 The Novelty Detector.- 4.4 Adaptive Feedback Networks.- 4.4.1 The Autocorrelation Matrix Memory.- 4.4.2 The Novelty Filter.- 5. Self-Organizing Feature Maps.- 5.1 On the Feature Maps of the Brain.- 5.2 Formation of Localized Responses by Lateral Feedback.- 5.3 Computational Simplification of the Process.- 5.3.1 Definition of the Topology-Preserving Mapping.- 5.3.2 A Simple Two-Dimensional Self-Organizing System.- 5.4 Demonstrations of Simple Topology-Preserving Mappings.- 5.4.1 Images of Various Distributions of Input Vectors.- 5.4.2 "The Magic TV".- 5.4.3 Mapping by a Feeler Mechanism.- 5.5 Tonotopic Map.- 5.6 Formation of Hierarchical Representations.- 5.6.1 Taxonomy Example.- 5.6.2 Phoneme Map.- 5.7 Mathematical Treatment of Self-Organization.- 5.7.1 Ordering of Weights.- 5.7.2 Convergence Phase.- 5.8 Automatic Selection of Feature Dimensions.- 6. Optimal Associative Mappings.- 6.1 Transfer Function of an Associative Network.- 6.2 Autoassociative Recall as an Orthogonal Projection.- 6.2.1 Orthogonal Projections.- 6.2.2 Error-Correcting Properties of Projections.- 6.3 The Novelty Filter.- 6.3.1 Two Examples of Novelty Filter.- 6.3.2 Novelty Filter as an Autoassociative Memory.- 6.4 Autoassociative Encoding.- 6.4.1 An Example of Autoassociative Encoding.- 6.5 Optimal Associative Mappings.- 6.5.1 The Optimal Linear Associative Mapping.- 6.5.2 Optimal Nonlinear Associative Mappings.- 6.6 Relationship Between Associative Mapping, Linear Regression, and Linear Estimation.- 6.6.1 Relationship of the Associative Mapping to Linear Regression.- 6.6.2 Relationship of the Regression Solution to the Linear Estimator.- 6.7 Recursive Computation of the Optimal Associative Mapping.- 6.7.1 Linear Corrective Algorithms.- 6.7.2 Best Exact Solution (Gradient Projection).- 6.7.3 Best Approximate Solution (Regression).- 6.7.4 Recursive Solution in the General Case.- 6.8 Special Cases.- 6.8.1 The Correlation Matrix Memory.- 6.8.2 Relationship Between Conditional Averages and Optimal Estimator.- 7. Pattern Recognition.- 7.1 Discriminant Functions.- 7.2 Statistical Formulation of Pattern Classification.- 7.3 Comparison Methods.- 7.4 The Subspace Methods of Classification.- 7.4.1 The Basic Subspace Method.- 7.4.2 The Learning Subspace Method (LSM).- 7.5 Learning Vector Quantization.- 7.6 Feature Extraction.- 7.7 Clustering.- 7.7.1 Simple Clustering (Optimization Approach).- 7.7.2 Hierarchical Clustering (Taxonomy Approach).- 7.8 Structural Pattern Recognition Methods.- 8. More About Biological Memory.- 8.1 Physiological Foundations of Memory.- 8.1.1 On the Mechanisms of Memory in Biological Systems.- 8.1.2 Structural Features of Some Neural Networks.- 8.1.3 Functional Features of Neurons.- 8.1.4 Modelling of the Synaptic Plasticity.- 8.1.5 Can the Memory Capacity Ensue from Synaptic Changes?.- 8.2 The Unified Cortical Memory Model.- 8.2.1 The Laminar Network Organization.- 8.2.2 On the Roles of Interneurons.- 8.2.3 Representation of Knowledge Over Memory Fields.- 8.2.4 Self-Controlled Operation of Memory.- 8.3 Collateral Reading.- 8.3.1 Physiological Results Relevant to Modelling.- 8.3.2 Related Modelling.- 9. Notes on Neural Computing.- 9.1 First Theoretical Views of Neural Networks.- 9.2 Motives for the Neural Computing Research.- 9.3 What Could the Purpose of the Neural Networks be?.- 9.4 Definitions of Artificial "Neural Computing" and General Notes on Neural Modelling.- 9.5 Are the Biological Neural Functions Localized or Distributed?.- 9.6 Is Nonlinearity Essential to Neural Computing?.- 9.7 Characteristic Differences Between Neural and Digital Computers.- 9.7.1 The Degree of Parallelism of the Neural Networks is Still Higher than that of any "Massively Parallel" Digital Computer.- 9.7.2 Why the Neural Signals Cannot be Approximated by Boolean Variables.- 9.7.3 The Neural Circuits do not Implement Finite Automata.- 9.7.4 Undue Views of the Logic Equivalence of the Brain and Computers on a High Level.- 9.8 "Connectionist Models".- 9.9 How can the Neural Computers be Programmed?.- 10. Optical Associative Memories.- 10.1 Nonholographic Methods.- 10.2 General Aspects of Holographic Memories.- 10.3 A Simple Principle of Holographic Associative Memory.- 10.4 Addressing in Holographic Memories.- 10.5 Recent Advances of Optical Associative Memories.- Bibliography on Pattern Recognition.- References.

8,197 citations

Trending Questions (1)
What are the basic components of artificial neural network?

Such models provide the unique features of each components, the reasoning and explanation of expert system and the generalization and adaptability of artificial neural networks.