scispace - formally typeset
Search or ask a question
Proceedings ArticleDOI

Stock Market Prediction Using Multi Expression Programming

TL;DR: A genetic programming technique (called multi-expression programming) for the prediction of two stock indices is introduced and the performance is compared with an artificial neural network trained using Levenberg-Marquardt algorithm, support vector machine, Takagi-Sugeno Neuro-Fuzzy model and difference boosting neural network.
Abstract: The use of intelligent systems for stock market predictions has been widely established. In this paper, we introduce a genetic programming technique (called multi-expression programming) for the prediction of two stock indices. The performance is then compared with an artificial neural network trained using Levenberg-Marquardt algorithm, support vector machine, Takagi-Sugeno Neuro-Fuzzy model and difference boosting neural network. We considered Nasdaq-100 index of Nasdaq Stock MarketSM and the S&P CNX NIFTY stock index as test data

Content maybe subject to copyright    Report

Citations
More filters
Journal ArticleDOI
TL;DR: This paper attempts to model and predict the return on stock price index of the Istanbul Stock Exchange (ISE) with ANFIS and reveals that the model successfully forecasts the monthly return of ISE National 100 Index with an accuracy rate of 98.3%.
Abstract: Stock market prediction is important and of great interest because successful prediction of stock prices may promise attractive benefits. These tasks are highly complicated and very difficult. In this paper, we investigate the predictability of stock market return with Adaptive Network-Based Fuzzy Inference System (ANFIS). The objective of this study is to determine whether an ANFIS algorithm is capable of accurately predicting stock market return. We attempt to model and predict the return on stock price index of the Istanbul Stock Exchange (ISE) with ANFIS. We use six macroeconomic variables and three indices as input variables. The experimental results reveal that the model successfully forecasts the monthly return of ISE National 100 Index with an accuracy rate of 98.3%. ANFIS provides a promising alternative for stock market prediction. ANFIS can be a useful tool for economists and practitioners dealing with the forecasting of the stock price index return.

357 citations

Proceedings ArticleDOI
16 Apr 2013
TL;DR: An empirical analysis is conducted to study the effect of three model selection criteria across two data transformations on the performance of GP while modeling the stock indexed in the New York Stock Exchange (NYSE).
Abstract: Genetic programming (GP) and its variants have been extensively applied for modeling of the stock markets. To improve the generalization ability of the model, GP have been hybridized with its own variants (gene expression programming (GEP), multi expression programming (MEP)) or with the other methods such as neural networks and boosting. The generalization ability of the GP model can also be improved by an appropriate choice of model selection criterion. In the past, several model selection criteria have been applied. In addition, data transformations have significant impact on the performance of the GP models. The literature reveals that few researchers have paid attention to model selection criterion and data transformation while modeling stock markets using GP. The objective of this paper is to identify the most appropriate model selection criterion and transformation that gives better generalized GP models. Therefore, the present work will conduct an empirical analysis to study the effect of three model selection criteria across two data transformations on the performance of GP while modeling the stock indexed in the New York Stock Exchange (NYSE). It was found that FPE criteria have shown a better fit for the GP model on both data transformations as compared to other model selection criteria.

37 citations


Cites methods from "Stock Market Prediction Using Multi..."

  • ...In these applications, the GP method applied is either its variant (gene expression programming (GEP) [8-10], multi expression programming (MEP) [11]) or methods of ensembles [8, 12]....

    [...]

Journal ArticleDOI
TL;DR: This paper is reviewing the main GP variants with linear representation, namely, Linear Genetic Programming, Gene Expression Programming, Multi Expression programming, Grammatical Evolution, Cartesian Genetic Programming and Stack-Based Genetic Programming.
Abstract: Genetic Programming (GP) is an automated method for creating computer programs starting from a high-level description of the problem to be solved. Many variants of GP have been proposed in the recent years. In this paper we are reviewing the main GP variants with linear representation. Namely, Linear Genetic Programming, Gene Expression Programming, Multi Expression Programming, Grammatical Evolution, Cartesian Genetic Programming and Stack-Based Genetic Programming. A complete description is provided for each method. The set of applications where the methods have been applied and several Internet sites with more information about them are also given.

36 citations

01 Jan 2009
TL;DR: The results showed that translated NSMP prediction approach was superior to untranslated NSMP predicition, and the mean relative percentage error was very low in all hidden topologies of error back propagation of translated NS MP than untranslation NSMP.
Abstract: This paper used error back propagation algorithm and regression analysis to analyze and predict untranslated and translated Nigeria Stock Market Price (NSMP). Nigeria stock market prices were collected for the periods of seven hundred and twenty days and grouped into untranslated and translated train, validation and test data. A zero mean unit variance transformation was used to normalize the input variables in order to allow the same range which makes them to differ by order of magnitude. A 5-j-1 network topology was adopted because of five input variables in which variable j was determined by the number of hidden neurons during network selection. The untranslated and translated data served as input into the error back propagation algorithm and regression model which were written in Java Programming Language. The results of both untranslated and translated statements were analyzed and compared. The performance of translated NSMP using regression analysis or error back propagation was more superior than untranslated NSMP. The results also showed that percentage prediction accuracy of error back propagation model on untranslated NSMP ranged for 11.3% while 2.7% was for translated NSMP. The 2.7% percent accuracy as against 11.3% indicates the relative stability of translated NSMP prediction as against untranslated NSMP. The mean relative percentage error was very low in all hidden topologies of error back propagation of translated NSMP than untranslated NSMP. This indicates that translated NSMP prediction approach was superior to untranslated NSMP predicition.

29 citations


Cites background from "Stock Market Prediction Using Multi..."

  • ...These data might have been affected by inflation or fluctuation of exchange rates especially in developed countries such as Nigeria....

    [...]

Journal ArticleDOI
TL;DR: The motif tracking algorithm (MTA), a novel immune inspired (IS) pattern identification tool that is able to identify unknown motifs of a non specified length which repeat within time series data is introduced.
Abstract: The search for patterns or motifs in data represents a problem area of key interest to finance and economic researchers In this paper, we introduce the motif tracking algorithm (MTA), a novel immune inspired (IS) pattern identification tool that is able to identify unknown motifs of a non specified length which repeat within time series data The power of the algorithm comes from the fact that it uses a small number of parameters with minimal assumptions regarding the data being examined or the underlying motifs Our interest lies in applying the algorithm to financial time series data to identify unknown patterns that exist The algorithm is tested using three separate data sets Particular suitability to financial data is shown by applying it to oil price data In all cases, the algorithm identifies the presence of a motif population in a fast and efficient manner due to the utilization of an intuitive symbolic representation The resulting population of motifs is shown to have considerable potential value for other applications such as forecasting and algorithm seeding

26 citations


Additional excerpts

  • ...Neural networks[1, 2] genetic programming[ 3 ] and genetic algorithms[4] are all examples of methods that have been so far applied to time series evaluation and prediction....

    [...]

References
More filters
Book
Vladimir Vapnik1
01 Jan 1995
TL;DR: Setting of the learning problem consistency of learning processes bounds on the rate of convergence ofLearning processes controlling the generalization ability of learning process constructing learning algorithms what is important in learning theory?
Abstract: Setting of the learning problem consistency of learning processes bounds on the rate of convergence of learning processes controlling the generalization ability of learning processes constructing learning algorithms what is important in learning theory?.

40,147 citations


"Stock Market Prediction Using Multi..." refers background or methods in this paper

  • ...The four techniques used in experiments are: an artificial neural network trained using the Levenberg-Marquardt algorithm, support vector machine [17], difference boosting neural network [14], a Takagi-Sugeno fuzzy inference system learned using a neural network algorithm (neuro-fuzzy model) [8]....

    [...]

  • ...When µ is large, this becomes gradient descent with a small step size....

    [...]

Book
01 Jan 1996
TL;DR: This text provides a comprehensive treatment of the methodologies underlying neuro-fuzzy and soft computing with equal emphasis on theoretical aspects of covered methodologies, empirical observations, and verifications of various applications in practice.
Abstract: Included in Prentice Hall's MATLAB Curriculum Series, this text provides a comprehensive treatment of the methodologies underlying neuro-fuzzy and soft computing. The book places equal emphasis on theoretical aspects of covered methodologies, empirical observations, and verifications of various applications in practice.

4,082 citations

Journal ArticleDOI
TL;DR: Interestingly, neuro fuzzy and soft computing a computational approach to learning and machine intelligence that you really wait for now is coming.
Abstract: Interestingly, neuro fuzzy and soft computing a computational approach to learning and machine intelligence that you really wait for now is coming. It's significant to wait for the representative and beneficial books to read. Every book that is provided in better way and utterance will be expected by many peoples. Even you are a good reader or not, feeling to read this book will always appear when you find it. But, when you feel hard to find it as yours, what to do? Borrow to your friends and don't know when to give back it to her or him.

3,932 citations

Journal ArticleDOI
TL;DR: As one of the part of book categories, the nature of statistical learning theory always becomes the most wanted book.
Abstract: If you really want to be smarter, reading can be one of the lots ways to evoke and realize. Many people who like reading will have more knowledge and experiences. Reading can be a way to gain information from economics, politics, science, fiction, literature, religion, and many others. As one of the part of book categories, the nature of statistical learning theory always becomes the most wanted book. Many people are absolutely searching for this book. It means that many love to read this kind of book.

2,716 citations


"Stock Market Prediction Using Multi..." refers methods in this paper

  • ...Results analysis and discussions Table 6 summarizes the results achieved for the two stock indices using the five intelligent paradigms (SVM, NF, ANN, DBNN, MEP)....

    [...]

  • ...The SVM approach transforms data into a feature space F that usually has a huge dimension....

    [...]

  • ...Both SVM (Gaussian kernel with γ = 3) and DBNN took less than one second to learn the two data sets [2]....

    [...]

  • ...Vapnik [16] shows how training a SVM for the pattern recognition problem leads to the following quadratic optimization problem Minimize: W (α) = − l∑ i=1 αi + 12 l∑ i=1 l∑ j=1 yiyjαiαjk(xi, xj) (2) Subject to l∑ i=1 yiαi ∀i : 0 ≤ αi ≤ C (3) Where l is the number of training examples αis a vector of lvariables and each component αicorresponds to a training example (xi, yi)....

    [...]

  • ...DBNN is based on the Bayes principle that assumes the clustering of attribute values while boosting the attribute differences [17]....

    [...]

Journal ArticleDOI
TL;DR: Analysis of the experimental results proved that it is advantageous to apply SVMs to forecast financial time series because of the variability in performance with respect to the free parameters.
Abstract: This paper deals with the application of a novel neural network technique, support vector machine (SVM), in financial time series forecasting. The objective of this paper is to examine the feasibility of SVM in financial time series forecasting by comparing it with a multi-layer back-propagation (BP) neural network. Five real futures contracts that are collated from the Chicago Mercantile Market are used as the data sets. The experiment shows that SVM outperforms the BP neural network based on the criteria of normalized mean square error (NMSE), mean absolute error (MAE), directional symmetry (DS) and weighted directional symmetry (WDS). Since there is no structured way to choose the free parameters of SVMs, the variability in performance with respect to the free parameters is investigated in this study. Analysis of the experimental results proved that it is advantageous to apply SVMs to forecast financial time series.

1,104 citations