scispace - formally typeset
Search or ask a question

Showing papers in "Computing in Economics and Finance in 2016"


Journal ArticleDOI
TL;DR: Overall, it is found that network based representations of correlations within a broad market index are useful in providing insights about the growth dynamics of an economy.
Abstract: In this paper, we consider three methods for filtering pertinent information from a series of complex networks modelling the correlations between stock price returns of the DAX 30 stocks for the time period 2001---2012 using the Thomson Reuters Datastream database and also the FNA platform to create the visualizations of the correlation-based networks. These methods reduce the complete $$30\times 30$$30×30 correlation coefficient matrix to a simpler network structure consisting only of the most relevant edges. The chosen network structures include the minimum spanning tree, asset graph and the planar maximally filtered graph. The resulting networks and the extracted information are analysed and compared, looking at the clusters, cliques and connectivity. Finally, we consider some specific time periods (a) a period of crisis (October---December 2008) and (b) a period of recovery (May---August 2010) where we discuss the possible underlying economic reasoning for some aspects of the network structures produced. Overall, we find that network based representations of correlations within a broad market index are useful in providing insights about the growth dynamics of an economy.

71 citations


Journal ArticleDOI
TL;DR: Through Monte Carlo simulations, it is shown that the PSTE is directly applicable to non-stationary in mean and variance time series and it is not affected by the existence of outliers and VAR filtering.
Abstract: In this paper, a framework is developed for the identification of causal effects from non-stationary time series. Focusing on causality measures that make use of delay vectors from time series, the idea is to account for non-stationarity by considering the ranks of the components of the delay vectors rather than the components themselves. As an exemplary measure, we introduce the partial symbolic transfer entropy (PSTE), which is an extension of the bivariate symbolic transfer entropy quantifying only the direct causal effects among the variables of a multivariate system. Through Monte Carlo simulations it is shown that the PSTE is directly applicable to non-stationary in mean and variance time series and it is not affected by the existence of outliers and VAR filtering. For stationary time series, the PSTE is also compared to the linear conditional Granger causality index (CGCI). Finally, the causal effects among three financial variables are investigated. Computations of the PSTE and the CGCI on both the initial returns and the VAR filtered returns, and the PSTE on the original non-stationary time series, show consistency of the PSTE in estimating the causal effects.

61 citations


Journal ArticleDOI
TL;DR: The proposed method circumvents the need for minimum-phase transfer functions and is able to localize causality in time and frequency suitably, finding that financial stress has been causing economic activity particularly during the unwinding financial and economic distress and not the other way around.
Abstract: This paper proposes a continuous wavelet transform causality method that dispenses with minimum-phase spectral density matrix factorization. Extant methods based on minimum-phase function are computationally intensive and those utilizing discrete wavelet transform also fail to unfold causal effects over time and frequency. The proposed method circumvents the need for minimum-phase transfer functions and is able to localize causality in time and frequency suitably. We study the ability of the proposed method using simulated data and find that it performs excellently in identifying the causal islands. We then use the method to analyze the time---frequency causal effects in the relationship between the US financial stress and economic activity and find that financial stress has been causing economic activity particularly during the unwinding financial and economic distress and not the other way around.

48 citations


Journal ArticleDOI
TL;DR: It is shown that the Bayesian network model performs well against competing models (logistic regression model and neural network model) along several dimensions such as accuracy, sensitivity, precision and the receiver characteristic curve.
Abstract: This paper proposes a Bayesian network model to address censoring, class imbalance and real-time implementation issues in credit risk scoring. It shows that the Bayesian network model performs well against competing models (logistic regression model and neural network model) along several dimensions such as accuracy, sensitivity, precision and the receiver characteristic curve. Better performance of the Bayesian network model is particularly salient with class imbalance, higher dimensions and a rejection sample. Furthermore, the Bayesian network model can be scaled efficiently when implemented onto a larger dataset, thus making it amenable for real-time implementation.

46 citations


Journal ArticleDOI
TL;DR: In this paper, a genetic algorithm (GA) was used to select an optimal combination of technical indicators, fundamental indicators and volatility indicators for improving out-of-sample trading performance.
Abstract: Recurrent reinforcement learning (RRL) has been found to be a successful machine learning technique for building financial trading systems. In this paper, we use a genetic algorithm (GA) to improve the trading results of a RRL-type equity trading system. The proposed trading system takes the advantage of GA's capability to select an optimal combination of technical indicators, fundamental indicators and volatility indicators for improving out-of-sample trading performance. In our experiment, we use the daily data of 180 S&P stocks (from the period January 2009 to April 2014) to examine the profitability and the stability of the proposed GA-RRL trading system. We find that, after feeding the indicators selected by the GA into the RRL trading system, the out-of-sample trading performance improves as the number of companies with a significantly positive Sharpe ratio increases.

40 citations


Journal ArticleDOI
TL;DR: In this paper, empirical research on the instability of complex interbank systems is reviewed and three network approaches are distinguished: descriptions of interbank exposure networks; simulation and modelling; and the development of new metrics to describe network topology and individual banks' relative importance.
Abstract: Financial institutions are highly interconnected Consequently, they form complex systems which are inherently unstable This paper reviews empirical research on the instability of complex interbank systems Three network approaches are distinguished: descriptions of interbank exposure networks; simulation and modelling; and the development of new metrics to describe network topology and individual banks' relative importance The paper concludes by inferring policy implications and priorities for future research

26 citations


Journal ArticleDOI
TL;DR: The results show that the radial basis function neural network statistically outperforms all models’ individual performances and support vector regression is found to be the superior model of the forecasting competition.
Abstract: This study investigates the efficiency of the radial basis function neural networks in forecasting the US unemployment and explores the utility of Kalman filter and support vector regression as forecast combination techniques. On one hand, an autoregressive moving average model, a smooth transition autoregressive model and three different neural networks architectures, namely a multi-layer perceptron, recurrent neural network and a psi sigma network are used as benchmarks for our radial basis function neural network. On the other hand, our forecast combination methods are benchmarked with a simple average and a least absolute shrinkage and selection operator. The statistical performance of our models is estimated throughout the period of 1972---2012, using the last 7 years for out-of-sample testing. The results show that the radial basis function neural network statistically outperforms all models' individual performances. The forecast combinations are successful, since both Kalman filter and support vector regression techniques improve the statistical accuracy. Finally, support vector regression is found to be the superior model of the forecasting competition. The empirical evidence of this application are further validated by the use of the modified Diebold---Mariano test.

25 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examined the co-movement patterns of European business cycles during the period 1986-2011, with an obvious focal point the year 1999 that marked the introduction of the common currency, the euro.
Abstract: We examine the co-movement patterns of European business cycles during the period 1986---2011, with an obvious focal point the year 1999 that marked the introduction of the common currency, the euro. The empirical analysis is performed within the context of Graph Theory where we apply a rolling window approach in order to dynamically analyze the evolution of the network that corresponds to the GDP growth rate cross-correlations of 22 European economies. The main innovation of our study is that the analysis is performed by introducing what we call the threshold-minimum dominating set (T-MDS). We provide evidence at the network level and analyze its structure and evolution by the metrics of total network edges, network density, isolated nodes and the cardinality of the T-MDS set. Next, focusing on the country level, we analyze each individual country's neighborhoodset (economies with similar growth patterns) in the pre- and post-euro era in order to assess the degree of convergence to the rest of the economies in the network. Our empirical results indicate that despite a few economies' idiosyncratic behavior, the business cycles of the European countries display an overall increased degree of synchronization and thus convergence in the single currency era.

25 citations


Journal ArticleDOI
TL;DR: In this article, the authors consider a cost function based automated market maker aggregating the beliefs of risk-averse traders with finite budgets, and show that the resulting sequence of prices is convergent under general conditions.
Abstract: We consider the properties of a cost function based automated market maker aggregating the beliefs of risk-averse traders with finite budgets. Individuals can interact with the market maker an arbitrary number of times before the state of the world is revealed. We show that the resulting sequence of prices is convergent under general conditions, and explore the properties of the limiting price and trader portfolios. The limiting price cannot be expressed as a function of trader beliefs, since it is sensitive to the market maker's cost function as well as the order in which traders interact with the market. For a range of trader preferences, however, we show numerically that the limiting price provides a good approximation to a weighted average of beliefs, inclusive of the market designer's prior belief as reflected in the initial contract price. This average is computed by weighting trader beliefs by their respective budgets, and weighting the initial contract price by the market maker's worst-case loss, implicit in the cost function. Since cost function parameters are chosen by the market designer, this allows for an inference regarding the budget-weighted average of trader beliefs.

22 citations


Journal ArticleDOI
TL;DR: In this article, the authors developed an evolving fuzzy-GARCH modeling approach for stock market asset returns forecasting, which uses time-varying data streams to continuously and simultaneously adapt the structure and functionality of fuzzy models.
Abstract: Volatility modeling and forecasting play a key role in asset allocation, risk management, derivatives pricing and policy making. The purpose of this paper is to develop an evolving fuzzy-GARCH modeling approach for stock market asset returns forecasting. The method addresses GARCH volatility modeling within the framwork of evolving fuzzy systems. This hybrid methodology aims to account for time-varying volatility, from GARCH approach, as well as volatility clustering and nonlinear time series identification, from evolving fuzzy systems, which use time-varying data streams to continuously and simultaneously adapt the structure and functionality of fuzzy models. The motivation is to improve model performance as new data is input through gradual model construction, inducing model adaptation and refinement without catastrophic forgetting while keeping current model useful. An empirical application includes the forecasting of S&P 500 and Ibovespa indexes by the evolving fuzzy-GARCH against traditional GARCH-family models and a fuzzy GJR-GARCH methodology. The results indicate the high potential of the evolving fuzzy-GARCH model to forecast stock returns volatility, which outperforms GARCH-type models and showed comparable forecasts with fuzzy GJR-GARCH methodology.

21 citations


Journal ArticleDOI
TL;DR: In this article, the authors examine whether or not anomalies such as intraday or time of the day effects give rise to exploitable profit opportunities by replicating the actions of traders.
Abstract: One of the leading criticisms of the efficient market hypothesis is the presence of so-called "anomalies", i.e. empirical evidence of abnormal behaviour of asset prices which is inconsistent with market efficiency. However, most studies do not take into account transaction costs. Their existence implies that in fact traders might not be able to make abnormal profits. This paper examines whether or not anomalies such as intraday or time of the day effects give rise to exploitable profit opportunities by replicating the actions of traders. Specifically, the analysis is based on a trading robot which simulates their behaviour, and incorporates variable transaction costs (spreads). The results suggest that trading strategies aimed at exploiting daily patterns do not generate extra profits. Further, there are no significant differences between sub-periods (2005---2006--"normal"; 2007---2009--"crisis"; 2010---2011--"post-crisis).

Journal ArticleDOI
TL;DR: In this paper, the authors obtained a recursive formula for the price of discrete single barrier option based on the Black-Scholes framework in which drift, dividend yield and volatility assumed as deterministic functions of time.
Abstract: In this article, the researchers obtained a recursive formula for the price of discrete single barrier option based on the Black---Scholes framework in which drift, dividend yield and volatility assumed as deterministic functions of time. With some general transformations, the partial differential equations (PDEs) corresponding to option value problem, in each monitoring time interval, were converted into well-known Black---Scholes PDE with constant coefficients. Finally, an innovative numerical approach was proposed to utilize the obtained recursive formula efficiently. Despite some claims, it has considerably low computational cost and could be competitive with the other introduced method. In addition, one advantage of this method, is that the Greeks of the contracts were also calculated.

Journal ArticleDOI
TL;DR: The participant to participant layer reveals the behaviour of banks settling their own obligations, which proved to be sensitive to the failure of Lehmann Brothers; the participant to third party payments layer presented stable properties; and the third party to thirdparty layer resulted in an increasingly dense network since the system has been adopted for the settlement of low-value obligations between accountholders.
Abstract: With the purpose of going further in the understanding of the payment flows among the participants in the large value payment system in Mexico, SPEI, we elaborate payment networks using historical data for a period of seven years. We conceptualize the SPEI large value payment system as a multiplex network and we study it accordingly. Based on transactions performed on a daily basis, we present three layers built on the following types of payments, i.e. transactions sent from participant to participant, from participant to third party and from third party to third party. We observe that those layers exhibit dissimilar topology: the participant to participant layer reveals the behaviour of banks settling their own obligations, which proved to be sensitive to the failure of Lehmann Brothers; the participant to third party payments layer presented stable properties; and the third party to third party layer resulted in an increasingly dense network since the system has been adopted for the settlement of low-value obligations between accountholders. In order to identify relevant players in those layers, we compare some well-known centrality measures and also a novel centrality measure specifically designed for payment systems, SinkRank. The rankings assigned by SinkRank show a low degree of coincidence across layers.

Journal ArticleDOI
TL;DR: In this article, robust designs for an applied macroeconomic discrete-time LQ tracking model with perfect state measurements were explored for the United States for the period 1947-2012. And the results from both weighting schemes show that fiscal policy remains more aggressive under the robust designs than the deterministic model.
Abstract: This analysis explores robust designs for an applied macroeconomic discrete-time LQ tracking model with perfect state measurements. We develop a procedure that reframes the tracking problem as a regulator problem that is then used to simulate the deterministic, stochastic LQG, H-infinity, multiple-parameter minimax, and mixed stochastic/H-infinity control, for quarterly fiscal policy. We compare the results of the five different design structures within a closed-economy accelerator model using data for the United States for the period 1947---2012. When the consumption and investment tracking errors are more heavily emphasized, the H-infinity design renders the most aggressive fiscal policy, followed by the multiple-parameter minimax, mixed, LQG, and deterministic versions. When the control tracking errors are heavily weighted, the resulting fiscal policy is initially more aggressive under the multi-parameter specification than under the H-infinity and mixed designs. The results from both weighting schemes show that fiscal policy remains more aggressive under the robust designs than the deterministic model. The simulations show that the multi-parameter minimax and mixed designs provide a balancing compromise between the stochastic and robust methods when the worst-case concerns can be primarily limited to a subset of the state-space.

Journal ArticleDOI
TL;DR: In this article, the authors analyzed cascades of defaults in an interbank loan market and found that the ability of a defaulted institution to start a cascade depends on an interplay of shock size and connectivity.
Abstract: We analyze cascades of defaults in an interbank loan market. The novel feature of this study is that the network structure and the size distribution of banks are derived from empirical data. We find that the ability of a defaulted institution to start a cascade depends on an interplay of shock size and connectivity. Further results indicate that the interbank loan network is structurally less stable after the financial crisis than it was before. To evaluate the influence of the network structure on market stability, we compare simulated cascades from the empirical network with results from different network models. The results show that the empirical network has non-random features, which cannot be captured by randomized networks. The analysis also reveals that simulations that assume homogeneity for banks and loan size tend to overestimate the fragility of the interbank market.

Journal ArticleDOI
TL;DR: In this article, the radial basis function (RBF) interpolation was used to solve the partial integro-differential equation for American and European options on non-dividend-paying stocks in the Merton jump-diffusion model.
Abstract: The aim of this paper is to show that option prices in jump-diffusion models can be computed using meshless methods based on radial basis function (RBF) interpolation instead of traditional mesh-based methods like finite differences or finite elements. The RBF technique is demonstrated by solving the partial integro-differential equation for American and European options on non-dividend-paying stocks in the Merton jump-diffusion model, using the inverse multiquadric radial basis function. The method can in principle be extended to Levy-models. Moreover, an adaptive method is proposed to tackle the accuracy problem caused by a singularity in the initial condition so that the accuracy in option pricing in particular for small time to maturity can be improved.

Journal ArticleDOI
TL;DR: A combination of wavelet and Postfix-GP, a postfix notation based genetic programming system, is proposed for financial time series prediction and the results are compared with those obtained using ECJ, a Java based evolutionary framework.
Abstract: Financial time series prediction is considered as a challenging task. The task becomes difficult due to inherent nonlinear and non-stationary characteristics of financial time series. This article proposes a combination of wavelet and Postfix-GP, a postfix notation based genetic programming system, for financial time series prediction. The discrete wavelet transform approach is used to smoothen the time series by separating the fluctuations from the trend of the series. Postx-GP is then employed to evolve models for the smoothen series. The out-of-sample prediction capability of evolved solutions is tested on two stocks price and two stock indexes series. The results are compared with those obtained using ECJ, a Java based evolutionary framework. The nonparametric statistical tests are applied to evaluate the significance of the obtained results.

Journal ArticleDOI
Kazuhiko Kakamu1
TL;DR: In this paper, the root mean square errors of the Gini coefficients and top income shares of the two distributions were investigated by means of Monte Carlo experiments, and it was shown that the fit of the distributions depends on the relationships and magnitudes of the parameters.
Abstract: Dagum and Singh---Maddala distributions have been widely assumed as models for income distribution in empirical analyses. The properties of these distributions are well known and several estimation methods for these distributions from grouped data have been discussed widely. Moreover, previous studies argue that the Dagum distribution gives a better fit than the Singh---Maddala distribution in the empirical analyses. This study explores the reason why Dagum distribution is preferred to the Singh---Maddala distribution in terms of the akaike information criterion through Monte Carlo experiments. In addition, the properties of the Gini coefficients and the top income shares from these distributions are examined by means of root mean square errors. From the experiments, we confirm that the fit of the distributions depends on the relationships and magnitudes of the parameters. Furthermore, we confirm that the root mean square errors of the Gini coefficients and top income shares depend on the relationships of the parameters when the data-generating processes are a generalized beta distribution of the second kind.

Journal ArticleDOI
TL;DR: In this article, the solutions to several variants of the so-called dividend-distribution problem in a multi-dimensional, diffusion setting are studied, where the state variables are the current levels of cash reserves and of the stochastic short-rate, as well as time.
Abstract: In this paper the solutions to several variants of the so-called dividend-distribution problem in a multi-dimensional, diffusion setting are studied. In a nutshell, the manager of a firm must balance the retention of earnings (so as to ward off bankruptcy and earn interest) and the distribution of dividends (so as to please the shareholders). A dynamic-programming approach is used, where the state variables are the current levels of cash reserves and of the stochastic short-rate, as well as time. This results in a family of Hamilton---Jacobi---Bellman variational inequalities whose solutions must be approximated numerically. To do so, a finite element approximation and a time-marching scheme are employed.

Journal ArticleDOI
TL;DR: In this paper, the authors adopt differential evolution (DE) in the context of nonlinear stochastic optimal control problems, thus ensuring better convergence to a global optimum and explicitly considering parameter uncertainty by evaluating the expected objective function.
Abstract: Policy makers constantly face optimal control problems: what controls allow them to achieve certain targets in, e.g., GDP growth or inflation? Conventionally this is done by applying certain linear-quadratic optimization algorithms to dynamic econometric models. Several algorithms extend this baseline framework to nonlinear stochastic problems. However, those algorithms are limited in a variety of ways including, most importantly, their restriction to local best solutions only and the symmetry of objective function. The contribution of the current study is that we adopt differential evolution (DE) in the context of nonlinear stochastic optimal control problems, thus ensuring better convergence to a global optimum and explicitly considering parameter uncertainty by evaluating the expected objective function. The latter is done by minimizing the median over a set of multiple Monte Carlo draws of uncertain parameters and by separately evaluating the random parameter draws looking particularly at extreme cases. Comparing DE with more traditional methods, which make use of linear-quadratic optimization, in two economic models, we find that the solutions obtained for expected and ex-post functions differ consistently raising doubts about the optimality of ex-post solutions. We claim that this research is aimed to broaden the range of decision support information used by policy makers when choosing an optimal strategy under much more realistic conditions.

Journal ArticleDOI
TL;DR: A Markov Chain Monte Carlo Bayesian Inference approach is proposed, which estimates conditional probability distributions in structures obtained from a Tree-Augmented Naïve Bayes algorithm, which takes into account a large number of technical indices, accompanied with features that are extracted by a text mining methodology, from financial news articles and opinions posted in different social media platforms.
Abstract: Stock market analysis by using Information and Communication Technology methods is a dynamic and volatile domain. Over the past years, there has been an increasing focus on the development of modeling tools, especially when the expected outcomes appear to yield significant profits to the investors' portfolios. In alignment with modern globalized economy, the available resources are becoming gradually more plentiful, thus difficult to be analyzed by standard statistical tools. Thus far, there have been a number of research papers that emphasize solely in past data from stock bond prices and other technical indicators. Nevertheless, throughout recent studies, prediction is also based on textual information, based on the logical assumption that the course of a stock price can also be affected by news articles and perhaps by public opinions, as posted on various Web 2.0 platforms. Despite the recent advances in Natural Language Processing and Data Mining, when data tend to grow both in number of records and attributes, numerous mining algorithms face significant difficulties, resulting in poor forecast ability. The aim of this study is to propose a potential answer to the problem, by considering a Markov Chain Monte Carlo Bayesian Inference approach, which estimates conditional probability distributions in structures obtained from a Tree-Augmented Naive Bayes algorithm. The novelty of this study is based on the fact that technical analysis contains the event and not the cause of the change, while textual data may interpret that cause. The paper takes into account a large number of technical indices, accompanied with features that are extracted by a text mining methodology, from financial news articles and opinions posted in different social media platforms. Previous research has demonstrated that due to the high-dimensionality and sparseness of such data, the majority of widespread Data Mining algorithms suffer from either convergence or accuracy problems. Results acquired from the experimental phase, including a virtual trading experiment, are promising. Certainly, as it is tedious for a human investor to read all daily news concerning a company and other financial information, a prediction system that could analyze such textual resources and find relations with price movement at future time frames is valuable.

Journal ArticleDOI
TL;DR: In this paper, a multicolor contact system is applied to model a random stock price process for investigating the fluctuation dynamics of financial market, and a volatility duration analysis is introduced to detect the duration and intensity relationship of time series for both SSECI and the financial model, and empirical research is also presented to study the nonlinear behaviors of returns for the actual data and the simulation data.
Abstract: A financial agent-based time series model is developed and investigated by the stochastic contact systems. Multicolor contact system, as one of statistical physics systems, is applied to model a random stock price process for investigating the fluctuation dynamics of financial market. The interaction and dispersal of different types of investment attitudes in a financial market is imitated by viruses spreading in a multicolor contact system, and we suppose that the investment attitudes of market participants contribute to the volatilities of financial time series. We introduce a volatility duration analysis to detect the duration and intensity relationship of time series for both SSECI and the financial model. Furthermore, the empirical research is also presented to study the nonlinear behaviors of returns for the actual data and the simulation data.

Journal ArticleDOI
TL;DR: In this paper, the authors study the financial stability implications of dependency on syndicate partners in the presence of shocks to banks' capital and show that such shocks can produce rare events in this market when banks have shared loan exposures while also relying on a common risk management tool such as value-at-risk.
Abstract: Loan syndication increases bank interconnectedness through co-lending relationships. We study the financial stability implications of such dependency on syndicate partners in the presence of shocks to banks' capital. Model simulations in a network setting show that such shocks can produce rare events in this market when banks have shared loan exposures while also relying on a common risk management tool such as value-at-risk (VaR). This is because a withdrawal of a bank from a syndicate can cause ripple effects through the market, as the loan arranger scrambles to commit more of its own funds by also pulling back from other syndicates or has to dissolve the syndicate it had arranged. However, simulations also show that the core-periphery structure observed in the empirical network may reduce the probability of such contagion. In addition, simulations with tighter VaR constraints show banks taking on less risk ex-ante.

Journal ArticleDOI
TL;DR: A variety of filters that are commonly employed by econometricians are analysed with a view to determining their effectiveness in extracting well-defined components of economic data sequences.
Abstract: A variety of filters that are commonly employed by econometricians are analysed with a view to determining their effectiveness in extracting well-defined components of economic data sequences. These components can be defined in terms of their spectral structures--i.e., their frequency content--and it is argued that the process of econometric signal extraction should be guided by a careful appraisal of the periodogram of the detrended data sequence. Whereas it is true that many annual and quarterly economic data sequences are amenable to relatively unsophisticated filtering techniques, it is often the case that monthly data that exhibit strong seasonal fluctuations require a far more delicate approach. In such cases, it may be appropriate to use filters that work directly in the frequency domain by selecting or modifying the spectral ordinates of a Fourier decomposition of data that have been subject to a preliminary detrending.

Journal ArticleDOI
TL;DR: The main contribution of this paper is in utlizing this solution to implement LU decomposition technique on the basic DEA models which is more accurate and numerically stable and the number of computations in applying the Gaussian elimination method will be fairly reduced.
Abstract: A fundamental problem that usually appears in linear systems is to find a vector $$\mathbf{x}$$x satisfying $$\mathbf{Bx}=\mathbf{b}$$Bx=b. This linear system is encountered in many research applications and more importantly, it is required to be solved in many contexts in applied mathematics. LU decomposition method, based on the Gaussian elimination, is particularly well suited for spars and large-scale problems. Linear programming (LP) is a mathematical method to obtain optimal solutions for a linear system that is more being considered in various fields of study in recent decades. The simplex algorithm is one of the mostly used mathematical techniques for solving LP problems. Data envelopment analysis (DEA) is a non-parametric approach based on linear programming to evaluate relative efficiency of decision making units (DMUs). The number of LP models that has to be solved in DEA is at least the same as the number of DMUs. Toloo et al. (Comput Econ 45(2):323---326, 2015) proposed an initial basic feasible solution for DEA models which practically reduces at least 50 % of the whole computations. The main contribution of this paper is in utlizing this solution to implement LU decomposition technique on the basic DEA models which is more accurate and numerically stable. It is shown that the number of computations in applying the Gaussian elimination method will be fairly reduced due to the special structure of basic DEA models. Potential uses are illustrated with applications to hospital data set.

Journal ArticleDOI
TL;DR: In this article, the ability of various learning techniques with the conventional statistical method in predicting sovereign credit ratings was compared by employing classification and regression trees, multilayer perceptron, support vector machines (SVM), Bayes net, and naive Bayes.
Abstract: Sovereign credit ratings have been a controversial issue since the outbreak of the 2008 financial crisis. Among the debates the inaccuracies stay at the centre. By employing classification and regression trees, multilayer perceptron, support vector machines (SVM), Bayes net, and naive Bayes; we compare the ability of various learning techniques with the conventional statistical method in predicting sovereign credit ratings. Experimental results suggest that all the techniques excluding SVM have over 90 % accurate prediction. According to within one and two notch accurate prediction measure, the prediction performance of SVM also increases above 90 %. These findings indicate a clear outperformance of AI methods over the conventional statistical method. The results have many implications for the practitioners in credit scoring industry. Amidst the regulatory measures that encourage individual credit scoring for international financial institutions, these findings suggest that up-to-date AI methods serve quite reliable technical tools to predict sovereign credit ratings.

Journal ArticleDOI
TL;DR: In this article, the authors demonstrate that the maximum entropy bootstrap (meboot) data generation process can provide accurate and narrow parameter confidence intervals in models with combinations of stationary and nonstationary variables, under both low and high degrees of autocorrelation.
Abstract: By undertaking a large scale simulation study, we demonstrate that the maximum entropy bootstrap (meboot) data generation process can provide accurate and narrow parameter confidence intervals in models with combinations of stationary and nonstationary variables, under both low and high degrees of autocorrelation. The relatively small sample sizes in which meboot performs particularly well make it a useful tool for rolling window estimation. As a case study, we analyze the evolution of the price and income elasticities of import demand for crude oil in Turkey by using quarterly data between 1996---2011. Our approach can be employed to tackle a wide range of macroeconometric estimation problems where small sample sizes are a common issue.

Journal ArticleDOI
TL;DR: The results, based upon a 2001–2010 sample of 31 provinces in mainland China, indicate that during this period, China could be classified into three categories according to the talent environment.
Abstract: We set out in this study to develop an intelligent computing method for the evaluation of the `economic contribution rate of talent' (ECRT). We begin by constructing an indicator system for comprehensive evaluation of the talent environment and then go on to classify our (country or region) target system using our proposed GA-DE-FCM methodology. We subsequently identify total `human capital' as comprising of `talent capital' and `general labor', which, along with `fixed assets', are used as the input variables of the economic system, whilst the corresponding gross domestic product is used as the output variable. The mapping between the inputs and output is modeled in this study by a `fuzzy artificial neural network' from which several fuzzy rules can be extracted. Having extracted these fuzzy rules, we subsequently go on to investigate the effect of each input factor (fixed assets, talent capital and general labor) on the level of economic growth within each category (obtained in Step 1), and then carry out an examination of the ECRT within each category, as well as that within the whole target system. The traditional methods of evaluating ECRT are not regarded as satisfactory, given that the ECRT problem is non-linear and involves lags; however, we argue that based upon intelligent computing, the model proposed here can effectively deal with these issues. The results, based upon a 2001---2010 sample of 31 provinces in mainland China, indicate that during this period, China could be classified into three categories according to the talent environment. The first category (high level talent environment) comprises of just two regions, with an average ECRT of 44.61 per cent, whilst the second category (median level talent environment) comprises of five regions, with an average ECRT of 37.57 per cent, and the third category (low level talent environment) comprises of 24 regions, with an average ECRT of 14.8 per cent. The average ECRT for China as a whole is 25.67 per cent.

Journal ArticleDOI
TL;DR: The main objective of this research is to propose a new hybrid model called genetic algorithms–support vector regression (GA–SVR), in which the model uses support vector machines that the parameters of which are tuned by GA.
Abstract: The main objective of this research is to propose a new hybrid model called genetic algorithms---support vector regression (GA---SVR). The proposed model consists of three stages. In the first stage, after lag selection, the most efficient features are selected using stepwise regression algorithm (SRA). Afterward, these variables are used in order to develop proposed model, in which the model uses support vector machines that the parameters of which are tuned by GA. Finally, evaluation of the proposed model is carried out by applying it on the test data set.

Journal ArticleDOI
TL;DR: In this article, a visual framework for displaying interregional trading patterns in multi-region, multi-activity economies is introduced, where the analysis of inter-region inter-activity feedback loops based on the trade flows data and associated trading patterns can be depicted using programmed functions in MathEMATICA.
Abstract: This paper introduces a visual framework in the environment of a computer algebra system for displaying interregional trading patterns in multi-region, multi-activity economies. Interregional input---output tables illustrate spatial and economic interdependencies within interregional trade and are considered a key component in input---output modeling. The study and analysis of the interregional trade flows in interregional economic activities often reveal interregional and inter-activity linkages, referred to as "feedback loops" and/or spatial production cycles in interregional level. Trade theory considering feedback loops is a relatively new approach to the detailed analysis of vertical specialization of trade flows. This approach leads to the decomposition of global trade into feedback loops. Given the analysis of interregional inter-activity feedback loops based on the trade flows data, interregional input---output tables and associated trading patterns can be depicted using programmed functions in MATHEMATICA. Our programmed functions create static and dynamic images presenting the structure and the intensity of feedback loops connecting the regions and the activities of an economy. The generated visual schemes succeed to picture the multilateral trade connections between all regions. The programming codes along with their application in examples from the relevant literature are our methodological contribution in the visualization of trading tables.