scispace - formally typeset
Search or ask a question

Showing papers in "Economics : the Open-Access, Open-Assessment e-Journal in 2007"


Journal ArticleDOI
TL;DR: In this article, the shadow economy for 145 countries, including developing, transition and highly developed OECD economies over 1999 to 2005, was estimated and various estimation methods were discussed and critically evaluated.
Abstract: Estimations of the shadow economies for 145 countries, including developing, transition and highly developed OECD economies over 1999 to 2005 are presented. The average size of the shadow economy (as a percent of "official" GDP) in 2004/05 in 96 developing countries is 36.7%, in 25 transition countries 38.8% and in 21 OECD countries 14.8%. An increased burden of taxation and social security contributions, combined with a labour market regulation are the driving forces of the shadow economy. Furthermore, the results show that the shadow economy reduces corruption in high income countries, but increases corruption in low income countries. Finally, the various estimation methods are discussed and critically evaluated.

415 citations


Journal ArticleDOI
TL;DR: In this paper, the authors outline a method for translating the assumptions underlying a DSGE model into a set of testable assumptions on a cointegrated VAR model and illustrate the ideas with the RBC model in Ireland.
Abstract: All economists say that they want to take their model to the data. But with incomplete and highly imperfect data, doing so is difficult and requires carefully matching the assumptions of the model with the statistical properties of the data. The cointegrated VAR (CVAR) offers a way of doing so. In this paper we outline a method for translating the assumptions underlying a DSGE model into a set of testable assumptions on a cointegrated VAR model and illustrate the ideas with the RBC model in Ireland (2004). Accounting for unit roots (near unit roots) in the model is shown to provide a powerful robustification of the statistical and economic inference about persistent and less persistent movements in the data. We propose that all basic assumptions underlying the theory model should be formulated as a set of testable hypotheses on the long-run structure of a CVAR model, a so called ‘theory consistent hypothetical scenario’. The advantage of such a scenario is that if forces us to formulate all testable implications of the basic hypotheses underlying a theory model. We demonstrate that most assumptions underlying the DSGE model and, hence, the RBC model are rejected when properly tested. Leaving the RBC model aside, we then report a structured CVAR analysis that summarizes the main features of the data in terms of long-run relations and common stochastic trends. We argue that structuring the data in this way offers a number of ‘sophisticated’ stylized facts that a theory model has to replicate in order to claim empirical relevance.

105 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present tests of long run macroeconomic relations involving interest rates, equity, prices and exchange rates suggested by arbitrage in financial and goods markets using the global vector autoregressive (GVAR) model.
Abstract: This paper presents tests of long run macroeconomic relations involving interest rates, equity, prices and exchange rates suggested by arbitrage in financial and goods markets. It uses the global vector autoregressive (GVAR) model to test for long run restrictions in each country/region conditioning on the rest of the world. Bootstrapping is used to compute both the empirical distribution of the impulse responses and the log-likelihood ratio statistic for over-identifying restrictions. The paper also examines the speed with which adjustments to the long run relations take place via the persistence profiles. It finds strong evidence in favour of a long run version of uncovered interest parity and to a lesser extent the Fisher equation across a number of countries, but the test results for the purchasing power parity relation are much weaker. Also the transmission of shocks and subsequent adjustments in financial markets are much faster than those in goods markets.

41 citations


Journal ArticleDOI
TL;DR: This paper provided a survey of three families of flexible parametric probability density functions (the skewed generalized t, the exponential generalized beta of the second kind, and the inverse hyperbolic sine distributions) which can be used in modeling a wide variety of econometric problems.
Abstract: This paper provides a survey of three families of flexible parametric probability density functions (the skewed generalized t, the exponential generalized beta of the second kind, and the inverse hyperbolic sine distributions) which can be used in modeling a wide variety of econometric problems. A figure, which can facilitate model selection, summarizing the admissible combinations of skewness and kurtosis spanned by the three distributional families is included. Applications of these families to estimating regression models demonstrate that they may exhibit significant efficiency gains relative to conventional regression procedures, such as ordinary least squares estimation, when modeling nonnormal errors with skewness and/or leptokurtosis, without suffering large efficiency losses when errors are normally distributed. A second example illustrates the application of flexible parametric density functions as conditional distributions in a GARCH formulation of the distribution of returns on the S&P500. The skewed generalized t can be an important model for econometric analysis.

41 citations


Journal ArticleDOI
TL;DR: It is shown that while an unconstrained VAR model does not imply any causal orders in the variables, a TSCM that contains some empirically testable causal orders implies a restricted SVAR model.
Abstract: Applying a probabilistic causal approach, we define a class of time series causal models (TSCM) based on stationary Bayesian networks. A TSCM can be seen as a structural VAR identified by the causal relations among the variables. We classify TSCMs into observationally equivalent classes by providing a necessary and sufficient condition for the observational equivalence. Applying an automated learning algorithm, we are able to consistently identify the data-generating causal structure up to the class of observational equivalence. In this way we can characterize the empirical testable causal orders among variables based on their observed time series data. It is shown that while an unconstrained VAR model does not imply any causal orders in the variables, a TSCM that contains some empirically testable causal orders implies a restricted SVAR model. We also discuss the relation between the probabilistic causal concept presented in TSCMs and the concept of Granger causality. It is demonstrated in an application example that this methodology can be used to construct structural equations with causal interpretations.

21 citations


Journal ArticleDOI
TL;DR: In this paper, the authors provide an exact characterization in terms of marginal revenues of when such a strategy is profitable, which, remarkably, does not depend on the distribution of customer valuations, but only on the value of the damaged product relative to the undamaged product.
Abstract: Companies with market power occasionally engage in intentional quality reduction of a portion of their output as a means of offering two qualities of goods for the purpose of price discrimination, even absent a cost saving. This paper provides an exact characterization in terms of marginal revenues of when such a strategy is profitable, which, remarkably, does not depend on the distribution of customer valuations, but only on the value of the damaged product relative to the undamaged product. In particular, when the damaged product provides a constant proportion of the value of the full product, selling a damaged good is unprofitable. One quality reduction produces higher profits than another if the former has higher marginal revenue than the latter.

17 citations


Journal ArticleDOI
TL;DR: In this article, the authors examined the effect of the US market on the return and volatility spillover effects from the US equity market into the Canadian and Mexican equity markets, and found that negative shocks from US stock market impacts on the conditional volatility of the returns in each of the three markets more deeply than positive shocks.
Abstract: In this paper we examine the issue of asymmetry in the return and volatility spillover effects from the US equity market into the Canadian and Mexican equity markets. We model the conditional volatility of the returns in each of the three markets using the asymmetric power model of Ding, Granger and Engle (1993). The empirical findings indicate that the US market has a significant impact on the returns in the Canadian and Mexican markets. However, the findings for Canada vary considerably from those for Mexico. In particular, the empirical results indicate that volatility spillover effects, but not return spillover effects, exhibit an asymmetric behavior, with negative shocks from the US equity market impacting on the conditional volatility of the Canadian and Mexican equity markets more deeply than positive shocks. Moreover, while the impact of positive shocks from the US equity market is not much different between the two markets, this is not the case with negative shocks, which affect the volatility of the Mexican market more intensely than the volatility of the Canadian market.

10 citations


Journal ArticleDOI
TL;DR: In this paper, the authors proposed to extend the stability tests to dependent cointegrated panels through the stationary bootstrap and showed that the proposed panel tests improve considerably on asymptotic tests applied to individual series.
Abstract: Stability tests for cointegrating coefficients are known to have very low power with small to medium sample sizes. In this paper we propose to solve this problem by extending the tests to dependent cointegrated panels through the stationary bootstrap. Simulation evidence shows that the proposed panel tests improve considerably on asymptotic tests applied to individual series. As an empirical illustration we examined investment and saving for a panel of European countries over the 1960-2002 period. While the individual stability tests, contrary to expectations and graphical evidence, in almost all cases do not reject the null of stability, the bootstrap panel tests lead to the more plausible conclusion that the long-run relationship between these two variables is likely to have undergone a break.

7 citations


Journal ArticleDOI
TL;DR: The authors presented a simple model of Reder competition that depicts wages as driven by labor heterogeneity, rather than scarcity, which may give rise to a simultaneous increase in wage differentials and over-qualification.
Abstract: The expansion of higher education in the Western countries has been accompanied by a marked widening of wage differentials and increasing over-qualification. While the increase in wage differentials has been attributed to skill-biased technological change that made advanced skills scarce, this explanation does not fit well with the observed increase in over-qualification which suggests that advanced skills are in excess supply. By “Reder-competition” I refer to the simultaneous adjustment of wage offers and hiring standards in response to changing labor market conditions. I present a simple model of Reder competition that depicts wages as driven by labor heterogeneity, rather than scarcity. The mechanism may give rise to a simultaneous increase in wage differentials and over-qualification.

4 citations


Journal ArticleDOI
TL;DR: In this article, the authors show that default-free intermediaries who issue credit lines to consumers can resolve the monetary problem and make it possible for the economy to reach a Pareto optimum.
Abstract: Using the monetary model developed in Sissoko (2007), where the general equilibrium assumption that every agent buys and sells simultaneously is relaxed, we observe that in this environment fiat money can implement a Pareto optimum only if taxes are type-specific. We then consider intermediated money by assuming that financial intermediaries whose liabilities circulate as money have an important identifying characteristic: they are widely viewed as default-free. The paper demonstrates that default-free intermediaries who issue credit lines to consumers can resolve the monetary problem and make it possible for the economy to reach a Pareto optimum. We argue that our idealized concept of financial intermediation is a starting point for studying the monetary use of credit.

2 citations


Journal ArticleDOI
TL;DR: In this paper, the authors construct a Blanchard-style overlapping generations model consisting of long-lived individuals who have uninsurable idiosyncratic risk resulting from uncertain retirement periods and medical costs in retirement.
Abstract: The authors construct a Blanchard-style overlapping generations model consisting of long-lived individuals who have uninsurable idiosyncratic risk resulting from uncertain retirement periods and medical costs in retirement. Without social insurance, such individuals must save for these eventualities. The authors examine the impact of pay-as-you-go social insurance policies (public pensions and medicare coverage) on individual and aggregate consumption, saving, and wealth levels as well as wealth distribution. They also derive expressions for optimal (Pareto improving) social insurance policies.

Journal ArticleDOI
TL;DR: In this paper, the authors used differential tax analysis to show how the socially optimal fiscal-tax to liquidity-tax ratio changes with the relative size of the tax-evading hidden economy.
Abstract: Differential tax analysis is used to show how the socially optimal fiscal-tax to liquidity-tax ratio changes with the relative size of the tax-evading hidden economy. The smaller the relative size of the hidden economy, the larger the optimal fiscal-tax to liquidity-tax ratio. The empirical cross-section and panel evidence supports this theoretical result.

Journal ArticleDOI
TL;DR: In this paper, a two-country model is used to demonstrate that the open economy dimension can enhance the ability of sticky price models to account for the empirical evidence that a positive technology shock leads to a temporary decline in employment.
Abstract: A growing body of empirical evidence suggests that a positive technology shock leads to a temporary decline in employment. A two-country model is used to demonstrate that the open economy dimension can enhance the ability of sticky price models to account for the evidence. The reasoning is as follows. An improvement in technology appreciates the nominal exchange rate. Under producercurrency pricing, the exchange rate appreciation shifts global demand toward foreign goods away from domestic goods. This causes a temporary decline in domestic employment. If the expenditureswitching effect is sufficiently strong, a technology shock also has a negative effect on output in the short run.

Journal ArticleDOI
TL;DR: This paper presented a simple model of Reder competition that reproduces the simultaneous increase in wage differentials and overqualification in response to an increase in education, which suggests that advanced skills are in excess supply.
Abstract: The expansion of higher education in the Western countries has been accompanied by a marked widening of wage differentials and increasing overqualification. While the increase in wage differentials has been attributed to skill-biased technological change that made advanced skills scarce, this explanation does not fit well with the observed increase in overqualification which suggests that advanced skills are in excess supply. By "Reder-competition" I refer to the simultaneous adjustment of wage offers and hiring standards in response to changing labor market condition. I present a simple model of Reder competition that reproduces the simultaneous increase in wage differentials and overqualification in response to an increase in education.