scispace - formally typeset
Search or ask a question
Journal ArticleDOI

A consistent nonparametric test of ergodicity for time series with applications

01 Jun 2001-Journal of Econometrics (North-Holland)-Vol. 102, Iss: 2, pp 365-398
TL;DR: In this article, a set of algorithms for testing the ergodicity of empirical time series, without reliance on a specific parametric framework, is proposed, and it is shown that the resulting test asymptotically obtains the correct size for stationary and nonstationary processes, and maximal power against non-ergodic but stationary alternatives.
About: This article is published in Journal of Econometrics.The article was published on 2001-06-01. It has received 21 citations till now. The article focuses on the topics: Ergodicity & Autocorrelation.
Citations
More filters
Journal ArticleDOI
TL;DR: In this article, it is shown that the direct consequences of the classical ergodic theorems for psychology and psychometrics invalidate this conjectured generalizability: only under very strict conditions-which are hardly obtained in real psychological processes-can a generalization be made from a structure of interindividual variation to the analogous structure of intraindividual variation.

1,344 citations

Journal ArticleDOI
TL;DR: In this article, the authors show how to consistently estimate ergodic models by simulated minimum distance techniques, both in a long-run equilibrium and during an adjustment phase, under a variety of conditions.

122 citations

Posted Content
TL;DR: This paper illustrates the use of the nonparametric Wald-Wolfowitz test to detect stationarity and ergodicity in agent-based models and shows that with appropriate settings the tests can detect non-stationarity and non-ergodicity.
Abstract: This paper illustrates the use of the nonparametric Wald-Wolfowitz test to detect stationarity and ergodicity in agent-based models. A nonparametric test is needed due to the practical impossibility to understand how the random component influences the emergent properties of the model in many agent-based models. Nonparametric tests on real data often lack power and this problem is addressed by applying the Wald-Wolfowitz test to the simulated data. The performance of the tests is evaluated using Monte Carlo simulations of a stochastic process with known properties. It is shown that with appropriate settings the tests can detect non-stationarity and non-ergodicity. Knowing whether a model is ergodic and stationary is essential in order to understand its behavior and the real system it is intended to represent; quantitative analysis of the artificial data helps to acquire such knowledge.

58 citations

Journal ArticleDOI
TL;DR: In this article, a three-wave longitudinal study was conducted in which they tested the temporal precedence between calling and engagement in learning activities, clarity of professional identity, and the presence of a supportive social environment.

32 citations

Journal ArticleDOI
TL;DR: Combining time series analysis and dynamic cluster analysis is a useful way to evaluate longitudinal patterns at both the individual level and subgroup level and represents a Typology of Temporal Patterns (TTP) approach.
Abstract: To improve complex behaviors such as adherence to medical recommendations, a better understanding of behavior change over time is needed. The focus of this study was adherence to treatment for obstructive sleep apnea (OSA). Adherence to the most common treatment for OSA is poor. This study involved a sample of 161 participants, each with approximately 180 nights of data. First, a time series analysis was performed for each individual. Time series parameters included the mean (average hours of use per night), level, slope, variance, and autocorrelation. Second, a dynamic cluster analysis was performed to find homogenous subgroups of individuals with similar adherence patterns. A four-cluster solution was found, and the subgroups were labeled: Great Users (17.2%; high mean and level, no slope), Good Users (32.8%; moderate mean and level, no slope), Low Users (22.7%; low mean and level, negative slope), and Slow Decliners (moderate mean and level, negative slope, high variance). Third, participants in the id...

31 citations

References
More filters
Journal ArticleDOI
TL;DR: In this article, the authors explore the dynamics of allocation under increasing returns in a context where increasing returns arise naturally: agents choosing between technologies competing for adoption, and examine how these influence selection of the outcome.
Abstract: This paper explores the dynamics of allocation under increasing returns in a context where increasing returns arise naturally: agents choosing between technologies competing for adoption. Modern, complex technologies often display increasing returns to adoption in that the more they are adopted, the more experience is gained with them, and the more they are improved.1 When two or more increasing-return technologies 'compete' then, for a 'market' of potential adopters, insignificant events may by chance give one of them an initial advantage in adoptions. This technology may then improve more than the others, so it may appeal to a wider proportion of potential adopters. It may therefore become further adopted and further improved. Thus a technology that by chance gains an early lead in adoption may eventually 'corner the market' of potential adopters, with the other technologies becoming locked out. Of course, under different 'insignificant events' - unexpected successes in the performance of prototypes, whims of early developers, political circumstances - a different technology might achieve sufficient adoption and improvement to come to dominate. Competitions between technologies may have, multiple potential outcomes. It is well known that allocation problems with increasing returns tend to exhibit multiple equilibria, and so it is not surprising that multiple outcomes should appear here. Static analysis can typically locate these multiple equilibria, but usually it cannot tell us which one will be 'selected'. A dynamic approach might be able to say more. By allowing the possibility of 'random events' occurring during adoption, it might examine how these influence ' selection' of the outcome - how some sets of random 'historical events' might cumulate to drive the process towards one market-share outcome, others to drive it towards another. It might also reveal how the two familiar increasingreturns properties of non-predictability and potential inefficiency come about: how increasing returns act to magnify chance events as adoptions take place, so that

5,583 citations

Journal ArticleDOI
TL;DR: In this paper, an efficient estimator of cointegrating vectors is presented for systems involving deterministic components and variables of differing, higher orders of integration. But the estimators are computed using GLS or OLS, and Wald Statistics constructed from these estimators have asymptotic x 2 distributions.
Abstract: Efficient estimators of cointegrating vectors are presented for systems involving deterministic components and variables of differing, higher orders of integration. The estimators are computed using GLS or OLS, and Wald Statistics constructed from these estimators have asymptotic x2 distributions. These and previously proposed estimators of cointegrating vectors are used to study long-run U.S. money (Ml) demand. Ml demand is found to be stable over 1900-1989; the 95% confidence intervals for the income elasticity and interest rate semielasticity are (.88,1.06) and (-.13, -.08), respectively. Estimates based on the postwar data alone, however, are unstable, with variances which indicate substantial sampling uncertainty.

4,088 citations

Journal ArticleDOI
TL;DR: In this paper, the authors present a unified approach to impulse response analysis which can be used for both linear and nonlinear multivariate models and demonstrate the use of these measures for a nonlinear bivariate model of US output and the unemployment rate.

3,821 citations

Journal ArticleDOI
TL;DR: In this paper, the authors propose a multiple regime alternative in which different economies obey different linear models when grouped according to initial conditions, and the marginal product of capital is shown to vary with the level of economic development.
Abstract: This paper provides some new evidence on the behaviour of cross-country growth rates. We reject the linear model commonly used to study cross-country growth behaviour in favour of a multiple regime alternative in which different economies obey different linear models when grouped according to initial conditions. Further, the marginal product of capital is shown to vary with the level of economic development. These results are consistent with growth models which exhibit multiple steady states. Our results call into question inferences that have been made in favour of the convergence hypothesis and further suggest that the explanatory power of the Solow growth model may be enhanced with a theory of aggregate production function differences.

1,417 citations

Journal ArticleDOI
TL;DR: In this article, the authors recommend a "solve-the-equation" plug-in bandwidth selector as being most reliable in terms of overall performance for kernel density estimation.
Abstract: There has been major progress in recent years in data-based bandwidth selection for kernel density estimation. Some “second generation” methods, including plug-in and smoothed bootstrap techniques, have been developed that are far superior to well-known “first generation” methods, such as rules of thumb, least squares cross-validation, and biased cross-validation. We recommend a “solve-the-equation” plug-in bandwidth selector as being most reliable in terms of overall performance. This article is intended to provide easy accessibility to the main ideas for nonexperts.

1,340 citations