scispace - formally typeset
Search or ask a question
Journal Article

Spectral Analysis and Time Series

21 Mar 1991-
TL;DR: In this article, the authors introduce the concept of Stationary Random Processes and Spectral Analysis in the Time Domain and Frequency Domain, and present an analysis of Processes with Mixed Spectra.
Abstract: Preface. Preface to Volume 2. Contents of Volume 2. List of Main Notation. Basic Concepts. Elements of Probability Theory. Stationary Random Processes. Spectral Analysis. Estimation in the Time Domain. Estimation in the Frequency Domain. Spectral Analysis in Practice. Analysis of Processes with Mixed Spectra.
Citations
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors consider the problem of comparing complex hierarchical models in which the number of parameters is not clearly defined and derive a measure pD for the effective number in a model as the difference between the posterior mean of the deviances and the deviance at the posterior means of the parameters of interest, which is related to other information criteria and has an approximate decision theoretic justification.
Abstract: Summary. We consider the problem of comparing complex hierarchical models in which the number of parameters is not clearly defined. Using an information theoretic argument we derive a measure pD for the effective number of parameters in a model as the difference between the posterior mean of the deviance and the deviance at the posterior means of the parameters of interest. In general pD approximately corresponds to the trace of the product of Fisher's information and the posterior covariance, which in normal models is the trace of the ‘hat’ matrix projecting observations onto fitted values. Its properties in exponential families are explored. The posterior mean deviance is suggested as a Bayesian measure of fit or adequacy, and the contributions of individual observations to the fit and complexity can give rise to a diagnostic plot of deviance residuals against leverages. Adding pD to the posterior mean deviance gives a deviance information criterion for comparing models, which is related to other information criteria and has an approximate decision theoretic justification. The procedure is illustrated in some examples, and comparisons are drawn with alternative Bayesian and classical proposals. Throughout it is emphasized that the quantities required are trivial to compute in a Markov chain Monte Carlo analysis.

11,691 citations

Book ChapterDOI
TL;DR: This paper provides a concise overview of time series analysis in the time and frequency domains with lots of references for further reading.
Abstract: Any series of observations ordered along a single dimension, such as time, may be thought of as a time series. The emphasis in time series analysis is on studying the dependence among observations at different points in time. What distinguishes time series analysis from general multivariate analysis is precisely the temporal order imposed on the observations. Many economic variables, such as GNP and its components, price indices, sales, and stock returns are observed over time. In addition to being interested in the contemporaneous relationships among such variables, we are often concerned with relationships between their current and past values, that is, relationships over time.

9,919 citations

Journal ArticleDOI
06 Mar 2002-JAMA
TL;DR: Fine particulate and sulfur oxide--related pollution were associated with all-cause, lung cancer, and cardiopulmonary mortality and long-term exposure to combustion-related fine particulate air pollution is an important environmental risk factor for cardiopULmonary and lung cancer mortality.
Abstract: ContextAssociations have been found between day-to-day particulate air pollution and increased risk of various adverse health outcomes, including cardiopulmonary mortality. However, studies of health effects of long-term particulate air pollution have been less conclusive.ObjectiveTo assess the relationship between long-term exposure to fine particulate air pollution and all-cause, lung cancer, and cardiopulmonary mortality.Design, Setting, and ParticipantsVital status and cause of death data were collected by the American Cancer Society as part of the Cancer Prevention II study, an ongoing prospective mortality study, which enrolled approximately 1.2 million adults in 1982. Participants completed a questionnaire detailing individual risk factor data (age, sex, race, weight, height, smoking history, education, marital status, diet, alcohol consumption, and occupational exposures). The risk factor data for approximately 500 000 adults were linked with air pollution data for metropolitan areas throughout the United States and combined with vital status and cause of death data through December 31, 1998.Main Outcome MeasureAll-cause, lung cancer, and cardiopulmonary mortality.ResultsFine particulate and sulfur oxide–related pollution were associated with all-cause, lung cancer, and cardiopulmonary mortality. Each 10-µg/m3 elevation in fine particulate air pollution was associated with approximately a 4%, 6%, and 8% increased risk of all-cause, cardiopulmonary, and lung cancer mortality, respectively. Measures of coarse particle fraction and total suspended particles were not consistently associated with mortality.ConclusionLong-term exposure to combustion-related fine particulate air pollution is an important environmental risk factor for cardiopulmonary and lung cancer mortality.

7,803 citations

Posted Content
TL;DR: The authors describes the advantages of these studies and suggests how they can be improved and also provides aids in judging the validity of inferences they draw, such as multiple treatment and comparison groups and multiple pre- or post-intervention observations.
Abstract: Using research designs patterned after randomized experiments, many recent economic studies examine outcome measures for treatment groups and comparison groups that are not randomly assigned. By using variation in explanatory variables generated by changes in state laws, government draft mechanisms, or other means, these studies obtain variation that is readily examined and is plausibly exogenous. This paper describes the advantages of these studies and suggests how they can be improved. It also provides aids in judging the validity of inferences they draw. Design complications such as multiple treatment and comparison groups and multiple pre- or post-intervention observations are advocated.

7,222 citations


Cites result from "Spectral Analysis and Time Series"

  • ...Our approach is similar in spirit to that of Vuong (1989) in the sense that we propose methods for measuring and assessing the significance of divergences between models and data....

    [...]

Journal ArticleDOI
TL;DR: The effect of the added white noise is to provide a uniform reference frame in the time–frequency space; therefore, the added noise collates the portion of the signal of comparable scale in one IMF.
Abstract: A new Ensemble Empirical Mode Decomposition (EEMD) is presented. This new approach consists of sifting an ensemble of white noise-added signal (data) and treats the mean as the final true result. Finite, not infinitesimal, amplitude white noise is necessary to force the ensemble to exhaust all possible solutions in the sifting process, thus making the different scale signals to collate in the proper intrinsic mode functions (IMF) dictated by the dyadic filter banks. As EEMD is a time–space analysis method, the added white noise is averaged out with sufficient number of trials; the only persistent part that survives the averaging process is the component of the signal (original data), which is then treated as the true and more physical meaningful answer. The effect of the added white noise is to provide a uniform reference frame in the time–frequency space; therefore, the added noise collates the portion of the signal of comparable scale in one IMF. With this ensemble mean, one can separate scales naturall...

6,437 citations

References
More filters