scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Persistence and Cycles in US Hours Worked

TL;DR: In this paper, a cyclical long memory model based on Gegenbauer processes was used to analyse the relationship between hours worked and technology shocks. But the model was not applied to the real business cycle (RBC) model.
Abstract: This paper analyses monthly hours worked in the US over the sample period 1939m1 - 2011m10 using a cyclical long memory model; this is based on Gegenbauer processes and characterised by autocorrelations decaying to zero cyclically and at a hyperbolic rate along with a spectral density that is unbounded at a non-zero frequency. The reason for choosing this specification is that the periodogram of the hours worked series has a peak at a frequency away from zero. The empirical results confirm that this model works extremely well for hours worked, and it is then employed to analyse their relationship with technology shocks. It is found that hours worked increase on impact in response to a technology shock (though the effect dies away rapidly), consistently with Real Business Cycle (RBC) models.

Summary (2 min read)

1. Introduction

  • This paper proposes a modelling approach for US hours worked, specifically average weekly hours in manufacturing.
  • Both types of studies use similar empirical (VAR) frameworks, the crucial difference between them being in the treatment of the hours worked variable.
  • The reason for choosing this specification is that the periodogram of the hours worked series is found not to exhibit a peak at the zero 1 Section 2 briefly describes the different types of long range dependence or long memory models used here.
  • Section 4 discusses the empirical results and their implications for the debate on the relationship between hours worked and technology shocks, while Section 5 contains some concluding remarks.

2. A cyclical I(d) model

  • This includes a wide range of model specifications such as the white noise case, the stationary autoregression (AR), moving average (MA), and stationary ARMA models.
  • For this case specifications with stochastic trends have usually been adopted, under the assumption that the first differenced process is stationary I(0), and thus valid statistical inference can be drawn after differencing once.
  • For an alternative definition (Type I) see Marinucci and Robinson (1999).
  • Most of the empirical literature has focused on the case when the singularity or pole in the spectrum occurs at the zero frequency (λ * = 0).

3. The dataset

  • The series examined here is the average number of hours worked per week by production workers in US manufacturing industries, monthly, over the sample period 1939m1 – 2011m10; the source is the Current Employment Statistics (CES) monthly survey of the US Bureau of Labor Statistics.
  • The authors analyse both seasonally adjusted and unadjusted data (HWSA11 and HWNSA11 respectively) for the whole sample period and also for a shorter sample ending in 2007m4 (HWSA07 and HWNSA07) in order to establish whether the 2007/8 crisis had an impact on hours worked.
  • It also shows the correlograms, which exhibit a clearly cyclical pattern.
  • The periodograms, also displayed in the same figure, have the highest peak at frequency 7, as opposed to the zero frequency, which suggests that the I(d) and I(1) specifications estimated by other authors are not appropriate, and also that cycles have a length of approximately T/7 = 124.85 months, i.e. around ten and a half years.
  • 4 3 LRD also admits processes with multiple poles or singularities in the spectrum (k-factor Gegenbauer processes - see Giraitis and Leipus, 1995; Woodward et al., 1998; etc.) but these are beyond the scope of the present study.

4. Empirical results

  • As a first step the authors estimate the order of integration of the series using a standard I(d) model, i.e. assuming that the peak of the spectrum occurs at the long run or zero frequency.
  • The latter is a non-parametric approach to modelling the I(0) disturbances that approximates ARMA structures with a small number of parameters and has been widely employed in the context of fractional integration (see GilAlana, 2004).
  • On the basis of the above evidence that supports the cyclically I(d) specification for hours worked, 7 Box-Pierce Q-statistics indicate that the models including AR(1) disturbances (see Table 3) are free of additional serial correlation.
  • The estimated value of c (not reported) is 38 in all cases, consistently with the periodograms displayed in Figure 4, whilst the estimated values of d are in all cases in the interval (0, 1) but smaller than for the seasonally unadjusted data (in Table 7), implying long memory and mean reverting behaviour.
  • Next the authors investigate which of the potential models for the disturbances is the most adequate for the two series examined.

5. Conclusions

  • This paper analyses monthly hours worked in the US over the sample period 1939m1 – 2011m10 using a cyclical I(d) model based on Gegenbauer processes, which are characterised by a spectral density function unbounded at a non-zero frequency.
  • For the seasonally unadjusted data, the estimated values of d range between 0.123 and 0.272, whilst for the seasonally adjusted ones the variability is much higher, the values ranging between 0.068 and 0.705.
  • This is in contrast to the models normally found in the literature (e.g., Gali, 1999; Christiano, Eichenbaum and Vigfusson, 2003; Gil-Alana and Moreno, 2009) that, although differing in the degree of integration assumed for hours worked, are all based on hours worked being a highly persistent series with a peak at the zero frequency in the spectrum.
  • When including productivity as a weakly exogenous variable further evidence is obtained supporting the Gegenbauer model, the order of integration again being in the interval (0, 0.5).
  • Moreover, hours worked are found to increase on impact in response to a technology shock (although its effects disappear after two years).

Did you find this useful? Give us your feedback

Content maybe subject to copyright    Report

Department of
Economics and Finance
Working Paper No. 12-07
http://www.brunel.ac.uk/economics
Economics and Finance Working Paper Series
Guglielmo Maria Caporale and Luis A. Gil-Alana
Persistence and Cycles in US Hours
Worked
March 2012

PERSISTENCE AND CYCLES IN US HOURS WORKED
Guglielmo Maria Caporale
Brunel University, London, United Kingdom
and
Luis A. Gil-Alana
*
University of Navarra, Pamplona, Spain
March 2012
Abstract
This paper analyses monthly hours worked in the US over the sample period 1939m1
2011m10 using a cyclical long memory model; this is based on Gegenbauer processes and
characterised by autocorrelations decaying to zero cyclically and at a hyperbolic rate along with
a spectral density that is unbounded at a non-zero frequency. The reason for choosing this
specification is that the periodogram of the hours worked series has a peak at a frequency away
from zero. The empirical results confirm that this model works extremely well for hours
worked, and it is then employed to analyse their relationship with technology shocks. It is
found that hours worked increase on impact in response to a technology shock (though the
effect dies away rapidly), consistently with Real Business Cycle (RBC) models.
Keywords: hours worked, fractional integration, cycles, technology shocks
JEL classification: C32, E24
Corresponding author: Professor Guglielmo Maria Caporale, Centre for Empirical Finance,
Brunel University, West London, UB8 3PH, UK. Tel.: +44 (0)1895 266713. Fax: +44 (0)1895
269770. Email: Guglielmo-Maria.Caporale@brunel.ac.uk
* We are grateful to Tommaso Proietti for kindly supplying the dataset. The second-named author gratefully
acknowledges financial support from the Ministry of Education of Spain (ECO2011-2014 ECON Y FINANZAS,
Spain) and from a Jeronimo de Ayanz project of the Government of Navarra.

1. Introduction
This paper proposes a modelling approach for US hours worked, specifically average weekly
hours in manufacturing. This is an important variable since it can be seen as an indicator of the
state of the economy. Authors such as Glosser and Golden (1997) argue that firms tend to
respond to business cycle conditions by decreasing or increasing hours worked, before hiring or
laying off workers.
Although the relationship between business cycles and hours worked and their response
to technology shocks has been extensively investigated, this is still a controversial issue. Gali
(1999), Francis and Ramey (2005) and Gali and Rabanal (2004 - GR) found that, contrary to
the implications of Real Business Cycle (RBC) models, they decline in response to a
technology shock. These results were challenged, among others, by Christiano, Eichenbaum
and Vigfusson (2003 - CEV) who presented evidence that instead hours worked increase
following a technology shock.
1
Both types of studies use similar empirical (VAR) frameworks,
the crucial difference between them being in the treatment of the hours worked variable. In
particular, the former authors model it as a nonstationary I(1) variable whilst the latter assume
that it is a stationary I(0) process. More recently, Gil-Alana and Moreno (2009) allow the order
of integration of hours worked to be fractional, i.e. I(d), and find that the value of d depends on
the specific series examined, although in general it lies in the interval between 0 and 1. They
also find that per capita hours fall on impact in response to a technology shock.
All three approaches taken in the studies mentioned above implicitly assume a high
degree of persistence in hours worked that should result in a large peak in the periodogram (or
in any other estimate of the spectral density function) at the zero frequency. The model used in
the present study is instead based on Gegenbauer processes and is characterised by
autocorrelations decaying to zero cyclically and at a hyperbolic rate along with a spectral
density that is unbounded at a non-zero frequency. The reason for choosing this specification is
that the periodogram of the hours worked series is found not to exhibit a peak at the zero
1
For further evidence, see Gambetti, 2005 and Pesavento and Rossi, 2005.

frequency, as implied by the previous models, but instead at a frequency away from zero,
which can be captured by Gegenbauer processes as explained in the following section. Our
results confirm that this model works extremely well for hours worked, and it is then employed
to analyse their relationship with technology shocks, finding a positive (though rapidly dying
away) effect of such shocks, as suggested by Real Business Cycle (RBC) models.
The outline of the paper is as follows. Section 2 briefly describes the different types of
long range dependence or long memory models used here. Section 3 presents the data. Section
4 discusses the empirical results and their implications for the debate on the relationship
between hours worked and technology shocks, while Section 5 contains some concluding
remarks.
2. A cyclical I(d) model
For the purposes of the present study, we define an I(0) process {x
t
, t = 0, ±1, …} as a
covariance stationary process with spectral density function, f(λ), that is positive and finite at
any frequency. Alternatively, it can be defined in the time domain as a process such that the
infinite sum of the autocovariances is finite. This includes a wide range of model specifications
such as the white noise case, the stationary autoregression (AR), moving average (MA), and
stationary ARMA models.
In general, the I(0) condition is a pre-requisite for statistical inference in time series
analysis. However, a series might be nonstationary, i.e. the mean, the variance or the
autocovariances may change over time. For this case specifications with stochastic trends have
usually been adopted, under the assumption that the first differenced process is stationary I(0),
and thus valid statistical inference can be drawn after differencing once. More specifically, x
t
is
said to be I(1) if:
,...,2,1t,ux)L1(
tt
(1)
where L is the lag operator (Lx
t
= x
t-1
) and u
t
is I(0) as defined above. If u
t
is ARMA(p, q), then
x
t
is said to be an ARIMA(p, 1, q) process.

The above model has been extended in recent years to the fractional case, since the
differencing parameter required to render a series stationary I(0) is not necessarily an integer
(usually 1) but might also have a fractional value. In this context, x
t
is said to be I(d) if:
...2,1,t,uxL)(1
tt
d
, (2)
with x
t
= 0, t 0
2
, and u
t
is again I(0). Note that the polynomial on the left-hand-side of
equation (2) can be expanded, for all real d, as
....
2
)1(
1)1()1(
2
0
L
dd
LdL
j
d
L
jj
j
d
Thus, if d in (2) is an integer value, x
t
will be a function of a finite number of past observations,
while, if d is not an integer, x
t
depends upon values of the time series in the distant past, and the
higher the value of d is, the higher the level of dependence is between the observations.
If d > 0 in (2) x
t
displays long range dependence (LRD) or long memory. There are two
definitions of LRD, one in the time domain and the other in the frequency domain. The former
states that given a covariance stationary process {x
t
, t = 0, ±1, }, with autocovariance
function E[(x
t
Ex
t
)(x
t-j
-Ex
t
)] = γ
j
, x
t
displays LRD if
T
Tj
jT
lim
is infinite. A frequency domain definition may be as follows. Suppose that x
t
has an absolutely
continuous spectral distribution, and therefore a spectral density function, denoted by f(λ), and
defined as
j
j
jf .,cos
2
1
)(
Then, x
t
displays LRD if the spectral density function has a pole at some frequency λ in the
interval [0, π], i.e.,
(3)
2
This condition is required for the Type II definition of fractional integration. For an alternative definition (Type
I) see Marinucci and Robinson (1999).

Citations
More filters
Posted Content
TL;DR: In this article, the authors applied the tests for unit root and other nonstationarity of Robinson (1994a) to an extended version of the data set used by Nelson and Plosser (1982) and found that consumer price index and money stock seem the most nonstationary, while industrial production and unemployment rate seem the closest to stationarity.
Abstract: Recently proposed tests for unit root and other nonstationarity of Robinson (1994a) are applied to an extended version of the data set used by Nelson and Plosser (1982). Unusually, the tests are efficient (against appropriate parametric alternatives), the null can be any member of the I(d) class, and the null limit distribution is chi-squared. The conclusions vary substantially across fourteen series, and across different models of the disturbances (which, also unusually, include the Bloomfield spectral model). Overall, the consumer price index and money stock seem the most nonstationary, while industrial production and unemployment rate seem the closest to stationarity.

382 citations

Journal ArticleDOI
TL;DR: In this article, the analysis of two observed features in historical oil price data; in particular, persistence and cyclicity, was conducted using monthly data from September 1859 to October 2013, and it was found that the series presents two peaks in the spectrum, one occurring at the long run or zero frequency and the other at a cyclical frequency.

27 citations

Journal ArticleDOI
TL;DR: In this article, the authors apply two methods for the estimation of the cyclical components from the data: the approach based on the structural time series models and the ARIMA-based approach combined with the canonical decomposition and a band-pass filter.

23 citations


Cites background from "Persistence and Cycles in US Hours ..."

  • ...and Marmol (2004), Morana (2007), and Caporale and Gil-Alana (2014). It is, however,...

    [...]

  • ..., Cazelles et al. (2007). In contrast, in their examples with economic data Aguiar-Conraria and Soares (2014) consider a parametric approach and generate new samples by bootstrapping ARMA models for the analyzed data, for instance the cycles....

    [...]

Journal ArticleDOI
TL;DR: In this paper, a new and unique look at the dynamics and persistence of historical house prices in the USA and the UK using fractional integration techniques not previously applied to housing is presented.
Abstract: This paper provides a new and unique look at the dynamics and persistence of historical house prices in the USA and the UK using fractional integration techniques not previously applied to housing ...

7 citations

Journal ArticleDOI
TL;DR: In this paper, the authors provide new empirical facts on the cyclical behaviors of occupational employment and discuss their implications using consistent aggregate hours data constructed through the method of "conversion factors", which was developed by the U.S. Census Bureau.
Abstract: The business cycle properties of occupational employment have not yet been extensively explored because of inconsistencies in the aggregate employment series by occupation. Using consistent aggregate hours data constructed through the method of “conversion factors,” which was developed by the U.S. Census Bureau, we provide new empirical facts on the cyclical behaviors of occupational employment and discuss their implications. First, employment of the middle-skill occupation group is negatively affected by a technology shock, while those of high-skill and low-skill groups are positively correlated with it. Second, it is the middle-skill group that experiences the largest decline in employment volatility after the mid-1980s. Last, recessions since the 1980s have heterogenous impacts on different occupations, defining the characteristics of each recession. We further discuss the value of having consistent employment data in studies of business cycles.

5 citations


Cites background from "Persistence and Cycles in US Hours ..."

  • ...…(or employment) has been a controversial issue in the literature (see Francis and Ramey (2005), Chang and Hong (2006), Francis and Ramey (2009), and Caporalea and Gil-Alana (2014) among others) because it provides a direct empirical test on the validity of the existing business cycle theories....

    [...]

References
More filters
Posted Content
Jordi Galí1
TL;DR: In this article, the authors estimate conditional correlations of employment and productivity, based on a decomposition of the two series into technology and non-technology components, and show that the pattern of economic fluctuations attributed to technology shocks seems to be largely unrelated to major postwar cyclical episodes.
Abstract: Using data for the G7 countries, I estimate conditional correlations of employment and productivity, based on a decomposition of the two series into technology and non-technology components. The picture that emerges is hard to reconcile with the predictions of the standard Real Business Cycle model. For a majority of countries the following results stand out: (a) technology shocks appear to induce a negative comovement between productivity and employment, counterbalanced by a positive comovement generated by demand shocks, (b) the impulse responses show a persistent decline of employment in response to a positive technology shock, and (c) measured productivity increases temporarily in response to a positive demand shock. More generally, the pattern of economic fluctuations attributed to technology shocks seems to be largely unrelated to major postwar cyclical episodes. A simple model with monopolistic competition, sticky prices, and variable effort is shown to be able to account for the empirical findings.

1,463 citations


"Persistence and Cycles in US Hours ..." refers result in this paper

  • ...This is in contrast to the models normally found in the literature (e.g., Gali, 1999; Christiano, Eichenbaum and Vigfusson, 2003; Gil-Alana and Moreno, 2009) that, although differing in the degree of integration assumed for hours worked, are all based on hours worked being a highly persistent…...

    [...]

  • ...Gali (1999), Francis and Ramey (2005) and Gali and Rabanal (2004 - GR) found that, contrary to the implications of Real Business Cycle (RBC) models, they decline in response to a technology shock....

    [...]

  • ...Gali (1999), Francis and Ramey (2005) and Gali and Rabanal (2004 - GR) found that, contrary to the implications of Real Business Cycle (RBC) models, they decline in response to a technology shock. These results were challenged, among others, by Christiano, Eichenbaum and Vigfusson (2003 - CEV) who presented evidence that instead hours worked increase following a technology shock.(1) Both types of studies use similar empirical (VAR) frameworks, the crucial difference between them being in the treatment of the hours worked variable. In particular, the former authors model it as a nonstationary I(1) variable whilst the latter assume that it is a stationary I(0) process. More recently, Gil-Alana and Moreno (2009) allow the order of integration of hours worked to be fractional, i....

    [...]

ReportDOI
TL;DR: In this paper, the authors show that large technology shocks are needed to produce realistic business cycles, while Solow residuals are sufficiently volatile, these imply frequent technological regress, suggesting the imminent demise of real business cycles.
Abstract: The Real Business Cycle (RBC) research program has grown spectacularly over the last decade, as its concepts and methods have diffused into mainstream macroeconomics. Yet, there is increasing skepticism that technology shocks are a major source of business fluctuations. This chapter exposits the basic RBC model and shows that it requires large technology shocks to produce realistic business cycles. While Solow residuals are sufficiently volatile, these imply frequent technological regress. Productivity studies permitting unobserved factor variation find much smaller technology shocks, suggesting the imminent demise of real business cycles. However, we show that greater factor variation also dramatically amplifies shocks: a RBC model with varying capital utilization yields realistic business cycles from small, nonnegative changes in technology.

1,255 citations


"Persistence and Cycles in US Hours ..." refers methods in this paper

  • ...…and also employ a testing procedure developed by 4 Burn and Mitchell (1946), Romer (1986, 1994), Stock and Watson (1998), Diebold and Rudebusch (1992), Canova (1998), Baxter and King (1999), King and Rebelo (1999) among others showed that the average length of the cycle is approximately six years....

    [...]

Journal ArticleDOI
TL;DR: This paper examined the business cycle properties of a small set of real US macroeconomic time series using a variety of detrending methods and found that both quantitatively and qualitatively "stylized facts" of US business cycles vary widely across detrended methods and that alternative detending filters extract different types of information from the data.

1,023 citations


"Persistence and Cycles in US Hours ..." refers methods in this paper

  • ...…and also employ a testing procedure developed by 4 Burn and Mitchell (1946), Romer (1986, 1994), Stock and Watson (1998), Diebold and Rudebusch (1992), Canova (1998), Baxter and King (1999), King and Rebelo (1999) among others showed that the average length of the cycle is approximately six years....

    [...]

Journal ArticleDOI
TL;DR: In this article, the authors propose tests for unit root and other forms of nonstationarity that are asymptotically locally most powerful against a certain class of alternatives and have the same critical values given by the chi-squared distribution.
Abstract: This article proposes tests for unit root and other forms of nonstationarity that are asymptotically locally most powerful against a certain class of alternatives and have asymptotic critical values given by the chi-squared distribution. Many existing unit root tests do not share these properties. The alternatives include fractionally and seasonally fractionally differenced processes. There is considerable flexibility in our choice of null hypothesis, which can entail one or more integer or fractional roots of arbitrary order anywhere on the unit circle in the complex plane. For example, we can test for a fractional degree of integration of order 1/2; this can be interpreted as a test for nonstationarity against stationarity. “Overdifferencing” stationary null hypotheses can also be tested. The test statistic is derived via the score principle and is conveniently expressed in the frequency domain. The series tested are regression errors, which, when the hypothesized differencing is correct, are w...

892 citations


"Persistence and Cycles in US Hours ..." refers background or methods in this paper

  • ...Robinson (1994), which has been shown to be the most efficient one in the context of fractional integration....

    [...]

  • ...Finally, it is the most efficient method in the Pitman sense against local departures from the null (see Robinson, 1994).5 [Insert Table 1 about here] Table 1 displays the (Whittle) estimates of d (and the 95% confidence bands corresponding to the non-rejection values of d using Robinson’s (1994)…...

    [...]

  • ...We further investigate this issue by employing the parametric approach of Robinson (1994) described above assuming that the disturbances are white noise and autocorrelated in turn....

    [...]

  • ...7 Robinson (1994), which has been shown to be the most efficient one in the context of fractional integration. This method, based on the Lagrange Multiplier (LM) principle, tests the null hypothesis Ho: d = do in (2) and (5) for any real value do and has several advantages over other approaches. First, it allows to test for any real value of do, therefore encompassing both the stationary (d < 0.5) and nonstationary (d ≥ 0.5) hypotheses. Moreover, the limiting distribution is N(0, 1) and this standard behaviour holds independently of the regressors used in the regression model (5) and the type of model for the I(0) disturbances ut in (2). Finally, it is the most efficient method in the Pitman sense against local departures from the null (see Robinson, 1994).(5) [Insert Table 1 about here] Table 1 displays the (Whittle) estimates of d (and the 95% confidence bands corresponding to the non-rejection values of d using Robinson’s (1994) method) in the model given by equations (2) and (5) with zt in (5) equal to ( 1, t), t ≥ 1, 0 otherwise, i....

    [...]

  • ...7 Robinson (1994), which has been shown to be the most efficient one in the context of fractional integration....

    [...]

Frequently Asked Questions (1)
Q1. What have the authors contributed in "Fax cover sheet" ?

This paper analyses monthly hours worked in the US over the sample period 1939m1 – 2011m10 using a cyclical long memory model ; this is based on Gegenbauer processes and characterised by autocorrelations decaying to zero cyclically and at a hyperbolic rate along with a spectral density that is unbounded at a non-zero frequency. The reason for choosing this specification is that the periodogram of the hours worked series has a peak at a frequency away from zero. The empirical results confirm that this model works extremely well for hours worked, and it is then employed to analyse their relationship with technology shocks.