scispace - formally typeset
Search or ask a question

Showing papers in "The Review of Economics and Statistics in 1990"


Journal ArticleDOI
TL;DR: In this article, a multivariate time series model with time varying conditional variances and covariances but with constant conditional correlations is proposed, which is readily interpreted as an extension of the seemingly unrelated regression (SUR) model allowing for heteroskedasticity.
Abstract: A multivariate time series model with time varying conditional variances and covariances but with constant conditional correlations is proposed. In a multivariate regression framework, the model is readily interpreted as an extension of the seemingly unrelated regression (SUR) model allowing for heteroskedasticity. Each of the conditional variances are parameterized as a univariate generalized autoregressive conditional heteroskedastic (GARCH) process. The descriptive validity of the model is illustrated for a set of 5 nominal European-US dollar exchange rates following the inception of the European Monetary System (EMS). EMS results are compared to estimates obtained for the same model using data over the pre-EMS period, July 1973 to March 1979. When compared to the pre-EMS free float period, the comovements between the currencies are found to be significantly higher over the later period.

3,662 citations


Journal ArticleDOI
TL;DR: The authors illustrates the danger of spurious regression from this kind of misspecification, using as an example a wage regression estimated on data for individual workers that includes in the specification aggregate regressors for characteristics of geographical states.
Abstract: Many economic researchers have attempted to measure the effect of aggregate market or public policy variables on micro units by merging aggregate data with micro observations by industry, occupation, or geographical location, then using multiple regression or similar statistical models to measure the effect of the aggregate variable on the micro units. The methods are usually based upon the assumption of independent disturbances, which is typically not appropriate for data from populations with grouped structure. Incorrectly using ordinary least squares can lead to standard errors that are seriously biased downward. This note illustrates the danger of spurious regression from this kind of misspecification, using as an example a wage regression estimated on data for individual workers that includes in the specification aggregate regressors for characteristics of geographical states. Copyright 1990 by MIT Press.

2,859 citations


Journal ArticleDOI
TL;DR: In this paper, the authors investigated small business longevity using a nationwide random sample of males who entered self-employment between 1976 and 1982 and found that highly educated entrepreneurs are most likely to create firms that remained in operation through 1986.
Abstract: Small business longevity is investigated utilizing a nationwide random sample of males who entered self-employment between 1976 and 1982. Highly educated entrepreneurs are most likely to create firms that remained in operation through 1986. Owner educational background, further, is a major determinant of the financial capital structure of small business startups. Financial capital endogeneity notwithstanding, firms with the larger financial investments at startup are consistently overrepresented in the survivor column. Firm leverage, finally, is trivial for delineating active from discontinued businesses. Reliance upon debt capital to finance business startup is clearly not associated with heightened risk of failure. Copyright 1990 by MIT Press.

1,033 citations


Journal ArticleDOI
TL;DR: In this paper, a nonparametric frontier approach is used to calculate the overall, technical, pure technical, allocative, and scale efficiencies for a sample of 322 independent banks, drawn from the Federal Deposit Insurance Corporation tapes on the Reports of Condition and Reports of Income (Call Reports) for the year 1986.
Abstract: A nonparametric frontier approach is used to calculate the overall, technical, pure technical, allocative, and scale efficiencies for a sample of 322 independent banks. The sample was drawn from the Federal Deposit Insurance Corporation tapes on the Reports of Condition and Reports of Income (Call Reports) for the year 1986. The results indicated a low level of overall efficiency. The main source of inefficiency was technical in nature, rather than allocative. Separate efficiency frontiers were constructed to test the effect of branching. However, the distributions of efficiency measures for branching and non-branching banks were not found to be different.

554 citations


Journal ArticleDOI
TL;DR: In this article, the authors proposed a framework for energy policy research based on the M.I.T. Energy Laboratory's Center for Energy Policy Research (CEPR) and National Science Foundation (NSF).
Abstract: National Science Foundation, SES-8412971 and Center for Energy Policy Research of the M.I.T. Energy Laboratory

271 citations


Journal ArticleDOI
TL;DR: In this paper, the available results are summarized and the missing links are provided in order to facilitate the computation of standard errors and test statistics, but no measures of estimation uncertainty are provided by authors.
Abstract: In recent years, vector autoregressive models have become standard tools for economic analyses. Impulse response functions and forecast error variance decompositions are usually computed from these models in order to investigate the interrelationships within the system. However, sometimes no measures of estimation uncertainty are provided by authors. One reason may be that the relevant asymptotic distribution theory is distributed over various publications. In this article, the available results are summarized and the missing links are provided in order to facilitate the computation of standard errors and test statistics. Copyright 1990 by MIT Press.

263 citations


Journal ArticleDOI
TL;DR: In this paper, a two-country micro-theoretic model consistent with the Ricardian equivalence hypothesis (REH) was developed for the U.S. economy, where tax increases used to retire government debt will not affect private spending or the current account balance.
Abstract: The paper develops a two-country micro-theoretic model consistent with the Ricardian equivalence hypothesis (REH). Specifically, tax increases used to retire government debt will not affect private spending or the current account balance. However, increases in government spending, regard- less of the means of finance, can be expected to induce a current account deficit. An unconstrained vector autoregres- sion shows some patterns in the recent U.S. data which appear to be inconsistent with the REH. Rigorous testing of the model, however, does not allow us to reject the indepen- dence of the record federal government budget and current account deficits.

220 citations



Journal ArticleDOI
TL;DR: The results show that information increases the probability that a consumer uses medical care, but that conditional on use the quantity of care consumed is not related to information, which contradict specific implications of models where physicians can create or induce demand for their own services.
Abstract: This paper is an empirical investigation of consumer health information. Using a new direct measure of information, the econometric approach treats both information and physician visits as endogenous variables when estimating the demand for medical care. The results show that information increases the probability that a consumer uses medical care, but that conditional on use the quantity of care consumed is not related to information. The results contradict specific implications of models where physicians can create or induce demand for their own services. Several results suggest that poorly informed consumers tend to underestimate the productivity of medical care in treating illness. Copyright 1990 by MIT Press.

197 citations


Journal ArticleDOI
TL;DR: In this article, the authors test if variations in the treatment of expenditures by state and local governments are an explanation for the inconsistent results of previous tax studies, and find evidence of structural linkages implicit in previous results for growth in state personal income.
Abstract: This paper tests if variations in the treatment of expenditures by state and local governments are an explanation for the inconsistent results of previous tax studies. Estimates for net investment and employment in manufacturing for 1962-82 support this conjecture, indicating that state and local taxes have a negative effect when the revenues are devoted to transfer-payment programs and that (with taxes held constant) increases in expenditures on health, education, consistent with the "vicious circle" phenomenon, do not appear simply to reflect common cyclical movements, and provide evidence of structural linkages implicit in previous results for growth in state personal income. Copyright 1990 by MIT Press.

197 citations


Journal ArticleDOI
TL;DR: The authors showed that as the average level of schooling increases, educational inequality first increases, and, after reaching a peak, starts to decline in later phases of educational expansion, and the turning point occurs when average schooling is about seven years.
Abstract: Fairly recent data for about 100 countries indicate that as the average level of schooling increases, educational inequality first increases, and, after reaching a peak, starts declining in later phases of educational expansion. The turning point occurs when average schooling is about seven years. The observed empirical generalization, which seems quite robust, appears to have important implications for educational and distributional policies and for research on the linkage between education and income inequality.

Journal ArticleDOI
TL;DR: In this article, the issue of inference from the problems of prediction or of quantitative policy analysis of an empirical parametric model and illustrates a new methodology that enables this in a test for prima facie causality.
Abstract: A research strategy is suggested that separates the issue of inference from the problems of prediction or of quantitative policy analysis of an empirical parametric model and illustrates a new methodology that enables this in a test for prima facie causality. Unlike the conventional parametric test, the more powerful multiple rank F test is invariant to monotonic transformations of the variables and independent of the error distribution. Employing this test, the Wagnerian hypothesis, supported by conventional parametric analysis, is rejected and the conventional Keynesian theory is accepted. Copyright 1990 by MIT Press.

Journal ArticleDOI
TL;DR: In this article, the authors used the Bank Spectrum estimator to estimate income and price elasticities for bilateral world trade and found that the direction of trade is sensitive to changes in income and prices.
Abstract: This paper estimates income and price elasticities for bilateral world trade. In addition to testing the properties of the error terms, the dynamic specification, and the assumption of parameter constancy, the analysis presents the first application of the Bank Spectrum estimator to bilateral trade flows for all countries. The paper finds that bilateral trade elasticities exhibit enough of a dispersion to suggest that the direction of trade is sensitive to changes in income and prices. Using the bilateral elasticities as raw data, the analysis obtains the associated multilateral estimates and finds that they are both consistent with the literature and suitable to addressing questions involving multilateral trade. But the evidence also reveals that sole reliance on multilateral elasticities conceals valuable information for both policy applications and empirical analyses of international trade. Copyright 1990 by MIT Press.

Journal ArticleDOI
TL;DR: In this article, a trade-off between functional flexibility and functional properness has been identified for the case of the quadratic cost function, and the proposed procedure is illustrated with an application to the Bell System.
Abstract: When choosing a flexible functional form to model multioutput cost structures, one is quickly confronted with severe violations of certain regularity conditions over large regions of output space. This paper explicitly imposes regional properness on the parameter space of flexible functional forms. The apparent trade-off between functional flexibility and functional properness has been identified for the case of the quadratic cost function. Using the quadratic cost function, the proposed procedure is illustrated with an application to the Bell System. The results suggest that the telecommunication industry in the United States--prior to the Bell System break-up--was a natural monopoly. Copyright 1990 by MIT Press.

Journal ArticleDOI
TL;DR: The authors investigated the connection between credit rationing and private intergenerational transfers and found that a substantial fraction of U.S. consumers are liquidity-constrained even if one allows for the possibility of private transfers.
Abstract: This paper investigates the connection between credit rationing and private intergenerational transfers. The research is motivated by the idea that private transfers may be a source of funds for consumers who have difficulty borrowing from financial intermediaries. This idea has important implications for consumer behavior, and economists have begun to think about it, but they have given it little empirical attention. Using the 1983 Survey of Consumer Finances, we find that private transfers do tend to be targeted toward consumers who face credit rationing. But we also find that a substantial fraction of U.S. consumers are liquidity-constrained even if one allows for the possibility of private transfers.

Journal ArticleDOI
TL;DR: In this article, a model of individual tax compliance behavior, including evasion and avoidance, is developed and estimated using individual-level data, and the model recognizes the importance of marginal income tax rates, payroll tax contributions and benefits.
Abstract: A model of individual tax compliance behavior, including evasion and avoidance, is developed and estimated. The model recognizes the importance of marginal income tax rates, payroll tax contributions and benefits, and the probabil- ity of detection and the penalty on unpaid taxes. Share equations for avoidance, evasion and reported income are estimated using individual-level data. The estimation results indicate that the tax base rises with higher benefits for payroll tax contributions and falls with higher marginal tax rates; the base also falls with more severe penalties and more certain detection of evasion as individuals substitute towards avoid- ance income. T HE methods by which individuals reduce their tax liabilities take a variety of legal and illegal forms, all of which are influenced at least in part by incentives created by the tax structure. These methods can be broadly classified as avoid- ance and evasion. Tax avoidance is any legal activity that lowers taxes, such as worker substitu- tion between wage and nonwage compensation. Tax evasion is the reduction in tax liabilities by illegal means, such as underreporting income on tax returns. Despite extensive-but separate literatures on avoidance and evasion, we know very little about the tax base response to changes in tax structure. It is the purpose of this paper to examine the role that the tax structure plays in compliance. There are several reasons for the persistence of the compliance puzzle. Most prominent is the absence of detailed individual data that would allow a full empirical specification of all factors affecting compliance. This difficulty is most evi- dent when searching for individual data on the evasion-compliance decision. Even when avail- able, data have never allowed in the same work the construction of both tax and audit variables.1 Clearly, data availability is even more problem- atic when looking at the choice among evasion, avoidance, and reported income. In addition, pre- vious work has not looked at the avoidance- evasion-compliance decision as a joint process, even though these decisions are made simultane- ously.2 Further, both strands of the literature have ignored another factor that may affect the compliance decision: the benefits that accrue from participation in payroll programs. If benefits are tied only to taxable income, then their presence gives individuals an incentive to pay taxes. In short, there has been no empirical work that analyzes the effects of tax rates, probabilities, penalties, and payroll benefits on avoidance and evasion choices of individuals. In this paper we provide such an analysis. We first develop a theory of individual choice among the three types of compensation. We then esti- mate the resulting share demand equations using a unique data set, which has detailed information on the compensation paid to roughly one-quarter of the labor force in Jamaica in 1983. From these data we are able to derive measures of reported taxable income, evasion income, and avoidance income for individual workers in the formal sec- tor. We are also able to construct measures of the marginal income tax rate, marginal payroll taxes and benefits, the probability of detection, and the penalty on evasion for individual workers. We are therefore able to estimate for the first time the responses of workers to the full range of tax structure parameters. Section I presents the theoretical model of worker compensation choice and the empirical specification of the model. Section II discusses the Jamaican tax system. Data and variable con- struction are discussed in section III. Estimation results are presented in section IV. The final section summarizes the main results.

ReportDOI
TL;DR: In this paper, a maximum likelihood and an asymptotically efficient two-step generalized least square estimator are proposed for groups of data. But the authors argue that for most applications in economics, the assumption that errors are independent within groups is inappropriate.
Abstract: When estimating linear models using grouped data researchers typically weight each observation by the group size. Under the assumption that the regression errors for the underlying micro data have expected values of zero, are independent and are homoscedastic, this procedure produces best linear unbiased estimates. This note argues that for most applications in economics the assumption that errors are independent within groups is inappropriate. Since grouping is commonly done on the basis of common observed characteristics, it is inappropriate to assume that there are no unobserved characteristics in common. If group members have unobserved characteristics in common, individual errors will be correlated. If errors are correlated within groups and group sizes are large then heteroscedasticity may be relatively unimportant and weighting by group size may exacerbate heteroscedasticity rather than eliminate it. Two examples presented here suggest that this may be the effect of weighting in most non-experimental applications. In many situations unweighted ordinary least squares may be a preferred alternative. For those cases where it is not, a maximum likelihood and an asymptotically efficient two-step generalized least squares estimator are proposed. An extension of the two-step estimator for grouped binary data is also presented.

Journal ArticleDOI
TL;DR: This paper used three duration models to estimate the effects of disability benefits on the hazard of returning to work and on the expected duration of work absences and found that disincentives exist even when disability benefits are not conditioned on the recipient remaining out of work.
Abstract: The authors use three duration models to estimate the effects of disability benefits on the hazard of returning to work and on the expected duration of work absences. The results show that disincentives exist even when disability benefits are not conditioned on the recipient remaining out of work. In addition, blacks and women are found to be absent longer than white men. Durations of work absences are also influenced by available wages, the type and severity of injury, the physical demands of the jobs for which the worker is qualified, and the willingness of employers to help the worker return to work. Copyright 1990 by MIT Press.

Journal ArticleDOI
TL;DR: This article used hedonic regression techniques to estimate the value of a statistical life from data on the market for automobiles and found that the estimated value for the sample as a whole is $3.357 million 1986 dollars.
Abstract: Using hedonic regression techniques, estimates of the willingness-to-pay for changes in the risks of dying can be inferred from actual behavior in market situations involving risk-dollar tradeoffs. Thaler and Rosen (1975) pioneered this approach, obtaining estimates of the value of a statistical life using labor market data, in this paper we use the hedonic technique to obtain the first estimates of the value of a statistical life from data on the market for automobiles. Our estimated value of a statistical life for the sample as a whole is $3.357 million 1986 dollars. Copyright 1990 by MIT Press.

Journal ArticleDOI
TL;DR: In this article, the authors provide an exploratory assessment of the theory of contestability that is based on a diverse sample of industries and introduces a new important determinant of market concentration.
Abstract: This paper provides an exploratory assessment of the theory of contestability that is based on a diverse sample of industries and introduces a new important determinant of market concentration. The results of the paper indicate that the variables determining the degree of contestability of markets are significant correlates of market concentration. Sunk costs account for a substantial portion of the sample variance of concentration even after technological factors are controlled for. Copyright 1990 by MIT Press.

Journal ArticleDOI
TL;DR: McDonald et al. as mentioned in this paper applied a partially adaptive technique to estimate the parameters of William F. Sharpe's market model based on a generalized t-distribution and included as special cases least squares, least absolute deviation, and L(superscript "p"), as well as some estimation procedures that have bounded and redescending influence functions.
Abstract: It is well known that least squares estimates can be very sensitive to departures from normality. Various robust estimators, such as least absolute deviations, L(superscript "p") estimators or M-estimators provide possible alternatives to least squares when such departures occur. This paper applies a partially adaptive technique to estimate the parameters of William F. Sharpe's market model. This methodology is based on a generalized t-distribution and includes as special cases least squares, least absolute deviation, and L(superscript "p"), as well as some estimation procedures that have bounded and redescending influence functions. Coauthors are James B.McDonald, Ray D. Nelson, and Steven B. White. Copyright 1990 by MIT Press.

Journal ArticleDOI
TL;DR: In this article, the results of forecasting experiments based on an error correction mechanism (ECM) model and various types of vector autoregressive (VAR) and BVAR models are presented.
Abstract: The results of forecasting experiments based on an error correction mechanism (ECM) model and various types of vector autoregressive (VAR) and Bayesian vector autoregressive (BVAR) models are presented. A Bayesian error correction mechanism (BECM) model is also tested. This model represents a hybrid of the BVAR and ECM models. The results from experiments using fifty industries and monthly Ohio labor market data demonstrate that the ECM model produces forecasts with much lower errors than any of the alternative VAR or BVAR models when the variables used in the model pass the statistical tests for cointegration. The findings confirm many of the beliefs expressed by Granger (1986) and Engle and Yoo (1987) based on theoretical consideration of the ECM model versus the VAR model. A result contradictory to the contentions of Engle and Yoo is that the BECM model performs well at the longer forecast horizons for both cointegrated and non-cointegrated industries.

Journal ArticleDOI
TL;DR: In this paper, it was shown that workers' willingness to pay for risk reduction (safety) in the workplace through diminished wages and market valuations of the "price" of these reductions are equivalent.
Abstract: The theory of compensating wage differentials, attributable to Adam Smith, suggests that jobs with disagreeable characteristics will command high wages, ceteris paribus. Most empirical tests of this theory with hedonic wage equations implicitly assume that workers' willingness to pay for risk reduction (safety) in the workplace through diminished wages and market valuations of the "price" of these reductions are equivalent. It is shown that this is not the case within the manufacturing sector where willingness to pay exceeds the price (cost) of risk reduction at current levels of risk exposure. Implications for implied value of life estimates are also examined. Copyright 1990 by MIT Press.

Journal ArticleDOI
TL;DR: In this article, a model of neighborhood turnover drawn from Bond and Coulson (1989) is proposed and the type of turnover process that is obtained is shown to depend mainly on the hedonic bid functions for housing and neighborhood quality.
Abstract: A model of neighborhood turnover drawn from Bond and Coulson (1989) is proposed. The type of turnover process that is obtained is shown to depend mainly on the hedonic bid functions for housing and neighborhood quality. A demand system of four hedonic attributes is estimated. The main results are that the traditional model of filtering by age of unit does not occur and that filtering by housing size does. Tipping due to changes in median neighborhood income is also quite possible. Tipping through changes in racial composition appears less likely. Copyright 1990 by MIT Press.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated the impact of macroeconomic policies on the distribution of income in the Philippines and found that underemployment inflation and government spending worsen income distribution while productivity gains the real interest rate and the real exchange rate were found to improve distribution.
Abstract: There has been a growing awareness of the income distribution dimension of macroeconomic policies This paper studies this issue empirically considering the case of the Philippines and using data available from integrated surveys of households After estimating a reduced form equation it was found that underemployment inflation and government spending worsen income distribution while productivity gains the real interest rate and the real exchange rate were found to improve distribution A similar pattern emerges when the effects of these variables on the absolute incidence of poverty are estimated (EXCERPT)

Posted Content
TL;DR: In this article, a double self-selection system was proposed to obtain the labor supply functions of moonlighters using a cross-section of 4,448 married couples from the Survey of Income and Program Participation, Wave 2.
Abstract: The model proposed here for obtaining the labor supply functions of moonlighters uses a double self-selection system to explore the husband's decision to moonlight together with his wife's decision to work. Subsequently, the labor functions are classified under two regimes depending on whether the wife works. The model is estimated based on a cross-section of 4,448 married couples from the Survey of Income and Program Participation, Wave 2. I find that the household production time of husbands and wives are substitutes and that specific human capital deters moonlighting. Copyright 1990 by MIT Press.

Journal ArticleDOI
TL;DR: In this paper, a double self-selection system was proposed to obtain the labor supply functions of moonlighters using a cross-section of 4,448 married couples from the Survey of Income and Program Participation, Wave 2.
Abstract: The model proposed here for obtaining the labor supply functions of moonlighters uses a double self-selection system to explore the husband's decision to moonlight together with his wife's decision to work. Subsequently, the labor functions are classified under two regimes depending on whether the wife works. The model is estimated based on a cross-section of 4,448 married couples from the Survey of Income and Program Participation, Wave 2. I find that the household production time of husbands and wives are substitutes and that specific human capital deters moonlighting. Copyright 1990 by MIT Press.

Journal ArticleDOI
TL;DR: In this paper, a sample selection bias that may occur even where initial assignment to the control and experimental groups is random was identified and adjustment for this bias was shown to influence the estimated effectiveness of programs aimed at increasing student learning of economics.
Abstract: When estimating regression models of educational achievement with pre- and posttest data, researchers have overlooked a sample selection bias that may occur even where initial assignment to the control and experimental groups is random. The bias arises because students who take the pretest but do not take the posttest are excluded from the regression analysis. Using data from a nationally normed test of high school student knowledge of economics, adjustment for this bias is shown to influence the estimated effectiveness of programs aimed at increasing student learning of economics. Copyright 1990 by MIT Press.

Journal ArticleDOI
TL;DR: In this paper, the authors used the bootstrap resampling method to construct confidence intervals for marginal costs, output-cost elasticities, economies of scale and scope, and Allen elasticities of substitution.
Abstract: A multi-product cost function system is estimated for 387 banks in states that allow branch banking. The bootstrap resampling method is used to construct confidence intervals for marginal costs, output-cost elasticities, economies of scale and scope, and Allen elasticities of substitution. Confidence intervals for these measures are usually constructed using a first-order variance approximation under a normality assumption, but such confidence intervals are inexact if the measures are not normally distributed or the variance approximations are imprecise. We find that the bootstrap standard error estimates can differ significantly from the usual estimates. Furthermore, we use the bootstrap to expand the analysis of cost function regularity properties. Copyright 1990 by MIT Press.

Journal ArticleDOI
TL;DR: In this article, the authors show that regulatory reform in trucking has a significant impact on the productivity of the industry, from 1 percent in 1981 to 23 percent in 1984, leading to a significant 16 percent productivity gain.
Abstract: This study confirms the higher productivity levels predicted by advocates of regulatory reform in trucking and shows that these gains have been substantial. Cost simulations suggest that, following a year of higher expenditures, efforts to remain competitive have yielded considerable cost savings that increase over time, from 1 percent in 1981 to 23 percent in 1984. The indirect effects of reform through the independent variables initially decrease costs, but later lead to higher costs. The cumulative effect has been a less than 1 percent increase in costs in 1980, becoming by 1984, a significant 16 percent productivity gain. Copyright 1990 by MIT Press.