scispace - formally typeset
Search or ask a question

Showing papers by "London School of Economics and Political Science published in 1984"


Journal ArticleDOI
TL;DR: A requisite decision model is defined as a model whose form and content are sufficient to solve a particular problem through an interactive and consultative process between problem owners and specialists (decision analysts).

470 citations


Journal ArticleDOI
TL;DR: In this paper, the maximum likelihood estimation of the parameters in an ARIMA model when some of the observations are missing or subject to temporal aggregation is considered, and both problems can be solved by setting up the model in state space form and applying the Kalman filter.
Abstract: Two related problems are considered. The first concerns the maximum likelihood estimation of the parameters in an ARIMA model when some of the observations are missing or subject to temporal aggregation. The second concerns the estimation of the missing observations. Both problems can be solved by setting up the model in state space form and applying the Kalman filter.

336 citations


Journal ArticleDOI
TL;DR: In this article, the authors consider the problem of minimizing the expected value of ∣X −Y ∣2 by finding the joint distribution of the random variable (X, Y) with specified marginal distributions for X and Y, and give a sufficient condition for the minimizing joint distribution and supply numerical results for two special cases.
Abstract: We consider the problem of mappingX→Y, whereX andY have given distributions, so as to minimize the expected value of ∣X–Y∣2. This is equivalent to finding the joint distribution of the random variable (X, Y), with specified marginal distributions forX andY, such that the expected value of ∣X–Y∣2 is minimized. We give a sufficient condition for the minimizing joint distribution and supply numerical results for two special cases.

264 citations


Journal ArticleDOI
TL;DR: This paper sets out to show the relationship between various procedures for univariate time series forecasting by adopting a framework in which a time series model is viewed in terms of trend, seasonal and irregular components.
Abstract: A large number of statistical forecasting procedures for univariate time series have been proposed in the literature. These range from simple methods, such as the exponentially weighted moving average, to more complex procedures such as Box–Jenkins ARIMA modelling and Harrison–Stevens Bayesian forecasting. This paper sets out to show the relationship between these various procedures by adopting a framework in which a time series model is viewed in terms of trend, seasonal and irregular components. The framework is then extended to cover models with explanatory variables. From the technical point of view the Kalman filter plays an important role in allowing an integrated treatment of these topics.

251 citations


Book ChapterDOI
TL;DR: This chapter discusses that given the paucity of dynamic theory and the small sample sizes currently available for most time series of interest, as against the manifest complexity of the data processes, all sources of information have to be utilized.
Abstract: Publisher Summary Dynamic specification denotes the problem of appropriately matching the lag reactions of a postulated theoretical model to the autocorrelation structure of the associated observed time-series data. As such, the issue is inseparable from that of stochastic specification if the finally chosen model is to have a purely random error process as its basic innovation. The subject-matter has advanced rapidly and offers an opportunity for critically examining the main themes and integrating previously disparate developments. A statistical-theory based model considers the joint density of the observables and seeks to characterize the processes whereby the data were generated. Thus, the focus is on means of simplifying the analysis to allow valid inference from submodels. This chapter also discusses that given the paucity of dynamic theory and the small sample sizes currently available for most time series of interest, as against the manifest complexity of the data processes, all sources of information have to be utilized. Attempt to resolve the issue of dynamic specification first involves developing the relevant concepts, models, and methods that is the deductive aspect of statistical analysis, prior to formulating inference techniques. An alternative interpretation is that by emphasizing the econometric aspect of time-series modeling, the analysis applies howsoever the model is obtained and seeks to be relatively neutral as to the economic theory content.

244 citations


Journal ArticleDOI
TL;DR: It was found that training resulted in more effective decision making only under the ‘no time pressure’ condition, and under time pressure training did not improve the quality of decision making at all, and the effectiveness of the decisions was significantly lower than under no time pressure.
Abstract: An experiment was carried out in order to evaluate the effects of time pressure and of training on the utilization of compensatory multi-attribute (MAU) decision processes. Sixty subjects made buying decisions with and without training in the process of compensatory MAU decision-making. This was repeated with and without time pressure. It was found that training resulted in more effective decision making only under the ‘no time pressure’ condition. Under time pressure the training did not improve the quality of decision making at all, and the effectiveness of the decisions was significantly lower than under no time pressure. It was concluded that specific training methods should be designed to help decision makers improve their decisions under time pressure.

198 citations


Journal ArticleDOI
TL;DR: The New Mediterranean Democracies: Regime Transition in Spain, Greece and Portugal, this paper, vol. 7, pp. 99-118, is a good starting point for this paper.
Abstract: (1984). Political parties in post‐junta Greece: A case of ‘bureaucratic clientelism'? West European Politics: Vol. 7, The New Mediterranean Democracies: Regime Transition in Spain, Greece and Portugal, pp. 99-118.

165 citations


Journal ArticleDOI
TL;DR: This paper presented and estimated an adjustment cost model of industry employment which takes explicit account of both expectations and aggregation over different labour types, subject to a large number of tests and is a highly robust representation of the data.
Abstract: In this paper we present and estimate an adjustment cost model of industry employment which takes explicit account of both expectations and aggregation over different labour types. The resulting model is subject to a large number of tests and is a highly robust representation of the data. Finally forecasts are produced for manufacturing employment up to 1990.

162 citations




Journal ArticleDOI
TL;DR: The authors examined the evidence from such models concerning the effects of unemployment benefits on incentives to work in Britain and found no benefit effect when benefit receipt is assumed to follow a hypothetical pattern which is shown to be unrealistic and overgenerous.

Book
01 Jan 1984
TL;DR: In this article, the authors explore the field of law which allows government and its agencies to practically apply its laws and provide a theoretical framework for administrative law that allows the student to develop the broadest possible perspective.
Abstract: This definitive textbook explores the field of law which allows government and its agencies to practically apply its laws. The subject, affected by policy and political factors, can challenge even the more advanced student. In response, this title looks at both the law and the factors informing it, laying down the foundations of the subject. This contextualised approach also allows the student to develop the broadest possible perspective. Case law and legislation are set out and discussed, and the authors have built in a range of case studies to give a practical emphasis to the study. It is, however, the distinctive theoretical framework for administrative law that the authors develop that distinguishes this title from others and allows for real understanding of the subject. This updated edition will cement the title's seminal status.

Journal ArticleDOI
TL;DR: In this paper, a new approach to factor analysis and related latent variable methods is proposed which is based on data reduction using the idea of Bayesian sufficiency, and considerations of symmetry, invariance and independence are used to determine an appropriate family of models.
Abstract: SUMMARY A new approach to factor analysis and related latent variable methods is proposed which is based on data reduction using the idea of Bayesian sufficiency. Considerations of symmetry, invariance and independence are used to determine an appropriate family of models. The results are expressed in terms of linear functions of the manifest variables after the manner of principal components analysis. The approach justifies some of the practices based on the normal theory factor model and lays a foundation for the treatment of nonnormal, including categorical, variables. 1. BACKGROUND Factor analysis is a widely used statistical technique but its theoretical foundations are somewhat obscure and subject to dispute. It is one of a family of multivariate methods which also includes latent structure and latent trait analysis. The common feature of the models underlying these methods is that the observed random variables are assumed to depend on latent, that is unobserved, random variables. There is sometimes debate about whether these latent variables are 'real' in any sense but they can be viewed simply as constructs designed to simplify and summarize the complex web of interrelated variables with which nature confronts us. By expressing these relationships in terms of a small number of latent variables, or factors, the models effect a reduction in dimensionality which aids comprehension. An early theoretical account of the subject is by Anderson & Rubin (1956) and more recent and comprehensive treatments are provided by Lawley & Maxwell (1971) and Harman (1968). The present paper is an attempt to provide a coherent framework within which existing methodology can be evaluated and a base from which new methods can be developed. Our approach is to start from a very general statement of the problem in terms of the distributions of the random variables involved and then to invoke ideas of symmetry, invariance and conditional independence to determine the class of models that it is reasonable to consider. This not only exhibits the unity of the various models already in existence but resolves many of the ambiguities and obscurities with which the subject has been bedevilled. An early approach on these lines is given by Anderson (1959) and was recognized by Birnbaum in his contribution to Lord & Novick (1968). The same line was followed by Bartholomew (1980). In spite of this the practical implications do not seem to have been made fully explicit or generally recognized.

Journal ArticleDOI
01 May 1984
TL;DR: Georg Rasch's life prior to his development of the Rasch models is described in this paper, where it is described how Rasch in his youth studied mathematics, and how he at the quite young age of 30 defended his doctoral thesis.
Abstract: This article explores Georg Rasch’s life prior to his development of the Rasch models. It will be described how Rasch in his youth studied mathematics, and how he at the quite young age of 30 defended his doctoral thesis. As it was, there was no available positions in mathematics for Rasch and he turned towards statistics. He was granted a scholarship to study statistics with R.A. Fisher; a circumstance that influenced the progress of statistics in Denmark. Rasch’s main occupation before he published the Rasch models will also be described at some length. He worked as a statistical consultant, and through his empirical work he developed a habit of developing whatever statistical tools he needed for the analysis.

Journal ArticleDOI
TL;DR: The results of a Principal Components analysis of the Effectiveness of Coping Behaviours Inventory (ECBl) administered to 256 hospitalized alcoholic patients are compared with the results of the Coping Behavior Inventory (CBI), administered to the same sample and also compared with a reanalysis of data obtained from a different sample 5 years ago as discussed by the authors.
Abstract: Summary The results of a Principal Components analysis of the Effectiveness of Coping Behaviours Inventory (ECBl) administered to 256 hospitalized alcoholic patients are compared with the results of the Coping Behaviours Inventory (CBI) administered to the same sample and also compared with a reanalysis of data obtained from‘a different sample 5years ago. The results indicate that the factor structure of the ECBI and CBI are similar. The four factors emerging from the present study, accounting for 59 per cent of the variance were: 1. Positive Thinking 2. Negative Thinking 3. Avoidance/Distraction 4. Seeking Social Supports While the scores on the CBI at intake did not discriminate between subsequent relapsers and survivors, the factor scores on the ECBI at intake on‘Positive Thinking'and ‘Avoidance/Distraction’ were found to be predictive of subsequent outcome 6 to 12 months later. The clinical implications are discussed in terms of coping strategies for relapse prevention treatment.

Journal ArticleDOI
TL;DR: This paper proposes an approach which uses individual-level data and thus permits regression analyses as well as analyses for sub-groups, and treats both unconditional (or additive) and conditional analyses.
Abstract: Summary Several recent papers have dealt with the problem of assessing the impact of the proximate determinants on fertility. All these approaches rely on combining a series of separately estimated aggregate level indicators. This paper proposes an approach which uses individual-level data and thus permits regression analyses as well as analyses for sub-groups. In the course of development it became clear that there are several deficiencies and inconsistencies in the measurement and formation of indices proposed elsewhere, which are overcome. We illustrate our approach with data from the Dominican Republic. The approach used involves attributing exposure to one or more of several states, including pregnancy, lactational and non-lactational components of post-partum amenorrhoea, absence of sexual relations and contraception. Key elements are efficacies of contraception and components of post-partum infecundity and the treatment of overlaps through an explicit hierarchy. We treat both unconditional (or addi...

Journal ArticleDOI
TL;DR: In this paper, the authors present a class of models which are designed for forecasting the net sales of a product when the stock of that product is believed to be subject to a saturation level.
Abstract: This paper presents a class of models which are designed for forecasting the net sales of a product when the stock of that product is believed to be subject to a saturation level. The forecast function for the stock takes the form of a general modified exponential, a family which includes the logistic as a special case. However, framing the model in terms of the net increase in the product enables a link to be made between the traditional approach to forecasting based on non-linear trend curves and the approach based on ARIMA models.

Journal ArticleDOI
TL;DR: Gardner in his [1982] assesses the ability of a number of methodological theories to explain the phenomenon, or alleged phenomenon, of the high supportive power, relative to some hypothesis, of any novel facts it predicts as mentioned in this paper.
Abstract: Gardner in his [1982] assesses the ability of a number of methodological theories to explain the phenomenon, or alleged phenomenon, of the high supportive power, relative to some hypothesis, of any novel facts it predicts. There are, perhaps surprisingly, several definitions of 'novelty' around in the literature, and Gardner himself provides one more. They all have in common, however, that the 'novel' fact is not one which the hypothesis was deliberately constructed in order to explain (c/. Zahar [1973], 2.2, Gardner, opcit.,p. 11). One of the methodological theories cited by Gardner is Bayesianism, and it is charged by him with failure to explain how any fact 'novel' in this sense but known at the time a hypothesis was formulated can support that hypothesis. My aims in this note are twofold: they are, first, to correct the elementary misunderstanding of the Bayesian theory responsible for that verdict and, second, to show how a Bayesian analysis provides a more accurate criterion than that of Zahar and others (according to which it is whether they are 'novel' or not, in the minimal sense above) for determining which known facts support subsequently-formulated hypotheses and which do not.

Journal ArticleDOI
01 Jan 1984
TL;DR: In this paper, a detailed investigation of a valley section in the Sussex Ouse basin has provided useful information concerning the litho- and bio-stratigraphy of the alluvial fill and floodplain deposits.
Abstract: Despite the apparent familiarity of geological and geomorphological events in southern England during.,the last millenia, there remains much uncertainty regarding the causal factors responsible for valley alluviation and floodplain development. A detailed investigation of a valley section in the Sussex Ouse basin has provided useful information concerning the litho- and bio-stratigraphy of the alluvial fill and floodplain deposits. The fine-grained fill sequence dates from the early Flandrian and palynological analysis indicates that anthropogenic factors appear to have played a significant role in its development. It is argued that lateral accretion processes have probably played a limited part in floodplain construction, whilst overbank processes alone are also considered unlikely to have been responsible for their formation. Rather, other processes associated with silty braided and/or anastomosing channels are considered as a more likely means of floodplain genesis. Finally, a series of conclusions are presented concerning Flandrian floodplain and alluvial fill development in southern England.


Posted Content
TL;DR: In this article, a survey of 6,010 U.S. households was conducted to investigate the extent to which a conventional portfolio choice model can explain the differences in portfolio composition among households.
Abstract: In this paper, we examine a new survey of 6,010 U.S.households and estimate a model for the allocation of total net worth among different assets. The paper has three main aims. The first is to investigate the extent to which a conventional portfolio choice model can explain the differences in portfolio composition among households. Our survey data show that most households hold only a subset of the available assets. Hence we analyze a model in which investors choose to hold incomplete portfolios. We show that the empirical specification of the joint discrete and continuous choice that characterizes household portfolio behavior is a switching regressions model with endogenous switching. The second aim is to examine the impact of taxes on portfolio composition. The survey contains a great deal of information on taxable incomes and deductions which enable us to calculate rather precisely the marginal tax rate facing each household.The third aim is to estimate wealth elasticities of demand for a range of assets and liabilities. We test the frequently made assumption of constant relative risk aversion.

Journal ArticleDOI
01 Oct 1984
TL;DR: The purpose has been to provide reasonably comprehensive and up-to-date surveys of recent developments and the state of various aspects of econometrics as of the early 1980s, written at a level intended for professional use by economists, econometricians, and statisticians and for use in advanced graduate econsometrics courses.
Abstract: Volume One of Elsevier's Handbook of Econometrics covers mathematical and statistical methods in econometrics, econometric models, and estimation and computation. The Handbook of Econometrics aims to serve as a source, reference, and teaching supplement for the field of econometrics, the branch of economics concerned with the empirical estimation of economic relationships. Econometrics is conceived broadly to include not only econometric models and estimation theory but also econometric data analysis, econometric applications in various substantive fields, and the uses of estimated econometric models. Our purpose has been to provide reasonably comprehensive and up-to-date surveys of recent developments and the state of various aspects of econometrics as of the early 1980s, written at a level intended for professional use by economists, econometricians, and statisticians and for use in advanced graduate econometrics courses.

Journal ArticleDOI
TL;DR: In this article, Tullock et al. examined the effect of Sunday trading in the context of a standard model of imperfect competition and concluded that the removal of the restrictions may result in a social loss, with a loss being less likely the more competitive the market.
Abstract: A Bill to legalise Sunday trading has recently been defeated in the British parliament. An argument commonly advanced by traders opposing liberalisation starts by observing that if a few shops were to choose to open on Sunday they would attract so much business from retailers trading only on weekdays that soon all shops would be forced to open on Sunday. Some customers might find this convenient, but, to the extent they were merely induced to change their shopping day, no new demand would be created. Without an increase in weekly turnover the additional costs of Sunday opening would bankrupt some shops and the survivors would have to raise prices. Thus, there is no guarantee that, on balance, consumers will gain. A counter-argument denies that all shops would be forced to open on Sundays. Insofar as consumers really prefer lower prices to the convenience of Sunday opening then a shop which closed on Sunday and used a fraction of the cost saving to reduce prices for the rest of the week could maintain its sales and increase its profit. Giving retailers the option of Sunday opening therefore enables them to respond more sensitively to consumers' preferences as between the convenience of Sunday .shopping and lower prices. This seems to be the position taken by Tullock (I975, pp. 673-4); my guess is that most economists would feel sympathetic to this argument as I certainly am. However, matters are not entirely straightforward. The retail trade can hardly be regarded as perfectly competitive and, starting from a second-best equilibrium, there is no guarantee that permitting more competition in one dimension of the characteristic space yields a potential Pareto gain. These considerations suggest that the argument be pursued at a more formal level. The following section examines Sunday opening in the context of a standard model of imperfect competition. The principal results are that the removal of the restrictions may result in a social loss, with a loss being less likely the more competitive is the market. An informal preview of the argument may be helpful. Given that travel costs must be incurred to make a purchase, customers who live close to a shop cannot easily be enticed away by competitors. The, real fight is for buyers located approximately equi-distant from two neighbouring stores. Ideally, each shop would like to offer price cuts to these distant customers only but this is not feasible. However, for customers who go out to work, the opportunity cost of shopping time may well be lowest on Sunday. So, for at least some customers, Sunday opening results in a fall in transport costs, thereby offering more to

Book ChapterDOI
01 Jan 1984
TL;DR: The paper as mentioned in this paper was presented at the World Congress of the Econometric Society at Toronto (1975) and to the AUFE meeting at Exeter (1979) as well as At seminars in Cambridge, Southampton and Leeds.
Abstract: Earlier versions of the paper were presented to the World Congress of the Econometric Society at Toronto (1975) and to the AUFE meeting at Exeter (1979) as well as At seminars in Cambridge, Southampton and Leeds. I am grateful to David Blake for much research help, to Stephen Glaister, Steve Nickell and Malcolm Pemberton for clarification on questions of theory. This research was supported over its long gestation period by the UK SSRC in a series of programmes to the LSE econometrics group — SEPDEM (1973–1976) PQE (1976–1979) MIME (1979–1982) and DEMEIC (1982–1985).

Posted Content
TL;DR: The authors developed a seniority model of union behaviour that attempts to resolve a number of long-standing puzzles in the literature, such as the shrinking union problem, bargaining over pay but not employment, unions having no direct interest in the elasticity of labour demand and extreme wage rigidity.
Abstract: The paper develops a seniority model of union behaviour that attempts to resolve a number of long-standing puzzles in the literature. The model predicts that (i) efficient contracts will lie on the labour demand curve, (ii) there will be negotiations over pay but not employment, (iii) unions will have no direct interest in the elasticity of labour demand, (iv) in certain circumstances there will be extreme wage rigidity, (v) exceptional recessions will produce concession bargaining, and (vi) there will be no ‘shrinking union’ problem. It is also shown that the introduction of layoffs by seniority into implicit contract theory eliminates its famous wage-rigidity theorem. The paper discusses the form of real labour contracts, documents the extent of layoffs by seniority, and reports the results of a survey of the largest British and US trade unions.

Journal ArticleDOI
TL;DR: In this article, the authors generalize the well-known maximum theorem to include discontinuous objective functions without any loss in the structure of the derived choice correspondence, and show that the maximum theorem can be generalized to cases with discontinuous objectives.

Journal ArticleDOI
TL;DR: There was a difference between the normal and maladjusted boys' knowledge of strategies for control of behaviour in a conflict situation and there is a developmental change in the spontaneous mention of a display rule.
Abstract: Younger (7-8 years) and older (10-11 years) boys from normal schools and from schools for the maladjusted were given two tests to assess their knowledge of strategies of control for both facial and overt behavioural expression of a negative emotion. The results indicated firstly that there is a developmental change in the spontaneous mention of a display rule with this developmental trend being retarded amongst the maladjusted boys. Secondly, there was a difference between the normal and maladjusted boys' knowledge of strategies for control of behaviour in a conflict situation.

Journal ArticleDOI
TL;DR: In this paper, a simple method for scaling a set of binary responses is proposed using the logit factor model of Bartholomew (1980), where individuals may be ranked on the basis of a linear combination, foaixi of the responses (xi = 0 or 1) where the oil's are the appropriate factor loadings.
Abstract: SUMMARY A simple method for scaling a set of binary responses is proposed using the logit factor model of Bartholomew (1980). It is shown that individuals may be ranked on the basis of a linear combination, foaixi of the responses (xi = 0 or 1) where the oil's are the appropriate factor loadings. This completely avoids the need to calculate the y-scores suggested in Bartholomew (1980) and hence the heavy numerical integration which that method involves.

Journal ArticleDOI
TL;DR: The time-dependent proportional hazards model is used to analyse first achievement by married couples of a home in one of the two major British housing tenures, owner-occupation and local authority accommodation, and conclusions are drawn about the extent to which the above factors have changed in importance over time.
Abstract: The time-dependent proportional hazards model is used to analyse first achievement by married couples of a home in one of the two major British housing tenures, owner-occupation and local authority accommodation. The effect of demographic and socioeconomic influences such as age at marriage, social class, and previous housing and fertility histories are estimated using a combination of life-table and regression approaches. All these factors are found to have substantial independent effects. The model is applied to data for three marriage cohorts in the Office of Population Censuses and Surveys' 1976 Family Formation Survey, and conclusions are drawn about the extent to which the above factors have changed in importance over time.