scispace - formally typeset
Search or ask a question

Showing papers in "Quality & Quantity in 1997"


Journal ArticleDOI
TL;DR: In this article, the authors focus on the opportunities and strengths of a multi-method approach, widely called methodological triangulation, in which different investigative methods are applied to one research object.
Abstract: The essay focuses on the opportunities and strengths of a multi-method approach, widely called methodological triangulation, in which different investigative methods are applied to one research object. In practice, this can be realized with the coupling of quantitative structural data concerning the life course and the interpretation and evaluation of life course data collected with qualitative methods. This approach is examined in order to shed light on the problem that research findings often show different phenomena and not the different aspects of one phenomenon. The discussion of the relationships of the findings to one another (congruent, complementary or divergent) shows that in this context a multi-method approach can nevertheless be used to increase validity and to test hypotheses. Further, its particular strengths are the empirically induced modification of existing models and theories, as well as the development of new explanations.

254 citations


Journal ArticleDOI
TL;DR: In this article, the identification of principles for a type of case study evolve out of the type of knowledge and information the researcher is seeking to gather, and different types of case studies need to be judged by unique principles or assumptions.
Abstract: Experimental research practices encouraging scientific control and explicit measurement have been criticized for dealing with management and organizational dynamics as if they were hypothetical, static, and unreal. There have been a number of calls for a more qualitative and relevant approach which provides more in-depth knowledge of cases. Certain principles and practices have been offered to assist researchers, yet there has been much debate on their application. Some researchers have suggested case study principles supporting intensive case analysis while others have indicated the importance of comparisons. This paper first provides a listing of common case study approaches, each of which is used for different purposes. Some of these approaches are used for descriptive research, some for encouraging discovery, and others for establishing proof. The identification of principles for a type of case study evolve out of the type of knowledge and information the researcher is seeking to gather. Narratives, explanatory, and interpretative cases tend to use historical information focused around questions, criteria, a sequence of occurrence, or testimonials. Tabulations, comparative studies, and diagnostic and experimental action research cases seem to be more complex in the variety of data they summarize. Survey cases stand on their own as researchers use them much like they are gathering survey data from a large sample. This paper suggests that different types of case studies need to be judged by unique principles or assumptions.

170 citations


Journal ArticleDOI
TL;DR: In this article, the statistical power and Type I error rate of several homogeneity tests, usually applied in meta-analysis, are compared using Monte Carlo simulation: (1) the chi-square test applied to standardized mean differences, correlation coefficients, and Fisher's r-to-Z transformations, and consequently, the S&H procedures presented greater statistical power.
Abstract: The statistical power and Type I error rate of several homogeneity tests, usually applied in meta-analysis, are compared using Monte Carlo simulation: (1) The chi-square test applied to standardized mean differences, correlation coefficients, and Fisher's r-to-Z transformations, and (2) SH consequently, the S&H procedures presented greater statistical power. In all conditions, the statistical power was very low, particularly when the sample had few studies, small sample sizes, and presented short differences between the parametric effect sizes. Finally, the criteria for selecting homogeneity tests are discussed.

66 citations


Journal ArticleDOI
TL;DR: In this article, the concept of lifestyle is defined with an eye to establishing a generic conception sufficient for guiding ethnographic exploration in a wide range of areas, with reference to the concepts of culture, status, status group, subculture, idioc culture, everyday life, and social world.
Abstract: In effect, one of the primary missions of ethnographic research is to explore the lifestyles of the people falling within its purview. Yet, rare indeed it is to find a study in the several disciplines presently conducting such research where this idea serves as the avowed focus of data collection. The concept of lifestyle is first reviewed, then defined with an eye to establishing a generic conception sufficient for guiding ethnographic exploration in a wide range of areas. Next, lifestyle is located theoretically with reference to the concepts of culture, status, status group, subculture, idioculture, everyday life, and social world. The many different types of lifestyles in modern life are then briefly examined. Finally, we consider certain methodological approaches thought to be especially appropriate for exploring lifestyles.

46 citations


Journal ArticleDOI
TL;DR: The authors discusses the role of time in causal inferences in the social sciences and compares in detail how panel and event history observation designs affect causal analysis and concludes that the collection of event history data is an extremely useful approach for uncovering causal relationships or mapping out systems of causal relations.
Abstract: This paper first discusses the role of time in causal inferences in the social sciences. It then compares in detail how panel and event history observation designs affect causal analysis. It shows that the collection of event history data is an extremely useful approach for uncovering causal relationships or mapping out systems of causal relations. It concludes that event history data provide an optimal basis for a causal understanding of social processes because they allow the social researcher to relate the change in future outcomes to conditions in the past at each point in time.

43 citations


Journal ArticleDOI
TL;DR: In this paper, the validity and reliability of the self-reported vote were evaluated using data from seven Swedish election study panels, 1973-1994, and eight categories were created, based on self-report at time 1, an external check, and selfreport at Time 2.
Abstract: Data from seven Swedish election study panels, 1973-1994, were analyzed to assess simultaneously the validity and reliability of self-reported vote. Eight categories were created, based on self-report at Time 1, an external check, and self-report at Time 2. Overall the validity and reliability of this measure were quite high. However, the cases in which the measure was valid but not reliable outnumbered those in which it was reliable but not valid. Subsequent turnout behavior was most strongly predicted by what people had done previously, but the two self-report measures were also significant predictors in the regression analysis. The eight categories were then compared on a series of demographic and political variables.

17 citations


Journal ArticleDOI
TL;DR: In this paper, the authors argue that life history techniques can develop research on housing careers, particularly of Pakistanis, provided certain of their properties are recognized in the process of analysis, which can increase awareness and understanding of social actor agency.
Abstract: The discussion centers on three key criticisms of life history techniques, which are reviewed and evaluated with reference to data from research on Pakistani housing histories in Glasgow. Firstly, the ‘problem of bias’ is reframed as an issue of reflexivity, which enhances research findings. Secondly, awareness of the cultural construction of life histories allows greater depth of analysis. And thirdly the problem of the relationship between the life history and wider social forces is argued to increase awareness and understanding of social actor agency. In conclusion, we argue that life history techniques can develop research on housing careers, particularly of Pakistanis, provided certain of their properties are recognised in the process of analysis.

16 citations


Journal ArticleDOI
TL;DR: In this paper, the authors argue that two main causes of the failure to create interdisciplinary social science can be distinguished, i.e., methodological and theoretical problems, and that the validity and reliability of explanations of macro social phenomena, which are provided by disciplines such as sociology and macro economics, are seriously at stake.
Abstract: Since the 1930's, interdisciplinarity has been advocated in the social sciences for the purpose of achieving more comprehensive explanations of observable social phenomena. However, the realization of this promising perspective has been rather poor. This article argues that two main causes of the failure to create interdisciplinary social science can be distinguished, i.e., methodological and theoretical problems. Methodological problems stem either from taking a reductionist approach towards interdisciplinarity, or by mistaking measurement issues for theoretical topics. Theoretical problems result from the poor state and rate of theory formation within psychology. The implications of these problems are that the validity and reliability of explanations of macro social phenomena, which are provided by disciplines such as sociology and macro economics, are seriously at stake.

16 citations


Journal ArticleDOI
TL;DR: This paper showed that respondents do base their preferences on the information provided in the Choice Questionnaire and that different information resulted in significantly different choices when compared to the same information in different contexts.
Abstract: In the Choice Questionnaire (Neijens et al. 1992) respondents have to choose between several policy options. Within this questionnaire they are provided with information about the consequences of each option. Until now, only indirect evidence as to whether or not respondents base their preferences on the information provided was available and plausible alternative explanations for the Choice Questionnaire's effect could not be ruled out. In the present study, we demonstrate that Choice Questionnaire respondents do base their preferences on the information provided: different information resulted in significantly different choices.

14 citations


Journal ArticleDOI
TL;DR: The paper will show that when evaluating attitude stability, researchers' models must not only take into account the structure of the variances/covariances but also theructure of the means.
Abstract: Some researchers suggest that panel data be used with simplex models in order to evaluate stability of opinions before drawing the conclusion that an attitude is detected. They have carried out studies showing that it is not always true that an attitude exists, mainly because the opinion is unstable. This paper proposes to continue this line of research, presenting both a new conceptualization of attitude stability and a way to evaluate it by using simplex models. The paper will show that when evaluating attitude stability, researchers' models must not only take into account the structure of the variances/covariances but also the structure of the means. The authors demonstrate that this new definition of change makes a difference in the conclusions with respect to the stability of opinions with respect to the role of women in society.

13 citations


Journal ArticleDOI
TL;DR: The authors argue that there are a variety of "implicit" issues in qualitative inquiry that need to be addressed if the area is to develop in some "normal science" sense, and make suggestions for accomplishing these issues.
Abstract: The paper argues that there are a variety of ”implicit” issues in qualitative inquiry that need to be addressed if the area is to develop in some “normal science” sense. This “unfinished business” is concerned with a deeper investigation of basic terms that are now simply taken for granted, such as “theme” and “pattern”. It also includes the need to develop “rules” which will assist in making and justifying how qualitative interpretations are made from the implicit processes of inference. Specific suggestions are made for accomplishing these issues.

Journal ArticleDOI
TL;DR: In this article, the complementary nature of system integration and conflict is discussed, and it is shown that conflict and integration can coexist simultaneously within a system, and that they are symbiotic and need each other.
Abstract: The systems and conflict approaches are often viewed as incompatible, if not contradictory. While the former emphasizes system integration, consensus, and harmony, the latter connotes lack of consensus, and perhaps even system dissolution. This paper shows that rather than being contradictory, consensus and conflict are in fact complementary in some ways. Further, they can coexist simultaneously within a system. Every system has, at a given time, some level of both consensus and conflict (although one or the other may be very low, it is still probably above zero). While functionalists have long viewed system integration as "functional" and conflict as "dysfunctional," we also see conflict as "functional," as it combats lethargy and obsolescence, and spurs needed change and growth. However, while both conflict and integration coexist in a system, their interrelationship is complex, and sometimes very difficult to analyze. This paper demonstrates the complementary of system integration and conflict through explication of the simultaneous interrelationships of three analytical models: the global-mutable-immutable distinction, the three-level model, and the Q-R distinction. Through this analysis we show that integration and conflict not only are complementary, but are in fact symbiotic, and need each other.

Journal ArticleDOI
TL;DR: Two marketing reserach techniques, Conjoint Analysis and Quality Function Deployment (QFD), are imported to the political world and presented as superior substitutes to the conservative popularity contest surveys.
Abstract: Two marketing reserach techniques, Conjoint Analysis and Quality Function Deployment (QFD), are imported to the political world and presented as superior substitutes to the conservative popularity contest surveys. Both techniques go deeper than "horse-race" popularity contests to analyze the roots of political success. Political Conjoint Analysis (PCA) and Political QFD (PQFD) identify and quantify consistent theoretical and operational features which statistically determine electoral victors. The successful application of such marketing techniques in the political world will hopefully eventually lead to Total Quality Politics (TQP) and intensify competition over voter satisfaction, rather than voter manipulation. Possible beneficial results of such techniques are a refinement of candidates' qualities in light of public demands and constant change in candidates' reservoirs due to the inability of existing pools to match voter preferences.

Journal ArticleDOI
TL;DR: In this article, the authors assess the consistency of the four scales of the inventory by means of a confirmatory factor analysis model and version VIII of the Lisrel computer package, and also its stability at different times of day.
Abstract: Of the instruments devised for measuring self-reported activation, those that consider arousal dimensions (energy and tension) and a third dimension (hedonic tone, or pleasure/displeasure) allow a complete evaluation of the subjective mood. The inventory prepared by Matthews (1987a), which also incorporates a composite general arousal scale made up of items from the principal arousal scales, is one of the most frequently used. The present study assesses the consistency of the four scales of the inventory by means of a confirmatory factor analysis model and version VIII of the Lisrel computer package, and also its stability at different times of day (09:00, 13:00, 17:00 and 21:00). The study sample was made up of 671 healthy university students (236 men/435 women) aged between 17 and 38. Our results confirm the accuracity of fit/goodness of fit of the inventory's three principal factors. However, in our view, the secondary arousal factor should be defined or omitted, since its configura tion was poor at all times of day that the inventory was tested.

Journal ArticleDOI
TL;DR: In this article, the authors deal with the issue of the detection of selective nonresponse in discrete-time, multi-wave panel studies, and discuss ways to detect and quantify the amount of selectiveness by means of discretetime Markov models.
Abstract: The current paper deals with the issue of the detection of selective nonresponse in discrete-time, multi-wave panel studies. If groups in a population differ with respect to the chances that they will be (and remain) in a longitudinal sample, we speak of selective nonresponse. Ultimately, selective nonresponse may lead to a sample that is very different from the target population. We discuss ways to detect and quantify the amount of selectiveness by means of discrete-time Markov models. Then we proceed by addressing how a researcher may gain understanding of how to solve the problems caused by selective nonresponse, and the degree to which these solutions will be effective, by means of data on the nonresponse during a three-wave panel study involving 2800 young Dutch adults.

Journal ArticleDOI
TL;DR: The authors empirically demonstrates what can happen when difference scores are used as dependent variables in research and demonstrates an alternative method of data analysis, which can substantially reduce the problems of difference scores.
Abstract: Many management researchers use difference scores to report results of empirical studies. Yet difference scores create known problems of reliability, spurious correlations, and variance restriction. Reframing a research model can substantially reduce the problems of difference scores. This note empirically demonstrates what can happen when difference scores are used as dependent variables in research and demonstrates an alternative method of data analysis.

Journal ArticleDOI
TL;DR: The authors showed that the notion of rationality does not have any meaningful role to play in behavioral inquiry, and that there is no sense in distinguishing rational from non-rational or irrational behavior.
Abstract: Almost every theory of human behavior is based upon some assumption of rationalty. Such an assumption is commonly believed to be necessary in order to distinguish rational behavior, which is, from non-rational behavior, which is not amenable to scientific investigation. This article presents a thorough re-examination of this assumption, an inquiry which turns out to raise all the central issues of both the methodology and the theory of behavioral inquiry generally. It leads to the somewhat surprising conclusion that the notion of rationality does not have any meaningful role to play in behavioral inquiry, and that there is no sense in distinguishing rational from non-rational or irrational behavior. It also shows that the generalization of the utility notion in terms of information makes it into a much more powerful and subtle tool of analysis than it commonly appears to be taken for.

Journal ArticleDOI
TL;DR: The issue of sensitivity of the structural equation modeling (SEM) methodology to violations of the underlying hypothesis of linear latent relationships is the focus of as discussed by the authors, and the results of a simulation study are then presented which demonstrate that latent correlations and percentage explained variance as well as parameter standard errors and model residuals can provide critical information about violation of latent linearity.
Abstract: The issue of sensitivity of the structural equation modeling (SEM) methodology to violations of the underlying hypothesis of linear latent relationships is the focus of this paper. The identity of overall goodness-of-fit indices of an initially considered linear latent pattern model and of an equivalent model not making this assumption exemplifies the lack of routinely available global means within the methodology to evaluate the linearity assumption. It is next focused on the sensitivity of SEM to violations of presumed linearity for a general, nonlinear pattern of true relationship. The results of a simulation study are then presented which demonstrate that latent correlations and percentage explained variance as well as parameter standard errors and model residuals can provide critical information about violation of latent linearity, and should therefore also be focused on when examining departures from linear relationships at the latent level in applications of the SEM methodology in social and behavioral research.

Journal ArticleDOI
TL;DR: In this article, a model that incorporates water quantity and quality aspects and a market-based system is developed to characterize optimum water allocations between two regions or countries, in case that the relevant authorities of both regions agreed to impose it.
Abstract: A model is presented that incorporates water quantity and quality aspects and a market-based system is developed to characterize optimum water allocations between two regions or countries. A methodology is developed to compute an optimal policy that could support an interregional optimum water quantity and quality allocation in case that the relevant authorities of both regions agreed to impose it. The methodology is illustrated using the case of the Nestos river in the Balkans.

Journal ArticleDOI
TL;DR: In this paper, the authors argue that conventional statistics is built upon unwarranted assumptions, and therefore it must be replaced by the new statistical thinking which is based on four axioms.
Abstract: Every science is built upon a few axioms or general truths, which cannot be logically proved, but accepted as obviously true to actual fact. Conventional statistics is built upon unwarranted assumptions, and therefore it must be replaced by the new statistical thinking which is based on four axioms.

Journal ArticleDOI
TL;DR: In this article, the authors study the sensitivity of the conclusions reached with respect to the model used and find that the hazard is inversely U-shaped, which means that models that cannot allow for this type of hazard run into difficulties.
Abstract: The survival pattern of Swedish commercial banks during the period 1830--1990 is studied by parametric and non-parametric event-history methods. In particular we study the sensitivity of the conclusions reached with respect to the model used. It is found that the hazard is inversely U-shaped, which means that models that cannot allow for this type of hazard run into difficulties. Thus two of the most popular approaches in the analysis of event history data, the Gompertz and the Weibull models produce misleading results regarding the development of the death risk of banks over time. As regards the effect of explanatory variables on survival, on the other hand, most models are found to be robust and even in cases of misspecified baseline hazards, the estimated effects of the explanatory variables do not seem to be seriously wrong.

Journal ArticleDOI
TL;DR: The collapsibility theorem describes both the circumstances in which the effects of hierarchical models change when additional variables are introduced, as well as the circumstances that the exclusion of certain variables and the analysis of specific marginal tables may lead to different conclusions as discussed by the authors.
Abstract: The collapsibility theorem describes both the circumstances in which the effects of hierarchical models change when additional variables are introduced, as the circumstances in which the exclusion of certain variables and the analysis of specific marginal tables may lead to different conclusions.

Journal ArticleDOI
TL;DR: In this paper, musical notation or modifications of musical notation may be used to register courses (or cross-sectional data) with more variables than usual, such as the duration of components and the time scale.
Abstract: Graphic representation of complicated courses is often necessary to detect patterns that may be worth analysing. Examples are given to show how musical notation or modifications of musical notation may be used to register courses (or cross-sectional data) with more variables than usual. One can register courses with known duration of components (and then also simultaneities); the time scale may be defined according to data. One can also register sequences without known duration of components. Finally the method can be modified so as to suit cross-sectional data. The method can be used to register a single case but also a group of cases that are thus rendered comparable. It is a method of registration, not of analysis but one that may help prepare a refined analysis.

Journal ArticleDOI
TL;DR: Some results of an investigation are presented in which answers on some basic variables (occupation and education) are registered in several ways, and causes of differences in assignments are discussed, and recommendations for improvement are given.
Abstract: Some results of an investigation are presented in which answers on some basic variables (occupation and education) are registered in several ways. The variables were measured in a setting in which the registration was to occur as fast as possible. All registrations went by using a computer. The registrations were checked a few weeks afterwards. This gives information about the reliability of the registrations. About 70% of those having a job reported the same occupation during the registration and the check. For educational level the same answer was given in about 80% of the cases. People not in the labour process reported both times nearly identical. Typing the labels resembles most the way employees at registration desks work. Choosing, however, from a list gives more reliable results. Causes of differences in assignments are discussed, and recommendations for improvement are given.

Journal ArticleDOI
TL;DR: In this article, a simple efficiency wage model is proposed for identical workers where each worker chooses an effort level lower than that attributed to others, and the latter is estimated as the lowest effort that allows to pass the monitoring test.
Abstract: Empirical evidence has shown that people systematically overrate own performance relatively to others. This paper investigates production with identical workers where each one believes to be more productive than other workers. In a simple efficiency wage model, we ask how these seemingly incompatible beliefs can be made compatible with one another. We suggest that to compensate for the subjectively perceived productivity gap, each worker chooses an effort level lower than that attributed to others. The latter is estimated as the lowest effort that allows to pass the firm's monitoring test. Since rational agents will not maintain expectations which turn out to be systematically wrong, we introduce a "consistency requirement for false beliefs". Accordingly, predictions based on the "wrong" model must agree with the observations of the "true" model. We show that even with consistency, less effort is supplied than in the full information setting. Hence, the wage-effort relationship gets less efficient from the firm's viewpoint. At a first sight, at the firm-level workers gain from holding false beliefs, while profits unambigously fall. At the aggregate market outcome, however, the firms' labor demand declines, total output falls, and the rate of unemployment rises, decreasing workers utility again.

Journal ArticleDOI
TL;DR: In this article, the authors introduce novel cumulative logit models for the panel-data analysis of transitions among ordered states of a polytomous dependent variable, which can distinguish between covariate effects on the odds of having an upward transition and covariate effect on the likelihood of having a downward transition.
Abstract: This paper introduces novel cumulative logit models for the panel-data analysis of transitions among ordered states of a polytomous dependent variable. The models differ from conventional cumulative logit models in that they can distinguish between covariate effects on the odds of having an upward transition and covariate effects on the odds of having a downward transition in the ordered states of the dependent variable.

Journal ArticleDOI
TL;DR: A methodology by which judges themselves can be evaluated in a manner that describes the relative standards they use in the evaluation process, their consistency, and how well they compare to a hypothetical ideal judge is presented.
Abstract: There are many situations that exist in science and technology where the performance of a set of subjects that are in competition with one another is evaluated in some way by a set of evaluators who act as judges. This paper presents a methodology by which these judges themselves can be evaluated in a manner that describes the relative standards they use in the evaluation process, their consistency, and how well they compare to a hypothetical ideal judge. The methodology is computationally simple, can be justified in a theoretical manner, and is easy to apply across a wide range of problem types. It can also be used in a predictive manner to indicate how absent judges would have most likely evaluated subjects if they were able to make such an evaluation. A number of empirical examples which fully describe the methodology are discussed in this paper along with the results of applying it to a sizeable real-world problem. The methodology is applicable to a number of areas of physical and social sciences and can be extended, as presented in this manuscript, in a manner which can be applied to other diverse problems in mathematical sociology, computer engineering, and graph theory.