scispace - formally typeset
Search or ask a question

Showing papers in "Quality & Quantity in 2006"


Journal ArticleDOI
TL;DR: This methods review shows that most challenges are resolved when taking into account the principles that guide the conduct of conventional surveys.
Abstract: The World Wide Web (WWW) is increasingly being used as a tool and platform for survey research. Two types of electronic or online surveys available for data collection are the email and Web based survey, and they constitute the focus of this paper. We address a multitude of issues researchers should consider before and during the use of this method of data collection: advantages and liabilities with this form of survey research, sampling problems, questionnaire design considerations, suggestions in approaching potential respondents, response rates and aspects of data processing. Where relevant, the methodological issues involved are illustrated with examples from our own research practice. This methods review shows that most challenges are resolved when taking into account the principles that guide the conduct of conventional surveys.

758 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a study on specific specific factors that impact the success of the TQM implementation purely based on the secondary research and point out that organizations are frequently exposed to the factors which may cause their efforts to delay or even fail.
Abstract: The past decade has seen many firms focusing on Total Quality Management (TQM) as a means of improving profits, market share and competitiveness. Although TQM is a proven approach for success in manufacturing, services and the public sector, several organizations failed in their campaigns because of many reasons like lack of top management commitment, ignoring customers etc. This paper presents a study on certain specific factors that impact the success of the TQM implementation purely based on the secondary research. The research points out that, though most organizations start TQM efforts for their success, they are frequently exposed to the factors which may cause their TQM efforts to delay or even fail. TQM is a sure bet to reverse poor performance, but when it did not yield the expected results, it was deemed a failure. The review has been done to identify the common problems that lead to the failure of TQM implementation in the organization and has pointed out the critical success factors of TQM. Nevertheless, the overall results of this research imply that the understanding of the elements that cause failure to the TQM implementation can provide needed help for companies involved in long-term continuous improvement efforts. If the advanced TQM approach is properly followed, it will help the companies to achieve organizational excellence.

110 citations


Journal ArticleDOI
TL;DR: The Q methodology as mentioned in this paper is an innovation in factor theory that was marginalized by 20th century developments in psychometrics but has more recently experienced a revival in interest, and which is particularly suited to illuminating and clarifying perspectives of marginalized populations.
Abstract: Marginalization consists in not taking others into account on any number of valued outcomes, resulting in powerlessness, ignorance, poverty, illness, insecurity, and other manifestations of devaluation. Two illustrations are presented, including a brief summary of Q methodology, an innovation in factor theory that was marginalized by 20th century developments in psychometrics but has more recently experienced a revival in interest, and which is particularly suited to illuminating and clarifying perspectives, including those of marginalized populations. The procedures associated with Q methodology are demonstrated in terms of clarifying the goals of a Central American initiative to foster development. A second illustration focuses on the applicability of Q methodology in revealing marginalized tendencies within single individuals. A variety of other recent and current applications are briefly summarized.

96 citations


Journal ArticleDOI
TL;DR: In this article, the authors modifies two assumptions of the classical EOQ model to reflect real-life situations, such as the payment of an order is made on the receipt of items by the inventory system.
Abstract: This paper modifies two assumptions of the classical EOQ model to reflect the real-life situations First, the classical EOQ model assumes that all units produced or purchased are of good quality Second, the payment of an order is made on the receipt of items by the inventory system So, we incorporate both Goyal [Journal of the Operational Research Society 36: 335–338 (1985)] and Salameh and Jaber [International Journal of Production Economics 64: 59–64 (2000)] to develop a production/inventory model of the retailer to allow items with imperfect quality under permissible delay in payments In addition, the objective function is modeled as an expected total annual profit maximization problem Then, two theorems are developed to efficiently determine the optimal cycle time and the optimal order quantity for the retailer Numerical examples are given to illustrate these theorems Finally, we deduce some previously published results of other researchers as special cases

59 citations


Journal ArticleDOI
TL;DR: In this paper, the behavior of three coefficients of reliability among coders, Cohen's K, Krippendorff's α and Perreault and Leigh's IFINESTρα, were patterned, in terms of the number of judges involved and the categories of answer defined.
Abstract: In the process of coding open-ended questions, the evaluation of interjudge reliability is a critical issue. In this paper, using real data, the behavior of three coefficients of reliability among coders, Cohen’s K, Krippendorff’s α and Perreault and Leigh’s I r are patterned, in terms of the number of judges involved and the categories of answer defined. The outcome underlines the importance of both variables in the valuations of interjudge reliability, as well as the higher adequacy of Perreault and Leigh’s I r and Krippendorff’s α for marketing and opinion research.

59 citations


Journal ArticleDOI
TL;DR: In this article, two major methodological risk factors in interpretive research are identified as horns of a dilemma, and a practical solution is offered. But the authors focus on the subjective social construct formation process (such as business strategy formation).
Abstract: Methodological challenges in researching subjective social construct formation processes (such as business strategy formation) are described. A solution is derived step by step, based on classic contributions to the literature and first principles. Two major methodological risk factors in interpretive research are identified as horns of a dilemma, and a practical solution is offered. In the process, a basis is offered for carrying out interpretive research in which meaningful data coding can occur across multiple case studies. The methodology developed for the author's own research is used as an illustration.

52 citations


Journal ArticleDOI
TL;DR: In this paper, the authors evaluate the suitability of pooled time series (TSCS) cross-section models for modeling welfare state evolution and conclude that the single equation error correction model is the best pooled TSCS model for modeling long-run effects even in the presence of nonstationary processes.
Abstract: In recent years, an impressive number of pooled time series (TSCS) cross-section models have been estimated in order to test hypotheses on welfare state development. Although most of these models share several of the variables, they can often be distinguished by the model specification adopted. This begs the question: what is the appropriate specification for modeling welfare state development? In order to answer this question some leading specifications are evaluated with respect to their ability to meet the theoretical assumptions about the theory of welfare state evolution in addition to the econometric canons on panel analysis. The main conclusions of this paper are the following. First, all specifications in levels are econometrically unfounded because most of the variables typically used for analyzing this topic cannot be considered to be stationary. Second, although a first difference model performs better from an econometric point of view, it is unable to test the hypothesized long-term relationships underlying welfare state dynamics. Third, and more importantly, the single equation error correction model represents the best pooled TSCS specification for modeling welfare state development since it is able tocapture long-run effects even in the presence of nonstationary processes.

49 citations


Journal ArticleDOI
TL;DR: It is concluded that certain varieties of MM are potentially more robust than others in fulfilling the diverse and severe criteria and that MM must concern itself with formulating “procedural rules” which guide the researcher in choosing and applying appropriate strategies for specific research problems.
Abstract: The article explores some of the emerging issues in the newly developing area of Mixed Methods (MM) research. Two of these issues concern the possibility of whether MM can provide for both “diverse” and “severe” testing. Based on a model of Placeholder Effects and utilizing an example of current empirical research, it is concluded that certain varieties of MM are potentially more robust than others in fulfilling the diverse and severe criteria. It is also argued that MM must concern itself with formulating “procedural rules” which guide the researcher in choosing and applying appropriate strategies for specific research problems.

46 citations


Journal ArticleDOI
TL;DR: In this paper, the authors deploy the unique opportunity of a dataset of Flemish school leavers to measure the incidence of over- and undereducation on the basis of the six applied measures in the literature.
Abstract: We deploy the unique opportunity of a dataset of Flemish school leavers to measure the incidence of over- and undereducation on the basis of the six applied measures in the literature. The incidence of overeducation in the first job after leaving school ranges from only 8% to 51%, undereducation ranges from 3% to 21%. While 66% is overeducated on the basis of at least one measure, only 3% is overeducated on the basis of every measure. Mismatch correlations range from 5% to 82%. Also the categories in terms of gender, educational level and region of residence with the highest likelihood of being overeducated depend on the measure. These findings clearly underline the weakness of the literature on this subject. However, measuring overeducation in different ways enables to derive some alternative concepts. Genuine overeducation amounts to about 20%. The incidence of over- and undereducation is both attributed to qualification inflation and deflation, and a credential gap. Finally, about 80% of the incidence of overeducation is classified as being structural.

45 citations


Journal ArticleDOI
TL;DR: A flexible geo-additive Bayesian survival model that controls, simultaneously, for spatial dependence and possible nonlinear or time-varying effects of other variables is described, based on recently developed Markov Chain Monte Carlo techniques.
Abstract: We describe a flexible geo-additive Bayesian survival model that controls, simultaneously, for spatial dependence and possible nonlinear or time-varying effects of other variables. Inference is fully Bayesian and is based on recently developed Markov Chain Monte Carlo techniques. In illustrating the model we introduce a spatial dimension in modelling under-five mortality among Malawian children using data from Malawi Demographic and Health Survey of 2000. The results show that district-level socioeconomic characteristics are important determinants of childhood mortality. More importantly, a separate spatial process produces district clustering of childhood mortality indicating the importance of spatial effects. The visual nature of the maps presented in this paper highlights relationships that would, otherwise, be overlooked in standard methods.

45 citations


Journal ArticleDOI
TL;DR: In this paper, the authors developed the economic design of the variable sampling intervals (VSI) exponentially weighted moving average (EWMA) charts to determine the values of the six test parameters of the charts (i.e., the sample size, the long sampling interval, the short sampling interval and the warning limit coefficient, the control limit coefficient and exponential weight constant) such that the expected total cost is minimized.
Abstract: Control charting is a graphical expression and operation of statistical hypothesis testing. In the present paper, we develop the economic design of the variable sampling intervals (VSI) exponentially weighted moving average (EWMA) charts to determine the values of the six test parameters of the charts (i.e., the sample size, the long sampling interval, the short sampling interval, the warning limit coefficient, the control limit coefficient, and exponential weight constant) such that the expected total cost is minimized. The genetic algorithm (GA) is applied to search for the optimal values of the six test parameters of the VSI EWMA chart, and an example is provided to illustrate the solution procedure. A sensitivity analysis is then carried out to investigate the effects of model parameters on the solution of the economic design.

Journal ArticleDOI
TL;DR: This paper found that respondents with more favorable evaluations of surveys had lower values on all kinds of nonresponse indicators, except for the strong effect on the prevalence of don't knows, survey attitudes were increasingly more predictive for all other aspects of non-response when these attitude answers were faster and thus cognitively more accessible.
Abstract: This paper analyzes whether respondents' attitudes toward surveys explains their susceptibility to item nonresponse. In contrast to previous studies, the decision to refuse to provide income information, not to answer other questions and the probability of "don't know" responses is tested separately. Furthermore, the interviewers' overall judgments of response willingness was included as well. Respondents with a positive and cognitively accessible attitude toward surveys were expected to adopt a cooperative orientation and were thus deemed more likely to answer difficult as well as sensitive questions. Attitudes were measured with a 16-item instrument and the response latencies were used as an indicator for attitude accessibility. We found that respondents with more favorable evaluations of surveys had lower values on all kinds of nonresponse indicators. Except for the strong effect on the prevalence of don't knows, survey attitudes were increasingly more predictive for all other aspects of nonresponse when these attitude answers were faster and thus cognitively more accessible. This accessibility, and thus how relevant survey attitudes are for nonresponse, was found to increase with the subjects' exposure to surveys in the past.

Journal ArticleDOI
TL;DR: The authors discusses consequences of violating the normal distribution assumption imbedded in Structural Equation Modeling (SEM) based on real data from a large sample customer satisfaction survey and discusses its impact on decision making in marketing.
Abstract: This paper discusses consequences of violating the normal distribution assumption imbedded in Structural Equation Modeling (SEM). Based on real data from a large sample customer satisfaction survey we follow the procedures as suggested in leading textbooks. We document consequences of this practice and discuss its impact on decision making in marketing.

Journal ArticleDOI
TL;DR: A statistical method using the relationship between GR&R and process capability indices is proposed for evaluating the adequacy of the acceptance criteria of P/T ratio.
Abstract: Measurement plays a significant role in Six sigma program. Usually, the gauge repeatability and reproducibility (GR&R) study needs to be conducted prior to the process capability analysis for verifying the accuracy of measuring equipments and helping organizations improve their product and service quality. Therefore, how to ensure the quality of measurement becomes an important task for quality practitioners. In performing the GR&R study, most industries are using the acceptance criteria of Precision to Tolerance(P/T) ratio as stipulated by QS9000. However, the adequacy of applying the same acceptance criteria to different manufacturing processes is very questionable. In this paper, a statistical method using the relationship between GR&R and process capability indices is proposed for evaluating the adequacy of the acceptance criteria of P/T ratio. Finally, a comparative analysis has also been performed for evaluating the accuracy of GR&R among three methods (ANOVA, Classical GR&R, and Long Form). Hopefully, the results of this research can provide a useful reference for quality practitioners in various industries.

Journal ArticleDOI
TL;DR: In this paper, an interaction analysis was used to analyze a total of 14,265 question-answer sequences of (Q-A Sequences) 80 questions that originated from two face-to-face and three telephone surveys.
Abstract: Interaction analysis was used to analyze a total of 14,265 question–answer sequences of (Q-A Sequences) 80 questions that originated from two face-to-face and three telephone surveys The analysis was directed towards the causes and effects of particular interactional problems Our results showed that problematic respondent behavior is affected by the questionnaire design, whereas inadequate interviewer behavior is affected by respondent behavior, rather than directly by the questionnaire design Two surveys used questions for which validating information was available It appeared that the occurrence of such irregularities of interviewer and respondent behavior was related to the validity of the eventual responses Explanations for the occurrence of problematic respondent behavior were proposed, concerning both cognitive and conversational factors, related to the wording of questions and response alternatives

Journal ArticleDOI
TL;DR: In this paper, the structural equation model was used to determine the factors that affect student performance in the statistics course in the Faculty of Psychology at the University of Barcelona (UBP).
Abstract: Many studies have examined the factors that influence academic performance in primary and secondary education as well as at university, with the purpose of enhancing learning at these stages and reducing drop-out rates. It is within this research framework that we want to emphasise the deficient performance of students enrolled on the statistics course in the Faculty of Psychology at the University of Barcelona. Consequently, this paper attempts to determine the factors that affect student performance in this subject by undertaking an analysis of a structural equation model and determining its stability over time. In order to accomplish our objective, we worked with two samples of students enrolled statistics classes. The first group comprised 211 students enrolled in the academic year 2000–2001, while the second comprised 287 students enrolled in the academic year 2001–2002. By administering a questionnaire, we obtained information concerning such variables as demographic data, previous academic record, information related to the subject and the degree of satisfaction with it, and the final mark obtained by the students in the subject. The parameters for each group of students were estimated separately and the goodness of fit of the proposed structural model was assessed. The data analysis showed a good fit with both data bases, but the set of estimated parameters differed in the two academic years under consideration.

Journal ArticleDOI
TL;DR: This article explored the motives of respondents for the failure to reveal earnings using the British Household Panel Study (BHPS) and demonstrated the importance of a discrimination between refusing the income-statement or don't know.
Abstract: Many validation studies deal with item nonresponse and measurement error in earning data. In this paper, we explore motives of respondents for the failure to reveal earnings using the British Household Panel Study (BHPS). The BHPS collects socio-economic information of private households in Great Britain. We explain the evolution of income-nonresponse in the BHPS and demonstrate the importance of a discrimination between refusing the income-statement or don’t know.

Journal ArticleDOI
TL;DR: In this article, the authors present a survey method for studying social representations, which creatively utilizes the elicitation and elaboration of free associations on a given topic in order to shed light on the semantic field and cognitive organization of a given social representation.
Abstract: The current article is concerned with the presentation of a novel method for studying social representations. Social representations are a key concept within social science but, as with many other social phenomena, notoriously difficult to grasp and study in a systematic way. The survey method presented though a formal one is not dependent on pre-specified and pre-graded answers as most questionnaires do. Instead it creatively utilizes the elicitation and elaboration of free associations on a given topic in order to shed light on the semantic field and cognitive organization of a given social representation. As a necessary complement to such a methodological exposition the article also presents some of the theoretical background that informs the method as well as the tradition of studying social representations in general. In the course of the article a research design for the method is presented accompanied by a complete script of the survey instrument. This is followed by a discussion of the merits of the method as well as some suggestions of analytical approaches that can be applied to the results. The article concludes with a short discussion of potential areas of application beyond the field of social representations.

Journal ArticleDOI
TL;DR: A mathematical model is developed to derive the optimal periodical preventive maintenance policy for a leased facility with Weibull life-time such that the expected total maintenance cost is minimized.
Abstract: This paper develops a mathematical model to derive the optimal periodical preventive maintenance (PM) policy for a leased facility with Weibull life-time. Within a lease period, any failures of the facility are rectified by minimal repairs and a penalty may occur to the lessor when the time required to perform a minimal repair exceeds a reasonable time limit. To reduce failures of the facility, additional PM actions are carried out periodically during the lease period. When the life-time distribution of a product is Weibull, the optimal number of PM actions and the corresponding maintenance degrees are derived such that the expected total maintenance cost is minimized. The structural properties of the optimal policy are investigated and an efficient algorithm is provided to search for the optimal policy. Finally, numerical examples are provided to illustrate the features of the proposed model.

Journal ArticleDOI
TL;DR: In this article, the authors investigated how much importance mathematics and its educational values have in 6th and 7th primary school students in Turkey and found that rationalism, control and openness values among mathematical values are emphasized more than complementary pairs of formalistic view, theoretical knowledge, instrumental understanding, accessibility and evaluation.
Abstract: Mathematics is usually seen as a field in which there is no values. Such a situation causes only a few studies about values teaching to be done in mathematics education. But, mathematics is a field that has various values in it, and that must be considered seriously from this point of view. Values are taught implicitly rather than explicitly in mathematics classes when comparing to others. The same situation can be seen in other lesson textbooks. Thus, in this paper; how much importance do mathematics and its educational values have, are investigated at mathematics textbooks in 6th and 7th primary school graders in Turkey. For this purpose, total eight 6th and 7th grade mathematics textbooks, that were chosen by random approach, are analysed with semantic content analysis. As a result of analysis it has been fixed that rationalism, control and openness values among mathematical values are emphasized more than complementary pairs of formalistic view, theoretical knowledge, instrumental understanding, accessibility and evaluation both in 6th and 7th grades mathematics textbooks.

Journal ArticleDOI
TL;DR: In this article, the effect of sunn pest damage on wheat price in Turkey is analyzed by using hedonic price function, and the results show that sunn pests damage is the most effective factor and causes a significant decrease in wheat prices.
Abstract: Wheat is a very strategic crop for Turkey as well as many other countries. Sunn pest is one of the most important pests of cereals particularly for wheat and barley in Turkey. Turkish governments have conducted sunn pest management (SPM) program, mainly based on chemical control since 1927. Neither farmers nor technical consultants have been satisfied with the SPM program conducted by the government. Therefore, the government purpose is to transfer SPM program to the farmers by providing technical information and equipments gradually. In this paper, effect of sunn pest damage on wheat price in Turkey is analyzed by using hedonic price function. The results show that sunn pest damage is the most effective factor and causes a significant decrease in wheat prices.

Journal ArticleDOI
TL;DR: In this article, the authors discuss the role of the normal distribution in psychological research and practice and show how it can be dangerous to treat the bell curve as a God or an Idol.
Abstract: The expression “the bell curve” designs both a kind of statistical distribution and the title of a famous and controversial book by Herrnstein and Murray. The first is so attractive that the second refers to it to give more credibility to its questionable theories on intelligence. The point is that, during the 20th century, the bell curve has assumed a more and more important role in psychological research and practice and have become both a reality and a myth. In the first case (reality) we can assist to appropriate applications of a real useful statistical concept. In the second (myth) we can have two kinds of attitudes: one attitude is typical of those researchers who search for normality in all their data and variables, just as Parsifal used to search for the Holy Graal (we call this “the Parsifal attitude”); the other is typical of those researchers who give normality for granted and act as if it were a Platonic Idea (we call this “the Plato attitude”). The article discusses the role of the normal distribution in psychological research and practice and shows how it can be dangerous to treat the bell curve as a God or an Idol.

Journal ArticleDOI
TL;DR: As part of a longitudinal mixed-method study on low back pain (LBP), free-text comments were invited at the end of the 12 month follow-up survey questionnaire and provided a wealth of material so that comparisons could be made with survey and interview results.
Abstract: As part of a longitudinal mixed-method study on low back pain (LBP), free-text comments were invited at the end of the 12 month follow-up survey questionnaire. 80% of respondents used this option and provided a wealth of material so that comparisons could be made with survey and interview results. People reported pain in other parts of the body apart from LBP, and illustrated the impact of LBP on physical ability and work, psychological well-being and social activities. Free-text material offers valuable insights that strengthen the findings of the survey, and the study as a whole.

Journal ArticleDOI
A. Akin Aksu1
TL;DR: In this paper, a customer loyalty topic had changed the ways of looking of establishments to customers and now, establishments are trying to satisfy, make loyal their customers, meeting their expectations, being different from rivals are important.
Abstract: Today everybody knows that in balance of establishment–customer relation, customers had gained big advantage. In order to continue long-term relationships with customers understanding them, meeting their expectations, being different from rivals are important. This situation is especially vital in similiar goods and services offering establishments. For gaining advantage and being unique, offering suitable goods and services, meeting even over meeting expectations and desires of customers are needed. Customer loyalty topic had changed the ways of looking of establishments to customers. Now, establishments are trying to satisfy, make loyal their customers. Loyal customers mean financial and spiritually inputs to establishments. It is generally known that especially on tourism sector there are few researches on customer loyalty. In this context, author believes that this research will have a positive input to the related literature.

Journal ArticleDOI
TL;DR: This article found that respondents translate similar attitudes differently into the answering options to forbid/allow questions, but the way the attitudes are expressed on the answering scale differs due to the use of "forbid" or "allow" in the answer.
Abstract: Survey questions worded with the verb ‘forbid’ prove not to elicit opposite answers to equivalent questions worded with the verb ‘allow’ (Rugg 1941). Although ‘forbid’ and ‘allow’ are generally considered each other’s counterparts, respondents rather answer ‘no, not forbid’ than ‘yes, allow’. In order to find out which question is a more valid measure of the underlying attitude, this asymmetry in the answers has to be explained. Experiments show that the asymmetry arises because respondents translate similar attitudes differently into the answering options to forbid/allow questions are equally valid, but the way the attitudes are expressed on the answering scale differs due to the use of ‘forbid’ or ‘allow’. How does this translation process work? The leading hypothesis in forbid/allow research predicts that respondents holding moderate opinions feel that ‘yes forbid’ and ‘yes allow’ are very extreme, causing moderate respondents to prefer answering ‘not forbid’, or ‘not allow’. This article presents the results of 10 experiments investigating the meanings of the answering options to forbid/allow questions. Extreme connotations are shown to only provide part of the explanation for the occurrence of the forbid/allow asymmetry. In order to describe the answering process for forbid/allow questions, well-definedness of meanings proves to be an important additional factor. The meanings of answering options to allow questions are ill-defined compared of those to forbid questions, which causes allow questions to be less homogeneous measures of the underlying attitude than forbid questions.

Journal ArticleDOI
TL;DR: The blocked-error regression R2 (beR2) as mentioned in this paper uses a minimal hypothetical causal intervention to resolve the variance-partitioning ambiguities created by loops and correlated errors.
Abstract: Bentler and Raykov (2000, Journal of Applied Psychology 85: 125–131), and Joreskog (1999a, http://www.ssicentral.com/lisrel/column3.htm, 1999b http://www.ssicentral. com/lisrel/column5.htm) proposed procedures for calculating R2 for dependent variables involved in loops or possessing correlated errors. This article demonstrates that Bentler and Raykov’s procedure can not be routinely interpreted as a “proportion” of explained variance, while Joreskog’s reduced-form calculation is unnecessarily restrictive. The new blocked-error-R2 (beR2) uses a minimal hypothetical causal intervention to resolve the variance-partitioning ambiguities created by loops and correlated errors. Hayduk (1996) discussed how stabilising feedback models – models capable of counteracting external perturbations – can result in an acceptable error variance which exceeds the variance of the dependent variable to which that error is attached. For variables included within loops, whether stabilising or not, beR2 provides the same value as Hayduk’s (1996) loop-adjusted-R2. For variables not involved in loops and not displaying correlated residuals, beR2 reports the same value as the traditional regression R2. Thus, beR2 provides a conceptualisation of the proportion of explained variance that spans both recursive and nonrecursive structural equation models. A procedure for calculating beR2 in any SEM program is provided.

Journal ArticleDOI
TL;DR: This paper suggests an alternative imputation procedure for incomplete data for which no true score exists: multiple complete random imputation, which overcomes the biasing effects of missing data and allows analysts to model respondents’ valid ‘I don’t know’ answers.
Abstract: Incomplete data is a common problem of survey research. Recent work on multiple imputation techniques has increased analysts awareness of the biasing effects of missing data and has also provided a convenient solution. Imputation methods replace non-response with estimates of the unobserved scores. In many instances, however, non-response to a stimulus does not result from measurement problems that inhibit accurate surveying of empirical reality, but from the inapplicability of the survey question. In such cases, existing imputation techniques replace valid non-response with counterfactual estimates of a situation in which the stimulus is applicable to all respondents. This paper suggests an alternative imputation procedure for incomplete data for which no true score exists: multiple complete random imputation, which overcomes the biasing effects of missing data and allows analysts to model respondents valid I don't know answers.

Journal ArticleDOI
TL;DR: In this paper, the distribution of the ratio X/Y is derived when X and Y are independent Frechet random variables and extensive tabulations of the associated percentage points are also given.
Abstract: The distribution of the ratio X/Y is derived when X and Y are independent Frechet random variables. Extensive tabulations of the associated percentage points are also given.

Journal ArticleDOI
TL;DR: In this paper, a computer simulation study was conducted to compare the effect of using discriminant logistic analysis (DLA) and MLR, applying either an iterative test purification procedure or non-iterative to detect non-uniform polytomous item DIF.
Abstract: This study focused on the effectiveness in nonuniform polytomous item DIF detection using Discriminant Logistic Analysis (DLA) and Multinomial Logistic Regression (MLR). A computer simulation study was conducted to compare the effect of using DLA and MLR, applying either an iterative test purification procedure or non-iterative to detect nonuniform DIF. The conditions under study were: DIF effect size (0.5, 1.0 and 1.5), sample size (500 and 1000), percentage of DIF items in the test (0, 10 and 20%) and DIF type (nonuniform). The results suggest that DLA is more accurate than MLR in detecting DIF. However, the purification process only improved the correct detection rate when MLR was applied. The false positive rates for both procedures were similar. Moreover, when the test purification procedure was used, the proportion of non-DIF items that were detected as DIF decreased for both procedures, although the false positive rates were smaller for DLA than for MLR.

Journal ArticleDOI
TL;DR: It is maintained that log-linear topological models, especially in their multi-matrix variant, are extremely useful in integrating sociological theory with empirical quantitative analysis, and shown that the principal shortcoming of these models is that they only partially allow the accurate modeling of the generative mechanisms underlying all the empirical regularities observed in aggregate data.
Abstract: Among techniques for the quantitative analysis of categorical data, log-linear models at present occupy a central place in social statistics, their sophistication and complexity having rapidly evolved over the past three decades. The article examines a specific variant of this approach to modeling which consists of log-linear topological models. It starts from the debate which followed introduction of the latter at the end of the 1970s to offer a new evaluation of the heuristic and methodological utility of this technique in light of recent discussion more generally concerned with the quantitative variables-based approach. In this regard, the article puts forward two arguments. It first maintains that log-linear topological models, especially in their multi-matrix variant, are extremely useful in integrating sociological theory with empirical quantitative analysis. It then shows that the principal shortcoming of these models is that they only partially allow the accurate modeling of the generative mechanisms underlying all the empirical regularities observed in aggregate data. These models are thus very attractive in that they go beyond the descriptive level of numerous works in quantitative sociology, and yet they are incapable of yielding explanations founded on the notion of generative mechanisms. In order not to remain at the abstract level of epistemological reflection, the article will attempt to show the well-foundedness of this thesis by constructing a multi-matrix log-linear topological model for the analysis of a contingency table which cross-classifies social origin with the educational qualification. The model is then tested against French survey data. To the extent that this model attempts to express ideas drawn from a specific theoretical approach – that of ‘rational educational choice’ – the analysis can contribute to both the study and understanding of inequalities in educational opportunity.