scispace - formally typeset
Search or ask a question

Showing papers in "Decision Sciences in 2001"


Journal ArticleDOI
TL;DR: Results of the study highlight several plausible limitations of TAM and TPB in explaining or predicting technology acceptance by individual professionals and suggest that instruments that have been developed and repeatedly tested in previous studies involving end users and business managers in ordinary business settings may not be equally valid in a professional setting.
Abstract: The proliferation of innovative and exciting information technology applications that target individual “professionals” has made the examination or re-examination of existing technology acceptance theories and models in a “professional” setting increasingly important. The current research represents a conceptual replication of several previous model comparison studies. The particular models under investigation are the Technology Acceptance Model (TAM), the Theory of Planned Behavior (TPB), and a decomposed TPB model, potentially adequate in the targeted healthcare professional setting. These models are empirically examined and compared, using the responses to a survey on telemedicine technology acceptance collected from more than 400 physicians practicing in public tertiary hospitals in Hong Kong. Results of the study highlight several plausible limitations of TAM and TPB in explaining or predicting technology acceptance by individual professionals. In addition, findings from the study also suggest that instruments that have been developed and repeatedly tested in previous studies involving end users and business managers in ordinary business settings may not be equally valid in a professional setting. Several implications for technology acceptance/adoption research and technology management practices are discussed.

1,386 citations


Journal ArticleDOI
TL;DR: It is suggested that teams make more effective decisions than individuals, and virtual teams make the most effective decisions.
Abstract: A total of 411 subjects participated in two decision-making experiments in order to examine the effectiveness of new product development project continuation decisions. Using escalation of commitment theory, in Study 1, individual versus face-to-face team decision-making effectiveness was compared. Study 2, an extension of Study 1, compared the new product development decision-making effectiveness of individuals, face-to-face teams, and virtual teams. A virtual team is a geographically and temporally dispersed and electronically communicating work group. In Study 2, the virtual teams communicated asynchronously via groupware technology. Our findings suggest that teams make more effective decisions than individuals, and virtual teams make the most effective decisions.

282 citations


Journal ArticleDOI
TL;DR: It is found that softer IT personnel skills do affect IS success, technical skills are viewed as the most important skill set in affecting IS infrastructure flexibility and competitive advantage, and modularity is viewed as more valuable to competitive advantage than integration.
Abstract: Determining and assessing the requisite skills of information technology (IT) personnel have become critical as the value of IT has risen in modern organizations. In addition to technical skills traditionally expected of IT personnel, softer skills like managerial, business, and interpersonal skills have been increasingly cited in previous studies as mandatory for these employees. This paper uses a typology of IT personnel skills—technology management skills, business functional skills, interpersonal skills, and technical skills—and investigates their relationships to two information systems (IS) success variables, IS infrastructure flexibility and the competitive advantage provided by IS. The study investigates these relationships using the perceptions of chief information officers (CIOs) from mostly Fortune 2000 companies. The contributions of this study are: IT personnel skills do affect IS success, technical skills are viewed as the most important skill set in affecting IS infrastructure flexibility and competitive advantage, and modularity is viewed as more valuable to competitive advantage than integration. Several explanations are offered for the lack of positive relationships between the softer IT personnel skills and the dimensions of IS success used in this study.

268 citations


Journal ArticleDOI
TL;DR: The results show that behavior modeling outperforms lecture-based training in a measure of final performance when task complexity is high, and it is found that computer self-efficacy has a greater positive effect on performance whentask simplicity is high than whentask complexity is low.
Abstract: Using a Modified Social Cognitive Theory framework, this study examines the behavior modeling and lecture-based training approaches to computer training. It extends the existing Social Cognitive Model for computer training by adding the task complexity construct to training method, prior performance, computer self-efficacy, outcome expectations, and performance. A sample of 249 students from a large state university served as participants in a laboratory experiment that was conducted to determine the task complexity*training method and task complexity* self-efficacy interaction effects on performance. Structural equation modeling with interaction effects was used to analyze the data. The results show that behavior modeling outperforms lecture-based training in a measure of final performance when task complexity is high. Further, it is found that computer self-efficacy has a greater positive effect on performance when task complexity is high than when task complexity is low. Prior performance is also found to be an important variable in the model.

152 citations


Journal ArticleDOI
TL;DR: A rigorous and comprehensive framework is developed that sharpens the theoretical contributions of loose coupling to the understanding of structural relationships and testable hypotheses are proposed with respect to three key independent variables that may affect patterns of coupling.
Abstract: Organizational theories frequently rely on notions of sharing and dependence among organizational participants, but researchers usually focus on characteristics of the actors themselves instead of the relational patterns among the actors. Loose coupling is one conceptual tool that emphasizes relational patterns. Loose coupling, however, is an abstract metaphor that is simultaneously fertile and ambiguous. This paper develops a rigorous and comprehensive framework that sharpens the theoretical contributions of loose coupling to our understanding of structural relationships. Characteristics of loose coupling capture some important and underexplored features of multidimensional fit and interdependence in organizations. The proposed framework clarifies these theoretical contributions of loose coupling with concepts and equations modified from network analysis. Testable hypotheses are proposed with respect to three key independent variables that may affect patterns of coupling: organization strategy, technology, and environmental turbulence. Additional hypotheses are advanced with respect to the use of the multidimensional approach to loose coupling in studying new organizational forms. Initial psychometric and empirical evidence are presented.

122 citations


Journal ArticleDOI
TL;DR: The data showed that supplier assessment and Just-In-Time strategies were correlated and affected the quality management strategy used, which in turn influenced the new product design and development strategy.
Abstract: This research uses structural equation modeling to analyze the effects of supplier assessment, Just-In-Time, and quality management strategies on new product design and development. A survey of senior managers who are members of the American Production and Inventory Control Society in the United States was used to test the relationships between the constructs in the model. In general, the survey results supported the proposed structural equation model. The data showed that supplier assessment and Just-In-Time strategies were correlated and affected the quality management strategy used, which in turn influenced the new product design and development strategy. The data also showed that the Just-In-Time strategy directly influenced the new product design and development strategy.

122 citations


Journal ArticleDOI
TL;DR: The results indicate that data mining methods and data proportion have a significant impact on classification accuracy, and rough sets provide better accuracy, followed by neural networks and inductive learning.
Abstract: Intrusion detection systems help network administrators prepare for and deal with network security attacks. These systems collect information from a variety of systems and network sources, and analyze them for signs of intrusion and misuse. A variety of techniques have been employed for analysis ranging from traditional statistical methods to new data mining approaches. In this study the performance of three data mining methods in detecting network intrusion is examined. An experimental design (3times2x2) is created to evaluate the impact of three data mining methods, two data representation formats, and two data proportion schemes on the classification accuracy of intrusion detection systems. The results indicate that data mining methods and data proportion have a significant impact on classification accuracy. Within data mining methods, rough sets provide better accuracy, followed by neural networks and inductive learning. Balanced data proportion performs better than unbalanced data proportion. There are no major differences in performance between binary and integer data representation.

100 citations


Journal ArticleDOI
TL;DR: This paper developed and used a comprehensive model consisting of four evaluation criteria: decision quality, user satisfaction, user learning, and decision-making efficiency, and found that deliberate decisional guidance was more effective on all four criteria.
Abstract: Decisional guidance is defined as how a decision support system (DSS) influences its users as they structure and execute the decision-making process. It is assumed that decisional guidance has profound effects on decision making, but these effects are understudied and empirically unproven. This paper describes an empirical, laboratory-experiment-based evaluation of the effectiveness of deliberate decisional guidance and its four types. We developed and used a comprehensive model consisting of four evaluation criteria: decision quality, user satisfaction, user learning, and decision-making efficiency. On these criteria, we compared decisional guidance versus no guidance, informative versus suggestive decisional guidance, and predefined versus dynamic decisional guidance. We found that deliberate decisional guidance was more effective on all four criteria; suggestive guidance was more effective in improving decision quality and user satisfaction, and informative guidance was more effective in user learning about the problem domain, whereas dynamic guidance was more effective than predefined guidance in improving decision quality and user learning; and both suggestive guidance and dynamic guidance reduced the decision time.

97 citations


Journal ArticleDOI
TL;DR: This study extends cognitive fit to accounting models and integrates cognitive fit theory with the concept of localization to provide additional evidence for how cognitive fit works.
Abstract: Cognitive fit, a correspondence between task and data representation format, has been demonstrated to lead to superior task performance by individual users and has been posited as an explanation for performance differences among users of various problem representations such as tables, graphs, maps, and schematic faces. The current study extends cognitive fit to accounting models and integrates cognitive fit theory with the concept of localization to provide additional evidence for how cognitive fit works. Two accounting model representations are compared in this study, the traditional DCA (Debit-Credit-Account) accounting model and the REA (Resources-Events-Agents) accounting model. Results indicate that the localization of relevant objects or linkages is important in establishing cognitive fit.

86 citations


Journal ArticleDOI
TL;DR: Results obtained indicate that a combination of retention enhancement and practice led to significantly better cognitive learning than practice alone, and the initial difference in cognitive achievement was still evident one week after training.
Abstract: Managers and analysts increasingly need to master the hands-on use of computer-based decision technologies including spreadsheet models. Effective training can prevent the lack of skill from impeding potential effectiveness gains from decision technologies. Among the wide variety of software training approaches in use today, recent research indicates that techniques based on behavior modeling, which consists of computer skill demonstration and hands-on practice, are among the most effective for achieving positive training outcomes. The present research examines whether the established behavior-modeling approach to software training can be improved by adding a retention enhancement intervention as a substitute for, or complement to, hands-on practice. One hundred and eleven trainees were randomly assigned to one of three versions of a training program for spreadsheets: retention enhancement only, practice only, and retention enhancement plus practice. Results obtained while controlling for total training time indicate that a combination of retention enhancement and practice led to significantly better cognitive learning than practice alone. The initial difference in cognitive achievement was still evident one week after training. Implications for future computer training research and practice are discussed.

82 citations


Journal ArticleDOI
TL;DR: An integrated framework for designing profit-maximizing products/ services, which can also be produced at reasonable operating difficulty levels, and shows that optimum profit, market share, cost, and product profiles are dependent on operating difficulty level.
Abstract: This paper presents an integrated framework for designing profit-maximizing products/ services, which can also be produced at reasonable operating difficulty levels. Operating difficulty is represented as a function of product and process attributes, and measures a firm's relative ease or difficulty in meeting customer demand patterns under specified operating conditions. Earlier optimum product design procedures have not considered. operational difficulty. We show that optimum profit, market share, cost, and product profiles are dependent on operating difficulty level. Empirical results from the pizza delivery industry demonstrate the value of the proposed Effective Product/Service Design approach.

Journal ArticleDOI
TL;DR: It is found that schedule integration can lead to overall cost savings in a supply chain, but some firms may have to absorb costs in excess of those they would incur with independent scheduling.
Abstract: This study explores the value of integrated production schedules for reducing the negative effects of schedule revisions in supply chains involving buyer and supplier firms. A stochastic cost model is developed to evaluate the total supply chain cost with integrated purchasing and scheduling policies. The model minimizes the costs associated with assembly rate adjustment, safety stock, and schedule changes for all supply chain members. Through experimentation, the paper examines the impact of several environmental factors on the value of schedule integration. This study finds that schedule integration can lead to overall cost savings in a supply chain, but some firms may have to absorb costs in excess of those they would incur with independent scheduling. Environments with high inventory holding costs and long supplier lead times may not find it beneficial to adopt an integrated schedule. Forecast effectiveness plays a critical role in realizing the benefits of schedule integration. The paper concludes with suggestions for future research.

Journal ArticleDOI
TL;DR: Among the six evaluation models developed, one (MHDIS) classifies correctly all countries into the appropriate groups and outperformed the other five methods in a 10-fold validation experiment, promising for the study of emerging new markets in fast-growing regions.
Abstract: Mathematical programming and multicriteria approaches to classification and discrimination are reviewed, with an emphasis on preference disaggregation. The latter include the UTADIS family and a new method, Multigroup Hierarchical DIScrimination (MHDIS). They are used to assess investing risk in 51 countries that have stock exchanges, according to 27 criteria. These criteria include quantitative and qualitative measures of market risk (volatility and currency fluctuations); range of investment opportunities; quantity and quality on market information; investor protection (security regulations treatment of minority shareholders); and administrative “headaches” (custody, settlement, and taxes). The model parameters are determined so that the results best match the risk level assigned to those countries by experienced international investment managers commissioned by The Wall Street Journal. Among the six evaluation models developed, one (MHDIS) classifies correctly all countries into the appropriate groups. Thus, this model is able to reproduce consistently the evaluation of the expert investment analysts. The most significant criteria and their weights for assessing global risk investing are also presented, along with their marginal utilities, leading to identifiers of risk groups and global utilities portraying the strength of each country's risk classification. The same method, MHDIS, outperformed the other five methods in a 10-fold validation experiment. These results are promising for the study of emerging new markets in fast-growing regions, which present fertile areas for investment growth but also

Journal ArticleDOI
TL;DR: A heuristic for the NP-hard service design problem that integrates realistic service delivery cost models with conjoint analysis is proposed and the resulting seller's utility function links expected profits to the intensity of a service's influential attributes and also reveals an ideal setting or level for each service attribute.
Abstract: Service designers predict market share and sales for their new designs by estimating consumer utilities. The service's technical features (for example, overnight parcel delivery), its price, and the nature of consumer interactions with the service delivery system influence those utilities. Price and the service's technical features are usually quite objective and readily ascertained by the consumer. However, consumer perceptions about their interactions with the service delivery system are usually far more subjective. Furthermore, service designers can only hope to influence those perceptions indirectly through their decisions about nonlinear processes such as employee recruiting, training, and scheduling policies. Like the service's technical features, these process choices affect quality perceptions, market share, revenues, costs, and profits. We propose a heuristic for the NP-hard service design problem that integrates realistic service delivery cost models with conjoint analysis. The resulting seller's utility function links expected profits to the intensity of a service's influential attributes and also reveals an ideal setting or level for each service attribute. In tests with simulated service design problems, our proposed configurations compare quite favorably with the designs suggested by other normative service design heuristics.

Journal ArticleDOI
TL;DR: The main contribution of this paper is to provide the production manager with a way of splitting a lot in order to optimize performance under various measures of performance and setup time considerations.
Abstract: Lot streaming is the process of splitting a production lot into sublots and then scheduling the sublots in overlapping fashion on the machines in order to improve the performance of the production system. Implementation of this concept arises in several batch production environments. These include, among others, printed circuit board assembly and semiconductor fabrication. There are several limitations in the lot-streaming models available in the literature which affect their usefulness in reality. In this paper, we consider the single batch, flow shop, lot-streaming problem but relax several of these limitations. The main contribution of this paper is to provide the production manager with a way of splitting a lot in order to optimize performance under various measures of performance and setup time considerations. In addition, the insight of the proposed procedure can be used to tackle more general versions of the problem considered.

Journal ArticleDOI
TL;DR: This study defines a new security requirement that achieves the objective of providing access to legitimate users without an increase in the ability of a snooper to predict confidential information and derives the specifications under which perturbation methods can achieve this objective.
Abstract: With the rapid increase in the ability to store and analyze large amounts of data, organizations are gathering extensive data regarding their customers, vendors, and other entities. There has been a concurrent increase in the demand for preserving the privacy of confidential data that may be collected. The rapid growth of e-commerce has also increased calls for maintaining privacy and confidentiality of data. For numerical data, data perturbation methods offer an easy yet effective solution to the dilemma of providing access to legitimate users while protecting the data from snoopers (legitimate users who perform illegitimate analysis). In this study, we define a new security requirement that achieves the objective of providing access to legitimate users without an increase in the ability of a snooper to predict confidential information. We also derive the specifications under which perturbation methods can achieve this objective. Numerical examples are provided to show that the use of the new specification achieves the objective of no additional information to the snooper. Implications of the new specification for e-commerce are discussed.

Journal ArticleDOI
TL;DR: This paper describes one such evaluation and proposes a set of evaluation criteria for embedded intelligent real-time systems (EIRTS).
Abstract: Over the past two decades, questions have surfaced about the effectiveness and contribution of intelligent systems to decision makers in a variety of settings This paper focuses on the evaluation challenges associated with intelligent real-time software systems that are embedded in larger host systems With the proliferation of such systems in operational settings such as aerospace, medical, manufacturing, and transportation systems, increased attention to evaluations of such systems, and to resulting software safety, is warranted This paper describes one such evaluation and proposes a set of evaluation criteria for embedded intelligent real-time systems (EIRTS) Implications of the evaluation and the evaluation criteria are discussed

Journal ArticleDOI
TL;DR: Using measures of performance from income statement and balance sheet data, and stock returns, it is found that the adoption of this labor practice is associated with superior subsequent performance.
Abstract: This paper provides the first systematic examination of the financial implications associated with increased reliance on contingent (i.e., temporary/part-time) labor. Using measures of performance from income statement and balance sheet data, and stock returns, we find that the adoption of this labor practice is associated with superior subsequent performance. Concurrently, no increase in systematic risk and standard deviation of stock returns is observed. The increase in performance with no concurrent increase in systematic risk and standard deviation of returns perhaps explains the increasing popularity of this labor practice.

Journal ArticleDOI
TL;DR: It is suggested that the impact of source reliability attributes may be more complex than portrayed in the auditing standards and that recognizing these subtleties may lead to greater efficiency and effectiveness.
Abstract: This paper provides a normative framework for how external auditors should evaluate internal audit (IA) work, with a view to assessing the risk of material misstatement. The central issue facing the external auditor when evaluating IA work is the reliability of IA work. Reliability assessments are structured using the cascaded inference framework from behavioral decision theory, in which attributes of source reliability are explicitly modeled and combined using Bayes' rule in order to determine the inferential value of IA work. Results suggest that the inferential value of an IA report is highly sensitive to internal auditor reporting bias, but relatively insensitive to reporting veracity. Veracity refers to internal auditors' propensity to report truthfully, whereas bias refers to the propensity to misreport findings. Results also indicate that this sensitivity to reporting bias is conditional on the level of internal auditor competence, thus suggesting significant interaction effects between the objectivity and competence factors. Collectively, these findings suggest that the impact of source reliability attributes may be more complex than portrayed in the auditing standards and that recognizing these subtleties may lead to greater efficiency and effectiveness.

Journal ArticleDOI
TL;DR: This paper proposes two mathematical problem formulations and optimization algorithms that consider additional fixed charges that are associated with each time period's production of lot-sizing problems in the food, chemical, and pharmaceutical industries.
Abstract: Traditional approaches for modeling economic production lot-sizing problems assume that a single, fixed equipment setup cost is incurred each time a product is run, regardless of the quantity manufactured. This permits multiple days of production from one production setup. In this paper, we extend the model to consider additional fixed charges, such as cleanup or inspection costs, that are associated with each time period's production. This manufacturing cost structure is common in the food, chemical, and pharmaceutical industries, where process equipment must be sanitized between item changeovers and at the end of each day's production. We propose two mathematical problem formulations and optimization algorithms. The models' unique features include regular time production constraints, a fixed charge for each time period's production, and the availability of overtime production capacity. Experimental results indicate the conditions under which our algorithms' performance is superior to traditional approaches. We also test the procedures on a set of lot-sizing problems facing a national food processor and document their potential economic benefit.

Journal ArticleDOI
TL;DR: The study uses an experiment to examine the separate and combined effects of managers' loss aversion and their causal attributions about their divisions' performance on tendencies to make goal-incongruent capital budget recommendations, finding that managers' recommendations are biased by their loss aversion.
Abstract: This study uses an experiment to examine the separate and combined effects of managers' loss aversion and their causal attributions about their divisions' performance on tendencies to make goal-incongruent capital budget recommendations. We find that managers' recommendations are biased by their loss aversion. In particular, managers of high-performing divisions are more likely than managers of low-performing divisions to propose investments that maximize their division's short-term profits at the expense of the firm's long-term value. We also find that managers' recommendations are biased by their causal attributions. In particular, managers are more likely to propose investments that maximize their division's short-term profits at the expense of the firm's long-term value when they attribute their division's performance to external causes (e.g., task difficulty or luck) rather than to internal causes (e.g., managerial ability or effort). Further, the effects of causal attributions are greater for managers of high-performing divisions than for managers of low-performing divisions. The study's findings are important because loss aversion and causal attributions are often manifested in firms. Thus, they may bias managers' decisions, which in turn may be detrimental to the firms' long-term value.

Journal ArticleDOI
TL;DR: The proposed brand demand system is a set of interrelated demand functions that are derived explicitly from a utility function describing consumer preferences, which generalizes by the integration of category expenditures, which are determined endogenously.
Abstract: This paper introduces the design and implementation of utility-consistent, brand, and category demand systems. It extends formal demand analysis to the area of brand and category demand, which directly concerns marketing researchers and managers. The proposed brand demand system is a set of interrelated demand functions that are derived explicitly from a utility function describing consumer preferences. The model generalizes by the integration of category expenditures, which are determined endogenously. The theoretical plausibility of the proposed demand model is demonstrated first and, subsequently, brand and category level systems are derived. Econometric methods for estimating the systems are also developed and illustrated in empirical data. The results yield empirically determined, quantitative insights into the structure of consumer demand for brands and product categories. The proposed approach has the attractive feature of structuring the interdependencies of consumer decisions and ensuring an explicit role for theory in applied research.

Journal ArticleDOI
TL;DR: Judgments were predicted to be more accurate when: (1) diagnostic information is presented late rather than early in the information sequence and (2) when no irrelevant distractor information isPresent and the experimental data support all three predictions.
Abstract: Computer-based decision aids are intended to support and improve human judgments. Frequently, the largest portion of the design effort is devoted to the technical aspects of the system; behavioral aspects are often overlooked. As a result, the decision aid may be ineffective. An experiment was conducted to examine the effects of two information structure variables that theoretically affect judgments: information sequence and irrelevant distractor information. Auditor subjects made continuing existence judgments for client-banks after interacting with one of four alternative decision aids. The decision aids are modifications of a system developed by an international CPA firm. Judgments were predicted to be more accurate when: (1) diagnostic information is presented late rather than early in the information sequence and (2) when no irrelevant distractor information is presented. Further, judgment confidence was predicted to be unrelated to either information sequence or irrelevant distractor information. The experimental data support all three predictions.

Journal ArticleDOI
TL;DR: The utility of the single-item discrete sequential search model is increased by developing a formulation that includes simple precedence relationships as well as sequence dependent relationships defined by group activities.
Abstract: Equipment failures can have significant implications in terms of cost and customer satisfaction. Reducing the time required to find the cause of a failure can provide large cost savings and help preserve customer goodwill. Single-item discrete sequential search models can be used to sequence the tasks in diagnostic search to minimize the expected time required to find the cause of the failure. We increase the utility of the single-item discrete sequential search model by developing a formulation that includes simple precedence relationships as well as sequence dependent relationships defined by group activities. This formulation can be applied to a number of other problems including determining the sequence for multiple quality control tests on an item, scheduling oil well workovers to maximize the expected increase in oil production, and sequencing tasks in a research project where there is a technological risk associated with each task.

Journal ArticleDOI
TL;DR: Results of a pilot implementation using actual data show potential for significant savings for the company, and the DSS is designed for use in a multiproduct environment with overlapping raw materials and processing requirements.
Abstract: This paper presents a decision support system (DSS) for managing production/distribution planning in a continuous manufacturing environment. The vendor has multiple plants and distribution centers (DCs). The trading partners have widely varying independent demand patterns. The DSS is designed for use in a multiproduct environment with overlapping raw materials and processing requirements. The production and distribution lead times at plants may span multiple planning periods. The impact of any manual override of a suggested solution can also be evaluated. The DSS is based on a linear programming model with a rolling horizon and was originally designed for a large process industry. Results of a pilot implementation using actual data are also presented, which show potential for significant savings for the company.

Journal ArticleDOI
TL;DR: It is analytically demonstrated that the lot sizes derived using an average annual cost approach for the different variants of the problem are, in general, larger than the DCF optimum.
Abstract: We consider the optimal lot-sizing policy for an inventoried item when the vendor offers a limited-time price reduction. We use the discounted cash flow (DCF) approach in our analysis, thereby eliminating the sources of approximation found in most of the earlier studies that use an average annual cost approach. We first characterize the optimal lot-sizing policies and their properties, then develop an algorithm for determining the optimal lot sizes. We analytically demonstrate that the lot sizes derived using an average annual cost approach for the different variants of the problem are, in general, larger than the DCF optimum. While DCF analysis is more rigorous and yields precise lot sizes, we recognize that the associated mathematical models and the solution procedure are rather complex. Since simple and easy-to-understand policies have a strong practical appeal to decision makers, we propose a DCF version of a simple and easy-to-implement heuristic called the “Early Purchase” (EP) strategy and discuss its performance. We supplement our analytical developments with a detailed computational analysis and discuss the implications of our findings for decision making.

Journal ArticleDOI
TL;DR: The results from the experiment showed that ratio judgments were less effective than equivalence judgments, and that useful techniques should be devised to incorporate different commonly used techniques, such as multiattribute utility theory and the Analytic Hierarchy Process, to elicit and consolidate equivalence trade-off judgments.
Abstract: Two commonly used elicitation modes on strength of preference, equivalence and ratio judgments, were compared in an experiment. The result from the experiment showed that ratio judgments were less effective than equivalence judgments. Based on an iterative design for eliciting multiattribute preference structures, equivalence judgments outperformed ratio judgments in estimating single-attribute measurable value functions, while being nearly more effective than ratio judgments in assessing multiattribute preference structures. The implications of the results from the experiment are that multiattribute decision-making techniques should take advantage of the decision maker's inclination of making effective equivalence trade-off judgments, and that useful techniques should be devised to incorporate different commonly used techniques, such as multiattribute utility theory and the Analytic Hierarchy Process, to elicit and consolidate equivalence trade-off judgments.

Journal ArticleDOI
TL;DR: This paper proposes a simple sequential updating alternative method based on function approximation for consensus forecasting based on a convex combination of individual forecast densities.
Abstract: In this paper we propose a consensus forecasting method based on a convex combination of individual forecast densities. The exact Bayesian updating of the convex combination weights is very complex and practically prohibitive. We propose a simple sequential updating alternative method based on function approximation. Several examples illustrate the method.