scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Robust minimum cost consensus models with aggregation operators under individual opinion uncertainty

TL;DR: A robust optimization method is used to construct three uncertain sets to better characterize the uncertainty of individual initial opinions and used three different aggregation operators to obtain collective opinions instead of using fixed values.
Abstract: Individual opinion is one of the vital factors influencing the consensus in group decision-making, and is often uncertain. The previous studies mostly used probability distribution, interval distribution or uncertainty distribution function to describe the uncertainty of individual opinions. However, this requires an accurate understanding of the individual opinions distribution, which is often difficult to satisfy in real life. In order to overcome this shortcoming, this paper uses a robust optimization method to construct three uncertain sets to better characterize the uncertainty of individual initial opinions. In addition, we used three different aggregation operators to obtain collective opinions instead of using fixed values. Furthermore, we applied the numerical simulations on flood disaster assessment in south China so as to evaluate the robustness of the solutions obtained by the robust consensus models that we proposed. The results showed that the proposed models are more robust than the previous models. Finally, the sensitivity analysis of uncertain parameters was discussed and compared, and the characteristics of the proposed models were revealed.
Citations
More filters
Journal ArticleDOI
TL;DR: In this article , the authors proposed three new minimum-cost consensus models with a distributionally robust method, and the three new models were transformed into a second-order cone programming problem to simplify the calculations.
Abstract: When solving the problem of the minimum cost consensus with asymmetric adjustment costs, decision makers need to face various uncertain situations (such as individual opinions and unit adjustment costs for opinion modifications in the up and down directions). However, in the existing methods for dealing with this problem, robust optimization will lead to overly conservative results, and stochastic programming needs to know the exact probability distribution. In order to overcome these shortcomings, it is essential to develop a novelty consensus model. Thus, we propose three new minimum-cost consensus models with a distributionally robust method. Uncertain parameters (individual opinions, unit adjustment costs for opinion modifications in the up and down directions, the degree of tolerance, and the range of thresholds) were investigated by modeling the three new models, respectively. In the distributionally robust method, the construction of an ambiguous set is very important. Based on the historical data information, we chose the Wasserstein ambiguous set with the Wasserstein distance in this study. Then, three new models were transformed into a second-order cone programming problem to simplify the calculations. Further, a case from the EU Trade and Animal Welfare (TAW) program policy consultation was used to verify the practicability of the proposed models. Through comparison and sensitivity analysis, the numerical results showed that the three new models fit the complex decision environment better.

1 citations

Journal ArticleDOI
TL;DR: Wang et al. as mentioned in this paper proposed a decision support framework for the evaluation and selection of alternative products based on OCR, which mainly includes three parts: 1) Data preprocessing: using Python to capture online consumer comments for data cleaning and preprocessing, and extracting key features as evaluation criteria; 2) Sentiment analysis: using Naive Bayes to analyze the sentiment of OCR.
Abstract: The sudden COVID-19 epidemic has caused consumers to gradually switch to online shopping, the increasing number of online consumer reviews (OCR) on Web 2.0 sites has made it difficult for consumers and merchants to make decisions by analyzing OCR. Much of the current literature on ranking products based on OCR ignores neutral reviews in OCR, evaluates mostly given criteria and ignores consumers’ own purchasing preferences, or ranks based on star ratings alone. This study aims to propose a new decision support framework for the evaluation and selection of alternative products based on OCR. The decision support framework mainly includes three parts: 1) Data preprocessing: using Python to capture online consumer comments for data cleaning and preprocessing, and extracting key features as evaluation criteria; 2) Sentiment analysis: using Naive Bayes to analyze the sentiment of OCR, and using intuitionistic fuzzy sets to describe the emotion score; 3) Benchmark analysis: a new IFMBWM-DEA model considering the preference of decision makers is proposed to calculate the efficiency score of alternative schemes and rank them according to the efficiency score. Then, the OCR of 15 laptops crawled from JD.com platform is used to prove the usefulness and applicability of the proposed decision support framework in two aspects: on the one hand, the comparison of whether the preference of decision makers is considered, and on the other hand, the comparison with the existing ranking methods. The comparison also proves that the proposed method is more realistic, the recommendations are more scientific and the complexity of the decision is reduced.
References
More filters
Journal ArticleDOI
Ronald R. Yager1
03 Jan 1988
TL;DR: A type of operator for aggregation called an ordered weighted aggregation (OWA) operator is introduced and its performance is found to be between those obtained using the AND operator and the OR operator.
Abstract: The author is primarily concerned with the problem of aggregating multicriteria to form an overall decision function. He introduces a type of operator for aggregation called an ordered weighted aggregation (OWA) operator and investigates the properties of this operator. The OWA's performance is found to be between those obtained using the AND operator, which requires all criteria to be satisfied, and the OR operator, which requires at least one criteria to be satisfied. >

6,534 citations

Journal ArticleDOI
TL;DR: If U is an ellipsoidal uncertainty set, then for some of the most important generic convex optimization problems (linear programming, quadratically constrained programming, semidefinite programming and others) the corresponding robust convex program is either exactly, or approximately, a tractable problem which lends itself to efficientalgorithms such as polynomial time interior point methods.
Abstract: We study convex optimization problems for which the data is not specified exactly and it is only known to belong to a given uncertainty set U, yet the constraints must hold for all possible values of the data from U. The ensuing optimization problem is called robust optimization. In this paper we lay the foundation of robust convex optimization. In the main part of the paper we show that if U is an ellipsoidal uncertainty set, then for some of the most important generic convex optimization problems (linear programming, quadratically constrained programming, semidefinite programming and others) the corresponding robust convex program is either exactly, or approximately, a tractable problem which lends itself to efficientalgorithms such as polynomial time interior point methods.

2,501 citations

Journal ArticleDOI
TL;DR: It is shown that the RC of an LP with ellipsoidal uncertainty set is computationally tractable, since it leads to a conic quadratic program, which can be solved in polynomial time.

1,809 citations

Journal ArticleDOI
TL;DR: The Robust Optimization methodology is applied to produce “robust” solutions of the above LPs which are in a sense immuned against uncertainty for the NETLIB problems.
Abstract: Optimal solutions of Linear Programming problems may become severely infeasible if the nominal data is slightly perturbed. We demonstrate this phenomenon by studying 90 LPs from the well-known NETLIB collection. We then apply the Robust Optimization methodology (Ben-Tal and Nemirovski [1–3]; El Ghaoui et al. [5, 6]) to produce “robust” solutions of the above LPs which are in a sense immuned against uncertainty. Surprisingly, for the NETLIB problems these robust solutions nearly lose nothing in optimality.

1,674 citations