scispace - formally typeset
Search or ask a question
Topic

Decision rule

About: Decision rule is a research topic. Over the lifetime, 10636 publications have been published within this topic receiving 328309 citations.


Papers
More filters
Book
31 Oct 1991
TL;DR: Theoretical Foundations.
Abstract: I. Theoretical Foundations.- 1. Knowledge.- 1.1. Introduction.- 1.2. Knowledge and Classification.- 1.3. Knowledge Base.- 1.4. Equivalence, Generalization and Specialization of Knowledge.- Summary.- Exercises.- References.- 2. Imprecise Categories, Approximations and Rough Sets.- 2.1. Introduction.- 2.2. Rough Sets.- 2.3. Approximations of Set.- 2.4. Properties of Approximations.- 2.5. Approximations and Membership Relation.- 2.6. Numerical Characterization of Imprecision.- 2.7. Topological Characterization of Imprecision.- 2.8. Approximation of Classifications.- 2.9. Rough Equality of Sets.- 2.10. Rough Inclusion of Sets.- Summary.- Exercises.- References.- 3. Reduction of Knowledge.- 3.1. Introduction.- 3.2. Reduct and Core of Knowledge.- 3.3. Relative Reduct and Relative Core of Knowledge.- 3.4. Reduction of Categories.- 3.5. Relative Reduct and Core of Categories.- Summary.- Exercises.- References.- 4. Dependencies in Knowledge Base.- 4.1. Introduction.- 4.2. Dependency of Knowledge.- 4.3. Partial Dependency of Knowledge.- Summary.- Exercises.- References.- 5. Knowledge Representation.- 5.1. Introduction.- 5.2. Examples.- 5.3. Formal Definition.- 5.4. Significance of Attributes.- 5.5. Discernibility Matrix.- Summary.- Exercises.- References.- 6. Decision Tables.- 6.1. Introduction.- 6.2. Formal Definition and Some Properties.- 6.3. Simplification of Decision Tables.- Summary.- Exercises.- References.- 7. Reasoning about Knowledge.- 7.1. Introduction.- 7.2. Language of Decision Logic.- 7.3. Semantics of Decision Logic Language.- 7.4. Deduction in Decision Logic.- 7.5. Normal Forms.- 7.6. Decision Rules and Decision Algorithms.- 7.7. Truth and Indiscernibility.- 7.8. Dependency of Attributes.- 7.9. Reduction of Consistent Algorithms.- 7.10. Reduction of Inconsistent Algorithms.- 7.11. Reduction of Decision Rules.- 7.12. Minimization of Decision Algorithms.- Summary.- Exercises.- References.- II. Applications.- 8. Decision Making.- 8.1. Introduction.- 8.2. Optician's Decisions Table.- 8.3. Simplification of Decision Table.- 8.4. Decision Algorithm.- 8.5. The Case of Incomplete Information.- Summary.- Exercises.- References.- 9. Data Analysis.- 9.1. Introduction.- 9.2. Decision Table as Protocol of Observations.- 9.3. Derivation of Control Algorithms from Observation.- 9.4. Another Approach.- 9.5. The Case of Inconsistent Data.- Summary.- Exercises.- References.- 10. Dissimilarity Analysis.- 10.1. Introduction.- 10.2. The Middle East Situation.- 10.3. Beauty Contest.- 10.4. Pattern Recognition.- 10.5. Buying a Car.- Summary.- Exercises.- References.- 11. Switching Circuits.- 11.1. Introduction.- 11.2. Minimization of Partially Defined Switching Functions.- 11.3. Multiple-Output Switching Functions.- Summary.- Exercises.- References.- 12. Machine Learning.- 12.1. Introduction.- 12.2. Learning From Examples.- 12.3. The Case of an Imperfect Teacher.- 12.4. Inductive Learning.- Summary.- Exercises.- References.

7,826 citations

Posted Content
TL;DR: In this article, the authors advocate a theory based on empirical observation of actual firm decision-making, which provides a theory of decision making within business organizations, contrary to the economic theory of the firm, which sees firms as profit-maximizing entities.
Abstract: Provides a theory of decision making within business organizations. Contrary to the economic theory of the firm, which sees firms as profit-maximizing entities, the authors advocate a theory based on empirical observation of actual firm decision-making. Various features of firm decision-making are identified. First, firms are coalitions of participants whose individual goals may and often do conflict. How this conflict is resolved is determined by the firm's bargaining process. This process is constrained by past behavior and decisions. Second, the authors reject the notion that firms are one-dimensional profit-maximizers in favor of a view of firms as entities with many different objectives that will accept suboptimal outcomes if they are above a minimum level. Congruent with this, a firm's search activity occurs in response to a perceived problem and is limited in scope. The effect is that firm policies will change only incrementally. Also impeding radical policy change is the fact that firms react to uncertainty using standardized decision rules. Using this theory, two computer models of business decision-making are presented and compared with actual results. These models are shown to have especially good predictive power. (CAR)

6,698 citations

Journal ArticleDOI
TL;DR: This work shows that this seemingly mysterious phenomenon of boosting can be understood in terms of well-known statistical principles, namely additive modeling and maximum likelihood, and develops more direct approximations and shows that they exhibit nearly identical results to boosting.
Abstract: Boosting is one of the most important recent developments in classification methodology. Boosting works by sequentially applying a classification algorithm to reweighted versions of the training data and then taking a weighted majority vote of the sequence of classifiers thus produced. For many classification algorithms, this simple strategy results in dramatic improvements in performance. We show that this seemingly mysterious phenomenon can be understood in terms of well-known statistical principles, namely additive modeling and maximum likelihood. For the two-class problem, boosting can be viewed as an approximation to additive modeling on the logistic scale using maximum Bernoulli likelihood as a criterion. We develop more direct approximations and show that they exhibit nearly identical results to boosting. Direct multiclass generalizations based on multinomial likelihood are derived that exhibit performance comparable to other recently proposed multiclass generalizations of boosting in most situations, and far superior in some. We suggest a minor modification to boosting that can reduce computation, often by factors of 10 to 50. Finally, we apply these insights to produce an alternative formulation of boosting decision trees. This approach, based on best-first truncated tree induction, often leads to better performance, and can provide interpretable descriptions of the aggregate decision rule. It is also much faster computationally, making it more suitable to large-scale data mining applications.

6,598 citations

Journal ArticleDOI
TL;DR: In this article, the authors analyze a sequential decision model in which each decision maker looks at the decisions made by previous decision makers in taking her own decision, and they show that the decision rules that are chosen by optimizing individuals will be characterized by herd behavior.
Abstract: We analyze a sequential decision model in which each decision maker looks at the decisions made by previous decision makers in taking her own decision. This is rational for her because these other decision makers may have some information that is important for her. We then show that the decision rules that are chosen by optimizing individuals will be characterized by herd behavior; i.e., people will be doing what others are doing rather than using their information. We then show that the resulting equilibrium is inefficient.

5,956 citations

BookDOI
TL;DR: In this article, a survey of elementary applications of probability theory can be found, including the following: 1. Plausible reasoning 2. The quantitative rules 3. Elementary sampling theory 4. Elementary hypothesis testing 5. Queer uses for probability theory 6. Elementary parameter estimation 7. The central, Gaussian or normal distribution 8. Sufficiency, ancillarity, and all that 9. Repetitive experiments, probability and frequency 10. Advanced applications: 11. Discrete prior probabilities, the entropy principle 12. Simple applications of decision theory 15.
Abstract: Foreword Preface Part I. Principles and Elementary Applications: 1. Plausible reasoning 2. The quantitative rules 3. Elementary sampling theory 4. Elementary hypothesis testing 5. Queer uses for probability theory 6. Elementary parameter estimation 7. The central, Gaussian or normal distribution 8. Sufficiency, ancillarity, and all that 9. Repetitive experiments, probability and frequency 10. Physics of 'random experiments' Part II. Advanced Applications: 11. Discrete prior probabilities, the entropy principle 12. Ignorance priors and transformation groups 13. Decision theory: historical background 14. Simple applications of decision theory 15. Paradoxes of probability theory 16. Orthodox methods: historical background 17. Principles and pathology of orthodox statistics 18. The Ap distribution and rule of succession 19. Physical measurements 20. Model comparison 21. Outliers and robustness 22. Introduction to communication theory References Appendix A. Other approaches to probability theory Appendix B. Mathematical formalities and style Appendix C. Convolutions and cumulants.

4,641 citations


Network Information
Related Topics (5)
Regression analysis
31K papers, 1.7M citations
82% related
Empirical research
51.3K papers, 1.9M citations
82% related
Estimator
97.3K papers, 2.6M citations
81% related
Information system
107.5K papers, 1.8M citations
79% related
Robustness (computer science)
94.7K papers, 1.6M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20241
202333
202276
2021266
2020293
2019268