Author
Nandan Sudarsanam
Other affiliations: Massachusetts Institute of Technology, Indian Institutes of Technology
Bio: Nandan Sudarsanam is an academic researcher from Indian Institute of Technology Madras. The author has contributed to research in topics: Medicine & Business. The author has an hindex of 5, co-authored 19 publications receiving 174 citations. Previous affiliations of Nandan Sudarsanam include Massachusetts Institute of Technology & Indian Institutes of Technology.
Papers
More filters
[...]
TL;DR: A meta-analysis of 113 data sets from published factorial experiments shows that a preponderance of active two-factor interaction effects are synergistic, meaning that when main effects are used to increase the system response, the interaction provides an additional increase and that when the interactions generally counteract the main effects.
Abstract: This article documents a meta-analysis of 113 data sets from published factorial experiments. The study quantifies regularities observed among factor effects and multifactor interactions. Such regularities are known to be critical to efficient planning and analysis of experiments and to robust design of engineering systems. Three previously observed properties are analyzed: effect sparsity, hierarchy, and heredity. A new regularity is introduced and shown to be statistically significant. It is shown that a preponderance of active two-factor interaction effects are synergistic, meaning that when main effects are used to increase the system response, the interaction provides an additional increase and that when main effects are used to decrease the response, the interactions generally counteract the main effects. © 2006 Wiley Periodicals, Inc. Complexity 11: 32– 45, 2006
98 citations
[...]
TL;DR: In this paper, an adaptive one-factor-at-a-time (AFAT) approach is proposed for robust parameter design using a two-level resolution III fractional factorial array.
Abstract: This paper presents a conceptually simple and resource efficient method for robust parameter design The proposed method varies control factors according to an adaptive one-factor-at-a-time plan while varying noise factors using a two-level resolution III fractional factorial array This method is compared with crossed arrays by analyzing a set of four case studies to which both approaches were applied The proposed method improves system robustness effectively, attaining more than 80% of the potential improvement on average if experimental error is low This figure improves to about 90% if prior knowledge of the system is used to define a promising starting point for the search The results vary across the case studies, but, in general, both the average amount of improvement and the consistency of the results are better than those provided by crossed arrays if experimental error is low or if the system contains some large interactions involving two or more control factors This is true despite the fact that the proposed method generally uses fewer experiments than crossed arrays The case studies reveal that the proposed method provides these benefits by exploiting, with high probability, both control by noise interactions and also higher order effects involving two control factors and a noise factor The overall conclusion is that adaptive one-factor-at-a-time, used in concert with factorial outer arrays, is demonstrated to be an effective approach to robust parameter design providing significant practical advantages as compared to commonly used alternatives
19 citations
Proceedings Article•
[...]
TL;DR: In this paper, a novel variant of the UCBV algorithm (referred to as Efficient-UCB-Variance (EUCBV)) was proposed for minimizing cumulative regret in the stochastic multi-armed bandit (MAB) setting.
Abstract: We propose a novel variant of the UCB algorithm (referred to as Efficient-UCB-Variance (EUCBV)) for minimizing cumulative regret in the stochastic multi-armed bandit (MAB) setting. EUCBV incorporates the arm elimination strategy proposed in UCB-Improved, while taking into account the variance estimates to compute the arms' confidence bounds, similar to UCBV. Through a theoretical analysis we establish that EUCBV incurs a gap-dependent regret bound which is an improvement over that of existing state-of-the-art UCB algorithms (such as UCB1, UCB-Improved, UCBV, MOSS). Further, EUCBV incurs a gap-independent regret bound which is an improvement over that of UCB1, UCBV and UCB-Improved, while being comparable with that of MOSS and OCUCB. Through an extensive numerical study we show that EUCBV significantly outperforms the popular UCB variants (like MOSS, OCUCB, etc.) as well as Thompson sampling and Bayes-UCB algorithms.
17 citations
[...]
TL;DR: Through simulation work, it is established that AugUCB, owing to its utilization of variance estimates, performs significantly better than the state-of-the-art APT, CSAR and other non variance-based algorithms.
Abstract: In this paper we propose the Augmented-UCB (AugUCB) algorithm for a fixed-budget version of the thresholding bandit problem (TBP), where the objective is to identify a set of arms whose quality is above a threshold. A key feature of AugUCB is that it uses both mean and variance estimates to eliminate arms that have been sufficiently explored; to the best of our knowledge this is the first algorithm to employ such an approach for the considered TBP. Theoretically, we obtain an upper bound on the loss (probability of mis-classification) incurred by AugUCB. Although UCBEV in literature provides a better guarantee, it is important to emphasize that UCBEV has access to problem complexity (whose computation requires arms' mean and variances), and hence is not realistic in practice; this is in contrast to AugUCB whose implementation does not require any such complexity inputs. We conduct extensive simulation experiments to validate the performance of AugUCB. Through our simulation work, we establish that AugUCB, owing to its utilization of variance estimates, performs significantly better than the state-of-the-art APT, CSAR and other non variance-based algorithms.
11 citations
[...]
TL;DR: It is suggested that running multiple adaptive experiments in parallel can be an effective way to make improvements in quality and performance of engineering systems and also provides a reasonable aggregation procedure by which to bring together the results of the many separate experiments.
Abstract: Ensemble Methods are proposed as a means to extendbiAdaptive One-Factor-at-a-Time (aOFAT) experimentation. The proposed method executes multiple aOFAT experiments on the same system with minor differences in experimental setup, such as ‘starting points’. Experimental conclusions are arrived at by aggregating the multiple, individual aOFATs. A comparison is made to test the performance of the new method with that of a traditional form of experimentation, namely a single fractional factorial design which is equally resource intensive. The comparisons between the two experimental algorithms are conducted using a hierarchical probability meta-model and an illustrative case study. The case is a wet clutch system with the goal of minimizing drag torque. In this study, the proposed procedure was superior in performance to using fractional factorial arrays consistently across various experimental settings. At the best, the proposed algorithm provides an expected value of improvement that is 15% higher than the traditional approach; at the worst, the two methods are equally effective, and on average the improvement is about 10% higher with the new method. These findings suggest that running multiple adaptive experiments in parallel can be an effective way to make improvements in quality and performance of engineering systems and also provides a reasonable aggregation procedure by which to bring together the results of the many separate experiments. Copyright © 2011 John Wiley & Sons, Ltd.
7 citations
Cited by
More filters
Journal Article•
[...]
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON
12,326 citations
[...]
TL;DR: I am moved by Professor Allan's elegy to bygone NHS virtues of ‘calm caring and gentle pace of clinical life… and all the time in the world to deliver compassionate care'.
Abstract: Editor – I am moved by Professor Allan's elegy to bygone NHS virtues of ‘calm caring and gentle pace of clinical life… and all the time in the world to deliver compassionate care' ( Clin Med October 2009 p 407). One's immediate instinct would be to say ‘Ah, but times have changed' – only
396 citations
[...]
TL;DR: It is shown that in crossed designs, statistical power typically does not approach unity as the number of participants goes to infinity but instead approaches a maximum attainable power value that is possibly small, depending on the stimulus sample.
Abstract: Researchers designing experiments in which a sample of participants responds to a sample of stimuli are faced with difficult questions about optimal study design. The conventional procedures of statistical power analysis fail to provide appropriate answers to these questions because they are based on statistical models in which stimuli are not assumed to be a source of random variation in the data, models that are inappropriate for experiments involving crossed random factors of participants and stimuli. In this article, we present new methods of power analysis for designs with crossed random factors, and we give detailed, practical guidance to psychology researchers planning experiments in which a sample of participants responds to a sample of stimuli. We extensively examine 5 commonly used experimental designs, describe how to estimate statistical power in each, and provide power analysis results based on a reasonable set of default parameter values. We then develop general conclusions and formulate rules of thumb concerning the optimal design of experiments in which a sample of participants responds to a sample of stimuli. We show that in crossed designs, statistical power typically does not approach unity as the number of participants goes to infinity but instead approaches a maximum attainable power value that is possibly small, depending on the stimulus sample. We also consider the statistical merits of designs involving multiple stimulus blocks. Finally, we provide a simple and flexible Web-based power application to aid researchers in planning studies with samples of stimuli.
314 citations