scispace - formally typeset
Search or ask a question
Author

Bovas Abraham

Bio: Bovas Abraham is an academic researcher from University of Waterloo. The author has contributed to research in topics: Autoregressive model & Time series. The author has an hindex of 23, co-authored 72 publications receiving 2309 citations. Previous affiliations of Bovas Abraham include University of British Columbia & Dalhousie University.


Papers
More filters
Journal ArticleDOI
TL;DR: A group of practitioners and researchers discuss the role of parameter design and Taguchi's methodology for implementing it and the importance of parameter-design principles with well-established statistical techniques.
Abstract: It is more than a decade since Genichi Taguchi's ideas on quality improvement were inrroduced in the United States. His parameter-design approach for reducing variation in products and processes has generated a great deal of interest among both quality practitioners and statisticians. The statistical techniques used by Taguchi to implement parameter design have been the subject of much debate, however, and there has been considerable research aimed at integrating the parameter-design principles with well-established statistical techniques. On the other hand, Taguchi and his colleagues feel that these research efforts by statisticians are misguided and reflect a lack of understanding of the engineering principles underlying Taguchi's methodology. This panel discussion provides a forum for a technical discussion of these diverse views. A group of practitioners and researchers discuss the role of parameter design and Taguchi's methodology for implementing it. The topics covered include the importance of vari...

654 citations

01 Oct 2001
TL;DR: In published discussions of Six Sigma improvement methodology, the terms "Black Belt," "Master Black Belt," and "Green Belt" are often used indiscriminately without clear d.. as mentioned in this paper...
Abstract: [This abstract is based on the author's abstract.] In published discussions of Six Sigma improvement methodology, the terms "Black Belt," "Master Black Belt," and "Green Belt" are often used indiscriminately without clear d..

193 citations

Journal ArticleDOI
TL;DR: In this article, the authors proposed a novel adjustment to the empirical likelihood that retains all the optimality properties and guarantees a sensible value of the likelihood at any parameter value, and introduced an iterative algorithm that is guaranteed to converge.
Abstract: Computing a profile empirical likelihood function, which involves constrained maximization, is a key step in applications of empirical likelihood. However, in some situations, the required numerical problem has no solution. In this case, the convention is to assign a zero value to the profile empirical likelihood. This strategy has at least two limitations. First, it is numerically difficult to determine that there is no solution; second, no information is provided on the relative plausibility of the parameter values where the likelihood is set to zero. In this article, we propose a novel adjustment to the empirical likelihood that retains all the optimality properties, and guarantees a sensible value of the likelihood at any parameter value. Coupled with this adjustment, we introduce an iterative algorithm that is guaranteed to converge. Our simulation indicates that the adjusted empirical likelihood is much faster to compute than the profile empirical likelihood. The confidence regions constructed via t...

189 citations

Journal ArticleDOI
TL;DR: In this article, the aberrant innovation model and aberrant observation model are considered to characterize outliers in time series, allowing for a small probability that any given observation is "bad" and in this set-up the inference about the parameters of an autoregressive model is considered.
Abstract: SUMMARY Two models, the aberrant innovation model and the aberrant observation model, are considered to characterize outliers in time series. The approach adopted here allows for a small probability a that any given observation is 'bad' and in this set-up the inference about the parameters of an autoregressive model is considered.

172 citations

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a method for distinguishing an observational outlier from an innovational one using regression analysis techniques, and a four-step procedure for modeling time series in the presence of outliers.
Abstract: Some statistics used in regression analysis are considered for detection of outliers in time series. Approximations and asymptotic distributions of these statistics are considered. A method is proposed for distinguishing an observational outlier from an innovational one. A four-step procedure for modeling time series in the presence of outliers is also proposed, and an example is presented to illustrate the methodology.

128 citations


Cited by
More filters
Book
08 Sep 2000
TL;DR: This book presents dozens of algorithms and implementation examples, all in pseudo-code and suitable for use in real-world, large-scale data mining projects, and provides a comprehensive, practical look at the concepts and techniques you need to get the most out of real business data.
Abstract: The increasing volume of data in modern business and science calls for more complex and sophisticated tools. Although advances in data mining technology have made extensive data collection much easier, it's still always evolving and there is a constant need for new techniques and tools that can help us transform this data into useful information and knowledge. Since the previous edition's publication, great advances have been made in the field of data mining. Not only does the third of edition of Data Mining: Concepts and Techniques continue the tradition of equipping you with an understanding and application of the theory and practice of discovering patterns hidden in large data sets, it also focuses on new, important topics in the field: data warehouses and data cube technology, mining stream, mining social networks, and mining spatial, multimedia and other complex data. Each chapter is a stand-alone guide to a critical topic, presenting proven algorithms and sound implementations ready to be used directly or with strategic modification against live data. This is the resource you need if you want to apply today's most powerful data mining techniques to meet real business challenges. * Presents dozens of algorithms and implementation examples, all in pseudo-code and suitable for use in real-world, large-scale data mining projects. * Addresses advanced topics such as mining object-relational databases, spatial databases, multimedia databases, time-series databases, text databases, the World Wide Web, and applications in several fields. *Provides a comprehensive, practical look at the concepts and techniques you need to get the most out of real business data

23,600 citations

Journal Article
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON

13,333 citations

Journal ArticleDOI
TL;DR: This survey tries to provide a structured and comprehensive overview of the research on anomaly detection by grouping existing techniques into different categories based on the underlying approach adopted by each technique.
Abstract: Anomaly detection is an important problem that has been researched within diverse research areas and application domains. Many anomaly detection techniques have been specifically developed for certain application domains, while others are more generic. This survey tries to provide a structured and comprehensive overview of the research on anomaly detection. We have grouped existing techniques into different categories based on the underlying approach adopted by each technique. For each category we have identified key assumptions, which are used by the techniques to differentiate between normal and anomalous behavior. When applying a given technique to a particular domain, these assumptions can be used as guidelines to assess the effectiveness of the technique in that domain. For each category, we provide a basic anomaly detection technique, and then show how the different existing techniques in that category are variants of the basic technique. This template provides an easier and more succinct understanding of the techniques belonging to each category. Further, for each category, we identify the advantages and disadvantages of the techniques in that category. We also provide a discussion on the computational complexity of the techniques since it is an important issue in real application domains. We hope that this survey will provide a better understanding of the different directions in which research has been done on this topic, and how techniques developed in one area can be applied in domains for which they were not intended to begin with.

9,627 citations

Posted Content
TL;DR: Deming's theory of management based on the 14 Points for Management is described in Out of the Crisis, originally published in 1982 as mentioned in this paper, where he explains the principles of management transformation and how to apply them.
Abstract: According to W. Edwards Deming, American companies require nothing less than a transformation of management style and of governmental relations with industry. In Out of the Crisis, originally published in 1982, Deming offers a theory of management based on his famous 14 Points for Management. Management's failure to plan for the future, he claims, brings about loss of market, which brings about loss of jobs. Management must be judged not only by the quarterly dividend, but by innovative plans to stay in business, protect investment, ensure future dividends, and provide more jobs through improved product and service. In simple, direct language, he explains the principles of management transformation and how to apply them.

9,241 citations

Posted Content
TL;DR: In this paper, the authors provide a unified and comprehensive theory of structural time series models, including a detailed treatment of the Kalman filter for modeling economic and social time series, and address the special problems which the treatment of such series poses.
Abstract: In this book, Andrew Harvey sets out to provide a unified and comprehensive theory of structural time series models. Unlike the traditional ARIMA models, structural time series models consist explicitly of unobserved components, such as trends and seasonals, which have a direct interpretation. As a result the model selection methodology associated with structural models is much closer to econometric methodology. The link with econometrics is made even closer by the natural way in which the models can be extended to include explanatory variables and to cope with multivariate time series. From the technical point of view, state space models and the Kalman filter play a key role in the statistical treatment of structural time series models. The book includes a detailed treatment of the Kalman filter. This technique was originally developed in control engineering, but is becoming increasingly important in fields such as economics and operations research. This book is concerned primarily with modelling economic and social time series, and with addressing the special problems which the treatment of such series poses. The properties of the models and the methodological techniques used to select them are illustrated with various applications. These range from the modellling of trends and cycles in US macroeconomic time series to to an evaluation of the effects of seat belt legislation in the UK.

4,252 citations