scispace - formally typeset
Search or ask a question
Author

Shelemyahu Zacks

Other affiliations: Fox Chase Cancer Center
Bio: Shelemyahu Zacks is an academic researcher from Binghamton University. The author has contributed to research in topics: Estimator & Poisson distribution. The author has an hindex of 22, co-authored 105 publications receiving 2511 citations. Previous affiliations of Shelemyahu Zacks include Fox Chase Cancer Center.


Papers
More filters
Journal ArticleDOI
TL;DR: An adaptive dose escalation scheme for use in cancer phase I clinical trials that makes use of all the information available at the time of each dose assignment, and directly addresses the ethical need to control the probability of overdosing is described.
Abstract: We describe an adaptive dose escalation scheme for use in cancer phase I clinical trials. The method is fully adaptive, makes use of all the information available at the time of each dose assignment, and directly addresses the ethical need to control the probability of overdosing. It is designed to approach the maximum tolerated dose as fast as possible subject to the constraint that the predicted proportion of patients who receive an overdose does not exceed a specified value. We conducted simulations to compare the proposed method with four up-and-down designs, two stochastic approximation methods, and with a variant of the continual reassessment method. The results showed the proposed method effective as a means to control the frequency of overdosing. Relative to the continual reassessment method, our scheme overdosed a smaller proportion of patients, exhibited fewer toxicities and estimated the maximum tolerated dose with comparable accuracy. When compared to the non-parametric schemes, our method treated fewer patients at either subtherapeutic or severely toxic dose levels, treated more patients at optimal dose levels and estimated the maximum tolerated dose with smaller average bias and mean squared error. Hence, the proposed method is promising alternative to currently used cancer phase I clinical trial designs.

580 citations

Book
01 Jan 1971

572 citations

Journal ArticleDOI
TL;DR: The methods of statistical catastrophe theory draw upon the theories of parameter estimation for multiparameter exponential families, nonlinear time-series analysis, and stochastic differential equations.
Abstract: Although catastrophe theory has been applied with mixed success to many problems in the biosciences, very few of these applications have used any form of statistical modeling. We present examples of the applications of statistical catastrophe theory in the analysis of experimental data. These include examples of hysteresis effects, bifurcation effects, and the full cusp catastrophe model. The methods of statistical catastrophe theory draw upon the theories of parameter estimation for multiparameter exponential families, nonlinear time-series analysis, and stochastic differential equations. We discuss the application of these methods to both canonical and noncanonical catastrophe models.

112 citations

Book
14 May 1992
TL;DR: This monograph is that of superpopulation models in which values of the population elements are considered as random variables having joint distributions and the emphasis is on the analysis of data rather than on the design of samples.
Abstract: A large number of papers have appeared in the past 20 years on estimating and predicting characteristics of finite populations. This monograph is designed to present this modern theory in a systematic and consistent manner. The author's approach is that of superpopulation models in which values of the population elements are considered as random variables having joint distributions. Throughout, the emphasis is on the analysis of data rather than on the design of samples. Topics covered include: optimal predictors for various superpopulation models, Bayes, minimax, and maximum likelihood predictors, classical and Bayesian prediction internals, model robustness, and models with measurement errors. Each chapter contains numerous examples, and exercises which extend and illustrate the themes in the text. As a result, this book will be ideal for all those research workers seeking an up-to-date and well-referenced introduction to the subject.

106 citations

Book
28 Jan 2014
TL;DR: Governing and post-graduate students in the areas of statistical quality and engineering, as well as industrial statisticians, researchers and practitioners in these fields will all benefit from the comprehensive combination of theoretical and practical information provided in this single volume.
Abstract: Fully revised and updated, this book combines a theoretical background with examples and references to R, MINITAB and JMP, enabling practitioners to find state-of-the-art material on both foundation and implementation tools to support their work. Topics addressed include computer-intensive data analysis, acceptance sampling, univariate and multivariate statistical process control, design of experiments, quality by design, and reliability using classical and Bayesian methods. The book can be used for workshops or courses on acceptance sampling, statistical process control, design of experiments, and reliability. Graduate and post-graduate students in the areas of statistical quality and engineering, as well as industrial statisticians, researchers and practitioners in these fields will all benefit from the comprehensive combination of theoretical and practical information provided in this single volume. Modern Industrial Statistics: With applications in R, MINITAB and JMP:Combines a practical approach with theoretical foundations and computational support. Provides examples in R using a dedicated package called MISTAT, and also refers to MINITAB and JMP. Includes exercises at the end of each chapter to aid learning and test knowledge. Provides over 40 data sets representing real-life case studies. Is complemented by a comprehensive website providing an introduction to R, and installations of JMP scripts and MINITAB macros, including effective tutorials with introductory material: www.wiley.com/go/modern_industrial_statistics.

82 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: In this article, a classical confidence belt construction is proposed to unify the treatment of upper confidence limits for null results and two-sided confidence intervals for non-null results for Gaussian processes with background and Gaussian errors with a bounded physical region.
Abstract: We give a classical confidence belt construction which unifies the treatment of upper confidence limits for null results and two-sided confidence intervals for non-null results. The unified treatment solves a problem (apparently not previously recognized) that the choice of upper limit or two-sided intervals leads to intervals which are not confidence intervals if the choice is based on the data. We apply the construction to two related problems which have recently been a battle-ground between classical and Bayesian statistics: Poisson processes with background, and Gaussian errors with a bounded physical region. In contrast with the usual classical construction for upper limits, our construction avoids unphysical confidence intervals. In contrast with some popular Bayesian intervals, our intervals eliminate conservatism (frequentist coverage greater than the stated confidence) in the Gaussian case and reduce it to a level dictated by discreteness in the Poisson case. We generalize the method in order to apply it to analysis of experiments searching for neutrino oscillations. We show that this technique both gives correct coverage and is powerful, while other classical techniques that have been used by neutrino oscillation search experiments fail one or both of these criteria.

2,830 citations

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a restricted maximum likelihood (reml) approach which takes into account the loss in degrees of freedom resulting from estimating fixed effects, and developed a satisfactory asymptotic theory for estimators of variance components.
Abstract: Recent developments promise to increase greatly the popularity of maximum likelihood (ml) as a technique for estimating variance components. Patterson and Thompson (1971) proposed a restricted maximum likelihood (reml) approach which takes into account the loss in degrees of freedom resulting from estimating fixed effects. Miller (1973) developed a satisfactory asymptotic theory for ml estimators of variance components. There are many iterative algorithms that can be considered for computing the ml or reml estimates. The computations on each iteration of these algorithms are those associated with computing estimates of fixed and random effects for given values of the variance components.

2,440 citations

Journal ArticleDOI
TL;DR: A procedure is derived for extracting the observed information matrix when the EM algorithm is used to find maximum likelihood estimates in incomplete data problems and a method useful in speeding up the convergence of the EM algorithms is developed.
Abstract: A procedure is derived for extracting the observed information matrix when the EM algorithm is used to find maximum likelihood estimates in incomplete data problems. The technique requires computation of a complete-data gradient vector or second derivative matrix, but not those associated with the incomplete data likelihood. In addition, a method useful in speeding up the convergence of the EM algorithm is developed. Two examples are presented.

2,145 citations

Journal ArticleDOI
TL;DR: Ceritinib was highly active in patients with advanced, ALK-rearranged NSCLC, including those who had had disease progression during crizotinib treatment, regardless of the presence of resistance mutations in ALK.
Abstract: BackgroundNon–small-cell lung cancer (NSCLC) harboring the anaplastic lymphoma kinase gene (ALK) rearrangement is sensitive to the ALK inhibitor crizotinib, but resistance invariably develops. Ceritinib (LDK378) is a new ALK inhibitor that has shown greater antitumor potency than crizotinib in preclinical studies. MethodsIn this phase 1 study, we administered oral ceritinib in doses of 50 to 750 mg once daily to patients with advanced cancers harboring genetic alterations in ALK. In an expansion phase of the study, patients received the maximum tolerated dose. Patients were assessed to determine the safety, pharmacokinetic properties, and antitumor activity of ceritinib. Tumor biopsies were performed before ceritinib treatment to identify resistance mutations in ALK in a group of patients with NSCLC who had had disease progression during treatment with crizotinib. ResultsA total of 59 patients were enrolled in the dose-escalation phase. The maximum tolerated dose of ceritinib was 750 mg once daily; dose-l...

1,297 citations

Journal ArticleDOI
TL;DR: In this paper, applied probability and queuing in the field of applied probabilistic analysis is discussed. But the authors focus on the application of queueing in the context of road traffic.
Abstract: (1987). Applied Probability and Queues. Journal of the Operational Research Society: Vol. 38, No. 11, pp. 1095-1096.

1,121 citations