scispace - formally typeset
Search or ask a question
Author

Yulia R. Gel

Bio: Yulia R. Gel is an academic researcher from University of Texas at Dallas. The author has contributed to research in topics: Computer science & Cryptocurrency. The author has an hindex of 21, co-authored 123 publications receiving 3202 citations. Previous affiliations of Yulia R. Gel include George Washington University & University of Waterloo.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors introduce a new R package nparLD which provides statisticians and researchers from other disciplines an easy and user-friendly access to the most up-to-date robust rank-based methods for the analysis of longitudinal data in factorial settings.
Abstract: Longitudinal data from factorial experiments frequently arise in various fields of study, ranging from medicine and biology to public policy and sociology In most practical situations, the distribution of observed data is unknown and there may exist a number of atypical measurements and outliers Hence, use of parametric and semi-parametric procedures that impose restrictive distributional assumptions on observed longitudinal samples becomes questionable This, in turn, has led to a substantial demand for statistical procedures that enable us to accurately and reliably analyze longitudinal measurements in factorial experiments with minimal conditions on available data, and robust nonparametric methodology offering such a possibility becomes of particular practical importance In this article, we introduce a new R package nparLD which provides statisticians and researchers from other disciplines an easy and user-friendly access to the most up-to-date robust rank-based methods for the analysis of longitudinal data in factorial settings We illustrate the implemented procedures by case studies from dentistry, biology, and medicine

1,181 citations

Journal ArticleDOI
TL;DR: In this article, a modification of Levene-type tests to increase their power to detect monotonic trends in variances is discussed, which is useful when one is concerned with an alternative of increasing or decreasing variability, for example, increasing volatility of stocks prices or "open or closed gramophones" in regression residual analysis.
Abstract: In many applications, the underlying scientific question concerns whether the variances of $k$ samples are equal. There are a substantial number of tests for this problem. Many of them rely on the assumption of normality and are not robust to its violation. In 1960 Professor Howard Levene proposed a new approach to this problem by applying the $F$-test to the absolute deviations of the observations from their group means. Levene's approach is powerful and robust to nonnormality and became a very popular tool for checking the homogeneity of variances. This paper reviews the original method proposed by Levene and subsequent robust modifications. A modification of Levene-type tests to increase their power to detect monotonic trends in variances is discussed. This procedure is useful when one is concerned with an alternative of increasing or decreasing variability, for example, increasing volatility of stocks prices or "open or closed gramophones" in regression residual analysis. A major section of the paper is devoted to discussion of various scientific problems where Levene-type tests have been used, for example, economic anthropology, accuracy of medical measurements, volatility of the price of oil, studies of the consistency of jury awards in legal cases and the effect of hurricanes on ecological systems.

355 citations

Journal ArticleDOI
14 Feb 2013-PLOS ONE
TL;DR: A accessible and flexible forecast model, based on real-time, geographically focused, and easy to access data, to provide individual medical centers with advanced warning of the number of influenza cases, thus allowing sufficient time to implement an intervention.
Abstract: Background We developed a practical influenza forecast model based on real-time, geographically focused, and easy to access data, designed to provide individual medical centers with advanced warning of the expected number of influenza cases, thus allowing for sufficient time to implement interventions. Secondly, we evaluated the effects of incorporating a real-time influenza surveillance system, Google Flu Trends, and meteorological and temporal information on forecast accuracy. Methods Forecast models designed to predict one week in advance were developed from weekly counts of confirmed influenza cases over seven seasons (2004–2011) divided into seven training and out-of-sample verification sets. Forecasting procedures using classical Box-Jenkins, generalized linear models (GLM), and generalized linear autoregressive moving average (GARMA) methods were employed to develop the final model and assess the relative contribution of external variables such as, Google Flu Trends, meteorological data, and temporal information. Results A GARMA(3,0) forecast model with Negative Binomial distribution integrating Google Flu Trends information provided the most accurate influenza case predictions. The model, on the average, predicts weekly influenza cases during 7 out-of-sample outbreaks within 7 cases for 83% of estimates. Google Flu Trend data was the only source of external information to provide statistically significant forecast improvements over the base model in four of the seven out-of-sample verification sets. Overall, the p-value of adding this external information to the model is 0.0005. The other exogenous variables did not yield a statistically significant improvement in any of the verification sets. Conclusions Integer-valued autoregression of influenza cases provides a strong base forecast model, which is enhanced by the addition of Google Flu Trends confirming the predictive capabilities of search query based syndromic surveillance. This accessible and flexible forecast model can be used by individual medical centers to provide advanced warning of future influenza cases.

312 citations

Journal ArticleDOI
TL;DR: In this article, a modification of Levene-type tests to increase their power to detect monotonic trends in variances is discussed, which is useful when one is concerned with an alternative of increasing or decreasing variability, for example, increasing volatility of stocks prices or open or closed gramophones in regression residual analysis.
Abstract: In many applications, the underlying scientific question concerns whether the variances of k samples are equal. There are a substantial number of tests for this problem. Many of them rely on the assumption of normality and are not robust to its violation. In 1960 Professor Howard Levene proposed a new approach to this problem by applying the F-test to the absolute deviations of the observations from their group means. Levene’s approach is powerful and robust to nonnormality and became a very popular tool for checking the homogeneity of variances. This paper reviews the original method proposed by Levene and subsequent robust modifications. A modification of Levene-type tests to increase their power to detect monotonic trends in variances is discussed. This procedure is useful when one is concerned with an alternative of increasing or decreasing variability, for example, increasing volatility of stocks prices or “open or closed gramophones” in regression residual analysis. A major section of the paper is devoted to discussion of various scientific problems where Levene-type tests have been used, for example, economic anthropology, accuracy of medical measurements, volatility of the price of oil, studies of the consistency of jury awards in legal cases and the effect of hurricanes on ecological systems.

234 citations

Journal ArticleDOI
TL;DR: Lawstat as mentioned in this paper is a software package that contains statistical tests and procedures that are utilized in various litigations on securities law, antitrust law, equal employment and discrimination as well as in public policy and biostatistics.
Abstract: We present a new R software package lawstat that contains statistical tests and procedures that are utilized in various litigations on securities law, antitrust law, equal employment and discrimination as well as in public policy and biostatistics. Along with the well known tests such as the Bartels test, runs test, tests of homogeneity of several sample proportions, the Brunner-Munzel tests, the Lorenz curve, the Cochran-Mantel-Haenszel test and others, the package contains new distribution-free robust tests for symmetry, robust tests for normality that are more sensitive to heavy-tailed departures, measures of relative variability, Levene-type tests against trends in variances etc. All implemented tests and methods are illustrated by simulations and real-life examples from legal cases, economics and biostatistics. Although the package is called lawstat, it presents implementation and discussion of statistical procedures and tests that are also employed in a variety of other applications, e.g., biostatistics, environmental studies, social sciences and others, in other words, all applications utilizing statistical data analysis. Hence, name of the package should not be considered as a restriction to legal statistics. The package will be useful to applied statisticians and "quantitatively alert practitioners" of other subjects as well as an asset in teaching statistical courses.

140 citations


Cited by
More filters
Posted Content
TL;DR: Deming's theory of management based on the 14 Points for Management is described in Out of the Crisis, originally published in 1982 as mentioned in this paper, where he explains the principles of management transformation and how to apply them.
Abstract: According to W. Edwards Deming, American companies require nothing less than a transformation of management style and of governmental relations with industry. In Out of the Crisis, originally published in 1982, Deming offers a theory of management based on his famous 14 Points for Management. Management's failure to plan for the future, he claims, brings about loss of market, which brings about loss of jobs. Management must be judged not only by the quarterly dividend, but by innovative plans to stay in business, protect investment, ensure future dividends, and provide more jobs through improved product and service. In simple, direct language, he explains the principles of management transformation and how to apply them.

9,241 citations

Journal ArticleDOI

6,278 citations

01 Jan 2016
TL;DR: The modern applied statistics with s is universally compatible with any devices to read, and is available in the digital library an online access to it is set as public so you can download it instantly.
Abstract: Thank you very much for downloading modern applied statistics with s. As you may know, people have search hundreds times for their favorite readings like this modern applied statistics with s, but end up in harmful downloads. Rather than reading a good book with a cup of coffee in the afternoon, instead they cope with some harmful virus inside their laptop. modern applied statistics with s is available in our digital library an online access to it is set as public so you can download it instantly. Our digital library saves in multiple countries, allowing you to get the most less latency time to download any of our books like this one. Kindly say, the modern applied statistics with s is universally compatible with any devices to read.

5,249 citations

Journal ArticleDOI
TL;DR: The theory of proper scoring rules on general probability spaces is reviewed and developed, and the intuitively appealing interval score is proposed as a utility function in interval estimation that addresses width as well as coverage.
Abstract: Scoring rules assess the quality of probabilistic forecasts, by assigning a numerical score based on the predictive distribution and on the event or value that materializes. A scoring rule is proper if the forecaster maximizes the expected score for an observation drawn from the distributionF if he or she issues the probabilistic forecast F, rather than G ≠ F. It is strictly proper if the maximum is unique. In prediction problems, proper scoring rules encourage the forecaster to make careful assessments and to be honest. In estimation problems, strictly proper scoring rules provide attractive loss and utility functions that can be tailored to the problem at hand. This article reviews and develops the theory of proper scoring rules on general probability spaces, and proposes and discusses examples thereof. Proper scoring rules derive from convex functions and relate to information measures, entropy functions, and Bregman divergences. In the case of categorical variables, we prove a rigorous version of the ...

4,644 citations

Journal ArticleDOI
TL;DR: In this article, the authors used a stochastic transmission model to assess if isolation and contact tracing are able to control onwards transmission from imported cases of COVID-19, and they used the model to quantify the potential effectiveness of contact tracing and isolation of cases at controlling a severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2)-like pathogen.

2,068 citations