scispace - formally typeset
Open AccessBook

A calculus for factorial arrangements

Reads0
Chats0
TLDR
In this paper, a factorial experiment involving n factors, F1, F2, F n, at m1, m2,..., m n (≥2) levels respectively, is considered and the effect due to this treatment combination is denoted by τ(j1,j2, j n ).
Abstract
Consider a factorial experiment involving n factors, F1, F2,..., F n , at m1, m2,..., m n (≥2) levels respectively. Let the levels of F i be coded as 0, 1,...,m i −1 (1 ≤ i ≤ n). A typical selection of levels j = (j1,j2,..., j n ), 0 ≤; j i ≤ m i , − 1, 1 ≤ i ≤ n, will be termed the jth treatment combination and the effect due to this treatment combination will be denoted by τ(j1,j2,..., j n ).

read more

Citations
More filters
Journal ArticleDOI

Generalized Entropy Power Inequalities and Monotonicity Properties of Information

TL;DR: In this paper, a new family of Fisher information and entropy power inequalities for sums of independent random variables are presented, which relate the information in the sum of n independent variables to the information contained in sums over subsets of the random variables, for an arbitrary collection of subsets.
Journal ArticleDOI

Classification of two-level factorial fractions

TL;DR: The problem of finding a fraction of a two-level factorial design with specific properties is usually solved within special classes, such as regular or Plackett-Burman designs as mentioned in this paper.
Journal ArticleDOI

Using Standard Tools From Finite Population Sampling to Improve Causal Inference for Complex Experiments

TL;DR: In this paper, the authors consider causal inference for treatment contrasts from a randomized experiment using potential outcomes in a finite population setting and develop an inferential framework for general mechanisms of assigning experimental units to multiple treatments.
Proceedings ArticleDOI

The Monotonicity of Information in the Central Limit Theorem and Entropy Power Inequalities

TL;DR: A simple proof of the monotonicity of information in the central limit theorem for i.i.d. summands is provided and new families of Fisher information and entropy power inequalities are discussed.
Journal ArticleDOI

Experimental design: methods and applications: an updated bibliography of books in English

TL;DR: This provides a comprehensive listing of books, mainly in English, on the topic of experimental design relative to historical development, classification, construction, experimental layout, and statistical analysis.
Related Papers (5)