scispace - formally typeset
Search or ask a question

Showing papers by "Rahul Mukerjee published in 2010"


Journal ArticleDOI
TL;DR: It is shown how block designs can be used to construct VCS which achieve optimality with respect to the average and minimum relative contrasts but require much smaller pixel expansions than the existing ones.
Abstract: In (k, n) visual cryptographic schemes (VCS), a secret image is encrypted into n pages of cipher text, each printed on a transparency sheet, which are distributed among n participants. The image can be visually decoded if any k(≥2) of these sheets are stacked on top of one another, while this is not possible by stacking any k − 1 or fewer sheets. We employ a Kronecker algebra to obtain necessary and sufficient conditions for the existence of a (k, n) VCS with a prior specification of relative contrasts that quantify the clarity of the recovered image. The connection of these conditions with an L 1-norm formulation as well as a convenient linear programming formulation is explored. These are employed to settle certain conjectures on contrast optimal VCS for the cases k = 4 and 5. Furthermore, for k = 3, we show how block designs can be used to construct VCS which achieve optimality with respect to the average and minimum relative contrasts but require much smaller pixel expansions than the existing ones.

39 citations


Journal ArticleDOI
TL;DR: In this paper, the variance of the difference between estimated responses at two points, maximized over all pairs of points in the factor space, is taken as the design criterion and optimal designs under this criterion are derived, via a combination of algebraic and numerical techniques, for the full second-order regression model over cuboidal regions.
Abstract: Minimization of the variance of the difference between estimated responses at two points, maximized over all pairs of points in the factor space, is taken as the design criterion. Optimal designs under this criterion are derived, via a combination of algebraic and numerical techniques, for the full second-order regression model over cuboidal regions. Use of a convexity argument and a surrogate objective function significantly reduces the computational burden.

9 citations


Journal ArticleDOI
TL;DR: This work considers second-order probability matching priors that ensure frequentist validity of posterior quantiles with margin of error o(n −1, where n is the sample size) and explores how this problem can be resolved via consideration of data-dependent priors.
Abstract: We consider second-order probability matching priors that ensure frequentist validity of posterior quantiles with margin of error o(n −1), where n is the sample size. It is well known that there are many models of interest where data-free second-order probability matching priors do not exist. We explore how this problem can be resolved via consideration of data-dependent priors. This is done both in the absence and presence of nuisance parameters.

6 citations


01 Jan 2010
TL;DR: In this paper, the authors proposed new factorial and fractional factorial designs that minimize the number of effects that have to be sacrificed to assess the significance of some of the factorial effects.
Abstract: Factorial and fractional factorial designs are widely used for assessing the impact of several factors on a process. Frequently, restrictions are placed on the randomization of the experimental trials. The randomization structure of such a factorial design can be characterized by its set of randomization defining contrast subspaces. It turns out that in many practical situations, these subspaces will overlap, thereby making it impossible to assess the significance of some of the factorial effects. In this article, we propose new designs that minimize the number of effects that have to be sacrificed. We also propose new designs, called stars, that are easy to construct and allow the assessment of a large number of factorial effects under an appropriately chosen overlapping strategy.

6 citations


Journal ArticleDOI
TL;DR: The role of data-dependent priors is investigated and it is seen that the resulting probability matching condition readily allows solutions, in contrast to what happens with data-free priors.

4 citations


Journal ArticleDOI
TL;DR: In this paper, a negative binomial sampling scheme was used to obtain a uniformly most accurate upper confidence limit for a small but unknown proportion, such as the proportion of defectives in a manufacturing process.
Abstract: On the basis of a negative binomial sampling scheme, we consider a uniformly most accurate upper confidence limit for a small but unknown proportion, such as the proportion of defectives in a manufacturing process. The optimal stopping rule, with reference to the twin criteria of the expected length of the confidence interval and the expected sample size, is investigated. The proposed confidence interval has also been compared with several others that have received attention in the recent literature.

2 citations


Posted Content
TL;DR: Partially cover-free families of sets are considered and these are employed to obtain anti-collusion digital fingerprinting codes and these methods ensure gains in terms of accommodating more users and/or reducing the number of basis vectors.
Abstract: Anti-collusion digital fingerprinting codes have been of significant current interest in the context of deterring unauthorized use of multimedia content by a coalition of users In this article, partially cover-free families of sets are considered and these are employed to obtain such codes Compared to the existing methods of construction, our methods ensure gains in terms of accommodating more users and/or reducing the number of basis vectors

Journal ArticleDOI
TL;DR: In this paper, a very general class of empirical-type likelihoods, including the usual empirical likelihood and all its major variants, were considered and it is known that none of these likelihoods admits a data-free probability matching prior for the highest posterior density region.