scispace - formally typeset
Search or ask a question

Showing papers in "Metrika in 2012"


Journal ArticleDOI
01 Jan 2012-Metrika
TL;DR: In this paper, the authors discuss the measurement of environmental performance (EP) in quantitative empirical research, and they mainly refer to the framework of Wood (1991) and conceive EP as a multidimensional construct representing the extent to which companies meet the environmental expectations of their stakeholders.
Abstract: We discuss the measurement of environmental performance (EP) in quantitative empirical research. Initially, we review and classify existing EP measures. Based on that, we analyze their validity and reliability. To provide a clear conceptualization of EP, we mainly refer to the framework of Wood (1991) and conceive EP as a multidimensional construct representing the extent to which companies meet the environmental expectations of their stakeholders. Finally, we discuss the operationalization of EP by examining stakeholders’ expectations in detail and investigating qualitative characteristics of EP measures used within empirical research. Our analysis leads to the conclusion that measures based on inputs and outputs, operational processes and strategic EP provide construct validity. Generic EP measures used in large-scale studies should adequately represent stakeholders’ environmental expectations, in particular referring to prospective indicators. Our study contributes to the research on EP measurement by providing an extensive literature overview, improving the theoretical understanding of the EP construct and providing basic recommendations for coherent EP measurement for empirical analysis.

62 citations


Journal ArticleDOI
01 Jul 2012-Metrika
TL;DR: In this paper, a method for sequential change-point detection in linear regression models is proposed, where the critical values are updated constantly by including new observations obtained from the monitoring and corresponding monitoring procedures are designed to control the overall significance level.
Abstract: Bootstrap methods for sequential change-point detection procedures in linear regression models are proposed. The corresponding monitoring procedures are designed to control the overall significance level. The bootstrap critical values are updated constantly by including new observations obtained from the monitoring. The theoretical properties of these sequential bootstrap procedures are investigated, showing their asymptotic validity. Bootstrap and asymptotic methods are compared in a simulation study, showing that the studentized bootstrap tests hold the overall level better especially for small historic sample sizes while having a comparable power and run length.

59 citations


Journal ArticleDOI
01 Jan 2012-Metrika
TL;DR: In this paper, the authors studied how to allocate the GHG volume of a transportation process (delivery tour) to the single shipments moved by the process and identified classes of generic allocation schemes and presented 15 allocation methods.
Abstract: Logistic activity, in particular transportation, produces green house gases (GHG). For different purposes GHG need to be allocated to objects. This paper studies how to allocate the GHG volume of a transportation process (delivery tour) to the single shipments moved by the process. First, it identifies classes of generic allocation schemes and presents 15 allocation methods. Second, since the majority of these methods has not been designed for allocating GHG, we apply and compare them in the short distance transport context within a numerical example. The aim is to study how the schemes perform according to criteria. We suggest using causality, efficiency, empty core robustness, symmetry, individual rationality, coalition stability, ease of application, and set robustness as appraisal criteria and attempt to mainstream the discussion by recommending selected allocation methods.

42 citations


Journal ArticleDOI
01 Nov 2012-Metrika
TL;DR: In this article, a test to determine whether variances of time series are constant over time is presented, where the test statistic is a suitably standardized maximum of cumulative first and second moments.
Abstract: We present a test to determine whether variances of time series are constant over time. The test statistic is a suitably standardized maximum of cumulative first and second moments. We apply the test to time series of various assets and find that the test performs well in applications. Moreover, we propose a portfolio strategy based on our test which hedges against potential financial crises and show that it works in practice.

38 citations


Journal ArticleDOI
01 Aug 2012-Metrika
TL;DR: In this paper, a unified procedure for parameter estimation and variable selection for joint mean and dispersion models of the inverse Gaussian distribution is proposed, which can simultaneously select significant variables.
Abstract: The choice of distribution is often made on the basis of how well the data appear to be fitted by the distribution. The inverse Gaussian distribution is one of the basic models for describing positively skewed data which arise in a variety of applications. In this paper, the problem of interest is simultaneously parameter estimation and variable selection for joint mean and dispersion models of the inverse Gaussian distribution. We propose a unified procedure which can simultaneously select significant variables in mean and dispersion model. With appropriate selection of the tuning parameters, we establish the consistency of this procedure and the oracle property of the regularized estimators. Simulation studies and a real example are used to illustrate the proposed methodologies.

36 citations


Journal ArticleDOI
12 Oct 2012-Metrika
TL;DR: In this article, the authors examined the determinants and effects of the use of budgets in German manufacturing firms and analyzed a broad set of potential influencing factors based on Simon's distinction between diagnostic and interactive control approaches.
Abstract: Based on Simon’s distinction between diagnostic and interactive control approaches, this study examines the determinants and effects of the use of budgets. Using survey data from German manufacturing firms, the study analyzes a broad set of potential influencing factors. We test a structural equation model which explains a significant part of the variance of the interactive and of the diagnostic use of budgets. The way budgets are used highly matters with respect to controlling business strategy and influencing firm performance. Interactive and diagnostic uses of budgets significantly contribute to the formation of emergent strategies and to the implementation of intended strategies. The study extends contingency research by analyzing different aspects of the external and internal business environment and their influence on the use of budgets. We contribute to the literature in two ways: First, the results on the effects of different influencing factors may be used for theory development, and second, it provides a basis for further empirical investigations which may focus on one particular factor.

32 citations


Journal ArticleDOI
07 Jan 2012-Metrika
TL;DR: A comprehensive overview and synthesis of the various definitions of corporate environmental performance (CEP) in conceptual and empirical papers is provided in this paper. But only few studies provide a clear definition of CEP.
Abstract: This paper provides a comprehensive overview and synthesis of the various definitions of corporate environmental performance (CEP) in conceptual and empirical papers. Based on an overview of existing conceptual and empirical studies of CEP, we analyze the complex nature of this multidimensional construct. In a first step, we apply content analysis to the relevant literature to identify definitions of CEP and conduct a bibliometric analysis using the software HistCite. We found only few studies that provide a clear definition of CEP. In a second step, we use a semantic mapping methodology by applying Leximancer, a computer-aided qualitative data analysis tool to organize the large literature on CEP and to explore the definitional and conceptual complexity of CEP. To our knowledge, this is a new and unique approach in the field of environmental management. This paper contributes to research on CEP in three ways. First, it collects and summarizes definitions and measurements of CEP used in the organizational literature so far. Second, it provides a bird’s eye view on the different contexts in which CEP is discussed. Third, a parsimonious model of CEP derived from computer-aided qualitative data analysis and consisting of five major elements is presented and discussed.

27 citations


Journal ArticleDOI
01 Feb 2012-Metrika
TL;DR: In this article, an optimal allocation in the Neyman-Tschuprov sense is developed which satisfies upper and lower bounds of the sample sizes within strata, and a stable algorithm is given which ensures optimality.
Abstract: In stratified random sampling without replacement boundary conditions, such as the sample sizes within strata shall not exceed the population sizes in the respective strata, have to be considered. Stenger and Gabler (Metrika, 61:137–156, 2005) have shown a solution that satisfies upper boundaries of sample fractions within the strata. However, in modern applications one may wish to guarantee also minimal sampling fractions within strata in order to allow for reasonable separate estimations. Within this paper, an optimal allocation in the Neyman-Tschuprov sense is developed which satisfies upper and lower bounds of the sample sizes within strata. Further, a stable algorithm is given which ensures optimality. The resulting sample allocation enables users to bound design weights within stratified random sampling while considering optimality in allocation.

26 citations


Journal ArticleDOI
01 Apr 2012-Metrika
TL;DR: In this paper, the authors discuss the statistical inference of the lifetime distribution of components based on observing the system lifetimes when the system structure is known, and provide a general proportional hazard rate model for the lifetime of the components, which includes some commonly used lifetime distributions.
Abstract: In this paper, we discuss the statistical inference of the lifetime distribution of components based on observing the system lifetimes when the system structure is known. A general proportional hazard rate model for the lifetime of the components is considered, which includes some commonly used lifetime distributions. Different estimation methods—method of moments, maximum likelihood method and least squares method—for the proportionality parameter are discussed. The conditions for existence and uniqueness of method of moments and maximum likelihood estimators are presented. Then, we focus on a special case when the lifetime distributions of the components are exponential. Computational formulas for point and interval estimations of the unknown mean lifetime of the components are provided. A Monte Carlo simulation study is used to compare the performance of these estimation methods and recommendations are made based on these results. Finally, an example is provided to illustrate the methods proposed in this paper.

25 citations


Journal ArticleDOI
01 Jun 2012-Metrika
TL;DR: Corporate venturing is defined as a process of entrepreneurial effort that leads to creating new ventures within or outside established corporate organizations as mentioned in this paper, and it is defined by the authors of this paper as a "process of entrepreneurial efforts that lead to new ventures in or outside of established corpora".
Abstract: Corporate venturing (CV) is defined as a process of entrepreneurial effort that leads to creating new ventures within or outside established corporate organizations. This paper contributes to the theoretical foundation of CV research by offering an extended typology along the three dimensions of (1) the focus of corporate ventures, (2) the degree of intermediation within the CV process and (3) the explorative or exploitative orientation of venturing activities. Resolving terminological ambiguity, current research on corporate venturing is reviewed, and then avenues are identified for future research.

25 citations


Journal ArticleDOI
01 Nov 2012-Metrika
TL;DR: In this article, the authors proposed an estimator and variance estimator for the Zenga index when estimated from a complex sampling design, based on linearization techniques and more specifically on the direct approach presented by Demnati and Rao.
Abstract: Zenga’s new inequality curve and index are two recent tools for measuring inequality. Proposed in 2007, they should thus not be mistaken for anterior measures suggested by the same author. This paper focuses on the new measures only, which are hereafter referred to simply as the Zenga curve and Zenga index. The Zenga curve Z(α) involves the ratio of the mean income of the 100α % poorest to that of the 100(1−α)% richest. The Zenga index can also be expressed by means of the Lorenz Curve and some of its properties make it an interesting alternative to the Gini index. Like most other inequality measures, inference on the Zenga index is not straightforward. Some research on its properties and on estimation has already been conducted but inference in the sampling framework is still needed. In this paper, we propose an estimator and variance estimator for the Zenga index when estimated from a complex sampling design. The proposed variance estimator is based on linearization techniques and more specifically on the direct approach presented by Demnati and Rao. The quality of the resulting estimators are evaluated in Monte Carlo simulation studies on real sets of income data. Finally, the advantages of the Zenga index relative to the Gini index are discussed.

Journal ArticleDOI
01 Nov 2012-Metrika
TL;DR: In this article, the authors consider the linear regression problem in the presence of one or more imprecise random elements and propose a fuzzy random variable (FRV) based model.
Abstract: In standard regression analysis the relationship between the (response) variable and a set of (explanatory) variables is investigated. In the classical framework the response is affected by probabilistic uncertainty (randomness) and, thus, treated as a random variable. However, the data can also be subjected to other kinds of uncertainty such as imprecision. A possible way to manage all of these uncertainties is represented by the concept of fuzzy random variable (FRV). The most common class of FRVs is the LR family (LR FRV), which allows us to express every FRV in terms of three random variables, namely, the center, the left spread and the right spread. In this work, limiting our attention to the LR FRV class, we consider the linear regression problem in the presence of one or more imprecise random elements. The procedure for estimating the model parameters and the determination coefficient are discussed and the hypothesis testing problem is addressed following a bootstrap approach. Furthermore, in order to illustrate how the proposed model works in practice, the results of a real-life example are given together with a comparison with those obtained by applying classical regression analysis.

Journal ArticleDOI
01 Jan 2012-Metrika
TL;DR: In this paper, the authors proposed a new method to calibrate the estimator of the general parameter of interest in survey sampling and showed that the linear regression estimator due to Hansen et al. (Sample Survey Method and Theory) is a special case of this.
Abstract: In the present investigation, we propose a new method to calibrate the estimator of the general parameter of interest in survey sampling. We demonstrate that the linear regression estimator due to Hansen et al. (Sample Survey Method and Theory. Wiley, NY, 1953) is a special case of this. We reconfirm that the sum of calibrated weights has to be set equal to sum of the design weights within a given sample as shown in Singh (Advanced sampling theory with applications: How Michael ‘selected’ Amy, Vol. 1 and 2. Kluwer, The Netherlands, pp 1–1247, 2003; Proceedings of the American Statistical Association, Survey Method Section [CD-ROM], Toronto, Canada: American Statistical Association, pp 4382–4389, 2004; Metrika:1–18, 2006a; Presented at INTERFACE 2006, Pasadena, CA, USA, 2006b) and Stearns and Singh (Presented at Joint Statistical Meeting, MN, USA (Available on the CD), 2005; Comput Stat Data Anal 52:4253–4271, 2008). Thus, it shows that the Sir. R.A. Fisher’s brilliant idea of keeping sum of observed frequencies equal to that of expected frequencies leads to a “Honest-Balance” while weighing design weights in survey sampling. The major benefit of the proposed new estimator is that it always works unlike the pseudo empirical likelihood estimators listed in Owen (Empirical Likelihood. Chapman & Hall, London, 2001), Chen and Sitter (Stat Sin 9:385–406, 1999) and Wu (Sur Methodol 31(2):239–243, 2005). The main endeavor of this paper is to bring a change in the existing calibration technology, which is based on only positive distance functions, with a displacement function that has the flexibility of taking positive, negative, or zero value. At the end, the proposed technology has been compared with its competitors under several kinds of linear and non-linear non-parametric models using an extensive simulation study. A couple of open questions are raised.

Journal ArticleDOI
Yan Liu1, Min-Qian Liu1
01 Jan 2012-Metrika
TL;DR: In this article, Liu and Lin proposed a method to construct more equidistant optimal SSDs by replacing the multi-level SSDs and transposed orthogonal arrays with mixed level SSDs.
Abstract: Supersaturated designs (SSDs) have been highly valued in recent years for their ability of screening out important factors in the early stages of experiments. Recently, Liu and Lin (in Statist Sinica 19:197–211, 2009) proposed a method to construct optimal mixed-level SSDs from smaller multi-level SSDs and transposed orthogonal arrays (OAs). This paper extends their method to construct more equidistant optimal SSDs by replacing the multi-level SSDs and transposed OAs with mixed-level SSDs and general transposed difference matrices, respectively, and then proposes two practical methods for constructing weak equidistant SSDs based on this extended method. A large number of new optimal SSDs can be constructed from these three methods. Some examples are provided and more new designs are listed in “Appendix” for practical use.

Journal ArticleDOI
07 Nov 2012-Metrika
TL;DR: This article found that the order in which management provides information about a firm's risks and chances has a significant influence on individuals' assessment of the economic position and prospects of the firm, whether the last pieces of information presented are positive or negative, financial statement users weight these items more heavily than the initially obtained ones.
Abstract: The exploration of information order effects has been a prominent topic in judgment and decision-making research in accounting in the last decades. While the vast majority of this research has focused on auditors’ and tax professionals’ judgments, the effects of information order on nonprofessionals’ belief revisions in a financial reporting context has largely remained unexamined. In the present paper, we provide initial experimental evidence on the impact that order effects have on the processing and evaluation of information provided in the management commentary. We find that the order in which management provides information about a firm’s risks and chances has a significant influence on individuals’ assessment of the economic position and prospects of the firm. In particular, our results show that whether the last pieces of information presented are positive or negative, financial statement users weight these items more heavily than the initially obtained ones. The paper outlines the major implications of these results as well as some opportunities for future research.

Journal ArticleDOI
01 Feb 2012-Metrika
TL;DR: In this paper, the log-concavity and monotonicity of the hazard and reversed hazard functions of series and parallel systems of components is discussed. And the corresponding results for the multivariate normal distribution follow readily as special cases.
Abstract: This paper establishes the log-concavity property of several forms of univariate and multivariate skew-normal distributions. This property is then used to prove the monotonicity of the hazard as well as reversed hazard functions. The log-concavity and monotonicity of the hazard and reversed hazard functions of series and parallel systems of components is then discussed. The corresponding results for the multivariate normal distribution follow readily as special cases.

Journal ArticleDOI
01 Apr 2012-Metrika
TL;DR: In this article, the analytical properties of an addictive hazards model are studied and the ageing properties of the baseline random variable and the induced random variable are compared. Various stochastic orders that relate these two variables are also explored.
Abstract: In the present paper, we study the analytical properties of an addictive hazards model. The ageing properties of the baseline random variable and the induced random variable are compared. Various stochastic orders that relate these two variables are also explored.

Journal ArticleDOI
01 Jul 2012-Metrika
TL;DR: In this article, a strongly consistent and asymptotically normal estimator for the parameters in such models is derived, which minimizes the Euclidean distance between certain empirical and theoretical functionals of the distribution.
Abstract: Associated with any parametric family of Levy subordinators there is a parametric family of extendible Marshall-Olkin copulas, which shares the dependence structure with the vector of first passage times of the Levy subordinator across i.i.d. exponential threshold levels. The present article derives a strongly consistent and asymptotically normal estimator for the parameters in such models. The estimation strategy is to minimize the Euclidean distance between certain empirical and theoretical functionals of the distribution. As a byproduct, the covariance structure of the order statistics of a d-dimensional extendible Marshall-Olkin distribution is computed.

Journal ArticleDOI
01 Apr 2012-Metrika
TL;DR: In this paper, the authors considered the theory of R-estimation of the regression parameters of a multiple regression models with measurement errors and proposed a linear rank statistics based estimator.
Abstract: We consider the theory of R-estimation of the regression parameters of a multiple regression models with measurement errors. Using the standard linear rank statistics, R-estimators are defined and their asymptotic properties are studied as robust alternatives to the least squares estimator. This paper fills the gap of the rank theory for the estimation of regression parameters with measurement error models. Some simulation results are presented to show the effectiveness of the R-estimators.

Journal ArticleDOI
10 Oct 2012-Metrika
TL;DR: In this paper, the authors examined the role of behavioral and organizational factors in comparison to planning and control (hard) factors in cost reduction projects and found that cost culture, top management commitment, participation, and participation are of particular importance for the success of cost reductions.
Abstract: Cost reduction is usually confronted with conflicts and resistance. Besides planning and controlling measures the management accounting literature discusses behavioral and organizational factors (e.g., top management commitment, participation, cost culture) in order to overcome this resistance. Thus, from a theoretical perspective different concepts exist for implementing an effective long-term cost reduction. However, only little empirical research can be found that investigates the relative importance of “soft” behavioral and implementation factors compared to general planning and control measures. This study examines the role of behavioral and organizational (“soft”) factors in comparison to planning and control (“hard”) factors in cost reduction projects. Target costing or activity-based costing projects represent examples for strategic cost reduction projects which are considered in this study. The sample comprises 131 chief management accountants of medium-size and large German companies which were involved in such strategic cost reduction projects. Structural equation modeling is used for deriving the results. The results show that cost culture, top management commitment, and participation are of particular importance for the success of cost reductions. Their influence drives significantly planning and controlling measures which in turn determine the effectiveness of cost reduction measures.

Journal ArticleDOI
01 Aug 2012-Metrika
TL;DR: In this paper, two possible business models for the direct marketing of electricity from biogas and biomethane are compared to two state of the art business models based on fixed feed-in tariffs.
Abstract: At the beginning of 2012 the legal framework for the promotion of electricity generation from renewable energies in Germany was amended. The compensation scheme for electricity generated from biogas and biomethane, previously based on a fixed feed-in tariff, will henceforth be supplemented by an alternative direct marketing option, which consists of market-based and flexibility premiums. In this article two possible business models for the direct marketing of electricity from biogas and biomethane are economically compared to two state of the art business models based on fixed feed-in tariffs. A linear optimization model taking into account economic and technical restrictions is developed and applied to the two business models based on direct-marketing, in order to compare the costs and revenues for these two models to state of the art biogas plants generating base-load electricity. The key findings are that by applying direct marketing, additional income can be generated that more than compensates for the necessary additional investments. The results also show that the income from direct marketing of electricity generated from biogas mainly consists of the market and flexibility premium and the revenues generated on the spot exchange are of minor importance. Hence the direct marketing of electricity from biogas and biomethane remains heavily dependent on subsidies. The fact that the developed model assumes perfect foresight regarding the development of market prices means that the determined additional incomes represent an upper limit.

Journal ArticleDOI
01 Jul 2012-Metrika
TL;DR: It turns out that on a finite experimental domain, the maximin efficient design can be computed by the methods of semidefinite programming by dealing with the non-differentiability inherent in the problem, due to which the standard iterative procedures cannot be applied.
Abstract: In the paper, we solve the problem of computing the maximin efficient design with respect to the class of all orthogonally invariant criteria. It turns out that on a finite experimental domain, the maximin efficient design can be computed by the methods of semidefinite programming. Using this approach, we can deal with the non-differentiability inherent in the problem, due to which the standard iterative procedures cannot be applied. We illustrate the results on the models of polynomial regression on a line segment and quadratic regression on a cube.

Journal ArticleDOI
25 Oct 2012-Metrika
TL;DR: This paper examined how strategy maps affect balanced scorecard (BSC) evaluators' assessments of managerial performance and found that the use of strategy maps may actually have detrimental effects for organizations whose outcomes are influenced significantly by uncontrollable factors.
Abstract: This study examines how strategy maps affect balanced scorecard (BSC) evaluators’ assessments of managerial performance. We examine a setting in which managers achieve target levels of performance on driver measures but not outcome measures. Without a strategy map, the more evaluators believe the outcome was beyond the manager’s control, the more they indemnify the manager. In contrast, evaluators with strategy maps do not use their beliefs about the uncontrollability of the outcome when making evaluation decisions. Rather, evaluators with strategy maps evaluate the manager without regard to the extent to which they believed the poor outcome was due to uncontrollable factors. Thus, strategy maps affect how evaluators implement control over the firm’s strategy. The finding suggests that the use of strategy maps, which is an integral part of the BSC, may actually have detrimental effects for organizations whose outcomes are influenced significantly by uncontrollable factors.

Journal ArticleDOI
01 Nov 2012-Metrika
TL;DR: In this paper, the authors considered the uniqueness properties and moment relations for the estimators of the second model and presented proofs of uniqueness for linear combinations of estimators for both models and are simplifications of proofs given in Kollo and von Rosen (Advanced multivariate statistics with matrices).
Abstract: In this paper the extended growth curve model is considered. The literature comprises two versions of the model. These models can be connected by one-to-one reparameterizations but since estimators are non-linear it is not obvious how to transmit properties of estimators from one model to another. Since it is only for one of the models where detailed knowledge concerning estimators is available (Kollo and von Rosen, Advanced multivariate statistics with matrices. Springer, Dordrecht, 2005) the object in this paper is therefore to present uniqueness properties and moment relations for the estimators of the second model. One aim of the paper is also to complete the results for the model presented in Kollo and von Rosen (Advanced multivariate statistics with matrices. Springer, Dordrecht, 2005). The presented proofs of uniqueness for linear combinations of estimators are valid for both models and are simplifications of proofs given in Kollo and von Rosen (Advanced multivariate statistics with matrices. Springer, Dordrecht, 2005).

Journal ArticleDOI
01 Oct 2012-Metrika
TL;DR: In this article, the sufficient and necessary conditions for a FFSP design with resolution III or IV to have various clear factorial effects, including two types of main effects and three types of two-factor interaction components, were investigated.
Abstract: Mixed-level designs are widely used in the practical experiments. When the levels of some factors are difficult to be changed or controlled, fractional factorial split-plot (FFSP) designs are often used. This paper investigates the sufficient and necessary conditions for a \({2^{(n_{1}+n_{2})-(k_1+k_2)}4_s^{1}}\) FFSP design with resolution III or IV to have various clear factorial effects, including two types of main effects and three types of two-factor interaction components. The structures of such designs are shown and illustrated with examples.

Journal ArticleDOI
01 Oct 2012-Metrika
TL;DR: In this article, the generalized Pareto distribution based on generalized order statistics has been studied, and the results in this paper extend and unify some existing results in the literature.
Abstract: In the present study we extend and unify some existing results in the literature on characterization of the generalized Pareto distributions based on generalized order statistics.

Journal ArticleDOI
01 Nov 2012-Metrika
TL;DR: In this paper, a re-weighted estimator of the diffusion coefficient in the second-order diffusion model is proposed, which is proved under appropriate conditions and the conditions that ensure the asymptotic normality are also stated.
Abstract: Second-order diffusion process can not only model integrated and differentiated diffusion processes but also overcome the difficulties associated with the nondifferentiability of the Brownian motion, so these models play an important role in econometric analysis. In this paper, we propose a re-weighted estimator of the diffusion coefficient in the second-order diffusion model. Consistence of the estimator is proved under appropriate conditions and the conditions that ensure the asymptotic normality are also stated. The performance of the proposed estimator is assessed by simulation study.

Journal ArticleDOI
01 May 2012-Metrika
TL;DR: In this paper, a (n − k + 1)-out-of-n system with independent and non-identical components is considered and the mean past lifetime of the components is defined and some of its properties are investigated.
Abstract: We consider a (n − k + 1)-out-of-n system with independent and nonidentical components. Under the condition that at time t the system has failed we study the past lifetime of the components of the system. The mean past lifetime of the components is defined and some of its properties are investigated. Stochastic comparisons are also made between the past lifetime of different systems.

Journal ArticleDOI
01 Jul 2012-Metrika
TL;DR: In this paper, a semiparametric method to estimate logistic regression models with missing both covariates and an outcome variable is proposed, which does not require any model assumptions regarding the missing data mechanism nor specification of the conditional distribution of the missing covariates given the observed covariates.
Abstract: We consider a semiparametric method to estimate logistic regression models with missing both covariates and an outcome variable, and propose two new estimators. The first, which is based solely on the validation set, is an extension of the validation likelihood estimator of Breslow and Cain (Biometrika 75:11–20, 1988). The second is a joint conditional likelihood estimator based on the validation and non-validation data sets. Both estimators are semiparametric as they do not require any model assumptions regarding the missing data mechanism nor the specification of the conditional distribution of the missing covariates given the observed covariates. The asymptotic distribution theory is developed under the assumption that all covariate variables are categorical. The finite-sample properties of the proposed estimators are investigated through simulation studies showing that the joint conditional likelihood estimator is the most efficient. A cable TV survey data set from Taiwan is used to illustrate the practical use of the proposed methodology.

Journal ArticleDOI
01 Jan 2012-Metrika
TL;DR: In this article, a length biased past inaccuracy measure between two past lifetime distributions over the interval (0, t) is introduced, based on proportional reversed hazard model characterization problem.
Abstract: In the present communication we introduce a length biased past inaccuracy measure between two past lifetime distributions over the interval (0, t). Based on proportional reversed hazard model characterization problem for the length biased inaccuracy measure has been studied. An upper bound to the weighted past inaccuracy measure has also been derived, which reduces to the upper bound obtained in case of weighted past entropy.