scispace - formally typeset
Search or ask a question
Author

Mohammad Saber Fallahnezhad

Bio: Mohammad Saber Fallahnezhad is an academic researcher from Yazd University. The author has contributed to research in topics: Control chart & Acceptance sampling. The author has an hindex of 9, co-authored 48 publications receiving 355 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: The results show that the proposed model can be applied as a flexible decision support tool for decision makers to adopt appropriate strategies and policies in designing a reliable and robust health service network.

64 citations

Journal ArticleDOI
TL;DR: A novel reliable hierarchical location-allocation model addressing a real-world health service network design problem and a robust scenario-based stochastic programming approach is employed to solve the model.

43 citations

Journal ArticleDOI
TL;DR: In this paper, the median of a life-time is used as the quality parameter and a decision-making framework is developed, based on first and second type errors, tables are obtained to select the parameters of the proposed decision making framework.

43 citations

Journal ArticleDOI
01 Dec 2016
TL;DR: A model for software effort ( person-month) estimation based on three levels Bayesian network and 15 components of COCOMO and software size is presented and results indicate that the accuracy of the model is more than the other models.
Abstract: Display Omitted Presenting an updatable Bayesian belief network for software effort estimation.Considering all intervals of nodes of network as fuzzy numbers.Applying optimal control by genetic algorithm for obtaining an accurate estimation.Considering the effective components and steps of software development in software effort estimation.Considering software quality in terms of the number of detected and removed defects in steps of software development. In this paper, we present a model for software effort (person-month) estimation based on three levels Bayesian network and 15 components of COCOMO and software size. The Bayesian network works with discrete intervals for nodes. However, we consider the intervals of all nodes of network as fuzzy numbers. Also, we obtain the optimal updating coefficient of effort estimation based on the concept of optimal control using Genetic algorithm and Particle swarm optimization for the COCOMO NASA database. In the other words, estimated value of effort is modified by determining the optimal coefficient. Also, we estimate the software effort with considering software quality in terms of the number of defects which is detected and removed in three steps of requirements specification, design and coding. If the number of defects is more than the specified threshold then the model is returned to the current step and an additional effort is added to the estimated effort. The results of model indicate that optimal updating coefficient obtained by genetic algorithm increases the accuracy of estimation significantly. Also, results of comparing the proposed model with the other ones indicate that the accuracy of the model is more than the other models.

40 citations

Journal ArticleDOI
TL;DR: In this paper, a new acceptance sampling design is proposed to accept or reject a batch based on Bayesian modeling to update the distribution function of the percentage of nonconforming items, and to determine the required sample size the backwards induction methodology of the decision tree approach is utilized.
Abstract: In acceptance sampling plans, the decisions on either accepting or rejecting a specific batch is still a challenging problem. In order to provide a desired level of protection for customers as well as manufacturers, in this paper, a new acceptance sampling design is proposed to accept or reject a batch based on Bayesian modeling to update the distribution function of the percentage of nonconforming items. Moreover, to determine the required sample size the backwards induction methodology of the decision tree approach is utilized. A sensitivity analysis that is carried out on the parameters of the proposed methodology shows the optimal solution is affected by initial values of the parameters. Furthermore, an optimal ( n ,c ) design is determined when there is a limited time and budget available and hence the maximum sample size is specified in advance.

25 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: In this paper, the Mathematical Theory of Reliability (MTR) is used to describe the relationship between reliability and operational reliability in the context of the ORS problem, and it is shown that it can be achieved.
Abstract: (1966). Mathematical Theory of Reliability. Journal of the Operational Research Society: Vol. 17, No. 2, pp. 213-215.

578 citations

Journal ArticleDOI
TL;DR: Stuart Coles’s book on the modeling of extreme values provides an introductory text on the topic, a modeling-oriented text with an emphasis on different types of data and analytical approaches, meant for individuals with moderate statistical background.
Abstract: The modeling of extreme values is important to scientists in such Ž elds as hydrology, civil engineering, environmental science, oceanography and Ž nance. Stuart Coles’s book on the modeling of extreme values provides an introductory text on the topic. It is a modeling-oriented text with an emphasis on different types of data and analytical approaches. The book is laid out in nine chapters. Following introductory material and discussion of necessary theoretical background are chapters on approaches to extreme values that focus on the different types of data that might be used in an extreme value analysis. These include models for block maximums, threshold models, models for data from stationary and nonstationary processes, and approaches based on point processes. A chapter covers analysis of multivariate extremes, and the Ž nal chapter brie y covers such topics as Bayesian inference, Markov chains, and spatial extremes. Although this is not a data-driven text, it does contain numerous examples and analyses. These examples are used to illustrate the methodology; I would have preferred to see more motivation and interpretation of the results of the analyses. Datasets and S-PLUS programs for the analyses in the text are available at a website. These are easy to use for those slightly familiar with S-PLUS. The appendix describes the programs and illustrates how to access data and use the programs. It also gives links to sites that provide other software. The text does not include problem sets; these would have been useful, especially if the text is to be used in coursework. The text by Reiss and Thomas (2001) contains more thoroughly analyzed datasets, although it is twice the length and not as streamlined as the text under review. The book is meant for individuals with moderate statistical background. Those with coursework in maximum likelihood methods should have no difŽ culty reading and comprehending the text. Overall, this is a good text for someone getting started in extreme value methods.

402 citations

Posted Content
TL;DR: In this article, the authors present a review of lot-size models which focus on coordinated inventory replenishment decisions between buyer and vendor and their impact on the performance of the supply chain.
Abstract: This article reviews lot-size models which focus on coordinated inventory replenishment decisions between buyer and vendor and their impact on the performance of the supply chain. These so-called joint economic lot size (JELS) models determine order, production and shipment quantities from the perspective of the supply chain with the objective of minimizing total system costs. This paper first describes the problem studied, introduces the methodology of the review and presents a descriptive analysis of the selected papers. Subsequently, papers are categorized and analyzed with respect to their contribution to the coordination of different echelons in the supply chain. Finally, the review highlights gaps in the existing literature and suggests interesting areas for future research.

257 citations

01 Jan 2012
TL;DR: In this article, a reliable joint inventory-location problem that optimizes facility locations, customer allocations, and inventory management decisions when facilities are subject to disruption risks (e.g., due to natural or man-made hazards).
Abstract: This paper studies a reliable joint inventory-location problem that optimizes facility locations, customer allocations, and inventory management decisions when facilities are subject to disruption risks (e.g., due to natural or man-made hazards). When a facility fails, its customers may be reassigned to other operational facilities in order to avoid the high penalty costs associated with losing service. The authors propose an integer programming model that minimizes the sum of facility construction costs, expected inventory holding costs and expected customer costs under normal and failure scenarios. The authors develop a Lagrangian relaxation solution framework for this problem, including a polynomial-time exact algorithm for the relaxed nonlinear subproblems. Numerical experiment results show that this proposed model is capable of providing a near-optimum solution within a short computation time. Managerial insights on the optimal facility deployment, inventory control strategies, and the corresponding cost constitutions are drawn.

128 citations

Proceedings ArticleDOI
A. Blanc1
01 Jan 1993

127 citations