scispace - formally typeset
Search or ask a question

Showing papers in "American Journal of Applied Mathematics and Statistics in 2017"


Journal ArticleDOI
TL;DR: In this article, a new and efficient approach of finding an initial basic feasible solution to transportation problems is proposed, which is named as Inverse Coefficient of Variation Method (ICVM) and illustrated with seven numerical examples.
Abstract: In this research, a new and efficient approach of finding an initial basic feasible solution to transportation problems is proposed. The proposed approach is named “Inverse Coefficient of Variation Method (ICVM)”, and the method is illustrated with seven numerical examples. Six existing methods; North West Corner Method (NWCM), Column Minimum Method (CMM), Least Cost Method (LCM), Row Minimum Method (RMM),Vogel’s Approximation Method (VAM), and Allocation Table Method (ATM) were compared with the proposed approach. It can be said conclusively that the proposed Inverse Coefficient of Variation Method (ICVM) provides an improved Initial Basic Feasible Solution to all the transportation problems used in the experiment. Further, the new method leads to the optimal solution to many of the problems considered.

7 citations


Journal ArticleDOI
TL;DR: Based on Type-I and Type-II generalized progressive hybrid censoring schemes, the maximum likelihood estimators and Bayes estimators for the unknown parameters of exponentiated Weibull lifetime model are derived as mentioned in this paper.
Abstract: Based on Type-I and Type-II generalized progressive hybrid censoring schemes, the maximum likelihood estimators and Bayes estimators for the unknown parameters of exponentiated Weibull lifetime model are derived. The approximate asymptotic variance-covariance matrix and approximate confidence intervals based on the asymptotic normality of the classical estimators are obtained. Independent non-informative types of priors are considered for the unknown parameters to develop the Bayes estimators and corresponding Bayes risks under a squared error loss function. Proposed estimators cannot be expressed in closed forms and can be evaluated numerically by some suitable iterative procedure. Finally, one real data set is analyzed for illustrative purposes.

7 citations


Journal ArticleDOI
TL;DR: In this article, the authors measured the relationship between strategic alignment of IT investment returns and corporate performance and provided empirical evidence that closer alignment between corporate and IT strategies leads to increased IT ROI and improved corporate performance.
Abstract: Information technology (IT) investment and aligning methodologies require thorough understanding of analyses on different parallel present values and strong internal rates of return. E-commerce has given a new dimension to IT investing that elevates the role of strong IT performance as a driver of corporate strategy. Stakeholders concerned with maximizing IT return on investment (ROI) recognize the importance of central, comprehensive information resources to effective strategic business planning. Alignment of corporate and IT strategies is now a vital element of business success. To empirically support this conclusion, this study measures the relationship between strategic alignment of IT investment returns and corporate performance. A Descriptive research design using survey methodology was employed. The study included analyses of variable values involving stakeholders in banks, such as new customers and employees. A Simple Percentage Method, chi-square tests, Tables and weighted average were used to analyze data of at least five (5) banks in Ajman Emirates of UAE to determine the degree of alignment and its impact on the two strategic dimensions. A binary logistic regression analysis using Chan’s STROIS model incorporated with Venkatraman’s STROBE model was proposed to collect survey data and determine the extent of the strategic alignment. The research results provide empirical evidence that supports the hypothesis that closer alignment between corporate and IT strategies leads to increased IT ROI and improved corporate performance. This relationship holds true for all firms regardless of strategic intent for IT. The study also shows a positive correlation between early adoption of newly emergent technologies and business competitive advantage which leads to positive conclusions that strategic competition is imperative towards corporate performances.

6 citations


Journal ArticleDOI
TL;DR: In this article, a generalized moment generating function is developed from the existing theory of moment generating functions as the expected value of powers of the exponential constant, which can be used to generate moments of positive and negative powers.
Abstract: This paper seeks to develop a generalized method of generating the moments of random variables and their probability distributions. The Generalized Moment Generating Function is developed from the existing theory of moment generating function as the expected value of powers of the exponential constant. The methods were illustrated with the Beta and Gamma Family of Distributions and the Normal Distribution. The methods were found to be able to generate moments of powers of random variables enabling the generation of moments of not only integer powers but also real positive and negative powers. Unlike the traditional moment generating function, the generalized moment generating function has the ability to generate central moments and always exists for all continuous distribution but has not been developed for any discrete distribution.

4 citations


Journal ArticleDOI
TL;DR: In this article, the authors provided probabilistic-mechanistic models for describing the cell kill (K) and cell sub-lethal damage (SL) for one fraction with a dose of radiation that is absorbed by a living tissue; also this provides the K and SL formalisms for fractioned irradiation regimens.
Abstract: This document provides probabilistic-mechanistic models for describing the cell kill (K) and cell sub-lethal damage (SL) for one fraction with a dose of radiation that is absorbed by a living tissue; also this provides the K and SL formalisms for fractioned irradiation regimens. These models and formalisms are based on real mean behavior of cell survival (S) - a complement of K- and strong probabilistic-radiobiological foundations. The K and SL formalisms include all possible factors affecting the biological radiation effects: dose (d), fractionations (n), SL, and the temporal factors: cell repair and cell repopulation. It is discussed some aspects about the widely used linear-quadratic (LQ) S(d) model and LQ S(n,d) formalism, and one of its derivations, the BED (biologically effective dose). The SMp K(d) parameters can be obtained from S data, or using graphical/analytical tools developed by this study. These new formalisms will be useful for simulations of treatments, and together regional damage distribution for optimizations of the treatment planning.

3 citations


Journal ArticleDOI
TL;DR: An extended version of the well-known Chapman-Kolmogorov Equations (CKEs) is used to model the state transition of the probability mass function of each side of the dice during the game and represent the time-dependent propensity of the game by a simple regression process.
Abstract: We present a mathematical formulation of the Multiple Dice Rolling (MDR) game and develop an adaptive computational algorithm to simulate such game over time. We use an extended version of the well-known Chapman-Kolmogorov Equations (CKEs) to model the state transition of the probability mass function of each side of the dice during the game and represent the time-dependent propensity of the game by a simple regression process, which enable to capture the change in the expectation over time. Furthermore, we perform a quantitative analysis on the outcome of the game in a framework of Average Probability Value (APV) of appearance of a side of the dice over trials. The power of our approach is demonstrated. Our results also suggest that in the MDR game, the APV of appearance of a side of a dice can be appropriately predicted independently of the number of sides and trials.

3 citations


Journal ArticleDOI
TL;DR: The exponential distribution is considered in situtations where intervals between events are considered as well as where a skewed distribution is appropriate as mentioned in this paper, and the exponential distribution also plays key role in survival analysis.
Abstract: The exponential distribution is considered in situtations where intervals between events are considered as well as where a skewed distribution is appropriate. The exponential distribution also plays key role in survival analysis. Goodness-of-fit for exponentiality is crucial as, in the natural sciences, some of the commonly used distributions such as gamma and Weibull distributions are just translated versions of the exponential distributions. Several well known exponentiality tests are reviewed. A power comparison is performed using simulation.

3 citations


Journal ArticleDOI
TL;DR: In this article, the effect of slip velocity on arterial flow through an arterial tube in presence of multiple stenosis was analyzed and the results have been shown in graphical form and discussed.
Abstract: The aim of the present analysis is to study the effect of slip velocity on blood flow through an arterial tube in presence of multiple stenosis. The effects of length of stenosis, shape parameter, parameter γ on resistance to flow and shear stress have been incorporated here. The results have been shown in graphical form and discussed.

3 citations


Journal ArticleDOI
TL;DR: In this article, the authors compared the accuracies of Discriminant Analysis model (DA) and Artificial Neural Networks model (ANN) for classification and prediction of Friesian cattle fertility status by using its reproductive traits.
Abstract: Background & objectives: This study was undertaken to compare the accuracies of Discriminant analysis model (DA) and Artificial neural networks model (ANN) for classification and prediction of Friesian cattle fertility status by using its reproductive traits. Methods: Data was collected through field survey of 2843 animal records of Friesian breed belongs to El Dakhalia province farms, Egypt. Data was covering the period extended from 2010 to 2013. The samples of dairy production sectors were selected randomly. Data was collected from valid farm records or the structured questionnaires established by the researcher. Results: The results of classification accuracy indicated that the artificial neural network (ANN) model is more efficient than the discriminant analysis (DA) model in expressing overall classification accuracy and accuracies of correctly classified cases of fertility status for Friesian cattle. The results showed that The ANN models had shown the highest classification accuracy (93.6%) for year (2010) while, it was (79.9%) for DA. The comparison of overall classification accuracies clearly favored the supremacy of ANN over DA. The results also were confirmed by the areas under Receiver Operating Characteristic Curves (ROC) captured by ANN and DA. ROC curves are used mainly for comparing different discriminating rates. Areas under ROC curves were higher in case of ANN models across the different years compared to DA models. The differences in accuracies were also significant at 5% level of significance with p-value 0.005 by using Paired Sample t-test. From all of the above we can conclude that artificial neural network model was more accurate in prediction and classification of fertility status than a traditional statistical model (Discriminant analysis).

3 citations


Journal ArticleDOI
TL;DR: The most current results of the statistical models project (SMp) related to probabilistic-mechanistic NTCP models for evaluating radiation injury (RI) are provided and some problems of the current radiobiological concepts and models are discussed.
Abstract: We provide the most current results of the statistical models project (SMp) related to probabilistic-mechanistic NTCP models for evaluating radiation injury (RI), and discuss some problems of the current radiobiological concepts and models, The SMp NTCP models in function of dose of reference (Dref) were formulated. The SMp (Dref) were used two QUANTEC studies. The SMp will provide of less complex and advantageous models and parameters. Given their unquestionable common elements with the Lyman-Kutcher-Burman (LKB) and sigmoid models derived from logistic functions, they will be applicable; and according to negative remarks pointed out, the well-established, clinically validated, NTCP models in clinical radiobiology, even the SMp NTCP models could replace to these. Until date for fitting real datasets concerning radiation injuries have been used mathematical models that have some similarities with the probabilistic models. All datasets, which have been fitted with mathematical models that have similarities with probabilistic functions (PFs) can also be fitted with PFs.

3 citations


Journal ArticleDOI
TL;DR: In this article, the Generalized Binomial Distribution (GBD) combined with some basic financial concepts is applied to generate a model for determining the prices of a European call and put options.
Abstract: In this work, the Generalized Binomial Distribution (GBD) combined with some basic financial concepts is applied to generate a model for determining the prices of a European call and put options. To demonstrate the behavior of the option prices (call and put) with respect to variables, some numerical examples and graphical illustration have been given in a concrete setting to illustrate the application of the obtained result of the study. It was observed that when there is an increase in strike prices, it leads to decrease in calls option price C(0) and increase in puts option price P(0). Decrease in interest rate leads to decrease in calls option price P(0), and increase in puts option price P(0), and decrease in expiration date leads to decrease in calls option price C(0) and decrease in puts option price P(0). It was also found that the problem of option price can be approached using Generalized Binomial Distribution (GBD) associated with finance terms.

Journal ArticleDOI
TL;DR: Results from this study indicate that various types of PA protect adults from all-cause mortality and a dose-response relationship exists between the number of PA modes adopted and risk of mortality.
Abstract: Purpose: The purpose of this study was to examine the protective effects of different modes of physical activity (PA) on all-cause mortality in adults. Methods: Data for this research came from the 2001-2002 National Health and Nutrition Examination Survey (NHANES). Participants 18+ years of age who were eligible for mortality linkage were used in the analysis. Different modes of PA were determined from a series of questions asking respondents if they participated in transportation (TPA), home/yard (HPA), moderate recreational (MPA), vigorous recreational (VPA), or muscle strengthening (MSPA) physical activity. Those respondents answering “yes” to either question were considered participating in that PA mode. Cox proportional hazards regression was used to model the effects of PA mode on mortality while controlling for age, sex, race, and income. Results: Adults were at less risk of mortality if they participated in TPA (Hazard Ratio (HR) =0.72, 95% CI: 0.57, 0.90), HPA (HR=0.43, 95% CI: 0.33-0.55), VPA (HR=0.30, 95% CI: 0.23-0.38), MPA (HR=0.53, 95% CI: 0.45-0.62), and MSPA (HR=0.44, 95% CI: 0.32-0.60). The adjusted model showed a 24.0% decrease in mortality (HR=0.76, 95% CI: 0.67-0.85) for each additional PA mode adopted. Conclusions: Results from this study indicate that various types of PA protect adults from all-cause mortality. Additionally, a dose-response relationship exists between the number of PA modes adopted and risk of mortality.

Journal ArticleDOI
TL;DR: The effectiveness of mathematics in the natural sciences was characterized by the famous Nobel prize holder E. P. Winger as being unreasonable as discussed by the authors and it is not difficult for one to understand that this characterization is related to a question that has occupied the interest of philosophers, mathematicians and other scientists at least from the Plato's era in ancient, until today: "Is mathematics discovered or invented by humans"?
Abstract: The effectiveness of mathematics in the natural sciences was characterized by the famous Nobel prize holder E. P. Winger as being unreasonable. It is not difficult for one to understand that this characterization is related to a question that has occupied the interest of philosophers, mathematicians and other scientists at least from the Plato’s era in ancient , until today: “Is mathematics discovered or invented by humans”? In the present work in an effort to obtain a convincing explanation of the above Winger’s “enigma”, the existing philosophical views about the above question are critically examined and discussed in connection with the advances in the history of mathematics that affected the human beliefs about them.

Journal ArticleDOI
TL;DR: In this paper, a higher-order modification of Newton's method for solving nonlinear equations based on the undetermined coefficients is presented, which can be applied to any iteration formula.
Abstract: In this article we construct some higher-order modifications of Newton’s method for solving nonlinear equations, which is based on the undetermined coefficients. This construction can be applied to any iteration formula. It can be found that per iteration the resulting methods add only one additional function evaluation, their order of convergence can be increased by two or three units. Higher order convergence of our methods is proved and corresponding asymptotic error constants are expressed. Numerical examples, obtained using Matlab with high precision arithmetic, are shown to demonstrate the convergence and efficiency of the combined iterative methods. It is found that the combined iterative methods produce very good results on tested examples, compared to the results produced by the existing higher order schemes in the related literature.

Journal ArticleDOI
TL;DR: In this paper, the authors used discontinues boundary element method (DBEM) to solve the quantity element using new numerical techniques on discontinue boundary element (DBE) for the solution of Laplace equation.
Abstract: This paper deals with solving the quantity element using new numerical techniques on discontinues boundary element method (DBEM). The common practice in getting solution with BEM is using constant element and for that, in a Sub-parametric element, quantity has a constant value along the element and geometry discretization is supposed to have a linear variation. But using higher order (polynomial) distribution of quantity over elements could have a better description of physical process. For this, the corresponding discretized expressions based on new techniques are derived and used for solution of Laplace equation. Many results for the quantity elements are presented and discussed for the ellipse at various diameters and mesh numbers.

Journal ArticleDOI
TL;DR: In this paper, a simulation study was carried out in SAS to examine the performance of the two methods used with general sample sizes; the Generalized Estimating Equations (GEE) method and Generalized Linear Mixed Models (GLMM) towards analysis of binary RMD with small sample size, after adjusting the bias that occurs in small samples.
Abstract: Binary repeated measurements occur often in a variety of fields. Particularly in medicine, small samples are used in the early phases (phase I and II) of clinical trials, in bio equivalence studies and in crossover trials where human participation is multitudinous. Hence, it is vital to develop a precise method to analyze binary Repeated Measures Data (RMD) with small sample size which is related to humans and even to animals. As a result, this simulation study was carried out in SAS to examine the performance of the two methods used with general sample sizes; the Generalized Estimating Equations (GEE) method and Generalized Linear Mixed Models (GLMM) towards analysis of binary RMD with small sample size, after adjusting the bias that occurs in small samples. Being motivated by the study of literature, large scale simulations are carried out for each method with the facilitation of PROC GENMOD and PROC GLIMMIX procedures respectively, along with varying options of small sample bias correction methods available in SAS, the Sandwich Variance Estimation (SVE) technique and its variants. Each method with all possible SVE techniques available in SAS were compared and contrasted with respect to the properties; Type I error, power, unbiasedness, consistency, sufficiency, convergence, speed of computation and efficiency. The results obtained from the simulation study depicted that for binary RMD which adhere to AR(1) process, with no missing values and with no covariates, GLMM with SVE techniques FIROEEQ and ROOT perform equally and exceptionally well for a small sample size binary repeated case with respect to all the properties of parameter estimates considered except for sufficiency. However the GEE method with the naive option while being marginal with respect to type I error, performs well in analyzing very small sample sizes and satisfies all the properties including sufficiency.

Journal ArticleDOI
TL;DR: In this article, the effect of gender, age, and environment on the academic performance of pupils in primary school was studied using factor analysis, and the analysis presented three factors (gender, age and environment) to be the cause of the variation between the pupils performance in primary schools.
Abstract: Over the years, efforts have been made by researchers in studying the effects of certain factors on academic performance of students, though with little concentration on the primary section of education. This research therefore worked on the academic performance of pupils in primary school. Variation in the analyzed performance and the factor(s) causing the variations were studied using factor analysis. The data used is a secondary data, collected from Federal University Wukari Staff School and it’s on the terminal examination scores of the pupils in seven selected subjects over one selected academic session (2015/2016). From both the un-rotated and rotated factor analysis results, we observed a fair relationship between the mathematical and less mathematical subjects, though they present the major variation in the pupils’ performance to be in the less mathematical subjects like English Language, Verbal Aptitude, Social Studies, Creative Art, and Religious Studies. Also, the analysis presented three factors (gender, age, and environment) to be the cause of the variation between the pupils performance in primary school.