scispace - formally typeset
Search or ask a question

Showing papers by "YMCA University of Science and Technology published in 2014"


Proceedings ArticleDOI
17 Apr 2014
TL;DR: The problems in current Agile practices thereby proposed a method for accurate cost and effort estimation, which focuses on the research work in Agile Software development and estimation inAgile.
Abstract: Projects that are over-budget, delivered late, and fall short of user's expectations have been a common problem are a for software development efforts for years. Agile methods, which represent an emerging set of software development methodologies based on the concepts of adaptability and flexibility, are currently touted as a way to alleviate these reoccurring problems and pave the way for the future of development. The estimation in Agile Software Development methods depends on an expert opinion and historical data of project for estimation of cost, size, effort and duration. In absence of the historical data and experts the previous method like analogy and planning poker are not useful. This paper focuses on the research work in Agile Software development and estimation in Agile. It also focuses the problems in current Agile practices thereby proposed a method for accurate cost and effort estimation.

67 citations


Journal ArticleDOI
TL;DR: In this article, a graph theoretic approach has been applied to find the intensity of these barriers through an index which is computed through a permanent function obtained from the digraph of TPM barriers.
Abstract: Total productive maintenance (TPM) is an innovative approach to maintenance which holds the potential for enhancing effectiveness of production facilities. But, implementation of TPM is not an easy task. Innumerable barriers are encountered in real-life cases during TPM implementation. It is very essential to evaluate the nature and impact of these barriers so that production and maintenance managers can cultivate some strategies to overcome these barriers. In the present exertion, a graph theoretic approach has been applied to find the intensity of these barriers through an index which is computed through a permanent function obtained from the digraph of TPM barriers.

52 citations


Journal ArticleDOI
Abstract: The object of this paper is to establish the existence and uniqueness of coupled fixed points under a ( , )-contractive condition for mixed monotone operators in the setup of partially ordered metric spaces. Presented work generalizes the recent results of Berinde (2011, 2012) and weakens the contractive conditions involved in the well-known results of Bhaskar and Lakshmikantham (2006), and Luong and Thuan (2011). The effectiveness of our work is validated with the help of a suitable example. As an application, we give a result of existence and uniqueness for the solutions of a class of nonlinear integral equations.

50 citations


Journal ArticleDOI
TL;DR: It was concluded that addition of vermicompost, in appropriate quantities, to potting media has synergistic effects on growth and yield of marigold.
Abstract: This paper reports the influence of vermicomposts prepared from cow dung and house hold waste on the growth and flowering of marigold crop A total of seven potting media were prepared containing soil, cow dung vermicompost and cow dung + house hold waste vermicompost The fertility status of soil and vermicomposts was quantified In these media, growth and flowering of marigold plant seedlings was studied for 60 days The results showed that the vermicomposting process converted the cow dung and household waste into a highly stabilized product having C:N ratio <200 The NPK content of vermicomposts was higher than soil The plant grown in vermicompost-containing potting media had 23 times more plant height than control Results showed that the addition of vermicompost, in appropriate quantities, to potting media has significantly positive effects on growth and flowering of marigold seedlings including plant biomass, plant height, number of buds and flowers It was concluded that addition of vermicompost, in appropriate quantities, to potting media has synergistic effects on growth and yield of marigold

45 citations


Journal ArticleDOI
TL;DR: Application of multi-objective optimization on the basis of ratio analysis approach is explored to solve such type of decision making problems in robust and flexible production systems.
Abstract: Decisions in robust and flexible production systems are made in an environment often characterized by complexity, need for flexibility, and inclusion of a decision-maker’s subjectivity. Typically in production system life cycle, decisions on the product design, facility location, facility layout, supplier, material, technology, and so forth has to be made in an efficient and timely manner. These decisions are more complex as the decision makers have to assess a wide range of alternatives based on a set of conflicting criteria. In this paper, application of multi-objective optimization on the basis of ratio analysis approach is explored to solve such type of decision making problems. Moreover performance of the reference point approach is also tested for the considered decision making problems.

44 citations


Journal ArticleDOI
30 Sep 2014
TL;DR: The main focus of the research was to find a technique that can efficiently perform sentiment analysis on big data sets using Hadoop and the experimental results shows that the technique exhibits very good efficiency in handling big sentiment data sets.
Abstract: Rapid increase in the volume of sentiment rich social media on the web has resulted in an increased interest among researchers regarding Sentimental Analysis and opinion mining. However, with so much social media available on the web, sentiment analysis is now considered as a big data task. Hence the conventional sentiment analysis approaches fails to efficiently handle the vast amount of sentiment data available now a days. The main focus of the research was to find such a technique that can efficiently perform sentiment analysis on big data sets. A technique that can categorize the text as positive, negative and neutral in a fast and accurate manner. In the research, sentiment analysis was performed on a large data set of tweets using Hadoop and the performance of the technique was measured in form of speed and accuracy. The experimental results shows that the technique exhibits very good efficiency in handling big sentiment data sets.

37 citations


Journal ArticleDOI
TL;DR: A selective algorithm for allocation of cloud resources to end -users on-demand basis is discussed, based on min -min and max-min algorithms, which shows that overall makespan of tasks on given set of VMs minimizes significantly in different scenarios.
Abstract: Modern day continued demand for resource hungry services and applications in IT sector has led to development of Cloud computing. Cloud computing environment involves high cost infrastructure on one hand and need high scale computational resources on the other hand. These resources need to be provisioned (allocation and scheduling) to the end users in most efficient manner so that the tremendous capabilities of cloud are utilized effectively and efficiently. In this paper we discuss a selective algorithm for allocation of cloud resources to end-users on-demand basis. This algorithm is based on min-min and max-min algorithms. These are two conventional task scheduling algorithm. The selective algorithm uses certain heuristics to select between the two algorithms so that overall makespan of tasks on the machines is minimized. The tasks are scheduled on machines in either space shared or time shared manner. We evaluate our provisioning heuristics using a cloud simulator, called CloudSim. We also compared our approach to the statistics obtained when provisioning of resources was done in First-Cum-First- Serve(FCFS) manner. The experimental results show that overall makespan of tasks on given set of VMs minimizes significantly in different scenarios.

34 citations


Journal ArticleDOI
TL;DR: In this article, the authors make a model and analysis of the productivity variables of flexible manufacturing system (FMS) performance, and actions to increase it are said to improve profitability and the wage earning capacity of employees.
Abstract: Productivity has often been cited as a key factor in a flexible manufacturing system (FMS) performance, and actions to increase it are said to improve profitability and the wage earning capacity of employees. Improving productivity is seen as a key issue for survival and success in the long term of a manufacturing system. The purpose of this paper is to make a model and analysis of the productivity variables of FMS. This study was performed by different approaches viz. interpretive structural modelling (ISM), structural equation modelling (SEM), graph theory and matrix approach (GTMA) and a cross-sectional survey within manufacturing firms in India. ISM has been used to develop a model of productivity variables, and then it has been analyzed. Exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) are powerful statistical techniques. CFA is carried by SEM. EFA is applied to extract the factors in FMS by the statistical package for social sciences (SPSS 20) software and confirming these factors by CFA through analysis of moment structures (AMOS 20) software. The twenty productivity variables are identified through literature and four factors extracted, which involves the productivity of FMS. The four factors are people, quality, machine and flexibility. SEM using AMOS 20 was used to perform the first order four-factor structures. GTMA is a multiple attribute decision making (MADM) methodology used to find intensity/quantification of productivity variables in an organization. The FMS productivity index has purposed to intensify the factors which affect FMS.

30 citations


Proceedings ArticleDOI
05 Mar 2014
TL;DR: The effectiveness and feasibility of the proposed algorithm has been shown by considering three cases in which different levels of factors are taken and compared and estimating the more accurate release date, cost, effort and duration for the project.
Abstract: The key measures of any project about its perfectness are ability to complete in the scheduled time and within the estimated cost. Cost and effort are important factors whenever a project is developed. The estimation of cost and effort is a difficult task in Agile environment. It has been observed that the current Agile methods mostly depends on an expert opinion, planning poker and historical data of project for estimation of cost, size, effort and duration. In absence of the historical data and experts the previous methods are not efficient. So there is need of an algorithmic method, which can calculate cost and effort of the project. In this direction S.Bhalero, Maya Ingel [2] has considered some project-related factors that calculate the size as well as duration of the project. However, several other project as well as people related factors may affect the Agile environment. In this work an Algorithmic estimation method is being proposed considering various factors thereby estimating the more accurate release date, cost, effort and duration for the project. The effectiveness and feasibility of the proposed algorithm has been shown by considering three cases in which different levels of factors are taken and compared.

29 citations


Journal ArticleDOI
31 Aug 2014
TL;DR: WPA2 is more hearty security convention as compared with WPA on the grounds that it utilizes the Advanced Encryption Standard (AES) encryption and there are few issues in WPA2 like it is helpless against brute force attack and MIC bits could be utilized by programmer to compare it with the decoded content.
Abstract: In the recent years we have huge development of wireless technology. We are presently getting more subject to wireless technology. As we know wireless networks have broadcast nature so there are different security issues in the wireless communication. The security conventions intended for the wired systems can't be extrapolated to wireless systems. Hackers and intruders can make utilization of the loopholes of the wireless communication. In this paper we will mull over the different remote security dangers to wireless systems and conventions at present accessible like Wired Equivalent Privacy (WEP), Wi-Fi Protected Access (WPA) and Wi-Fi Protected Access 2 (WPA2). WPA2 is more hearty security convention as compared with WPA on the grounds that it utilizes the Advanced Encryption Standard (AES) encryption. There are few issues in WPA2 like it is helpless against brute force attack and MIC bits could be utilized by programmer to compare it with the decoded content. So in this paper we will concentrate on different sorts of wireless security dangers.

27 citations


Journal ArticleDOI
TL;DR: In this article, a systematic approach based on graph theory and matrix method is developed ingeniously for the evaluation of reliability index for a Combined Cycle Power Plant (CCPP) system is divided into six subsystems and their interrelations are considered in evaluating the index.

Book
24 Jul 2014
TL;DR: A new version of LEACH protocol called OP-LEACH is presented which aims to reduce energy consumption within the wireless sensor network and is evaluated through extensive simulations using OMNET++ simulator which shows that Op-leACH performs better than LEACH Protocol.
Abstract: A Wireless Sensor Network (WSN) is a collection of small, self-contained electro-mechanical devices that monitor the environment conditions. There are many design issues for WSNs such as deployment, mobility, infrastructure, network topology, network size and density, connectivity, lifetime, node addressability, data aggregation etc. The hierarchical routing protocols are LEACH (Low-Energy Adaptive Clustering Hierarchy) is one of the routing protocols designed for communication in WSNs. LEACH is clustering based protocol that utilizes randomized rotation of local cluster-heads to evenly distribute the energy load among the sensors in the network. LEACH uses localized coordination to enable scalability and robustness for dynamic networks, and incorporates data fusion into the routing protocol to reduce the amount of information that must be transferred to the base station. But LEACH is based on the assumption that each sensor nodes contain equal amount of energy which is not valid in real scenarios. LEACH uses a TDMA based MAC protocol, in order to maintain balanced energy consumption. A number of these TDMA slots are wasted when the nodes have random data distribution. A modification to existing LEACH protocol is needed in order to use the slots corresponding to nodes that do not have data to send at its scheduled slot. This paper presents a new version of LEACH protocol called OP-LEACH which aims to reduce energy consumption within the wireless sensor network. Both existing LEACH and proposed OP-LEACH are evaluated through extensive simulations using OMNET++ simulator which shows that Op-LEACH performs better than LEACH protocol.

Journal ArticleDOI
TL;DR: In this paper, an experimental study of rough and trim cutting operation in wire electrical discharge machining (WEDM) has been presented on four hard to machine materials namely WC-Co composite, HCHCr steel alloy, Nimonic-90 and Monel-400.

Proceedings ArticleDOI
01 Feb 2014
TL;DR: There are lot of parameters which must be considered while analyzing any social network and this analysis spans from pure mathematical modeling to graphical representations and also the implications associated with the same are discovered.
Abstract: Since late 1970s, Social Network Analysis (SNA) techniques have evolved as one of the successful applications of internet. There are numerous reasons that demand the better understanding the structure of a social network, justify the need of their analysis and also studying the impact of these on future internet. For instance, finding the shared interest and trust could be one of the reasons to study a social network. Moreover, if future distributed online social networks are popular and bandwidth-intensive, they can have a significant impact on Internet traffic, just as current peer-to-peer content distribution networks do. Regardless of one's stance on these phenomena, a better understanding of the structure of social networks are likely to improve our understanding of the opportunities, limitations, and threats associated with these ideas. The paper presents an engraved review of need of social network analysis and also the implications associated with the same. It is discovered that there are lot of parameters which must be considered while analyzing any social network and also this analysis spans from pure mathematical modeling to graphical representations.

Proceedings ArticleDOI
03 Apr 2014
TL;DR: In this research work some importance related and effort related factors are considered on the basis of which the prioritization of user-stories is done and will allows the companies to better screen job candidates and assess their internal talent for skills development.
Abstract: In the last few years Agile methodologies appeared as a reaction to traditional software development methodologies. In Agile environment the requirements from the client are always taken in the form of user-stories and prioritization of requirements is done by Moscow method, validate learning and walking skeleton methods. By literature survey it has been observed that these methods are not efficient because they do not consider importance of user-stories by client. In this research work some importance related and effort related factors are considered on the basis of which the prioritization of user-stories is done. Further the feasibility of work has been validated by a case study of Enable Quiz which is a lightweight technical quizzing solution; for companies that hire engineers. The research work will allows the companies to better screen job candidates and assess their internal talent for skills development.

Proceedings ArticleDOI
17 Apr 2014
TL;DR: Proposed protocol was structured to overcome some QoS problems related to this layer like packet handling, reliable packet transmission with loss recovery and congestion control, and is making an effort to reduce retransmission of duplicate packets and relevant to control congestion.
Abstract: Wireless Body Area Network (WBAN) applications are looking forward for better and effective environment because of its heterogeneous and wearable nature. Recent applications of WBAN need to be support both real time traffic (sensitive to end-to-end packet delay) and non-real time traffic (sensitive to packet loss), which further origin the problem of diverse QoS requirements. So the first step of this paper is to study QoS issues related to each layer and extort why transport layer is more devoted to QoS issue. Then we inspect the limitations of current existing transport protocols for WBAN system. Considering these limitations we have designed a protocol, which tries to handle QoS in an efficient way. As we know transport layer deals QoS significantly, proposed protocol was structured to overcome some QoS problems related to this layer like packet handling, reliable packet transmission with loss recovery and congestion control. The intention of proposed schema is to provide end-to-end bidirectional (both upstream and downstream) and bi-functional (both packet based and event based) reliability module. This is also taking care of each kind of packets loss. Its intelligent packet handling method provides priority fairness to overcome starvation problem. This proposed work is making an effort to reduce retransmission of duplicate packets and relevant to control congestion.

Journal ArticleDOI
28 Feb 2014
TL;DR: In this paper, the authors discuss a selective algorithm for allocation of cloud resources to end-users on-demand basis, which is based on min-min and max-min algorithms.
Abstract: Modern day continued demand for resource hungry services and applications in IT sector has led to development of Cloud computing. Cloudcomputing environment involves high cost infrastructure on one hand and need high scale computational resources on the other hand. These resources need to be provisioned (allocation and scheduling) to the end users in most efficient manner so that the trem endous capabilities of cloud are utilized effectively and efficiently. In this paper we discuss a selective algorithm for allocation of cloud resources to end -users on-demand basis. This algorithm is based on min -min and max-min algorithms. These are two c onventional task scheduling algorithm. The selective algorithm uses certain heuristics to select between the two algorithms so that overall makespan of tasks on the machines is minimized. The tasks are scheduled on machines in either space shared or timeshared manner. We evaluate our provisioning heuristics using a cloud simulator, called CloudSim. We also compared our approach to the statistics obtained when provisioning of resources was done in First -Cum-FirstServe(FCFS) manner. The experimental result s show that overall makespan of tasks on given set of VMs minimizes significantly in different scenarios.

Journal ArticleDOI
TL;DR: A logical approach based on graph theory and matrix method (GTMM) is developed for assessment of reliability index for a co-generation cycle power plant (CGCPP), where a higher value of index implies that the plant is available with better reliability.
Abstract: A logical approach based on graph theory and matrix method (GTMM) is developed for assessment of reliability index for a co-generation cycle power plant (CGCPP). For a humongous and multipart system such as CGCPP, reliability of its components or subsystems is closely intertwined and insuperable without taking the effect of others. For the ease of analysis CGCPP system is divided into four sub-systems. Reliability of CGCPP is modeled in terms of a reliability attributes digraph which is developed from system reliability digraph. Nodes in the digraph represent sub-system reliability and reliability of interrelations is represented by the directed edges. The digraph is represented by one-to-one matrix called as variable system reliability permanent matrix (VSRPM). A step by step procedure is developed for calculating variable permanent function for reliability (VPF-r) from VSRPM. A higher value of index implies that the plant is available with better reliability. The developed methodology is illustrated step-by-step with the help of an example.

Proceedings ArticleDOI
17 Apr 2014
TL;DR: A review on different Social network Aggregators and issues in integrating social network is given, Exposing criminal behaviours in e-commerce, computer intrusions identification, detecting health problems, analysing satellite images.
Abstract: Organizing contacts, friends, sharing thoughts, emotions searching content and data are the basic means that a social networking site provides. By sharing, and managing content the users form a social network for example, Facebook, Linkedin, twitter, Orkut, etc. A user may exist on several different social networking sites and the problem to maintain the account on these networks prevails. This brings us to define a Social network aggregation i.e. collecting the social content from different social networks and integrate it at a single location/site. It is an attempt to organize a user's social networking experience as whole. This paper gives a review on different Social network Aggregators and issues in integrating social network. Exposing criminal behaviours in e-commerce, computer intrusions identification, detecting health problems, analysing satellite images.

Journal ArticleDOI
TL;DR: In this article, ten operational risk factors have been identified through literature review, they are further analysed by using weighted interpretive structural modelling (W-ISM) approach to develop a structural modelling between these factors and a method of effectiveness index (EI) is used to identify the key areas.
Abstract: Risk in the context of supply chain management plays an important role. Risk management is the key area for the competitiveness and the growth of industries. In this paper, ten operational risk factors have been identified through literature review, they are further analysed by using weighted interpretive structural modelling (W-ISM) approach. ISM approach is used to develop a structural modelling between these factors and a method of effectiveness index (EI) is used to identify the key areas. The effectiveness index evaluated in this paper will help the industries to benchmark their performance by managing the risks reported in this study.

Journal ArticleDOI
01 Oct 2014
TL;DR: In this article, two novel defected ground structures (DGS) are proposed to improve the return loss, compactness, gain and radiation efficiency of rectangular microstrip patch antenna.
Abstract: In this paper, two novel defected ground structures (DGS) are proposed to improve the return loss, compactness, gain and radiation efficiency of rectangular microstrip patch antenna. The performance of antenna is characterized by the shape, dimension & the location of DGS at specific position on ground plane. By incorporating a peacock shaped slot of optimum geometries at suitable location on the ground plane, return loss is enhanced from -23.89 dB to -43.79 dB, radiation efficiency is improved from 97.66% to 100% and compactness of 9.83% is obtained over the traditional antenna .Simulation results shows that the patch antenna with star shaped DGS can improve the impedance matching with better return loss of -35.053 dB from -23.89 dB and compactness of 9% is achieved. In the end comparison of both DGS shapes is carried out to choose one best optimize one. The proposed antennas are simulated and analyzed using Ansoft HFSS (version 11.1) software.

Proceedings ArticleDOI
06 Mar 2014
TL;DR: The main purpose of the proposed schema is to provide Genetic Binary Decision Tree based Packet Handling protocol, which classifies the heterogeneous traffic flow according to rule sets, which helps in improving the quality of service required for different kinds of flows.
Abstract: New generation applications related to Wireless Body Area Network (WBAN) are responsible for gathering and managing heterogeneous data for both real time and non-real time traffic. The proposed algorithm can handle heterogeneous packets by considering characteristics in terms of required bandwidth, required buffer, transmission delay, packet loss, and reliability. The main purpose of the proposed schema is to provide Genetic Binary Decision Tree based Packet Handling (GBDTPH) protocol, which classifies the heterogeneous traffic flow according to rule sets. Decision tree based data classification and prioritization helps in improving the quality of service required for different kinds of flows. The newly designed Prioritized Earliest Deadline Scheduling algorithm provides fairness to low priority packets and helps in overcoming starvation problem. Proposed packet drop module smartly drops deadline exceeded and least frequent used low priority packets.

Journal ArticleDOI
TL;DR: In this article, an R-152a ejector-jet pump refrigeration cycle and a LiBr-H2O absorption cooling cycle have been integrated with a renewable energy power generator for making a novel compact cogeneration cycle.
Abstract: An R-152a ejector-jet pump refrigeration cycle and a LiBr-H2O absorption refrigeration cycle have been integrated with a renewable energy power generator for making a proposed ‘novel compact cogeneration cycle’. The exergy analysis of this proposed cycle leads to a possible performance improvement. Nearly 71.12% of the input exergy is destructed due to irreversibilities in the different components. The useful exergy output is around 7.12%. The exhaust exergy lost to the environment is 21.76%, which is lower than the exhaust energy lost 37.6% of the input energy, while the useful energy output is approximately 19.3%. The refrigerants used and the exhaust gas emissions samples are found to be favourable for reducing the global environmental related problems. The results also show that the coupling of the entrainment ratios of the ejector and jet pump has great effect on the exergy and energy efficiency.

Journal ArticleDOI
TL;DR: In the present work, these barriers have been identified through the literature, their ranking is done by a questionnaire-based survey and interpretive structural modelling approach has been utilised in analysing their mutual interaction.
Abstract: Today’s volatile market, which is influenced by global competition and changing customers’ demands, has made the manufacturing companies to look for new manufacturing systems which can fulfil their requirements for global competition. Reconfigurable manufacturing systems (RMSs) are those systems which are capable to meet the requirements of modern manufacturing industries. But adoption and implementation of RMSs is not an easy task. There are certain barriers which not only influence the implementation process but also influence each other. The main objective of this paper is to identify and analyse these barriers. In the present work, these barriers have been identified through the literature, their ranking is done by a questionnaire-based survey and interpretive structural modelling (ISM) approach has been utilised in analysing their mutual interaction.

Journal ArticleDOI
TL;DR: A literature review of published paper in various reputed journals on the concept of Lean manufacturing is presented in this article, which reveals that the goal of Lean is creation and maintenance of a production system, which runs repetitively, day after day, week after week in a manner identical to the precious time period.
Abstract: In India, the manufacturing sector is striving hard to generate revenues and this is ongoing for the last one decade. This aspect of generating funds is dependent on two major factors i.e. reduction in product manufacturing cost, and enhancing the product quality to satisfy the customer needs. The production cost can be reduced by improving the design and through incorporating the newest version of materials. The improved version of materials will positively add value to the product quality and thereby attract the customers. On the other hand, the product quality can increased through application of systematic and statistical data analysis tool especially Lean, Six Sigma, SCM, JIT, TPM etc. in the manufacturing sector. This may further results in high labour efficiency. The paper comprises of literature review of published paper in various reputed journals on the concept of Lean manufacturing. "Lean" is a production process which encompasses the expenditure of all the resources to accomplish a goal and creating of value to end customer through eliminate the wastes. The paper reveals that the goal of Lean is creation and maintenance of a production system, which runs repetitively, day after day, week after week in a manner identical to the precious time period. The continuous and smooth flow, earlier delivery, reduced cost and better design are the outcomes of the systems.

Journal ArticleDOI
TL;DR: The work presented in this paper provides alternate methods for fatigue life assessment of leaf springs and the method which provides fatigue life closer to experimental value and consumes less time is suggested.
Abstract: The experimental fatigue life prediction of leaf springs is a time consuming process. The engineers working in the field of leaf springs always face a challenge to formulate alternate methods of fatigue life assessment. The work presented in this paper provides alternate methods for fatigue life assessment of leaf springs. A 65Si7 light commercial vehicle leaf spring is chosen for this study. The experimental fatigue life and load rate are determined on a full scale leaf spring testing machine. Four alternate methods of fatigue life assessment have been depicted. Firstly by SAE spring design manual approach the fatigue test stroke is established and by the intersection of maximum and initial stress the fatigue life is predicted. The second method constitutes a graphical method based on modified Goodman's criteria. In the third method codes are written in FORTRAN for fatigue life assessment based on analytical technique. The fourth method consists of computer aided engineering tools. The CAD model of the leaf spring has been prepared in solid works and analyzed using ANSYS. Using CAE tools, ideal type of contact and meshing elements have been proposed. The method which provides fatigue life closer to experimental value and consumes less time is suggested.

Journal ArticleDOI
TL;DR: It is observed that the relative rankings of the alternative cutting fluid as obtained using PSI method match quite well with those as derived by the past researchers.
Abstract: This paper presents a simple and systematic multi-criteria decision making methodology to select a cutting fluid for the given application using the preference selection index (PSI) method. In this methodology, cutting fluid is selected for a given machining application without considering relative importance between the selection attributes. Two real time examples are cited from the literature in order to demonstrate and validate the applicability and potentiality of PSI method in solving the cutting-fluid selection problem. It is observed that the relative rankings of the alternative cutting fluid as obtained using PSI method match quite well with those as derived by the past researchers.

Proceedings ArticleDOI
09 May 2014
TL;DR: An Earliest Deadline based flexible dynamic scheduling algorithm is designed, which has been proven to be an optimal prioritized scheduling for problem like starvation and is evaluated in terms of binary decision tree and genetic algorithm.
Abstract: Wireless Body Area Network (WBAN) is a special kind of autonomous sensor network evolved to provide wide variety of services. Nowadays WABN becomes an integral component of healthcare management system where a patient needs to be monitors both inside and outside home or hospital. These applications are responsible for gathering and managing heterogeneous data in terms of both for real time and non-real time traffic. Heterogeneous traffic classification plays an important role in various application of WBAN. Due to the ineffectiveness of traditional port-based and payload-based methods, recent work were proposed using machine learning methods to classify flows based on statistical characteristics. In this paper, we evaluate the effectiveness of integral concept of machine learning in terms of binary decision tree and genetic algorithm for classification of heterogeneous traffic flow according to rules. We have also designed an Earliest Deadline based flexible dynamic scheduling algorithm, which has been proven to be an optimal prioritized scheduling for problem like starvation.

Journal ArticleDOI
TL;DR: In this paper, a multiple attribute decision making method methodology is structured to resolve the problem of evaluation of the most appropriate flexibility in the manufacturing sector is one of the strategic issues that may affect the flexible manufacturing system (FMS).
Abstract: The evaluation of the most appropriate flexibility in the manufacturing sector is one of the strategic issues that may affect the flexible manufacturing system (FMS). In this paper, a multiple attribute decision making method methodology is structured to resolve this problem. The two decision making methods, which are analytic hierarchy process (AHP) and compromise solution method, also known as the VIsekriterijumsko KOmpromisno Rangiranje (VIKOR) method, are integrated in order to make the best use of information available. The purpose of using AHP is to turn over the weights of the variable and the VIKOR method is allowed to rank flexibility in FMS. Furthermore, the method uses fuzzy logic to alter the qualitative attributes into the quantitative attributes. Fifteen factors are taken for the evaluation of 15 flexibilities. In this paper, we concluded that production flexibility has the most impact in 15 flexibilities and programme flexibility has the least impact in these 15 flexibilities by this methodology.

Proceedings ArticleDOI
01 Nov 2014
TL;DR: A hierarchical test case prioritization technique is proposed wherein various factors have been considered that affect error propagation in the inheritance.
Abstract: Software reuse is the use of existing artifacts to create new software. Inheritance is the foremost technique of reuse. But the inherent complexity due to inheritance hierarchy found in object - oriented paradigm also affect testing. Every time any change occurs in the software, new test cases are added in addition to the existing test suite. So there is need to conduct effective regression testing having less number of test cases to reduce cost and time. In this paper a hierarchical test case prioritization technique is proposed wherein various factors have been considered that affect error propagation in the inheritance. In this paper prioritization of test cases take place at two levels. In the first level the classes are prioritized and in the second level the test cases of prioritized classes are ordered. To show the effectiveness of proposed technique it was applied and analyze on a C++ program.