scispace - formally typeset
Search or ask a question

Showing papers by "YMCA University of Science and Technology published in 2013"


Journal ArticleDOI
TL;DR: An ISM model has been prepared to identify some key enablers and their managerial implications in the implementation of TPM and their ranking is done by a questionnaire-based survey and interpretive structural modelling approach has been utilized in analysing their mutual interaction.
Abstract: Total Productive maintenance (TPM) is increasingly implemented by many organizations to improve their equipment efficiency and to obtain the competitive advantage in the global market in terms of cost and quality. But, implementation of TPM is not an easy task. There are certain enablers, which help in the implementation of TPM. The utmost need is to analyse the behaviour of these enablers for their effective utilization in the implementation of TPM. The main objective of this paper is to understand the mutual interaction of these enablers and identify the ‘driving enablers’ (i.e. which influence the other enablers) and the ‘dependent enablers’ (i.e. which are influenced by others). In the present work, these enablers have been identified through the literature, their ranking is done by a questionnaire-based survey and interpretive structural modelling (ISM) approach has been utilized in analysing their mutual interaction. An ISM model has been prepared to identify some key enablers and their managerial implications in the implementation of TPM.

99 citations


Journal ArticleDOI
TL;DR: A questionnaire based survey and interpretive structural modelling approach have been used to model and analyse key barriers and drive managerial insights.
Abstract: In the highly competitive environment, to be successful and to achieve world-class-manufacturing, organizations must possess both efficient maintenance and effective manufacturing strategies. A strategic approach to improve the performance of maintenance activities is to effectively adapt and implement strategic TPM initiatives in the manufacturing organizations. Total productive maintenance (TPM) is not easy to adopt and implement, due to presence of many barriers. The purpose of this paper is to identify and analyse these barriers. A questionnaire based survey was conducted to rank these barriers. The results of this survey and interpretive structural modelling approach have been used to model and analyse key barriers and drive managerial insights.

77 citations


Journal ArticleDOI
TL;DR: In this paper, two vermicomposting units containing cow dung (CD) and biogas plant slurry (BPS) were established, inoculated with Eisenia fetida species of earthworm and allowed to be harvested and characterized.
Abstract: Vermicomposting is a biological process which may be a future technology for the management of animal excreta. This study was undertaken to produce vermicompost from cow dung and biogas plant slurry under field conditions. To achieve the objectives, two vermicomposting units containing cow dung (CD) and biogas plant slurry (BPS) were established, inoculated with Eisenia fetida species of earthworm and allowed to be vermicomposted for 3 months. After 3 months, the vermicompost was harvested and characterized. The results showed that the vermicompost had lower pH, total organic carbon (TOC), organic matter (OM) and carbon/nitrogen ratio (C/N ratio) but higher electrical conductivity (EC), nitrogen, phosphorous and potassium (NPK) content than the raw substrate. The heavy metal content in vermicomposts was higher than raw substrates. During vermicomposting, the CD and BPS were converted into a homogeneous, odourless and stabilized humus-like material. This experiment demonstrates that vermicomposting is an environmentally sustainable method for the management of animal excreta.

74 citations


Journal ArticleDOI
TL;DR: A multiple attribute decision making method is structured to solve the problem and concluded that production flexibility has the most impact, and programme flexible has the least impact in FMS based on factors, which affect the flexibility by using combined multiple attribute decisions making method.
Abstract: The flexibility in manufacturing system is required so it is called flexible manufacturing system (FMS), but in FMS, there is different flexibility, which is incorporated. So, in manufacturing system which flexibility has more impact and which is less impact in FMS is decided by combined multiple attribute decision making method, which are analytic hierarchy process (AHP), technique for order preference by similarity to ideal situation, and improved preference ranking organization method for enrichment evaluations. The criteria weights are calculated by using the AHP. Furthermore, the method uses fuzzy logic to convert the qualitative attributes into the quantitative attributes. In this paper, a multiple attribute decision making method is structured to solve this problem and concluded that production flexibility has the most impact, and programme flexibility has the least impact in FMS based on factors, which affect the flexibility in FMS by using combined multiple attribute decision making method.

32 citations


Journal ArticleDOI
TL;DR: The proposed methodology is able to accommodate the imprecise and inexact data involved in the problem of ranking of software engineering metrics, vagueness and ambiguity occurring during expert (human) decision making and to depart from the complexity of formulation of the objective and the constraint function.
Abstract: SUMMARY This research paper presents a framework for ranking of software engineering metrics based on expert opinion elicitation and fuzzy-based matrix methodology. The proposed methodology is able to accommodate the imprecise and inexact data involved in the problem of ranking of software engineering metrics, vagueness and ambiguity occurring during expert (human) decision making and to depart from the complexity of formulation of the objective and the constraint function. The matrices lend themselves to mechanical manipulations and are useful for analyzing and deriving systems functions expeditiously to meet the objectives. The current research is based on software engineering metrics identified in an earlier study conducted by Lawrence Livermore National Laboratory. A set of ranking criteria were identified. Software engineering metrics are then ranked in ascending order using experts' opinion in accordance with the value of Permanent function on their criteria matrix. The proposed methodology has also been compared with other known methodologies. Copyright © 2011 John Wiley & Sons, Ltd.

29 citations


Journal ArticleDOI
TL;DR: In this paper, the authors investigated the effect of polarity and other SAW parameters on the heat affected zone size and dilution and established their correlations by using statistical techniques and empirical models.
Abstract: Submerged arc welding (SAW) is a fusion joining process, known for its high deposition capabilities. This process is useful in joining thick section components used in various industries. Besides joining, SAW can also be used for surfacing applications. Heat Affected Zone (HAZ) produced within the base metal as a result of tremendous heat of arc is of big concern as it affects the performance of welded/surfaced structure in service due to metallurgical changes in the affected region. This work was carried out to investigate the effect of polarity and other SAW parameters on HAZ size and dilution and to establish their correlations. Influence of heat input on dilution and heat affected zone was then carried out. Four levels of heat input were used to study their effect on % dilution and HAZ area at both the electrode positive and electrode negative polarities. Proper management of heat input in welding is important, because power sources can be used more efficiently if one knows how the same heat input can be applied to get the better results. Empirical models have been developed using statistical technique.

27 citations


Journal ArticleDOI
TL;DR: In this paper, EFA is applied to extract the factors in FMS by The Statistical Package for Social Sciences (SPSS 20) software and confirming these factors by CFA through Analysis of Moment Structures (AMOS 18) software.
Abstract: The purpose of this paper is to investigate the factors from the variables of the flexible manufacturing system (FMS) which affect flexibility in FMS. The study was performed by conducting a cross-sectional survey within manufacturing firms in India through exploratory factor analysis (EFA) and confirmatory factor analysis (CFA). By performing EFA, factor structure is identified whereas CFA verified the factor structure of a set of observed variables. CFA is carried by structural equation modeling (SEM) statistical technique. In this paper, EFA is applied to extract the factors in FMS by The Statistical Package for Social Sciences (SPSS 20) software and confirming these factors by CFA through Analysis of Moment Structures (AMOS 18) software. Fifteen variables are identified through literature, and four factors extracted, which affects the flexibility of FMS in Production Flexibility, Machine Flexibility, Product Flexibility, and Volume Flexibility. SEM using AMOS 18.0 was used to perform the first-order four-factor structure (Production Flexibility, Machine Flexibility, Product Flexibility and Volume Flexibility) of the FMS flexibility.

26 citations


Proceedings ArticleDOI
09 Mar 2013
TL;DR: This paper proposes a ranking scheme for the semantic web documents by finding the semantic similarity between the documents and the query which is specified by the user, and finds that this semantic similarity based ranking scheme gives much better results than those by the prevailing methods.
Abstract: In recent years, semantic search for relevant documents on web has been an important topic of research. Many semantic web search engines have been developed like Ontolook, Swoogle, etc that helps in searching meaningful documents presented on semantic web. The concept of semantic similarity has been widely used in many fields like artificial intelligence, cognitive science, natural language processing, psychology. To relate entities/texts/documents having same meaning, semantic similarity approach is used based on matching of the keywords which are extracted from the documents using syntactic parsing. The simple lexical matching usually used by semantic search engine does not extract web documents to the user expectations. In this paper we have proposed a ranking scheme for the semantic web documents by finding the semantic similarity between the documents and the query which is specified by the user. The novel approach proposed in this paper not only relies on the syntactic structure of the document but also considers the semantic structure of the document and the query. The approach used here includes the lexical as well as the conceptual matching. The combined use of conceptual, linguistic and ontology based matching has significantly improved the performance of the proposed ranking scheme. We explore all relevant relations between the keywords exploring the user's intention and then calculate the fraction of these relations on each web page to determine their relevance with respect to the query provided by the user. We have found that this semantic similarity based ranking scheme gives much better results than those by the prevailing methods.

21 citations


Proceedings ArticleDOI
13 May 2013
TL;DR: To preventing malicious node attack, this paper presents PPN (Prime Product Number) scheme for detection and removal of malicious node.
Abstract: A mobile adhoc network is an autonomous network that consists of nodes which communicate with each other with wireless channel. Due to its dynamic nature and mobility of nodes, mobile adhoc networks are more vulnerable to security attack than conventional wired and wireless networks. One of the principal routing protocols AODV used in MANETs. The security of AODV protocol is influence by malicious node attack. In this attack, a malicious node injects a faked route reply claiming to have the shortest and freshest route to the destination. However, when the data packets arrive, the malicious node discards them. To preventing malicious node attack, this paper presents PPN (Prime Product Number) scheme for detection and removal of malicious node.

19 citations


Journal ArticleDOI
TL;DR: In this article, the authors presented a methodology based on graph theoretic approach (GTA) to design a new cogeneration cycle power plant (CGCPP), improvement of existing plant and comparison of two real life operating CEP power plants.

17 citations


Proceedings ArticleDOI
09 Mar 2013
TL;DR: An Algorithmic estimation method is being proposed considering various factors thereby estimating the more accurate release date, cost, effort and duration for the project specifically for Scrum, and the effectiveness and feasibility has been shown.
Abstract: In the last few years Agile methodologies appeared as a reaction to traditional software development methodologies. The Scrum is the widely used method of Agile. Most of the software development organizations are using Scrum these days but these are facing problems related to estimation of cost and effort. It has been observed that the current estimation method in Scrum mostly rely on historical data from past projects and expert opinion but in absence of historical data and experts these methods are not efficient. So there is need of an algorithmic method, which can calculate cost and effort of the software project. In this direction S. Bhalero, Maya Ingel [2] has considered some project-related factors that calculate the cost as well as effort of the project for a general Agile environment. However, several other project as well as people related factors may affect the Scrum environment. In this work an Algorithmic estimation method is being proposed considering various factors thereby estimating the more accurate release date, cost, effort and duration for the project specifically for Scrum. The effectiveness and feasibility of the proposed algorithm has been shown by considering three cases in which different levels of factors are taken and compared.

Proceedings ArticleDOI
06 Apr 2013
TL;DR: It is found that by properly placing UPFC load ability margin of the system has been increased considerably leading to improvement of voltage stability and stability index value decreases at each reactive load with the insertion of the device at right place.
Abstract: Proper placement of FACTS devices is very important for the rapid and successful operation because of high cost and circuit complexities. In this paper best location of UPFC (Unified Power Flow Controller) is obtained both for static and transient voltage stability enhancement of an IEEE 14 bus power system. The simulation is done on PSAT (Power System Analysis Tool-box) in MATLAB and optimal location is found out by Continuation Power Flow (CPF) and Line stability index. The bus having lowest voltage is the critical bus and the line having largest value of index for maximum permissible load with respect to a bus is the most critical line referred to that bus. It is found that by properly placing UPFC load ability margin of the system has been increased considerably leading to improvement of voltage stability and stability index value decreases at each reactive load with the insertion of the device at right place. Transient stability analysis is also done for an IEEE 14 bus system with a fault created at a bus. It is found from the time domain simulation that proper placement of UPFC increases the transient performance of the system by damping out the power oscillation under large disturbance conditions.

Journal ArticleDOI
TL;DR: This will be the first study which used combine approach of ANP and GTMA leading to single numerical index of effectiveness for a manufacturing system, and will help managers to benchmark the effectiveness of manufacturing system with their peers.
Abstract: Evaluating effectiveness of a manufacturing system is increasingly recognized as a tool for gaining competitive success. Today, lot of new manufacturing technologies are coming into the market. To build confidence of managers in adopting these new technologies, measurement of their effectiveness is must. So, developing a model on measurement of effectiveness for a manufacturing system will be significant from strategic management point of view. Manufacturing effectiveness factors from the literature and an expert questionnaire were utilized prior to building the effectiveness measurement model. To prioritize these, we used well known multi-attribute decision making (MADM) technique-Analytical Network Process (ANP). ANP allows interdependencies and feedback within and between clusters of factors. ANP is the generalized form of AHP. A group of experts were consulted to establish interrelations and to provide weightage for pairwise comparison. Outcome of the ANP is weighted comparison of the factors. A Manufacturing System Effectiveness Index (MSEI) is also calculated by using robust MADM technique-Graph Theoretic and Matrix Approach (GTMA). This index is a single numerical value and will help managers to benchmark the effectiveness of manufacturing system with their peers. A case study in three organisations is performed to demonstrate and validate the use of GTMA for calculation of MSEI. To the authors’ knowledge, this will be the first study which used combine approach of ANP and GTMA leading to single numerical index of effectiveness for a manufacturing system.

Journal ArticleDOI
TL;DR: The following work presents algorithms that handle almost all the constructs of procedural programming languages that are applied to C programs and is currently being tested on a financial enterprise resource planning system.
Abstract: Test Data Generation is the soul of automated testing. The dream of having efficient and robust automated testing software can be fulfilled only if the task of designing a robust automated test data generator can be accomplished. In the work we explore the gaps in the existing techniques and intend to fill these gaps by proposing new algorithms. The following work presents algorithms that handle almost all the constructs of procedural programming languages. The proposed technique uses cellular automata as its base. The use of Cellular Automata brings a blend of artificial life to the work. The work is a continuation of our earlier attempt to amalgamate Cellular Automata based algorithms to generate test data. The technique has been applied to C programs and is currently being tested on a financial enterprise resource planning system. Since, the solution of most of the problems can be found by observing nature, we must explore artificial nature to accomplish the above task.

Journal ArticleDOI
TL;DR: The work presented intends to automate the process of Test Data Generation with a goal of attaining maximum coverage and is a part of a larger system being developed, which takes into account both black box and white box testing.
Abstract: Manual Test Data Generation is an expensive, error prone and tedious task. Therefore, there is an immediate need to make the automation of this process as efficient and effective as possible. The work presented intends to automate the process of Test Data Generation with a goal of attaining maximum coverage. A Cellular Automata system is discrete in space and time. Cellular Automata have been applied to things like designing water distribution systems and studying the patterns of migration. This fascinating technique has been amalgamated with standard test data generation techniques to give rise to a technique which generates better test cases than the existing techniques. The approach has been verified on programs selected in accordance with their Lines of Code and utility. The results obtained have been verified. The proposed work is a part of a larger system being developed, which takes into account both black box and white box testing.

Book ChapterDOI
TL;DR: This paper has proposed a rule-based approach for the extraction of the link-context from anchor-text (AT) structure using bottom-up simple LR (SLR) parser and shown that, the proposed LCEA has extracted 100% actual link- context of each considered AT.
Abstract: Most of the researchers have widely explored the use of link-context to determine the theme of target web-page. Link-context has been applied in areas such as search engines, focused crawlers, and automatic classification. Therefore, extraction of precise link-context may be considered as an important parameter for extracting more relevant information from the web-page. In this paper, we have proposed a rule-based approach for the extraction of the link-context from anchor-text (AT) structure using bottom-up simple LR (SLR) parser. Here, we have considered only named entity (NE) anchor-text. In order to validate our proposed approach, we have considered a sample of 4 ATs. The results have shown that, the proposed LCEA has extracted 100% actual link-context of each considered AT.

Proceedings ArticleDOI
06 Apr 2013
TL;DR: Performance evaluation of the proposed work showed that this new approach for accessing Deep-Web using Ontologies has promising results.
Abstract: Deep Web is content hidden behind HTML forms Since it represents a large portion of the structured, unstructured and dynamic data on the Web, accessing Deep-Web content has been a long challenge for the database community This paper describes a crawler for accessing Deep-Web using Ontologies Performance evaluation of the proposed work showed that this new approach has promising results

Journal ArticleDOI
TL;DR: In this article, a model for the evaluation of AMT investments by using fuzzy graph theoretic approach (FGTA) was developed, which quantifies the intangible factors and based upon these factors gives a single numerical index which is useful for managers to evaluate the effectiveness of AMTs.
Abstract: Globalisation has increased opportunities for the manufacturers, as it has increased the customer size but at the same time it has brought competition. Customers are enjoying the variety of products at the minimum cost. Because of this, manufacturers are striving to improve their flexibility, quality of product, delivery time, etc. Manufacturers are adopting advanced manufacturing technologies (AMT), so as to meet these requirements. Adoption of (AMT) is a colossal investment and decision should be taken with proper evaluation. From the literature survey, it is proved that traditional financial methods for evaluating the effectiveness of AMT are not enough as it also enhances many intangible factors like flexibility, quality, employees’ satisfaction etc. Therefore, in this paper an endeavour has been made to develop a model for the evaluation of AMT investments by using fuzzy graph theoretic approach (FGTA). FGTA quantifies the intangible factors and based upon these factors gives a single numerical index which is useful for managers to evaluate the effectiveness of AMT.

Proceedings ArticleDOI
09 Mar 2013
TL;DR: This work proposes a multi-agent hybrid protocol exploiting the benefits of both value and decision fusion by performing aggregation at the source level in a clustered WSN.
Abstract: Data fusion deals with collaborative in-network processing and gathers relatively accurate information about the events in the environment. Conventional data fusion algorithms when assisted with mobile agent technology shifts computationally intensive tasks to these intelligent units thereby increasing the lifetime of the network. There exist mobile agent based event driven protocols for accumulating and forwarding the information to the sink(base station) eg: Tree-based Itinerary Design [4]. Usually, such protocols either deploy value-based fusion or decision-based fusion but very few are using both at the same time. Moreover, use of multi-agent systems(MAS) in such protocols is still in its infancy. The focus of this work is thus to propose a multi-agent hybrid protocol exploiting the benefits of both value and decision fusion by performing aggregation at the source level in a clustered WSN.

Journal ArticleDOI
TL;DR: A mathematical model using the graph theory and matrix method is developed to evaluate the performance of a gas based CCPP to help improve performance, design, maintenance planning, and selection of new power generation systems.
Abstract: The performance of a combined cycle power plant (CCPP) and cost of electricity generation per unit is a function of its basic structure (i.e., layout and design), availability (maintenance aspects), efficiency (trained manpower and technically advanced equipments), cost of equipments and maintenance, pollutants emission and other regulatory aspects. Understanding of its structure will help in the improvement of performance, design, maintenance planning, and selection of new power generation systems. A mathematical model using the graph theory and matrix method is developed to evaluate the performance of a gas based CCPP. In the graph theoretic model, a directed graph or digraph is used to represent abstract information of the system using directed edges, which is useful for visual analysis. The matrix model developed from the digraph is useful for computer processing. Detailed methodology for developing a system structure graph, various system structure matrices, and their permanent functions are described for the combined cycle power plant. A top–down approach for complete analysis of CCPP is given.

Journal ArticleDOI
TL;DR: In this paper, the analysis for different control techniques in Z-source inverters is presented and the voltage gain versus modulation index from simulation result is compared with the mathematical calculated voltage gain.
Abstract: This paper presents the analysis for different control techniques in Z-source inverters. The voltage gain versus modulation index from simulation result is compared with the mathematical calculated voltage gain. Further detailed analysis of %THD, %harmonics of output voltage at different modulation indexes for different boosting techniques of a Z-source inverter are also performed with respect to the traditional VSI by MATLAB based simulation

Journal ArticleDOI
TL;DR: In this paper, the authors identify the enablers through literature survey and discussions with experts, and conclude that the two enabler namely stakeholders' view point and financial position have high driving power and therefore deserve great attention.
Abstract: With more than 5,000 technical institutions set up in India to impart technical education, this education sector is booming like never before. However, at the same time, the reports published by AICTE in 2012 indicate that more than 130 technical institutions have given application for closure. With more than one million students appearing for the engineering entrance exam, the failure of these institutions can be attributed to the non-compliance to provide quality education for producing quality graduates. Most of the technical institution still do not understand the main quality technical education enablers and how these are interrelated to each other. If, this is known prior to management, then they can plan their policies in right direction. The endeavour is being made by the authors through this paper to identify the enablers through literature survey and discussions with experts. ISM methodology and questionnaire survey has been performed to understand the intervening effect of these enablers with each other. It is found that the two enablers namely ‘Stakeholders’ view point’ and ‘Financial Position’ have high driving power and therefore deserve great attention. This study concludes with practical implication.

Journal ArticleDOI
TL;DR: In this paper, an NH3-H2O ejector-absorption refrigeration cycle and an R-152a ejector cooling cycle were employed with a renewable energy power generator to make a proposed compact power generation and triple effect ejectorabsorption cooling cycle.
Abstract: An NH3–H2O ejector-absorption refrigeration cycle, and an R-152a ejector refrigeration cycle are employed with a renewable energy power generator to make a proposed compact power generation and triple effect ejector-absorption refrigeration cycle. The exergy analysis of the cycle leads to a possible performance improvement. Approximately 71.69% of the input exergy is destructed due to irreversibilities in different components. Around 7.976% is available as the useful exergy output. The exhaust exergy lost to the environment is 20.33%, which is lower than the exhaust energy loss of 47.95%, while the useful energy output is 27.88%. The refrigerants used are of zero ODP and negligible GWP, and the CO2 emission of the exhaust gases is very small as compared to that of the fossil fuel run engine, hence, this cycle is favorable to the global environment. The results also show that the proposed cycle has significant higher energy and exergy efficiency than the earlier investigated 'triple effect refrigeration cycle' and 'the combined power and ejector-refrigeration cycle'.

Proceedings ArticleDOI
09 Mar 2013
TL;DR: A solution is being proposed for the development of a crawling technique that attempts to reduce server load by taking advantage of migrants for downloading the relevant pages; pertaining to a specific topic only.
Abstract: WWW is a distributed heterogeneous information resource. With the exponential growth of WWW, it has become difficult to access desired information that matches with user needs and interest. In spite of strong crawling, indexing and page ranking techniques, the returned result-sets of the search engine lack in accuracy and preciseness. Large number of irrelevant links, topic drift, and load on servers are some of the other issues that need to be addressed towards developing an efficient search engine. In this paper a solution is being proposed for the development of a crawling technique that attempts to reduce server load by taking advantage of migrants for downloading the relevant pages; pertaining to a specific topic only. The downloaded documents are then ranked considering user preferences and past usage patterns of the web page thereby improving the quality of retuned result-sets.

Proceedings ArticleDOI
13 Jun 2013
TL;DR: This paper deals with analysis and comparison of different artificial techniques for harmonic reduction in the source current by using Fuzzy logic and intelligent features of the neuron cell in MATLAB environment.
Abstract: Artificial Intelligence techniques are the recent trends used for the enhancement of power quality. This paper deals with analysis and comparison of different artificial techniques for harmonic reduction in the source current. Artificial Intelligence may include Fuzzy logic and Artificial Neural Network. The purpose of the simulation is to enhance the power quality by using the concepts that cannot be expressed as "true" or "false" but rather as "partially true" called as Fuzzy logic and intelligent features of the neuron cell. Shunt Active Filter is one of the controller that can be used for suppress the source current harmonics and compensates the reactive power. Hysteresis current controller is used to control the switching of Voltage Source Inverter. The D-Q Frame theory is used to generate the reference compensating current. Simulations are carried out in MATLAB environment using power system toolbox.

Book ChapterDOI
01 Jan 2013
TL;DR: The properties, approaches and strategies of trust that is required in a trusted environment are discussed and two types of SW networks-peer-to-peer and social networks and all the algorithms that compute trust under these networks are discussed.
Abstract: Trust is an essential element of semantic web(SW)and networks associated with it. In this paper, we will discuss the properties, approaches and strategies of trust that is required in a trusted environment. Then, we will discuss two types of SW networks-peer-to-peer and social networks and all the algorithms that compute trust under these networks. After analysing all the trust algorithm we will compare these algorithms based on their strengths and weaknesses

Journal ArticleDOI
TL;DR: An in-depth survey of existing literature from various known international journal papers is conducted to come up with a framework which will help the researchers to focus on specific and emerging areas in the field of data warehouse development as well as application of data Warehouse in various business domains.
Abstract: Data warehouse is playing an important role in strategic decision making process for complex business solutions. To gain competitive advantage, business executives are increasingly making use of data warehouse concepts as it plays a vital role in analysing, predicting future trends based on past and current scenarios. We as authors have surveyed the various techniques used in building of data warehouse and the methods used for the implementation of techniques. We have conducted an in-depth survey of existing literature from various known international journal papers to come up with a framework which will help the researchers to focus on specific and emerging areas in the field of data warehouse development as well as application of data warehouse in various business domains.

Proceedings ArticleDOI
24 Aug 2013
TL;DR: The paper mainly focuses on the aspect of URL distribution among the various parallel crawling processes, and provides a solution by designing a framework that partitions the URL frontier into a several URL queues by ordering the URLs within each of the distributed set of URLs.
Abstract: With the ever proliferating size and scale of the WWW [1], efficient ways of exploring content are of increasing importance. How can we efficiently retrieve information from it through crawling? And in this "era of tera" and multi-core processors, we ought to think of multi-threaded processes as a serving solution. So, even better how can we improve the crawling performance by using parallel crawlers that work independently? The paper devotes to the fundamental advantages and challenges arising from the design of parallel crawlers [4]. The paper mainly focuses on the aspect of URL distribution among the various parallel crawling processes. How to distribute URLs from the URL frontier to the various concurrently executing crawling process threads is an orthogonal problem. The paper provides a solution to the problem by designing a framework that partitions the URL frontier into a several URL queues by ordering the URLs within each of the distributed set of URLs.

Journal ArticleDOI
TL;DR: In this paper, a program is executed in software EES on the basis of mathematical modelling described in paper to study cogeneration cycle performance for different parameters and results obtained are compared with the results available in literature and are found in good agreement with them.
Abstract: Cogeneration cycle is an efficient mean to recover the waste heat from the flue gases coming out of gas turbine. With the help of computer simulation, design parameters may be selected for the best performance of cogeneration cycle. In the present work a program is executed in software EES on the basis of mathematical modelling described in paper to study cogeneration cycle performance for different parameters. Results obtained are compared with the results available in literature and are found in good agreement with them. Real gas and water properties are inbuilt in the software. Results show that enthalpy of air entering the combustion chamber is higher than that of the flue gases at combustion chamber outlet. For different operative conditions, energy and exergy efficiencies follow similar trends; although, exergy efficiency values are always lower than the corresponding energy efficiency ones. From the results it is found that turbine outlet temperature (TIT) of 524°C is uniquely suited to efficient cogeneration cycle because it enables the transfer of heat from exhaust gas to the steam cycle to take place over a minimal temperature difference. This temperature range results in the maximum thermodynamic availability while operating with highest temperature and highest efficiency cogeneration cycle. Effect of cycle pressure ratio (CR), inlet air temperature (IAT) and water pressure at heat recovery steam generator (HRSG) inlet on the 30MW cogeneration cycle is also studied.

13 Jun 2013
TL;DR: In this article, a proposal that includes indexing algorithm to effectively maintain ontologies, suggesting retrieving algorithm to process user query and produce results with better precision and recall has been made.
Abstract: Various researchers have recognized that ontology based retrieval is one of the best in terms of precision and recall for semantic search engine However, they did not specify explicitly any scheme/ algorithm for ontology based indexing and retrieval This motivates us to conceive a proposal that includes (a) suggesting indexing algorithm to effectively maintain ontologies, (b) suggesting retrieving algorithm to process user query and produce results with better precision and recall The proposed indexing algorithm has been tested manually considering a sample of ten OWL ontologies of mammal domain but presented only a result of two ontologies due to space constraint Finally, the proposed retrieval algorithms have also been validated against three user queries