scispace - formally typeset
Search or ask a question

Showing papers by "YMCA University of Science and Technology published in 2011"


Journal ArticleDOI
TL;DR: It has been concluded from the results that addition of PPVW up to 40% with CD can produce a good quality vermicompost and growth and fecundity of E. fetida was best when reared in 20% PPVW+80% CD feed mixture.

118 citations


Proceedings ArticleDOI
10 Nov 2011
TL;DR: A page ranking mechanism called Page Ranking based on Visits of Links (VOL) is being devised for search engines, which works on the basic ranking algorithm of Google i.e. PageRank and takes number of visits of inbound links of Web pages into account.
Abstract: Search engines generally return a large number of pages in response to user queries. To assist the users to navigate in the result list, ranking methods are applied on the search results. Most of the ranking algorithms proposed in the literature are either link or content oriented, which do not consider user usage trends. In this paper, a page ranking mechanism called Page Ranking based on Visits of Links(VOL) is being devised for search engines, which works on the basic ranking algorithm of Google i.e. PageRank and takes number of visits of inbound links of Web pages into account. This concept is very useful to display most valuable pages on the top of the result list on the basis of user browsing behavior, which reduces the search space to a large scale. The paper also presents a method to find link-visit counts of Web pages and a comparison between VOL with the PageRank algorithm.

68 citations


Journal ArticleDOI
TL;DR: In this paper, a graph theoretic approach is proposed to evaluate the machinability of tungsten carbide composite, and the effect of several factors and their subfactors are analyzed by developing a mathematical model using digraph and matrix method.
Abstract: Machinability aspect is of considerable importance for efficient process planning in manufacturing. Machinability of an engineering material may be evaluated in terms of the process output variables like material removal rate, processed surface finish, cutting forces, tool life, specific power consumption, etc. In this paper, graph theoretic approach (GTA) is proposed to evaluate the machinability of tungsten carbide composite. Material removal rate is considered as a machinability attribute of tungsten carbide to evaluate the effect of several factors and their subfactors. Factors affecting the machinability and their interactions are analyzed by developing a mathematical model using digraph and matrix method. Permanent function or machinability index is obtained from the matrix model developed from the digraphs. This index value helps in quantifying the influence of considered factors on machinability. In the present illustration, factors affecting machinability of tungsten carbide are grouped into five broad factors namely work material, machine tool, tool electrode, cutting conditions, and geometry to be machined. GTA methodology reveals that the machine tool has highest index value. Therefore, it is the most influencing factor affecting machinability.

56 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a methodology to evaluate the performance of carbide compacting die using graph theoretic approach (GTA) and their interactions are analyzed by developing a mathematical model using digraph and matrix method.
Abstract: This paper presents a methodology to evaluate the performance of carbide compacting die using graph theoretic approach (GTA). Factors affecting the die performance and their interactions are analysed by developing a mathematical model using digraph and matrix method. Permanent function or die performance index is obtained from the matrix model developed from the digraphs. This permanent function/index value compares and ranks the factors affecting the die performance. It helps in selection of optimum process parameters during die manufacturing. Hence, process output errors such as dimensional inaccuracy, large surface craters, deep recast layers, etc. will be minimised during die manufacturing which helps to achieve better die performance. In present illustration, factors affecting the performance of carbide compacting die are grouped into five main factors namely work material, machine tool, tool electrode, geometry of die and machining operation. GTA methodology reveals that the machine tool has highest index value. Therefore, it is the most influencing factor affecting the die performance. In case of die material low cobalt concentration and small grain size yields good surface finish, while in machine tool low discharge energy (i.e. low values of peak current, pulse-on time, servo voltage and high value of pulse-off time) and high dielectric flow rate yields good surface finish and, hence, favours the good die performance. In case of die geometry, large work piece thickness and small taper angles results in lesser geometrical deviations and hence helps to achieve better die performance.

30 citations


Journal ArticleDOI
TL;DR: To minimize limitations of existing deep Web crawlers, a novel architecture is proposed based on QIIIEP specifications Sharma & Sharma, 2009, which is cost effective and has features of privatized search and general search for deep Web data hidden behind html forms.
Abstract: A traditional crawler picks up a URL, retrieves the corresponding page and extracts various links, adding them to the queue. A deep Web crawler, after adding links to the queue, checks for forms. If forms are present, it processes them and retrieves the required information. Various techniques have been proposed for crawling deep Web information, but much remains undiscovered. In this paper, the authors analyze and compare important deep Web information crawling techniques to find their relative limitations and advantages. To minimize limitations of existing deep Web crawlers, a novel architecture is proposed based on QIIIEP specifications Sharma & Sharma, 2009. The proposed architecture is cost effective and has features of privatized search and general search for deep Web data hidden behind html forms.

28 citations


Journal ArticleDOI
TL;DR: The graph theoretic approach reveals a single numerical index and accordingly it is possible to choose the best manufacturing process and accordingly the authors selected four factors namely: Quality, Cost, Technical Capability, and Production.
Abstract: To manufacture a product, nowadays there are many methods available in the market to manufacture them and to earn more profits and best production which is the prime focus of any manufacturing industry, it is necessary to select only that type of manufacturing process which leads to more profits, less scraps, and reworks, faster production rate, good quality of production, employee satisfaction, customer satisfaction, etc So the aim of this paper is to judge the best manufacturing process among various manufacturing processes for manufacturing any product using graph theoretic approach The graph theoretic approach reveals a single numerical index and accordingly it is possible to choose the best manufacturing process To apply the graph theoretic approach the authors selected four factors namely: Quality, Cost, Technical Capability, and Production Based on these factors and their co-factors a fish bone diagram is represented While applying graph theoretic approach a digraph of the characteristics is drawn which represented the factors and co-factors affecting the selection of manufacturing process and further the interdependency of the factors as well as their inheritances has been identified and its representation in the matrix form has been used for the calculation of numerical index of the manufacturing process through its variable permanent quality function The technique is applicable when there are more than options are available for manufacturing a product An example is also shown in the last of the paper to understand the application of graph theoretic approach for the selection of best manufacturing process among three processes

18 citations


Proceedings ArticleDOI
07 Oct 2011
TL;DR: It is presented that distributed crawling methods based on migrating crawlers are an essential tool for allowing such access that minimizes network utilization and also keeps up with document changes.
Abstract: Study reports that about 40% of current internet traffic and bandwidth consumption is due to the web crawlers that retrieve pages for indexing by the different search engines. As the size of the web continues to grow, searching it for useful information has become increasingly difficult. The centralized crawling techniques are unable to cope up with constantly growing web. In this paper it is presented that distributed crawling methods based on migrating crawlers are an essential tool for allowing such access that minimizes network utilization and also keeps up with document changes.

13 citations


Journal ArticleDOI
TL;DR: The role of e-manufacturing is pivotal towards the success of a company as discussed by the authors, and identifying the enablers of emanufacturing has become important for manufacturer in customer-oriented manufacturing to lure new customers along with retaining the old customers.
Abstract: The term e-manufacturing refers to the ability of a manufacturing system to integrate various inputs using internet and intranet. With the market being increasingly competitive and customer oriented desiring to get best of the quality at cheapest available price of a product in shortest possible time, the role of e-manufacturing is pivotal towards the success of a company. Another important fact is that each customer may desire to have a certain different set of values added to the product being purchased. Customers want to voice their concerns directly to manufacturers thereby necessitating an interface to hear them in real-time and take suitable action thereupon, if needed, in real-time. As such identifying the enablers of e-manufacturing has become important for manufacturer in customer-oriented manufacturing to lure new customers along with retaining the old customers. It will also result in closing the gap between demand and supply of a product. This paper tries to assimilate the key enablers of e-manufacturing.

13 citations


Journal ArticleDOI
TL;DR: This paper concentrates on the information available on the surface web through general web pages and the hidden information behind the query interface, called deep web, and examines the three main components of search engines.
Abstract: ICT plays a vital role in human development through information extraction and includes computer networks and telecommunication networks. One of the important modules of ICT is computer networks, which are the backbone of the World Wide Web (WWW). Search engines are computer programs that browse and extract information from the WWW in a systematic and automatic manner. This paper examines the three main components of search engines: Extractor, a web crawler which starts with a URL; Analyzer, an indexer that processes words on the web page and stores the resulting index in a database; and Interface Generator, a query handler that understands the need and preferences of the user. This paper concentrates on the information available on the surface web through general web pages and the hidden information behind the query interface, called deep web. This paper emphasizes the Extraction of relevant information to generate the preferred content for the user as the first result of his or her search query. This paper discusses the aspect of deep web with analysis of a few existing deep web search engines.

10 citations


Journal ArticleDOI
05 Dec 2011
TL;DR: The paper attempts to represent the overall effect of key website performance attributes quantitatively by developing a mathematical model using graph theoretic approach and provides an insight into the website performance factors at system and subsystem level.
Abstract: To analyse the overall performance of a website, identification of website performance factors is required. The key website performance attributes affecting the overall website quality are identified and discussed for the sub-factors affecting them. The effect of interaction of these factors among themselves and the resulting overall effect help attain a better managed website. The paper attempts to represent the overall effect of key website performance attributes quantitatively by developing a mathematical model using graph theoretic approach. In this approach, interaction among identified key website performance attributes is represented through digraph, matrix model and a multinomial. The key website performance attributes is represented in terms of the ‘website performance index’. It provides an insight into the website performance factors at system and subsystem level. The paper attempts to quantify the performance factors through systematic approach and is of value to website managers to improve upon their website environment.

8 citations


Journal ArticleDOI
TL;DR: This paper examines and proposes a list of attributes of Total Quality Management (TQM) in an educational institute, and develops a model for the benefit of researchers and academicians.
Abstract: This paper examines and proposes a list of attributes of Total Quality Management (TQM) in an educational institute, and develops a model for the benefit of researchers and academicians. Even though there have been a large number of papers published related to TQM, none of the papers focused on documenting the attributes of TQM in educational institutes using statistical methods. The paper investigates and lists 42 attributes of TQM in educational institutions. A quantitative study, involving the administration of a survey was conducted. The survey instrument consisted of 42 items and was prepared on the basis of attributes of TQM found during Literature Review. The application of Factor Analysis technique is illustrated for grouping the various attributes into Factors. The results of this study will help in a smoother penetration of TQM programs in educational institutes. The period of study is from 1995-2006. Considering the gamut of publications, TQM implementation has seen a steady growth and appears to be heading towards its maturity level.

Proceedings ArticleDOI
25 Feb 2011
TL;DR: The proposed technique is based on the domain specific crawling of World Wide Web and a link is followed in a step by step manner, which results in a large source of hidden web databases.
Abstract: For context based surfing of World Wide Web in a systematic and automatic manner, a web crawler is required. The World Wide Web consists interlinked documents and resources that are easily crawled by general web crawler, known as surface web crawler. But for crawling the hidden web data, in which the data is hidden behind the html forms requires special type of crawler, known as hidden web crawler. For efficient crawling of hidden web data, the discovery of relevant and proper html forms is very important step. For this purpose a technique for domain specific hidden web crawler is proposed in this paper. The proposed technique is based on the domain specific crawling of World Wide Web. In this approach, a link is followed in a step by step manner, which results in a large source of hidden web databases. Experiential results verify that the proposed approach is quite effective in crawling the hidden web data contents.

Book ChapterDOI
19 Jul 2011
TL;DR: This paper presents a brief overview of challenges in designing a security mechanism for WSN, classify different types of attacks and lists available protocols, while laying outline for proposed work.
Abstract: If sensor networks are to attain their potential, security is one of the most important aspects to be taken care of. The need for security in military applications is obvious, but even more benign uses, such as home health monitoring, habitat monitoring and sub-surface exploration require confidentiality. WSNs are perfect for detecting environmental, biological, or chemical threats over large scale areas, but maliciously induced false alarms could completely negate value of the system. The widespread deployment of sensor networks is directly related to their security strength. These stated facts form the basis for this survey paper. This paper present a brief overview of challenges in designing a security mechanism for WSN, classify different types of attacks and lists available protocols, while laying outline for proposed work.

Book ChapterDOI
08 Aug 2011
TL;DR: A clustering method is being proposed that considers the presence of physical obstacles and uses obstacle modeling as a preprocessing step and further incorporates the hierarchical structure into the existing clustering structure.
Abstract: Spatial data clustering groups similar objects based on their distance, connectivity, or their relative density in space whereas in the real world, there exist many physical constraints e.g. highways, rivers, hills etc. that may affect the result of clustering. Therefore, these obstacles when taken into consideration render the cluster analysis a hopelessly slow exercise. In this paper, a clustering method is being proposed that considers the presence of physical obstacles and uses obstacle modeling as a preprocessing step. With a view to prune the search space and reduce the complexity at search levels, the work further incorporates the hierarchical structure into the existing clustering structure. The clustering algorithm can detect clusters of arbitrary shapes and sizes and is insensitive to noise and input order.

Proceedings ArticleDOI
07 Oct 2011
TL;DR: General security objectives for migrants are identified and corresponding mechanisms for facing the identified threats have been designed.
Abstract: Owing to their inherent security problems, migrants have limited applications especially when they are more prone to misuse by some other applications, resulting in increase in the scale of threats. Nevertheless migrants are potential contributors for implementation of migrating crawlers because of their capability to move to the information resource itself. In this paper, general security objectives for migrants are identified and corresponding mechanisms for facing the identified threats have been designed.

Journal ArticleDOI
TL;DR: In the case of sol-gel, polymer and thin film matrices of C60 and C70, it is found that the broader absorption bands with less intensity suffer from intermolecular interaction.
Abstract: The electronic absorption spectra of C60 and C70 molecules have been reported in various solvents viz., polar, aliphatic and aromatic. In addition the electronic absorption spectra of C60 and C70 also recorded in different matrices viz., sol-gel, polymer and thin film. The solvent effect has been observed in polar, aliphatic and aromatic solvents. The aromatic solvent shows larger red shift as compared to other solvents. The solution form electronic absorption spectra C60 and C70 show the conjugation effect and compared with the absorption spectra reported in different matrices like sol-gel, polymer and thin film. In the case of sol-gel, polymer and thin film matrices of C60 and C70, it is found that the broader absorption bands with less intensity suffer from intermolecular interaction.

Proceedings ArticleDOI
01 Dec 2011
TL;DR: The impact of a new parameter, the shape of the area where the ad hoc network is to be deployed, is provided, on the performance of ad hoc networks through simulation in MATLAB.
Abstract: The Mobile Ad hoc Networks (MANETs) are normally deployed in the disaster affected areas where there is a big damage to infrastructural resources or in the military operations in the remote areas, where the infrastructural resources are virtually nonexistent. The performance of these networks is known to be dependent upon various parameters such as transmission range, mobility, bandwidth and residual power of the nodes, and the network size. Literature provides the study of the impact of these parameters on the MANET performance. This paper provides the impact of a new parameter, the shape of the area where the ad hoc network is to be deployed, on the performance of ad hoc networks through simulation in MATLAB.

Journal ArticleDOI
TL;DR: In this paper, a technique to utilise the users' browsing behaviour at the crawling and indexing process is being proposed so as to direct the crawler to download the important pages, which were not previously crawled.
Abstract: Making search engines responsive to human needs requires understanding of user navigations through the search results in response to the submitted queries. The user behaviour characterisation provides an interesting perspective towards understanding the workload imposed on the search engine and can be used to address crucial points such as load balancing, content caching, data distribution and result optimisation. The user browsing behaviour is recorded in the query logs of search engines and usually referred to as web usage data. In this paper, a technique to utilise the users' browsing behaviour at the crawling and indexing process is being proposed so as to direct the crawler to download the important pages, which were not previously crawled. As the work attempts to index most of important pages based on user feedback, it would benefit the search engine to enhance its efficiency. To add further to the proposed work, the existing data structures maintained by the search engines has been refined so as to support the proposed user feedback mechanism and open more research directions.

Proceedings ArticleDOI
01 Dec 2011
TL;DR: In this paper, a high performance sliding mode controller using a new switching function and positive sequence voltage detector (PSVD) is proposed to control the source current harmonics, compensating reactive power, and improving power factor.
Abstract: The purpose of this paper is to design a high performance sliding mode controller through the use of a new switching function and Positive Sequence Voltage Detector (PSVD). The controller aims at controlling the source current harmonics, compensating reactive power, and improving power factor. In SMC used, the signum function is replaced by saturation function to smooth out the signal for minimizing chattering effect. By incorporating PSVD, the controller can be used, especially in the case of high voltage distortion or unbalanced input voltage signals. The proposed method is tested with balanced and unbalanced linear/non-linear load.

Proceedings ArticleDOI
29 Dec 2011
TL;DR: The aim of this paper is to assess the manufacturing industries by determining a single numerical index with the help of Graph Theoretic Approach with the aim of revealing a numerical index showing the best industry.
Abstract: The aim of this paper to assess the manufacturing industries by determining a single numerical index with the help of Graph Theoretic Approach (GTA). To apply the Graph theoretic approach the authors identified the factors and co - factors through an intense literature survey and determined interdependence between them. The factors are grouped into four main factors namely Human Resource; Material, Machine and Methodology; Planning and Organization; Work culture. The GTA methodology reveals a numerical index showing the best industry.

Proceedings ArticleDOI
03 Jun 2011
TL;DR: A method for improvement of temporal relevance of the collection has been developed which takes care of the users' current topic of interest by tracking the search queries supplied by them to a search engine.
Abstract: A major component of a search engine from information retrieval point of view is crawler. It works at the back end and is busy throughout in downloading the documents and depositing them at the repository/collection maintained by the search engine. Relevance of the collection with the users' current topic of interest is a major concern. In this paper a method for improvement of temporal relevance of the collection has been developed which takes care of the users' current topic of interest by tracking the search queries supplied by them to a search engine.

Journal ArticleDOI
TL;DR: A model of changing damping properties with compressive residual stress and depth of deformed layer of austenitic stainless steel is discussed, which found that damping factor in thin film surface increases with depth ofDeformed layer.
Abstract: The mechanical properties of austenitic stainless steel are rarely improved by heat treatment. Shot peening is a well-known cold working process that affects thin surface of materials. By controlling the shot peening intensity and shot size, the variable mechanical properties film thickness was obtained from 0.05 mm to 0.5 mm. The damping factor and compressive residual stress are determined experimentally and forming a relation between them. It was found that damping factor in thin film surface increases with depth of deformed layer. An investigation was carried out, and it was found that the increase in damping factor was due to introduction of compressive residual stress and increased hardness due to shot peening. The paper discusses a model of changing damping properties with compressive residual stress and depth of deformed layer of austenitic stainless steel.

Journal Article
TL;DR: In this paper, a framework for web-based gas turbine system design is presented, which will not only nurture the sharing potential of the information among the peer user group but also design exchange updation for existing Gas turbine system designs along with its sensitivity analysis through web browser.
Abstract: Despite of usage of computer simulation packages in a schematic Gas turbine system design environment, limited efforts havebeen carried out towards the usage of web based environments for the design of some such system. This paper presents a novel framework for web based gas turbine system design. Development of web based environment for gas turbine system designs will not only nurture the sharing potential of the information among the peer user group but also design exchange updation for existing gas turbine system designs along with its sensitivity analysis through web browser. The user interface modules as the front end and the knowledge modules with servers along with schedule of information exchange have been proposed in this paper for the gas turbine system designs.

Journal ArticleDOI
TL;DR: In this present work, an endeavour has been made to quantify intangible attributes by using fuzzy MADM approach and overall numerical index has been find out by using Graph Theoretic Approach (GTA) so, that the customer can visually compare different models.
Abstract: the optimum selection of grippers is a complex task as it involves large tangible and intangible attributes and also availability of large options in the market. In this present work, an endeavour has been made to quantify intangible attributes by using fuzzy MADM approach and then overall numerical index has been find out by using Graph Theoretic Approach (GTA) So, that the customer can visually compare different models.