scispace - formally typeset
Search or ask a question

Showing papers by "KCG College of Technology published in 2011"


Proceedings ArticleDOI
08 Apr 2011
TL;DR: This paper has proposed a reversible T-Flip-flop which is better than the existing designs in the literature and a novel design of reversible asynchronous and synchronous counters is also proposed in this paper.
Abstract: Now-a-days, the reversible logic design attracts more interest due to its low power consumption. A lot of research has been done in combinational as well as sequential design of reversible circuits. In this paper we have proposed a reversible T-Flip-flop which is better than the existing designs in the literature. A novel design of reversible asynchronous and synchronous counters is also proposed in this paper. As far as it is known, this is the first attempt to apply reversible logic to the counter design. In this paper we have also proposed a new reversible gate which can be used as copying gate. We hope this paper will initiate a new area of research in the field of reversible sequential circuit.

22 citations


Proceedings ArticleDOI
23 Mar 2011
TL;DR: In this paper Census Data in sparse form are taken and the parameters are compared with the performance in time, space and transactions measured for the E-commerce datasets.
Abstract: Sparse data are becoming increasingly common and available in many real-life applications. However, relatively little attention has been paid to effectively model the sparse data and existing approaches such as the conventional “horizontal” and “vertical” representations fail to provide satisfactory performance for both storage and query processing, as such approaches are too rigid and generally do not consider the dimension correlations. So a new technique called HoVer was proposed by Bin Cui. This method holds better than both horizontal and vertical representations. In this paper Census Data in sparse form are taken. The variations in performance in time, space and transactions are measured. The parameters are then compared with the performance in time, space and transactions measured for the E-commerce datasets. The changes in parameters with the change in schema are analyzed and the variations are observed.

13 citations


Journal ArticleDOI
TL;DR: The results indicated that the use of proposed controller improve the process in terms of time domain specification, set point tracking and also reject disturbances with optimum stability.
Abstract: Problem statement: The temperature control in plastic extrusion machine is an important factor to produce high quality plastic products. The first order temperature control system in plastic extrusion comprises of coupling effects, long delay time and large time constants. Controlling temperature is very difficult as the process is multistage process and the system coupled with each other. In order to conquer this problem the system is premeditated with neuro fuzzy controller using LabVIEW. Approach: The existing technique involved is conventional PID controller, Neural controller, mamdani type Fuzzy Logic Controller and the proposed method is neuro fuzzy controller. Results: Manifest feature of the proposed method is smoothing of undesired control signal of conventional PID, neural controller and mamdani type FLC controller. The software incorporated the LabVIEW graphical programming language and MATLAB toolbox were used to design temperature control in plastic extrusion system. Hence neuro fuzzy controller is most powerful approach to retrieve the adaptiveness in the case of nonlinear system. Conclusion: The tuning of the controller was synchronized with the controlled variable and allowing the process at its desired operating condition. The results indicated that the use of proposed controller improve the process in terms of time domain specification, set point tracking and also reject disturbances with optimum stability.

10 citations


Proceedings Article
08 Mar 2011
TL;DR: In this proposed work fuzzy rules are defined and applied to all incoming emails, and based on the result of the various rules against user attitude, input email is classified as spam or ham.
Abstract: In this Information era, most of the communication is happening through e-mails. Many emails contain web spam, as the transaction through this internet is affected by Passive attacks and Active attacks. Recently, several algorithms are developed partially for detecting spam emails. Therefore there is a need for improving the performance of the spam-detecting algorithm. In this proposed work fuzzy rules are defined and applied to all incoming emails. Based on the result of the various rules against user attitude, input email is classified as spam or ham. This method outperforms the existing spam-detecting algorithms in terms of accuracy and user friendly.

5 citations


Proceedings Article
08 Mar 2011
TL;DR: In this research work, a statistical approach using test hypothesis with degrees of confidence at level 95% is used for retrieving the relevant web documents and this algorithm works well for both structured and unstructured web documents with high precision.
Abstract: Today, the most powerful tool in the internet world is the search engine as most of the people rely on them for retrieving interesting documents. Due to huge amount of information available on the web, most of the documents retrieved from the search engine are mostly irrelevant and causes a waste of user Therefore there is a need for Information retrieval and web mining researchers to develop an automated tool for improving the quality of the search results returned by search engines. In this research work, a statistical approach using test hypothesis with degrees of confidence at level 95% is used for retrieving the relevant web documents. This algorithm works well for both structured and unstructured web documents with high precision.

4 citations


Proceedings Article
08 Mar 2011
TL;DR: A mathematical approach based on signed and trust reputation management is developed to restrict the spam e-mails through user's attitude on a particular e- mail and content relevancy of the e-mail.
Abstract: In the e-mail revolution most of the e-mail messages contain SPAM which clogs up the inbox and is quite obnoxious. Therefore, managing a mailbox has become a big task in the faster e-world. Especially, when the user linked with social networks, user's inbox is occupied with several kinds of SPAM emails which lead to many problems. Deduction of spam mails has become an important issue in e-world. In this paper a mathematical approach based on signed and trust reputation management is developed to restrict the spam e-mails through user's attitude on a particular e-mail and content relevancy of the e-mail. The results obtained by this approach are similar to the results obtained through ID3 classifier.

3 citations



Proceedings ArticleDOI
20 Jul 2011
TL;DR: This paper describes an approach to develop a payroll system using service oriented architecture through component based development technique and shows how this technique can improve the quality of software built using components.
Abstract: In recent years, software industry has been in a constant strive for developing software in a more efficient and time saving manner. With this in focus the industry concentrated in researching on compartmentalizing functionalities in the form of components which can be reused. Software built using components are of high quality as they are pre-built and pre-tested and operate with each other. Components communicate with each other through interfaces. These components can be used in service oriented architecture as services. Services provide access to a well-defined collection of functionality. The system as a whole is designed and implemented as a set of interactions among these services. Exposing functionality as services is the key to flexibility. Some form of inter-service infrastructure is required to facilitate service interactions and communication and this is implemented using Web Services. This paper describes an approach to develop a payroll system using service oriented architecture through component based development technique.

1 citations


Proceedings ArticleDOI
20 Jul 2011
TL;DR: A comparative study on the Information Extraction tools used for Biological database based on the technique used such as scan Pass, Extraction Rules Type, Features used, Learning Algorithm and Tokenization Schemes is made.
Abstract: The Internet presents a huge amount of biological data It is difficult to extract relevant data from various sources Therefore, the availability of robust, flexible Information Extraction (IE) systems and tool that transform the Web pages into program-friendly structures such as a relational database This paper made a study on information extraction tools Which can be used for Biological databases The tools have been classified based on four categories such as tools for Manually constructed Information Extraction, Supervised Wrapper Induction System, Semi supervised Information Extraction Systems, Unsupervised Information extraction Systems Finally we made a comparative study on the Information Extraction tools used for Biological database based on the technique used such as scan Pass, Extraction Rules Type, Features used, Learning Algorithm and Tokenization Schemes

1 citations


Journal ArticleDOI
TL;DR: In this article, RFID tag applications include enterprise supply chain management to improve the efficiency of inventory tracking and management, where the data storage and retrieval on special devices are carried by RFID tags or transponders.
Abstract: Radio Frequency Identification (RFID) is an automatic identification system. The data storage and retrieval on special devices are carried by RFID tags or transponders. RFID tag applications include enterprise supply chain management to improve the efficiency of inventory tracking and management. These replace bar codes and other low cost remote sensors earlier in use.