scispace - formally typeset
Search or ask a question

Showing papers by "Velagapudi Ramakrishna Siddhartha Engineering College published in 2011"


Journal ArticleDOI
TL;DR: In this paper, the experiments of tensile and flexural tests were carried out on composites made by reinforcing jowar as a new natural fibre into polyester resin matrix.

364 citations


Journal ArticleDOI
TL;DR: In this paper, the authors measured the density, ultrasonic velocity, and viscosity of binary mixtures of (anisaldehyde + o-cresol, or +m-Cresol) or +p- cresol over the entire range of composition at T = (303.15, 308.15 and 318.15) K.

52 citations


Journal ArticleDOI
TL;DR: In this article, green fibers were extracted from sansevieria leaves and treated with 5% aqueous NaOH solution and the primary and derivate thermograms of the untreated and alkali-treated fibers were measured using a TGA/DTA instrument.

49 citations


Journal ArticleDOI
TL;DR: Partially biodegradable Typha angustifolia natural fiber-reinforced polyester composites were prepared in this article, and the fiber content in the composites was varied from ∼18.3% to 35.5% by volume, and mechanical properties in each case were determined.

44 citations


Proceedings ArticleDOI
08 Apr 2011
TL;DR: The denoising method which uses Undecimated Wavelet Transform to decompose the raw ECG signal and the shrinkage operation to eliminate the noise from the noisy signal proved that the denoised signal have a better balance between smoothness and accuracy than the DWT.
Abstract: The Electrocardiogram (ECG) is a technique of recording bioelectric currents generated by the heart which will help clinicians to evaluate the conditions of a patient's heart. So it is very important to get the parameters of ECG signal clear without noise. Many of the wavelet based denoising algorithms use DWT (Discrete Wavelet Transform) in the decomposition stage which is suffering from shift variance. To overcome this in this paper we are proposing the denoising method which uses Undecimated Wavelet Transform to decompose the raw ECG signal and we performed the shrinkage operation to eliminate the noise from the noisy signal. In the shrinkage step we used semi-soft and stein thresholding operators along with traditional hard and soft thresholding operators and verified the suitability of different wavelet families for the denoising of ECG signals. The results proved that the denoised signal using UDWT (Undecimated Discrete Wavelet Transform) have a better balance between smoothness and accuracy than the DWT.

30 citations


Journal ArticleDOI
TL;DR: In this paper, wildcane grass stam fibers were extracted from its stem using retting and chemical (NaOH) extraction processes, and the resulting fibers were intentionally reinforced in a polyester matrix unidirectionally.
Abstract: The main objective of this study is to introduce a new natural fiber as reinforcement in polymers for making composites. Wildcane grass stalk fibers were extracted from its stem using retting and chemical (NaOH) extraction processes. These fibers were treated with KMnO4 solution to improve adhesion with matrix. The resulting fibers were intentionally reinforced in a polyester matrix unidirectionally, and the flexural properties of the composite were determined. The fibers extracted by retting process have a tensile strength of 159 MPa, modulus of 11.84 GPa, and an effective density of 0.844 g/cm3. The composites were formulated up to a maximum fiber volume fraction of 0.39, resulting in a flexural strength of 99.17 MPa and flexural modulus of 3.96 GPa for wildcane grass fibers extracted by retting. The flexural strength and the modulus of chemically extracted wildcane grass fiber composites have increased by approximately, 7 and 17%, respectively compared to those of composites made from fibers extracted by retting process. The flexural strength and the modulus of KMnO4-treated fiber composites have increased by 12 and 76% over those of composites made from fibers extracted by retting process and decreased by 3 and 48% over those of composites made from fibers extracted by chemical process, respectively. The results of this study indicate that wildcane grass fibers have potential as reinforcing fillers in plastics in order to produce inexpensive materials with high toughness.

29 citations


Proceedings ArticleDOI
03 Nov 2011
TL;DR: The denoising method which uses Undecimated Wavelet Transform to decompose the image and the shrinkage operation to eliminate the noise from the noisy image proved that the denoised image has a better balance between smoothness and accuracy than the DWT.
Abstract: In Medical diagnosis operations such as feature extraction and object recognition will play the key role. These tasks will become difficult if the images are corrupted with noises. So the development of effective algorithms for noise removal became an important research area in present days. Developing Image denoising algorithms is a difficult task since fine details in a medical image embedding diagnostic information should not be destroyed during noise removal. Many of the wavelet based denoising algorithms use DWT (Discrete Wavelet Transform) in the decomposition stage which is suffering from shift variance. To overcome this in this paper we are proposing the denoising method which uses Undecimated Wavelet Transform to decompose the image and we performed the shrinkage operation to eliminate the noise from the noisy image. In the shrinkage step we used semi-soft and stein thresholding operators along with traditional hard and soft thresholding operators and verified the suitability of different wavelet families for the denoising of medical images. The results proved that the denoised image using UDWT (Undecimated Discrete Wavelet Transform) have a better balance between smoothness and accuracy than the DWT. We used the SSIM (Structural similarity index measure) along with PSNR to assess the quality of denoised images.

21 citations


Journal ArticleDOI
TL;DR: In this article, the performance of both ball and roller burnishing tools on a cylindrical work piece was evaluated on the surface roughness and surface hardness of brass specimens and the results revealed that improvements in the surface finish and increase in surface hardness are obtained by the increase of the number of burnishing tool passes in both ball burnishing and roller burning on the brass specimens.
Abstract: The process of burnishing is performed by applying a highly polished and hardened ball or roller with external force onto the surface of a cylindrical work piece. The burnishing process increases the surface hardness of the work piece, which in turn improves wear resistance, increases corrosion resistance, improves tensile strength, maintains dimensional stability and improves the fatigue strength by inducing residual compressive stresses in the surface of the work piece. In the present experimental work, both ball and roller burnishing tools are used. Experiments are conducted to study the performance of the ball and roller burnishing tools on lathe, along with the influence of number of burnishing tool passes on the surface roughness and surface hardness of brass specimens. The results revealed that improvements in the surface finish and increase in the surface hardness are obtained by the increase of the number of burnishing tool passes in both ball burnishing and roller burnishing on the brass specimens.

18 citations


Journal ArticleDOI
TL;DR: A simple and computationally inexpensive algorithm based on triangle subdivision method is proposed to extract additional features from the contact map and results show great promise in developing a new and simple tool for the challenging problem of fold prediction.
Abstract: The three-dimensional structure of proteins is useful to carry out the biophysical and biochemical functions in a cell. Approaches to protein structure/fold prediction typically extract amino acid sequence features, and machine learning approaches are then applied to classification problem. Protein contact maps are two-dimensional representations of the contacts among the amino acid residues in the folded protein structure. This paper highlights the need for a systematic study of these contact networks. Mining of contact maps to derive features pertaining to fold information offers a new mechanism for fold discovery from the protein sequence via the contact maps. These ideas are explored in the structural class of all-alpha proteins to identify structural elements. A simple and computationally inexpensive algorithm based on triangle subdivision method is proposed to extract additional features from the contact map. The method successfully characterizes the off-diagonal interactions in the contact map for predicting specific ‘folds’. The decision tree classification results show great promise in developing a new and simple tool for the challenging problem of fold prediction. © 2011 John Wiley & Sons, Inc. WIREs Data Mining Knowl Discov 2011 1 362–368 DOI: 10.1002/widm.35

15 citations


Journal ArticleDOI
TL;DR: A novel compact Swastika shaped patch antenna is designed in the present work, which can be used for Multiple Input Multiple Output (MIMO) systems and shows an improvement in the capacity compared to a 2×2 MIMO system developed with dipole antennas.
Abstract: A novel compact Swastika shaped patch antenna is designed in the present work, which can be used for Multiple Input Multiple Output (MIMO) systems. The proposed two element MIMO system resonates at a triband of 3.3 GHz, 5.8 GHz, and 7.1 GHz with an improved impedance bandwidth of 37% and a reduced mutual coupling of −33 dB. These results are better compared to a normal E shaped patch antenna designed with same size and thickness, achieved without using any additional decoupling methods. A 2×2 MIMO system employing the Swastika shaped patch antennas is analyzed using computational electromagnetic ray tracing software for an indoor environment. The results show an improvement in the capacity compared to a 2×2 MIMO system developed with dipole antennas. The proposed antenna is a good choice for MIMO systems operating for several Ultra WideBand (UWB) applications.

13 citations


Book ChapterDOI
02 Jan 2011
TL;DR: This work used entropy method for reducing high dimensionality to lower dimensionality, where the processing time can be saved without compromising the efficiency and the results are shown with the high accuracy.
Abstract: Outlier detection is a popular technique that can be utilized for finding Intruders. Security is becoming a critical part of organizational information systems. Network Intrusion Detection System ( NIDS) is an important detection system that is used as a counter measure to preserve data integrity and system availability from attacks [2]. However, current researches find that it is extremely difficult to find out outliers directly from high dimensional datasets. In our work we used entropy method for reducing high dimensionality to lower dimensionality, where the processing time can be saved without compromising the efficiency. Here we proposed a framework for finding outliers from high dimensional dataset and also presented the results. We implemented our proposed method on standard dataset kddcup’99 and the results shown with the high accuracy.

Proceedings ArticleDOI
02 Jun 2011
TL;DR: In this paper, three different methods for controller tuning namely RST, IMC, and pole placement technique are proposed in cascaded multilevel inverter type dynamic voltage restorer for power quality improvement.
Abstract: Usually PID controller is used in all closed loop control systems, but owing to difficulties in tuning the parameters with the existing conventional methodologies like Ziegler Nicholas, Cohen-coon etc there is always a need to develop a unique algorithm for PID controller. The existing methods tune controller parameters on trial and error basis. The study includes three different methods for controller tuning namely RST, IMC, and pole placement technique. The proposed control strategies are implemented in cascaded multilevel inverter (CMLI) type dynamic voltage restorer for power quality improvement. Disturbance rejection capability, set point tracking and stability aspects are discussed in the proposed strategies. The case studies are presented to test the performance of the proposed control strategies. The test circuit is developed in Matlab/simulink and the results are presented to validate the proposed strategy.

Proceedings ArticleDOI
23 Mar 2011
TL;DR: The proposed approach seems to be promising with the test performed on the front pose images of GTAV database of AT&T by using the mat lab, and can be further evaluated by using different databases with various poses other than the frontal pose.
Abstract: Automatic gender detection through facial features has become a critical component in the new domain of computer human observation and computer human interaction (HCI). Automatic gender detection has numerous applications in the area of recommender systems, focused advertising, security and surveillance. Detection of gender by using the facial features is done by many methods such as Gabor wavelets, artificial neural networks and support vector machine. In this work, we have used the facial global feature distance measure as a pre-cursor to perform the support vector machine based classification technique to improve the performance results. The proposed approach seems to be promising with the test performed on the front pose images of GTAV database of AT&T by using the mat lab. The proposed method can be further evaluated in future by using different databases with various poses other than the frontal pose.

Proceedings ArticleDOI
10 Oct 2011
TL;DR: This correspondence presents generalized complex orthogonal designs rate for five and six transmit antennas with 5/13 and rate 6/14 respectively.
Abstract: Space -time block codes from orthogonal design have two advantages, namely, fast maximum - likelihood (ML) decoding and full diversity. Rate 1 real space -time codes (real orthogonal design) for multiple transmit antennas have been constructed from the real Hurwitz - Radon families, which also provides the rate ? complex space -time codes(complex orthogonal designs) for any number of transmit antennas. Rate ? complex orthogonal designs (space - time codes) for three and four transmit antennas have existed in the literature. In this correspondence, we present generalized complex orthogonal designs rate for five and six transmit antennas with 5/13 and rate 6/14 respectively.

Journal ArticleDOI
TL;DR: The paper describes a model for cloud computing to implement software as a service (SaaS) and an expansion of the client/server model.
Abstract: Cloud computing is a paradigm where tasks are assigned to a combination of connections, software and services accessed over a network. Clouds provide processing power, which is made possible through distributed computing. Cloud computing can be seen as a traditional desktop computing model, where the resources of a single desktop or computer used to complete tasks, and an expansion of the client/server model. The paper describes a model for cloud computing to implement software as a service (SaaS).

Book ChapterDOI
23 Sep 2011
TL;DR: This research intends to develop the speed violated vehicle detection system using image processing technique and a novel adaptive Thresholding method is proposed to binarize the outputs from the interframe difference and the background subtraction techniques.
Abstract: This research intends to develop the speed violated vehicle detection system using image processing technique. Overall works are the software development of a system that requires a video scene, which consists of the following components: moving vehicle, starting reference point and ending reference point. A dedicated digital signal processing chip is used to exploit computationally inexpensive image-processing techniques over the video sequence captured from the fixed position video camera for estimating the speed of the moving vehicles The moving vehicles are detected by analyzing the binary image sequences that are constructed from the captured frames by employing the interface difference or the background subtraction Algorithm. A novel adaptive Thresholding method is proposed to binarize the outputs from the interframe difference and the background subtraction techniques The system is designed to detect the position of the moving vehicle in the scene and the position of the reference points and calculate the speed of each static image frame from the detected positions and report, to speed violated vehicle information to the authorised remote station

Book ChapterDOI
15 Jul 2011
TL;DR: A simple two-branch transmit diversity scheme using two transmit antennas and one receive antenna is presented and the performance of OSTBC with Alamouti is compared with no STBC scheme at lower as well as higher SNRs.
Abstract: The increasing demand for higher data rates and higher quality in wireless communications has motivated the use of multiple antenna elements at the transmitter and single antenna at the receiver in a wireless link. Space-time block coding over Rayleigh fading channels using multiple transmit antennas was introduced. In this work Data is encoded using a space-time block code and the encoded data is split into n streams which are simultaneously transmitted using n transmit antennas. The received signal at each receive antenna is a linear superposition of the n transmitted signals perturbed by noise. Maximum likelihood decoding is carried out by dividing the signals transmitted from different antennas. This uses the orthogonal structure of the space-time block code and gives a maximum-likelihood decoding algorithm, which is based only on linear processing at the receiver. Space-time block codes are designed to achieve the maximum diversity order for a given number of transmit and receive antennas subject to the constrain of having a simple decoding algorithm. This paper presents a simple two-branch transmit diversity scheme. Using two transmit antennas and one receive antenna using QAM modulation technique the performance of OSTBC with Alamouti is compared with no STBC scheme at lower as well as higher SNRs. This paper evaluates the performance of the system by increasing data lengths in terms of blocks.

Journal ArticleDOI
TL;DR: The proposed technique is comprised of two stages of enhancement, namely, local statistics-based image enhancement and Genetic Algorithm based local contrast enhancement which aids in searching of an optimal contrast factor, which plays vital role in the contrast enhancement.
Abstract: Nowadays, Image enhancement finds enormous image processing applications, which are related to practical situations, Contrast enhancement is one among the different image enhancement techniques that intends to improve the image visibility. Though several works for local contrast enhancement are available in the literature, the effectiveness remains an issue and the enhancement performance needs to be improved. In this paper, a local contrast enhancement technique is proposed for both gray scale images and RGB color images. The proposed technique is comprised of two stages of enhancement, namely, local statistics-based image enhancement and Genetic Algorithm based local contrast enhancement. The former stage is a pre-enhancement stage and the later is the major stage of enhancement. In the former stage, the image is processed in window basis and the local statistics of the image is obtained. Based on the local statistics, the image is enhanced. In the later stage, the window based operation is performed over the preenhanced image and the local contrast is enhanced. The Genetic Algorithm aids in searching of an optimal contrast factor, which plays vital role in the contrast enhancement. The technique is evaluated with both gray scale images as well as RGB color images and performance is compared with the existing contrast enhancement techniques.

Proceedings ArticleDOI
01 Oct 2011
TL;DR: In this article, the optimum positioning of an active decoy which is fired in the form of a cartridge from the platform of the target is reported Various radar and jammer parameters for effective luring away of the missile are studied Computer simulations are carried out and it is shown that miss distances of the order of half a Kilo meter or more can be obtained for typical monopulse radars.
Abstract: In a battle engagement scenario, while missile interception and hard kill options can be exercised, soft kill options are less expensive and elegant In this paper, optimum positioning of an active decoy which is fired in the form of a cartridge from the platform of the target is reported Various radar and jammer parameters for effective luring away of the missile are studied Computer simulations are carried out and it is shown that miss distances of the order of half a Kilo meter or more can be obtained for typical monopulse radars

Book ChapterDOI
01 Jan 2011
TL;DR: It is noted that Cellular Neural Network based edge detection improves the performance in non standard license plate localization when compared with traditional edge detection approaches.
Abstract: Automatic license plate localization is one of the crucial steps for any intelligent transportation system. The location of the license plate is not same for all types of vehicles and in some developing countries the size of the license plate also varies. The localization of such a license plate requires a series of complex image processing steps in which edge detection plays major role, as the edges give crucial information regarding the location. In order to localize the license plate in real time, the amount of data to be processed must be minimized at the stage where the edges are identified. In this work we proposed Cellular Neural Network based edge detection for real time computation of edges. The performance of the proposed method has been evaluated over the collected database of 100 images with license plates located at different location. Based on the experimental results, we noted that Cellular Neural Network based edge detection improves the performance in non standard license plate localization when compared with traditional edge detection approaches.

Proceedings ArticleDOI
01 Dec 2011
TL;DR: This work summarized the state of the art and made comparative study among contrast enhancement techniques and method using Cellular Neural Networks (CNN) proved to perform better than the conventional techniques.
Abstract: Contrast enhancement is one of the primary aspects in computer vision In order to understand the image, the contrast of the image should be clear In many scenarios, especially in biomedical images, security and surveillance, the visual quality of source images or video is not up to the expected quality There exist many algorithms such as histogram equalization, genetic algorithms and neural networks to improve the contrast of the images In this work, we summarized the state of the art and made comparative study among contrast enhancement techniques Comparisons are done in two cases: one among the histogram based techniques, another between histogram based techniques and method using Cellular Neural Networks (CNN) The method using CNN proved to perform better than the conventional techniques

Proceedings ArticleDOI
01 Dec 2011
TL;DR: A new approach of generating summary for a given input document is discussed based on identification and extraction of important sentences in the document, which obtains the selective terms from the extracted terms and builds qualitative summary with appreciable compression ratio.
Abstract: With ever growing content on World Wide Web, it has been increasingly difficult for users to search for relevant information. A rough estimation of world's famous search engine Google in year 2010 revealed that the total size of internet has now turned to 2 petabytes. Search engines that are supposed to satisfy user's information need, has too much information to offer than what is required. This problem is referred as information overload. The field of Information Extraction (IE) is offering a huge scope to concise and compact the information enabling the user to decide by mere check at snippets of each link. Automatic text summarization, a subset of IE is an important activity in the analysis of a high volume text documents. In this context, it has been increasingly important to develop information access solutions that can provide an easy and efficient access to users. Automatic summarization systems address information overload problem by producing a summary of related documents that provides an overall understanding of the topic without having to go through every document. In this paper, we propose a feature term based text summarization technique based on the analysis of Parts of Speech Tagging. A new approach of generating summary for a given input document is discussed based on identification and extraction of important sentences in the document. The system obtains the selective terms from the extracted terms and builds qualitative summary with appreciable compression ratio.

Proceedings ArticleDOI
23 Mar 2011
TL;DR: Suffix tree clustering produces comparatively more accurate and informative grouped results, facilitating quick browsing options to the browser providing an excellent interface to results precisely.
Abstract: It is a common experience to the web users with the existing search engines like Google, Yahoo, MSN, Ask, e.t.c., that the information related to the entered query returns a long ranked list of results (snippets). It becomes cumbersome to the user to go through each title, snippet and even sometimes link of the search results until relevant results are found to the query. Clustering of search results is a special technique in data mining using which the retrieved results are organized into meaningful groups enlightening the user work. This paper deals with the generalized Suffix tree based clustering approach. The most repeated phrase in the document tags is considered as cluster name. Thus in short, web search results that are fetched from the prevailing web search engines grouped under phrases that contain one or more search keywords. This paper aims at organizing web search results into clusters facilitating quick browsing options to the browser providing an excellent interface to results precisely. Suffix tree clustering produces comparatively more accurate and informative grouped results. A basic problem during image searching in any search engine is Image Repetition. This can be avoided by using the L-Point Comparison algorithm, a specially worked out technique in field of Information Retrieval systems, is also discussed with a practical example.

Book ChapterDOI
23 Sep 2011
TL;DR: This paper develops an automatic controller for inhalation of oxygen through cylinders and performs an effective support in the construction of models and the simulation of the same.
Abstract: The human respiratory system is a well-developed and complex system involving many different organs such as the nasal cavity, pharynx, trachea and the lungs. Though the actual physiological function of breathing begins only at birth, the development of the respiratory tract, the diaphragm and the lungs occur much earlier in the embryonic stage. Structural and functional changes continue from infancy to adulthood and into old age, as the respiratory system matures with age. Modeling of the respiratory system is helpful in finding out the diseases related to lungs. Engineering performed an effective support, bringing its namely in the construction of models and the simulation of the same. In this paper we develop an automatic controller for inhalation of oxygen through cylinders.

Proceedings ArticleDOI
08 Apr 2011
TL;DR: This work proposed Cellular Neural Network based edge detection for real time computation of edges for non standard license plate localization and noted that the method shows superior performance in non standardlicense plate localization.
Abstract: Automatic license plate localization is one of the crucial steps for any intelligent transportation system. This requires a series of complex image processing steps in which edge detection plays major role in case of non standard license plate as the edges give crucial information regarding the location. In order to localize the license plate in real time, the amount of data to be processed must be minimized at the stage where the edges are identified. In this work we proposed Cellular Neural Network based edge detection for real time computation of edges. The performance of the proposed method has been tested in real time under various type of vehicles where number plate is located at different places. Based on the experimental results, we noted that our method shows superior performance in non standard license plate localization.

Proceedings ArticleDOI
01 Dec 2011
TL;DR: This Hidden Markov Model approach for building Anomaly based Intrusion Detection System (ABIDS) as a network security tool works for even high dimensional data streams with high performance detection rate and robust to noise.
Abstract: Today Internet security has become a serious issue for anyone connected to the Internet. The internet is a great tool for many things, but unfortunately it can also pose a security risk for personal information and privacy. Hidden Markov Model (HMM) based applications are common in various areas, but the incorporation of HMM's for anomaly detection is still in its infancy. There are many approaches for building the IDS; here we are chosen Hidden Markov Model approach for building Anomaly based Intrusion Detection System (ABIDS) as a network security tool. This model has two phases, in the first phase the model is trained and in the second phase the model is tested. In both phases we have used a standard masquerade dataset. This dataset contains 50 users and each user has 15000 records. The first 5000 records have been used to train the model and the remaining 10000 records have been used for evaluation (testing) of the model. This model works for even high dimensional data streams with high performance detection rate and robust to noise.

Proceedings ArticleDOI
08 Apr 2011
TL;DR: Results obtained show that the location of the license plate can be obtained by following the proposed morphological based method, and future work focuses on the reading the charters of the obtained license plate.
Abstract: Automatic license plate recognition is one of the crucial steps for any intelligent transportation application. License plate recognition is done in many approaches. However, license plate recognition in Indian scenario is very complex because, there is no unique standard for the plate. Even though the government stipulated the standards on how to write the plate and where to display, many people disobey those rules, making it complex for the systems to recognize. The proposed morphological based method aptly suits for the Indian context. The results obtained show that the location of the license plate can be obtained by following the proposed method. Future work focuses on the reading the charters of the obtained license plate.

Proceedings ArticleDOI
22 Dec 2011
TL;DR: This paper presents a brief idea of how the authors can statistically estimate the emotional characteristics of a speech notes by analyzing the content in the speech notes through Information Extraction.
Abstract: Human Being is the only creature in the World who can express his emotions in various forms. Capturing the extent of emotions in a particular speech notes through quantification of verbal expressions is undoubtedly a challenging area to study. Either positive or negative, whatever be the emotions are in the notes, if we succeed in assessing the degree of positiveness or negativeness, the impact of the notes while addressing to the intended people can be pre-estimated easily. This paper presents a brief idea of how we can statistically estimate the emotional characteristics of a speech notes by analyzing the content in the speech notes through Information Extraction. Since it is mandatory for any sentence to have atleast a verb, we primarily concentrate on the verbs as featured terms in each sentence and there by statistically estimate the rigorousness of the verb on the entire speech note. The strength of the verb in its severity can be measured by comparing it with strong, medium, light corpus of emotions developed specially. This enables to judge the effectiveness of a speech notes in various emotions.

Proceedings ArticleDOI
01 Oct 2011
TL;DR: In this paper, the particle movement is determined in epoxy coated gas insulated Busduct with SF6 and N2 gas mixtures as insulating medium, conducting particle material considered for study are Aluminium, Copper and Silver.
Abstract: Due to an exceptional combination of physical and chemical properties, SF6 has become the only insulating gas used in gas insulated electric power transmission and distribution equipment. However, the fact that SF6 is one of the strongest man made greenhouse gases has promised a search for substitute gases with lower or no environmental impact. Thus, it is needed to develop the alternative dielectric gas or gas mixtures having better insulating characteristics and no greenhouse effect. The main issue concerning the practical use of such mixtures is their behavior in the presence of metallic particle contamination. In this paper particle movement is determined in epoxy coated gas insulated Busduct with SF6 and N2 gas mixtures as insulating medium. In the gas mixture, conducting particle material considered for study are Aluminium, Copper and Silver. The results have been analyzed and presented.

Book ChapterDOI
11 Aug 2011
TL;DR: The investigation is to control the traffic at the road junction by applying few inference rules which reduces the waiting time and increases the efficiency of the traffic light controller intelligently.
Abstract: In this paper we have adapted an agent approach for traffic light control. According to this, the proposed system contains agents and their world which in turn contains roads, cars, traffic lights etc. Each of these agents observe the traffic density and control the traffic light at the junction by using observe-think-act rule i.e. the agents will continuously observe the traffic and depending on the density and waiting time it decides which rule can be inferred and finally it implements the condition to the traffic light controller which can efficiently manage the traffic flow near the junction. The system has been implemented by using NetLogo based traffic simulator. The investigation is to control the traffic at the road junction by applying few inference rules.It reduces the waiting time and increases the efficiency of the traffic light controller intelligently.