scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Innovative Research in Computer and Communication Engineering in 2014"


Journal Article
TL;DR: A brief survey of different object detection, object classification and object tracking algorithms available in the literature including analysis and comparative study of different techniques used for various stages of tracking is presented.
Abstract: The goal of object tracking is segmenting a region of interest from a video scene and keeping track of its motion, positioning and occlusion.The object detection and object classification are preceding steps for tracking an object in sequence of images. Object detection is performed to check existence of objects in video and to precisely locate that object. Then detected object can be classified in various categories such as humans, vehicles, birds, floating clouds, swaying tree and other moving objects. Object tracking is performed using monitoring objects’ spatial and temporal changes during a video sequence, including its presence, position, size, shape, etc.Object tracking is used in several applications such as video surveillance, robot vision, traffic monitoring, Video inpainting and Animation. This paper presents a brief survey of different object detection, object classification and object tracking algorithms available in the literature including analysis and comparative study of different techniques used for various stages of tracking.

156 citations


Journal Article
TL;DR: Simple Algorithm for detection of range and shape of tumor in brain MR Images is implemented for diagnosis of brain tumor detection through segmentation and fuzzy c-means algorithms.
Abstract: Tumor is an uncontrolled growth of tissue in any part of the body. The tumor is of different types and they have different characteristics and different treatment. This paper is to implement of Simple Algorithm for detection of range and shape of tumor in brain MR Images. Normally the anatomy of the Brain can be viewed by the MRI scan or CT scan. MRI scanned image is used for the entire process. The MRI scan is more comfortable than any other scans for diagnosis. It will not affect the human body, because it doesn’t practice any radiation. It is centered on the magnetic field and radio waves. There are dissimilar types of algorithm were developed for brain tumor detection. But they may have some drawback in detection and extraction. After the segmentation, which is done through k-means clustering and fuzzy c-means algorithms the brain tumor is detected and its exact location is identified. Comparing to the other algorithms the performance of fuzzy c-means plays a major role. The patient's stage is determined by this process, whether it can be cured with medicine or not.

82 citations


Journal Article
TL;DR: Digital image processing is a technique to enhance raw images received from cameras/sensors placed on satellites, space probes and aircrafts or pictures taken in normal day-to-day life for various applications.
Abstract: Digital image processing is always an interesting field as it gives improved pictorial information for human interpretation and processing of image data for storage, transmission, and representation for machine perception. Image Processing is a technique to enhance raw images received from cameras/sensors placed on satellites, space probes and aircrafts or pictures taken in normal day-to-day life for various applications. This field of image processing significantly improved in recent times and extended to various fields of science and technology. The image processing mainly deals with image acquisition, Image enhancement, image segmentation, feature extraction, image classification etc.

50 citations


Journal Article
TL;DR: Three routing protocols AODV (Ad- Hoc On-Demand Distance Vector), OLSR (Optimized Link State Routed Protocol) and DSR (Dynamic Source Routing Protocol) along with many other algorithms are described briefly.
Abstract: Mobile ad hoc network is a collection of wireless nodes that can dynamically be set up anywhere and anytime to exchange information without using any pre-existing network infrastructure. It is a self organized and self configurable network where the mobile nodes move randomly. In MANETs mobile nodes can receive and forward packets as a router and each node operates not only as an end system, but also as a router to forward packets. There is no fixed infrastructure, which results in addition and exclusion of any number of nodes from the network for relatively small networks routing protocols may be sufficient. However, in larger networks either hierarchical or geographic routing protocols are needed. In this survey paper three routing protocols AODV (Ad- Hoc On-Demand Distance Vector), OLSR (Optimized Link State Routing Protocol) and DSR (Dynamic Source Routing Protocol) along with many other algorithms are described briefly.

44 citations


Journal Article
TL;DR: This paper proposes a data replication strategy which adaptively selects the data files for replication in order to improve the overall reliability of the system and to meet the required quality of services.
Abstract: Unlike traditional high performance computing environment, such as cluster and supercomputers, the cloud computing is a collection of interconnected and virtualized computing resources that are managed to be one unified highperformance computing power. However, the Cloud environment constitutes a heterogeneous and a highly dynamic environment. Failures on the data centers nodes are normal rather because of the large scale of physical resources and data. As a result, the cloud environment requires an efficient adaptive data replication management in order to cope with the inherent characteristic of the Cloud environment. In this paper, we propose a data replication strategy which adaptively selects the data files for replication in order to improve the overall reliability of the system and to meet the required quality of services. Further, the proposed strategy decides dynamically the number of replicas as well as the effective data nodes for replication. The popular data files are selected for replication based on employing a lightweight time-series technique, which analyzes the recent pattern of data files requests, and provides predictions for the future data requests.Experimental results show that the proposed strategy behaves effectively to improve the reliability of the Cloud system under study.

39 citations


Journal Article
TL;DR: In this article, a survey on recent trends and a new strategy for securing trajectory moving data objects is presented, where the authors give a survey of the recent trends in trajectory data collection.
Abstract: Recently there is a steep rise in usage of location –aware devices such as many GSM mobile phones, GPS enabled PDA’s, location sensors, and active RFID tags. Due to this device usage scenario, the device generate a large collection of moving data objects with the help of trajectory data , all these data are used for various data identification and analysis process. For instance consider traffic control, one can hack the control unit of traffic control management. Therefore it is way clear that a hacker may collect many temporal data to cover sensational massages of an organization and especially he/she can discover much personal information of third party/check points of many premises. Typically personal data (data privacy) may also fetch. Due to user’s identity replacement which is actual like terminal i.e. Quasi Identifiers (QID) of moving data are linked to external information to re-identify individual existence, thus the attacker can be able to track and trace the anonymous moving objects back into individuals. This paper gives a survey on recent trends and a new strategy for securing trajectory moving data objects.

33 citations


Journal Article
TL;DR: Methods developed to protect data in distributed environment evolved over the past 30 years are enumerated and MyPHRMachines a patient centric system that takes a radically new architectural solution to health record interoperability is introduced.
Abstract: This paper reviews methods to protect medical data in distributed cloud for the past 15 years. Methods developed to protect data in distributed environment evolved over the past 30 years are enumerated. Developing strategies to securely store data across cloud is a much focused topic of research in recent days. Cloud computing focuses on maximizing the effectiveness of the shared resources. Cloud storage provides a convenient means of storing and retrieval of huge amount of data. Personal Health Records (PHRs) should remain the lifelong property of patients and should be displayable conveniently and securely to selected caregivers. MyPHRMachines a patient centric system that takes a radically new architectural solution to health record interoperability. Patients Can Upload their Medical data then they access and share through remote Virtual machine. We have made a literature survey on techniques to protect PHRs and find open prototype of MyPHRMachines supports the use case of a real world patient scenario.

30 citations


Journal Article
TL;DR: This paper reviews methods to protect moving data objects for the past 30 years and portrays Decentralization Methods to Preserve Privacy Dummy Node and Cloaking Region Security Methods and Location Based Services for Securing Moving Data Objects.
Abstract: This paper reviews methods to protect moving data objects for the past 30 years. Data Disclosure Preventing Techniques such as disclosure limiting and ad-hoc approval publishing data are depicted. Privacy Homomorphism And Encryption Methods such as Data Protection Directive, Commercial Masking facility algorithm, Data Encryption Algorithm and post randomization method are also discussed in detail. The Knowledge Discovery Data Mining Techniques to Preserve Privacy such as k-anonymity, Advanced Traveler Information Systems (ATIS) and Geographical Information System (GIS) are elaborately studied . Partition-And-Group Framework for Clustering Trajectories TRACLUS algorithm, secure verification proof gathering protocol (SLVPGP) and a large-scale quantitative analysis of Brightkite, a commercial location-based social network (LSN) are also elaborately studied. Decentralization Methods to Preserve Privacy Dummy Node and Cloaking Region Security Methods and Location Based Services for Securing Moving Data Objects are portrayed.

30 citations


Journal Article
TL;DR: A Term Frequency Based Sequence Generation Algorithm (TFSGA) which creates node sequence based on term frequency of tuples with minimal distortion is proposed which experimentally shows the efficiency of the proposed algorithm under varying cluster sizes.
Abstract: A big deal of research has been performed in the area of graphical data anonymization. Because of the wide range of application of graphical data from social network data to large warehouse data and knowledge engineering domains. Notion of k-anonymity has been proposed in literature, which is a framework for protecting privacy, emphasizing the lemma that a database to be k-anonymous, every tuple should be different from at least k-1 other tuples in accordance with their quasi-identifiers(QIDs). Inspite of the existence of k-anonymity framework, malicious users and misfeasers may get authorization to the sensitive information if a set of nodes exhibit alike attributes. In this paper we make a systematic analysis on structure anonymization mechanisms and models projected in the literature. Also we discuss the simulation analysis of KDLD model creation and construction. We propose a Term Frequency Based Sequence Generation Algorithm (TFSGA) which creates node sequence based on term frequency of tuples with minimal distortion. We experimentally show the efficiency of the proposed algorithm under varying cluster sizes.

29 citations


Journal Article
TL;DR: A novel synergized k-degree l-diversity t-closeness model to effectively anonymize graphical data at marginal information loss, thereby controlling the distribution of sensitive information in graphical structure based on the threshold value is proposed.
Abstract: Privacy becoming a major concern in publishing sensitive information of the individuals in the social network data. A lot of anonymization techniques are evolved to protect sensitive information of individuals. kanonymity is one of the data anonymization framework for protecting privacy that emphasizes the lemma, every tuple should be different from at least k-1 other tuples in accordance with their quasi-identifiers(QIDs). Researchers have developed privacy models similar to k-anonymity but still label-node relationship is not well protected. In this paper, we propose a novel synergized k-degree l-diversity t-closeness model to effectively anonymize graphical data at marginal information loss, thereby controlling the distribution of sensitive information in graphical structure based on the threshold value. Experimental evidences indicate the substantial reduction in information loss ratio during synergy.

29 citations


Journal Article
TL;DR: This proposed system discusses the effective way used in performing detection of grape diseases through leaf feature inspection, which is captured and proposed to determine the health status of each plant.
Abstract: Producing Grape is a daunting task as the plant is exposed to the attacks from various micro organisms ,bacterial diseases and pests .The symptoms of the attacks are usually distinguished through the leaves ,stems or fruit inspection . This proposed system discusses the effective way used in performing detection of grape diseases through leaf feature inspection. Leaf image is captured and proposed to determine the health status of each plant. Plant disease diagnosis is an art as well as science. The diagnosis process (i.e. recognition of symptoms and signs), is inherently visual and requires intuitive judgement as well as the use of scientific methods. Photographic images of symptoms and signs of plant’s diseases used extensively to enhance description of plant diseases are invaluable in research, diagnostics etc.

Journal Article
TL;DR: This paper investigates the applications of Cuckoo algorithm in various domains and describes the improved version of CS algorithm namely: Binary CS, Modified CS and Improved CS.
Abstract: Cuckoo Search (CS) is heuristic search algorithm which is inspired by the reproduction strategy of cuckoos. This paper investigates the applications of Cuckoo algorithm in various domains. The applications of Cuckoo includes optimizing weights of neural networks, parameters of Support vector machines and Radial basis function, job scheduling, finding optimal cluster head in wireless sensor networks, finding shortest path and clustering. The paper also describes the improved version of CS algorithm namely: Binary CS, Modified CS and Improved CS.

Journal Article
TL;DR: This paper is reviewing the two Asymmetric algorithms- RSA and El-Gamal and its implications for secure file transmission in banking transactions, e-shopping etc.
Abstract: Cryptography is used to make secure data transmission over networks. The algorithm selected for cryptography should meet the conditions of authentication, confidentiality, integrity and non-repudiation.The prevention of information from unauthorized access is the main concern in the area of cryptography.There are many cases where we need secure file transmission for example in banking transactions, e-shopping etc. RSA and El- Gamalalgorithm is asymmetric key cryptography also called Public Key cryptography.In this paper we are reviewing the two Asymmetric algorithms- RSA and El-Gamal.

Journal Article
TL;DR: This paper presents the review of feature detection techniques for image mosaicing which is an important research subject in the field of computer vision and discusses some techniques which are commonly used.
Abstract: This paper presents the review of feature detection techniques for image mosaicing which is an important research subject in the field of computer vision. Image mosaicing is the process of combining several overlapped images to create single continuous image. Feature extraction methods extract the distinct features from the images like edges, corners, etc. which can be used to match the similarity for estimation of relative transformation between the images. Features based methods have shown much advantage over direct mosaicing methods in both time and space complexity. Thus large number of research has been done around the feature extraction and feature matching algorithms to improve processing of algorithm execution in terms of speed and space. Here we discussed some techniques which are commonly used for the image mosaicing.

Journal Article
TL;DR: This paper reviews the existing denoising algorithms and performs their comparative study using several thresholding techniques such as BayesShrink,SureShRink, and VisuShrinks and a quantitative measure of comparison is provided by SNR (signal to noise ratio) and mean square error (MSE).
Abstract: The main challenge in digital image processing is to remove noise from the original image. This paper reviews the existing denoising algorithms and performs their comparative study. Different noise models including additive and multiplicative types are discussed in the paper.Selection of the denoising algorithm is application dependent. Hence, it is necessary to have knowledge about the noise present in the image so as to select the appropriate denoising algorithm. Here we put results of different approaches of wavelet based image denoising methods using several thresholding techniques such as BayesShrink,SureShrink, and VisuShrink.A quantitative measure of comparison is provided by SNR (signal to noise ratio) and mean square error (MSE).

Journal Article
TL;DR: A new version of the advanced encryption standard algorithm with efficient utilization of resources such as processor and memory providing high security and throughput by consuming less memory and processor.
Abstract: The paper consist of a new version of the advanced encryption standard algorithm with efficient utilization of resources such as processor and memory. The new algorithm AES 512 consists of input block of 512 bit and key 512 bit. Due to this provision it becomes more resistant to linear and differential encrypt analysis providing high security and throughput by consuming less memory and processor. The result show that the tremendous increase in the throughput to 230% than AES 128 bit algorithm.

Journal Article
TL;DR: Critical review of the brain-computer interface system and robotics for manufacturing applications is presented and Human Threading TM technique is developed to maximize human and machine interaction.
Abstract: Robots are employed in variety of applications and are available in a wide range of configurations. The need to respond to the environment without using the nervous system’s efferent pathways has initiated a new interaction system that can boost and speed up the human sensor-effector system. To maximize human and machine interaction, Human Threading TM technique has been developed to merge the observations made in human cognitive system, neuro-anatomical structures, finite state machines and their associated relationships. The Brain-Computer Interface (BCI) is used to create a robust communication system that can interpret human intentions and cognitive emotions reflected by appropriate brain signals into control signals for robotic manipulations. Efficient brain- computer interfaces use efficient neural signal recording devices that are able to record neural signals continuous over long periods of time through Positron Emission Tomography (PET), functional Magnetic Resonance Imaging (fMRI), functional Near-Infrared Imaging (fNIR), Electroencephalography (EEG) and Electrocorticographic (ECoG) methods. The paper presents critical review of the brain-computer interface system and robotics for manufacturing applications.

Journal Article
TL;DR: An attempt has been made to understand the root cause of the authentication attacks and propose possible mitigation measures in a cloud environment.
Abstract: Cloud computing is an evolving computing paradigm that offers great potential to improve productivity and operational efficiency. This recently developed technology supports resource sharing and multi-tenancy which in turn contributes towards reduced capital and operational expenditure. While cost and ease of use are the main benefits of cloud computing, trust and security are the two top concerns of users of cloud services. The providers of this fast growing technology need to address many issues related to virtualization distributed computing, application security, identity management, access control and authentication. However, strong user authentication that restricts illegal access to the service providing servers is the paramount requirement for securing cloud environment. In this regard, the paper focuses on identifying the various authentication attacks in cloud environment. An attempt has been made to understand the root cause of the authentication attacks and propose possible mitigation measures in a cloud environment.

Journal Article
TL;DR: This paper proposes the system which uses neural network and Decision tree (ID3) to predict the heart attacks and the results of the prediction give more accurate output than the other techniques.
Abstract: The healthcare environment is more and more data enriched, but the amount of knowledge getting from those data is very less, because lack of data analysis tools. We need to get the hidden relationships from the data. In the healthcare system to predict the heart attack perfectly, there are some techniques which are already in use. There is some lack of accuracy in the available techniques like Naive Bayes. Here, this paper proposes the system which uses neural network and Decision tree (ID3) to predict the heart attacks. Here the dataset with 6 attributes is used to diagnose the heart attacks. The dataset used is acath heart attack dataset provided by UCI machine learning repository. The results of the prediction give more accurate output than the other techniques.

Journal Article
TL;DR: A novel framework for recognizing and identifying plants is been proposed which has the ability to identify tree species from photographs of their leaves and it provides accurate results in less time.
Abstract: With the evolution of technology, people have adopted their day today lives to utilize the benefits of highly advanced technologies. Plants are among the earth's most useful and beautiful products of nature. The crucial need is that many plants are at the risk of extinction. Most of the ayurvedic medicines are prepared using plant leaves and many of these plant species belong to the endanger group. Hence it is vitally important to set up a database for plant protection. Even today, identification and classification of unknown plant species are performed manually by expert personnel who are very few in number. The important aspect is to develop a system which classifies the plants. In this paper a novel framework for recognizing and identifying plants is been proposed. Shape, vein, color and texture features have been used to identify the leaf and neural network approach is used to classify them. This is an intelligent system which has the ability to identify tree species from photographs of their leaves and it provides accurate results in less time.

Journal Article
TL;DR: This paper presents the fast autonomous network reconfiguration system which provides the multi radio Wireless Mesh Networks to recover from link failure automatically to maintain the network performance.
Abstract: Mesh Network has the advantages of fast implementation, easy maintenance and low direct investment while comparing with the existing networks. Wireless mesh networks are implemented as wireless anchors, but they are not stabilized. WMN experience frequent link failures caused by channel interference, dynamic obstacles and/or applications bandwidth demands. These failures cause severe performance degradation in WMNs. This paper presents the fast autonomous network reconfiguration system which provides the multi radio Wireless Mesh Networks to recover from link failure automatically to maintain the network performance. Fast autonomous network reconfiguration system gives the necessary changes in local radio and channel allocations to rescue from fails. The cooperative networks reconfigure network protocol for routers. FARS widely used in Wireless Mesh Networks test. The Implementation results Shows Fast Autonomous Network Reconfiguration System failure recovery by more than 97%. Recover Mechanism have been introduced in order to increase the network performance while failure occurs. The infrastructure will be implemented to create disjoint paths in those frameworks.

Journal Article
TL;DR: Two algorithms are proposed on Improved Max-min where instead of selecting the largest task, a task just greater than average execution time is selected and assigned to the resource which gives minimum completion time.
Abstract: In this paper a unique modification to the Improved Max-min algorithm is proposed. In Improved Maxmin algorithm largest job is selected and assigned to the resource which gives minimum completion time. Here two algorithms are proposed on Improved Max-min where instead of selecting the largest task , a task just greater than average execution time is selected and assigned to the resource which gives minimum completion time. The experimental results shows the new algorithms schedules jobs with lower makespan .

Journal Article
TL;DR: Focus of this research is to optimize the four S-boxes into two S- boxes in original Blowfish algorithm to increase the speed and examine the effectiveness and limitations of some Block cipher algorithms.
Abstract: The internet plays an important role in day-to-day life. The people can transfer important data through the internet such as Email, banking transaction and online purchase. In order to get secured transaction, network security is essential. Network security is mostly achieved through the use of cryptography. Cryptography refers to the art and science of transforming the message to make them secure and immune to attacks. Different algorithms and protocols are used to protect the data. The efficiency of the algorithm is measured by execution time and throughput. Using of larger key size may affect the efficiency of the algorithm. Blowfish is a symmetric block cipher with a 64 bit block size and variable key length from 32 bits to 448 bits. The Blowfish algorithm keeps two sub key arrays: four Sboxes and single P-box. Focus of this research is to optimize the four S-boxes into two S-boxes in original Blowfish algorithm to increase the speed and examine the effectiveness and limitations of some Block cipher algorithms. The program simulation result provides the better performance as well as security

Journal Article
TL;DR: In VANET network topology is rapidly changed due to high mobility of nodes, and Dynamic Network Topology Graph (DNTG) is constructed and sampling technique is applied to handle the data delivery process.
Abstract: Vehicular ad hoc network (VANET) technology uses moving vehicles as nodes in a network to create a mobile network. A VANET turns every participating vehicle into a wireless router or node. 100 to 300 meter distance is allowed between vehicles to cover a wide network range. In VANET network topology is rapidly changed due to high mobility of nodes. VANET uses infrastructure support to handle time sensitive data exchange process. Single-hop and multi-hop methods are used for VANET communication. Vehicle to Vehicle (V2V) communication and Vehicle to Infrastructure (V2I) communication methods are used for VANET data transmission. Mobile internet is provided with the consideration of signal range and mobility of vehicle. Vehicular communication is used to download different contents from the internet. Downloading optimization scheme is used to improve the content downloading throughput. Roadside infrastructure, vehicle-to-vehicle relaying, and penetration rate for communication factors are used in the system. Dynamic Network Topology Graph (DNTG) is constructed and sampling technique is applied to handle the data delivery process. The content delivery system is improved with historical pattern based vehicle prediction scheme. Data request level based bandwidth scheduling is used in the system. Infrastructure estimation is performed with historical data patterns. Data replication scheme is used to reduce the data delivery delay.

Journal Article
TL;DR: This paper focuses on the impact of cloud computing on the education system and how the quality education can be provided by using the above technology.
Abstract: Education plays an important role in maintaining the economic growth of a country. Now a days the classroom teaching is changing and students are becoming more technology oriented and Therefore in his changing environment, it’s important that we think about the latest technologies to incorporate in the teaching and learning process. One of the latest technologies prevailing now days is Cloud Computing. By sharing IT services in the cloud, educational institution can outsource noncore services and better concentrate on offering students, teachers, faculty, and staff the essential tools to help them succeed. This paper focuses on the impact of cloud computing on the education system and how we can provide the quality education by using the above technology.

Journal Article
TL;DR: This paper is studying about classifiers for sentiment analysis of user opinion towards political candidates through comments and tweets sing Support Vector Machine (SVM), in the manner of the Pang, Lee and Vaithyanathan paper.
Abstract: Sentiment analysis is a subfield of NLP concerned with the determination of opinion and subjectivity in a text, which has many applications. In this paper we will be studying about classifiers for sentiment analysis of user opinion towards political candidates through comments and tweets sing Support Vector Machine (SVM),in the manner of the Pang, Lee and Vaithyanathan, which was the first research paper on this topic. The goal is to develop a classifier that performs sentiment analysis, by labeling the users comment to positive or negative. From which we can classify text into classes of interest.

Journal Article
TL;DR: An overview of wireless body area network is presented and the differences between Wireless Body Area Network and Wireless Sensor Network are provided and an idea to improve healthcare systems in India with the help of telecommunication and information technology by using wearable and implantable body sensor nodes which does not affect the mobility of the patients is presented.
Abstract: In wireless body area networks various sensors are attached on clothing or on the body or even implanted under the skin. The wireless nature of the network and the wide variety of sensors offer numerous new, practical and innovative applications to improve health care and the Quality of Life. Using a WBAN, the patient experiences a greater physical mobility and is no longer compelled to stay in the hospital. In this paper, we present an overview of wireless body area network and we also provide the differences between Wireless Body Area Network and Wireless Sensor Network (WSN) that is inadequate to apply in WBAN. We also present an idea to improve healthcare systems in India with the help of telecommunication and information technology by using wearable and implantable body sensor nodes which does not affect the mobility of the patients. We discuss how the wireless body area networks are used for healthcare monitoring by using multiple sensor nodes. In this paper we present various innovations and discuss promising new trends of wireless body area networks for ubiquitous health monitoring applications.

Journal Article
TL;DR: This paper proposes a new framework, called Histogram-based Global Load Balancing (HiGLOB), to facilitate global load balancing in structured P2P systems and shows that the scheme can control and bound the amount of load imbalance across the system.
Abstract: Over the past few years, peer-to-peer (P2P) systems have rapidly grown in popularity and have become a dominant means for sharing resources. In these systems, load balancing is a key challenge because nodes are often heterogeneous. While several load-balancing schemes have been proposed in the literature, these solutions are typically ad hoc, heuristic based, and localized. In this paper, we present a general framework, HiGLOB, for global load balancing in structured P2P systems. Each node in HiGLOB has two key components: 1) a histogram manager maintains a histogram that reflects a global view of the distribution of the load in the system, and 2) a load-balancing manager that redistributes the load whenever the node becomes overloaded or under loaded. We exploit the routing metadata to partition the P2P network into no overlapping regions corresponding to the histogram buckets. We propose mechanisms to keep the cost of constructing and maintaining the histograms low. We further show that our scheme can control and bound the amount of load imbalance across the system. We propose a new framework, called Histogram-based Global Load Balancing (HiGLOB) to facilitate global load balancing in structured P2P systems. Each node P in HiGLOB has two key components. The first component is a histogram manager that maintains a histogram that reflects a global view of the distribution of the load in the system. The histogram stores statistical information that characterizes the average load of no overlapping groups of nodes in the P2P network. These nodes are connected to P through its neighbor nodes. The histogram information can be used for two purposes. On one hand, it is used to determine if a node is normally loaded, overloaded, or under loaded. On the other hand, it is used to facilitate the discovery of a lightly loaded node or a heavily loaded node for the load-balancing process when it is needed. The second component of the system is a load-balancing manager that takes actions to redistribute the load whenever a node becomes overloaded or under loaded.

Journal Article
TL;DR: This paper presents a study about different types of ambiguities that comes under Natural Language Processing.
Abstract: Ambiguity can be referred as the ability of having more than one meaning or being understood in more than one way. Natural languages are ambiguous, so computers are not able to understand language the way people do. Natural Language Processing (NLP) is concerned with the development of computational models of aspects of human language processing. Ambiguity can occur at various levels of NLP. Ambiguity could be Lexical, Syntactic, Semantic, Pragmatic etc. This paper presents a study about different types of ambiguities that comes under Natural Language Processing.

Journal Article
TL;DR: The results illustrate, employing feature subset selection using proposed wrapper approach has enhanced classification accuracy, as well as increasing the overall efficiency of classification model.
Abstract: The purpose of this study is to evaluate the most important features of graduate employability in higher education database, in attempt to measure the employability situation for graduate information of the Maejo University in Thailand. The experiment also applies the features selection methods to increases the overall efficiency of classification model. There are two general attribute selection approaches: the Filter approach and the Wrapper approach. The Filter approach includes 3 methods, including Information Gain, Gain Ratio and Chi-square. The Wrapper approach we used Search method consisting of Genetic Search, Best First search and Greedy Stepwise as random search approach for subset generation, wrapped with different bayesian classifiers namely Naive bayes, Bayes network with K2 algorithm, Bayes network with TAN algorithm and Bayes network with Hill-climber algorithm. The results illustrate, employing feature subset selection using proposed wrapper approach has enhanced classification accuracy.