scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Information Technology and Computer Science in 2013"


Journal ArticleDOI
TL;DR: Self Organizing Map and Neuro Fuzzy scheme to automatically extract WM, GM, CSF and tumor region of brain MRI image tested on three normal and three abnormal brain MRI images.
Abstract: This paper explores the possibility of applying techniques for segmenting the regions of medical image. For this we need to investigate the use of different techniques which helps for detection and classification of image regions. We also discuss some segmentation methods classified by researchers. Region classification is an essential process in the visualization of brain tissues of MRI. Brain image is basically classified into three regions; WM, GM and CSF. The forth region can be called as the tumor region, if the image is not normal. In the paper; Segmentation and characterization of Brain MR image regions using SOM and neuro fuzzy techniques, we integrate Self Organizing Map(SOM) and Neuro Fuzzy scheme to automatically extract WM, GM, CSF and tumor region of brain MRI image tested on three normal and three abnormal brain MRI images. Now in this paper this scheme is further tested on axial view images to classify the regions of brain MRI and compare the results from the Keith"s database. Using some statistical tests like accuracy, precision, sensitivity, specificity, positive predictive value, negative predictive value, false positive rate, false negative rate, likelihood ratio positive, likelihood ratio negative and prevalence of disease we calculate the effectiveness of the scheme.

59 citations


Journal ArticleDOI
TL;DR: Case study results show that the weighted criteria value method offers a very promising technique in software reliability growth models comparison, and is relatively simple and requires less calculation.
Abstract: Many software reliability growth models (SRGMs) have been analyzed for measuring the growth of software reliability. Selection of optimal SRGMs for use in a particular case has been an area of interest for researchers in the field of software reliability. All existing methodologies use same weight for each comparison criterion. But in reality, it is the fact that all the parameters do not have the same priority in reliability measurement. Keeping this point in mind, in this paper, a computational methodology based on weighted criteria is presented to the problem of performance analysis of various non-homogenous Poisson process (NHPP) models. It is relatively simple and requires less calculation. A set of twelve comparison criteria has been formulated and each criterion has been assigned different weight to rank the software reliability growth models proposed during the past 30 years. Case study results show that the weighted criteria value method offers a very promising technique in software reliability growth models comparison.

54 citations


Journal ArticleDOI
TL;DR: A literature survey on intrusion detection system shows that 42 % KDD cup dataset, 20 % DARPA dataset and 38 % other datasets are used by the different researchers for testing the effectiveness of their proposed method for misuse detection, anomaly detection or both.
Abstract: In the era of information and communication technology, Security is an important issue. A lot of effort and finance are being invested in this sector. Intrusion detection is one of the most prominent fields in this area. Data mining in network intrusion detection can automate the network intrusion detection field with a greater efficiency. This paper presents a literature survey on intrusion detection system. The research papers taken in this literature survey are published from 2000 to 2012. We can see that almost 67 % of the research papers are focused on anomaly detection, 23 % on both anomaly and misuse detection and 10 % on misuse detection. In this literature survey statistics shows that 42 % KDD cup dataset, 20 % DARPA dataset and 38 % other datasets are used by the different researchers for testing the effectiveness of their proposed method for misuse detection, anomaly detection or both.

52 citations


Journal ArticleDOI
TL;DR: Semantic Web (SW) is a well defined portal that helps in extracting relevant information using many Information Retrieval (IR) techniques, and use of Ontology also contributes in building new generation of web- Semantic Web.
Abstract: In present age of computers, there are various resources for gathering information related to given query like Radio Stations, Television, Internet and many more. Among them, Internet is considered as major factor for obtaining any information about a given domain. When a user wants to find some information, he/she enters a query and results are produced via hyperlinks linked to various documents available on web. But the information that is retrieved to us may or may not be relevant. This irrelevance is caused due to huge collection of documents available on web. Traditional search engines are based on keyword based searching that is unable to transform raw data into knowledgeable representation data. It is a cumbersome task to extract relevant information from large collection of web documents. These shortcomings have led to the concept of Semantic Web (SW) and Ontology into existence. Semantic Web (SW) is a well defined portal that helps in extracting relevant information using many Information Retrieval (IR) techniques. Current Information Retrieval (IR) techniques are not so advanced that they can be able to exploit semantic knowledge within documents and give precise result. The terms, Information Retrieval (IR), Semantic Web (SW) and Ontology are used differently but they are interconnected with each other. Information Retrieval (IR) technology and Web based Indexing contributes to existence of Semantic Web. Use of Ontology also contributes in building new generation of web- Semantic Web. With the help of ontologies, we can make content of web as it will be markup with the help of Semantic Web documents (SWD's). Ontology is considered as backbone of Software system. It improves understanding between concepts used in Semantic Web (SW). So, there is need to build an ontology that uses well defined methodology and process of developing ontology is called Ontology Development.

51 citations


Journal ArticleDOI
TL;DR: A joint level controller for continuum robots is described which utilizes a fuzzy methodology component to compensate for dynamic uncertainties and is developed based on the unit quaternion representation.
Abstract: This paper describes the design and implementation of robust nonlinear sliding mode control strategies for robot manipulators whose dynamic or kinematic models are uncertain. Therefore a fuzzy sliding mode tracking controller for robot manipulators with uncertainty in the kinematic and dynamic models is design and analyzes. The controller is developed based on the unit quaternion representation so that singularities associated with the otherwise commonly used three parameter representations are avoided. Simulation results for a planar application of the continuum or hyper-redundant robot manipulator (CRM) are provided to illustrate the performance of the developed adaptive controller. These manipulators do not have rigid joints, hence, they are difficult to model and this leads to significant challenges in developing high-performance control algorithms. In this research, a joint level controller for continuum robots is described which utilizes a fuzzy methodology component to compensate for dynamic uncertainties.

50 citations


Journal ArticleDOI
TL;DR: The study reviews the prominent processes, tools and technologies used in the requirement gathering phase and highlights the importance of security requirements as though they are part of the non- functional requirement, yet are naturally considered fundamental to secure software development.
Abstract: Requirement engineering is an integral part of the software development lifecycle since the basis for developing successful software depends on comprehending its requirements in the first place. Requirement engineering involves a number of processes for gathering requirements in accordance with the needs and demands of users and stakeholders of the software product. In this paper, we have reviewed the prominent processes, tools and technologies used in the requirement gathering phase. The study is useful to perceive the current state of the affairs pertaining to the requirement engineering research and to understand the strengths and limitations of the existing requirement engineering techniques. The study also summarizes the best practices and how to use a blend of the requirement engineering techniques as an effective methodology to successfully conduct the requirement engineering task. The study also highlights the importance of security requirements as though they are part of the non- functional requirement, yet are naturally considered fundamental to secure software development.

50 citations


Journal ArticleDOI
TL;DR: This paper focuses on the COBIT framework and the importance of its adoption in academic institution, universities and organizations.
Abstract: With the advent of the increasing and complex pace of Information Technology (IT) innovation, coupled with maximized investments in IT as a strategy for businesses to stay competitive in a volatile market, and executive decision makers should possess effective IT governance. Effective IT governance is viewed as a great mechanism to use information and processes which in turn leads to greater profits and present future benefits. Therefore, good IT governance structures guide organizations in better leveraging their IT spending on crucial business areas. The significance of IT governance affects the ability of the organization to fulfill its goals and IT governance assists in the minimization of risks and maximization of value by focusing on performance and leveraging IT to satisfy the organization’s and customer’s long-term demands. Currently, organizations are showing interest in adopting the best practices and standards for IT governance. A framework offers the boundaries, the principles to follow and the guidelines through which a vision is provided as a philosophical base and the construction structure. It offers the basic structure that is flexible to apply in a certain environment. Such framework used in this study is COBIT. COBIT offers effective practices throughout a framework and lays down activities in an organized and flexible structure. These practices assist in optimizing IT-enabled investments, guarantee delivery of service and offers protection against who’s accountable for the wrongs. This paper focuses on the COBIT framework and the importance of its adoption in academic institution, universities and organizations. Some case studies are selected and analyzed of adopting COBIT framework in higher education institutions. These case studies are Australian Higher Education Institutions, Curtin University of Technology, and Viana do Castelo Polytechnic Institute.

44 citations


Journal ArticleDOI
TL;DR: Experimental results show that in comparison with the pure CSO, the proposed CSO can takes a less time to converge and can find the best solution in less iteration.
Abstract: Cat Swarm Optimization (CSO) is one of the new swarm intelligence algorithms for finding the best global solution. Because of complexity, sometimes the pure CSO takes a long time to converge and cannot achieve the accurate solution. For solving this problem and improving the convergence accuracy level, we propose a new improved CSO namely 'Adaptive Dynamic Cat Swarm Optimization'. First, we add a new adaptive inertia weight to velocity equation and then use an adaptive acceleration coefficient. Second, by using the information of two previous/next dimensions and applying a new factor, we reach to a new position update equation composing the average of position and velocity information. Experimental results for six test functions show that in comparison with the pure CSO, the proposed CSO can takes a less time to converge and can find the best solution in less iteration.

43 citations


Journal ArticleDOI
TL;DR: This paper has made an attempt to compare different mobility models and provide an overview of their current research status and the main focus is on Random Mobility Models and Group Mobility Models.
Abstract: A mobile ad-hoc network (MANET) is basically called as a network without any central administration or fixed infrastructure. It consists of a number of mobile nodes that use to send data packets through a wireless medium. There is always a need of a good routing protocol in order to establish the connection between mobile nodes since they possess the property of dynamic changing topology. Further, in all the existing routing protocols, mobility of a node has always been one of the important characteristics in determining the overall performance of the ad hoc network. Thus, it is essential to know about various mobility models and their effect on the routing protocols. In this paper, we have made an attempt to compare different mobility models and provide an overview of their current research status. The main focus is on Random Mobility Models and Group Mobility Models. Firstly, we present a survey of the characteristics, drawbacks and research challenges of mobility modeling. At the last we present simulation results that illustrate the importance of choosing a mobility model in the simulation of an ad hoc network protocol. Also, we illustrate how the performance results of an ad hoc network protocol drastically change as a result of changing the mobility model simulated.

42 citations


Journal ArticleDOI
TL;DR: Simulation results show that the proposed improved AODV protocol with limited TTL (Time to Live) of RREP packet in which the route reply (RREP) packet of A ODV is modified to limite TTL information of nodes outperforms regular AodV in terms of packet delivery rate, good put, throughput, and jitter.
Abstract: The AODV protocol is based on the minimum delay path as its route selection criteria, regardless of the paths load. This issue leads to unbalanced load dissemination in the network and the energy of the nodes on the shortest path deplete earlier than others. We proposed an improved AODV protocol with limited TTL (Time to Live) of RREP packet in which the route reply (RREP) packet of AODV is modified to limite TTL information of nodes. Experiments have been carried out using network simulator software (NS2). Simulation results show that our proposed routing protocol outperforms regular AODV in terms of packet delivery rate, good put, throughput, and jitter.

40 citations


Journal ArticleDOI
TL;DR: The OLSR routing protocol is improved by eliminating the unnecessary loops, and simulation results demonstrated a significant improvement in the criteria of package delivery rate and throughput.
Abstract: Mobile ad hoc networks are type of wireless networks in which any kind of infrastructure is not used, i.e. there are no infrastructures such as routers or switches or anything else on the network that can be used to support the network structure and the nodes has mobility. The purpose of this paper is to provide a better quality of the package delivery rate and the throughput, that is in need of powerful routing protocol standards, which can guarantee delivering of the packages to destinations, and the throughput on a network. For achieving this purpose, we use OLSR 1 routing protocol that is a responsive protocol and is currently covered under the IETF 2 standard (RFC 3626). At this paper, we improved the OLSR routing protocol by eliminating the unnecessary loops, and simulation results demonstrated a significant improvement in the criteria of package delivery rate and throughput.

Journal ArticleDOI
TL;DR: This review shows, although many publication and research focus on real-time aspect of the challenge, only few researches have investigated the deployment of extracted and retrieved information for forensic video surveillance.
Abstract: Recently, various conferences and journals have published articles related to Video Surveillance Systems, indicating researchers' attention. The goal of this review is to examine the latest works were published in journals, propose a new classification framework of video surveillance systems and investigate each aspect of this classification framework. This paper provides a comprehensive and systematic literature review of video surveillance systems from 2010-2011, extracted from six online digital libraries using article's title and keyword. The proposed classification framework is expanded on the basis of architecture of video surveillance systems, which is composed of six layers: Concept and Foundation Layer, Network Infrastructure Layer, Processing Layer, Communication Layer, Application Layer, and User Interaction Layer. This review shows, although many publication and research focus on real-time aspect of the challenge, only few researches have investigated the deployment of extracted and retrieved information for forensic video surveillance.

Journal ArticleDOI
TL;DR: An Under Water Density Based Clustered Sensor Network (UWDBCSN) scheme using heterogeneous sensors is proposed, found to be more energy efficient helps in extending the life time of underwater sensor networks.
Abstract: An underwater sensor network comprise of sensors and vehicles to perform numerous tasks. In underwater ad-hoc sensor network acoustic signals are transmitted through multi-hop sequence so as to save sensors' energy and to achieve longer life time. Re- charging batteries of deep water deployed sensors is practically not feasible. Clustering is the best strategy to achieve efficient multi-hopping, where cluster head is made responsible to collect local data and forward it to the sink. Cluster-head selection is the challenging job in a cluster, as it loses its energy in transmitting its own data and aggregated data, as compared to other sensors. In this paper we have proposed an Under Water Density Based Clustered Sensor Network (UWDBCSN) scheme using heterogeneous sensors. The scheme utilizes two types of sensors: one having high energy capacity, working as cluster head, having small quantity and other are ordinary sensors in huge quantity. Further cluster-head selection is based on node degree i.e. the density of the sensors in a region. The proposed scheme is found to be more energy efficient helps in extending the life time of underwater sensor networks.

Journal ArticleDOI
TL;DR: The fingerprints, iris image, and DNA features based multimodal systems and their performances are analyzed in terms of security, reliability, accuracy, and long-term stability.
Abstract: Biometric systems are alternates to the traditional identification systems. This paper provides an overview of single feature and multiple features based biometric systems, including the performance of physiological characteristics (such as fingerprint, hand geometry, head recognition, iris, retina, face recognition, DNA recognition, palm prints, heartbeat, finger veins, palates etc) and behavioral characteristics (such as body language, facial expression, signature verification, speech recognition, Gait Signature etc.). The fingerprints, iris image, and DNA features based multimodal systems and their performances are analyzed in terms of security, reliability, accuracy, and long-term stability. The strengths and weaknesses of various multiple features based biometric approaches published so far are analyzed. The directions of future research work for robust personal identification is outlined.

Journal ArticleDOI
TL;DR: Simulation based analysis of an FTP server"s performance in a typical enterprise network under distributed denial of service attack and some recent information on attacks dominated in year 2012 are provided.
Abstract: Different types and techniques of DDoS attacks & defense are studied in this paper with some recent information on attacks dominated in year 2012 (1st Quarter) We further provide simulation based analysis of an FTP server"s performance in a typical enterprise network under distributed denial of service attack Simulations in OPNET show noticeable variations in connection capacity, task processing and delay parameters of the attacked server as compared to the performance without attack DDoS detection and mitigation mechanisms discussed in this paper mainly focus on some recently investigated techniques Finally, conclusions are drawn on the basis of survey based study as well as simulation results

Journal ArticleDOI
TL;DR: In this research, a multi-input-multi- output baseline computed fuel control scheme is used to simultaneously control the mass flow rate of both port fuel injection (PFI) and direct injection (DI) systems to regulate the fuel ratio of PFI to DI to desired levels.
Abstract: Internal combustion (IC) engines are optimized to meet exhaust emission requirements with the best fuel economy. Closed loop combustion control is a key technology that is used to optimize the engine combustion process to achieve this goal. In order to conduct research in the area of closed loop combustion control, a control oriented cycle-to-cycle engine model, containing engine combustion information for each individual engine cycle as a function of engine crank angle, is a necessity. In this research, the IC engine is modeled according to fuel ratio, which is represented by the mass of air. In this research, a multi-input-multi- output baseline computed fuel control scheme is used to simultaneously control the mass flow rate of both port fuel injection (PFI) and direct injection (DI) systems to regulate the fuel ratio of PFI to DI to desired levels. The control target is to maintain the fuel ratio at stoichiometry and the fuel ratio to a desired value between zero and one. The performance of the baseline computed fuel controller is compared with that of a baseline proportional, integral, and derivative (PID) controller.

Journal ArticleDOI
TL;DR: In this paper, the performance analysis is carried out on Ad-hoc On- demand Distance Vector (AODV), Limited Hop Count AODV (LHC-A ODV), Optimized Link State Routing (OLSR), Unnecessary Loop OLSR (UL-OLSR) and Destination Sequenced distance Vector (DSDV) protocols using NS2 simulator.
Abstract: Mobile ad hoc networks are type of wireless networks in which any kind of infrastructure is not used, i.e. there are no infrastructures such as routers or switches or anything else on the network that can be used to support the network structure and the nodes has mobility. The routing is particularly a challenging task in MANETs that selecting paths in a network along which to send network traffic. In this paper, the performance analysis is carried out on Ad-hoc On- demand Distance Vector (AODV), Limited Hop Count AODV (LHC-AODV), Optimized Link State Routing (OLSR), Unnecessary Loop OLSR (UL-OLSR) and Destination Sequenced Distance Vector (DSDV) protocols using NS2 simulator. The delay, throughput, and packet delivery ratio are the three common measures used for the comparison of the performance of above protocols.

Journal ArticleDOI
TL;DR: The asymptotic stability of fuzzy PD control with first- order sliding mode compensation in the parallel structure is proven and the finite time convergence with a super-twisting second-order sliding-mode is guaranteed.
Abstract: In this paper, a linear proportional derivative (PD) controller is designed for highly nonlinear and uncertain system by using robust factorization approach. To evaluate a linear PD methodology two type methodologies are introduced; sliding mode controller and fuzzy logic methodology. This research aims to design a new methodology to fix the position in continuum robot manipulator. PD method is a linear methodology which can be used for highly nonlinear system's (e.g., continuum robot manipulator). To estimate this method, new parallel fuzzy sliding mode controller (PD.FSMC) is used. This estimator can estimate most of nonlinearity terms of dynamic parameters to achieve the best performance. The asymptotic stability of fuzzy PD control with first- order sliding mode compensation in the parallel structure is proven. For the parallel structure, the finite time convergence with a super-twisting second-order sliding-mode is guaranteed.

Journal ArticleDOI
TL;DR: This paper presents a dynamic cluster based job scheduling algorithm for efficient execution of user jobs and shows that the proposed scheduling algorithms (CHS 1) has shown the best average waiting time, average turnaround times, average response times and average total completion times compared to other job scheduling approaches.
Abstract: Grid computing enables sharing, selection and aggregation of computing resources for solving complex and large-scale scientific problems. The resources making up a grid need to be managed to provide a good quality of service. Grid scheduling is a vital component of a Computational Grid infrastructure. This paper presents a dynamic cluster based job scheduling algorithm for efficient execution of user jobs. This paper also includes the comparative performance analysis of our proposed job scheduling algorithm along with other well-known job scheduling algorithms considering the parameters like average waiting time, average turnaround time, average response time and average total completion time. The result has shown also exhibit that Our proposed scheduling algorithms (CHS 1 ) has shown the best average waiting times, average turnaround times, average response times and average total completion times compared to other job scheduling approaches.

Journal ArticleDOI
TL;DR: The asymptotic stability of fuzzy PD control with first- order sliding mode compensation in the parallel structure is proven and the finite time convergence with a super-twisting second-order sliding-mode is guaranteed.
Abstract: Both fuzzy logic and sliding mode can compensate the steady-state error of proportional- derivative (PD) control. This paper presents parallel sliding mode compensations for fuzzy PD controllers. The asymptotic stability of fuzzy PD control with first- order sliding mode compensation in the parallel structure is proven. For the parallel structure, the finite time convergence with a super-twisting second-order sliding-mode is guaranteed.

Journal ArticleDOI
TL;DR: Assessment of the quality of E-Commerce websites in Saudi Arabia using a proposed evaluation instrument revealed that international websites were of higher quality than regional and domestic websites with regard to all aspects of web evaluation.
Abstract: Electronic Commerce (E-Commerce) has a growing potential in Saudi Arabia, due to widespread use of the internet and the maturity of the Information Technology (IT) infrastructure. The purpose of this paper is to assess the quality of E-Commerce websites in Saudi Arabia, using a proposed evaluation instrument. To achieve this aim, six E-Commerce websites were selected for evaluation and then categorized into three categories: domestic, regional and international. Each category consisted of two E-Commerce websites, with the international website category considered as a benchmark. The six websites were evaluated by sixty participants (n=60) with good web design experience, following a set of guidelines offered by the evaluation instrument. The evaluation instrument described six key evaluation factors: appearance, content, organization, interaction, customer-focus and assurance. Each factor was evaluated by several indicators using a five-point scale ranging from not applicable (0) to very strong (4). To unify the views of all evaluators, a technique was introduced which would show the way in which each indicator was assessed based on several checklists (questions). The results therein revealed that international websites were of higher quality than regional and domestic websites with regard to all aspects of web evaluation. In addition, the quality of regional websites outweighed that of domestic websites with regard to appearance, content, organization, customer-focus and assurance, but not interaction.

Journal ArticleDOI
TL;DR: A real-coded genetic algorithm (RCGA) with arithmetic- average-bound-blend crossover and wavelet mutation is applied to design the digital IIR filter that satisfies the different performance requirements like minimizing the Lp-norm approximation error and minimizing the ripple magnitude.
Abstract: The paper develops a technique for the robust and stable design of digital infinite impulse response (IIR) filters. As the error surface of IIR filters is generally multi-modal, global optimization techniques are required to design efficient digital IIR filter in order to avoid local minima. In this paper a real-coded genetic algorithm (RCGA) with arithmetic- average-bound-blend crossover and wavelet mutation is applied to design the digital IIR filter. A multicriterion optimization is employed as the design criterion to obtain the optimal stable IIR filter that satisfies the different performance requirements like minimizing the Lp-norm approximation error and minimizing the ripple magnitude. The proposed real-coded genetic algorithm is effectively applied to solve the multicriterion, multiparameter optimization problems of low-pass, high-pass, band-pass, and band-stop digital filters design. The computational experiments show that the proposed method is superior or atleast comparable to other algorithms and can be efficiently used for higher order filter design.

Journal ArticleDOI
TL;DR: It is concluded that grey Verhulst neural network is a feasible and effective modeling method for the time series increasing in the curve with S-type.
Abstract: The advantages and disadvantages of BP neural network and grey Verhulst model for time series prediction are analyzed respectively, this article proposes a new time series forecasting model for the time series growth in S-type or growth being saturated. From the data fitting's viewpoint, the new model named grey Verhulst neural network is established based on grey Verhulst model and BP neural network. Firstly, the Verhulst model is mapped to a BP neural network, the corresponding relationships between grey Verhulst model parameters and BP network weights is established. Then, the BP neural network is trained by means of BP algorithm, when the BP network convergences, the optimized weights can be extracted, and the optimized grey Verhulst neural network model can be obtained. The experiment results show that the new model is effective with the advantages of high precision, less samples required and simple calculation, which makes full use of the similarities and complementarities between grey system model and BP neural network to settle the disadvantage of applying grey model and neural network separately. It is concluded that grey Verhulst neural network is a feasible and effective modeling method for the time series increasing in the curve with S-type.

Journal ArticleDOI
TL;DR: This paper examines the performance of three on demand routing protocols at application layer using QualNet-5.01 simulator to find out which one is more suitable for ad hoc network.
Abstract: A routing protocol is used to facilitate communication in ad hoc network. The primary goal of such a routing protocol is to provide an efficient and reliable path between a pair of nodes. The routing protocols for ad hoc network can be categorized into three categories: table driven, on demand and hybrid routing. The table driven and hybrid routing strategies require periodic exchange of hello messages between nodes of the ad hoc network and thus have high processing and bandwidth requirements. On the other hand on demand routing strategy creates routes when required and hence is very much suitable for ad hoc network. This paper therefore examines the performance of three on demand routing protocols at application layer using QualNet-5.01 simulator.

Journal ArticleDOI
TL;DR: This paper investigates how ants search for food, and how they find the shortest path, and the Ant Colony approach towards Cloud Computing gives better performance.
Abstract: Ants are very small insects.They are capable to find food even they are complete blind. The ants lives in their nest and their job is to search food while they get hungry. We are not interested in their living style, such as how they live, how they sleep. But we are interested in how they search for food, and how they find the shortest path. The technique for finding the shortest path are now applying in cloud computing. The Ant Colony approach towards Cloud Computing gives better performance.

Journal ArticleDOI
TL;DR: This work proposed a Novel and Efficient User Profile Characterization under distributed environment using Hadoop Map Reduce technique and results clearly show that the proposed technique shows better performance.
Abstract: The massive increases in data have paved a path for distributed computing, which in turn can reduce the data processing time. Though there are various approaches in distributed computing, Hadoop is one of the most efficient among the existing ones. Hadoop consists of different elements out of which Map Reduce is a scalable tool that enables to process a huge data in parallel. We proposed a Novel and Efficient User Profile Characterization under distributed environment. In this frame work the network anomalies are detected by using Hadoop Map Reduce technique. The experimental results clearly show that the proposed technique shows better performance.

Journal ArticleDOI
TL;DR: The experimental Results confirm that Back propagation algorithm is efficient than Elman neural network, the most widely used architecture in software development.
Abstract: Software Effort estimation involves the estimation of effort required to develop software. Cost overrun, schedule overrun occur in the software development due to the wrong estimate made during the initial stage of software development. Proper estimation is very essential for successful completion of software development. Lot of estimation techniques available to estimate the effort in which neural network based estimation technique play a prominent role. Back propagation Network is the most widely used architecture. ELMAN neural network a recurrent type network can be used on par with Back propagation Network. For a good predictor system the difference between estimated effort and actual effort should be as low as possible. Data from historic project of NASA is used for training and testing. The experimental Results confirm that Back propagation algorithm is efficient than Elman neural network.

Journal ArticleDOI
TL;DR: This paper proposes an automated innovative wheelchair controlled by neck position of person that uses simple LEDs, photo sensor, motor and microcontroller to control the movement of wheelchair.
Abstract: New development in sensors, radar and ultrasonic technologies has proved to be a boon for electronics travelling aids (ETAs). These devices are widely used by blind and physically challenged peoples. C5 laser cane, Mowat sensor, belt and binaural sonic aid, NAV guide cane are among popular electronic travelling aids used by blind peoples. For physically challenged person electric wheel chairs controlled by joystick, eye movement and voice recognition are also available but they have their own limitation in terms of operating complexity, noise environment and cost. Our paper proposes an automated innovative wheelchair controlled by neck position of person. It uses simple LEDs, photo sensor, motor and microcontroller to control the movement of wheelchair.

Journal ArticleDOI
TL;DR: The concepts of multigranular rough equivalences are introduced and the replacement properties, which are obtained by interchanging the bottom equivalences with the top equivalences, have been established.
Abstract: The notion of rough sets introduced by Pawlak has been a successful model to capture impreciseness in data and has numerous applications. Since then it has been extended in several ways. The basic rough set introduced by Pawlak is a single granulation model from the granular computing point of view. Recently, this has been extended to two types of multigranular rough set models. Pawlak and Novotny introduced the notions of rough set equalities which is called approximate equalities. These notions of equalities use the user knowledge to decide the equality of sets and hence generate approximate reasoning. However, it was shown by Tripathy et al, even these notions have limited applicability to incorporate user knowledge. So the notion of rough equivalence was introduced by them. The notion of rough equalities in the multigranulation context was introduced and studied. In this article, we introduce the concepts of multigranular rough equivalences and establish their properties. Also, the replacement properties, which are obtained by interchanging the bottom equivalences with the top equivalences, have been established. We provide a real life example for both types of multigranulation, compare the rough multigranular equalities with the rough multigranular equivalences and illustrate the interpretation of the rough equivalences through the example.

Journal ArticleDOI
TL;DR: This paper presents a design approach based on the service oriented paradigm for building E-governance systems and formalizes concepts like service environment, serv ice composition, and service collaboration which are some of the important ingredients of the design approach.
Abstract: Today electronic Governance (E -governance) is no more a buzzword but a reality as countries all over the worldwide have shown interest in harnessing governance with state -of-the-art information and communication technology(ICT), in order to foster better governance. However, the inherent complexities of E-governance systems remain as a challenge for the architects to develop large scale, distributed, and interoperable E-governance applications. Besides thi s the dynamic nature of such applications further complicates the system design. In this paper, we present a design approach based on the service oriented paradigm for building E-governance systems. We also formalize concepts like service environment, serv ice composition, and service collaboration which are some of the important ingredients of our design approach. In the sequel we highlight the suitability of our approach throughsome E-governance service provisioning scenarios.