scispace - formally typeset
Search or ask a question

Showing papers in "IOSR Journal of Computer Engineering in 2013"


Journal ArticleDOI
TL;DR: A new model used to enhance voltage stability is identified and several key issues that had remained as research challenges in this area are exposed.
Abstract: The intent of this paper is to present an analysis of reactive power control and voltage stability in power systems. It identifies a new model used to enhance voltage stability and exposes several key issues that had remained as research challenges in this area. The steady state voltage and reactive power control in distribution systems can be properly controlled by coordinating the available voltage and reactive power control equipment, such as on-load tap-changers, substation shunt capacitors and feeder shunt capacitors. It began with an overview of reactive power and voltage stability in transmission, distribution and load, and the importance of providing reactive power locally. The description of selected control features of shunt power systems such as SVC (Static Var Compensator) - static compensators of reactive power, STATCOM-type systems (Static Compensator), static reactive power generators and systems that combine both these solutions, which are referred to as SVC based on STATCOM were not left out. It explains the need to improve the voltage stability of Power system, as well as the increasing requirements for energy quality and security. It also discusses the techniques that were adopted in controlling and monitoring of the rate of power flow in the entire power system topology. This investigates the system to an optimal level in order to reduce losses and ensures sufficiency of reactive power control during normal and emergency conditions and to prevent voltage collapse.

60 citations


Journal ArticleDOI
TL;DR: In this work, a detailed formulation and explanation of the Firefly algorithm implementation is given and later Firefly algorithm is verified using six unimodal engineering optimization problems reported in the specialized literature.
Abstract: Meta-heuristic algorithms prove to be competent in outperforming deterministic algorithms for real-world optimization problems. Firefly algorithm is one such recently developed algorithm inspired by the flashing behavior of fireflies. In this work, a detailed formulation and explanation of the Firefly algorithm implementation is given. Later Firefly algorithm is verified using six unimodal engineering optimization problems reported in the specialized literature.

49 citations


Journal ArticleDOI
TL;DR: A comprehensive simulation based performance study and analysis is performed on various types of routing protocols over MANET, a collection of wireless mobile nodes which forms a dynamic temporary network without using any existing infrastructure based on OPNET simulation.
Abstract: Wireless Technology is at its peak when we talk about research and innovation. This field has become a hub of invention of new theories and structures. Mobile Ad-hoc Network is a special point of focus for the researchers. MANET is a collection of wireless mobile nodes which forms a dynamic temporary network without using any existing infrastructure. In recent time the market of mobile devices, laptops, hand held and portable devices is at its zenith so communication between such devices is an important issue. Routing is an essential part in the success of communication among these mobile structures. Routing protocols play an important role for finding an efficient and reliable route from source to destination. In the literature, there are numerous MANET routing protocols aiming to find the most suitable path from source to destination. In this paper, a comprehensive simulation based performance study and analysis is performed on various types of routing protocols over MANET. Ad Hoc On-Demand Distance Vector (AODV), Dynamic Source Routing (DSR), Temporally-Ordered Routing Algorithm (TORA), Optimized Link State Routing (OLSR) and Geographic Routing Protocol (GRP) has been considered for investigation in this paper based on OPNET simulation. Moreover the performance of these routing protocols will be measured on the basis of throughput, delay, load and data dropped metrics.

42 citations


Journal ArticleDOI
TL;DR: ATCS is an Automated Toll Collection System used for collecting tax automatically with the help of radio frequency, which assures time saving, fuel conservation and also contributing in saving of money.
Abstract: ATCS is an Automated Toll Collection System used for collecting tax automatically. In this we do the identification with the help of radio frequency. A vehicle will hold an RFID tag. This tag is nothing but unique identification number assigned. This will be assigned by RTO or traffic governing authority. In accordance with this number we will store, all basic information as well as the amount he has paid in advance for the TOLL collection. Reader will be strategically placed at toll collection center. Whenever the vehicle passes the toll naka, the tax amount will be deducted from his prepaid balance. New balance will be updated. Incase if one has insufficient balance, his updated balance will be negative one. To tackle this problem, we are alarming a sound, which will alert the authority that this vehicle doesn't have sufficient balance and that particular vehicle can be trapped. As vehicles don't have to stop in a queue, it assures time saving, fuel conservation and also contributing in saving of money.

35 citations


Journal ArticleDOI
TL;DR: This paper has worked with different data mining applications and various classification algorithms, these algorithms have been applied on different dataset to find out the efficiency of the algorithm and improve the performance by applying data preprocessing techniques and feature selection and also prediction of new class labels.
Abstract: Data mining is the knowledge discovery process by analyzing the large volumes of data from various perspectives and summarizing it into useful information; data mining has become an essential component in various fields of human life. It is used to identify hidden patterns in a large data set. Classification techniques are supervised learning techniques that classify data item into predefined class label. It is one of the most useful techniques in data mining to build classification models from an input data set; these techniques commonly build models that are used to predict future data trends. In this paper we have worked with different data mining applications and various classification algorithms, these algorithms have been applied on different dataset to find out the efficiency of the algorithm and improve the performance by applying data preprocessing techniques and feature selection and also prediction of new class labels.

34 citations


Journal ArticleDOI
TL;DR: The result shows that the proposed clustering algorithm can predict the likelihood of patients getting a heart attack in a more efficient and cost effective way than the other well known algorithms.
Abstract: Cardiovascular disease remains the biggest cause of deaths worldwide. The percentage of premature death from this disease ranges from 4% in high income countries and 42 % in low income countries. This shows the importance of predicting heart disease at the early stage. In this paper, a new unsupervised classification system is adopted for heart attack prediction at the early stage using the patient’s medical record. The information in the patient record are preprocessed initially using data mining techniques and then the attributes are classified using a Fuzzy C means classifier. In the classification stage 13 attributes are given as input to the Fuzzy C Means (FCM) classifier to determine the risk of heart attack. FCM is an unsupervised clustering algorithm, which allows one piece of data to belong to two or more clusters. The proposed system will provide an aid for the physicians to diagnosis the disease in a more efficient way. The efficiency of the classifier is tested using the records collected from 270 patients, which gives a classification accuracy of 92%. The result shows that the proposed clustering algorithm can predict the likelihood of patients getting a heart attack in a more efficient and cost effective way than the other well known algorithms.

29 citations


Journal ArticleDOI
TL;DR: This paper delineate a robust hybrid system for recognition of handwritten Bangla numerals for the automated postal system, which performed feature extraction using k-means clustering, Baye's theorem and Maximum a Posteriori, then the recognition is performed using Support Vector Machine.
Abstract: Recognition of handwritten Bangla numerals finds numerous applications in postal system automation, passports and document analysis and even for number plate identification. However, the recognition rate requires high and reliable accuracy for practical applications. This paper delineate a robust hybrid system for recognition of handwritten Bangla numerals for the automated postal system, which performed feature extraction using k-means clustering, Baye's theorem and Maximum a Posteriori, then the recognition is performed using Support Vector Machine . Recognition of handwritten numerals, such as postal codes, reveal all kinds of local and global deformations: distortions, different writing styles, thickness variations, wide variety of scales, limited amount of rotation, added noise, occlusion and missing parts. This paper shows that the proposed method is better than other system. Keywords - K-means clustering, Bayes' theorem, MAP, PCA, SVM and OCR.

28 citations


Journal ArticleDOI
TL;DR: This proposed model gives two layers of security for secret data, which fully satisfy the basic key factors of information security system that includes: Confidentiality, Authenticity, Integrity and Non - Repudiation.
Abstract: In the present scenario, any communication of internet and networks application requires security.Lots of data security and data hiding algorithms have been developed in the last decade.Cryptography and steganography are the two major techniques for secret communication.In this paper,the secret image is first encrypted by using BLOWFISH algorithm which has very good performance and is a most powerful technique compared to other Algorithms. Now this encrypted imageis embedded with videoby using LSB Approach of steganography. Our proposed model gives two layers of security for secret data, which fully satisfy the basic key factors of information security system that includes: Confidentiality, Authenticity, Integrity and Non - Repudiation.

28 citations


Journal ArticleDOI
TL;DR: The accuracy percentage, sensitivity percentage and specificity percentage are considered to provide a result on the performance of the various classification techniques used in healthcare industry.
Abstract: Healthcare industry is a type of industry, where the data is very large and sensitive. The data is required to be handled very carefully without any mismanagement. There are various data mining techniques that have been used in healthcare industry but the research that has to be done now is on the performance of the various classification techniques. So that amongst the all, the best on can be chosen. In this paper, we aim to consider the accuracy percentage, sensitivity percentage and specificity percentage to provide a result.

28 citations


Journal ArticleDOI
TL;DR: This paper explains a method of how the two factor authentication implemented using SMS OTP or OTP generated by Smartphone- One Time Password to secure user accounts is explained.
Abstract: This paper explains a method of how the two factor authentication implemented using SMS OTP or OTP generated by Smartphone- One Time Password to secure user accounts. The proposed method guarantees authenticating online banking features are secured also this method can be useful for e-shopping & ATM machines. The proposed system involves generating and delivering a One Time Password to mobile phone. Smartphone can be used as token for creating OTP or OTP can be send to mobile phone in form of SMS. The generated OTP is valid for only for short period of time and it is generated and verified using Secured Cryptographic Algorithm. The proposed system has been implemented and tested successfully. Keywords - OTP, Authentication, SHA-1, Cloud, token, Android

27 citations


Journal ArticleDOI
TL;DR: A semi-blind algorithm is been developed using DWT-DCT and SVD technique which is robust against several attacks like cropping, noise, rotation, filtering, translation, etc and extracts singular values of the watermark using inverse SVD to validate the content authentication.
Abstract: In this paper a semi-blind algorithm is been developed using DWT-DCT and SVD technique which is robust against several attacks like cropping, noise, rotation, filtering, translation, etc. Trigonometric function is used to closely relate the singular values of the original image and the watermarked image. In this algorithm, firstly DWT is applied on the host image which results in four frequency bands LL, LH, HL and HH. As it is found that middle frequency band is less prone to attacks so the singular values of the DCT Transformed coefficients of the LH band of the image is been modified using the singular values of the DCT transformed coefficients of the watermark and the scaling factor with the help of inverse-trigonometric function. And then this modified singular values are used to reconstruct the watermarked Host Image. Now to validate the content authentication, the extraction technique is applied on the watermarked image. It consists of applying DWT on the watermarked image to get all the four frequency bands and then by using the singular values of the DCT coefficients of the middle frequency band and the scaling factor using trigonometric function, the singular values of the watermark are extracted to reconstruct the watermark using inverse SVD. Keywords- DCT, DWT, Robust, Semi Blind Watermarking

Journal ArticleDOI
TL;DR: This paper examines the performance of a set of lossless data compression algorithm, on different form of text data, implemented to evaluate the performance in compressing text data.
Abstract: Data Compression is the technique through which, we can reduce the quantity of data, used to represent content without excessively reducing the quality of the content. This paper examines the performance of a set of lossless data compression algorithm, on different form of text data. A set of selected algorithms are implemented to evaluate the performance in compressing text data. A set of defined text file are used as test bed. The performance of different algorithms are measured on the basis of different parameter and tabulated in this article. The article is concluded by a comparison of these algorithms from different aspects. Keywords - Encryption, Entropy Encoding, Dictionary Encoding, Compression Ratio, Compression time, Test Bed.

Journal ArticleDOI
TL;DR: This paper proves that Wavelet Transform method is very effective for all types of noise.
Abstract: Digital images are prone to a variety of noise. The varieties include Speckle noise, Gaussian noise, Salt and pepper noise. It is a difficult task to separate noise from an image while maintaining the desired information and quality of an image. To obtain significant results, various algorithms have been proposed. This paper deals with comparison of two approaches i.e. filtering approach and wavelet based approach accounting Peak Signal to Noise Ratio and Root Mean Square Error as performance parameters. This paper proves that Wavelet Transform method is very effective for all types of noise. Results of this paper have been simulated on MATLAB.

Journal ArticleDOI
TL;DR: A survey of different image encryption techniques has been discussed from which researchers can get an idea for efficient techniques to be used as discussed by the authors, which is one of the most popular techniques for image encryption.
Abstract: Due to the rapid growth of digital communication and multimedia application, security becomes an important issue of communication and storage of images. Encryption is one of the ways to ensure high security images are used in many fields such as medical science, military.Modern cryptography provides essential techniques for securing information and protecting multimedia data. In recent years, encryption technology has been developed quickly and many image encryption methods have been used to protect confidential image data from unauthorized access .In this paper survey of different image encryption techniques have been discussed from which researchers can get an idea for efficient techniques to be used.

Journal ArticleDOI
TL;DR: The results prove that contourlet coefficient co-occurrence matrix texture features can be successfully applied for the classification of mammogram images.
Abstract: This work presents and investigates the discriminatory capability of contourlet coefficient co- occurrence matrix features in the analysis of mammogram images and its classification. It has been revealed that contourlet transform has a remarkable potential for analysis of images representing smooth contours and fine geometrical structures, thus suitable for textural details. Initially the ROI (Region of Interest) is cropped from the original image and its contrast is enhanced using histogram equalization. The ROI is decomposed using contourlet transform and the co-occurrence matrices are generated for four different directions (θ=0°, 45°, 90° and 135°) and distance (d= 1 pixel). For each co-occurrence matrix a variety of second order statistical texture features are extracted and the dimensionality of the features is reduced using Sequential Floating Forward Selection (SFFS) algorithm. A PNN is used for the purpose of classification. For experimental evaluation, 200 images are taken from mini MIAS (Mammographic Image Analysis Society) database. Experimental results show that the proposed methodology is more efficient and maximum classification accuracy of 92.5% is achieved. The results prove that contourlet coefficient co-occurrence matrix texture features can be successfully applied for the classification of mammogram images. Keywords-Contourlet Transform, Mammogram, SFFS, PNN, ROI, MIAS

Journal ArticleDOI
TL;DR: This paper proposes a comparison among all kind of biometric system available in the society based upon various aspects to make easy selection for biometric device deployment in specific environment.
Abstract: A biometric system provides automatic recognition of an individual based on a unique feature or characteristic possessed by the individual. These biometric characteristic may physiological or behavioral. Unlike other identification methods such as id proof, tokens and password, the distinct aspect of biometric recognition comes into light from randomly distributed features in human being. In this paper, I describe the novel comparison based upon various aspects to make easy selection for biometric device deployment in specific environment. This paper proposes a comparison among all kind of biometric system available in the society. The existing computer security systems used at various places like banking, passport, credit cards, smart cards, PIN , access control and network security are using username and passwords for person identification. Biometric systems also introduce an aspect of user convenience; it means one can be authorized by representing himself or herself. In this paper, the main focus is on working principal of biometric technique, the various biometrics systems and their comparisons.

Journal ArticleDOI
TL;DR: This paper provides classification of CAPTCHAs, its application areas and guidelines for generating a captcha, and its working and literature Review.
Abstract: Today several daily activities such as communication, education, E-commerce, Entertainment and tasks are carried out by using the internet. To perform such web activities users have to register regarding the websites. In registering websites, some intruders write malicious programs that waste the website resources by making automatic false enrolments that are called as bots. These false enrolments may adversely affect the working of websites. So, it becomes necessary to differentiate between human users and Web bots (or computer programs) is known as CAPTCHA. CAPTCHA is based on identifying the distorted text, the color of image, object or the background. This paper examines CAPTCHAs and its working and literature Review. This paper also provides classification of CAPTCHAs, its application areas and guidelines for generating a captcha.

Journal ArticleDOI
TL;DR: This paper describes the challenges due to lack of coordination between Security agencies and the Critical IT Infrastructure, and focuses on cyber security emerging trends while adopting new technologies such as mobile computing, cloud computing, e-commerce, and social networking.
Abstract: Cyber security is the activity of protecting information and information systems (networks, computers, data bases, data centres and applications) with appropriate procedural and technological security measures Firewalls, antivirus software, and other technological solutions for safeguarding personal data and computer networks are essential but not sufficient to ensure security As our nation rapidly building its Cyber- Infrastructure, it is equally important that we educate our population to work properly with this infrastructure Cyber-Ethics, Cyber-Safety, and Cyber-Security issues need to be integrated in the educational process beginning at an early age Security counter measures help ensure the confidentiality, availability, and integrity of information systems by preventing or mitigating asset losses from Cyber security attacks Recently cyber security has emerged as an established discipline for computer systems andinfrastructures with a focus on protection of valuable information stored on those systems fromadversaries who want to obtain, corrupt, damage, destroy or prohibit access to it An Intrusion Detection System (IDS) is a program that analyses what happens or has happened during an execution and tries to find indications that the computer has been misused A wide range of metaphors was considered, including those relating to: military and other types of conflict, biological, health care, markets, three-dimensional space, and physical asset protection These in turn led to consideration of a variety of possible approaches for improving cyber security in the future These approaches were labelled "Heterogeneity" ,"Motivating Secure Behaviour" and "Cyber Wellness" Cyber Security plays an important role in the development of information technology as well as Internet services Our attention is usually drawn on "Cyber Security" when we hear about "Cyber Crimes" Our first thought on "National Cyber Security" therefore starts on how good is our infrastructure for handling "Cyber Crimes" This paper focus on cyber security emerging trends while adopting new technologies such as mobile computing, cloud computing, e-commerce, and social networking The paper also describes the challenges due to lack of coordination between Security agencies and the Critical IT Infrastructure Keywords - cyber safety,e-commerce ,intrusion detection system (IDS), internet engineering task force (IETF),metaphors

Journal ArticleDOI
TL;DR: A new method, called fully homomorphic encryption (FHE) that performs computation with the encrypted data and send to the client and offers a realistic hope that such calculations can be performed securely in the cloud.
Abstract: As the data storage challenge continues to grow for insurers and everyone else, one of the obvious solutions is cloud technology. Storing data on remote servers rather than in-house is definitely a money-saver, but in insurance circles, the worry has been that having critical data reside outside the physical and virtual walls of the insurance enterprise is a risky situation. As the IT field is rapidly moving towards Cloud Computing, software industry's focus is shifting from developing applications for PCs to Data Centers and Clouds that enable millions of users to make use of software simultaneously. "Attempting computation on sensitive data stored on shared servers leaves that data exposed in ways that traditional encryption techniques can't protect against," the article notes. "The main problem is that to manipulate the data, it has to be decoded first". Now a new method, called fully homomorphic encryption (FHE) that performs computation with the encrypted data and send to the client and offers a realistic hope that such calculations can be performed securely in the cloud. Keywords: Cloud computing, fully homomorphic encryption, security threats. I. Introduction "Homomorphic" is an adjective which describes a property of an encryption scheme. That property, in simple terms, is the ability to perform computations on the cipher text without decrypting it first. Our ultimate goal is to used a fully homomorphic encryption scheme E. Let us discuss what it means to be fully homomorphic. At a high-level, the essence of fully homomorphic encryption is simple: given cipher texts that encrypt m1,m2,m3,…………………mt , fully homomorphic encryption should allow anyone (not just the key- holder) to output a cipher text that encrypts f(m1,m2,m3,…………………mi) for any desired function f, as long as that function can be efficiently computed. No information about m1,m2,m3,…………………mt or f(m1,m2,m3,…………………mt) or any intermediate plaintext values, should leak; the inputs, output and intermediate values are always encrypted.

Journal ArticleDOI
TL;DR: An optimized version of the FCFS scheduling algorithm to address the major challenges of task scheduling in cloud and create a module depicting the normal FCFS algorithm in comparison to the optimized version algorithm for resource provisioning in the cloud.
Abstract: In our project, we propose an optimized version of the FCFS scheduling algorithm to address the major challenges of task scheduling in cloud. The incoming tasks are grouped on the basis of task requirement like minimum execution time or minimum cost and prioritized (FCFS manner). Resource selection is done on the basis of task constraints using a greedy approach. The proposed model will be implemented and tested on simulation toolkit. We intend to create a module depicting the normal FCFS algorithm in comparison to our optimized version algorithm for resource provisioning in the cloud.

Journal ArticleDOI
TL;DR: XpertMalTyph; a novel medical diagnostic expert system for the various kinds of malaria and typhoid complications is designed and implemented based on JESS (Java Expert System Shell) programming because of its robust inference engine and rules for implementing expert systems.
Abstract: The dearth of medical experts in the developing world has subjected a large percentage of its populace to preventable ailments and deaths. Also, because of the predominant rural communities, the few medical experts that are available always opt for practice in the few urban cities. This consequently puts the rural communities at a disadvantage with respect to access to quality health care services. In this work, we designed and implemented XpertMalTyph; a novel medical diagnostic expert system for the various kinds of malaria and typhoid complications. A medical diagnostic expert system uses computer(s) to simulate medical doctor skills in diagnosis of ailments and prescription of treatments, hence can be used to provide the same service in the absence of the experts. XpertMalTyph is based on JESS (Java Expert System Shell) programming because of its robust inference engine and rules for implementing expert systems.

Journal ArticleDOI
TL;DR: IEEE standard and protocols for vehicular communication, IEEE 802.11p and IEEE 1609.x, also known as WAVE protocol stack are presented and the necessary requirements for a generic discrete event simulator which can be used to simulate Vehicular Ad-hoc Networks are discussed.
Abstract: Vehicular communication is considered to be a backbone for many critical safety applications. In order to achieve a better implementation of any vehicular communication scenario, an efficient, accurate and reliable simulator is essential. Various open source and commercial simulating tools are available for this purpose. One of the key issues in this regard is the selection of a reliable simulator which implements all standard algorithms and paradigms giving accurate results. In this paper, we first present IEEE standard and protocols for vehicular communication, IEEE 802.11p and IEEE 1609.x, also known as WAVE protocol stack. The paper then discusses the necessary requirements for a generic discrete event simulator which can be used to simulate Vehicular Ad-hoc Networks. Since not all the network simulators can be used in the scenario of vehicular communication, we highlight the key features of some network simulators in the context of vehicular ad-hoc networks. The paper also highlights some of the implementation limitations in these simulators. Furthermore, the paper presents a discussion on traffic simulators by emphasizing on the underlying mobility models used in order to generate the realistic traffic patterns. A comparative study of both network and traffic simulators show the pros and cons of these simulation tools. The paper suggests the appropriate choice of a network simulator to be used as a VANET simulator.

Journal ArticleDOI
TL;DR: The evolution of full adder circuits in terms of lesser power consumption, higher speed is discussed in this paper.
Abstract: The Full Adder circuit is an important component in application such as Digital Signal Processing (DSP) architecture, microprocessor, and microcontroller and data processing units. This paper discusses the evolution of full adder circuits in terms of lesser power consumption, higher speed. Starting with the most conventional 28 transistor full adder and then gradually studied full adders consisting of as less as 8 transistors. We have also included some of the most popular full adder cells like dynamic CMOS (9), Dual rail domino logic(14), Static Energy Recovery Full Adder (SERF) (7) (8), Adder9A, Adder9B, GDI based full adder.

Journal ArticleDOI
TL;DR: The aim of this paper is to provide the past and current techniques in Web Mining and the overview of development in research of web mining and some important research issues related to it.
Abstract: This paper is a work on survey on the existing techniques of web mining and the issues related to it. The World Wide Web acts as an interactive and popular way to transfer information. Due to the enormous and diverse information on the web, the users cannot make use of the information very effectively and easily. Data mining concentrates on non trivial extraction of implicit previously unknown and potential useful information from the very large amount of data. Web mining is an application of data mining which has become an important area of research due to vast amount of World Wide Web services in recent years. The aim of this paper is to provide the past and current techniques in Web Mining. This paper also reports the summary of various techniques of web mining approached from the following angles like Feature Extraction, Transformation and Representation and Data Mining Techniques in various application domains. The survey on data mining technique is made with respect to Clustering, Classification, Sequence Pattern Mining, Association Rule Mining and Visualization. The research work done by different users depicting the pros and cons are discussed. It also gives the overview of development in research of web mining and some important research issues related to it.

Journal ArticleDOI
TL;DR: This paper proposes a method of audio steganographic system that provides a unique platform to hide the secret information in audio file though the information is in text, image or in an audio format and provides security by using PKE algorithm.
Abstract: In this digital world, huge amount information exchange takes place due to enhanced facilities of networking. Therefore it is necessary to secure the information which we transmit. The need for secured communication introduces the concept of "Steganography". Steganography, the word itself indicates that information within information; it is the best technique to hide the secret information by using cover objects. Secret information may be a text, image or an audio file. But as per secret information format there are different steganographic techniques are available. This paper proposes a method of audio steganographic system that provides a unique platform to hide the secret information in audio file though the information is in text, image or in an audio format. So there is no need to go for different techniques of steganography as per information format. Many steganographic methods follow the LSB insertion technique to hide the secret information. But there are many statistical techniques available to determine if a stego object has been subjected to LSB Embedding. The proposed system hides secret information in audio file through random based approach and provides security by using PKE algorithm. This paper focuses on combining the strengths of cryptography and steganography for secured communication.

Journal ArticleDOI
TL;DR: The security and privacy concerns of cloud computing and some possible solutions to enhance the security are discussed and a secured framework for cloud computing is proposed.
Abstract: The National Institute of Standards and Technology (NIST) defined cloud computing as a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or cloud provider interaction [Def:1]. Cloud computing has the potential to change how organizations manage information technology and transform the economics of hardware and software at the same time. Cloud computing promised to bring a new set of entrepreneurs who could start their venture with zero investment on IT infrastructure. How ever this captivating technology has security concerns which are formidable. The promises of cloud computing, especially public cloud can be shadowed by security breaches which are inevitable. As an emerging information technology area cloud computing should be approached carefully. In this article we will discuss the security and privacy concerns of cloud computing and some possible solutions to enhance the security. Based on the security solutions suggested i have come up with a secured framework for cloud computing.

Journal ArticleDOI
TL;DR: A new approach to face detection systems using the skin color of a subject, based on color tone values specially defined for skin area detection within the image frame, which can detect a face regardless of the background of the picture.
Abstract: Human face recognition systems have gained a considerable attention during last decade due to its vast applications in the field of computer and advantages over previous biometric methods. There are many applications with respect to security, sensitivity and secrecy. Face detection is the most important and first step of recognition system. This paper introduces a new approach to face detection systems using the skin color of a subject. This system can detect a face regardless of the background of the picture, which is an important phase for face identification. The images used in this system are color images which give additional information about the images than the gray images provide. In face detection, the two respective classes are the "face area" and the "non-face area". This new approach to face detection is based on color tone values specially defined for skin area detection within the image frame. This system first resizes the image, and then separates it into its component R, G, and B bands. These bands are transformed into another color space which is YCbCr space and then into YC'bC'r space (the skin color tone). The morphological process is implemented on the presented image to make it more accurate. At last, the projection face area is taken by this system to determine the face area. Experimental results show that the proposed algorithm is good enough to localize a human face in an image with an accuracy of 92.69%.

Journal ArticleDOI
TL;DR: In this paper, insight is provided into the potential applications of ad hoc networks and the impact of comparison of different routing protocol in term of different parameters is presented.
Abstract: In the past few years, we have seen a rapid expansion in the field of mobile computing due to the proliferation of inexpensive, widely available wireless devices A mobile ad hoc network (MANET) consists of mobile wireless nodes in which the communication between nodes is carried out without any centralized control. MANET is a self organized and self configurable network where the mobile nodes move arbitrarily. The mobile nodes can receive and forward packets as a router. In this paper provides insight into the potential applications of ad hoc networks and discusses the technological challenges that protocol designers and network developers are faced with. In this paper also present the impact of comparison of different routing protocol in term of different parameters.

Journal ArticleDOI
TL;DR: An encryption technique in cloud computing environment using randomization method to increase security and optimize the encrypted data in migration process is proposed.
Abstract: With the increase in the development of cloud computing environment, the security has become the major concern that has been raised more consistently in order to move data and applications to the cloud as individuals do not trust the third party cloud computing providers with their private and most sensitive data and information. In this paper, I proposed an encryption technique in cloud computing environment using randomization method to increase security and optimize the encrypted data in migration process.

Journal ArticleDOI
TL;DR: This paper focus on the study of spatial temporal database and its models and different types of applications where dynamic modeling of spatialporal database can be used, this is an emerging field which places lot of contribution in DBMS with the aspects of the real world.
Abstract: This paper focus on the study of spatial temporal database and its models and different types of applications where dynamic modeling of spatial temporal database can be used, this is an emerging field which places lot of contribution in DBMS with the aspects of the real world. Spatio-Temporal data models are the heart of a Spatio-Temporal Information System (STIS); they describe object data types, relationships, operations and rules to maintain database integrity. A rigorous data model must anticipate spatio-temporal queries and analytical methods to be performed in the STIS. Spatio-temporal database models are proposed to deal with real world applications, where spatial changes occur over the time line. A serious weakness of existing models is that each of them deals with few common characteristics found across a number of specific applications. Thus the applicability of the model to different cases, fails on spatio-temporal behaviors not anticipated by the application used for the initial model development. Keywords- Land Information System (LIS), Geographical Information System (GIS), Spatial-temporal databases, STIS.