scispace - formally typeset
Search or ask a question

Showing papers in "IOSR Journal of Computer Engineering in 2012"


Journal ArticleDOI
TL;DR: This research paper presents what cloud computing is, the various cloud models and the overview of the cloud computing architecture, and analyzes the key research challenges present in cloud computing and offers best practices to service providers and enterprises hoping to leverage cloud service to improve their bottom line in this severe economic climate.
Abstract: Cloud computing is a set of IT services that are provided to a customer over a network on a leased basis and with the ability to scale up or down their service requirements. Usually Cloud Computing services are delivered by a third party provider who owns the infrastructure.Cloud Computing holds the potential to eliminate the requirements for setting up of high-cost computing infrastructure for IT-based solutions and services that theindustry uses. It promises to provide a flexible IT architecture, accessible through internet from lightweight portable devices.This would allow multi-fold increase in the capacity and capabilities of the existing and new software.This new economic model for computing has found fertile ground and is attracting massive global investment. Many industries, such as banking, healthcare and education are moving towards the cloud due to the efficiency of services provided by the pay-per-use pattern based on the resources such as processing power used, transactions carried out, bandwidth consumed, data transferred, or storage space occupied etc.In a cloud computing environment, the entire data resides over a set of networked resources, enabling the data to be accessed through virtual machines. Despite the potential gains achieved from the cloud computing, the organizations are slow in accepting it due to security issues and challenges associated with it. Security is one of the major issues which hamper the growth of cloud. There are various research challenges also there for adopting cloud computing such as well managed service level agreement (SLA), privacy, interoperability and reliability.This research paper presents what cloud computing is, the various cloud models and the overview of the cloud computing architecture. This research paper also analyzes the key research challenges present in cloud computing and offers best practices to service providers as well as enterprises hoping to leverage cloud service to improve their bottom line in this severe economic climate.

68 citations


Journal ArticleDOI
TL;DR: This paper focuses on the merits and demerits of routing protocols which will help to develop new routing protocols or improvement of existing routing protocol in near future.
Abstract: VANET (Vehicular Ad-hoc Network) is an emerging new technology with some unique characteristics that makes it different from other ad hoc network. Due to rapid topology changing and frequent disconnection it is also difficult to design an efficient routing protocol for routing data among vehicles, called V2V or vehicle to vehicle communication and vehicle to road side infrastructure, called V2I. Because of road accident daily occurrence VANET is one of the influencing areas for the improvement of Intelligent Transportation System (ITS) which can increase road safety and provide traffic information etc. The existing routing protocols for VANET are not efficient to meet every traffic scenarios. Suitable routing protocols are required to establish communication between vehicles in future for road safety. In this paper, we focus on the merits and demerits of routing protocols which will help to develop new routing protocols or improvement of existing routing protocol in near future.

62 citations


Journal ArticleDOI
TL;DR: This paper presents the general overview of image watermarking and different security issues, and Least Significant Bit algorithm has been used for embedding the message/logo into the image.
Abstract: In recent years, internet revolution resulted in an explosive growth in multimedia applications The rapid advancement of internet has made it easier to send the data/image accurate and faster to the destination Besides this, it is easier to modify and misuse the valuable information through hacking at the same time Digital watermarking is one of the proposed solutions for copyright protection of multimedia data A watermark is a form, image or text that is impressed onto paper, which provides evidence of its authenticity In this paper an invisible watermarking technique (least significant bit) and a visible watermarking technique is implemented This paper presents the general overview of image watermarking and different security issues Various attacks are also performed on watermarked images and their impact on quality of images is also studied In this paper, Image Watermarking using Least Significant Bit (LSB) algorithm has been used for embedding the message/logo into the image This work has been implemented through MATLAB Keywords - Watermarking, Least Significant Bit (LSB), JPEG (Joint Photographic Experts Group), Mean Square Error (MSE) and Peak Signal to Noise Ratio (PSNR)

59 citations


Journal ArticleDOI
TL;DR: The author will make an attempt for identifying major issues and challenges associated with different vanet protocols, security and simulation tools.
Abstract: Since the last few years VANET have received increased attention as the potential technology to enhance active and preventive safety on the road, as well as travel comfort. Several unexpected disastrous situations are encountered on road networks daily, many of which may lead to congestion and safety hazards. If vehicles can be provided with information about such incidents or traffic conditions in advance, the quality of driving can be improved significantly in terms of time, distance, and safety. One of the main challenges in Vehicular ad hoc network is of searching and maintaining an effective route for transporting data information. Security and privacy are indispensable in vehicular communications for successful acceptance and deployment of such a technology. The vehicular safety application should be thoroughly tested before it is deployed in a real world to use. Simulator tool has been preferred over outdoor experiment because it simple, easy and cheap. VANET requires that a traffic and network simulator should be used together to perform this test. In this paper, the author will make an attempt for identifying major issues and challenges associated with different vanet protocols, security and simulation tools.

42 citations


Journal ArticleDOI
TL;DR: This paper compares and analyzes several kinds of classical algorithms of image edge detection, including Roberts, Sobel, Prewitt, LOG and Canny with MATLAB tool.
Abstract: Edge is the basic characteristic of image, edge detection plays an important role in computer vision and image analysis. The pretty usefull and identical information contained in edge of sub-image enable edge detection to be the main approach to image analysis and recognition. This paper compares and analyzes several kinds of classical algorithms of image edge detection, including Roberts, Sobel, Prewitt, LOG and Canny with MATLAB tool.

36 citations


Journal ArticleDOI
TL;DR: DAUB-4 Wavelet method is used for feature extraction from MR Images to differentiate between normal and abnormal brain and Experimental results show that the proposed system have high classification accuracy of 98.7% with Radial Basis Kernel.
Abstract: The task of classification in recovery systems is to differentiate between normal and abnormal brain. In this paper feature extraction from MR Images is carried out by DAUB-4 Wavelet method. DAUB-4 is an efficient tool for feature extraction because it gives better contrast to an image. Due to better contrast it improves easily hanging signals of an image and reduces the overhead. PCA is used to select the best features for classification. These PCA selected features are given as an input to SVM for classification. In this work we are using two SVM kernel functions which are Linear Kernel and Radial Basis Kernel. Experimental results show that the proposed system have high classification accuracy of 98.7% with Radial Basis Kernel.

35 citations


Journal ArticleDOI
TL;DR: This approach presents a new approach on text steganography through the null space in the cover message for hiding the secret message through adding extra null spaces in the plaintext of the cover file.
Abstract: Steganography is the art and science of covered or hidden writing. The purpose of steganography is covert communication to hide the existence of a message from an intermediary. Digital Steganography algorithms have been developed by using texts, images and audio etc as the cover media. In this approach, a security model is proposed which imposes the concept of secrecy over privacy for text messages. In the recent years, we have plenty of security tools which are developed to protect the transmission of multimedia objects. But approaches for the security of text messages are comparatively less. In this approach we present a new approach on text steganography through the null space in the cover message for hiding the secret message. In this method, hiding bits of secret message is done through adding extra null spaces in the plaintext of the cover file. These null spaces placed, when the binary bit of secret message is equal to 1 in plaintext of the cover file. And null space remains unchanged when the binary bit of the secret message is equal to 0.

35 citations


Journal ArticleDOI
TL;DR: In this article, the authors introduce Cloud Storage, which covers the key technologies in cloud computing and Cloud storage, management insights about cloud computing, different types of cloud services, driving forces of Cloud computing and cloud storage, advantages and challenges of cloud storage and concludes by pinpointing few challenges to be addressed by the cloud storage providers.
Abstract: Enterprises are driving towards less cost, more availability, agility, managed risk - all of which is accelerated towards Cloud Computing. Cloud is not a particular product, but a way of delivering IT services that are consumable on demand, elastic to scale up and down as needed, and follow a pay-for-usage model. Out of the three common types of cloud computing service models, Infrastructure as a Service (IaaS) is a service model that provides servers, computing power, network bandwidth and Storage capacity, as a service to their subscribers. Cloud can relate to many things but without the fundamental storage pieces, which is provided as a service namely Cloud Storage, none of the other applications is possible. This paper introduces Cloud Storage, which covers the key technologies in cloud computing and Cloud Storage, management insights about cloud computing, different types of cloud services, driving forces of cloud computing and cloud storage, advantages and challenges of cloud storage and concludes by pinpointing few challenges to be addressed by the cloud storage providers.

33 citations


Journal ArticleDOI
TL;DR: This paper mainly focuses on clustering techniques such as K-means clustering, hierarchical clustering which in turn involves agglomerative and divisive clustering Techniques.
Abstract: Machine learning is a branch of artificial intelligence which recognizes complex patterns for making intelligent decisions based on input data values. In machine learning, pattern recognition assigns input value to given set of data labels. Based on the learning method used to generate the output we have the following classification, supervised and unsupervised learning. Unsupervised learning involves clustering and blind signal separation. Supervised learning is also known as classification. This paper mainly focuses on clustering techniques such as K-means clustering, hierarchical clustering which in turn involves agglomerative and divisive clustering techniques. This paper deals with introduction to machine learning, pattern recognition, clustering techniques. We present steps involved in each of these clustering techniques along with an example and the necessary formula used. The paper discusses the advantages and disadvantages of each of these techniques and in turn we make a comparison of K-means and hierarchical clustering techniques. Based on these comparisons we suggest the best suited clustering technique for the specified application.

31 citations


Journal ArticleDOI
TL;DR: In this paper a basic digital image is analyzed and based on this analysis an LSB based steganographic algorithm is designed and its performance is compared with various other steganograph- ic tools available on the internet.
Abstract: This paper informs the reader how an innocent looking digital image hides a deadly terrorist plan. It analyses the strengths of image steganography and the reasons why terrorists are relying on it. It also aims to generate interest among the scientific community about Image steganography so that experts from multiple dis- ciplines may join hands for devising better steganalysis techniques against the global terrorism. In this paper a basic digital image is analyzed and based on this analysis an LSB based steganographic algorithm is designed. Using this algorithm a software is developed and its performance is compared with various other steganograph- ic tools available on the internet. The result of this comparison explains us why steganography is so much pre- ferred by terrorists. Various images and image related tables used in this paper are generated using Matlab Si- mulation Environment.

30 citations


Journal ArticleDOI
TL;DR: The research finding shows that Cloud Computing is the better ICT utilization mechanism for Education institutions teaching-learning and service delivery requirements, for it enables wise and strategic use of technology which significantly reduces cost.
Abstract: Educational institutions throughout the World have become highly dependent on information technology for their teaching-learning, service delivery and business requirements. Procuring and maintaining a wide range of hardware and software require substantial, ongoing investment and the skills to support them. In the current financial crisis and being challenged by growing needs, universities are facing problems in providing necessary information technology (IT) support for educational, research and development activities. The objective of this paper is to find alternatives to the use of IT, while leading universities try improve agility and obtain savings. The paper discusses the advantages of cloud computing for educational institutions, the limitations of current IT utilization in Ethiopian Higher Education institutions. It also discusses alternative solutions to solve the current IT utilizations limitations in Ethiopian Higher Education Institutions. The research finding shows that Cloud Computing is the better ICT utilization mechanism for Education institutions teaching-learning and service delivery requirements, for it enables wise and strategic use of technology which significantly reduces cost. Accordingly, when the Proposed Hybrid Cloud Computing model is implemented, it will have significant contribution to the country in different aspects.

Journal ArticleDOI
TL;DR: This paper proposes an algorithm that mines negative association rules by using conviction measure which does not require extra database scans, and is very convenient for associative classifiers, classifiers that build their classification model based on association rules.
Abstract: Association rule mining is a data mining task that discovers associations among items in a transactional database. Typical association rules consider only items enumerated in transactions. Such rules are referred to as positive association rules. Negative association rules also consider the same items, but in addition consider negated items (i.e. absent from transactions). Negative association rules are useful in market- basket analysis to identify products that conflict with each other or products that complement each other. They are also very convenient for associative classifiers, classifiers that build their classification model based on association rules. Many other applications would benefit from negative association rules if it was not for the expensive process to discover them. Indeed, mining for such rules necessitates the examination of an exponentially large search space. In this paper, we propose an algorithm that mines negative association rules by using conviction measure which does not require extra database scans.

Journal ArticleDOI
TL;DR: This paper has proposed a Particle Swarm Optimization based Routing protocol (PSOR) where it has taken energy efficiency as major criteria for performing routing and deriving optimized path for data forwarding and processing to base node.
Abstract: Wireless sensor network is becoming a progressively Important and challenging research area. Advancement in WSN enable a wide range of environmental monitoring and object tracking system. Wireless sensor networks consists of small low cost sensor nodes, having a limited transmission range and their processing, storage capabilities and energy resources are limited. We consider energy constrained wireless sensor network deployed over a region. The main task of such a network is to gather information from node and transmit it to base station for further processing. Generally, it needs a fixed amount of energy to receive one bit of information and an additional amount of energy to transmit the same. This additional amount depends on the transmission range. So, if all nodes transmit directly to the BS, then they will quickly deplete their energy. To perform routing in wireless sensor network with this limitation of low power, energy and storage capabilities is a major problem. Many solutions has been proposed where energy awareness is essential consideration for routing. The LEACH, PEGASIS, GROUP, Ant colony optimization etc has provided elegant solutions and has shown very effective results. In this paper, we have proposed a Particle Swarm Optimization based Routing protocol (PSOR ) where we have taken energy efficiency as major criteria for performing routing and deriving optimized path for data forwarding and processing to base node. The PSOR generates a whole new path of routing by taking energy as fitness value to judge different path and choose best optimized path whose energy consumption is less as compared to other routing paths. We concluded with the result obtained by performing experiment on our proposed algorithm PSOR and comparing its result with Genetic Algorithm which shows better result as compared to Genetic Algorithm and the experiments performed are done using NS2 simulator.

Journal ArticleDOI
TL;DR: The use of data mining technique is elaborated to help retailers to identify customer profile for a retail store and behaviors, improve better customer satisfaction and retention and judge the accuracy of different data mining algorithms on various data sets.
Abstract: The retail industry collects vast amounts of data on sales, customer buying history, goods, and service with ease of use of modern computing technology. This paper elaborates the use of data mining technique to help retailers to identify customer profile for a retail store and behaviors, improve better customer satisfaction and retention. The aim is to judge the accuracy of different data mining algorithms on various data sets. The performance analysis depends on many factors encompassing test mode, different nature of data sets, and size of data set. Keywords-data mining, performance, analysis, retail

Journal ArticleDOI
TL;DR: Fuzzy logic method is proposed for improvement in the extraction of summary sentences in automatic text summarization using extractive method.
Abstract: Automatic text summarization is undergoing wide research and gaining importance as the availability of online information is increasing. Automatic text summarization is to compress the larger original text into shorter text called as summary. Abstraction and Extraction are the two main methods to carry out text summarization. Our approach uses extractive method. Summarization by extraction involves identifying important features and extracting sentences based on their scores. 30 documents from news based URLs are used as input. After preprocessing the input document, eight features are used to calculate their score for each sentence. In this paper fuzzy logic method is proposed for improvement in the extraction of summary sentences.

Journal ArticleDOI
TL;DR: This paper presents the comparison of different classification techniques to detect and classify intrusions into normal and abnormal behaviours using WEKA tool, which consists of a collection of machine learning algorithms for Data mining tasks.
Abstract: Intrusion detection is one of the major research problems in network security. It is the process of monitoring and analyzing the events occurring in a computer system in order to detect different security violations. The aim of this paper is to classify activities of a system into two major categories: normal and abnormal activities. In this paper we present the comparison of different classification techniques to detect and classify intrusions into normal and abnormal behaviours using WEKA tool. WEKA is open source software which consists of a collection of machine learning algorithms for Data mining tasks. The algorithms or methods tested are Naive Bayes , j48, OneR, PART and RBF Network Algorithm. The experiments and assessments of the proposed method were performed with NSL-KDD intrusion detection dataset. With a total data of 2747 rows and 42 columns will be used to test and compare performance and accuracy among the classification methods that are used.

Journal ArticleDOI
TL;DR: This work analyzes the predictive performance by comparing K-Means Clustering with kNN Classifier for imputing missing value, and finds that k-NN performs better than K- means ClUSTering, in terms of accuracy.
Abstract: Missing Data is a widespread problem that can affect the ability to use data to construct effective predictions systems. We analyze the predictive performance by comparing K-Means Clustering with kNN Classifier for imputing missing value. For investigation, we simulate with 5 missing data percentages; we found that k-NN performs better than K-Means Clustering, in terms of accuracy.

Journal ArticleDOI
TL;DR: This proposed work presents a brief survey of different image inpainting techniques and comparative study of these techniques and provides a review of different techniques used for image Inpainting.
Abstract: Inpainting is the process of reconstructing lost or deteriorated part of images based on the background information. i. e. image Inpainting fills the missing or damaged region in an image utilizing spatial information of its neighbouring region. Inpainting algorithm have numerous applications. It is helpfully used for restoration of old films and object removal in digital photographs. It is also applied to red-eye correction, super resolution, compression etc. The main goal of the Inpainting algorithm is to modify the damaged region in an image in such a way that the inpainted region is undetectable to the ordinary observers who are not familiar with the original image. There have been several approaches proposed for the image inpainting. This proposed work presents a brief survey of different image inpainting techniques and comparative study of these techniques. In this paper we provide a review of different techniques used for image Inpainting. We discuss different inpainting techniques like image PDE based image inpainting, Exemplar based image inpainting, hybrid inpainting, and texture synthesis based image inpainting and semi-automatic and fast digital Inpainting.

Journal ArticleDOI
TL;DR: This paper presents techniques to support horizontal aggregations through SQL queries that include CASE, SPJ and PIVOT and shows that these constructs are capable of generating data sets that can be used for further data mining operations.
Abstract: Data mining is widely used domain for extracting trends or patterns from historical data. However, the databases used by enterprises can't be directly used for data mining. It does mean that Data sets are to be prepared from real world database to make them suitable for particular data mining operations. However, preparing datasets for analyzing data is tedious task as it involves many aggregating columns, complex joins, and SQL queries with sub queries. More over the existing aggregations performed through SQL functions such as MIN, MAX, COUNT, SUM, AVG return a single value output which is not suitable for making datasets meant for data mining. In fact these aggregate functions are generating vertical aggregations. This paper presents techniques to support horizontal aggregations through SQL queries. The result of the queries is the data which is suitable for data mining operations. It does mean that this paper achieves horizontal aggregations through some constructs built that includes SQL queries as well. The methods prepared by this paper include CASE, SPJ and PIVOT. We have developed a prototype application and the empirical results reveal that these constructs are capable of generating data sets that can be used for further data mining operations. Index Terms - Aggregations, SQL, data mining, OLAP, and data set generation.s

Journal ArticleDOI
TL;DR: The work explores the different techniques of cryptography in order to prove that the natural selection based techniques are as good as the rigorous mathematical techniques.
Abstract: Cryptography is essential for protecting information as the importance of security is increasing day by day with the advent of online transaction processing and e commerce. Public key cryptography is one of the most important types of cryptography. In public key cryptography the key has to be unique. There are two ways of key production, the first one is mathematical like AES, DES and the other one is based on the theory of natural selection. The work explores the different techniques of cryptography in order to prove that the natural selection based techniques are as good as the rigorous mathematical techniques. 12 papers and theses have been studied in order to reach the conclusion.

Journal ArticleDOI
TL;DR: Three algorithms based on density and distance based Cluster Head, An Energy Efficient Algorithm for Cluster-Head Selection in WSNs, Consumed Energy as a Factor for Cluster Head are analyzed and studied and a new algorithm called EDRLEACH is proposed through this paper.
Abstract: The Cluster-head Gateway Switch Routing protocol (CGSR) uses a hierarchical network topology. CGSR organizes nodes into clusters, with coordination among the members of each cluster entrusted to a special node named cluster-head. The cluster head selection is done with the help of any of the algorithm for cluster head selection. Energy is the primary constraint on designing any Wireless Networks practically. This leads to limited network lifetime of network. Low-Energy Adaptive Clustering Hierarchy (LEACH) and LEACH with deterministic cluster head selection are some of the cluster head algorithms that enable to optimize power consumption of WSN. There are various factors like density & distance, threshold based, power efficient. Load balancing and scalability are the other factors which plays important role in the selection of Cluster head. Algorithms based on load balancing reduce communication cost to a great extent. The algorithms that this study is focused are A Density and Distance based Cluster Head, An Energy Efficient Algorithm for Cluster-Head Selection in WSNs, Consumed Energy as a Factor for Cluster Head. These three algorithms are analyzed and studied in this paper. The analysis of these algorithms gave birth to a new algorithm called EDRLEACH, which is proposed through this paper. Keywords-Cluster head,energy efficient algorithms,selection head algorithms, wireless sensor,networks.

Journal ArticleDOI
TL;DR: This research paper aims to perform a Systematic Literature Review for the identification of challenges during tracking an object in augmented reality environment and how these challenges can be overcame.
Abstract: Context– Augmented Reality (AR) is a technology through which the view of real world environment is augmented by computer generated elements/objects. Tracking and registration are the key challenges in AR system. Objective– This research paper aims to perform a Systematic Literature Review (SLR) for the identification of challenges during tracking an object in augmented reality environment and how these challenges can be overcame. Method–We have used a Systematic Literature Review (SLR) for Augmented Reality Tracking Techniques (ARTT). SLR is based on a structured protocol, and is therefore, different from ordinary literature review. We have developed the SLR protocol and are in the process of its implementation. Expected Outcome– The expected outcomes of the review will be the identification of a list of challenges and its proposed solutions while tacking objects in augmented reality environment.

Journal ArticleDOI
TL;DR: Diverse Ensemble Creation by Oppositional Relabeling of Artificial Training Examples, that directly constructs diverse hypotheses using additional artificially-constructed training examples, a simple, general meta-learner that can use any strong learner as a base classifier to build diverse committees.
Abstract: Ensemble methods for different classifiers like Bagging and Boosting which combine the decisions of multiple hypotheses are some of the strongest existing machine learning methods. The diversity of the members of an ensemble is known to be an important factor in determining its generalization error. DECORATE (Diverse Ensemble Creation by Oppositional Relabeling of Artificial Training Examples), that directly constructs diverse hypotheses using additional artificially-constructed training examples. The technique is a simple, general meta-learner that can use any strong learner as a base classifier to build diverse committees. The diverse ensembles produced by DECORATE are very effective for reducing the amount of supervision required for building accurate models. DECORATE ensembles can also be used to reduce supervision through active learning, in which the learner selects the most informative examples from a pool of unlabeled examples, such that acquiring their labels will increase the accuracy of the classifier.

Journal ArticleDOI
TL;DR: It is shown that privacy/protection in cloud is still immature, which shows that security is still a critical challenge in cloud computing paradigm.
Abstract: Cloud computing is a technique to deliver software, storage and processing. It increases system's capability without changing the existing infrastructure, educating new people or taking license for the softwares. It improves the existing software capabilities and extends the Information Technology resources. In recent years, cloud computing has grown up rapidly and boosted the business concept in IT industry. Despite of all the achievements in cloud computing, security is still a critical challenge in cloud computing paradigm. These challenges include user's secret data (like health and financial data) loss, leakage and disclosing of privacy. We have studied literature and discussed various model in cloud computing, it shows that privacy/protection in cloud is still immature.

Journal ArticleDOI
TL;DR: This paper gives review about clustering methods by taking some example for each classification, and provides comparative statement by taking constraints i.e Data type, Cluster Shape, Complexity, Data Set, Measure, Advantages and Disadvantages.
Abstract: Clustering is the assignment of data objects (records) into groups (called clusters) so that data objects from the same cluster are more similar to each other than objects from different clusters. Clustering techniques have been discussed extensively in similarity search, Segmentation, Statistics ,Machine Learning ,Trend Analysis, Pattern Recognition and classification. Clustering methods can be classified in to i)partition methods2)Hierarchical methods,3)Density Based methods 4)Grid based methods5)Model Based methods. in this paper, i would like to give review about clustering methods by taking some example for each classification. I am also providing comparative statement by taking constraints i.e Data type, Cluster Shape, Complexity, Data Set, Measure, Advantages and Disadvantages. Keywords: clustering; Partition, Hierarchical, Density, grid, Model

Journal ArticleDOI
TL;DR: A new variant of digital signature algorithm which is based on linear block cipher or Hill cipher initiate with Asymmetric algorithm using mod 37 is presented.
Abstract: The digital signature technique is essential for secure transactions over open networks. It is used in a variety of applications to ensure the integrity of data exchanged or stored and to prove to the recipient the originator's identity. Digital signature schemes are mostly used in cryptographic protocols to provide services like entity authentication, authenticated key transport and authenticated key agreement. This architecture is related with secure Hash Function and cryptographic algorithm. There are many other algorithms which are based on the hybrid combination of prime factorization and discrete logarithms, but different weaknesses and attacks have been developed against those algorithms. This Research paper presents a new variant of digital signature algorithm which is based on linear block cipher or Hill cipher initiate with Asymmetric algorithm using mod 37.

Journal ArticleDOI
TL;DR: Strong AI claims that in near future the authors will be surrounded by such kinds of machine which can completely works like human being and machine could have human level intelligence.
Abstract: Artificial intelligence (AI) is the study of how to make computers do things which, at the moment, people do better. Thus Strong AI claims that in near future we will be surrounded by such kinds of machine which can completely works like human being and machine could have human level intelligence. One intention of this article is to excite a broader AI audience about abstract algorithmic information theory concepts, and conversely to inform theorists about exciting applications to AI.The science of Artificial Intelligence (AI) might be defined as the construction of intelligent systems and their analysis.

Journal ArticleDOI
TL;DR: A BSL finger spelling and an alphabet gesture recognition system was designed with Artificial Neural Network and constructed in order to translate the BSL alphabet into the corresponding printed Bangla letters.
Abstract: This paper presents a system for recognizing static hand gestures of alphabet in Bangla Sign Language (BSL). A BSL finger spelling and an alphabet gesture recognition system was designed with Artificial Neural Network (ANN) and constructed in order to translate the BSL alphabet into the corresponding printed Bangla letters. The proposed ANN is trained with features of sign alphabet using feed-forward back- propagation learning algorithm. Logarithmic sigmoid (logsig) function is chosen as transfer function. This ANN model demonstrated a good recognition performance with the mean square error values in this training function. This recognition system does not use any gloves or visual marking systems. This system only requires the images of the bare hand for the recognition. The Simulation results show that this system is able to recognize 36 selected letters of BSL alphabet with an average accuracy of 80.902%.

Journal ArticleDOI
TL;DR: An iterative decoding algorithm called Message Passing Algorithm that operates in factor graph is discussed, and the marginal function associated with the global function of the variables is computed.
Abstract: This tutorial paper reviews the basics of error correcting codes like linear block codes and LDPC. The error correcting codes which are also known as channel codes enable to recover the original message from the message that has been corrupted by the noisy channel. These block codes can be graphically represented by factor graphs. We mention the link between factor graphs, graphical models like Bayesian networks, channel coding and compressive sensing. In this paper, we discuss an iterative decoding algorithm called Message Passing Algorithm that operates in factor graph, and compute the marginal function associated with the global function of the variables. This global function is factorized into many simple local functions which are defined by parity check matrix of the code. We also discuss the role of Message Passing Algorithm in Compressive Sensing reconstruction of sparse signal.

Journal ArticleDOI
TL;DR: This paper identifies the possible security attacks on clouds including: Denial of Service attack, Authentication attack, Man-in-the Middle attack, Wrapping attacks, Malware-Injection attacks, Flooding attacks, Browser attacks, and also Accountability checking problems.
Abstract: Cloud computing security (sometimes referred to simply as "cloud security") is an evolving sub- domain of Computer security, Network security and, more broadly, Information security It refers to a broad set of policies, technologies, and controls deployed to protect data, applications, and the associated infrastructure of cloud computing. Cloud security is not to be confused with security software offerings that are "cloud-based" (a.k.a. security-as-a-service).There are a number of security issues/concerns associated with cloud computing but these issues fall into two broad categories: Security issues faced by cloud providers (organizations providing software, platform, or infrastructure as a service via the cloud) and security issues faced by their customers. In most cases, the provider must ensure that their infrastructure is secure and that their clients' data and applications are protected while the customer must ensure that the provider has taken the proper security measures to protect their information. The extensive use of virtualization in implementing cloud infrastructure brings unique security concerns for customers or tenants of a public cloud service. Virtualization alters the relationship between the OS and underlying hardware - be it computing, storage or even networking. This introduces an additional layer - virtualization - that itself must be properly configured, managed and secured .Specific concerns include the potential to compromise the virtualization software, or "hyper visor". While these concerns are largely theoretical, they do exist. Cloud computing offers great potential to improve productivity and reduce costs, but at the same time it possesses many security risks. In this paper we identify the possible security attacks on clouds including: Denial of Service attack, Authentication attack Man-in-the Middle attack, Wrapping attacks, Malware-Injection attacks, Flooding attacks, Browser attacks, and also Accountability checking problems. We identify the root causes of these attacks and propose specific solutions. In this paper, the authors discuss security issues for cloud computing and present a layered framework for secure clouds and then focus on two of the layers, i.e., the storage layer and the data layer. In particular, the authors discuss a scheme for secure third party publications of documents in a cloud. Next, the paper will converse secure federated query processing with map Reduce and Hardtop, and discuss the use of secure co-processors for cloud computing. Finally, the authors discuss XACML implementation for Hardtop and discuss their beliefs that building trusted applications from untrusted components will be a major aspect of secure cloud computing. Keywords : Accountability, Authentication attacks, Computer security, Cloud Computing, Flooding attacks, Hypervisor, Virtualization, Wrapping attacks, Browser attacks, secure clouds.