Showing papers in "International Journal of Computer Applications in 2015"
TL;DR: This paper analyzes the security issues and challenges and provides a well defined security architecture as a confidentiality of the user's privacy and security which could result in its wider adoption by masses.
Abstract: Internet of Things (IoT) has been a major research topic for almost a decade now, where physical objects would be interconnected as a result of convergence of various existing technologies. IoT is rapidly developing; however there are uncertainties about its se- curity and privacy which could affect its sustainable development. This paper analyzes the security issues and challenges and provides a well defined security architecture as a confidentiality of the user's privacy and security which could result in its wider adoption by masses.
316 citations
TL;DR: A survey is presented which covers the problem of sentiment analysis, techniques and methods used for the same and the major challenge lies in analyzing the sentiments and identifying emotions expressed in texts.
Abstract: A huge amount of online information, rich web resources are highly unstructured and such natural language are not solvable by machine directly. The increased demand to capture opinions of general public about social events, campaigns and sales of the product has led to study of the field opinion mining and sentiment analysis. Opinion refers to extraction of lines in raw data which expresses an opinion. Sentiment analysis identifies polarity of extracted opinions. The major challenge lies in analyzing the sentiments and identifying emotions expressed in texts. This paper presents a survey which covers a problem of sentiment analysis, techniques and methods used for the same.
167 citations
TL;DR: A comprehensive overview of the IoT scenario is provided and its enabling technologies and the sensor networks are reviewed and a six-layered architecture of IoT is described and points out the related key challenges.
Abstract: Internet, a revolutionary invention, is always transforming into some new kind of hardware and software making it unavoidable for anyone. The form of communication that we see now is either human-human or human-device, but the Internet of Things (IoT) promises a great future for the internet where the type of communication is machine-machine (M2M). This paper aims to provide a comprehensive overview of the IoT scenario and reviews its enabling technologies and the sensor networks. Also, it describes a six-layered architecture of IoT and points out the related key challenges.
161 citations
TL;DR: The different classification methods and classifiers that can be used for classification of observations that are initially uncategorized are compared to demonstrate the different accuracies and usefulness of classifiers.
Abstract: This paper focuses on the various techniques that can be implemented for classification of observations that are initially uncategorized. Our objective is to compare the different classification methods and classifiers that can be used for this purpose. In this paper, we study and demonstrate the different accuracies and usefulness of classifiers and the circumstances in which they should be implemented. General Terms Classification, Sentiment, Review, Accuracy, Positive, Negative, Neutral.
152 citations
TL;DR: In this paper, a survey of various techniques available in text mining for keyword and keyphrase extraction is presented.
Abstract: In this paper we present a survey of various techniques available in text mining for keyword and keyphrase extraction.
151 citations
TL;DR: The paper gives an overview of the different sentiment classification approaches and tools used for sentiment analysis and provides a classification of approaches with respect to features/techniques and advantages/limitations.
Abstract: The paper gives an overview of the different sentiment classification approaches and tools used for sentiment analysis. Starting from this overview the paper provides a classification of (i) approaches with respect to features/techniques and advantages/limitations and (ii) tools with respect to the different techniques used for sentiment analysis. Different application fields of application of sentiment analysis such as: business, politic, public actions and finance are also discussed in the paper.
145 citations
TL;DR: The architecture and basic learning process underlying ANFIS (adaptive-network-based fuzzy inference system) is presented, which is a fuzzy inferenceSystem implemented in the framework of adaptive networks.
Abstract: paper, we presented the architecture and basic learning process underlying ANFIS (adaptive-network-based fuzzy inference system) which is a fuzzy inference system implemented in the framework of adaptive networks. Soft computing approaches including artificial neural networks and fuzzy inference have been used widely to model expert behavior. Using given input/output data values, the proposed ANFIS can construct mapping based on both human knowledge (in the form of fuzzy if-then rules) and hybrid learning algorithm. In modeling and simulation, the ANFIS strategy is employed to model nonlinear functions, to control one of the most important parameters of the induction machine and predict a chaotic time series, all yielding more effective, faster response or settling times.
142 citations
TL;DR: The research has proved that the complexity of SVM (LibSVM) is O(n3) and the time complexity shown that C++ faster than Java, both in training and testing, beside that the data growth will be affect and increase the time of computation.
Abstract: Support Vector Machines (SVM) is one of machine learning methods that can be used to perform classification task. Many researchers using SVM library to accelerate their research development. Using such a library will save their time and avoid to write codes from scratch. LibSVM is one of SVM library that has been widely used by researchers to solve their problems. The library also integrated to WEKA, one of popular Data Mining tools. This article contain results of our work related to complexity analysis of Support Vector Machines. Our work has focus on SVM algorithm and its implementation in LibSVM. We also using two popular programming languages i.e C++ and Java with three different dataset to test our analysis and experiment. The results of our research has proved that the complexity of SVM (LibSVM) is O(n3) and the time complexity shown that C++ faster than Java, both in training and testing, beside that the data growth will be affect and increase the time of computation.
130 citations
TL;DR: This research work used C5.0 as the base classifier so proposed system will classify the result set with high accuracy and low memory usage, and over fitting problem of the decision tree is solved by using reduced error pruning technique.
Abstract: Data mining is a knowledge discovery process that analyzes data and generate useful pattern from it. Classification is the technique that uses pre-classified examples to classify the required results. Decision tree is used to model classification process. Using feature values of instances, Decision trees classify those instances. Each node in a decision tree represents a feature in an instance to be classified. In this research work ID3, C4.5 and C5.0 Compare with each other. Among all these classifiers C5.0 gives more accurate and efficient result. This research work used C5.0 as the base classifier so proposed system will classify the result set with high accuracy and low memory usage. The classification process generates fewer rules compare to other techniques so the proposed system has low memory usage. Error rate is low so accuracy in result set is high and pruned tree is generated so the system generates fast results as compare with other technique. In this research work proposed system use C5.0 classifier that Performs feature selection and reduced error pruning techniques which are described in this paper. Feature selection technique assumes that the data contains many redundant features. so remove that features which provides no useful information in any context. Select relevant features which are useful in model construction. Crossvalidation method gives more reliable estimate of predictive. Over fitting problem of the decision tree is solved by using reduced error pruning technique. With the proposed system achieve 1 to 3% of accuracy, reduced error rate and decision tree is construed within less time.
126 citations
TL;DR: The proposed system is a novel intrusion detection system for the IoT, which is capable of detecting Wormhole attack and attacker and will help in securing the IoT network and may prevents such attacks.
Abstract: There are currently more objects connected to the Internet than people in the world. This gap will continue to grow, as more objects gain the ability to directly interface with the Internet. Providing security in IoT is challenging as the devices are resource constrained, the communication links are lossy, and the devices use a set of novel IoT technologies such as RPL and 6LoWPAN. Due to this it is easy to attack in IoT network. The proposed system is a novel intrusion detection system for the IoT, which is capable of detecting Wormhole attack and attacker. The proposed methods uses the location information of node and neighbor information to identify the Wormhole attack and received signal strength to identify attacker nodes. Design of such system will help in securing the IoT network and may prevents such attacks. This method is very energy efficient and only takes fixed number of UDP packets for attack detection, hence it is beneficial for resource constrained environment.
117 citations
TL;DR: Two of the comparison of - Hadoop Map Reduce and the recently introduced Apache Spark - both of which provide a processing model for analyzing big data are discussed, both of whom vary significantly based on the use case under implementation.
Abstract: Data has long been the topic of fascination for Computer Science enthusiasts around the world, and has gained even more prominence in the recent times with the continuous explosion of data resulting from the likes of social media and the quest for tech giants to gain access to deeper analysis of their data This paper discusses two of the comparison of - Hadoop Map Reduce and the recently introduced Apache Spark - both of which provide a processing model for analyzing big data Although both of these options are based on the concept of Big Data, their performance varies significantly based on the use case under implementation This is what makes these two options worthy of analysis with respect to their variability and variety in the dynamic field of Big Data In this paper we compare these two frameworks along with providing the performance analysis using a standard machine learning algorithm for clustering (K- Means)
TL;DR: This paper addresses energy-efficient design for uplink multiuser SIMO frameworks with marked channel state data (CSD) at the base station (BS) and proposes a non-helpful energy efficiency uplink power control game, where every client egotistically overhauls its own uplinkPower control game.
Abstract: This paper addresses energy-efficient design for uplink multiuser SIMO frameworks with marked channel state data (CSD) at the base station (BS). Since the CSD at the BS is constantly unreliable because of the channel estimation error and delay, the imperfectness of the CSD needs to be considered in practical framework plan. It causes interuser impedance at the zero-forcing (ZF) receiver and makes it hard to acquire the universally ideal power distribution that expands the energy efficiency (EE). Consequently, we propose a non-helpful energy efficiency uplink power control game, where every client egotistically overhauls its own uplink power. The proposed framework is utilized to examine the execution of expansive scale MU-MIMO framework by changing the quantity of BS receivers, clients and recognize the effect on limit, spectral efficiency, aggregate rate, energy efficiency and so on. The proposed work is planned & analyzed proficiently procedure/system for improvement of energy efficiency, throughput and so on.
TL;DR: This document provides insights on the challenges of managing such a huge data – popularly known as Big Data, the solutions offered by Big Data management tools/ techniques and the opportunities it has created.
Abstract: In today„s world, every tiny gadget is a potential data source, adding to the huge data bank. Every day, we create 2.5 quintillion bytes of data – structured and unstructured, so much that 90% of the data in the world today has been created in the last two years alone. This data generated through large customer transactions, social networking sites is varied, voluminous and rapidly generating. All this data prove a storage and processing crisis for the enterprises. While more data enables realistic analysis and thus help in making accurate business decisions / goals, it is equally difficult to manage and analyze such a huge amount of data. This document provides insights on the challenges of managing such a huge Data – popularly known as Big Data, the solutions offered by Big Data management tools/ techniques and the opportunities it has created. General Terms Big Data, Big Data Opportunities, Big Data Challenges
TL;DR: An overview of recommender systems that include collaborative filtering, content-based filtering and hybrid approach ofRecommender system is provided.
Abstract: systems or recommendation systems are a subset of information filtering system that used to anticipate the 'evaluation' or 'preference' that user would feed to an item. In recent years E-commerce applications are widely using Recommender system. Generally the most popular E- commerce sites are probably music, news, books, research articles, and products. Recommender systems are also available for business experts, jokes, restaurants, financial services, life insurance and twitter followers. Recommender systems have formulated in parallel with the web. Initially Recommender systems were based on demographic, content-based filtering and collaborative filtering. Currently, these systems are incorporating social information for enhancing a quality of recommendation process. For betterment of recommendation process in the future, Recommender systems will use personal, implicit and local information from the Internet. This paper provides an overview of recommender systems that include collaborative filtering, content-based filtering and hybrid approach of recommender system.
TL;DR: The current research work is a study on satellite image classification methods and techniques and compares various researcher’s comparative results on satellite images classification methods.
Abstract: Satellite image classification process involves grouping the image pixel values into meaningful categories Several satellite image classification methods and techniques are available Satellite image classification methods can be broadly classified into three categories 1) automatic 2) manual and 3) hybrid All three methods have their own advantages and disadvantages Majority of the satellite image classification methods fall under first category Satellite image classification needs selection of appropriate classification method based on the requirements The current research work is a study on satellite image classification methods and techniques The research work also compares various researcher’s comparative results on satellite image classification methods
TL;DR: A brief review and future prospect of the vast applications of machine learning has been made.
Abstract: Machine learning is one of the most exciting recent technologies in Artificial Intelligence. Learning algorithms in many applications that’s we make use of daily. Every time a web search engine like Google or Bing is used to search the internet, one of the reasons that works so well is because a learning algorithm, one implemented by Google or Microsoft, has learned how to rank web pages. Every time Facebook is used and it recognizes friends' photos, that's also machine learning. Spam filters in email saves the user from having to wade through tons of spam email, that's also a learning algorithm. In this paper, a brief review and future prospect of the vast applications of machine learning has been made.
TL;DR: An algorithm using Artificial Neural Network for fault detection which will overcome the gaps of previously implemented algorithms and provide a fault tolerant model is proposed.
Abstract: With the immense growth of internet and its users, Cloud computing, with its incredible possibilities in ease, Quality of service and on-interest administrations, has turned into a guaranteeing figuring stage for both business and nonbusiness computation customers. It is an adoptable technology as it provides integration of software and resources which are dynamically scalable. The dynamic environment of cloud results in various unexpected faults and failures. The ability of a system to react gracefully to an unexpected equipment or programming malfunction is known as fault tolerance. In order to achieve robustness and dependability in cloud computing, failure should be assessed and handled effectively. Various fault detection methods and architectural models have been proposed to increase fault tolerance ability of cloud. The objective of this paper is to propose an algorithm using Artificial Neural Network for fault detection which will overcome the gaps of previously implemented algorithms and provide a fault tolerant model.
TL;DR: This paper discussed the various localization algorithms in WSNs with their applicable areas, requirements and limitations, and on conclusion compared these localization algorithms and analyzed the future research directions.
Abstract: Wireless sensor networks (WSNs) have recently emerges as promising technology in wireless communication field and gained special attention by research groups. It uses small and cheap gadgets with low energy requirements and limited on board computing resourceswhich communicates with each other’s or base stations without any pre-defined infrastructure. The property of being infrastructure less makes it suitable in distinctive application situations including remotemonitoring, disaster management, military applications and biomedical health observing devices. In many of these applications, node localization is unavoidably one of the important system parameters for example in target tracking if the nodes are not able to obtain the accurate location information, the related task cannot be performed.It is also helpful in routing, network coverage and quarry management of sensors. In general the localization techniques are ordered into two general classifications: range based and range free. In this paper, we discussed the various localization algorithms with their applicable areas, requirements and limitations. Moreover, on conclusion we compare these localization algorithms and analyze the future research directions for the localization algorithms in WSNs.
TL;DR: This paper seeks to highlight the concept of Internet of Things (IoT) in general, as well as reviewing the main challenges of the IoT environment by focusing on the recent research directions in this topic.
Abstract: In this paper, we seek to highlight the concept of Internet of Things (IoT) in general, as well as reviewing the main challenges of the IoT environment by focusing on the recent research directions in this topic. Recently, IoT has emerged as a new technology that is used to express a modern wireless telecommunication network, and it can be defined as an intelligent and interoperability node interconnected in a dynamic global infrastructure network, also it seeks to implement the connectivity concept of anything from anywhere at anytime. Indeed, the IoT environment possesses a large spectrum of challenges has a broad impact on their performance, which can be divided into two categories, namely, i) General challenges: such as communication, heterogeneity, virtualization and security; and ii) Unique challenges: such as wireless sensor network (WSN), Radio Frequency Identification (RFID), and finally Quality of service (QoS) that is considered as a common factor between both general and special challenges. In addition, this paper highlights the main applications of the IoT.
TL;DR: This paper is an honest attempt to collectively discuss all possible algorithms along with quality metrics following two assessment procedures i.e. at full and reduced scale resolutions to evaluate performance of these algorithms.
Abstract: Major technical constraints like minimum data storage at satellite platform in space, less bandwidth for communication with earth station, etc. limits the satellite sensors from capturing images with high spatial and high spectral resolutions simultaneously. To overcome this limitation, image fusion has proved to be a potential tool in remote sensing applications which integrates the information from combinations of panchromatic, multispectral or hyperspectral images; intended to result in a composite image having both higher spatial and higher spectral resolutions. The research in this area cites date back to last few decades, but the diverse approaches proposed so far by different researchers have been rarely discussed at one place. This paper is an honest attempt to collectively discuss all possible algorithms along with quality metrics following two assessment procedures i.e. at full and reduced scale resolutions to evaluate performance of these algorithms.
TL;DR: The design of an omnidirectional universal mobile platform for Omni-directional Robots and its implementation at the National University of Singapore, 2005.
Abstract: O. Diegel, A. Badve, G. Bright, J. Potgieter, and S. Tlale, "Improved Mecanum Wheel Design for Omni-directional Robots," no. November, pp. 27–29, 2002. I. Doroftei, V. Grosu, and V. Spinu, Omnidirectional Mobile Robot Design and Implementation, Bioinspiration and Robotics Walking and Climbing Robots, no. September. I-Tech, 2007. R. P. A. van Haendel, "Design of an omnidirectional universal mobile platform," National University of Singapore, 2005.
Journal Article•
TL;DR: An open source approach is presented, throughout which, twitter Microblogs data has been collected, pre-processed, analyzed and visualized using open source tools to perform text mining and sentiment analysis for analyzing user contributed online reviews about two giant retail stores in the UK namely Tesco and Asda stores over Christmas period 2014.
Abstract: Social media has arisen not only as a personal communication media, but also, as a media to communicate opinions about products and services or even political and general events among its users. Due to its widespread and popularity, there is a massive amount of user reviews or opinions produced and shared daily. Twitter is one of the most widely used social media micro blogging sites. Mining user opinions from social media data is not a straight forward task; it can be accomplished in different ways. In this work, an open source approach is presented, throughout which, twitter Microblogs data has been collected, pre-processed, analyzed and visualized using open source tools to perform text mining and sentiment analysis for analyzing user contributed online reviews about two giant retail stores in the UK namely Tesco and Asda stores over Christmas period 2014. Collecting customer opinions can be expensive and time consuming task using conventional methods such as surveys. The sentiment analysis of the customer opinions makes it easier for businesses to understand their competitive value in a changing market and to understand their customer views about their products and services, which also provide an insight into future marketing strategies and decision making policies.
TL;DR: This survey paper discusses main characteristics of the Fog, explores the advantages and motivation of Fog computing, and analyze its applications for IOT.
Abstract: Fog computing is new buzz word in computing world after cloud computing. This new computing paradigm could be seen as an extension to cloud computing. Main aim of fog computing is to reduce the burden on cloud by gathering workloads, services, applications and huge data to near network edge. In this survey paper, we will discuss main characteristics of the Fog that are; 1.Mobility, 2.Location awareness, 3.Low latency, 4.Huge number of nodes, 5. Extensive geographical distribution, 6.Various real time applications and we explore the advantages and motivation of Fog computing, and analyze its applications for IOT.
TL;DR: Various Data Mining techniques such as classification, clustering, association, regression in health domain are reviewed and applications, challenges and future work of Data Mining in healthcare are highlighted.
Abstract: Data mining is gaining popularity in disparate research fields due to its boundless applications and approaches to mine the data in an appropriate manner. Owing to the changes, the current world acquiring, it is one of the optimal approach for approximating the nearby future consequences. Along with advanced researches in healthcare monstrous of data are available, but the main difficulty is how to cultivate the existing information into a useful practices. To unfold this hurdle the concept of data mining is the best suited. Data mining have a great potential to enable healthcare systems to use data more efficiently and effectively. Hence, it improves care and reduces costs. This paper reviews various Data Mining techniques such as classification, clustering, association, regression in health domain. It also highlights applications, challenges and future work of Data Mining in healthcare.
TL;DR: The proposed algorithm “Optimal Keyless Algorithm for Security” represents a new way of using data itself to create a protective shield and provides security at both character level as well as bit level.
Abstract: In Modern era, every business is dependent on the Internet. The Network is growing so quickly that now at this stage no one can ever imagine anything without use of internet. But at the same time security over network is very important because of the vulnerability of data to eavesdropping. To protect the data from eavesdropping, it must be appropriately encrypted before sending over the network. There are two types of algorithms, keyed and keyless, exist to protect data. The keyed algorithms are efficient but to avoid the overhead of key generation and key management, keyless algorithms are getting popularity now days. The proposed algorithm “Optimal Keyless Algorithm for Security” represents a new way of using data itself to create a protective shield. The algorithm provides security at both character level as well as bit level. The number of rounds and the number of shifts applied at bit level are made data dependent to increase the security level, is a major advantage of the algorithm. The system is proposed with the motive to provide highest security level with minimum execution time in terms of encryption and decryption. This paper presents simulation results of proposed algorithm and its comparison with the commonly used JS keyless algorithm. General Terms Security, Cryptography, Key, Keyless, Encryption, Decryption
TL;DR: It is shown that the introduced model is more competitive than other models and its statistical properties, such as, quantiles, moments, mean deviation and maximum likelihood estimators of its parameters are discussed.
Abstract: In this paper, A new distribution called Exponential Lomax distribution is introduced It is seemed that the parameter values of our new distribution are depending on decreasing and upside-down bathtub failure rate function Also, the statistical properties of this model are studied, such as, quantiles, moments, mean deviation Moreover, maximum likelihood estimators of its parameters are discussed Finally, the procedure is illustrated by real data set It is shown that the introduced model is more competitive than other models
TL;DR: The project aims to build a monocular vision autonomous car prototype using Raspberry Pi as a processing chip that is capable of reaching the given destination safely and intelligently thus avoiding the risk of human errors.
Abstract: The project aims to build a monocular vision autonomous car prototype using Raspberry Pi as a processing chip. An HD camera along with an ultrasonic sensor is used to provide necessary data from the real world to the car. The car is capable of reaching the given destination safely and intelligently thus avoiding the risk of human errors. Many existing algorithms like lane detection, obstacle detection are combined together to provide the necessary control to the car.
TL;DR: The quantum implementation of primitive reversible gate has been presented and the proposed gates have been designed and simulated using QCADesigner.
Abstract: Quantum Dot Cellular Automata (QCA) is a rising innovation which seems to be a good competitor for the next generation of digital systems and widely utilized as a part of advanced frameworks. It is an appealing substitute to ordinary CMOS innovation because of diminutive size, faster speed, extremely scalable feature, ultralow power consumption and better switching frequency. The realization of quantum computation is not possible without reversible logic. Reversible logic has enlarged operations in quantum computation. Generally reversible computing is executed when system composes of reversible gates. It has numerous fields of use as applied science, quantum dot cellular automata as well as low power VLSI circuits, low power CMOS, digital signal processing, computer graphics. In this paper, the quantum implementation of primitive reversible gate has been presented. The proposed gates have been designed and simulated using QCADesigner. General Terms Quantum Cellular Automata and Reversible Logic Gates
TL;DR: A survey of controlling devices and configurations found in existing systems found in many places for a wide variety of applications.
Abstract: With the increase in consumption of energy and population, there is a grave need to conserve energy in every way possible. The inability to access and control the appliances from remote locations is one of the major reasons for energy loss. A web or an android application is used by the users to give instructions to these systems. This system can make use of a host of communication methods such as Wi-Fi, GSM, Bluetooth, ZigBee. Different controlling devices and configurations can be found in existing systems. Such systems have been found already in many places for a wide variety of applications. This paper presents a survey of
TL;DR: This paper investigates and evaluates some popular feature normalization techniques and studies their impact on performance of classifier with application to breast tumor classification using ultrasound images and shows that that normalization of features has significant effect on the classification accuracy.
Abstract: Feature extraction and feature normalization is an important preprocessing technique, usually employed before classification. Feature normalization is a useful step to restrict the values of all features within predetermined ranges. However, appropriate choice of normalization technique and normalization range is an important issue, since, applying normalization on the input could change the structure of data and thereby affecting the outcome of multivariate analysis and calibration used in data mining and pattern recognition problems. This paper investigates and evaluates some popular feature normalization techniques and studies their impact on performance of classifier with application to breast tumor classification using ultrasound images. For evaluating the feature normalization techniques, back-propagation artificial neural network [BPANN] and support vector machine [SVM] classifier models are used. Results show that that normalization of features has significant effect on the classification accuracy. General Terms Pattern Recognition, Medical Image Processing.