scispace - formally typeset
Search or ask a question

Showing papers in "International Review on Computers and Software in 2014"


Journal ArticleDOI
TL;DR: Cognition-based ontology is used for information retrieval, which can be implemented in a semantic-based information retrieval system and probability rule-based reasoning techniques are used for ontology learning and reasoning.
Abstract: The combinations of different concepts with respect to its associative relationship, in a quantitative sense is said to be valid information knowledge base. Relevant information retrieval through cognitive process is our main objective of this paper. In this paper, the concept of information retrieval was explored and visualized with respect to handling the uncertain knowledge of World Wide Web. In order to narrow the searches in the web, the semantic plays an important role. In this paper, a concept of fuzzy ontology is introduced with respect to textual and image search. The specification of concept with its relation in an organized way is said to be ontology. Here, we use Cognition-based ontology for information retrieval, which can be implemented in a semantic-based information retrieval system. For textual search, a combination of formal context and ontology were used, whereas for image search, low-level feature determination and fuzzy membership function mapping were used. In this paper, the domain concept of basketball game was considered. For this game, the actions are listed and respective image features are linked with respect to Type-2 level fuzzy metrics. To extract information from the created ontology, probability rule-based reasoning techniques are used for ontology learning and reasoning. This rule-based method is enriched by Fuzzy learning system

10 citations


Journal ArticleDOI
TL;DR: An appropriate multicriteria network selection algorithm based on multiplicative weighted utility function is proposed to provide complete solution for seamless connectivity in heterogeneous environment based on network conditions, application QoS, Mobile Terminal (MT) battery level and user preferences.
Abstract: To provide global connectivity with high speed and high quality at any place and any time is now becoming a reality due to the integration and co-ordination of different Radio Access Technologies(RAT) such as Worldwide Interoperability for Microwave Access (WiMAX), Universal mobile Telecommunication Systems (UMTS), Wireless Local Area network (WLAN) and Long Term Evolution (LTE).Such a diversity of networks offers different choices in terms of bandwidth, security, cost, latency and the coverage area for the mobile user. For such a heterogeneous wireless environment, it is important to design an intelligent handover algorithm which selects the optimal target network. The existing works do not consider the interdependence between the criteria and the use of application QoS in weight calculation during network evaluation process. To address this issue, an appropriate multicriteria network selection algorithm based on multiplicative weighted utility function is proposed to provide complete solution for seamless connectivity in heterogeneous environment based on network conditions, application QoS, Mobile Terminal (MT) battery level and user preferences. MATLAB based simulations are conducted to highlight the effectiveness of proposed scheme and simulation results confirm that the proposed scheme selects the best suitable network for both Real-Time (RT) and Non- Real Time (NRT) applications while achieving optimization between QoS, cost and energy consumption. The Cobb-Douglas based user satisfaction degree is also estimated to verify whether the selected network maximizes the end user satisfaction for the offered service.

9 citations


Journal ArticleDOI
TL;DR: This paper proposes location aided cluster based geographical routing protocol for intermittently connected MANET that minimizes the control overhead and delay.
Abstract: In mobile ad hoc network (MANET), the increased node mobility causes increased control overhead. The existing route discovery technique is not efficient for intermittently connected or Delay Tolerant Network (DTN). In order to overcome these issues, in this paper, we propose location aided cluster based geographical routing protocol for intermittently connected MANET. In this technique, cluster head is chosen based on node value which is estimated based on degree difference, node mobility and residual energy. The cluster consists of GPS enabled node and antenna equipped node. The cluster that contains atleast one G-Node considers the remaining energy, speed of the node along with the mobility of node to select the cluster head. Also the cluster maintenance is implemented in order to re-organize and re-configure the cluster dynamically due to the mobility of nodes in the ad hoc networks. Then a store-carry forward and geographical routing based routing protocol is employed. Finally, in order to prevent the location error caused during routing, a location update technique is executed. By simulation results, we show that proposed technique minimizes the control overhead and delay

9 citations


Journal ArticleDOI
TL;DR: A model using Fuzzy Petri Nets (FPN) is developed for assessing the risk associated with the post-implementation phase or the usage phase of Enterprise Resource Planning (ERP) in Small and Medium Enterprises.
Abstract: A model using Fuzzy Petri Nets (FPN) is developed for assessing the risk associated with the post-implementation phase or the usage phase of Enterprise Resource Planning (ERP) in Small and Medium Enterprises. This paper has identified a well defined set of risk factors from the existing research literature. The usage of FPN is because of its simplicity of usage and the efficiency in quantifying the risks inherent in the post-implementation or the usage stage of ERP adoption by SMEs. The model represented in the report will be an authentic character from a project risk management perspective to validate and quantify the risk inherent in the usage phase of ERP adoption.

9 citations


Journal ArticleDOI
TL;DR: A novel approach for modeling HIS requirements based on the System of Systems (SoS) paradigm is proposed and the language used is SysML due to its vocation to be a modeling language dedicated to complex systems.
Abstract: Health care domain faces an era of unprecedented pressure in order to improve the quality and effectiveness of care. Subsequently, Hospital Information Systems (HIS) have become mandatory in order to improve the quality of both health care and managerial processes, while simultaneously reducing their cost. However, due to complex nature of HIS, variety of end users and professional disciplines related to the production process of care, stakeholders require an in-depth analysis of requirements. This paper proposes a novel approach for modeling HIS requirements based on the System of Systems (SoS) paradigm. The language used is SysML due to its vocation to be a modeling language dedicated to complex systems. The billing system serves as an experiment for modeling and requirements association.

9 citations


Journal ArticleDOI
TL;DR: An automated method based on backpropagation neural network (BPNN) for classification of the MRI of a human brain and results showed an acceptable accuracy of classification.
Abstract: Brain tumors classification in magnetic resonance imaging (MRI) is very important in medical diagnosis. Most of the current conventional diagnosis techniques are based on human experience in interpreting the MRI-scan for classification. This paper presents an automated method based on backpropagation neural network (BPNN) for classification of the MRI of a human brain. The proposed method utilizes wavelet transform (WT) as a feature extraction tool of the MRI. The proposed method follows two steps: feature extraction and classification. WT is first employed for decomposing the image into different levels of approximate and detailed coefficients and then these coefficients are fed into a BPNN for further classification and tumor detection. The proposed method has been applied on several MRI scans, and the results showed an acceptable accuracy of classification.

8 citations


Journal ArticleDOI
TL;DR: This paper gives an overview of ontology and its types including the building and design for an enterprise system and gives the different definitions of ontological research.
Abstract: Ontology is an important concept in Computer Science to formally represent knowledge. The software engineering ontology assists in defining information for the exchange of semantic project information framework. Research into ontological issues has been widely active in various areas. This paper presents the origin of ontology research and gives the different definitions of ontology. The paper gives an overview of ontology and its types including the building and design for an enterprise system.This review tries to study articles in which adaption model and there properties were discussed in order to get a clear review about the ontology. This study is showed a Systematic Literature Review which was used to identify important characteristic about the ontology. The research identified more than 70 paper on this study but only 46 of them was precisely relevant in the field of Ontology development processs. In this paper we analyse the ontology based on types, design, building ,model as a systematic review of the subject. And choose the best way according to that

8 citations


Journal ArticleDOI
TL;DR: The main objectives of this research are to propose a model to achieve better knowledge representation, provide the capability to expand queries through additional analytical attributes and reduce redundancy, and thereby obtain better integrity and consistency in spatio-temporal databases.
Abstract: We are surrounded by information and much of it needs to be stored and analysed. Data analysis would be easier if the data storage structure were closer to that of a natural data structure. Many storage structures and related methods have been proposed in recent years due to the importance of understanding spatio-temporal information associated with a particular place and time. In this paper, some of the most important analytic methods for spatio-temporal data are considered and categorized in terms of their algorithms. We also describe the difficulties of knowledge representation when dealing with spatio-temporal data. In addition, three of the analytic functions of theHair-oriented Data Model are defined, which is a nature-inspired solution. These analytic functions are implemented in Oracle and tested on climate change data as a case study. The main objectives of this research are to propose a model to achieve better knowledge representation, provide the capability to expand queries through additional analytical attributes and reduce redundancy, and thereby obtain better integrity and consistency in spatio-temporal databases

8 citations


Journal ArticleDOI
TL;DR: Experimental results using a 72-image dataset demonstrate that PCA is able to reduce computational time while improving classification accuracy and the use of the proposed Gabor filter seems to be more robust compared to other existing techniques.
Abstract: In this paper, a technique to classify Engineering Machined Textures (EMT) into the six classes of Turning, Grinding, Horizontal-Milling, Vertical-Milling, Lapping and Shaping, is presented. Multidirectional Gabor features are firstly extracted from each image followed by a dimensionality reduction step using Principal Components Analysis (PCA). The images are finally classified using a supervised Artificial Neural Network (ANN) classifier. Experimental results using a 72-image dataset demonstrate that PCA is able to reduce computational time while improving classification accuracy. In addition, the use of the proposed Gabor filter seems to be more robust compared to other existing techniques.

8 citations


Journal ArticleDOI
TL;DR: A framework of constructing IPv6 datasets is proposed which can be encouraged researchers to produce a solid dataset for IPv6 network environment to facilitate further researcher in IPv6 security domain.
Abstract: IPv6 has been implemented for quite a while. Nowadays, the number of IPv6 users has gradually increased. This is due to high demand of new IP addresses allocation which IPv4 cannot offer anymore. Theoretically, IPv6 protocol is much better than IPv4 in terms of security, mobility and routing speed. Although the design of the IPv6 technology has taken security concerns into its design, the implementation of IPv6 is not a panacea for the overall security issues. New threats have been discovered due to the flaws of the IPv6 new design. In IPv4, there is a dataset called KDD’ 99 dataset which widely used to propose a new detection technique in IPv4 environment. Many intrusion detection techniques were proposed by using the KDD’99 dataset. Unfortunately till this point of time there is no available dataset which similar to KDD’ 99 in IPv6 network environment. Hence, this paper is meant to propose a framework of constructing a dataset which similar to KDD’99 dataset based on IPv6 network environment. An example of IPv6 dataset construction is explained according to the proposed framework. A testbed based on the original KDD’ 99 framework is used as a baseline platform for this study. A framework of constructing IPv6 datasets is proposed which can be encouraged researchers to produce a solid dataset for IPv6 network environment. In the future, a new dataset can be produced which can facilitate further researcher in IPv6 security domain.

7 citations


Journal ArticleDOI
TL;DR: It was found that the enhancements made on the extended HS are mainly on the modification of the parameters, such as the harmony memory consideration rate (HMCR), pitch adjusting rate (PAR) and distance bandwidth (BW).
Abstract: A Harmony Search (HS) algorithm is a population based-meta-heuristics approach that is superior in solving diversified and large scale optimization problems. Several studies have pointed out that Harmony Search is an efficient and flexible tool to resolve optimization problems in diverse areas of construction, engineering, robotics, telecommunication, health and energy. Considering its increasing usage in diverse areas, this paper aims to present the historical development of HS, highlighting on its different features, modifications for improvements and limitations. Based on the description of the fundamental concept of HS, recent variations of the extended HS were analyzed focusing on its algorithm’s theory as well as its fundamental and primary concepts. It was found that the enhancements made on the extended HS are mainly on the modification of the parameters, such as the harmony memory consideration rate (HMCR), pitch adjusting rate (PAR) and distance bandwidth (BW). This analysis provides a useful motivation for researchers interested to improve the achievement of the standard HS algorithm and enhance the solution convergence rate and flexibility.

Journal ArticleDOI
TL;DR: A lightweight self-organized efficient authentication and key management scheme (SEAKS) to countermeasure the MAC layer attacks such as denial of service (DoS), replay attack, man-in-the-middle attack and the interleaving attacks in mobile multihop relay (MMR) networks.
Abstract: This paper proposes a lightweight self-organized efficient authentication and key management scheme (SEAKS) to countermeasure the MAC layer attacks such as denial of service (DoS), replay attack, man-in-the-middle attack and the interleaving attacks in mobile multihop relay (MMR) networks. SEAKS has been developed based on privacy key management (PKM) protocol for both unilateral authentication defined as SEAKS-PKMv1 and mutual authentication as SEAKS-PKMv2. In SEAKS, the non-transparent relays (N-RS) perform authentication and establish the authorization key (AK) using our proposed public key cryptosystem based on hash authentication code scheme. The subsequent N-RS can be authenticated with less overhead thus enhancing the scalability of the system. The performance of SEAKS-PKMv1 and SEAKS-PKMv2 protocol has been evaluated using BAN LOGIC to verify the integrity of the participating N-RSs and SSs. Simulation study shows that SEAKS exhibits higher packet delivery ratio by 22%, lesser packet overhead by 12%, and less processing time by 14% as compared to the official draft scheme (OD-2009) for MMR WiMAX networks. SEAKS can be applied to any multihop networks with minimum authentication overhead.

Journal ArticleDOI
TL;DR: A monitoring, detection and prevention system to help individuals whose health is affected developing or increasing diseases such as electromagnetic hypersensitivity, through which it is reported the rates of electromagnetic radiation in certain areas, based on the information that the own Smart City gives us.
Abstract: Today, faced with the constant rise of the Smart cities around the world, there is an exponential increase of the use and deployment of information technologies in the cities. The intensive use of Information Technology (IT) in these ecosystems facilitates and improves the quality of life of citizens, but in these digital communities coexist individuals whose health is affected developing or increasing diseases such as electromagnetic hypersensitivity. In this paper we present a monitoring, detection and prevention system to help this group, through which it is reported the rates of electromagnetic radiation in certain areas, based on the information that the own Smart City gives us. This work provides a perfect platform for the generation of predictive models for detection of future states of risk for humans.

Journal ArticleDOI
TL;DR: A new algorithm that is basically has similar functionalities as that of the Ant Colony Optimization algorithm and one that improvises association rule mining results is proposed to improve the quality of rules that are produced for CACO.
Abstract: In recent years, Diabetes mellitus almost tops the list of chronic diseases worldwide among the major public health challenges. Diagnosing diabetes at the preliminary stage is undoubtedly challenging as it involves varying complexities and inter relation and dependence on several factors that affect it directly or indirectly. Due to large number of diabetic patients in recent years, desperate measures have to be devised and developed to facilitate medical diagnostic decision support systems that help doctors, researchers and medical practitioners during and after the process of diagnosing Diabetes. In this paper, the Association Rule Mining and Enhanced FP-Growth Algorithm has been used as a reference to propose a new algorithm that is basically has similar functionalities as that of the Ant Colony Optimization algorithm and one that improvises association rule mining results. The Refined Continuous Ant Colony Optimization or CACO deploys a meta-heuristic approach and has been devised and enthused by actual ant colonies behavior along with the sustained continuous domains. Preliminarily association rules so produced by the Enhanced FP-Growth algorithm are deployed thereafter which rules from weakest set are found on the basis of threshold value and then further used by the Ant Colony algorithm so that association rules are reduced and a better quality of rules are discovered as a result of the efforts. The study as well the research presented here aims at reducing database scanning by optimizing as well by improving the quality of rules that are produced for CACO.

Journal ArticleDOI
TL;DR: This study proposes a technique to split the class of SMS phishing from SMS spam and produce better accuracy using the Bayesian technique, and generated an improvement of SMS Phishing corpus which has been labelled in three different classes ie.
Abstract: Short Message Service (SMS) is one of the popular communication services. However, this can contribute to increasing mobile device attacks. Presently, SMS phishing (SMiShing) attack is alarming to the mobile phone users because these attacks usually succeed in stealing information and money. Moreover, SMS phishing and spam are two different types of attack and level of risk. Thus, it is important to have a SMS phishing corpus. The established SMS corpus is limited to spam and none can be found suitable for SMS Phishing. This study proposes a technique to split the class of SMS phishing from SMS spam and produce better accuracy using the Bayesian technique. The result shows that the enhanced SMS corpus gets 99.8064% accurate classification. The study identified classes and generated an improvement of SMS Phishing corpus which has been labelled in three different classes ie., Ham, Spam and Phishing with better accuracy.

Journal ArticleDOI
TL;DR: This technique will make EDCA protocol tolerates more VOIP users with achieving the QOS requirements, and decrease the collisions inside the network and increase the throughput.
Abstract: Distributed Coordination Function (DCF) and Enhance Distributed Channel Access (EDCA) are two protocols which are used in Medium Access Control (MAC) layer. DCF sends its data without any differentiation between various data types. Real time application data such as voice or video do not have any preference to service it first. However the data with EDCA protocol are divided into four categories with different priorities. The voice category takes the highest priority, and then followed by video, best effort and background respectively. Specific EDCA parameters are used in each category. When using EDCA protocol, there is probability that the backoff time of two or more categories reach zero at the same time causing internal collision. Increasing the number of collisions will affect the Quality of Service (QOS) parameters requirement and decrease the throughput of the network. In this paper, the OPNET simulation is used to present some limitations in EDCA protocol with increasing the number of Voice Over Internet Protocol (VOIP) users and using default values of EDCA parameters. In addition to that, this paper proposed new technique to enhance EDCA protocol by adjusting contention window (CW) depending on the work load. Our technique will make EDCA protocol tolerates more VOIP users with achieving the QOS requirements. It will also decrease the collisions inside the network and increase the throughput.

Journal ArticleDOI
TL;DR: This paper introduces the novel concept extraction method PROCEOL (Probabilistic Relational of Concept Extraction in Ontology Learning), and an experimental result provides the best concept extractions compared to the state of the art method.
Abstract: Ontologies play an important role in knowledge Management like annotating web resources, web mining and other internet related applications Since the manual construction of a high quality ontologies are more expensive and take more time to complete the process So, more number of automatic and semi-automatic ontologies is created in the system and also existing ontology learning provides the best results, but sometimes makes the failure due to process of noise in the text Noise text is one of the major problems in the ontology learning Because noise text are could not extracted It makes the problem in completion of the extraction For avoiding the noise in the data and providing the quick process, paper introduce the novel concept extraction method This concept extraction presents an ontology building through the automatic and semi-automatic process Most of the ontology learning technique developed using the Classifiers, NLP, probabilistic and statistical learning For the concept extraction it uses the process of statistical learning with the combination of text To increases the richness and avoid the issue of noise, this paper proposes the method of PROCEOL (Probabilistic Relational of Concept Extraction in Ontology Learning) An experimental result provides the best concept extractions compared to the state of the art method

Journal ArticleDOI
TL;DR: Simulation results show that the SEER-MWSN protocol has less Energy dissipated and more network lifetime, Packet Delivery Ratio, Throughput and Delay than the existing LEach-Mobile and LEACH-Mobile-Enhanced protocols.
Abstract: In Wireless sensor networks, existing routing algorithms assumed that the sensor nodes are stationary. But some applications of WSN should poses mobile sensor nodes are mixed with fixed sensor nodes in the same networks. So Mobility is the key factor for the WSN. When mobility is functioned there should be performance degradation and also the sensor devices are resource restricted, so that the networks exposed to different types of attacks. The tradition security mechanisms are not suitable for the Mobile wireless sensor networks since the resource limitation in the sensor node. Especially the routing security is important for the sensor networks. Because of the intermediate node need access the contents of the data message, there are many attacks towards the wireless sensor network routing protocol and the traditional end-to-end security mechanism could not do anything. Therefore, Mobility and Security are the challenging task in MWSN. In this paper, we propose Secure, Energy Efficient and Reliable Routing Protocols for Mobile Wireless Sensor Network (SEER-MWSN) for considering the security and mobility as a major constraints. It provides energy efficiency as well as sufficient security to the mobile sensor networks. Simulation results show that the SEER-MWSN protocol has less Energy dissipated and more network lifetime, Packet Delivery Ratio, Throughput and Delay than the existing LEACH-Mobile and LEACH-Mobile-Enhanced protocols

Journal ArticleDOI
TL;DR: This paper makes comparative simulation analysis between the MoLEACH and LEACH in testing different parameters such as first node dead, half nodes dead, and the effect of the number of nodes to the network lifetime, and shows improvement of energy efficiency over the LEACH.
Abstract: In this paper, we propose a Modified Low-Energy Adaptive Clustering Hierarchy (MoLEACH) protocol to improve energy consumption in in Wireless Sensor Networks. The novelty of MoLEACH is that, unlike the original LEACH that uses the residual energy of the network, it considers the residual energy of each node for calculation of the threshold value for the next round in cluster head selection. We make comparative simulation analysis between the MoLEACH and LEACH in testing different parameters such as first node dead, half node dead, and the effect of the number of nodes to the network lifetime. The simulation results show that the number of nodes affects the network lifetime in which increments of number of nodes decrease the network lifetime. In small area, minimum number of nodes is better for network lifetime in both MoLEACH and LEACH protocols. Hence, MoLEACH shows improvement of energy efficiency over the LEACH.

Journal ArticleDOI
TL;DR: The proposed method proved to be a highly efficient method for the detection of lung nodule with high rate of accuracy and utilized clustering for classifying the lung images as normal or abnormal image.
Abstract: Lung cancer has become one of the leading causes of cancer related death in both men as well as women which ranges for about 30% of cancer death occurring in the world. Medical image analysis is a complex task in which a human expert makes extensive use of the knowledge of anatomy and imaging techniques. Specially, the detection of the presence of lung nodules is challenging problem from a computer vision point of view. The early detection of the lung cancer can definitely improve the long term health of those people diagnosed with it. Evaluation of the variation of cardiac size from month to month by taking serial chest images remains crucial for the treatment of lung cancer. In this paper, we have proposed an efficient method for detecting the presence of the lung nodule with the help of CT images. The proposed method is carried out using three processes such as segmentation, classification and detection. Here we utilized clustering for classifying the lung images as normal or abnormal image. These methods helped to improve the early detection of the lung nodules. Our proposed method proved to be a highly efficient method for the detection of lung nodule with high rate of accuracy.

Journal ArticleDOI
TL;DR: Fuzzy k means clustering algorithm optimizes the relevance results from conventional image retrieval system by firstly clustering the related images in the images database to improve the effectiveness of images retrieval system.
Abstract: The construction of large database with thousands of data has been facilitated by the developments in data storage and image acquisition technologies. Suitable information system requires proper handling of these datasets in efficient manner. Content-Based Image Retrieval (CBIR) is commonly used system to handle these datasets. Basis on the image substance CBIR extracts the images that are relevant to the user given query image from large image databases. Many of the CBIR systems retrieval of the result are corresponding to feature similarities for user given query, ignoring the similarities among images in database. These existing CBIR system measures the feature similarities by using k means algorithm, but the traditional k-means algorithm mostly depends on the selection of initial centers values, the algorithm normally uses random procedures to get them and it degrades the performance of the CBIR retrieval results. To overcome the problem of initial centroid random selection process in K means clustering algorithm use the fuzzy logic based feature similarities information with K means clustering algorithm to image retrieval system. Combining both low-level and high-level visual features, the fuzzy k means algorithm entirely measures the features similarities information between the images in larger dataset. Fuzzy k means clustering algorithm optimizes the relevance results from conventional image retrieval system by firstly clustering the related images in the images database to improve the effectiveness of images retrieval system.

Journal ArticleDOI
TL;DR: This paper introduces the novel concept extraction method PROCEOL (Probabilistic Relational of Concept Extraction in Ontology Learning), and an experimental result provides the best concept extractions compared to the state of the art method.
Abstract: Ontologies play an important role in knowledge Management like annotating web resources, web mining and other internet related applications. Since the manual construction of a high quality ontologies are more expensive and take more time to complete the process. So, more number of automatic and semi-automatic ontologies is created in the system and also existing ontology learning provides the best results, but sometimes makes the failure due to process of noise in the text. Noise text is one of the major problems in the ontology learning. Because noise text are could not extracted. It makes the problem in completion of the extraction. For avoiding the noise in the data and providing the quick process, paper introduce the novel concept extraction method. This concept extraction presents an ontology building through the automatic and semi-automatic process. Most of the ontology learning technique developed using the Classifiers, NLP, probabilistic and statistical learning. For the concept extraction it uses the process of statistical learning with the combination of text. To increases the richness and avoid the issue of noise, this paper proposes the method of PROCEOL (Probabilistic Relational of Concept Extraction in Ontology Learning). An experimental result provides the best concept extractions compared to the state of the art method.

Journal ArticleDOI
TL;DR: From the analysis, the chosen framework has been enhanced forMalay SMS spam and phishing detection framework and shows high accuracy in detecting Malay SMS ham, spam andPhishing.
Abstract: Short Message Service (SMS) spam and SMS phishing has been increase nowadays especially in Malay language which is the first language for Malaysia country. Currently, many SMS spam in others language has been proposed, however not yet for Malay language and we are the first to propose these. In addition, this paper also analyst on several frameworks of SMS spam filtering for our SMS spam and phishing detection framework. From the analysis, the chosen framework has been enhanced for Malay SMS spam and phishing. The enhancement has been done on classification phase where our framework proposed dual classification. The classification 1 will classify the SMS into ham and scam SMS. For classification 2, the scam SMS will be classified again into SMS spam and SMS phishing. After dual classifications phase completed, the Malay SMS has been examined using Naive Bayes and J48 unsupervised Machine Learning techniques. The result shows high accuracy in detecting Malay SMS ham, spam and phishing.

Journal ArticleDOI
TL;DR: Experimental results that are carried out to analyze the sensitivity of RSSI measurements in an indoor environment for various power levels are presented, with the reduction in the Distance error through improved RSSI technique, Path-loss also reduced comparatively.
Abstract: Localization is one of the most challenging and important issue in wireless sensor networks (WSN), especially if cost effective approaches are demanded. Distance measurement based on RSSI (Received Signal Strength Indication) is a low cost and low complexity of the distance measurement technique, and it is widely applied in the range-based localization of the WSN. The RSS (Received Signal Strength) used to estimate the distance between an unknown node and a number of reference nodes with known co-ordinates. Location of the target node is then determined by trilateration. Log-normal shadowing model can better describe the relationship between the RSSI value and the distance. Due to the Non-line of sight and multipath transmission effects in the indoor environment, the distance error or ranging error will be large. It is a challenging task to optimize the transmission power of WSNs, in presence of path loss, because although increasing the transmission power reduces the residual energy, it also reduces the number of retransmissions required. In this paper, experimental results that are carried out to analyze the sensitivity of RSSI measurements in an indoor environment for various power levels are presented. Along with the reduction in the Distance error through improved RSSI technique, Path-loss also reduced comparatively.

Journal ArticleDOI
TL;DR: Experimental results show that the proposed data prediction schemes using SVM and AHL decrease the application execution time by more than 50%.
Abstract: In recent years, scientific applications have the need to effectively handle huge volume of data. The processing and handling of such large quantity of data requires large scale computing infrastructure such as grid structures. The main objective of the proposed system is to improve the performance of the grid system by predicting the behavior of the application and its future events. Time Series classification technique models the behavior of the application based on three properties namely Stochasticity, Linearity and Stationarity. The major drawback of time series classification is that it is a complex process and not suitable for long-term forecasting. In order to overcome the above mentioned difficulties, in this work, two techniques, namely, Support Vector Machines (SVM) and Adaptive Hypergraph Learning (AHL) are used for predicting the behavior of user applications. SVM is a supervised learning algorithm which is used in many domains such as classification and regression analysis. AHL is a graph-based learning algorithm which uses K-Nearest Neighbors algorithm (KNN) to construct hypergraph. Confusion matrix is constructed to determine the accuracy of the prediction using the proposed algorithms based on SVM and AHL. The jobs are executed in the grid simulator, OptorSim, with the predicted events. Experimental results show that the proposed data prediction schemes using SVM and AHL decrease the application execution time by more than 50%.

Journal ArticleDOI
TL;DR: This paper presents the application of the MDA (Model Driven Architecture) to generate, from the UML model, the Code following the MVP pattern (Model-View-Presenter) for a RIA using the standard MOF 2.0 QVT (Meta-Object Facility2.0 Query- view-Transformation) as a transformation language.
Abstract: The continuing evolution of business needs and technology makes Web applications more demanding in terms of development, usability and interactivity of their user interfaces. To cope with this complexity, several frameworks have emerged and a new type of Web applications called RIA (Rich Internet Applications) has recently appeared providing richer and more efficient graphical components similar to desktop applications. Given this diversity of solutions, the generation of a code based on UML models has become important. This paper presents the application of the MDA (Model Driven Architecture) to generate, from the UML model, the Code following the MVP pattern (Model-View-Presenter) for a RIA using the standard MOF 2.0 QVT (Meta-Object Facility 2.0 Query-View-Transformation) as a transformation language. We adopt GWT (Google web Toolkit) for creating a target meta-model to generate an entire GWT-based web application. The transformation rules defined in this paper can generate, from the class diagram, an XML file containing the Views, the Models, and the Presenter. This file can be used to generate the necessary code of a RIA.

Journal ArticleDOI
TL;DR: The use of time series prediction techniques in estimating RUL from established degradation index with two variants of multi-step time series name predictions namely hybrid ANN-DES and Enhanced Double Exponential Smoothing (EDES).
Abstract: Prognostic have progressed over in the last few years as a specific function. It provides remaining useful lifetime (RUL) estimation of the targeted equipment or component in which able to be beneficially used by production or maintenance people to be readily advanced through preventive maintenance actions. In order to get accurate RUL for predicting future failure, RUL estimation is depending on the current condition of equipment. However, existing prognostic works use historical run-to-failure data and simulation-based model which is difficult to predict the future failure occurrence from the current certain level of degradation equipment. Therefore, this paper reported the use of time series prediction techniques in estimating RUL from established degradation index. Artificial Neural Network (ANN) with time series and Double Exponential Smoothing (DES) approaches with some modification is used to carry out the prediction steps. The modification obtained two variants of multi-step time series name predictions namely hybrid ANN-DES and Enhanced Double Exponential Smoothing (EDES). All the techniques are compared and evaluated to investigate the performance accuracy based on RMSE. The results shows that the EDES has a better solution in RUL estimation compare than other techniques.

Journal ArticleDOI
TL;DR: A new approach for modeling the composition of JADE agents services and thus web services using the Multi-Agent Reactive Decisional System (MARDS) Model is proposed.
Abstract: Web services have gained popularity today for enabling universal interoperability among applications. In order this, to answer the complex service requirements of the user, composite web services has to be constructed correctly and effectively. For this reason, various approaches have been used for web service composition. Among these approaches we quote the one who allows software agents to access and control Web Services, in this approach the integration between agents and web services platforms is important. For this purpose, the toolkit WS2JADE is developed at the Centre of Intelligent Agent and Multi-Agent Systems, it allows deployment of Web services as JADE (JAVA agent DEvelopment Framework) agents services at run time. Therefore, composition of web services is returned to that of JADE agents services. In this paper we propose a new approach for modeling the composition of JADE agents services and thus web services using the Multi-Agent Reactive Decisional System (MARDS) Model.

Journal ArticleDOI
TL;DR: The technique of automatic medical image annotation/classification to enhance the accuracy of CBIR systems is proposed and suggestions for the future of medical domains to solve the discussed disadvantages are introduced.
Abstract: Content-based image retrieval (CBIR) is one of the most interesting research areas of computer vision in recent years. CBIR uses feature comparison to search, browse, and retrieve images from large databases. It is considered as the main research topic in medical fields because of the daily increase in the number of recorded medical images. CBIR can help physicians diagnose various types of diseases as well as medical references, teaching purposes , training and research. This paper introduces the main concepts, stages, and methods of CBIR systems; explores the main types of features that can be extracted from images; and identifies the techniques to evaluate the systems. It also reviews and evaluates the advantages and disadvantages of several existing commercial and academic content-based medical image retrieval systems. Finally, the paper introduces suggestions for the future of medical domains to solve the discussed disadvantages. It proposes the technique of automatic medical image annotation/classification to enhance the accuracy of CBIR systems.

Journal ArticleDOI
Youssef Zaitar1
TL;DR: A new model of ERP projects life cycle is proposed on the one hand, and the FMEA method for classifying risks by their criticality on the other, through a case study inside a large enterprise that has been experienced in implementing such systems.
Abstract: Like any project, the ERP (Enterprise Resource Planning) presents multitude risks that have to be taken for granted in our project so that we can identify and avoid them before suffering .As the proverb said "prevention is better than cure". However, the risk management within ERP projects has been widely recognized as a very complicated task both by academics and practitioners. Unfortunately, these failure factors have been generally underestimated by decision makers and project managers.This article aims at proposing a new model of ERP projects life cycle on the one hand, and applying the FMEA method for classifying risks by their criticality on the other, through a case study inside a large enterprise that has been experienced in implementing such systems.