scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Information System Modeling and Design in 2017"


Journal ArticleDOI
TL;DR: This article describes how howSequences areﻷ attributedﻴtemporal﻽�characteristicsﻵeitherﻰ�explicitlyﻡ�orﻢimplicitly £2.5m Cybersecurity.
Abstract: This article describes how sequential data modeling is a relevant task in Cybersecurity. Sequences are attributed temporal characteristics either explicitly or implicitly. Recurrent neural networks (RNNs) are a subset of artificial neural networks (ANNs) which have appeared as a powerful, principle approach to learn dynamic temporal behaviors in an arbitrary length of large-scale sequence data. Furthermore, stacked recurrent neural networks (S-RNNs) have the potential to learn complex temporal behaviors quickly, including sparse representations. To leverage this, the authors model network traffic as a time series, particularly transmission control protocol / internet protocol (TCP/IP) packets in a predefined time range with a supervised learning method, using millions of known good and bad network connections. To find out the best architecture, the authors complete a comprehensive review of various RNN architectures with its network parameters and network structures. Ideally, as a test bed, they use the existing benchmark Defense Advanced Research Projects Agency / Knowledge Discovery and Data Mining (DARPA) / (KDD) Cup ‘99’ intrusion detection (ID) contest data set to show the efficacy of these various RNN architectures. All the experiments of deep learning architectures are run up to 1000 epochs with a learning rate in the range [0.01-0.5] on a GPU-enabled TensorFlow and experiments of traditional machine learning algorithms are done using Scikit-learn. Experiments of families of RNN architecture achieved a low false positive rate in comparison to the traditional machine learning classifiers. The primary reason is that RNN architectures are able to store information for long-term dependencies over time-lags and to adjust with successive connection sequence information. In addition, the effectiveness of RNN architectures are shown for the UNSW-NB15 data set. KEywoRDS Deep Learning (DL) Approaches, Gated Recurrent Unit (GRU), Intrusion Detection (ID) Data Sets, KDDCup ’99’, Long Short-Term Memory (LSTM), Machine Learning (ML), Recurrent Neural Network (RNN), UNSW-NB15

59 citations


Journal ArticleDOI
TL;DR: KeywoRDS Ant Colony Optimization, Artificial Bee Colony, Cancer, Diabetes, Disease Diagnosis, Genetic Algorithm, Heart Disease, Nature Inspired Techniques, Particle Swarm Optimization.
Abstract: Genetic Algorithms GA, Ant Colony Optimization ACO, Particle Swarm Optimization PSO and Artificial Bee Colonies ABC are some vital nature inspired computing NIC techniques. These approaches have be...

30 citations



Journal ArticleDOI
TL;DR: As the wirelessﻷwireless-sensor-sensors-networks-WSN(WSN) gains popularity, the need forreliable delivery becomes more important, trust in the system is aware, andprotocol is proposed.
Abstract: As the wireless sensor networks (WSN) are gaining popularity the need of reliable delivery of data packets becomes more important. The reliable delivery is only possible when the routing protocols are efficient and secure. Because of lack of resources it is not possible to use existing cryptosystems to provide security in WSN. But, trust aware routing can provide the security with lesser resources, which become popular in last three to four years. In this paper, a new energy efficient and trust aware reliable opportunistic routing (TAEROR) protocol is proposed. The protocol consists of a trust metric and also a relay selection algorithm. The trust aware metric detects the malicious nodes on the basis of forwarding sincerity, energy consumption and acknowledgement sincerity. Relay selection algorithms avoid these malicious nodes to get selected in the routing process. The protocol is simulated and compared to existing trust aware routing protocols. Proposed protocol TEAROR presents better results than the other compared protocols. KEyWORDS Energy Efficiency, Opportunistic Routing, Sensor, Trust, WSN

11 citations


Journal ArticleDOI
TL;DR: This article describes how how swarm swarm-inspired intelligence and bio-inspiredtechniques are described and discusses Ant Colony Optimization, Cuckoo Search, Firefly Algorithm, Genetic Al algorithm, Particle Swarmoptimization, Swarm Intelligence.
Abstract: This article describes how swarm intelligence (SI) and bio-inspired techniques shape in-vogue topics in the advancements of the latest algorithms. These algorithms can work on the basis of SI, using physical, chemical and biological frameworks. The authors can name these algorithms as SI-based, inspired by biology, physics and chemistry as per the basic concept behind the particular algorithm. A couple of calculations have ended up being exceptionally effective and consequently have turned out to be the mainstream devices for taking care of real-world issues. In this article, the reason for this survey is to show a moderately complete list of the considerable number of algorithms in order to boost research in these algorithms. This article discusses Ant Colony Optimization (ACO), the Cuckoo Search, the Firefly Algorithm, Particle Swarm Optimization and Genetic Algorithms in detail. For ACO a real-time problem, known as Travelling Salesman Problem, is considered while for other algorithms a min-sphere problem is considered, which is well known for comparison of swarm techniques. KeywoRDS Ant Colony Optimization, Cuckoo Search, Firefly Algorithm, Genetic Algorithm, Particle Swarm Optimization, Swarm Intelligence

8 citations


Journal ArticleDOI
TL;DR: The experimental results show that the proposed Proposedscheme for Curvelet Transform performs better than expected and is employed for both audio and speech watermarking.
Abstract: In this article, a watermarking scheme using Curvelet Transform with a combination of compressive sensing (CS) theory is proposed for the protection of a digital audio signal. The curvelet coefficients of the host audio signal are modified according to compressive sensing (CS) measurements of the watermarked data. The CS measurements of watermark data is generated using CS theory processes and sparse coefficients (wavelet coefficients of DCT coefficients). The proposed scheme can be employed for both audio and speech watermarking. The gray scale watermark image is inserted into the host digital audio signal when the proposed scheme is used for audio watermarking. The speech signal is inserted into the host digital audio signal when the proposed scheme is employed for speech watermarking. The experimental results show that proposed scheme performs better than the existing watermarking schemes in terms of perceptual transparency. KEyWoRDS Audio Signal, Compressive Sensing (CS), Discrete Cosine Transform (DCT), Discrete Wavelet Transform (DWT), Fast Discrete Curvelet Transform (FDCuT), Speech Signal, Watermarking

6 citations




Journal ArticleDOI
TL;DR: This article describes how e-commerce has become so-called "vast" that almost every product and service can be bought online, and proposes a system to solve this problem.
Abstract: This article describes how e-commerce has become so vast that almost every product and service can be purchased online, to be delivered at our doorsteps. This has led to a striking increase in the number of online customers. In an attempt to make the online shopping more appealing and transparent to the online customers, the e-retailers allow their customers to express their opinion about the purchased products and services. Recently, analysis of such online reviews has become an active topic of research. This is because it is of immense concern to various stakeholders vs. online merchants, potential customers and the manufacturers of the particular product or service providers. The present article addresses the problem of summarization of such opinions expressed online and aims to create an organized feature-based summary as a solution. The proposed system depends on the frequency of occurrences of the potential features. A number of pruning methods are applied in order to obtain the final feature set and sentiment analysis has been done for each such feature.

2 citations


Journal ArticleDOI
TL;DR: A method and a framework to test production systems by combining two approaches: model generation and passive testing, and Autofunk combines the notions of expert system, formal models and machine learning to infer symbolic models while preventing over-generalisation.
Abstract: Many software engineering approaches often rely on formal models to automate some steps of the software life cycle, particularly the testing phase. Even though automation sounds attractive, writing models is usually a tedious and error-prone task. In addition, with industrial software systems, models are often not up-to-date. Hence, testing these systems becomes problematic. In this context, this article proposes a framework called Autofunk to test production systems by combining two approaches: model generation and passive testing. Given a large set of events collected from a production system, Autofunk combines an expert system, formal models and machine learning to infer symbolic models while preventing over-generalisation. Afterwards, these models are considered to passively test whether another system is conforming to the models. As the generated models do not express all the possible behaviours that should happen, we define conformance with four specialised implementation relations.

2 citations


Journal ArticleDOI
TL;DR: Aspects of Denoising and Thresholding and Quality Assessments.
Abstract: Denoising is one of the important aspects in image processing applications. Denoising is the process of eliminating the noise from the noisy image. In most cases, noise accumulates at the edges. So that prevention of noise at edges is one of the most prominent problem. There are numerous edge preserving approaches available to reduce the noise at edges in that Gaussian filter, bilateral filter and non-local means filtering are the popular approaches but in these approaches denoised image suffer from blurring. To overcome these problems, in this article a Gaussian/bilateral filtering (G/ BF) with a wavelet thresholding approach is proposed for better image denoising. The performance of the proposed work is compared with some edge-preserving filter algorithms such as a bilateral filter and the Non-Local Means Filter, in terms that objectively assess quality. From the simulation results, it is found that the performance of proposed method is superior to the bilateral filter and the Non-Local Means Filter. KEywoRDS Bilateral Filter, DWT, Gaussian Filter, Image Denoising, Non-Local Means Filter and Thresholding, Quality Assessments

Journal ArticleDOI
TL;DR: In this article, the authors describe how sequential data modeling is a relevant task in Cybersecurity and describe how recurrent neural networks can be used to model the sequential data in Cyber Security.
Abstract: This article describes how sequential data modeling is a relevant task in Cybersecurity. Sequences are attributed temporal characteristics either explicitly or implicitly. Recurrent neural networks...

Journal ArticleDOI
TL;DR: The specification of the MDE transformation rules must be at a very low level, to allow for the development of a master master of the transformation process.
Abstract: Model-driven engineering (MDE) is a paradigm based on the intensive use of models throughout the life cycle of an application, where model transformation plays an important role. Various model transformation approaches have been proposed, but developers are still faced with the complexity of model transformation specifications. Most of these approaches are based on the specification of transformation rules with a concrete syntax at a low level where the developer must master the transformation language. The question at this level is how to generate a model transformation specification that must be at a very abstract level, independent of any transformation language. This article aims to propose an approach to generate an abstract representation of transformation rules and these are used to produce a source code written in a chosen transformation language. The transformation rules are calculated semi-automatically by using a matching technique on elements of source and target metamodels. This idea is illustrated by different transformation examples.


Journal ArticleDOI
TL;DR: This article describes how classification algorithms have emerged as strong meta-learning techniques to accurately and efficiently analyze the masses of data generated from the widespread use of internet and other sources and provides a comprehensive comparison of existing classification algorithms over big data classification frameworks and other novel frameworks.
Abstract: This article describes how classification algorithms have emerged as strong meta-learning techniques to accurately and efficiently analyze the masses of data generated from the widespread use of internet and other sources. In particular, there is need of some mechanism which classifies unstructured data into some organized form. Classification techniques over big transactional database may provide required data to the users from large datasets in a more simplified way. With the intention of organizing and clearly representing the current state of classification algorithms for big data, present paper discusses various concepts and algorithms, and also an exhaustive review of existing classification algorithms over big data classification frameworks and other novel frameworks. The paper provides a comprehensive comparison, both from a theoretical as well as an empirical perspective. The effectiveness of the candidate classification algorithms is measured through a number of performance metrics such as implementation technique, data source validation, and scalability etc.


Journal ArticleDOI
TL;DR: This framework provides guidance on how to design and manage component-based systems using Maude's methodology.
Abstract: Constructing systems from components and building components for different systems require well-established methodologies and processes. This article proposes a formal framework for designing and specifying component-based systems (CBS). The two-dimensional evolutions of CBS are architectural reconfiguration and behavioral adaptation, when user requirements and/or runtime contexts change. This framework provides an incremental design methodology where component interfaces and their corresponding ports are the basic units of software construction, rather than components. Conceptually, interfaces serve to assemble simple components to obtain more complex ones. Behaviorally, they serve to propagate side effects of visible changes of a component on its neighbors. Interfaces also supply interactions and synchronization effects on the underlying sub-system. The calculation process is guided by changes on interfaces where the hierarchical structure of the underlying CBS is maintained. In this framework, CBS specification is supported by some execution tools based on Maude.

Journal ArticleDOI
TL;DR: Conceptual Modeling, Domain-Specific Modelling Method, Knowledge Graphs, Metamodeling, Model-Aware Application, Product-Service Modeling, Resource Description Framework, Semantic Queries.
Abstract: Conceptual modeling is commonly employed for two classes of goals: (1) as input for run-time functionality (e.g., code generation) and (2) as support for design-time analysis (e.g., in business process management). An inherent trade-off manifests between such goals, as different levels of abstraction and semantic detail is needed. This has led to a multitude of modeling languages that are conceptually redundant (i.e., they share significant parts of their metamodels) and a dilemma of selecting the most adequate language for each goal. This article advocates the substitution of the selection dilemma with an approach where the modeling method is agilely tailored for the semantic variability required to cover both run-time and design-time concerns. The semantic space enabled by such a method is exposed to model-driven systems as RDF knowledge graphs, whereas the method evolution is managed with the Agile Modeling Method Engineering framework. The argument is grounded in the application area of Product-Service Systems, illustrated by a projectbased modeling method. KeywoRDS Agile Modeling Method Engineering, Domain-Specific Modelling Method, Knowledge Graphs, Metamodeling, Model-Aware Application, Product-Service Modeling, Resource Description Framework, Semantic Queries