scispace - formally typeset
Journal ArticleDOI

Evaluation of Recurrent Neural Network and its Variants for Intrusion Detection System (IDS)

TLDR
This article describes how howSequencesareﻷ attributedﻴtemporal﻽�characteristicsﻵeitherﻰ�explicitlyﻡ�orﻢimplicitly £2.5m Cybersecurity.
Abstract
ThisarticledescribeshowsequentialdatamodelingisarelevanttaskinCybersecurity.Sequencesare attributedtemporalcharacteristicseitherexplicitlyorimplicitly.Recurrentneuralnetworks(RNNs)are asubsetofartificialneuralnetworks(ANNs)whichhaveappearedasapowerful,principleapproach tolearndynamictemporalbehaviorsinanarbitrarylengthoflarge-scalesequencedata.Furthermore, stackedrecurrentneuralnetworks(S-RNNs)havethepotentialtolearncomplextemporalbehaviors quickly,includingsparserepresentations.Toleveragethis,theauthorsmodelnetworktrafficasatime series,particularlytransmissioncontrolprotocol/internetprotocol(TCP/IP)packetsinapredefined time range with a supervised learning method, using millions of known good and bad network connections.Tofindoutthebestarchitecture,theauthorscompleteacomprehensivereviewofvarious RNNarchitectureswithitsnetworkparametersandnetworkstructures.Ideally,asatestbed,they usetheexistingbenchmarkDefenseAdvancedResearchProjectsAgency/KnowledgeDiscovery andDataMining(DARPA)/(KDD)Cup‘99’intrusiondetection(ID)contestdatasettoshowthe efficacyofthesevariousRNNarchitectures.Alltheexperimentsofdeeplearningarchitecturesare runupto1000epochswithalearningrateintherange[0.01-0.5]onaGPU-enabledTensorFlowand experimentsoftraditionalmachinelearningalgorithmsaredoneusingScikit-learn.Experimentsof familiesofRNNarchitectureachievedalowfalsepositiverateincomparisontothetraditionalmachine learningclassifiers.TheprimaryreasonisthatRNNarchitecturesareabletostoreinformationfor long-termdependenciesovertime-lagsandtoadjustwithsuccessiveconnectionsequenceinformation. Inaddition,theeffectivenessofRNNarchitecturesareshownfortheUNSW-NB15dataset. KEywoRDS Deep Learning (DL) Approaches, Gated Recurrent Unit (GRU), Intrusion Detection (ID) Data Sets, KDDCup ’99’, Long Short-Term Memory (LSTM), Machine Learning (ML), Recurrent Neural Network (RNN), UNSW-NB15

read more

Citations
More filters
Proceedings ArticleDOI

Evaluating Shallow and Deep Neural Networks for Network Intrusion Detection Systems in Cyber Security

TL;DR: DNNs have been utilized to predict the attacks on Network Intrusion Detection System (N-IDS) and it is concluded that a DNN of 3 layers has superior performance over all the other classical machine learning algorithms.
Journal ArticleDOI

SDN-Enabled Hybrid DL-Driven Framework for the Detection of Emerging Cyber Threats in IoT

TL;DR: This work presents an SDN-enabled architecture leveraging hybrid deep learning detection algorithms for the efficient detection of cyber threats and attacks while considering the resource-constrained IoT devices so that no burden is placed on them.
Journal ArticleDOI

GAN augmentation to deal with imbalance in imaging-based intrusion detection

TL;DR: A deep learning methodology for the binary classification of the network traffic that leads to better predictive accuracy when compared to competitive intrusion detection architectures on four benchmark datasets is illustrated.
Journal ArticleDOI

Intrusion Detection System Using Voting-Based Neural Network

TL;DR: Experimental results over KDDCUP'99 and CTU-13, as two well known and more widely employed datasets, revealed the voting procedure was highly effective to increase the system performance, where the false alarms were reduced up to 75% in comparison with the original deep learning models.
Journal ArticleDOI

Intrusion detection systems using classical machine learning techniques vs integrated unsupervised feature learning and deep neural network

TL;DR: A performance comparison of classical machine learning approaches that require vast feature engineering, vs integrated unsupervised feature learning and deep neural networks on the NSL‐KDD dataset, finds the DNN using 15 features extracted using Principal Component analysis (PCA) was the most effective modeling method.
References
More filters
Journal ArticleDOI

Long short-term memory

TL;DR: A novel, efficient, gradient based method called long short-term memory (LSTM) is introduced, which can learn to bridge minimal time lags in excess of 1000 discrete-time steps by enforcing constant error flow through constant error carousels within special units.
Journal Article

Scikit-learn: Machine Learning in Python

TL;DR: Scikit-learn is a Python module integrating a wide range of state-of-the-art machine learning algorithms for medium-scale supervised and unsupervised problems, focusing on bringing machine learning to non-specialists using a general-purpose high-level language.
Journal ArticleDOI

Deep learning

TL;DR: Deep learning is making major advances in solving problems that have resisted the best attempts of the artificial intelligence community for many years, and will have many more successes in the near future because it requires very little engineering by hand and can easily take advantage of increases in the amount of available computation and data.
Book

Deep Learning

TL;DR: Deep learning as mentioned in this paper is a form of machine learning that enables computers to learn from experience and understand the world in terms of a hierarchy of concepts, and it is used in many applications such as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames.
Journal Article

Visualizing Data using t-SNE

TL;DR: A new technique called t-SNE that visualizes high-dimensional data by giving each datapoint a location in a two or three-dimensional map, a variation of Stochastic Neighbor Embedding that is much easier to optimize, and produces significantly better visualizations by reducing the tendency to crowd points together in the center of the map.
Related Papers (5)