scispace - formally typeset
Search or ask a question
Author

Aman Jantan

Bio: Aman Jantan is an academic researcher from Universiti Sains Malaysia. The author has contributed to research in topics: Intrusion detection system & Encryption. The author has an hindex of 17, co-authored 76 publications receiving 1512 citations.


Papers
More filters
Journal ArticleDOI
01 Nov 2018-Heliyon
TL;DR: The study found that neural-network models such as feedforward and feedback propagation artificial neural networks are performing better in its application to human problems and proposed feedforwardand feedback propagation ANN models for research focus based on data analysis factors like accuracy, processing speed, latency, fault tolerance, volume, scalability, convergence, and performance.

1,471 citations

Journal ArticleDOI
TL;DR: There is a need for state-of-the-art in neural networks application to PR to urgently address the above-highlights problems and the research focus on current models and the development of new models concurrently for more successes in the field.
Abstract: The era of artificial neural network (ANN) began with a simplified application in many fields and remarkable success in pattern recognition (PR) even in manufacturing industries. Although significant progress achieved and surveyed in addressing ANN application to PR challenges, nevertheless, some problems are yet to be resolved like whimsical orientation (the unknown path that cannot be accurately calculated due to its directional position). Other problem includes; object classification, location, scaling, neurons behavior analysis in hidden layers, rule, and template matching. Also, the lack of extant literature on the issues associated with ANN application to PR seems to slow down research focus and progress in the field. Hence, there is a need for state-of-the-art in neural networks application to PR to urgently address the above-highlights problems for more successes. The study furnishes readers with a clearer understanding of the current, and new trend in ANN models that effectively addresses PR challenges to enable research focus and topics. Similarly, the comprehensive review reveals the diverse areas of the success of ANN models and their application to PR. In evaluating the performance of ANN models, some statistical indicators for measuring the performance of the ANN model in many studies were adopted. Such as the use of mean absolute percentage error (MAPE), mean absolute error (MAE), root mean squared error (RMSE), and variance of absolute percentage error (VAPE). The result shows that the current ANN models such as GAN, SAE, DBN, RBM, RNN, RBFN, PNN, CNN, SLP, MLP, MLNN, Reservoir computing, and Transformer models are performing excellently in their application to PR tasks. Therefore, the study recommends the research focus on current models and the development of new models concurrently for more successes in the field.

217 citations

01 Jan 2008
TL;DR: A block-based transformation algorithm based on the combination of image transformation and a well known encryption and decryption algorithm called Blowfish is introduced, which showed that the correlation between image elements was significantly decreased by using the proposed technique.
Abstract: Encryption is used to securely transmit data in open networks. Each type of data has its own features, therefore different techniques should be used to protect conf idential image data from unauthorized access. Most of the available encryption algorithms are mainly used for textual data and may not be suitable for multimedia data such as images. In this paper, we introduce a block-based transformation algorithm based on the combination of image transformation and a well known encryption and decryption algorithm called Blowfish. The original image was divided into blocks, which were rearranged into a transformed image using a transformation algorithm presented here, and then the transformed image was encrypted using the Blowfish algorithm. The results showed that the correlation between image elements was sig nificantly decreased by using the proposed technique. The results also show that increasing the number of blocks by using smaller block sizes resulted in a lower correlation and hig her entropy.

179 citations

01 Jan 2008
TL;DR: A new permutation technique based on the combination of image permutation and a well known encryption algorithm called RijnDael is introduced, which showed that the correlation between image elements was significantly decreased by using the combination technique and higher entropy was achieved.
Abstract: Summary Data encryption is widely used to ensure security in open networks such as the internet. Each type of data has its own features, therefore, different techniques should be used to protect confidential image data from unauthorized access. Most of the available encryption algorithms are used for text data, however, due to large data size and real time constrains, algorithms that are good for textual data may not be suitable for multimedia data. In most of the natural images the values of the neighboring pixels are strongly correlated. This means that the value of any given pixel can be reasonably predicted from the values of its neighbors. In this paper, we introduce a new permutation technique based on the combination of image permutation and a well known encryption algorithm called RijnDael. The original image was divided into 4 pixels × 4 pixels blocks, which were rearranged into a permuted image using a permutation process presented here, and then the generated image was encrypted using the RijnDael algorithm. The results showed that the correlation between image elements was significantly decreased by using the combination technique and higher entropy was achieved.

74 citations

01 Jan 2008
TL;DR: This approach uses the Least Significant Bits (LSB) insertion to hide data within encrypted image data to enable the receiver to reconstruct the same secret transformation table after extracting it and hence the original image can be reproduced by the inverse of the transformation and encryption processes.
Abstract: Summary A new steganography approach for data hiding is proposed. This approach uses the Least Significant Bits (LSB) insertion to hide data within encrypted image data. The binary representation of the hidden data is used to overwrite the LSB of each byte within the encrypted image randomly. Experimental results show that the correlation and entropy values of the encrypted image before the insertion are similar to the values of correlation and entropy after the insertion. Since the correlation and entropy have not changed, the method offers a good concealment for data in the encrypted image, and reduces the chance of the encrypted image being detected. The hidden data will be used to enable the receiver to reconstruct the same secret transformation table after extracting it and hence the original image can be reproduced by the inverse of the transformation and encryption processes.

70 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.
Abstract: Machine Learning is the study of methods for programming computers to learn. Computers are applied to a wide range of tasks, and for most of these it is relatively easy for programmers to design and implement the necessary software. However, there are many tasks for which this is difficult or impossible. These can be divided into four general categories. First, there are problems for which there exist no human experts. For example, in modern automated manufacturing facilities, there is a need to predict machine failures before they occur by analyzing sensor readings. Because the machines are new, there are no human experts who can be interviewed by a programmer to provide the knowledge necessary to build a computer system. A machine learning system can study recorded data and subsequent machine failures and learn prediction rules. Second, there are problems where human experts exist, but where they are unable to explain their expertise. This is the case in many perceptual tasks, such as speech recognition, hand-writing recognition, and natural language understanding. Virtually all humans exhibit expert-level abilities on these tasks, but none of them can describe the detailed steps that they follow as they perform them. Fortunately, humans can provide machines with examples of the inputs and correct outputs for these tasks, so machine learning algorithms can learn to map the inputs to the outputs. Third, there are problems where phenomena are changing rapidly. In finance, for example, people would like to predict the future behavior of the stock market, of consumer purchases, or of exchange rates. These behaviors change frequently, so that even if a programmer could construct a good predictive computer program, it would need to be rewritten frequently. A learning program can relieve the programmer of this burden by constantly modifying and tuning a set of learned prediction rules. Fourth, there are applications that need to be customized for each computer user separately. Consider, for example, a program to filter unwanted electronic mail messages. Different users will need different filters. It is unreasonable to expect each user to program his or her own rules, and it is infeasible to provide every user with a software engineer to keep the rules up-to-date. A machine learning system can learn which mail messages the user rejects and maintain the filtering rules automatically. Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis. Statistics focuses on understanding the phenomena that have generated the data, often with the goal of testing different hypotheses about those phenomena. Data mining seeks to find patterns in the data that are understandable by people. Psychological studies of human learning aspire to understand the mechanisms underlying the various learning behaviors exhibited by people (concept learning, skill acquisition, strategy change, etc.).

13,246 citations

Christopher M. Bishop1
01 Jan 2006
TL;DR: Probability distributions of linear models for regression and classification are given in this article, along with a discussion of combining models and combining models in the context of machine learning and classification.
Abstract: Probability Distributions.- Linear Models for Regression.- Linear Models for Classification.- Neural Networks.- Kernel Methods.- Sparse Kernel Machines.- Graphical Models.- Mixture Models and EM.- Approximate Inference.- Sampling Methods.- Continuous Latent Variables.- Sequential Data.- Combining Models.

10,141 citations

Journal ArticleDOI
01 Nov 2018-Heliyon
TL;DR: The study found that neural-network models such as feedforward and feedback propagation artificial neural networks are performing better in its application to human problems and proposed feedforwardand feedback propagation ANN models for research focus based on data analysis factors like accuracy, processing speed, latency, fault tolerance, volume, scalability, convergence, and performance.

1,471 citations