scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Computer Science in 2018"


Journal ArticleDOI
TL;DR: A Principal Component Analysis (PCA) dimension reduction method that includes the calculation of variance proportion for eigenvector selection was used and found that the classification method using LMBP was more stable than SVM.
Abstract: Cancer is one of the most deadly diseases in the world. The International Agency for Research on Cancer (IARC) noted 14.1 million new cancer cases and 8.2 million deaths from cancer in 2012. In the last few years, DNA microarray technology has increasingly been used to analyze and diagnose cancer. Analysis of gene expression data in the form of microarray allows medical experts to ascertain whether or not a person suffers from cancer. DNA microarray data has a large dimension that can affect the process and accuracy of cancer classification. Therefore, a classification scheme that includes dimension reduction is needed. In this research, a Principal Component Analysis (PCA) dimension reduction method that includes the calculation of variance proportion for eigenvector selection was used. For the classification method, a Support Vector Machine (SVM) and Levenberg-Marquardt Backpropagation (LMBP) algorithm were selected. Based on the tests performed, the classification method using LMBP was more stable than SVM. The LMBP method achieved an average 96.07% accuracy, while the SVM achieved 94.98% accuracy.

62 citations


Journal ArticleDOI
TL;DR: The experiments’ results show that the Logistic Regression (LR) algorithm is the best classifier with the highest accuracy as compared to the other three classifiers, not merely in text classification, but in unfair reviews detection as well.
Abstract: Reputation and trust are significantly important and play a pivotal role in enabling multiple parties to establish relationships that achieve mutual benefit especially in an E-Commerce (EC) environment. There are several factors negatively affecting the sight of customers and sellers in terms of reputation. For instance, lack of credibility in providing feedback reviews, by which users might create phantom feedback reviews to support their reputation. Thus, we will feel that these reviews and ratings are unfair. In this study, we have used Sentiment Analysis (SA) which is now the subject generating the most interest in the field of text analysis. One of the major challenges confronting SA today is how to detect unfair negative reviews, unfair neutral reviews and unfair positive reviews from opinion reviews. Sentiment classification techniques are used against a dataset of consumer reviews. Precisely, we provide comparison of four supervised machine learning algorithms: Naive Bayes (NB), Decision Tree (DT-J48), Logistic Regression (LR) and Support Vector Machine (SVM) for sentiment classification using three datasets of reviews, including Clothing, Shoes and Jewelry reviews, Baby reviews as well as Pet Supplies reviews. In order to evaluate the performance of sentiment classification, this work has implemented accuracy, precision and recall as a performance measure. Our experiments’ results show that the Logistic Regression (LR) algorithm is the best classifier with the highest accuracy as compared to the other three classifiers, not merely in text classification, but in unfair reviews detection as well.

37 citations


Journal ArticleDOI
TL;DR: The results of this experiment show that LSTM model with/or without intermediate variable has better performance than ARIMA Model.
Abstract: Weather forecasting is an interesting research problem in flight navigation area. One of the important weather data in aviation is visibility. Visibility is an important factor in all phases of flight, especially when the aircraft is maneuvering on or close to the ground, i.e., during taxi-out, take-off and initial climb, approach and landing and taxi-in. The aim of these study is to analyze intermediate variables and do the comparison of visibility forecasting by using Autoregressive Integrated Moving Average (ARIMA) and Long Short Term Memory (LSTM) Model. This paper proposes ARIMA model and LSTM model for forecasting visibility at Hang Nadim Airport, Batam Indonesia using one variable weather data as predictor such as visibility and combine with another variable weather data as moderating variables such as temperature, dew point and humidity. The models were tested using weather time series data at Hang Nadim Airport, Batam Indonesia. This research compares the Root Mean Square Error (RMSE) resulted by LTSM model with the RMSE resulted by ARIMA model. The results of this experiment show that LSTM model with/or without intermediate variable has better performance than ARIMA Model.

32 citations


Journal ArticleDOI
TL;DR: The conceptual User Satisfaction Evaluation Model (USEM) employed to measures LMS success seeks to examine the relationship between service quality, system quality, ease of use, perceived usefulness, information quality and students satisfaction, as well as to measure the outcomes of the LMS.
Abstract: In the last few years, the use of educational technology, particularly the concept of Learning Management System (LMS), has increased rapidly. With this fast development, the question arises as how to manage the LMS to obtain success and efficiency in online courses. One of the important factors that have received many citations in literature studies (and has a special position in information system research) is the user satisfaction. It is a crucial factor that can predict the success or failure of any LMS. In relation, this research examined the success factors that affect the user satisfaction and outcomes of LMS. This paper discusses the conceptual User Satisfaction Evaluation Model (USEM) employed to measures LMS success. In particular, it seeks to examine “the relationship between: Service quality, system quality, ease of use, perceived usefulness, information quality and students satisfaction, as well as to measure the outcomes of the LMS.” Results from the data analysis indicate that all proposed factors have a positive effect on student satisfaction. The result also concludes that a higher rate of user satisfaction will lead to greater benefits for the students.

26 citations


Journal ArticleDOI
TL;DR: The experiments show that GRU model performs very well on emotion classification compared to the DNN model, and with the recent advancements in deep learning now it is possible to get better accuracy, robustness and low latency for solving complex functions.
Abstract: Emotions play a vital role in the efficient and natural human computer interaction. Recognizing human emotions from their speech is truly a challenging task when accuracy, robustness and latency are considered. With the recent advancements in deep learning now it is possible to get better accuracy, robustness and low latency for solving complex functions. In our experiment we have developed two deep learning models for emotion recognition from speech. We compare the performance of a feed forward Deep Neural Network (DNN) with the recently developed Recurrent Neural Network (RNN) which is known as Gated Recurrent Unit (GRU) for speech emotion recognition. GRUs are currently not explored for classifying emotions from speech. The DNN model gives an accuracy of 89.96% and the GRU model gives an accuracy of 95.82%. Our experiments show that GRU model performs very well on emotion classification compared to the DNN model.

21 citations


Journal ArticleDOI
TL;DR: Message Queue Telemetry Transfer protocol was chosen over Constrained Application Protocol and Extensible Messaging and Presence Protocol in the experiment conducted in terms of its light weight transmission, resource consumption and effectively providing the different quality of services to detect the temperature and humidity as well as the gas leaks encountered in a greenhouse environment.
Abstract: With industrialization and continuously evolving climatic conditions, the urge to practice agriculture with the fusion of technology has become a necessity In the era of Internet of Things where all eyes are witnessing the evolution of machine to machine interaction, there is also a lack of clarity in considering the type of protocol to be used in building a particular system like Green House A green house is a regulated environment for agriculture where critical parameters like temperature, light, humidity, ph level of soil can be monitored with the help of sensor systems using Internet of Things protocols Message Queue Telemetry Transfer protocol was chosen over Constrained Application Protocol and Extensible Messaging and Presence Protocol in the experiment conducted in terms of its light weight transmission, resource consumption and effectively providing the different quality of services to detect the temperature and humidity as well as the gas leaks encountered in a greenhouse environment

21 citations


Journal ArticleDOI
TL;DR: This paper delves into the capacity of enhanced Big Bang-Big Crunch (EBB-BC) metaheuristic to handle data clustering problems and demonstrates the high quality solutions generated by elite pool-based BB-BC.
Abstract: This paper delves into the capacity of enhanced Big Bang-Big Crunch (EBB-BC) metaheuristic to handle data clustering problems. BB-BC is a product of an evolution theory of the universe in physics and astronomy. Two main phases of BB-BC are big bang and big crunch. The big bang phase involves a creation of a population of random initial solutions, while in the big crunch phase these solutions are shrunk into one elite solution exhibited by a mass center. This study looks into enhancing the BB-BC’s effectiveness in clustering data. Where, the inclusion of an elite pool alongside implicit solution recombination and local search method, contribute to such enhancement. Such strategies resulted in a balanced search of good quality population that is also diverse. The proposed elite pool-based BB-BC was compared with the original BB-BC and other identical metaheuristics. Fourteen different clustering datasets were used to test BB-BC and the elite pool-based BB-BC showed better performance compared to the original BB-BC. BB-BC was impacted more by the incorporated strategies. The experiments outcomes demonstrate the high quality solutions generated by elite pool-based BB-BC. Its performance in fact supersedes that of identical metaheuristics such as swarm intelligence and evolutionary algorithms.

21 citations


Journal ArticleDOI
TL;DR: It was noticed that the PCC filter showed a remarkable improvement in the classification accuracy when it was combined with BPSO or GA, thereby forming a PCC-BPSO/GA-multi classifiers approach.
Abstract: In this study, a three-phase hybrid approach is proposed for the selection and classification of high dimensional microarray data. The method uses Pearson’s Correlation Coefficient (PCC) in combination with Binary Particle Swarm Optimization (BPSO) or Genetic Algorithm (GA) along with various classifiers, thereby forming a PCC-BPSO/GA-multi classifiers approach. As such, five various classifiers are employed in the final stage of the classification. It was noticed that the PCC filter showed a remarkable improvement in the classification accuracy when it was combined with BPSO or GA. This positive impact was seen to be varied for different datasets based on the final applied classifier. The performance of various combination of the hybrid technique was compared in terms of accuracy and number of selected genes. In addition to the fact that BPSO is working faster than GA, it was noticed that BPSO has better performance than GA when it is combined with PCC feature selection.

20 citations


Journal ArticleDOI
TL;DR: VANET’s special characteristics are discussed and why VANETs are considered a subcategory of ad hoc networks are explained and a categorization of VANets architectures is presented in this study.
Abstract: VAVETs have become an interesting area of research since vehicles can be equipped with sensors, processing and communication devices. As a result, various life changing application emerged in different areas such as safety and public services. VANETs are considered as a subclass of Ad hoc networks. However, they have special characteristics that differentiates them such as QoS requirements, privacy, safety and high mobility of nodes. This paper discusses VANET’s special characteristics and explain why VANETs are considered a subcategory of ad hoc networks. Also, a categorization of VANETs architectures is presented in this study. Additionally, the importance and needs of VANETs along with their applications are presented in this study. Furthermore various routing protocols proposed for VANETs are studied and a categorization for these protocols is proposed in this study.

20 citations


Journal ArticleDOI
TL;DR: Adaboost is found to be the best meta decision classifier for predicting the student’s result based on the marks obtained in the semester.
Abstract: Student performance prediction is an area of concern for educational institutions. At the University level learning system, the method or rule adopted to identify the candidates who pass or fail differs depending on various factors such as the course, the department of study and so on. Predicting the result of a student in a course is an issue that has recently been addressed using machine learning techniques. The focus of this work is to find a way to predict a student’s academic performance in the University using the machine learning approach. This is done by using the previous records of the student rather than applying course dependent formulae to predict the student’s final grade. In this work, meta decision tree classifier techniques based on four representative learning algorithms, namely Adaboost, Bagging, Dagging and Grading are used to construct different decision trees. REPTree is used as the decision tree method for meta learning. These four meta learning methods have been compared separately with respect to the training and test sets. Adaboost is found to be the best meta decision classifier for predicting the student’s result based on the marks obtained in the semester.

20 citations


Journal ArticleDOI
TL;DR: This study presents a novel hybrid approach to detecting a DDoS attack by means of monitoring abnormal traffic in the network, which combines two methods: traffic prediction and changing detection.
Abstract: In recent years, computer networks have become more and more advanced in terms of size, applications, complexity and level of heterogeneity. Moreover, availability and performance are important issues for end users. New types of cyber-attacks that can affect and damage network performance and availability are constantly emerging and some threats, such as Distributed Denial of Service (DDoS) attacks, can be very dangerous and cannot be easily prevented. In this study, we present a novel hybrid approach to detecting a DDoS attack by means of monitoring abnormal traffic in the network. This approach reads traffic data and from that it is possible to build a model, by means of which future data may be predicted and compared with observed data, in order to detect any abnormal traffic. This approach combines two methods: traffic prediction and changing detection. To the best of our knowledge, such a combination has never been used in this area before. The approach achieved a highly significant accuracy rate of 98.3% and sensitivity was 100%, which means that all potential attacks are detected and prevented from penetrating the network system.

Journal ArticleDOI
TL;DR: The results regression analysis shows whenever users used the cloud storage easily perceived usefulness decreased the performance of job consequently, and went up the desirability to use the specific technology and own conscious scheme to execute for future.
Abstract: Recently, technology has witnessed a rapid progress and dominates all aspects of life systematically. Smartphones are devices that users can depend on managing and storing data. Moreover, they are used to save, update and secure the private information of users by using Cloud Storage such as iCloud, Dropbox and Google Drive. This study presents and adjusts the Technology Acceptance Model (TAM) specifically, perceived ease of use, usefulness, attitude toward usage and behavioral intention. Additionally, the study used Cornbach Alpha Correlation and Regression analysis in public universities, particularly in the University of Sulaimani. The study aims to recognize the major elements that have impact on using cloud storage and to what extent the user depends on electronic devices in the academic field and to identify the factors and crucial effect on the users to use electronic device easily, improve their capability and attract to use the specific technology. The results of this study confirmed that some TAM constructs direct and indirect effect on university academics, employees and students’ behavioral intention to use cloud storage (Dropbox, iCloud and Google drive). However, the results regression analysis shows whenever users used the cloud storage easily perceived usefulness decreased the performance of job consequently. On the other hand, it went up the desirability to use the specific technology and own conscious scheme to execute for future.

Journal ArticleDOI
TL;DR: Applications reviewed in this research are about crop sensing, mapping and monitoring the croplands pattern, managing and controlling with the help of radio frequency identification and real-time monitoring of environment, low power wireless sensor, better connectivity, operational efficiency and remote management.
Abstract: Internet of things has acquired attention all over the globe. It has transformed the agricultural field and allowed farmers to compete with massive issues they face. The aim of this paper is to review the various challenges and opportunities associated with the applications of internet of things in agricultural sector. This research makes use of secondary sources that have been gathered from existing academic literature such as journals, books, articles, magazines, internet, newsletter, company publications and whitepapers. Applications reviewed in this research are about crop sensing, mapping and monitoring the croplands pattern, managing and controlling with the help of radio frequency identification and real-time monitoring of environment. Some of the challenges that were taken into consideration for reviewing the applications of internet of things are software complexity, security, lack of supporting infrastructure and technical skill requirement. Complexity in the software has to be rectified in order to support the IoT network. Therefore software must be developed as user-friendly for improving the farming, production and quality of the crop. Security is the major threat in the IoT applications. Security has to be enhanced through proper access control, data confidentiality and user authentication. Technical skill is required for farming to enhance the organizational abilities and to perform the farming functions, solving problems and more. Proper supporting infrastructure can be developed with proper internet availability and connectivity. Some of the opportunities were taken for reviewing the applications of internet of things are low power wireless sensor, better connectivity, operational efficiency and remote management.

Journal ArticleDOI
TL;DR: The objective of this project is to predict accurately one-dimensional coordinates of normalized n-component vectors representing two-dimensional silhouettes in order to identify individuals at a distance without any interaction and obtrusion.
Abstract: It is crucial to find methods that analyze large amount of data captured by cameras and/or various sensors installed all around us. Machine learning becomes a prevailing tool in analyzing such data that signifies behavioral characteristics of human beings. Gait as an identifier for use in individual recognition systems has respective and almost certainly unique key features for each person including centroid, cycle length and step size. Gait is sometimes preeminent suited to recognition or surveillance scenarios. It might be used in the identification of females who are wearing veils in some countries without critical social issues. The objective of this project is to predict accurately one-dimensional coordinates of normalized n-component vectors representing two-dimensional silhouettes in order to identify individuals at a distance without any interaction and obtrusion. Varied algorithms are further incorporated into walk pattern analysis to adoptively improve gait recognitions and classification. The results are reported reasonable identification performance as compared to several machine learning methods.

Journal ArticleDOI
TL;DR: The results show that the combined system can achieve high classification accuracy and has promising potential application in the Arabic sentiment analysis and opinion mining.
Abstract: Sentiment analysis has recently become one of the growing areas of research related to text mining and natural language processing. Sentiment analysis techniques are increasingly exploited to categorize the opinion text to one or more predefined sentiment classes for the creation and automated maintenance of review-aggregation websites. Most of the current studies related to this topic focus mainly on English texts with very limited resources available for other languages like Arabic. The complexities of Arabic language in morphology, orthography and dialects makes sentiment analysis for Arabic more challenging. In this study, the Naive Bayes algorithm (NB) and Multilayer Perceptron (MLP) network are combined with hybrid system called NB-MLP for Arabic sentiment classification. Five datasets were tested; attraction, hotel, movie, product, and restaurant. The datasets are then classified into positive or negative polarities of sentiment using both standard and combined system. The 10-fold cross validation was employed for splitting the dataset. Over the whole set of experimental data, the results show that the combined system can achieve high classification accuracy and has promising potential application in the Arabic sentiment analysis and opinion mining.

Journal ArticleDOI
TL;DR: This paper proposes a new emotional database with speech in the Portuguese language of Brazil, called Voice Emotion Recognition dataBase in Portuguese language (VERBO), and establishment of a new actuated voice database and support provided by voice recognition systems for the analysis of feelings and emotions.
Abstract: The recognition of human emotional traits based on Affective Computing is being carried out by computational systems that are able to interpret and react intelligently to the context of the user. Speech Emotion Recognition systems are capable of transforming speech signal data into information related to the feelings of individuals in specific situations. However, the emotional expression of a human being depends mainly on his origins. For this reason, emotional voice databases are peculiar to each language. In this paper, we propose a new emotional database with speech in the Portuguese language of Brazil, called Voice Emotion Recognition dataBase in Portuguese language (VERBO). The database was validated by a panel of expert judges and we achieved an agreement rate of 76% using the content validity index and substantial agreement rate of 65% using Fleiss’ Kappa. In addition, an accuracy of 0.76 was achieved and it was possible to observe that the emotions anger and happiness were more easy to recognize showing 0.85 and 0.83 of f1-score, respectively, whereas the disgust and surprise emotions were the most difficult showing 0.67 and 0.68, respectively. In view of this, the main contributions to research made by this study are: (1) The establishment of a new actuated voice database; (2) support provided by voice recognition systems for the analysis of feelings and emotions; and (3) statistical validation of the database using CVI and Fleiss kappa.

Journal ArticleDOI
TL;DR: This research presents a fusion of statistical features, extracted from fragments of Arabic handwriting samples to identify the writer using fuzzy ARTMAP classifier, a supervised neural model, especially suited to classification problems.
Abstract: Arabic writer identification and associated tasks are still fresh due to huge variety of Arabic writer's styles Current research presents a fusion of statistical features, extracted from fragments of Arabic handwriting samples to identify the writer using fuzzy ARTMAP classifier Fuzzy ARTMP is supervised neural model, especially suited to classification problems It is faster to train and need less number of training epochs to "learn" from input data for generalization The extracted features are fed to Fuzzy ARTMP for training and testing Fuzzy ARTMAP is employed for the first time along with a novel fusion of statistical features for Arabic writer identification The entire IFN/ENIT database is used in experiments such that 75% handwritten Arabic words from 411 writers are employed in training and 25% for testing the system at random Several combinations of extracted features are tested using fuzzy ARTMAP classifier and finally one combination exhibited promising accuracy of 94724% for Arabic writer identification on IFN/ENIT benchmark database

Journal ArticleDOI
TL;DR: The paper presents a detailed description of the main elements of the approach including models, transformations and a specialised software (Personal Knowledge Base Designer) that makes the design process of rule-based expert systems and knowledge bases more efficient.
Abstract: The problem of improving efficiency of intelligence systems engineering remains a relevant topic of scientific research. One of the trends in this area is the use of the principles of cognitive (visual) modelling and design as well as approaches based on generative programming and model transformations. This paper aims to describe the implementation and application of model transformations for prototyping rule-based knowledge bases and expert systems. The implementation proposed uses the main principles of the Model Driven Architecture (MDA) (e.g., model types and creation stages) and considers the features of developing intelligent systems. Therefore, the current research employs the following tools: Ontologies for the representation of the computation-independent model; the author’s original notation, namely, the Rule Visual Modelling Language (RVML) to create the platform-independent and platform-specific models; the C Language Integrated Production System (CLIPS) and the Drools Rule Language (DRL) as the programming languages (as the platforms). The approach proposed targets non-programmers (domain experts and analytics) and makes the design process of rule-based expert systems and knowledge bases more efficient. The paper also presents a detailed description of the main elements of the approach including models, transformations and a specialised software (Personal Knowledge Base Designer).

Journal ArticleDOI
TL;DR: The result showed that Petrosian C in signal with the scale of 1-5 and SVM with fine Gaussian kernel had the highest accuracy of 99% for five classes of lung sound data, and the proposed method can be used as an alternative method for computerized lung sound analysis to assist the doctors in the early diagnosis of lung disease.
Abstract: Lung sound is a biological signal with the information of respiratory system health Health lung sound can be differentiated from other pathological sounds by auscultation This difference can be objectively analyzed by a number of digital signal processing techniques One method in analyzing the lung sound is signal complexity analysis using fractal dimension To improve the accuracy of lung sound classification, Fractal Dimension (FD) is calculated in the multiscale signal using the coarse-grained procedure The combination of FD and multiscale process generates the more comprehensive information of lung sound This study used seven types of FD and three types of the classifier The result showed that Petrosian C in signal with the scale of 1-5 and SVM with fine Gaussian kernel had the highest accuracy of 99% for five classes of lung sound data The proposed method can be used as an alternative method for computerized lung sound analysis to assist the doctors in the early diagnosis of lung disease

Journal ArticleDOI
TL;DR: Most active binarization researchers exploit several initial information from the source image such as histogram shape, measurement space clustering, entropy, object attributes, spatial correlation and local gray level surface with a special attention to statistical information description features of image used in recent thresholds techniques.
Abstract: Binarization is an important process in image enhancement and analysis. Currently, numerous binarization techniques have been reported in the literature. These binarization methods produce binary images from color or gray-level images. This article highlights an extensive review on various binarization approaches which are also referred to as thresholding methods. These methods are grouped into seven categories according to the employed features and techniques: histogram shape-based, clustering-based, entropy-based, object-attribute-based, spatial, local and hybrid methods. Most active binarization researchers exploit several initial information from the source image such as histogram shape, measurement space clustering, entropy, object attributes, spatial correlation and local gray level surface with a special attention to statistical information description features of image used in recent thresholding techniques.

Journal ArticleDOI
TL;DR: The results showed a positive and significant relationship on vocational school students in Indonesia, especially in vocational high school "Dharma Nusantara", and it is recommended that further research is undertaken by examining other factors that may contribute to and influence the improvement of WELS.
Abstract: The background of this research question comes from teachers of the vocational high school in Indonesia who are difficulties and not sure apply WEB-Based learning to students of Computer and Network engineering program. To answer that question, researchers are developing a learning model based on Electronic Learning Systems (WELS) web-based. The development process is done through Research and Development Approach. From this WELS model, then used in learning on computer technique and network technique and the result is quite effective. Then tested the level of relationship between WELS with the level of effectiveness obtained, the results showed a positive and significant relationship on vocational school students in Indonesia, especially in vocational high school \"Dharma Nusantara\". In order to produce better future research, we recommend that further research is undertaken by examining other factors that may contribute to and influence the improvement of WELS for all vocational high schools in Indonesia.

Journal ArticleDOI
TL;DR: The study shows that Zambian public sector has related challenges in mitigation of insider attacks that calls for considered efforts in developing measures for mitigation of these challenges in order to ensure national cyber security readiness and enhancing data privacy.
Abstract: Insider attacks are security breaches posed by an existing or former organizational stakeholder with unrestricted access rights to the resources who, with or without intent, compromises the confidentiality, integrity and availability of organizational data. Zambian public organizations are vulnerable to insider attacks due to a number of factors that include; technology complexity, understaffing, financial gains, lack of security policies and procedures, lack of adoption and implementation of international security frameworks and standards such as ISO 27000 and COBIT. Insider threats can be categorized into three dimensions namely; Information Technology (IT) Sabotage, Financial Fraud and Intellectual Property (IP) theft. This paper reports the results from three targeted public organizations in Zambia. These are among the few that seem to recognised cyber threats and have partially adopted some parts of security base practices and international information security standards such as COBIT 5.0 and ISO 27001 standard. The study aimed at assessing the security GAPs using ISO 27001:2013 Information Security Management System (ISMS) standard. The study approach used was quantitative and qualitative with survey questionnaires and interviews as assessment tools for empirical data collection. The study shows that Zambian public sector has related challenges in mitigation of insider attacks that calls for considered efforts in developing measures for mitigation of these challenges in order to ensure national cyber security readiness and enhancing data privacy. The study reviewed that majority of the organizations assessed lack insider security deterring policies such as access control, non-disclosure agreements (NDA), pre-employment screening and unacceptable use. Additionally, the findings indicated that majority of public organizations have not made any efforts towards cyber security readiness, while only about 33% have adopted some security base practices. Further, using Actor Network Theory (ANT) and Theory of Planned Behavior (TPB), the study proposed an expedient insider mitigation model with an emphasis on user awareness and access control considering that it is difficult to model human behavior.

Journal ArticleDOI
TL;DR: In this study, two techniques are introduced for imagesteganography in the spatial domain that employ chaos theory to track the addresses of shuffled bits in steganography and outperform existing systems.
Abstract: In this study, two techniques are introduced for imagesteganography in the spatial domain. These systems employ chaos theory to trackthe addresses of shuffled bits in steganography. The first system is based onthe well-known LSB technique, while the second system is based on a recentapproach that searches for the identical bits between the secret message andthe cover image. A modified logistic map is employed in the chaotic map togenerate integer chaotic series to extract the shuffled addresses bits. PeakSignal-to-Noise Ratio (PSNR), Mean Square Error (MSE), histogram analysis andcorrelative analysis are used for testing and evaluating the new levels ofsecurity for the proposed techniques. The results show that the proposedmethods outperform existing systems.

Journal ArticleDOI
TL;DR: Simulation results show that the proposed method can reduce the total distance traveled and time taken in order to reach a destination, as compared to the classic “shortest path method” (based only on the distance).
Abstract: Proposing an efficient strategy to reduce traffic congestion is an essential step towards improvement as we take into consideration the unpredictable and dynamic infrastructure of the road network. With the advances in computing technologies and communications protocols, we can retrieve any type of data and receive in real-time the state of traffic congestion at each road using Electronic Toll Collection System (ETCS), Vehicle Traffic Routing System (VTRS), Intelligent Transportation System (ITS) and Traffic Light Signals (TLS). This study introduces a new distributed strategy that aims to optimize traffic road congestion in real-time based on the Vehicular Ad-Hoc Network (VANET) communication system and the techniques of the Ant Colony Optimization (ACO). The VANET is used as a communication technology that will help us create a channel of communication between several vehicles and routes. The techniques of the ACO is used to compute the shortest path that can be followed by the driver to avoid congested routes. The proposed system is based on a multi-agent architecture, in which all agents work together to monitor the road traffic congestion and help drivers quickly arrive at their destinations by following the best routes with less congestion. Simulation results show that the proposed method can reduce the total distance traveled and time taken in order to reach a destination, as compared to the classic “shortest path method” (based only on the distance).

Journal ArticleDOI
TL;DR: This research paper describes how permission system security can create an awareness among the users that would assist them in deciding on permission grants and improved and responsible user activities in Android OS can help the users in utilizing their device securely.
Abstract: In today’s world there has been an exponential growth among smart-phone users which has led to the unbridled growth of smart-phone apps available in Google play store, app store etc., In case of android application, there are many free applications for which the user need not shell out a penny to use the services. Here the magic word is “free” which entices millions of pliant people into installing those apps and giving unnecessary access to their data and device control. Current studies have shown that over 70% of the apps in market, request to gather data digressive to the most functions of apps that might cause seeping of personal data or inefficient use of mobile resources. Of late, couple of malignant applications gather unobtrusive information of the user through third-party applications by increasing their permissions to high-level on the Android Operating System. Android permission system provides, the user access to the third party apps and in return based on the permissions granted by the user, an app can access the related resource from the user's mobile. A user is bound to grant or deny permits during the installation of the application. For the most part, users don't focus on the asked permissions, or sometimes users do not understand the meaning of the permission and install the app on their device. They allow a way for attackers to perform the malicious task by demanding for more than expected set of permissions. These extra permissions permit the attacker to exploit the device and also retrieve sensitive information from it. In this research paper we describe how permission system security can create an awareness among the users that would assist them in deciding on permission grants. This improved and responsible user activities in Android OS can help the users in utilizing their device securely.

Journal ArticleDOI
TL;DR: This work pays attention to basic resource provisioning problems arise in cloud computing environments and presents some conceptual graph theoretical suggestions to address these issues.
Abstract: Cloud computing, a kind of web service provisioning model, provides immense benefits over traditional IT service environments with the help of virtualization technology. As cloud computing is not a fully matured paradigm, it poses many open issues to be addressed. The key research problem in cloud computing is efficient resource provisioning which is due to its complex and distributed architecture. Graph-based representations of complex networks provide simpler views and graph theoretical techniques provide simpler solutions for lot of issues inherent in networks. Hence, this paper begins with exploration of graph theory applications in computer networks with specific focus on graph theory applications in cloud computing. This work pays attention to basic resource provisioning problems arise in cloud computing environments and presents some conceptual graph theoretical suggestions to address these issues.

Journal ArticleDOI
TL;DR: The event of the collapse and eventual sinking of a concrete offshore platform in the North Sea is presented as a case study where a serious error in the finite element analysis played a crucial role leading to structural failure and collapse.
Abstract: Computer simulations and computational methods, such as the Finite Element Analysis (FEA) have become essential methodologies in science and engineering during the last decades, in a wide variety of academic fields. Six decades after the invention of the digital computer, advanced FE simulations are used to enhance and leapfrog theoretical and experimental progress, at different levels of complexity. Particularly in Civil and Structural Engineering, significant research work has been made lately on the development of FE simulation codes, methodologies and validation techniques for understanding the behavior of large and complex structures such as buildings, bridges, dams, offshore structures and others. These efforts are aimed at designing structures that are resilient to natural excitations (wind loads, earthquakes, floods) as well as human-made threats (impact, fire, explosion and others). The skill set required to master advanced FEA is inherently interdisciplinary, requiring in-depth knowledge of advanced mathematics, numerical methods and their computational implementation, as well as engineering sciences. In this paper, we focus on the importance of sound and profound engineering education and knowledge about the theory behind the Finite Element Method to obtain correct and reliable analysis results for designing real-world structures. We highlight common mistakes made by structural engineers while simulating complex structures and the risk of structural damage because of human-made mistakes or errors in the model assumptions. The event of the collapse and eventual sinking of a concrete offshore platform in the North Sea is presented as a case study where a serious error in the finite element analysis played a crucial role leading to structural failure and collapse.

Journal ArticleDOI
TL;DR: A thorough understanding of the nature of the problem, the methods used for data balancing, the learning objectives and assessment metrics used for getting measurable performance, the stated research solutions and the imbalanced problem in multiple classes are provided.
Abstract: The class imbalance problem presents an important challenge to the data mining community, in which the number of examples of one class is more than the others. This problem is characterized by a different distribution of cases between all the classes. In this paper, our goal is to study the various challenges of class imbalance problem and provide a comparative study of the current development of research in learning from imbalanced data. We provide a thorough understanding of the nature of the problem, the methods used for data balancing, the learning objectives and assessment metrics used for getting measurable performance, the stated research solutions and the imbalanced problem in multiple classes. This paper highlights the significant opportunities and challenges in the field and provides potential future research directions in the class imbalance problem.

Journal ArticleDOI
TL;DR: A mobile application that utilises sight word reading strategy that is incorporated into mobile application is an effective approach to help dyslexic children improve their reading skill.
Abstract: Dyslexia is an indicating term for learning disorder due to the difficulty in identifying speech and sound of letters and words, causing reading difficulty. Most children with dyslexia utilise the greater part of their senses to connect with their environment. One of the difficulties faced by dyslexic children is their troubles expressing their emotion or thought through verbal or written communication due to limited vocabulary, which is caused by issues in perceiving letters, sound and importance of the word overall. Sight word reading is a methodology with various reading stages and fascinating diversion with the purpose to create a fun reading experience. Currently, the dyslexia centre (i.e., Learning and Resource Centre, Dyslexia Association of Sarawak) does not adopt Information and Communication Technology (ICT) in their teaching and learning processes. Hence, in this study, a mobile application that utilises sight word reading strategy has been produced with the aim to help children with dyslexia build up their reading aptitudes. Sight words reading strategy is incorporated into three different modules in the mobile application, namely short stories, rhymes and song verses, according to the suggestions given by the instructors at dyslexia centre. This paper presents the study on the effect of sight word reading strategy in mobile application. The main contributions of this study is the utilisation of sight word reading strategy to enhance the application features such as story, rhyme and song modules. An improved score board is added to monitor the progress of the child. This mobile application, Mr Read V2.0 has been tested at the Learning and Resource Centre, Dyslexia Association of Sarawak with the instructors and dyslexic children, together with their parents. The children obtained 28% improved test scores taken before and after using Mr. Read V2.0. The overall results of the testing session showed that 100% respondents, instructors, parents and children either agreed or strongly agreed that this mobile application can improve the reading skill from using the additional features added to Mr. Read V2.0. Sight word reading strategy that is incorporated into mobile application is an effective approach to help dyslexic children improve their reading skill.

Journal ArticleDOI
TL;DR: This study discusses the tools proposed for modeling IFML to provide a comparative analysis while considering various criteria and the result can be used as a basis of tool selection for specific aspects.
Abstract: Modeling approaches based on standards are of paramount importance in the field of front-end design for web and mobile applications. Problems are often encountered during the selection of tools for designing applications for developers and researchers, particularly applications that are related to time and cost in the market and academia. The Interaction Flow Modeling Language (IFML) is a recently standardized modeling language designed for managing the content expression, user interaction and behavior control of front-end applications. IFML brings several benefits to the development process of web and mobile front-end applications. Thus, several tools have been developed for the exploitation of technical artifacts offered by current specifications. In this study, we discuss the tools proposed for modeling IFML to provide a comparative analysis while considering various criteria. The result can be used as a basis of tool selection for specific aspects.