scispace - formally typeset
Search or ask a question

Showing papers in "Intelligent Decision Technologies in 2020"



Journal ArticleDOI
TL;DR: The proposed DNN is characterized by the application of conditions on the weights in the form of box-constraints, during the training process, which restrict them from taking large values in order for all inputs and neurons of the DNN to be efficiently exploited and explored.
Abstract: The prediction of stock index movement is considered a rather significant objective in the financial world, since a reasonably accurate prediction has the possibility of gaining profit in stock exchange, yielding high financial benefits and hedging against market risks. Undoubtedly, the area of financial analysis has been dramatically changed from a rather qualitative science to a more quantitative science which is also based on knowledge extraction from databases. During the last years, deep learning constitutes a significant prediction tool in analyzing and exploiting the knowledge acquired from financial data. In this paper, we propose a new Deep Neural Network (DNN) prediction model for forecasting stock exchange index movement. The proposed DNN is characterized by the application of conditions on the weights in the form of box-constraints, during the training process. The motivation for placing these constraints is focused on defining the weights in the trained network in more uniform way, by restricting them from taking large values in order for all inputs and neurons of the DNN to be efficiently exploited and explored. The training of the new DNN model is performed by a Weight-Constrained Deep Neural Network (WCDNN) training algorithm which exploits the numerical efficiency and very low memory requirements of the L-BFGS (Limited-memory Broyden–Fletcher–Goldfarb–Shanno) matrices together with a gradient-projection strategy for handling the bounds on the weights of the network. The performance evaluation carried out on three popular stock exchange indices, demonstrates the classification efficiency of the proposed algorithm.

13 citations


Journal ArticleDOI
TL;DR: This work divides the moodle data sets from an annual postgraduate program at the Hellenic Open University in six periods for each section, and implements data mining techniques to analyze the activity, polarity and emotions of tutors and students in order to predict students’ grades.
Abstract: The lack of physical contact and the demanding need for personalized services has prompted stakeholders in distance learning to benefit from the enormous volume of students’ online traces in the Learning Management Systems. Data mining methodologies are widely applied to analyze data logs and predict trends for early and efficient interventions. Thus, the retention of students in the educational process can be achieved with positive effects on the reputation and finances of the institutions. This work divides the moodle data sets from six different sections of an annual postgraduate program at the Hellenic Open University in six periods for each section, due to the number of written assignments. Then it implements data mining techniques to analyze the activity, polarity and emotions of tutors and students in order to predict students’ grades. The results indicate the algorithm with the highest precision in each prediction. In addition, the research concludes that polarity and emotions as independent variables provide better performance in comparative models. Moreover, tutors’ variables are highlighted as an important factor for more accurate predictions of student grades. Finally, a comparison of actual and predicted grades indicates which students have used a third party to fulfill their assignments.

11 citations








Journal ArticleDOI
TL;DR: The objective of this study is to develop a method for helping accurate diagnosis of different diseases based on various classification methods using datasets taken from Machine Learning Repository datasets of the University of California at Irvine.
Abstract: There are many useful data mining methods for diagnosis of diseases and cancers. However, early diagnosis of a disease or cancer could significantly affect the chance of patient survival in some cases. The objective of this study is to develop a method for helping accurate diagnosis of different diseases based on various classification methods. Knowledge collection from domain experts is challenging, inaccessible and time-consuming; so we design a multi-classifier using a dynamic classifier and clustering selection approach to takes advantages of these methods based on data. We combine Forward-backward and Principal Component Analysis for feature reduction. The multi-classifier evaluates three clustering methods and ascertains the best classification methods in each cluster based on some training data. In this study, we use ten datasets taken from Machine Learning Repository datasets of the University of California at Irvine (UCI). The proposed multi-classifier improves both computation time and accuracy as compared with all other classification methods. It achieves maximum accuracy with minimum standard deviation over the sampled datasets.

5 citations




Journal ArticleDOI
TL;DR: The objective of this paper is to enhance/optimize the rate of transmission to upload the data in the cloud storage using the rclone and create the API for the archival solution in order to authenticate the cloud Storage using the JOSS authentication model, providing the authentication URL.
Abstract: Cloud Computing is widespread to support industrial and academy related applications worldwide. The archival data process in the cloud is one of the challenging tasks. To handle this issue, the authors addressed the model, which allows authenticating the cloud storage using the JOSS (Java Library for Open stack storage) through the authentication URL. To achieve the authentication of the heterogeneous data while uploading into the cloud storage, a proposed model developed using the JOSS. To address this research question and demonstrate our approach, we have created the API for the archival solution in order to authenticate the cloud storage using the JOSS authentication model, providing the authentication URL. It consists of the user id and password for the verification of the user while transmitting the data into the cloud storage. The API is becoming dynamic and complex, and one of the major challenges faced by the developer in order to develop appropriate standalone applicataion interface. The extensible mechanism for authentication requires a URL in order to connect to the cloud storage system. The objective of this paper is to enhance/optimize the rate of transmission to upload the data in the cloud storage using the rclone.


Journal ArticleDOI
TL;DR: This work proposes a classification system for latent fingerprints where conventional supervised technique based binary classifiers are combined into a cascade/stack of classifiers, fed with all or optimal feature set(s) for binary classification of fingermarks as ridge patterns from non-ridge background.
Abstract: Segmentation and classification of latent fingerprints is a young challenging area of research. Latent fingerprints are unintentional fingermarks. These marks are ridge patterns left at crime scenes, lifted with latent or unclear view of fingermarks, making it difficult to find the guilty party. The segmentation of lifted images of such finger impressions comes with some unique challenges in domain such as poor quality images, incomplete ridge patterns, overlapping prints etc. The classification of poorly acquired data can be improved with image pre-processing, feeding all or optimal set of features extracted to suitable classifiers etc. Our classification system proposes two main steps. First, various effective extracted features are compartmentalised into maximal independent sets with high correlation value, Second, conventional supervised technique based binary classifiers are combined into a cascade/stack of classifiers. These classifiers are fed with all or optimal feature set(s) for binary classification of fingermarks as ridge patterns from non-ridge background. The experimentation shows improvement in accuracy rate on IIIT-D database with supervised algorithms.


Journal ArticleDOI
TL;DR: A Secure Energy Efficient Ant Routing Algorithm (SEEARA) based on Ant Colony Optimization (ACO) algorithm and cryptographic primitives that exercises on power control and secure routing between a pair of network nodes and increases the performance and longevity of the network.
Abstract: In Mobile Ad hoc Networks (MANETs), setting up an ideal and proficient route linking the conveying bodies is the essential objective of the routing protocols. But any assault during the routing stage may upset the communication, paralyzing the whole network. So, providing security in routing for a protected communication between nodes has become a prime concern. In the present study, we propose a Secure Energy Efficient Ant Routing Algorithm (SEEARA) based on Ant Colony Optimization (ACO) algorithm and cryptographic primitives that exercises on power control and secure routing between a pair of network nodes and increases the performance and longevity of the network. Also, it can be realized during simulation studies that SEEARA shows a better solution in comparison with the previously proposed routing protocols.


Journal ArticleDOI
TL;DR: A hybrid approach is proposed integrating MCDM and clustering for evaluating and comparing the regional risk to natural disasters, which reveals the variation of regions in perspective of natural disaster risk and therefore offer valuable suggestions for disaster risk reduction.
Abstract: Natural disaster that contributes to the economic crisis all over the world has a crucial role in emergency management. The assessment of regional risk to natural disasters is normally studied as a multi-criteria decision making (MCDM) problem in the literature. However little effort was devoted into the comparison of temporary disaster risk of regions. In this paper, a hybrid approach is proposed integrating MCDM and clustering for evaluating and comparing the regional risk to natural disasters. Our two-stage method is applied to thirty-one Chinese regions over the past two consecutive years. In the first stage MCDM is used to prioritize the regions yearly yielding a set of risk vectors over the given period. In the second stage, K-means clustering is applied to divide the regions into a number of clusters characterized by different risk variation patterns. The derived patterns reveal the variation of regions in perspective of natural disaster risk and therefore offer valuable suggestions for disaster risk reduction.



Journal ArticleDOI
TL;DR: The value of usability obtained from software metrics and fuzzy model was very similar and will be useful for the software developer to evaluate the usability of any CBSS and will also help them to compare different version of anyCBSS in term of their usability.
Abstract: Component Based Software Engineering (CBSE) provides a way to create a new Component Based Software System (CBSS) by utilizing the existing components. The primary reason for that is to minimize the software development time, cost and effort. CBSS also increases the component reusability. Due to these advantages, software industries are working on CBSS and continuously trying to provide quality product. Usability is one of the major quality factors for CBSS. It should be measured before delivering the software product to the customer, so that if there are any usability flaws, it can be removed by software development team. In this paper, work has been done to evaluate the usability of CBSS based on major usability sub-factors (learnability, operability, understandability and configurability). For this purpose, firstly software metrics are identified for each usability sub-factor and the value of each sub-factor is evaluated for a component based software project. Secondly, overall usability of the software project is evaluated by using the calculated value of each usability sub-factor. Usability for the same project was also evaluated using Fuzzy approach in MATLAB to validate the experimental work of this research paper. It was identified that the value of usability obtained from software metrics and fuzzy model was very similar. This research work will be useful for the software developer to evaluate the usability of any CBSS and will also help them to compare different version of any CBSS in term of their usability.