scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Information Technology and Computer Science in 2012"


Journal ArticleDOI
TL;DR: A hybrid approach, CART classifier with feature selection and bagging technique has been considered to evaluate the performance in terms of accuracy and time for classification of various breast cancer datasets.
Abstract: Data mining is the process of analyzing large quantities of data and summarizing it into useful information. In medical diagnoses the role of data mining approaches increasing rapidly. Particularly Classification algorithms are very helpful in classifying the data, which is important in decision making process for medical practitioners. Further to enhance the classifier accuracy various pre-processing techniques and ensemble techniques were developed. In this study a hybrid approach, CART classifier with feature selection and bagging technique has been considered to evaluate the performance in terms of accuracy and time for classification of various breast cancer datasets.

158 citations


Journal ArticleDOI
TL;DR: The aim of this paper is bring together two areas in which are Artificial Neural Network (ANN) and Support Vector Machine (SVM) applying for image classification by bringing together many ANN and one SVM.
Abstract: Image classification is one of classical problems of concern in image processing. There are various approaches for solving this problem. The aim of this paper is bring together two areas in which are Artificial Neural Network (ANN) and Support Vector Machine (SVM) applying for image classification. Firstly, we separate the image into many sub-images based on the features of images. Each sub-image is classified into the responsive class by an ANN. Finally, SVM has been compiled all the classify result of ANN. Our proposal classification model has brought together many ANN and one SVM. Let it denote ANN_SVM. ANN_SVM has been applied for Roman numerals recognition application and the precision rate is 86%. The experimental results show the feasibility of our proposal model.

119 citations


Journal ArticleDOI
TL;DR: An improved genetic algorithm is developed by merging two existing scheduling algorithms for scheduling tasks taking into consideration their computational complexity and computing capacity of processing elements, which minimizes execution time and execution cost.
Abstract: Cloud computing is recently a booming area and has been emerging as a commercial reality in the information technology domain. Cloud computing represents supplement, consumption and delivery model for IT services that are based on internet on pay as per usage basis. The scheduling of the cloud services to the consumers by service providers influences the cost benefit of this computing paradigm. In such a scenario, Tasks should be scheduled efficiently such that the execution cost and time can be reduced. In this paper, we proposed a meta-heuristic based scheduling, which minimizes execution time and execution cost as well. An improved genetic algorithm is developed by merging two existing scheduling algorithms for scheduling tasks taking into consideration their computational complexity and computing capacity of processing elements. Experimental results show that, under the heavy loads, the proposed algorithm exhibits a good performance.

109 citations


Journal ArticleDOI
TL;DR: The aim of this work is to compare the performance of all three neural networks on the basis of its accuracy, time taken to build model, and training data set size to propose the best algorithm for kidney stone diagnosis.
Abstract: Artificial Neural networks are often used as a powerful discriminating classifier for tasks in medical diagnosis for early detection of diseases. They have several advantages over parametric classifiers such as discriminate analysis. The objective of this paper is to diagnose kidney stone disease by using three different neural network algorithms which have different architecture and characteristics. The aim of this work is to compare the performance of all three neural networks on the basis of its accuracy, time taken to build model, and training data set size. We will use Learning vector quantization (LVQ), two layers feed forward perceptron trained with back propagation training algorithm and Radial basis function (RBF) networks for diagnosis of kidney stone disease. In this work we used Waikato Environment for Knowledge Analysis (WEKA) version 3.7.5 as simulation tool which is an open source tool. The data set we used for diagnosis is real world data with 1000 instances and 8 attributes. In the end part we check the performance comparison of different algorithms to propose the best algorithm for kidney stone diagnosis. So this will helps in early identification of kidney stone in patients and reduces the diagnosis time.

85 citations


Journal ArticleDOI
TL;DR: The main aim of this paper is to explore the recent applications of Neural Networks and Artificial Intelligence and provides an overview of the field, where the AI & ANN's are used and discusses the critical role of AI & NN played in different areas.
Abstract: Artificial Neural Network is a branch of Artificial intelligence and has been accepted as a new computing technology in computer science fields. This paper reviews the field of Artificial intelligence and focusing on recent applications which uses Artificial Neural Networks (ANN"s) and Artificial Intelligence (AI). It also considers the integration of neural networks with other computing methods Such as fuzzy logic to enhance the interpretation ability of data. Artificial Neural Networks is considers as major soft-computing technology and have been extensively studied and applied during the last two decades. The most general applications where neural networks are most widely used for problem solving are in pattern recognition, data analysis, control and clustering. Artificial Neural Networks have abundant features including high processing speeds and the ability to learn the solution to a problem from a set of examples. The main aim of this paper is to explore the recent applications of Neural Networks and Artificial Intelligence and provides an overview of the field, where the AI & ANN"s are used and discusses the critical role of AI & NN played in different areas.

79 citations


Journal ArticleDOI
TL;DR: This paper identifies parametric & non parametric classifiers that are used in classification process and provides tree representation of these classifiers.
Abstract: In the field of Machine learning & Data Mining, lot of work had been done to construct new classification techniques/ classifiers and lot of research is going on to construct further new classifiers with the help of nature inspired technique such as Genetic Algorithm, Ant Colony Optimization, Bee Colony Optimization, Neural Network, Particle Swarm Optimization etc. Many researchers provided comparative study/ analysis of classification techniques. But this paper deals with another form of analysis of classification techniques i.e. parametric and non parametric classifiers analysis. This paper identifies parametric & non parametric classifiers that are used in classification process and provides tree representation of these classifiers. For the analysis purpose, four classifiers are used in which two of them are parametric and rest of are non-parametric in nature.

66 citations


Journal ArticleDOI
TL;DR: A new approach based on the quantum inspired cuckoo search algorithm to deal with the 1- BPP problem is presented, consisting in defining an appropriate quantum representation based on qubit representation to represent bin packing solutions.
Abstract: The Bin Packing Problem (BPP) is one of the most known combinatorial optimization problems. This problem consists to pack a set of items into a minimum number of bins. There are several variants of this problem; the most basic problem is the one- dimensional bin packing problem (1-BPP). In this paper, we present a new approach based on the quantum inspired cuckoo search algorithm to deal with the 1- BPP problem. The contribution consists in defining an appropriate quantum representation based on qubit representation to represent bin packing solutions. The second contribution is proposition of a new hybrid quantum measure operation which uses first fit heuristic to pack no filled objects by the standard measure operation. The obtained results are very encouraging and show the feasibility and effectiveness of the proposed approach.

59 citations


Journal ArticleDOI
TL;DR: A plagiarism detection tool for comparison of Arabic documents to identify potential similarities based on a new comparison algorithm that uses heuristics to compare suspect documents at different hierarchical levels to avoid unnecessary comparisons is presented.
Abstract: Many language-sensitive tools for detecting plagiarism in natural language documents have been developed, particularly for English. Language- independent tools exist as well, but are considered restrictive as they usually do not take into account specific language features. Detecting plagiarism in Arabic documents is particularly a challenging task because of the complex linguistic structure of Arabic. In this paper, we present a plagiarism detection tool for comparison of Arabic documents to identify potential similarities. The tool is based on a new comparison algorithm that uses heuristics to compare suspect documents at different hierarchical levels to avoid unnecessary comparisons. We evaluate its performance in terms of precision and recall on a large data set of Arabic documents, and show its capability in identifying direct and sophisticated copying, such as sentence reordering and synonym substitution. We also demonstrate its advantages over other plagiarism detection tools, including Turnitin, the well-known language-independent tool.

55 citations


Journal ArticleDOI
TL;DR: The results show that there is positive correlation between knowledge management capabilities and organizational performance, and the proposed framework can be used to assess organizational performance and also can be use as decision tool to decide which knowledge management capability should be improved.
Abstract: In the present aggressive world of competition, knowledge management strategies are becoming the major vehicle for the organizations to achieve their goals; to compete and to perform well. Linking knowledge management to business performance could make a strong business case in convincing senior management of any organization about the need to adopt a knowledge management strategy. Organizational performance is, therefore, a key issue and performance measurement models provide a basis for developing a structured approach to knowledge management. In this respect, organizations need to assess their knowledge management capabilities and find ways to improve their performance. This paper takes these issues into account when study the role of knowledge management in enhancing the organizational performance and consequently, developed an integrated knowledge management capabilities framework for assessing organizational performance. The results show that there is positive correlation between knowledge management capabilities and organizational performance. The results also show that the proposed framework can be used to assess organizational performance and also can be used as decision tool to decide which knowledge management capability should be improved.

54 citations


Journal ArticleDOI
TL;DR: This paper proposes a Hybrid algorithm which combines the merits of ACO and Cuckoo Search, which can perform the local search more efficiently and there is only a single parameter apart from the population size.
Abstract: Job scheduling is a type of combinatorial optimization problem. In this paper, we propose a Hybrid algorithm which combines the merits of ACO and Cuckoo Search. The major problem in the ACO is that, the ant will walk through the path where the chemical substances called pheromone is deposited. This acts as if it lures the artificial ants. Cuckoo search can perform the local search more efficiently and there is only a single parameter apart from the population size. It minimizes the makespan and the scheduling can be used in scientific computing and high power computing.

52 citations


Journal ArticleDOI
TL;DR: Comparison between these techniques in terms of their precision in exon and intron classification is introduced and it is found that the classification performance is a function of the numerical representation method.
Abstract: Using digital signal processing in genomic field is a key of solving most problems in this area such as prediction of gene locations in a genomic sequence and identifying the defect regions in DNA sequence. It is found that, using DSP is possible only if the symbol sequences are mapped into numbers. In literature many techniques have been developed for numerical representation of DNA sequences. They can be classified into two types, Fixed Mapping (FM) and Physico Chemical Property Based Mapping (PCPBM ( . The open question is that, which one of these numerical representation techniques is to be used? The answer to this question needs understanding these numerical representations considering the fact that each mapping depends on a particular application. This paper explains this answer and introduces comparison between these techniques in terms of their precision in exon and intron classification. Simulations are carried out using short sequences of the human genome (GRch37/hg19). The final results indicate that the classification performance is a function of the numerical representation method.

Journal ArticleDOI
TL;DR: Handwritten Devanagari script recognition system using neural network is presented and it is attempted to use the power of genetic algorithm to recognize the character.
Abstract: Handwritten Devanagari script recognition system using neural network is presented in this paper. Diagonal based feature extraction is used for extracting features of the handwritten Devanagari script. After that these feature of each character image is converted into chromosome bit string of length 378. More than 1000 sample is used for training and testing purpose in this proposed work. It is attempted to use the power of genetic algorithm to recognize the character. In step-I preprocessing on the character image, then image suitable for feature extraction as here is used. Diagonal based feature extraction method to extract 54 features to each character. In the next step character recognize image in which extracted feature in converted into Chromosome bit string of size 378. In recognition step using fitness function in which find the Chromosome difference between unknown character and Chromosome which are store in data base.

Journal ArticleDOI
TL;DR: This research aims to design a new methodology to fix the fuel ratio in internal combustion (IC) engine using new linear part sliding mode method (NLPSM) which can adjust the optimal coefficient to have the best performance.
Abstract: Internal combustion (IC) engines are optimized to meet exhaust emission requirements with the best fuel economy. Closed loop combustion control is a key technology that is used to optimize the engine combustion process to achieve this goal. In order to conduct research in the area of closed loop combustion control, a control oriented cycle-to-cycle engine model, containing engine combustion information for each individual engine cycle as a function of engine crank angle, is a necessity. This research aims to design a new methodology to fix the fuel ratio in internal combustion (IC) engine. Baseline method is a linear methodology which can be used for highly nonlinear system's (e.g., IC engine). To optimize this method, new linear part sliding mode method (NLPSM) is used. This online optimizer can adjust the optimal coefficient to have the best performance.

Journal ArticleDOI
Liguo Yu1
TL;DR: A deep study to investigate the effectiveness of using negative binomial regression to predict fault-prone software modules under two different conditions, self- assessment and forward assessment shows the performance of forward assessment is better than or at least as same as the performances of self-assessment.
Abstract: Negative binomial regression has been proposed as an approach to predicting fault-prone software modules. However, little work has been reported to study the strength, weakness, and applicability of this method. In this paper, we present a deep study to investigate the effectiveness of using negative binomial regression to predict fault-prone software modules under two different conditions, self- assessment and forward assessment. The performance of negative binomial regression model is also compared with another popular fault prediction model—binary logistic regression method. The study is performed on six versions of an open-source objected-oriented project, Apache Ant. The study shows (1) the performance of forward assessment is better than or at least as same as the performance of self-assessment; (2) in predicting fault-prone modules, negative binomial regression model could not outperform binary logistic regression model; and (3) negative binomial regression is effective in predicting multiple errors in one module.

Journal ArticleDOI
TL;DR: This work is aimed at providing a neuro- fuzzy system for heart attack detection with eight input field and one output field that was designed in a way that the patient can use it personally.
Abstract: This work is aimed at providing a neuro- fuzzy system for heart attack detection. Theneuro-fuzzy system was designed with eight input field and one output field. The input variables are heart rate, exercise, blood pressure, age, cholesterol, chest pain type, blood sugar and sex. The output detects the risk levels of patients which are classified into 4 different fields: very low, low, high and very high. The data set used was extracted from the database and modeled in order to make it appropriate for the training, then the initial FIS structure was generated, the network was trained with the set of training data after which it was tested and validated with the set of testing data. The output of the system was designed in a way that the patient can use it personally. The patient just need to supply some values which serve as input to the system and based on the values supplied the system will be able to predict the risk level of the patient.

Journal ArticleDOI
TL;DR: Simulation results show that the probability of detection increases significantly when signal to noise ratio increases and that the detection probability decreases when the bandwidth factor increases.
Abstract: Spectrum sensing is a challenging task for cognitive radio. Energy detection is one of the popular spectrum sensing technique for cognitive radio. In this paper we analyze the performance of energy detection technique to detect primary user (PU). Simulation results show that the probability of detection increases significantly when signal to noise ratio increases. It is also observed that the detection probability decreases when the bandwidth factor increases.

Journal ArticleDOI
TL;DR: The proposed hybrid model is actually an express version of Scrum model and possesses features of engineering practices that are necessary to develop quality software as per customer requirements and company objectives.
Abstract: Scrum does not provide any direction about how to engineer a software product. The project team has to adopt suitable agile process model for the engineering of software. XP process model is mainly focused on engineering practices rather than management practices. The design of XP process makes it suitable for simple and small size projects and not appropriate for medium and large projects. A fine integration of management and engineering practices is desperately required to build quality product to make it valuable for customers. In this research a novel framework hybrid model is proposed to achieve this integration. The proposed hybrid model is actually an express version of Scrum model. It possesses features of engineering practices that are necessary to develop quality software as per customer requirements and company objectives. A case study is conducted to validate the proposal of hybrid model. The results of the case study reveal that proposed model is an improved version of XP and Scrum model. Index Terms—XP model, Scrum Model, Software Engineering Practices, Quality

Journal ArticleDOI
TL;DR: Using historical purchase data, a predictive response model with data mining techniques was developed to predict a probability that a customer in Ebedi Microfinance bank will respond to a promotion or an offer.
Abstract: Identifying customers who are more likely to respond to new product offers is an important issue in direct marketing. In direct marketing, data mining has been used extensively to identify potential customers for a new product (target selection). Using historical purchase data, a predictive response model with data mining techniques was developed to predict a probability that a customer in Ebedi Microfinance bank will respond to a promotion or an offer. To achieve this purpose, a predictive response model using customers' historical purchase data was built with data mining techniques. The data were stored in a data warehouse to serve as management decision support system. The response model was built from customers' historic purchases and demographic dataset.

Journal ArticleDOI
TL;DR: RFID based toll deduction system and how to make more efficient and perfect is examined and can be considered scalable to implement in motor vehicles used today.
Abstract: In this research paper we examine RFID based toll deduction system and how to make more efficient and perfect. The vehicle will be equipped with a radio frequency (RF) tag which will detect RF Reader located in on toll plaza. The amount will then automatically deduct from the bank account. This research paper can be considered scalable to implement in motor vehicles used today.

Journal ArticleDOI
TL;DR: A new approach is investigated through the use of fuzzy logic to serve as a base or platform to build an intelligent controller using a set of well defined rules to guide its operational performance.
Abstract: Nowadays, there are several models of computer systems finding their ways into various offices, houses, organizations as well as remote locations. Any slight malfunction of the computer system's components could lead to loss of vital data and information. One of the sources of computer system malfunction is overheating of the electronic components. A common method of cooling a computer system is the use of cooling fan(s). Therefore, it is essential to have an appropriate control mechanism for the operation of computer system's cooling fan in order to save energy, and prevent overheating. Failure to adopt a well designed and efficient performance controller could lead to the malfunction of a computer system. Presently, most controllers in computer systems are pulse width modulation based. That is, they make use of pulses in form of digits, 0 and 1. It was observed that inherent noise is still prevalent in the operation of computer system. Also, eventual breakdown of components is common. A new approach is therefore investigated through the use of fuzzy logic to serve as a base or platform to build an intelligent controller using a set of well defined rules to guide its operational performance. Mamdani-type fuzzy inference system and Sugeno-type fuzzy inference system were used with two input sets each and a single output function each. Simulation was carried out in MATLAB R2007a platform and operational performances of the two approaches were compared. Simulated results of the performances of the Mamdani-type fuzzy inference system based controller and the Sugeno-type fuzzy inference system based controller are presented accordingly.

Journal ArticleDOI
TL;DR: The proposed scheme is non blind and strongly robust to different attacks like compression, scaling, rotation, cropping and Noise addition which is tested with standard database image of size 512x512 and watermark of size 64X64.
Abstract: In this paper 'DWT-SVD' based Color Image Watermarking technique in YUV color space using Arnold Transform is proposed. The RGB color image is converted into YUV color space. Image is decomposed by 3 level DWT and then SVD is applied. The security is increased with watermark scrambling using Arnold Transform. The watermark is embedded in all Y,U and V color spaces in HL3 region. The decomposition is done with 'Haar' which is simple, symmetric and orthogonal wavelet and the direct weighting factor is used in watermark embedding and extraction process is used. PSNR and Normalized Correlations (NC) values are tested for 10 different values of flexing factor. We got maximum PSNR up to 52.3337 for Y channel and average value of NC equal to 0.99 indicating best recovery of watermark. The proposed scheme is non blind and strongly robust to different attacks like compression, scaling, rotation, cropping and Noise addition which is tested with standard database image of size 512x512 and watermark of size 64X64.

Journal ArticleDOI
TL;DR: The results showed that the proposed framework is applicable and implementable in the e- services evaluation; it also shows thatThe proposed framework may assist decision makers and e-service system designers to consider different criteria and measures before committing to a particular choice of e- service or to evaluate any existing e-services system.
Abstract: The introduction of e-service solutions within the public sector has primarily been concerned with moving away from traditional information monopolies and hierarchies. E-service aims at increasing the convenience and accessibility of government services and information to citizens. Providing services to the public through the Web may lead to faster and more convenient access to government services with fewer errors. It also means that governmental units may realize increased efficiencies, cost reductions, and potentially better customer service. The main objectives of this work are to study and identify the success criteria of e-service delivery and to propose a comprehensive, multidimensional framework of e-services success. To examine the validity of the proposed framework, a sample of 200 e-service users were asked to assess their perspectives towards e-service delivery in some Egyptian organizations. The results showed that the proposed framework is applicable and implementable in the e- services evaluation; it also shows that the proposed framework may assist decision makers and e-service system designers to consider different criteria and measures before committing to a particular choice of e-service or to evaluate any existing e-service system. Index Terms—IS success model, e-services success model, e-services success measurement framework.

Journal ArticleDOI
TL;DR: The performances of 130 W (Solara PV) and 100 W (Sunworth PV) solar modules are evaluated using a single diode equivalent circuit that is able to simulate both the I-V and P-V characteristic curves and is used to study the effect of the operating temperature, diode ideality factor, series resistance, and solar irradiance level on the model performance.
Abstract: The performances of 130 W (Solara PV) and 100 W (Sunworth PV) solar modules are evaluated using a single diode equivalent circuit. The equivalent circuit is able to simulate both the I-V and P-V characteristic curves, and is used to study the effect of the operating temperature, diode ideality factor, series resistance, and solar irradiance level on the model performance. The results of the PV characteristics curves are compared with the parameters from the manufacturing companies for each model. Afterwards, the Solara PV model is tested under different irradiance levels. The relationship between the model power versus its current under different irradiance levels is plotted, such that if the solar power meter (pyrheliometer) does not exist, the irradiance-current (G-I) curve can be used to measure solar radiation power without using the solar power meter. The measurement is achieved by moving the solar panel by a certain angle toward the solar radiation, and then measuring the corresponding current.

Journal ArticleDOI
TL;DR: This paper intends to propose a hybrid model naming SPRUP model by combining strengths of Scrum, XP and RUP by eliminating their weaknesses to produce high quality software.
Abstract: Scrum and Extreme Programming (XP) are frequently used models among all agile models whereas Rational Unified Process (RUP) is one of the widely used conventional plan driven software development models. The agile and plan driven approaches both have their own strengths and weaknesses. Although RUP model has certain drawbacks, such as tendency to be over budgeted, slow in adaptation to rapidly changing requirements and reputation of being impractical for small and fast paced projects. XP model has certain drawbacks such as weak documentation and poor performance for medium and large development projects. XP has a concrete set of engineering practices that emphasizes on team work where managers, customers and developers are all equal partners in collaborative teams. Scrum is more concerned with the project management. It has seven practices namely Scrum Master, Scrum teams, Product Backlog, Sprint, Sprint Planning Meeting, Daily Scrum Meeting and Sprint Review. Keeping above mentioned context in view, this paper intends to propose a hybrid model naming SPRUP model by combining strengths of Scrum, XP and RUP by eliminating their weaknesses to produce high quality software. The proposed SPRUP model is validated through a controlled case study.

Journal ArticleDOI
TL;DR: A new cognitive complexity metric namely cognitive weighted coupling between objects for measuring coupling in object- oriented systems is presented.
Abstract: Analyzing object - oriented systems in order to evaluate their quality gains its importance as the paradigm continues to increase in popularity. Consequently, several object- oriented metrics have been proposed to evaluate different aspects of these systems such as class coupling. This paper presents a new cognitive complexity metric namely cognitive weighted coupling between objects for measuring coupling in object- oriented systems. In this metric, five types of coupling that may exist between classes: control coupling, global data coupling, internal data coupling, data coupling and lexical content coupling are consider in computing CWCBO.

Journal ArticleDOI
TL;DR: This paper aims to answer the question how semantic interoperability between two databases can be achieved by using Formal Concept Analysis (FCA) and Information Flow (IF) theories.
Abstract: As databases become widely used, there is a growing need to translate information between multiple databases. Semantic interoperability and integration has been a long standing challenge for the database community and has now become a prominent area of database research. In this paper, we aim to answer the question how semantic interoperability between two databases can be achieved by using Formal Concept Analysis (FCA for short) and Information Flow (IF for short) theories. For our purposes, firstly we discover knowledge from different databases by using FCA, and then align what is discovered by using IF and FCA. The development of FCA has led to some software systems such as TOSCANA and TUPLEWARE, which can be used as a tool for discovering knowledge in databases. A prototype based on the IF and FCA has been developed. Our method is tested and verified by using this prototype and TUPLEWARE.

Journal ArticleDOI
TL;DR: A non- linear programming mathematical model is developed to determine the optimum value of the time quantum, in order to minimize the average waiting time of the processes.
Abstract: The process scheduling, is one of the most important tasks of the operating system. One of the most common scheduling algorithms used by the most operating systems is the Round Robin method in which, the ready processes waiting in ready queue, seize the processor for a short period of time known as the quantum (or time slice) circularly. In this paper, a non- linear programming mathematical model is developed to determine the optimum value of the time quantum, in order to minimize the average waiting time of the processes. The model is implemented and solved by Lingo 8.0 software on four selected problems from the literature.

Journal ArticleDOI
TL;DR: A proposed Web programming language to be analyzed with five Web browsers in term of their performances to process the encryption of the programming language's script with the Web browsers to determine which algorithm works best and most compatible with which Web browser.
Abstract: The hacking is the greatest problem in the wireless local area network (WLAN). Many algorithms like DES, 3DES, AES,UMARAM, RC6 and UR5 have been used to prevent the outside attacks to eavesdrop or prevent the data to be transferred to the end-user correctly. We have proposed a Web programming language to be analyzed with five Web browsers in term of their performances to process the encryption of the programming language's script with the Web browsers. This is followed by conducting tests simulation in order to obtain the best encryption algorithm versus Web browser. The results of the experimental analysis are presented in the form of graphs. We finally conclude on the findings that different algorithms perform differently to different Web browsers like Internet Explorer, Mozilla Firefox, Opera and Netscape Navigator. Hence, we now determine which algorithm works best and most compatible with which Web browser. A comparison has been conducted for those encryption algorithms at different settings for each algorithm such as encryption/decryption speed in the different web Browsers. Experimental results are given to demonstrate the effectiveness of each algorithm.

Journal ArticleDOI
TL;DR: It is found that this area still requires a lot of work, and needs to be focused towards some new intelligent approaches so that human inactivity periods for computer systems could be reduced intelligently.
Abstract: Recently, the demand of "Green Computing", which represents an environmentally responsible way of reducing power consumption, and involves various environmental issues such as waste management and greenhouse gases is increasing explosively. We have laid great emphasis on the need to minimize power consumption and heat dissipation by computer systems, as well as the requirement for changing the current power scheme options in their operating systems (OS). In this paper, we have provided a comprehensive technical review of the existing, though challenging, work on minimizing power consumption by computer systems, by utilizing various approaches, and emphasized on the software approach by making use of dynamic power management as it is used by most of the OSs in their power scheme configurations, seeking a better understanding of the power management schemes and current issues, and future directions in this field. Herein, we review the various approaches and techniques, including hardware, software, the central processing unit (CPU) usage and algorithmic approaches for power economy. On the basis of analysis and observations, we found that this area still requires a lot of work, and needs to be focused towards some new intelligent approaches so that human inactivity periods for computer systems could be reduced intelligently.

Journal ArticleDOI
TL;DR: A detailed survey is presented on frequent subgraph mining algorithms, which are used for knowledge discovery in complex objects and also a frame work is proposed for classification of these algorithms.
Abstract: Data mining algorithms are facing the challenge to deal with an increasing number of complex objects. Graph is a natural data structure used for modeling complex objects. Frequent subgraph mining is another active research topic in data mining . A graph is a general model to represent data and has been used in many domains like cheminformatics and bioinformatics. Mining patterns from graph databases is challenging since graph related operations, such as subgraph testing, generally have higher time complexity than the corresponding operations on itemsets, sequences, and trees. Many frequent subgraph Mining algorithms have been proposed. SPIN, SUBDUE, g_Span, FFSM, GREW are a few to mention. In this paper we present a detailed survey on frequent subgraph mining algorithms, which are used for knowledge discovery in complex objects and also propose a frame work for classification of these algorithms. The purpose is to help user to apply the techniques in a task specific manner in various application domains and to pave wave for further research.