scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Computer Applications Technology and Research in 2014"


Journal ArticleDOI
TL;DR: This paper presents an alternative method of implementing ALPR systems using Free Software including Python and the Open Computer Vision Library.
Abstract: Automatic License Plate Recognition system is a real time embedded system which automatically recognizes the license plate of vehicles. There are many applications ranging from complex security systems to common areas and from parking admission to urban traffic control. Automatic license plate recognition (ALPR) has complex characteristics due to diverse effects such as of light and speed. Most of the ALPR systems are built using proprietary tools like Matlab. This paper presents an alternative method of implementing ALPR systems using Free Software including Python and the Open Computer Vision Library.

21 citations


Journal ArticleDOI
TL;DR: In the proposed methodology, features are extracted from raw images which are then fed to ANFIS (Artificial neural fuzzy inference system), which proves to be a sophisticated framework for multiobject classification.
Abstract: Manual classification of brain tumor is time devastating and bestows ambiguous results. Automatic image classification is emergent thriving research area in medical field. In the proposed methodology, features are extracted from raw images which are then fed to ANFIS (Artificial neural fuzzy inference system).ANFIS being neuro-fuzzy system harness power of both hence it proves to be a sophisticated framework for multiobject classification. A comprehensive feature set and fuzzy rules are selected to classify an abnormal image to the corresponding tumor type. This proposed technique is fast in execution, efficient in classification and easy in implementation.

16 citations


Journal ArticleDOI
TL;DR: In this article, a Flood Prediction Model (FPM) is proposed to predict flood in rivers using Artificial Neural Network (ANN) approach, this model predicts river water level from rainfall and present river water levels data.
Abstract: This paper presents a Flood Prediction Model (FPM) to predict flood in rivers using Artificial Neural Network (ANN) approach. This model predicts river water level from rainfall and present river water level data. Though numbers of factors are responsible for changes in water level, only two of them are considered. Flood prediction problem is a non-linear problem and to solve this nonlinear problem, ANN approach is used. Multi Linear Perceptron (MLP) based ANN’s Feed Forward (FF) and Back Propagation (BP) algorithm is used to predict flood. Statistical analysis shows that data fit well in the model. We present our simulation results for the predicted water level compared to the actual water level. Results show that our model successfully predicts the flood water level 24 hours ahead of time.

12 citations


Journal ArticleDOI
TL;DR: The benefits and risks of using cloud computing in knowledge management systems, a great means for gathering and redistributing knowledge, are discussed.
Abstract: The success of organizations largely depends on continual investment in learning and acquiring new knowledge that creates new businesses and improve existing performance. So, using Knowledge management must result in better achieving, or even exceeding, organizations objectives. The purpose of knowledge management must not be to just become more knowledgeable, but to be able to create, transfer and apply knowledge with the purpose of better achieving objectives. As new technologies and paradigms emerge, businesses have to make new efforts to properly get aligned with them, especially in knowledge management area. Today the Cloud Computing paradigm is becoming more and more popular, due to the vast decrease in time, cost and effort for meeting software development needs. It also provides a great means for gathering and redistributing knowledge. In this paper, we will discuss the benefits and risks of using cloud computing in knowledge management systems.

12 citations


Journal ArticleDOI
TL;DR: A new method is introduced which is a combination of bisection and other methods to prove that with the help of bisected method, scientists and engineers can solve problems and to improve the speed.
Abstract: The bisection method is the basic method of finding a root. As iterations are conducted, the interval gets halved. So method is guaranteed to converge to a root of “f” if “f” is a continuous function at an interval [a,b] and f(a) and f(b) should have opposite sign. In this paper we have explained the role of bisection method in computer science research. we also introduced a new method which is a combination of bisection and other methods to prove that with the help of bisection method we can also develop new methods. It is observed that scientists and engineers are often faced with the task of finding out the roots of equations and the basic method is bisection method but it is comparatively slow. We can use this new method to solve these problems and to improve the speed.

12 citations


Journal ArticleDOI
TL;DR: This paper explains different kinds of web spam, and describes some method, used to combat with this difficulty, which involves commercial, political and economic applications.
Abstract: Internet is a global information system. Most of the users use search engines due to high volume of information in virtual world in order to access to required information. They often observe the results of first pages in search engines. If they cannot obtain desired result, then they exchange query statement. Search engines try to place the best results in the first links of results on the basis of user's query. Web spam is an illegal and unethical method to increase the rank of internet pages by deceiving the algorithms of search engines. It involves commercial, political and economic applications. In this paper, we firstly present some definitions in terms of web spam. Then we explain different kinds of web spam, and we describe some method, used to combat with this difficulty.

11 citations


Journal ArticleDOI
TL;DR: In this article, the adoption of mobile parking management information systems in the parking industry in Nairobi County with a special focus on Lulu East Africa is assessed. And the primary data was collected using questionnaires from the sample population of 60 comprising of workers and clients, and analyzed using SPSS version 21 and presented using distribution tables and graphs.
Abstract: In the parking industry, savings can be made by providing alternative payment options for customers, such as cashless parking schemes, and providing the parking workforce with more sophisticated equipment. This technological change is expected to contribute to the development of more flexible, convenient and efficient parking services, increases revenue and customer satisfaction. This study assesses the adoption of mobile parking management information systems in the parking industry in Nairobi County with a special focus on Lulu East Africa. The researcher adopted a descriptive research design and the primary data was collected using questionnaires from the sample population of 60 comprising of workers and clients. This data was analyzed using SPSS version 21 and presented using distribution tables and graphs. The research findings indicated that Nairobi county and the parking industry were generally ready to adopt the mobile parking management system whose success is subject to a detailed feasibility study, although as with any technological adoption it is bound to face some barriers which can be overcome.

10 citations


Journal ArticleDOI
TL;DR: What data types are supported, what methods and information security professionals indetecting the use of steganography, after detection has occurred, can the embedded message be reliably extracted,Can the embedded data be separated from the carrier revealing the original file, and finally, what are some methods to defeat theUse of Steganography even if it cannot be reliably detected.
Abstract: This paper presents a general overview of the steganography. Steganography is the art of hiding the very presence of communication by embedding secret messages into innocuous looking cover documents, such as digital images. Detection of steganography, estimation of message length, and its extraction belong to the field of steganalysis. Steganalysis has recently received a great deal of attention both from law enforcement and the media. In this paper review the what data types are supported, what methods and information security professionals indetecting the use of steganography, after detection has occurred, can the embedded message be reliably extracted, can the embedded data be separated from the carrier revealing the original file, and finally, what are some methods to defeat the use of steganography even if it cannot be reliably detected. Â

9 citations


Journal ArticleDOI
TL;DR: Various types of mobile devices aretalking and they are inquiring into in details and existing operation systems that are most famed for mentioned devices are talking.
Abstract: The breakthrough in wireless networking has prompted a new concept of computing, called mobile computing in which users tote portable devices have access to a shared infrastructure, independent of their physical location. Mobile computing is becoming increasingly vital due to the increase in the number of portable computers and the aspiration to have continuous network connectivity to the Internet irrespective of the physical location of the node. Mobile computing systems are computing systems that may be readily moved physically and whose computing ability may be used while they are being moved. Mobile computing has rapidly become a vital new example in today's real world of networked computing systems. It includes software, hardware and mobile communication. Ranging from wireless laptops to cellular phones and WiFi/Bluetooth- enabled PDA"s to wireless sensor networks; mobile computing has become ubiquitous in its influence on our quotidian lives. In this paper various types of mobile devices are talking and they are inquiring into in details and existing operation systems that are most famed for mentioned devices are talking. Another aim of this paper is to point out some of the characteristics, applications, limitations, and issues of mobile computing.

9 citations


Journal ArticleDOI
TL;DR: This paper has considered XSS attacks, its types and different methods employed to resist these attacks with their corresponding limitations, and discussed the proposed approach for countering XSS attack.
Abstract: In present-day time, most of the associations are making use of web services for improved services to their clients. With the upturn in count of web users, there is a considerable hike in the web attacks. Thus, security becomes the dominant matter in web applications. The disparate kind of vulnerabilities resulted in the disparate types of attacks. The attackers may take benefit of these vulnerabilities and can misuse the data in the database. Study indicates that more than 80% of the web applications are vulnerable to cross-site scripting (XSS) attacks. XSS is one of the fatal attacks & it has been practiced over the maximum number of well-known search engines and social sites. In this paper, we have considered XSS attacks, its types and different methods employed to resist these attacks with their corresponding limitations. Additionally, we have discussed the proposed approach for countering XSS attack and how this approach is superior to others.

9 citations


Journal ArticleDOI
TL;DR: Parameter estimation of NHPP based reliability models, using MLE and using an evolutionary search algorithm called Particle Swarm Optimization, has been explored in the paper.
Abstract: Software reliability is considered as a quantifiable metric, which is defined as the probability of a software to operate without failure for a specified period of time in a specific environment. Various software reliability growth models have been proposed to predict the reliability of a software. These models help vendors to predict the behaviour of the software before shipment. The reliability is predicted by estimating the parameters of the software reliability growth models. But the model parameters are generally in nonlinear relationships which creates many problems in finding the optimal parameters using traditional techniques like Maximum Likelihood and least Square Estimation. Various stochastic search algorithms have been introduced which have made the task of parameter estimation, more reliable and computationally easier. Parameter estimation of NHPP based reliability models, using MLE and using an evolutionary search algorithm called Particle Swarm Optimization, has been explored in the paper.

Journal ArticleDOI
TL;DR: In this article, the diversity of bat species is studied using various techniques including speech recognition, voice recognition, artificial neural networks etc. and to detect the presence of bats acoustically.
Abstract: Bat is an important keystone member in the ecosystem, which is the only flying mammal. It plays a vital role in maintaining eco-balance through propagation of vital flora. Bat has a major role in pest management in the forest. Bats give major indication for biodiversity conservation through propagation and pest management. Bats are also the key informers of climate change and its impact on their habitat. Bat species and their activity are useful to assess habitat quality and they serve as biological indicators of the ecosystem conditions and degradation. Diversity of bat species is studied using various techniques including speech recognition, voice recognition, artificial neural networks etc. and to detect the presence of bats acoustically. In this paper, the various computer techniques used to study bats are surveyed.

Journal ArticleDOI
TL;DR: The data classification is diabetic patients data set is developed by collecting data from hospital repository consists of 1865 instances with different attributes, J48 is better algorithm in most of the cases.
Abstract: Data mining refers to extracting knowledge from large amount of data. Real life data mining approaches are interesting because they often present a different set of problems for diabetic patient’s data. The research area to solve various problems and classification is one of main problem in the field. The research describes algorithmic discussion of J48, J48 Graft, Random tree, REP, LAD. Here used to compare the performance of computing time, correctly classified instances, kappa statistics, MAE, RMSE, RAE, RRSE and to find the error rate measurement for different classifiers in weka .In this paper the data classification is diabetic patients data set is developed by collecting data from hospital repository consists of 1865 instances with different attributes. The instances in the dataset are two categories of blood tests, urine tests. Weka tool is used to classify the data is evaluated using 10 fold cross validation and the results are compared. When the performance of algorithms, we found J48 is better algorithm in most of the cases. KeywordsData Mining, Diabetics data, Classification algorithm, Weka tool

Journal ArticleDOI
TL;DR: Integer Cosine Transform and Integer Wavelet Transform is combined for converting signal to frequency and the performance of the stegnographic technique is improved in terms of PSNR value.
Abstract: There are algorithms in existence for hiding data within an image. The proposed scheme treats the image as a whole. Here Integer Cosine Transform (ICT) and Integer Wavelet Transform (IWT) is combined for converting signal to frequency. Hide Behind Corner (HBC) algorithm is used to place a key at corners of the image. All the corner keys are encrypted by generating Pseudo Random Numbers. The Secret keys are used for corner parts. Then the hidden image is transmitted. The receiver should be aware of the keys that are used at the corners while encrypting the image. Reverse Data Hiding (RDH) is used to get the original image and it proceeds once when all the corners are unlocked with proper secret keys. With these methods the performance of the stegnographic technique is improved in terms of PSNR value.

Journal ArticleDOI
TL;DR: This paper has proposed parallel implementation of Ant colony optimization Ant System algorithm on GPU using OpenCL, and done comparison on different parameters of the ACO which directly or indirectly affect the result.
Abstract: In this paper we have proposed parallel implementation of Ant colony optimization Ant System algorithm on GPU using OpenCL. We have done comparison on different parameters of the ACO which directly or indirectly affect the result. Parallel comparison of speedup between CPU and GPU implementation is done with a speed up of 3.11x in CPU and 7.21x in GPU. The control parameters α, β, ρ is done with a result of best solution at 1, 5 and 0.5 respectively.

Journal ArticleDOI
TL;DR: In this paper, a detailed discussion has been done on both the type of attacks and some approaches which have proved to be of significance in detection of these attacks.
Abstract: In recent years, the number of automobiles on the road has increased tremendously. Due to high density and mobility of vehicles, possible threats and road accidents are increasing. Wireless communication allows sending safety and other critical information. Due to this inherent wireless characteristic and periodic exchange of safety packets, Vehicular Ad-hoc Network (VANET) is vulnerable to number of security threats like Sybil attack or temporal attack. In this paper, a detailed discussion has been done on both the type of attacks. With the help of already published works, some approaches have also been studied which have proved to be of significance in detection of these attacks.

Journal ArticleDOI
TL;DR: A brief discussion on different dimensions of classification of privacy preservation techniques is given and some of the popular data mining algorithms like association rule mining, clustering, decision tree, Bayesian network etc. used to privacy preservation technique are discussed.
Abstract: It is often highly valuable for organizations to have their data analyzed by external agents. Data mining is a technique to analyze and extract useful information from large data sets. In the era of information society, sharing and publishing data has been a common practice for their wealth of opportunities. However, the process of data collection and data distribution may lead to disclosure of their privacy. Privacy is necessary to conceal private information before it is shared, exchanged or published. The privacypreserving data mining (PPDM) has thus has received a significant amount of attention in the research literature in the recent years. Various methods have been proposed to achieve the expected goal. In this paper we have given a brief discussion on different dimensions of classification of privacy preservation techniques. We have also discussed different privacy preservation techniques and their advantages and disadvantages. We also discuss some of the popular data mining algorithms like association rule mining, clustering, decision tree, Bayesian network etc. used to privacy preservation technique.. We also presented few related works in this field.

Journal ArticleDOI
TL;DR: This paper has compared different periodicity mining algorithms and given plan for developing efficientperiodicity mining algorithm which detect symbol periodicity, sequence periodicity and segmentperiodicity and noise-resilient.
Abstract: Time series datasets consist of sequence of numeric values obtained over repeated measurements of time. They are Popular in many applications such as stock market analysis, power consumption, economic and sells forecasting, temperature etc. Periodic pattern mining or periodicity detection is process of finding periodic patterns in time series database. It has a number of applications, such as prediction, forecasting, detection of unusual activities, etc. Periodicity mining needs to give more attention as its increased need in real life applications. The types of periodicities are symbol periodicity, sequence periodicity and segment periodicity and they should be identified even in the presence of noise in the time series database. There are number of algorithms exists for periodic pattern mining. Those algorithms have some advantages and disadvantages. In this paper, we have compared different periodicity mining algorithms and given plan for developing efficient periodicity mining algorithm which detect symbol periodicity, sequence periodicity and segment periodicity and noise-resilient .

Journal ArticleDOI
TL;DR: In this research it has been tried to evaluate spam detection in legal electronica letters, and their effect on several Machin learning algorithms through presenting a feature selection method based on genetic algorithm.
Abstract: Spam is defined as redundant and unwanted electronical letters, and nowadays, it has created many problems in business life such as occupying networks bandwidth and the space of user’s mailbox. Due to these problems, much research has been carried out in this regard by using classification technique. The resent research show that feature selection can have positive effect on the efficiency of machine learning algorithm. Most algorithms try to present a data model depending on certain detection of small set of features. Unrelated features in the process of making model result in weak estimation and more computations. In this research it has been tried to evaluate spam detection in legal electronica letters, and their effect on several Machin learning algorithms through presenting a feature selection method based on genetic algorithm. Bayesian network and KNN classifiers have been taken into account in classification phase and spam base dataset is used.

Journal ArticleDOI
TL;DR: OCR system interprets the printed or handwritten characters image and converts it into corresponding editable text document and recognizes the exact character using feature matching between the extracted character and the template of all characters as a measure of similarity.
Abstract: Optical Character Recognition (OCR) is a system that provides a full alphanumeric recognition of printed or handwritten characters by simply scanning the text image. OCR system interprets the printed or handwritten characters image and converts it into corresponding editable text document. The text image is divided into regions by isolating each line, then individual characters with spaces. After character extraction, the texture and topological features like corner points, features of different regions, ratio of character area and convex area of all characters of text image are calculated. Previously features of each uppercase and lowercase letter, digit, and symbols are stored as a template. Based on the texture and topological features, the system recognizes the exact character using feature matching between the extracted character and the template of all characters as a measure of similarity.

Journal ArticleDOI
TL;DR: This proposed system will include functionalities and interfaces for processing user request, fetching web pages from the internet allowing users to select zone in web pages to monitor and helps to locate minor or major changes within the selected zone of document.
Abstract: This paper describes Web Page Change Detection System for Selected Zone based on tree comparison mechanism corresponding to HTML pages. Two sub trees for the selected zone will be generated one for initial and another for changed version of Web Document. The Generalized Tree Comparison Algorithm is developed to compare these sub trees for selected zone. This algorithm uses the properties of HTML page and heuristics as a node of the trees. This proposed system will include functionalities and interfaces for processing user request, fetching web pages from the internet allowing users to select zone in web pages to monitor. This method performs well and is able to detect the structural as well as content level changes even at the minute level and helps to locate minor or major changes within the selected zone of document.

Journal ArticleDOI
TL;DR: This review paper provides the functioning mechanism of the RED technique with the help of its algorithm & its variants and describes its parameters and variants.
Abstract: Internet and its applications are an integral part of our daily life .These days they are widely used for various purposes such as communication, public services, entertainments, distant educations, etc., each possessing different quality of service (QoS) requirements. How to provide finer congestion control for network emerges as a major problem. To prevent the problem of congestion control and synchronization various active queue management (AQM) techniques are used. AQM algorithms execute on network routers and detect initial congestion by monitoring some functions. When congestion occurs on the link the AQM algorithms detects and provides signals to the end systems. Various algorithms have been proposed in recent years but RED is one of the most influential techniques among all the existing ones. This review paper provides the functioning mechanism of the RED technique with the help of its algorithm & its variants. KeywordsAQM ,RED ,RED parameters, RED algorithm, RED variants

Journal ArticleDOI
TL;DR: This paper proposes a new algorithm which combines the firefly algorithm with the Max-Min algorithm for scheduling of jobs on the grid which has a better efficiency than other compared algorithms.
Abstract: Grid computing indeed is the next generation of distributed systems and its goals is creating a powerful virtual, great, and autonomous computer that is created using countless Heterogeneous resource with the purpose of sharing resources. Scheduling is one of the main steps to exploit the capabilities of emerging computing systems such as the grid. Scheduling of the jobs in computational grids due to Heterogeneous resources is known as an NP-Complete problem. Grid resources belong to different management domains and each applies different management policies. Since the nature of the grid is Heterogeneous and dynamic, techniques used in traditional systems cannot be applied to grid scheduling, therefore new methods must be found. This paper proposes a new algorithm which combines the firefly algorithm with the Max-Min algorithm for scheduling of jobs on the grid. The firefly algorithm is a new technique based on the swarm behavior that is inspired by social behavior of fireflies in nature. Fireflies move in the search space of problem to find the optimal or near-optimal solutions. Minimization of the makespan and flowtime of completing jobs simultaneously are the goals of this paper. Experiments and simulation results show that the proposed method has a better efficiency than other compared algorithms.

Journal ArticleDOI
TL;DR: This work proposes an approach for evaluate the performance and reliability of software systems, based on formal models (hierarchical timed colored petri nets) in software architecture level, which is designed at the primary stages of software system development cycle and prior to implementation.
Abstract: Validation of software systems is very useful at the primary stages of their development cycle. Evaluation of functional requirements is supported by clear and appropriate approaches, but there is no similar strategy for evaluation of non-functional requirements (such as performance and reliability). Whereas establishing the non-functional requirements have significant effect on success of software systems, therefore considerable necessities are needed for evaluation of non-functional requirements. Also, if the software performance has been specified based on performance models, may be evaluated at the primary stages of software development cycle. Therefore, modeling and evaluation of non-functional requirements in software architecture level, that are designed at the primary stages of software systems development cycle and prior to implementation, will be very effective. We propose an approach for evaluate the performance and reliability of software systems, based on formal models (hierarchical timed colored petri nets) in software architecture level. In this approach, the software architecture is described by UML use case, activity and component diagrams, then UML model is transformed to an executable model based on hierarchical timed colored petri nets (HTCPN) by a proposed algorithm. Consequently, upon execution of an executive model and analysis of its results, non-functional requirements including performance (such as response time) and reliability may be evaluated in software architecture level.

Journal ArticleDOI
TL;DR: The decision probability of the handoff are modeled and simulated for smaller bandwidths and the results are presented for two cases with and without the probabilities of four different states of the mobile nodes.
Abstract: In this work, the decision probability of the handoff are modeled and simulated for smaller bandwidths. The smaller bandwidth is chosen just for simulation purposes and to demonstrate the applicability of the algorithm. The probability of handover and probability of incorrect decision in the handover is modeled. Two nodes of the network are modeled and the probabilities of four different states of the mobile node are also modeled. The results are presented for two cases with and without the probabilities of four different states of the mobile nodes.

Journal ArticleDOI
TL;DR: It has been statistically shown that although the privacy concerns of respondents are significant but their attitude towards the risks of information disclosure is still very relaxed, people are still not aware about the actual cyber risks and therefore still happy to disregard protective advice and have risk taking behavior.
Abstract: With the emergence of online social networking sites, the rules of social interaction and communication has been changed. Most of the social networking sites motivate users to share personal information, build new relationships and increase knowledge with a perceived ease of use. But this online interaction and sharing of personal information on online sites have raised many new privacy concerns as it provide huge amount of data to third party which can be misused by malicious activities against the user's will. This research aims to develop an understanding of individual's risk taking behavior online particularly around the issue of information disclosure and to study the factors influencing the site use behavior. Human behavior concern has crucial role in development of social networking sites. Using an online questionnaire, survey data has been collected from social networking site users of different age groups and gender. It has been statistically shown that although the privacy concerns of respondents are significant but their attitude towards the risks of information disclosure is still very relaxed. People are still not aware about the actual cyber risks and therefore still happy to disregard protective advice and have risk taking behavior. Further some actual flaws and changes that user's wishes to see in online social networking sites are mentioned which are also collected from the actual online interaction with the users.

Journal ArticleDOI
TL;DR: A principal goal of this paper is to identify privacy and security issues in the distributed environment and concern to cloud computing participants and users.
Abstract: -The National Institute of Standards and Technology (NIST) defined cloud computing as a model for enabling convenient, ondemand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or cloud provider interaction. Cloud Computing refers to the following concepts of Grid Computing, Utility Computing, software as a service, storage in the cloud and virtualization. These are termed as a client using a provider’s service remotely, known as cloud. Cloud computing has the potential to change how organizations manage information technology and transform the economics of hardware and software at the same time. Cloud computing promised to bring a new set of entrepreneurs who could start their venture with zero investment on IT infrastructure. A principal goal of this paper is to identify privacy and security issues in the distributed environment and concern to cloud computing participants and users .

Journal ArticleDOI
TL;DR: This paper investigates different types of attacks which are happened at the different layers of MANET, and discusses some available detection techniques for these attacks.
Abstract: A Mobile Ad-Hoc Network (MANET) is a collection of mobile nodes (stations) communicating in a multi hop way without any fixed infrastructure such as access points or base stations. MANET has not well specified defense mechanism, so malicious attacker can easily access this kind of network. In this paper we investigate different types of attacks which are happened at the different layers of MANET after that we discuss some available detection techniques for these attacks. To our best knowledge this is the first paper that studies all these attacks corresponding to different layers of MANET with some available detection techniques.

Journal ArticleDOI
TL;DR: This paper investigates the Self-Organizing Map method, an algorithm based on neural networks that is suitable for Intrusion Detection Systems (IDS) and the name is "Self Organizing Maps" (SOM).
Abstract: With the rapid expansion of computer usage and computer network the security of the computer system has became very important. Every day new kind of attacks are being faced by industries. Many methods have been proposed for the development of intrusion detection system using artificial intelligence technique. In this paper we will have a look at an algorithm based on neural networks that are suitable for Intrusion Detection Systems (IDS). The name of this algorithm is "Self Organizing Maps" (SOM). So far, many different methods have been used to build a detector that Wide variety of different ways in the covers. Among the methods used to detect attacks in intrusion detection is done, In this paper we investigate the Self-Organizing Map method.

Journal ArticleDOI
TL;DR: Hidden patterns of teacher evaluation by students are verified and it is predicted that which teachers will be invited to faculty classes and which teachers are refusing and education managers due to evaluation reasons will cut the education contract with these teachers in next semesters.
Abstract: Data mining, the extraction of hidden knowledge from large amounts of data repositories. Data mining is used in a vast area and numerous commercial data mining applications including retail sales, e-commerce, remote sensing, bioinformatics etc. Education is an essential element for the progress of country. Mining in educational environment is called Educational Data Mining. Educational data mining is concerned with developing new methods to discover knowledge from educational database. Educational data mining is concerned with developing new methods to discover knowledge from educational database. The main goal of this paper is gathering manageable experiences with data mining and also using of these experiences at E learning system and traditional education according to teacher evaluation. In this paper are verified hidden patterns of teacher evaluation by students and is predicted that which teachers will be invited to faculty classes and which teachers will be refusing and education managers due to evaluation reasons will cut the education contract with these teachers in next semesters? And what’s effect of some items for examples Evaluation‘s score, Teacher’s degree, Degree’s type, Teaching experience, Acceptation to next semesters on teacher’s evaluation?