scispace - formally typeset
Search or ask a question

Showing papers in "Computer Engineering and Intelligent Systems in 2012"


Journal Article
TL;DR: A Genetic A lgorithm-Support Vector Machine (GA-SVM) feature selection technique is developed to optimize the SV M classification parameters, the prediction accuracy and computation time and Spam assassin dataset was used to validate the performance of the proposed hybrids.
Abstract: Feature selection is a problem of global combinator ial optimization in machine learning in which subse ts of relevant features are selected to realize robust le arning models. The inclusion of irrelevant and redu ndant features in the dataset can result in poor predicti ons and high computational overhead. Thus, selectin g relevant feature subsets can help reduce the comput ational cost of feature measurement, speed up learn ing process and improve model interpretability. SVM classifier has proven inefficient in its inability to produce accurate classification results in the face of larg e e-mail dataset while it also consumes a lot of computational resources. In this study, a Genetic A lgorithm-Support Vector Machine (GA-SVM) feature selection technique is developed to optimize the SV M classification parameters, the prediction accurac y and computation time. Spam assassin dataset was used to validate the performance of the proposed syste m. The hybrid GA-SVM showed remarkable improvements over SVM in terms of classification accuracy and computation time.

36 citations


Journal Article
TL;DR: This research focuses on the controlling of home appliances remotely when the user is away from the house and uses wireless technology to revolutionize the standards of living.
Abstract: Safeguarding our home appliances has become an issue when dealing with an advancement and growth of an economy. This research focuses on the controlling of home appliances remotely when the user is away from the house. The system is Short Message Service (SMS) based and uses wireless technology to revolutionize the standards of living. It provides ideal solution to certain problems faced by home owners in daily life. Due to its wireless nature, it is more adaptable and cost-effective. The research is divided into two sections; the hardware and the software sections. The hardware section consists of the Global System for Mobile Communications (GSM) modem module, The Control module, the Appliance module, the Liquid Crystal Display (LCD) module and the power supply module. The GSM modem receives the message sent by the person who wishes to operate any of the connected appliances, it then forwards it to the microcontroller, and the microcontroller decodes the message, switches on or switches off the appropriate appliance, updates the LCD and sends a feedback to the mobile phone via the GSM modem. Overall, the work employs The ATmega16 microcontroller, relays, a programmer to program the microcontroller, a mobile phones and a GSM modem. The AT commands is used to handle communication between the modem and the ATmega16 microcontroller. Flow code is used for the programming of the microcontroller. The overall work was implemented with a constructed work, tested working and perfectly functional. Keywords: Short Message Service, Global System for Mobile Communications, Remote Control, Electronic Circuit Design

21 citations


Journal Article
TL;DR: In this article, a genetic algorithm was used to predict the prone direction of the price for BSE index (India cements stock price index (ICSPI) futures with several technical indicators using artificial intelligence techniques and mining the trading rules to determined conflict among the outputs of the first stage using the evolve learning.
Abstract: The generation of profitable trading rules for stock market investments is a difficult task but admired problem. First stage is classifying the prone direction of the price for BSE index (India cements stock price index (ICSPI)) futures with several technical indicators using artificial intelligence techniques. And second stage is mining the trading rules to determined conflict among the outputs of the first stage using the evolve learning. We have found trading rule which would have yield the highest return over a certain time period using historical data. These groundwork results suggest that genetic algorithms are promising model yields highest profit than other comparable models and buy-and-sell strategy. Experimental results of buying and selling of trading rules were outstanding. Key words: Data mining, Trading rule, Genetic algorithm, ANN, ICSPI prediction

20 citations


Journal Article
TL;DR: The survey of congestion adaptive routing protocols for mobile ad hoc network, which are adaptive to congestion status of mobile ad-hoc network can greatly improve the network performance is presented.
Abstract: Routing protocols for mobile ad hoc networks (MANETs) have been explored extensively in last few years. Much of this work is targeted at finding a feasible route from a source to a destination without considering current network traffic or application requirements. Routing may let a congestion happen which is detected by congestion control, but dealing with congestion in reactive manner results in longer delay, and unnecessary packet loss and requires significant overhead if a new route is needed. Routing should not be aware of, but also be adaptive to, network congestion. Adaptation to the congestion helps to increase both the effectiveness and efficiency of routing. These problems are solved by the congestion-aware routing protocols in certain degree. These protocols which are adaptive to congestion status of mobile ad-hoc network can greatly improve the network performance. In this paper, we present the survey of congestion adaptive routing protocols for mobile ad-hoc network. Finally, the future direction of congestion-aware routing protocols is described. Keywords : Ad hoc networks, congestion aware routing, Congestion metric, congestion adaptability

15 citations


Journal Article
TL;DR: A proposed algorithm called Modified K-meansLBG algorithm used to obtain a good codebook and has shown good performance on limited vocabulary tasks.
Abstract: In the Vector Quantization, the main task is to generate a good codebook. The distortion measure between the original pattern and the reconstructed pattern should be minimum. In this paper, a proposed algorithm called Modified K-meansLBG algorithm used to obtain a good codebook. The system has shown good performance on limited vocabulary tasks. Keywords: K-means algorithm, LBG algorithm, Vector Quantization, Speech Recognition

11 citations


Journal Article
TL;DR: The traditional procedure of the medical diagnosis of hypotension employed by physician is analyzed using neuro-fuzzy inference procedure and the proposed system which is self-learning and adaptive is able to handle the uncertainties often associated with the diagnosis and analysis of hypotensions.
Abstract: Hypotension; also known as low blood sugar affect gender of all sort; hypotension is a relative term because the blood pressure normally varies greatly with activity, age, medications, and underlying medical conditions. Low blood pressure can result from conditions of the nervous system, conditions that do not begin in the nervous system and drugs. Neurologic conditions (condition affecting the brain neurons) that can lead to low blood pressure include changing position from lying to more vertical (postural hypotension), stroke , shock, lightheadedness after urinating or defecating, Parkinson's disease, neuropathy and simply fright. Clinical symptoms of hypotension include low blood pressure, dizziness, Fainting, clammy skin, visual impairment and cold sweat. Neuro-Fuzzy Logic explores approximation techniques from neural networks to find the parameter of a fuzzy system. In this paper, the traditional procedure of the medical diagnosis of hypotension employed by physician is analyzed using neuro-fuzzy inference procedure. The proposed system which is self-learning and adaptive is able to handle the uncertainties often associated with the diagnosis and analysis of hypotension. Keywords: Neural Network, Fuzzy logic, Neuro Fuzzy System, Expert System, Hypotension

10 citations


Journal Article
TL;DR: In this dissertation work an additive wavelet transform will be proposed for enhancement and filtration of homomorphic infrared images.
Abstract: In Infrared Image Enhancement using Wavelet Transform, two enhancement algorithms namely spatial and spatiotemporal homomorphic filtering (SHF and STHF) have been given for enhancement of the far infrared images based upon a far infrared imaging model. Although spatiotemporal homomorphic filtering may reduce the number of iterations greatly in comparison to spatial one for a similar degree of convergence by making explicit use of the additional information provided temporally, the enhanced results from SHF are in general better than those from STHF. In this dissertation work an additive wavelet transform will be proposed for enhancement and filtration of homomorphic infrared images. Keywords: Infrard Images, Additive Wavelet transform, Homomorphic Image Enhancement,

8 citations


Journal Article
TL;DR: A new algorithm based on both the Renyi entropy and the Shannon entropy together for edge detection using split and merge technique to find the best edge representation and decrease the computation time is shown.
Abstract: Most of the classical methods for edge detection are based on the first and second order derivatives of gray levels of the pixels of the original image. These processes give rise to the exponential increment of computational time, especially with large size of images, and therefore requires more time for processing. This paper shows the new algorithm based on both the Renyi entropy and the Shannon entropy together for edge detection using split and merge technique. The objective is to find the best edge representation and decrease the computation time. A set of experiments in the domain of edge detection are presented. The system yields edge detection performance comparable to the classic methods, such as Canny, LOG, and Sobel. The experimental results show that the effect of this method is better to LOG, and Sobel methods. In addition, it is better to other three methods in CPU time. Another benefit comes from easy implementation of this method. Keywords: Renyi Entropy, Information content, Edge detection, Thresholding

8 citations


Journal Article
TL;DR: In this article, a modified version of simple circular monopole antenna for WPAN application is proposed, which offers excellent performance in the range of 2-12 GHz, and the antenna is designed on FR4 substrate and fed with 50 ohms micro strip feed line.
Abstract: The basic circular monopole antenna exhibits a 10 dB return loss bandwidth over the entire frequency band, the paper proposed a modified version of simple circular monopole antenna for WPAN application. The antenna offers excellent performance in the range of 2-12 GHz.. The antenna is designed on FR4 substrate and fed with 50 ohms micro strip feed line. The antenna is suitable for operating frequency of 7 GHz. It is shown that return loss of the antennas at 7 GHz is better than -10 dB and VSWR obtained is less than 2. Proposed geometry is design and simulated using HFSS11 Details of the proposed antenna design and measured results are presented. Index Terms: Wireless communication, UWB, circular monopole, WPAN

8 citations


Journal Article
TL;DR: In this paper, a comparative study of the satisfaction levels of the Sub-Subscriber of the Pre-Paid and Post-Pay electricity billing systems is presented. But, any such systematic and focused study is yet to have taken place, although that might carry mentionable significance from a number of perspectives.
Abstract: The introduction of Pre-Paid billing system for electricity at households are claimed to be an addition to the convenience of subscribers, especially by excusing the hassles of bills payments associated to the Post-Paid system Among the other benefits of Pre-Paid system, user control over electricity consumption, freedom from discrepancies regarding to billing etc are largely spoken about These conveniences brought by the Pre-Paid billing system should therefore, obviously result in a higher satisfaction level of the subscribers compared to those of the Post-Paid system But, any such systematic and focused study is yet to have taken place, although that might carry mentionable significance from a number of perspectives This study is a humble yet strongly rooted quest to address the very issues that contributes to the satisfaction levels of the subscribers of electricity at household levels Conducted on 50 subscribers from both Pre-Paid and Post-Paid systems of Sylhet city, this study can be considered as ‘Small Scale’ that aims to construct a comparative picture of the satisfaction level of the subscribers of the two systems on the benchmark issues The backbone of the study is the information acquired through a questionnaire survey conducted through ‘In Home’ type of personal interviewing Along with the generation of findings, the study offers some implications and recommendation that may be used at policy levels Keywords: Pre-Paid and Post-Paid electricity billing system, Residential level electricity, Subscriber satisfaction, Comparative study, Questionnaire survey

7 citations


Journal Article
TL;DR: A Robust Series Checkpointing Algorithm (SCpA) implemented in JADE environment, which extends the previous work, keeping in mind the security of mobile host platforms and evaluates the performance of the agents’ execution through graphical analysis.
Abstract: Mobile agent paradigm relies heavily on security of both the agent as well as its host platform. Both of the entities are prone to security threats and attacks such as masquerading, denial-of-service and unauthorized access. Security fissures on the platform can result in significant losses. This paper produced a Robust Series Checkpointing Algorithm (SCpA) implemented in JADE environment, which extends our previous work, keeping in mind the security of mobile host platforms. The algorithm is Series Check-pointing in the sense that layers are placed in series one after the other, in the framework, to provide two-level guard system so that if incase, any malevolent agent somehow able to crack the security at first level and unfortunately managed to enter the platform; may be trapped at the next level and hence block the threat. The work also aimed to evaluate the performance of the agents’ execution, through graphical analysis. Our previous work proposed successfully a platform security framework (PSF) to secure host platform from various security threats, but the technical algorithm realization and its implementation was deliberately ignored, which has now been completed. Keywords: Mobile Agent, Security, Reputation Score, Threshold Value, Check-points, Algorithm.

Journal Article
TL;DR: Computer Aided Detection System has to be developed for the detection of masses and calcifications in Digital Mammogram, which acts as a secondary tool for the radiologists for diagnosing the breast cancer.
Abstract: Mammography is one of the available techniques for the early detection of masses or abnormalities which is related to breast cancer. Breast Cancer is the uncontrolled of cells in the breast region, which may affect the other parts of the body. The most common abnormalities that might indicate breast cancer are masses and calcifications. Masses appear in a mammogram as fine, granular clusters and also masses will not have sharp boundaries, so often difficult to identify in a raw mammogram. Digital Mammography is one of the best available technologies currently being used for the early detection of breast cancer. Computer Aided Detection System has to be developed for the detection of masses and calcifications in Digital Mammogram, which acts as a secondary tool for the radiologists for diagnosing the breast cancer. In this paper, we have proposed a secondary tool for the radiologists that help them in the segmentation and feature extraction process. Keywords: Mammography, Breast Cancer, Masses, Calcification, Digital Mammography, Computer Aided Detection System, Segmentation, Feature Extraction

Journal Article
TL;DR: In this paper, the challenging issues of spatio-temporal data mining are presented and the ability to analyze these data remains inadequate, and the need for adapted data mining tools becomes a major challenge.
Abstract: The spatio-temporal database (STDB) has received considerable attention during the past few years, due to the emergence of numerous applications ( e.g. , flight control systems, weather forecast, mobile computing, etc. ) that demand efficient management of moving objects. These applications record objects' geographical locations (sometimes also shapes) at various timestamps and support queries that explore their historical and future (predictive) behaviors. The STDB significantly extends the traditional spatial database , which deals with only stationary data and hence is inapplicable to moving objects, whose dynamic behavior requires re-investigation of numerous topics including data modeling, indexes, and the related query algorithms. In many application areas, huge amounts of data are generated, explicitly or implicitly containing spatial or spatiotemporal information. However, the ability to analyze these data remains inadequate, and the need for adapted data mining tools becomes a major challenge. In this paper, we have presented the challenging issues of spatio-temporal data mining. Keywords: database, data mining, spatial, temporal, spatio-temporal

Journal Article
TL;DR: An adaptive neuro-fuzzy inference system (ANFIS) has been used to model the relationship between maximum and minimum temperature data and its ability to provide weekly temperature data is validated.
Abstract: Temperature changes had a direct effect on crops. In the present study an adaptive neuro-fuzzy inference system (ANFIS) has been used to model the relationship between maximum and minimum temperature data. Time series data of weekly maximum temperature at a location is analyzed to predict the maximum temperature of the next week at that location based on the weekly maximum temperatures for a span of previous n week referred to as order of the input. Mean weekly maximum and mean weekly minimum temperature data of 10 years 1997 to 2006 (520 weeks) taken from regional center of Indian Meteorological Department at Dehradun, India. The objectives of this paper are to develop prediction model and validate its ability to provide weekly temperature data. Keywords: Minimum weekly temperature, ANFIS, forecasting

Journal Article
TL;DR: This paper is an attempt to establish a linkage between a flowshop scheduling model having job block criteria with a parallel biserial queue network linked with a common channel in series.
Abstract: This paper is an attempt to establish a linkage between a flowshop scheduling model having job block criteria with a parallel biserial queue network linked with a common channel in series. The arrival and service pattern both follows Poisson law in queue network. The generating function technique, law of calculus and statistical tools have been used to find out the various characteristics of queue. Further the completion time of jobs in a queue system form the set up time for the first machine in the scheduling model. A heuristic approach to find an optimal sequence of jobs with a job block criteria with minimum total flow time when the jobs are processed in a combined system with a queue network is discussed. The proposed method is easy to understand and also provide an important tool for the decision makers when the production is done in batches. A computer programme followed by a numerical illustration is given to justify the algorithm. Keywords: Queue Network, Mean Queue length, Waiting time, Processing time, Job-block, Makespan, Biserial Channel

Journal Article
TL;DR: A Genetic Algorithm for optimum allocation of Virtual Machines (VMs) that permit maximum usage of physical resources and the fitness function and the GA operators in detail are described.
Abstract: In this paper, we propose an optimized scheduling algorithm for cloud services. We propose a Genetic Algorithm for optimum allocation of Virtual Machines (VMs) that permit maximum usage of physical resources. We describe the fitness function and the GA operators in detail and how they manipulate the problem space. Keywords: cloud computing, service optimization, genetic algorithm

Journal Article
TL;DR: This paper evaluates security mechanism on application, transport and network layers of ISO/OSI reference model and gives examples of today's most popular security protocols applied in each of mentioned layers.
Abstract: In multilayered security infrastructure, the layers are projected in a way that vulnerability of one layer could not compromise the other layers and thus the whole system is not vulnerable. This paper evaluates security mechanism on application, transport and network layers of ISO/OSI reference model and gives examples of today's most popular security protocols applied in each of mentioned layers. A secure computer network systems is recommended that consists of combined security mechanisms on three different ISO/OSI reference model layers : application layer security based on strong user authentication, digital signature, confidentiality protection, digital certificates and hardware tokens, transport layer security based on establishment of a cryptographic tunnel between network nodes and strong node authentication procedure and network IP layer security providing bulk security mechanisms on network level between network nodes. Strong authentication procedures used for user based on digital certificates and PKI systems are especially emphasized.

Journal Article
TL;DR: A method for word sense disambiguation task using a tree-matching approach that requires a context knowledge base containing the corpus of sentences and some preliminary results when a corpus containing the ambiguous words is tested.
Abstract: Word Sense Disambiguation is one of the basic tasks in Natural language processing It is the method of selecting the correct sense of the word in the given context It is applied whenever a semantic understanding of text is needed In order to disambiguate a word, two resources are necessary: a context in which the word has been used, and some kind of knowledge related to the word This paper presents a method for word sense disambiguation task using a tree-matching approach The method requires a context knowledge base containing the corpus of sentences This paper also gives some preliminary results when a corpus containing the ambiguous words is tested on this system Keywords: Natural Language Understanding, Word Sense Disambiguation; Tree-matching; dependent-word matching

Journal Article
TL;DR: A discussion is proposed here in order to assess the quality of the compressed image and the relevant information of the processed image is found.
Abstract: Image quality is a characteristic of an image that measures the perceived image degradation, typically, compared to an ideal or perfect image. Imaging systems may introduce some amounts of distortion or artifacts in the signal, so the quality assessment is an important problem. Processing of images involves complicated steps. The aim of any processing result is to get a processed image which is very much same as the original. It includes image restoration, enhancement, compression and many more. To find if the reconstructed image after compression has lost the originality is found by assessing the quality of the image. Traditional perceptual image quality assessment approaches are based on measuring the errors (signal differences between the distorted and the reference images and attempt to quantify the errors in a way that simulates human visual error sensitivity features. A discussion is proposed here in order to assess the quality of the compressed image and the relevant information of the processed image is found. Keywords: Reference methods, Quality Assessment, Lateral chromatic aberration , Root Mean Squared Error, Peak Signal to Noise Ratio, Signal to Noise Ratio, Human Visual System.

Journal Article
TL;DR: A land vehicle tracking system determines the position of land rover with a terminal with embedded GPS receiver or PCS phone and displays the position on a digital map.
Abstract: As urban living environment is becoming more and more complex, the road condition is becoming worse because of heavy traffic, increase of traffic accidents and high ratio of empty vehicles. It increases the cost of transportation and wastes time of vehicle movement. To solve such problems, a land vehicle tracking system has been developed. A land vehicle tracking system determines the position of land rover with a terminal with embedded GPS receiver or PCS phone and displays the position on a digital map. Recently, vehicle tracking technologies have brought some breakthrough in these areas: commercial vehicle operations, fleet management, dispatching, emergency rescue, hazard material monitoring, and security. Keywords: Android, Java, Eclipse, GPS, AGPS, Land Vehicle Tracking, Internet.

Journal Article
TL;DR: The CPN model used for composition design verification is reused for test design purpose and an on-the-fly algorithm that generates a test suite that covers all possible paths without redundancy is proposed.
Abstract: Web service composition is most mature and effective way to realize the rapidly changing requirements of business in service-oriented solutions Testing the compositions of web services is complex, due to their distributed nature and asynchronous behaviour Colored Petri Nets (CPNs) provide a framework for the design, specification, validation and verification of systems In this paper the CPN model used for composition design verification is reused for test design purpose We propose an on-the-fly algorithm that generates a test suite that covers all possible paths without redundancy The prioritization of test sequences, test suite size and redundancy reduction are also focused The proposed technique was applied to air line reservation system and the generated test sequences were evaluated against three coverage criteria; Decision Coverage, Input Output Coverage and Transition Coverage Keywords— CPN, MBT, web service composition testing, test case generation

Journal Article
TL;DR: Data security using Cryptosteganography in web application is a web based application used to conceal important information through hybrid cryptography and Steganography and provide means of its secure transmission through any medium or channel.
Abstract: Data security using Cryptosteganography in web application is a web based application used to conceal important information through hybrid cryptography and Steganography and provide means of its secure transmission through any medium or channel .Using a web browser the user uploads the important information and an envelope image. The same is received by the Data Shielder facade web application .The web application sends the data and envelope image to the real Data Shielder .It generates a unique key and encrypts the crucial data .The key is associated with a "unique id" and preserved in a store .Then the encrypted information is embedded into the envelope image using modified BPCS technique .Finally a stego image is generated .Data Shielder returns the "unique id" and stego image to the facade web application .Web application further archives the stego image and unique key and allows the user to download it. The user can simply unzip the archive and transmit the stego image through unsecured channels like email, sockets, pen drives, cds, dvds, etc. And can keep the unique id safe. When the user wants its data back then user needs to upload the stego image and the "unique id" to the Data Shielder facade web application. The web application sends the unique id and stego image to the real Data Shielder. First it finds the encryption key from the store through the unique id. Next, reversing the BPCS Steganography, the stego image is processed and encrypted data is fetched. Finally using the encryption key decryption is done and the crucial data is fetched back. The same is returned to the facade web application, which is rendered to the user. Keywords: Cryptography, Steganography, Stego- image, Threshold Value

Journal Article
TL;DR: The proposed algorithm has the potential to become an appropriate routing tactics for mobile ad-hoc networks and the results were presented based on the simulations made with the implementation in ns-2.
Abstract: In this paper based on swarm intelligence a new approach for an on demand ad-hoc routing algorithm is proposed The foraging behavior of Ant colony optimization and Bee colony optimization, which are the subset of swarm intelligence and considering the ability of simple ants to solve complex problems by cooperation Several algorithms which are based on ant colony problems were introduced in the literatures to solve different problems, eg, optimization problems The proposed algorithm is compared and proven by results that the approach has the potential to become an appropriate routing tactics for mobile ad-hoc networks The results were presented based on the simulations made with the implementation in ns-2 Keywords: BACOR, Bee Routing, Ant Routing, Bee-Ant Routing

Journal Article
TL;DR: Assessing quality as a function for monitoring and measuring the strength of development processes and any successful application development enterprise requires an unambiguous understanding of customer expectation and maximizing participation of customers in the development activities thereby ensuring that people involved in development activities do the right thing.
Abstract: A statement “Prevention is better than cure” for illnesses in medical sciences also applies to the software development life cycle in terms of software defects. A defect is a deviation from actual functionality of the application in terms of the correctness and completeness of the specification of the customer requirements. Defective software fails to meet its customer requirements leading to the development of applications with poor quality. Quality is a top priority in every enterprise these days. Organizations struggle in a treadmill race to deliver quality software to stay ahead with new technology, deal with accumulated development backlogs, handle customer issues as software teams work as hard as they can make their organizations stay alive and competitive in the market place. Software companies face an immense pressure to virtually release a bug-free product or a software package. The culture of an organization is a critical success factor in the efforts of process improvement. The paper aims at assessing quality as a function for monitoring and measuring the strength of development processes and any successful application development enterprise requires an unambiguous understanding of customer expectation and maximizing participation of customers in the development activities thereby ensuring that people involved in development activities do the right thing and do the thing right for delivering high quality software . Keywords: Software development, process improvement, software defect, bug-free product, software package

Journal Article
TL;DR: The study found a strong relationship between information systems and the process of decision making; on the other hand the results show that Jordan relies heavily on a number of technologies used by IS to implement their key activities.
Abstract: This research aims to analyze the current state of computer information systems and its role in decision making in Jordan bank. It identifies the types of computer based information systems that is used in the Bank. However, the research relies on an empirical study and a structured questionnaire. Questionnaire numbered 252 have been distributed to the studied bank, 212 questionnaires have been retrieved. The study found a strong relationship between information systems and the process of decision making; on the other hand the results show that Jordan relies heavily on a number of technologies used by IS to implement their key activities. Keywords: Information systems, decision making.

Journal Article
TL;DR: In this article, the authors analyze and evaluate the accessibility of government websites in perspective of developing countries and present a recommendation for improvement of e-Government websites' accessibility in developing countries.
Abstract: The Web has been blessed for all people regardless of their economic, social, political, cultural, mental or physical condition and behavior. But the proper utilization and distribution of the benefits of web is crucial. It is essential that the web be accessible to people with equal access and equal opportunity to all also with disabilities. An accessible web can also help elderly population and also people with disabilities more actively contribute in society. In this paper, researchers analyze and evaluate accessibility of government websites' in perspective of developing countries. They take Bangladesh as a case study. This paper concentrates on mainly two things; firstly, it briefly examines accessibility guidelines, evaluation methods and analysis tools. Secondly, it analyzes and evaluates the web accessibility of e-Government websites of Bangladesh according to the W3C Web Content Accessibility Guidelines. We also present a recommendation for improvement of e-Government websites' accessibility in developing countries. Keywords: Web accessibility, Accessibility guidelines, Assistive tools, e-Government, Accessibility testing and evaluation

Journal Article
TL;DR: This work has developed a technique that performs query optimization at compile-time to reduce the burden of optimization at run- time to improve the performance of the code execution.
Abstract: Object-Oriented Programming (OOP) is one of the most successful techniques for abstraction. Bundling together objects into collections of objects, and then operating on these collections, is a fundamental part of main stream object-oriented programming languages. Object querying is an abstraction of operations over collections, whereas manual implementations are performed at low level which forces the developers to specify how a task must be done. Some object-oriented languages allow the programmers to express queries explicitly in the code, which are optimized using the query optimization techniques from the database domain. In this regard, we have developed a technique that performs query optimization at compile-time to reduce the burden of optimization at run-time to improve the performance of the code execution. Keywords - Querying; joins; compile time; run-time; histograms; query optimization

Journal Article
TL;DR: An attempt has been made to present an approach for soft tissue characterization utilizing texture-primitive features and segmentation with Artificial Neural Network (ANN) classifier tool that directly combines second, third, and fourth steps into one algorithm.
Abstract: This paper addresses the system which achieves auto-segmentation and cell characterization for prediction of percentage of carcinoma (cancerous) cells in the given image with high accuracy. The system has been designed and developed for analysis of medical pathological images based on hybridization of syntactic and statistical approaches, using Artificial Neural Network as a classifier tool (ANN) [2]. This system performs segmentation and classification as is done in human vision system [1] [9] [10] [12], which recognize objects; perceives depth; identifies different textures, curved surfaces, or a surface inclination by texture information and brightness. In this paper, an attempt has been made to present an approach for soft tissue characterization utilizing texture-primitive features and segmentation with Artificial Neural Network (ANN) classifier tool. The present approach directly combines second, third, and fourth steps into one algorithm. This is a semi-supervised approach in which supervision is involved only at the level of defining structure of Artificial Neural Network; afterwards, algorithm itself scans the whole image and performs the segmentation and classification in unsupervised mode. Finally, algorithm was applied to selected pathological images for segmentation and classification. Results were in agreement with those with manual segmentation and were clinically correlated [18] [21]. Keywords: Grey scale images, Histogram equalization, Gausian filtering, Haris corner detector, Threshold, Seed point, Region growing segmentation, Tamura texture feature extraction, Artificial Neural Network(ANN), Artificial Neuron, Synapses, Weights, Activation function, Learning function, Classification matrix.

Journal Article
TL;DR: This paper proposes a strategy via workflow chart which helps the stakeholders at various phases of software development life cycle to synchronize the work of tester role and ensure the quality product on time.
Abstract: Like other methodologies, Component Based Software Development (CBSD) has become emerging software development paradigm due to selecting reliable, reusable and robust software components and assembling all these into suitable software architecture. In CBSD, more emphasis is given to select, test and to adapt new component into existing software architecture. If these activities are not performed properly then it will impact the functionality and quality of software. During development life cycles of software all stakeholders especially tester roles are involved to overcome the errors and reduce the defects rates, so they need proper guidelines. In this paper, authors propose a strategy via workflow chart which helps the stakeholders at various phases of software development life cycle. Moreover, this strategy leads to synchronize the work of tester role and ensure the quality product on time Keywords: CBD, Tester, functional testing, stakeholders, quality, CBSD, Third party.

Journal Article
TL;DR: For both of the best-effort and QoS IP networks, peak signal noise ratio (PSNR), throughput, frame and packet statistics have been considered as performance metrics and calculated values reflect that video transmission overQoS IP network is better than that of thebest-efforts network.
Abstract: The demand for video communication over internet has been growing rapidly in recent years and the quality of video has become a challenging issue for video transmission. Different types of video coding standards like MPEG-2 and MPEG-4 have been developed to support application like video transmission. MPEG-2 which requires high bit rate transmission has been successful video standard for DVD and satellite digital broadcasting. On the other hand, MPEG-4 supports low bit rate and is suitable for transmitting video over IP networks. In this paper, MPEG-4 Video standard has been used for evaluating the performance of video transmission over two IP networks:- Best-effort and Quality of Service (QoS). For both of the best-effort and QoS IP networks, peak signal noise ratio (PSNR), throughput, frame and packet statistics have been considered as performance metrics. The calculated values of these performance metrics reflect that video transmission over QoS IP network is better than that of the best-effort network. Keywords: video transmission, mpeg, ip networks, best-effort, quality of service, ns-2