scispace - formally typeset
Search or ask a question

Showing papers presented at "International Conference on Innovations in Information Technology in 2007"


Proceedings ArticleDOI
01 Nov 2007
TL;DR: This part focuses on the PLC applications and the technical issues regarding it and the layers and methods that are needed to make it work.
Abstract: We give an overview of the power line communications (PLC) technology, its importance, its standards and an overview of the HomePlug standards associated with it. This is done is two parts due to publication constraints. In this part, we will concentrate on the PLC applications and the technical issues regarding it. We will also see the layers and methods that are needed to make it work.

74 citations


Proceedings ArticleDOI
18 Nov 2007
TL;DR: This paper compares and contrasts two feature selection techniques when applied to Arabic corpus; in particular; stemming, and light stemming were employed.
Abstract: This paper compares and contrasts two feature selection techniques when applied to Arabic corpus; in particular; stemming, and light stemming were employed. With stemming, words are reduced to their stems. With light stemming, words are reduced to their light stems. Stemming is aggressive in the sense that it reduces words to their 3-letters roots. This affects the semantics as several words with different meanings might have the same root. Light stemming, by comparison, removes frequently used prefixes and suffixes in Arabic words. Light stemming doesn't produce the root and therefore doesn't affect the semantics of words; it maps several words, which have the same meaning to a common syntactical form. The effectiveness of above two feature selection techniques was assessed in a text categorization exercise for Arabic corpus. This corpus consists of 15000 documents that fall into three categories. The K-nearest neighbors (KNN) classifier was used in this work. Several experiments were carried out using two different representations of the same corpus; the first version uses stem- vectors; and the second uses light stem-vectors as representatives of documents. These two representations were assessed in terms of size, time and accuracy. The light stem representation was superior in terms of classifier accuracy when compared with stemming.

55 citations


Proceedings ArticleDOI
01 Nov 2007
TL;DR: A heuristic opportunistic scheduler that is consistent with the resource allocation constraints of the uplink channel for 3G long term evolution (LTE) systems is proposed and an integer programming problem is formulated that provides a theoretical optimal solution.
Abstract: In this paper we propose a heuristic opportunistic scheduler that is consistent with the resource allocation constraints of the uplink channel for 3G long term evolution (LTE) systems. Retransmission processes as well as channel conditions are taken into account in the formation of the scheduling decision. The heuristic scheduler could be seen a reasonable choice for solving the constrained resource allocation maximization problem since it is computationally feasible and can find a practical solution. We also formulate an integer programming problem that provides a theoretical optimal solution. The heuristic scheduler was found to perform relatively well when compared to the optimal solution.

49 citations


Proceedings ArticleDOI
01 Nov 2007
TL;DR: This paper looks at the bottlenecks associated with packet capturing using commodity hardware in local area networks (LANs) without losing data and shows that increasing the buffering at either the kernel level or the application level can significantly improve capturing performance.
Abstract: This paper looks at the bottlenecks associated with packet capturing using commodity hardware in local area networks (LANs) without losing data. Experiments were carried out using the Wireshark packet sniffer to write captured packets directly to disk in a Fast Ethernet network with various test setups. These experiments involved generating large packets at almost line rate. Various sizes of the kernel level buffer associated with the packet capturing socket were also experimented with. As well, a simple multithreaded design with user level buffers was proposed for the capturing application and experiments were carried out with this solution. The results showed that increasing the buffering at either the kernel level or the application level can significantly improve capturing performance. The best results can be achieved by using a mix of increased kernel socket buffering and a multithreaded capturing application with its own store and hold buffers.

44 citations


Proceedings ArticleDOI
01 Nov 2007
TL;DR: A novel feature extraction method for offline recognition of segmented handwritten characters based on the fuzzy-zoning and normalized vector distance measures is presented and this method is found to be promising.
Abstract: This paper present a novel feature extraction method for offline recognition of segmented handwritten characters based on the fuzzy-zoning and normalized vector distance measures. Experiments are conducted on forty four basic Malayalam handwritten characters. In the recognition experiments are conducted using class modular neural network with the proposed features and this method is found to be promising.

29 citations


Proceedings ArticleDOI
01 Nov 2007
TL;DR: Two basic techniques have been applied: a region detection technique and a feature extraction technique with the aim to achieve a high specificity rate and reduce the time consumed to analyze sputum samples.
Abstract: The analysis of sputum color images can be used to detect the lung cancer in its early stages. However, the analysis of sputum is time consuming and requires highly trained personnel to avoid high errors. Image processing techniques provide a good tool for improving the manual screening of sputum samples. In this paper two basic techniques have been applied: a region detection technique and a feature extraction technique with the aim to achieve a high specificity rate and reduce the time consumed to analyze such sputum samples. These techniques are based on determining the shape of the nuclei inside the sputum cells. After that we extract some features from the nuclei shape to build our diagnostic rule. The final results will be used for a computer aided diagnosis (CAD) system for early detection of lung cancer.

28 citations


Proceedings ArticleDOI
01 Nov 2007
TL;DR: The result of the pilot testing shows that readiness with acceptance and use appeared to be the most important attribute, followed by structural and then engagement while need-change is the least important.
Abstract: The main objective of this paper is to report on a pilot test for a proposed formal model for e-Healthcare readiness assessment. The model provides a tool for determining critical factors of e-Healthcare readiness such as need-change readiness, engagement readiness, structural readiness and, acceptance and use readiness. These factors constitute the main constructs of the model which are formalized as Hierarchical e-Healthcare Readiness Index System. The model was operationalized and pilot tested to determine the e- Healthcare readiness status of healthcare practitioners, the public and patients from communities associated with two healthcare facilities in the Uthungulu Health District of KwaZulu/Natal province of Republic of South Africa (RSA). The result of the pilot testing shows that (i) readiness with acceptance and use appeared to be the most important attribute, followed by structural and then engagement while need-change is the least important, (ii) healthcare practitioners agreed to be e-Healthcare ready while the public and patients fairly agreed and (Hi) the attitude of healthcare practitioners can be determined as a function of their preference for technology usefulness to ease of use. The theoretical framework for the model is drawn from change and change management theories, and IT acceptance and use, and innovation adoption theories.

28 citations


Proceedings ArticleDOI
01 Nov 2007
TL;DR: This research focuses on behavior of the concept lattice reduction after using matrix decompositions, which has lower dimensions and acts as input for some known algorithms for lattice construction.
Abstract: High complexity of formal concept, analysis algorithms and lattice construction algorithms are main problems today. If we want to compute all concepts from huge incidence matrix, complexity plays a great role. In some cases, we do not need to compute all concepts, but only some of them. Our research focuses on behavior of the concept lattice reduction after using matrix decompositions. Modified matrix has lower dimensions and acts as input for some known algorithms for lattice construction. In this paper we want to describe the deferent between methods for matrix decompositions and describe their influence on the concept lattice.

27 citations


Proceedings ArticleDOI
01 Nov 2007
TL;DR: This paper applies the particle swarm optimization algorithm to the classic timetabling problem and shows that the number of unplaced events (error) is decreased in comparison with previous approaches.
Abstract: A timetabling problem is usually defined as assigning a set of events to a number of rooms and timeslots such that they satisfy a number of constraints Particle swarm optimization (PSO) is a stochastic, population-based computer problem-solving algorithm; it is a kind of swarm intelligence that is based on social-psychological principles and provides insights into social behavior, as well as contributing to engineering applications This paper applies the particle swarm optimization algorithm to the classic timetabling problem This is inspired by similar attempts belonging to the evolutionary paradigm in which the metaheuristic involved is tweaked to suit the grouping nature of problems such as timetabling, graph coloring or bin packing In the case of evolutionary algorithms, this typically means substituting the "traditional operators" for newly defined ones that seek to evolve fit groups rather than fit items We apply a similar idea to the PSO algorithm and compare the results The results show that the number of unplaced events (error) is decreased in comparison with previous approaches

26 citations


Proceedings ArticleDOI
01 Nov 2007
TL;DR: A set of textural features was applied to a set of 120 digital mammographic images, from the Digital Database for Screening Mammography, and these features are used in conjunction with SVMs to detect the breast cancer.
Abstract: Localized textural analysis of breast tissue on mammograms has recently gained considerable attention by researchers studying breast cancer detection. Despite the research progress to solve the problem, detecting breast cancer based on textural features has not been investigated in depth. In this paper we study the breast cancer detection based on statistical texture features using Support Vector Machine (SVM). A set of textural features was applied to a set of 120 digital mammographic images, from the Digital Database for Screening Mammography. These features are then used in conjunction with SVMs to detect the breast cancer. Other linear and non-linear classifiers were also employed to be compared to the SVM performance. SVM was able to achieve better classification accuracy of 82.5%.

25 citations


Proceedings ArticleDOI
01 Jan 2007
TL;DR: The proposed VHO scheme is based on IEEE 802.21 media independent handover functions (MIHFs) and new network entities for network based mobility management scheme in the course of MIP signaling.
Abstract: This paper proposes a vertical handover (VHO) solution between the two emerging future mobile communication systems, WiMAX and 3rd generation long term evolution (3G-LTE)/system architecture evolution (SAE). The proposed VHO scheme is based on IEEE 802.21 media independent handover functions (MIHFs) and new network entities for network based mobility management scheme in the course of MIP signaling.

Proceedings ArticleDOI
01 Nov 2007
TL;DR: A proof-of- concept prototype of asynchronous Mob-WS application is developed and presented with a special focus on network optimization for wireless sensor networks (WSN).
Abstract: Mobile phones in today's era are not just small devices that provide the means of communication, rather, they are equipped with more processing power, storage capacity and battery performance. Now, the hand held devices are not only service consumers but are also capable of hosting and providing services to their peers. These services deployed on mobile devices bring in the idea of mobile Web services (Mob-WS) to the research community. This paper concentrates on a middleware for long- lived Mob-WS that are accessible asynchronously over the network. Since the synchronous Mob-WS are not feasible for long durational tasks, therefore a concept and architecture of controllable and monitor able asynchronous Mob-WS middleware is presented. The service interaction techniques are discussed followed by the middleware subsystem and high-level architecture and control flow. The presented middleware is a potential basis for innovative mobile applications, therefore, a proof-of- concept prototype of asynchronous Mob-WS application is developed and presented with a special focus on network optimization for wireless sensor networks (WSN).

Proceedings ArticleDOI
01 Nov 2007
TL;DR: This paper proposes a two-tier approach to access control for e-health portals that supplements existing role based access control capabilities with a rule-based access control module based on the classical flexible authorization framework (FAF) model.
Abstract: Many e-health portal systems are implemented using off-the-shelf software components. The security features provided by such components are usually insufficient. This paper addresses the issue from the access control perspective. More specifically, we first propose a two-tier approach to access control for e-health portals. The approach supplements existing role based access control (RBAC) capabilities with a rule-based access control module based on the classical flexible authorization framework (FAF) model. We study conflict resolution and interaction between the two modules. We also address authentication for real-time services provided by remote service providers.

Proceedings ArticleDOI
01 Nov 2007
TL;DR: It is demonstrated that this model includes the necessary components to describe various actual distributed system technologies, and provides the mechanisms to describe concurrent network traffic, evaluate different strategies in data replication, and analyze job scheduling procedures.
Abstract: The use of discrete-event simulators in the design and development of large scale distributed systems is appealing due to their efficiency and scalability. Their core abstractions of process and event map neatly to the components and interactions of modern-day distributed systems and allow designing realistic simulation scenarios. MONARC 2, a multithreaded, process oriented simulation framework designed for modelling large scale distributed systems, allows the realistic simulation of a wide-range of distributed system technologies, with respect to their specific components and characteristics. In this paper we present the design characteristics of the simulation model proposed in MONARC 2. We demonstrate that this model includes the necessary components to describe various actual distributed system technologies, and provides the mechanisms to describe concurrent network traffic, evaluate different strategies in data replication, and analyze job scheduling procedures.

Proceedings ArticleDOI
01 Nov 2007
TL;DR: A genetic algorithm is proposed to optimize two objectives simultaneously and shows better results in experiments and can reduce finishing time and waiting time simultaneously.
Abstract: Job scheduling is an important issue which has many applications in different fields. In this paper job scheduling in multi processor architecture is studied. The main issue is how jobs are partitioned between processors in which total finishing time and waiting time are minimized. Minimization of these two criteria simultaneously, is a multi objective optimization problem. To solve this problem, a genetic algorithm is proposed to optimize two objectives simultaneously. In so doing, fitness function based on aggregation is used. In addition, longest processing time and shortest processing time algorithms are implemented to compare with genetic algorithm. Results of three methods are compared in unified condition simulation. Proposed genetic algorithm shows better results in experiments and can reduce finishing time and waiting time simultaneously.

Proceedings ArticleDOI
01 Nov 2007
TL;DR: Experimental results show that the proposed cache locking algorithm improves predictability and performance up to 15% locking, after that, predictability may be further enhanced by sacrificing performance.
Abstract: Cache memory improves performance by reducing the speed gap between the CPU and the main memory. However, the execution time becomes unpredictable due to the cache's adaptive and dynamic behavior. Real-time applications are subject to operational deadlines and predictability is considered necessary to support them. Studies show that for embedded systems, cache locking helps determine the worst case execution time (WCET) and cache-related preemption delay. In this work, we evaluate predictability of an embedded system running real-time applications by instruction cache (I-Cache) locking. We implement an algorithm that locks the blocks that may cause more cache misses, using the Heptane simulation tool. We obtain CPU utilization measures for both cache analysis (no cache locking) and I-cache locking. Experimental results show that our proposed cache locking algorithm improves predictability and performance up to 15% locking, after that, predictability may be further enhanced by sacrificing performance.

Proceedings ArticleDOI
Ali Al-Haj1
01 Nov 2007
TL;DR: An imperceptible and a robust digital image watermarking algorithm is proposed based on combining two powerful transform domain techniques; the discrete wavelet transform and the singular value decomposition (SVD) transform.
Abstract: Digital image watermarking is an emerging copyright protection technology. It aims at asserting intellectual property rights of digital images by inserting a copyright identifier in the contents of the image, without sacrificing its quality. In this paper, we propose an imperceptible and a robust digital image watermarking algorithm. The algorithm is based on combining two powerful transform domain techniques; the discrete wavelet transform (DWT) and the singular value decomposition (SVD) transform. Performance evaluation results demonstrate the effectiveness of the proposed algorithm with respect to the requirements of image watermarking; imperceptibility and robustness.

Proceedings ArticleDOI
01 Nov 2007
TL;DR: A framework for comparison and evaluation of agent methodologies aimed at recognizing the shortcomings of each of the methodologies and should help those pursuing development and improvement of agent oriented methodologies.
Abstract: A number of methodologies are being suggested for multi-agent systems. Yet, their application is still very limited because they still lack several aspects for the development of agent systems and require improvement. In this context, a comparative framework for the evaluation of those methodologies is needed in order to show their advantages and disadvantages. This framework is an important factor for their improvement and development. This paper presents a framework for comparison and evaluation of agent methodologies. Several known methodologies are evaluated using this framework. The framework aims at recognizing the shortcomings of each of the methodologies and should help those pursuing development and improvement of agent oriented methodologies.

Proceedings ArticleDOI
01 Nov 2007
TL;DR: A Cross-layer Vertical Handover (CVH) decision model that maintains the pre-decision load of vertical handover processing on two layers i.e. application layer and a thin shim layer between the network and data link layer is presented.
Abstract: Vertical handover in heterogeneous wireless networks is a strategic decision due to its profound impact on mobile applications and their performance. Primarily it's an issue of network availability and secondarily it is a question of choosing best available service. This paper presents a Cross-layer Vertical Handover (CVH) decision model that maintains the pre-decision load of vertical handover processing on two layers i.e. application layer and a thin shim layer between the network and data link layer. A matchmaker service help in handover decision on the basis of best match of application demands with available connectivity options. The results show low-latency, highly consistent vertical handovers during mobility and improved performance of applications with respect to throughput and reliability.

Proceedings ArticleDOI
01 Nov 2007
TL;DR: An automatic, dynamic index selection method for data warehouses that is based on incremental frequent itemset mining from a given query workload that helps update the set of selected indexes when workload evolves instead of recreating it from scratch is presented.
Abstract: Analytical queries defined on data warehouses are complex and use several join operations that are very costly, especially when run on very large data volumes To improve response times, data warehouse administrators casually use indexing techniques This task is nevertheless complex and fastidious In this paper, we present an automatic, dynamic index selection method for data warehouses that is based on incremental frequent itemset mining from a given query workload The main advantage of this approach is that it helps update the set of selected indexes when workload evolves instead of recreating it from scratch Preliminary experimental results illustrate the efficiency of this approach, both in terms of performance enhancement and overhead

Proceedings ArticleDOI
01 Nov 2007
TL;DR: The SemID ontology is proposed which formalizes roles of the members, and controls access to project resources by means of formalized privacy policies and rules.
Abstract: The need for information security and privacy in today's connected systems is overwhelming. This paper focuses on the access control and privacy issues in a project based business environment to access project resources and to maintain privacy of members. In this regard, the SemID ontology is proposed which formalizes roles of the members, and controls access to project resources by means of formalized privacy policies and rules. The ontology is modeled from a corporate project scenario using the Protege ontology editor platform.

Proceedings ArticleDOI
01 Nov 2007
TL;DR: This paper proposes a novel vertical handoff decision scheme using the simple additive weighting (SAW) ranking method in a distributed manner and presents the simulation results and analysis, which show good performance of the proposed scheme.
Abstract: Next generation wireless networks (NGWN) offers for users the ability to be served while moving from a network to another or between subnetworks. Handoff is a main challenge for NGWN, this challenge lies in the decision to which network to connect in order to keep the mobile terminal always best connected. Mobile terminal may face while moving, different types of networks with a large number of characteristics that need to be considered for choosing between these networks. In this paper we propose a novel vertical handoff decision scheme using the simple additive weighting (SAW) ranking method in a distributed manner. We then present the simulation results and analysis, which show good performance of the proposed scheme.

Proceedings ArticleDOI
01 Nov 2007
TL;DR: This paper presents a novel framework for multimedia processing in wireless sensor networks considering needs of surveillance video applications, and automatically extracted moving objects are treated as intruder's events and their positions are exploited for efficient communication.
Abstract: In order to extract detailed information about the environment, multimedia sensor networks are becoming popular nowadays. However, due to unique properties of multimedia data delivery, we face novel challenges for resource-constrained sensor networks. Because of the high bandwidth demands of multimedia frames, the transmission of raw data collected at sensor nodes is costly. On the other hand, processing limitations prohibit the use of sophisticated multimedia processing at individual nodes to reduce the amount of data that needs to be communicated. In this paper, we present a novel framework for multimedia processing in wireless sensor networks considering needs of surveillance video applications. In our framework, automatically extracted moving objects are treated as intruder's events and their positions are exploited for efficient communication. We then apply joint processing of collected data at the sink to identify events using fuzzy memberships and decide the actual multimedia data to be sent from sensors to sink.

Proceedings ArticleDOI
01 Nov 2007
TL;DR: Three heuristic clustering methods based on MEC and MEC/GI model are presented and as numerical results on real biological data and simulation data show, the clustering algorithms work well and an increase in the rate of similarity between the real haplotypes and the reconstructed ones is gained.
Abstract: Most positions of the human genome are typically invariant (99%) and only some positions (1%) are commonly variant which are associated with complex genetic diseases. Haplotype reconstruction is to divide aligned SNP fragments, which is the most frequent form of difference to address genetic diseases, into two classes, and thus inferring a pair of haplotypes from them. Minimum error correction (MEC) is an important model for this problem but only effective when the error rate of the fragments is low. MEC/GI as an extension to MEC employs the related genotype information besides the SNP fragments and so results in a more accurate inference. The haplotyping problem, due to its NP-hardness, may have no efficient algorithm for exact solution. In this paper, three heuristic clustering methods based on MEC and MEC/GI model are presented. As numerical results on real biological data and simulation data show, the clustering algorithms work well and an increase in the rate of similarity between the real haplotypes and the reconstructed ones is gained.

Proceedings ArticleDOI
01 Nov 2007
TL;DR: The results show that an RBFN using fuzzy C-means performs better than anRBFN using hard C-Means, which can improve greatly the accuracy of obtained estimates.
Abstract: The Fuzzy Radial basis function Neural Networks (FRBFN) for software cost estimation is designed by integrating the principles of RBFN and the fuzzy C- means clustering algorithm. The architecture of the network is suitably modified at the hidden layer to realise a novel neural implementation of the fuzzy clustering algorithm. Fuzzy set-theoretic concepts are incorporated at the hidden layer, enabling the model to handle uncertain and imprecise data, which can improve greatly the accuracy of obtained estimates. MMRE and Pred are used as measures of prediction accuracy for this comparative study. The results show that an RBFN using fuzzy C-means performs better than an RBFN using hard C-means. This study uses data on web applications from the Tukutuku database.

Proceedings ArticleDOI
01 Nov 2007
TL;DR: A new method for estimating bandwidth in multicast protocols and also a new technique for supporting QoS routing in ODMRP by making an acceptable estimation of available and required bandwidth are proposed.
Abstract: Mobile ad hoc network (MANET) is a multi-hop wireless network formed by a collection of mobile nodes without any fixed infrastructure. The primary concerns in MANETs are bandwidth limitation and unpredictable dynamic topology. Reliable multicast plays a significant role in many applications of MANETs. Also efficient bandwidth utilization is crucial in routing protocols. The on-demand multicast routing protocol (ODMRP) was designed for multicast routing in MANETs. In this paper, we propose a new method for estimating bandwidth in multicast protocols and also a new technique for supporting QoS routing in ODMRP by making an acceptable estimation of available and required bandwidth. Also we propose a local recovery approach to design a reliable multicast algorithm. Simulation results show that using QoS routing for ODMRP improves network performance in presence of mobility.

Proceedings ArticleDOI
01 Nov 2007
TL;DR: This paper tries to augment the traditional directional rumor routing by introducing a second layer geographical routing, and a method is proposed to reduce the cost of localization equipments by using cheaper equipments like AoA antennas.
Abstract: Rumor routing is a classic routing algorithm based on the movement of the software agents among the network. In directional rumor routing, we try to propagate agents in straight lines instead of randomly wandering around the source point leading to definitely better performance. In this paper we try to augment the traditional directional rumor routing by introducing a second layer geographical routing. Moreover, a method is proposed to reduce the cost of localization equipments by using cheaper equipments like AoA antennas. Finally, the new protocol is evaluated by simulation.

Proceedings ArticleDOI
01 Nov 2007
TL;DR: A novel technique for scheduling mixed tasks on single dynamic voltage scaling enabled processor, preserves all timings constraints for hard periodic tasks under worst case execution time scenario, improves responsiveness to periodic tasks and, saves as much energy as possible for hybrid workload.
Abstract: Recently, a lot of work has been done on minimizing energy consumption of real time embedded systems by exploiting hardware characteristics of latest processors. However, these techniques are effective to energy reduction at the expense of delayed responsiveness; a feature highly discouraged in real time embedded systems. As opposed to the previous works, we value response time of higher importance than energy reduction after reliability, when a tradeoff is involved. In this paper, we present a novel technique for scheduling mixed tasks on single dynamic voltage scaling enabled processor. The proposed algorithm, preserves all timings constraints for hard periodic tasks under worst case execution time scenario, improves responsiveness to periodic tasks and, saves as much energy as possible for hybrid workload.

Proceedings ArticleDOI
01 Nov 2007
TL;DR: A new stemmer algorithm to find the roots and patterns for Arabic words based on excessive letter locations is described, which locates the trilateral root, quadri-literal root as well as the pentaliteral root.
Abstract: The paper describes a new stemmer algorithm to find the roots and patterns for Arabic words based on excessive letter locations. The algorithm locates the trilateral root , quadri-literal root as well as the pentaliteral root. The algorithm is written with the goal of supporting natural language processing programs such as parsers and information retrieval systems. The algorithm has been tested on thousands of Arabic words. Results reveals an accuracy reached to 95%.

Proceedings ArticleDOI
01 Nov 2007
TL;DR: Current state of the art of MediMed project is described, which involves transport of ultrasound and CT images across private fibre optics network and connection of remote hospitals.
Abstract: Institute of Computer Science of Masaryk University is working on the field of supporting medicine multimedia data transport archiving and processing more than ten years. Since first steps like transport of ultrasound and CT images across private fibre optics network these activities have grown up to regional PACS archive. Today more and more hospitals are participating on this project know under the name MediMed and we stand before scaling this project to republic wide application. Connection of remote hospitals introduced some not very common technical solutions. This paper describes current state of the art of MediMed project.