scispace - formally typeset
Search or ask a question

Showing papers presented at "International Conference on Innovations in Information Technology in 2009"


Proceedings ArticleDOI
15 Dec 2009
TL;DR: In this paper, a framework for supporting the development of well-informed research policies and plans is presented, which is based on the use of bibliometrics; i.e., analysis is conducted using information regarding trends and patterns of publication.
Abstract: This paper presents a novel framework for supporting the development of well-informed research policies and plans. The proposed methodology is based on the use of bibliometrics; i.e., analysis is conducted using information regarding trends and patterns of publication. While using bibliometric techniques in this way is not a new idea, the proposed approach extends previous studies in a number of important ways. Firstly, instead of being purely exploratory, the focus of our research has been on developing techniques for detecting technologies that are in the early growth phase, characterized by a rapid increase in the number of relevant publications. Secondly, to increase the reliability of the forecasting effort, we propose the use of automatically generated keyword taxonomies, allowing the growth potentials of subordinate technologies to be aggregated into the overall potential of larger technology categories. A proof-of-concept implementation of each component of the framework is presented, and is used to study the domain of renewable energy technologies. Results from this analysis are presented and discussed.

28 citations


Proceedings ArticleDOI
15 Dec 2009
TL;DR: This paper has proposed the services composition algorithm based on quality of services and gravitational search algorithm which has many merits, for example rapid convergence speed, less memory use, considering a lot of special parameters such as the distance between solutions, etc.
Abstract: Web services composition based on QoS is the NP-hard problem, so the bionics optimization algorithms can solve it well. On the other hand, QoS of compound service is a key factor for satisfying the users. The users prefer different QoSs according to their desires. We have Proposed the services composition algorithm based on quality of services and gravitational search algorithm which is one of the recent optimization algorithms and it has many merits, for example rapid convergence speed, less memory use, considering a lot of special parameters such as the distance between solutions, etc. This paper presents a new approach to Service selection for Service Composition based on QoS and under the user's constraints. So in this approach, the QoS measures are considered based on the user's constraints and priorities. The experimental results show the method can achieve the composition effectively and it has a lot of potentiality for being applied.

21 citations


Proceedings ArticleDOI
15 Dec 2009
TL;DR: A transistor-level gate failure analysis starting from threshold voltage variations is presented, which will reveal huge differences between the highest and the lowest probabilities of failure, and will show how strongly these are affected by the supply voltage.
Abstract: The high-level approach for estimating circuit reliability tends to consider the probability of failure of a logic gate as a constant, and work towards the higher levels. With scaling, such gate-centric approaches become highly inaccurate, as both transistors and input vectors drastically affect the probability of failure of the logic gates. This paper will present a transistor-level gate failure analysis starting from threshold voltage variations. We will briefly review the state-of-the-art, and rely upon freshly reported results for threshold voltage variations. These will be used to estimate the probabilities of failure of a classical NAND-2 CMOS gate for (a few) different technologies, voltages, and input vectors. They will also reveal huge differences between the highest and the lowest probabilities of failure, and will show how strongly these are affected by the supply voltage.

19 citations


Proceedings ArticleDOI
15 Dec 2009
TL;DR: A new adjustable watermarking method based on singular value decomposition is presented so that SVD parameters are adjusted by using the GA considering image complexity and attack resistance.
Abstract: As information technology and multimedia products become more and more readily available, copyright and other related legal topics become more and more significant. Embedding copyright information as hidden data into the multimedia product-named watermarking- is one of the methods to protect owner rights. Two main concepts in watermarking are imperceptibility and robustness of the watermark. A tradeoff between these two features exists, which can be introduced as an optimization problem. Genetic Algorithm (GA) is applied to solve this optimization problem. In this paper, a new adjustable watermarking method based on singular value decomposition is presented so that SVD parameters are adjusted by using the GA considering image complexity and attack resistance. The proposed watermarking method is also an adjustable solution, so that by changing fitness function (cost function), watermarking method can be converted to each of robust, fragile, or semi-fragile types. Simulation results show that the proposed method has better results from the case where watermarking parameters are adjusted by the user empirically.

13 citations


Proceedings ArticleDOI
15 Dec 2009
TL;DR: An architecture for e-voting systems based on dependable web services is proposed and is modeled using stochastic Petri nets (SPNs), and the reliability and availability measures are evaluated.
Abstract: The explosion in the use of information technology and widespread use of the Internet makes the countries to utilize the information and communication technologies in order to get their inevitable benefits. Some of these benefits are: accuracy, speed, cost saving and etc. Election and voting is one of the cases, which is recently tended to be performed electronically. Web services, due to their advantages, may have a key role in usage and deployment of e-voting systems. However, employment of web services faces some major dependability and security issues. In this paper, an architecture for e-voting systems based on dependable web services is proposed. The proposed architecture is then modeled using stochastic Petri nets (SPNs), and the reliability and availability measures are evaluated.

13 citations


Proceedings ArticleDOI
15 Dec 2009
TL;DR: Fleet management is an important topic in research and development nowadays and companies, security and emergency forces need to keep track of their trucks and cars and know where a vehicle is at a moment in time.
Abstract: Fleet management is an important topic in research and development nowadays. Companies, security and emergency forces need to keep track of their trucks and cars and know where a vehicle is at a moment in time, and when, where and for how long a vehicle stopped.

12 citations


Proceedings ArticleDOI
15 Dec 2009
TL;DR: In this article, the authors compare a family of algorithms for automatic generation of taxonomies by adapting the Heymann-algorithm in various ways and show that betweenness centrality calculated on unweighted similarity graphs often performs best but requires threshold fine-tuning and is computationally more expensive than closeness centrality.
Abstract: We compare a family of algorithms for the automatic generation of taxonomies by adapting the Heymann-algorithm in various ways. The core algorithm determines the generality of terms and iteratively inserts them in a growing taxonomy. Variants of the algorithm are created by altering the way and the frequency, generality of terms is calculated. We analyse the performance and the complexity of the variants combined with a systematic threshold evaluation on a set of seven manually created benchmark sets. As a result, betweenness centrality calculated on unweighted similarity graphs often performs best but requires threshold fine-tuning and is computationally more expensive than closeness centrality. Finally, we show how an entropy-based filter can lead to more precise taxonomies.

11 citations


Proceedings ArticleDOI
15 Dec 2009
TL;DR: The authors' implementation of FPMAX, an extension of FP-Growth method for mining maximal frequent itemsets only, showed good performances compared with the others, and the comparison of response times published in FIMI 2004, for the chosen implementations, could not be replicated.
Abstract: Mining maximal frequent itemsets is an important issue in many data mining applications. In our thesis work on selection and tuning of indices in data werhouses, we have proposed a strategy based on mining maximal frequent itemsets in order to determine a set of candidate indices from a given workload. In a first step we have to select an algorithm, for mining maximal frequent itemsets, to implement. Experimental results in the repository of the workshops on Frequent Itemset Mining Implementations (http://fimi.cs.helsinki.fi/), shows that FPMAX has the best performance. Therefore, we have selected it for our own implementation in java language. FPMAX is an extension of FP-Growth method for mining maximal frequent itemsets only. We tested our implementation on two benchmark databases MUSHROOM and RETAIL. We compare our results with the best implementations available in the repository mentioned earlier. Our implementation showed good performances compared with the others. However, the comparison of response times published in FIMI 2004, for the chosen implementations, could not be replicated.

11 citations


Proceedings ArticleDOI
15 Dec 2009
TL;DR: A variable named step-size that was used in the algorithm has made NKLMS more efficient in prediction of time-series which have inconsistency in amplitude and the proposed algorithm is applied to channel modeling.
Abstract: In this paper Normalized Kernel Least Mean Square (NKLMS) algorithm is presented which has applications in system modeling and pattern recognition. In 2007 a similar algorithm was proposed Named Kernel Least Mean Square (KLMS), and a modified version of KLMS was introduced in 2008. Although KLMS has good results in prediction of some time series, high sensitivity to step-size and signal amplitude stability, still remain as problems. In this paper NKLMS and its ability in prediction and identification of time series is presented and is compared to KLMS method. A variable named step-size that was used in the algorithm has made NKLMS more efficient in prediction of time-series which have inconsistency in amplitude. Thus, convergence speed and system tracking are improved. Furthermore the proposed algorithm is applied to channel modeling.

10 citations


Proceedings ArticleDOI
15 Dec 2009
TL;DR: A pilgrim tracking system with the help of Radio Frequency identification (RFID) that can be deployed to improve the present situation and a new secure protocol which meets the requirements for this application is proposed.
Abstract: Every year millions of Muslims from all around the world gather to perform their pilgrimage in Makkah, Saudi Arabia. Due to the massive number of pilgrims of different languages, cultures and countries, it is highly challenging for the authorities to manage and provide proper services to these pilgrims. In this paper, we propose a pilgrim tracking system with the help of Radio Frequency identification (RFID) that can be deployed to improve the present situation. We also investigate possible security and privacy issues and propose a new secure protocol which meets the requirements for this application. Our proposed authentication protocol provides privacy, security and it is efficient particularly for this application.

10 citations


Proceedings ArticleDOI
15 Dec 2009
TL;DR: This paper tries to elaborate how honeypot systems can be exploited to reduce the chance of Bluetooth enabled attacks' success by limiting the client device discoverability for attackers.
Abstract: Wireless technologies provide a new channel for implementation of mobile payments systems. In this regard, the potential of short-range wireless technologies such as Bluetooth is enormous. These systems can be used for proximity payment to vending machines or offering banking service in the bank area. However, unsolved security issues are the biggest barriers to the growth of mobile payment. This paper is focused on the security of banking services which can be offered through Bluetooth technology. We propose a solution using honeypots in bank environment to mitigate the risk of Bluetooth-enabled payment transactions. In this paper, we try to elaborate how honeypot systems can be exploited to reduce the chance of Bluetooth enabled attacks' success by limiting the client device discoverability for attackers.

Proceedings ArticleDOI
15 Dec 2009
TL;DR: A novel approach is described that recognizes Arabic script documents by preprocessing which involves binarization, noise reduction, and thinning and is proven to work satisfactorily for scanned printed Arabic text.
Abstract: Automatic recognition of printed and handwritten documents remains an active area of research. Arabic is one of the languages that present special problems. Arabic is cursive and therefore necessitates a segmentation process to determine the boundaries of a character. Arabic characters consist of multiple disconnected parts. Dots and Diacritics are used in many Arabic characters and can appear above or below the main body of the character. In Arabic, the same letter has up to four different forms depending on where it appears in the word and depending on the letters that are adjacent to it. In this paper, a novel approach is described that recognizes Arabic script documents. The method starts by preprocessing which involves binarization, noise reduction, and thinning. The text is then segmented into separate lines. Characters are then segmented by determining bifurcation points that are near the baseline. Segmented characters are then compared to prestored templates to identify the best match. The template comparisons are based on central moments, Hu moments, and Invariant moments. The method is proven to work satisfactorily for scanned printed Arabic text. The paper concludes with a discussion of the drawbacks of the method, and a description of possible solutions.

Proceedings ArticleDOI
15 Dec 2009
TL;DR: For segmentation of vessels, HBVF method is the first step of the algorithm, and to reduce the false positives, fine particles are removed from the result according to their size to demonstrate the efficiency of the proposed algorithm.
Abstract: In many of vessel segmentation methods, Hessian based vessel enhancement filter as an efficient step is employed. In this paper, for segmentation of vessels, HBVF method is the first step of the algorithm. Afterward, to remove non-vessels from image, a high level threshold is applied to the filtered image. Since, as a result of threshold some of weak vessels are removed, recovering of vessels using Hough transform and morphological operations is accomplished. Then, the yielded image is combined with a version of vesselness filtered image which is converted to a binary image using a low level threshold. As a consequence of image combination, most of vessels are detected. In the final step, to reduce the false positives, fine particles are removed from the result according to their size. Experiments indicate the promising results which demonstrate the efficiency of the proposed algorithm.

Proceedings ArticleDOI
15 Dec 2009
TL;DR: This study investigated the availability of features in three ITSAT namely CTAT, ASPIRE and Assistment Buider, and proposed ITSAT features which could meet teachers' requirements in using ITSAT to design ITS.
Abstract: Intelligent tutoring systems (ITSs) have been proven effective in supporting students' learning activities, but the actual utilization of ITSs has not been confirmed. Delegation of development tasks from developers to teachers through the use of ITS authoring tools (ITSATs) is not promoting rapid progress in this area of research. Designing ITSs using ITSATs by teachers seems to be difficult to realize. This could be affected by insufficient features in ITASTs which meet teachers' requirements. This paper presents feature analysis of ITSATs. In this study, we examined two categories of features: authoring environment and lesson content creation. The study focuses on three ITSAT namely CTAT, ASPIRE and Assistment Buider. We investigated the availability of such features in those ITSAT and propose ITSAT features which could meet teachers' requirements in using ITSAT to design ITS.

Proceedings ArticleDOI
15 Dec 2009
TL;DR: A computational framework for steganography in WSN using the concept of redundancy and distributed computing to add robustness to the steganographic embedding and extraction algorithms is presented.
Abstract: This article provides a brief review of steganography, steganalysis and wireless sensor networks (WSNs). It also presents a computational framework for steganography in WSN. The technique uses the concept of redundancy and distributed computing to add robustness to the steganographic embedding and extraction algorithms.

Proceedings ArticleDOI
15 Dec 2009
TL;DR: The objective of this work is to develop a service-oriented context-aware middleware for real- time data management in real-time using mobile devices.
Abstract: The development of middleware has emerged as an area of expanding research, focused on the integration of services available for distributed applications. In this context, many challenges have also arisen with the use of middleware, such as communication, flexibility, performance, as well as integration with the Web and the computer itself. The development of middleware for mobile computing poses new challenges to developers because of the limitations of mobile devices. Thus, these developers must understand the new mobile computing and middleware technologies in order to integrate them. The objective of this work is to develop a service-oriented context-aware middleware for real-time data management in real-time using mobile devices.

Proceedings ArticleDOI
15 Dec 2009
TL;DR: Simulation results indicated that the functional link network showed higher signal to noise ratio in comparison to the pi-sigma neural networks.
Abstract: In this paper, we present the use of higher order neural networks for the prediction of speech signal. Various neural network structure have been used for our experiments these include the functional link neural network and pi-sigma neural network. Extensive experimentation is carried out to evaluate the performance of the higher order networks on the speech prediction platform. Simulation results indicated that the functional link network showed higher signal to noise ratio in comparison to the pi-sigma neural networks.

Proceedings ArticleDOI
15 Dec 2009
TL;DR: A design of a smart card based prepaid gas metering system has been presented that will make the consumers more conscious to utilize gas carefully and also it promises easy, fast and accurate billing scheme.
Abstract: Gas is an important source of energy in this world. At this age of energy crisis, gas must be utilized wisely and carefully. In this paper, a design of a smart card based prepaid gas metering system has been presented. Prepaid gas meter will make the consumers more conscious to utilize gas carefully and also it promises easy, fast and accurate billing scheme. The entire system is designed with the state-of-the-art digital and information technology. A prototype of the system has been developed and tested.

Proceedings ArticleDOI
15 Dec 2009
TL;DR: A robotics project was described that took place in the lab, which aimed towards providing a strong hands-on background in the basics of robotics and the fundamentals of the research process to a group of four final-year students.
Abstract: Constructionism, a term first coined by Seymour Papert, is a learning theory based on constructivism, which however also holds that learning can happen most effectively when people are also active in making objects in the real world. In this paper, we will present a case study of an application of constructionism at the undergraduate senior project level. More specifically, we will describe a robotics project that took place in our lab, which aimed towards providing a strong hands-on background in the basics of robotics and the fundamentals of the research process to a group of four final-year students. During this project, the students experienced basics of team working, flexible project management, and intra- as well as extra-group constructionist tuition, as well as aspects of real-world research. Furthermore, they were able to gain experience in three programming languages, build and successfully demonstrate basic behaviors and collaborative mapping using the Mindstorms robots, and create a theoretical framework incorporating and providing novel extensions to their methods.

Proceedings ArticleDOI
15 Dec 2009
TL;DR: The main contribution of this paper is the use of design diversity techniques and WS-BPEL, which would result in proposing a useful and flexible architecture for dependable web services.
Abstract: As the use of web services is growing, there is an increasing demand for dependability. In this paper we intend to introduce various offered solutions for dependable web services. The existing solutions are divided into two categories: fault tolerance techniques, such as active and passive replications, and the use of design diversity. The main contribution of this paper is the use of design diversity techniques and WS-BPEL, which would result in proposing a useful and flexible architecture for dependable web services. The proposed architecture has been used as the basis of a design pattern for dependable web services, called D3WS. We have implemented the proposed architecture in a prototype application using N-version programming (NVP). We have also modeled the architecture using stochastic reward nets (SRNs) for further evaluation of dependability measures.

Proceedings ArticleDOI
15 Dec 2009
TL;DR: The method of virtualization of ubiquitous databases can describe ubiquitous database schema in a unified fashion using the XML schema and consists of a high-level concept of distributed database management of the same type and of different types, and also of a location transparency feature.
Abstract: In this paper, our research objective is to develop a database virtualization technique so that data analysts or other users who apply data mining methods to their jobs can use all ubiquitous databases in the Internet as if they were recognized as a single database, thereby helping to reduce their workloads such as data collection from the databases and data cleansing works. In this study, firstly we examine XML scheme advantages and propose a database virtualization method by which such ubiquitous databases as relational databases, object-oriented databases, and XML databases are useful, as if they all behaved as a single database. Next, we show the method of virtualization of ubiquitous databases can describe ubiquitous database schema in a unified fashion using the XML schema. Moreover, it consists of a high-level concept of distributed database management of the same type and of different types, and also of a location transparency feature.

Proceedings ArticleDOI
15 Dec 2009
TL;DR: RSMA combines the authors' enhanced preprocessing phase from the Berry Ravindran algorithm with the proposed new searching phase procedure, and offers a smaller number of comparisons and improved elapsed searching time when compared to other well-known algorithms.
Abstract: Huge amounts of biological data are stored in linear files. Biological proteins are sequences of amino acids. The quantities of data in these fields tend to increase year on year. String matching algorithms play a key role in many computer science problems, and in the implementation of computer software. For this reason efficient string-matching algorithms should be used which use minimal computer storage and which minimize the searching response time. In this study, we propose a new algorithm called the Random String Matching Algorithm (RSMA). RSMA combines our enhanced preprocessing phase from the Berry Ravindran algorithm with our proposed new searching phase procedure. This variety of searching order allows our proposed algorithm to reduce the number of comparison characters and enhances the searching response time. Experimental results show that the RSMA algorithm offers a smaller number of comparisons and offers improved elapsed searching time when compared to other well-known algorithms.

Proceedings ArticleDOI
15 Dec 2009
TL;DR: This paper draws the focus on a major drawback of the PSO algorithm: the poor gbest update, and purposefully used simultaneous perturbation SA (SPSA) for its low cost and since SPSA is applied only to the gbest, both approaches have thus a negligible overhead cost over the entire PSO process.
Abstract: Particle Swarm Optimization (PSO) is attracting an ever-growing attention and more than ever it has found many application areas for many challenging optimization problems. In this paper, we draw the focus on a major drawback of the PSO algorithm: the poor gbest update. This can be a severe problem, which causes pre-mature convergence to local optima since gbest as the common term in the update equation of all particles, is the primary guide of the swarm. Therefore, we basically seek a solution for the social problem in PSO, i.e. “Who will guide the guide?” which resembles the rhetoric question posed by Plato in his famous work on government: “Who will guard the guards?” (Quis custodiet ipsos custodes?). Stochastic approximation (SA) is purposefully adapted into two approaches to guide (or drive) the gbest particle (with simultaneous perturbation) towards the right direction with the gradient estimate of the underlying surface (or function) whilst avoiding local traps due to its stochastic nature. We purposefully used simultaneous perturbation SA (SPSA) for its low cost and since SPSA is applied only to the gbest (not the entire swarm), both approaches have thus a negligible overhead cost over the entire PSO process. Yet we have shown over a wide range of non-linear functions that both approaches significantly improve the performance of PSO especially if the parameters of SPSA suits to the problem in hand.

Proceedings ArticleDOI
15 Dec 2009
TL;DR: A system for collecting and connecting people's stories, using mobile phones for video capture, and applying a question-and-answer game format so that participants can see an entire network of stories and add their own to it.
Abstract: We propose a system for collecting and connecting people's stories. In order to help encourage participants to engage their “expression mode,” we employed mobile phones for video capture, as opposed to traditional video cameras. As part of our work, we held a workshop in an art festival. In the workshop, facilitators capture participants' stories using mobile phones. Applying a question-and-answer game format, each video has connections to other videos. Our system shows these connections on a large screen such that participants can see an entire network of stories and add their own to it. We show the results of our experiments and conclude that our proposed workshop could successfully collect people's stories and connect them.

Proceedings ArticleDOI
15 Dec 2009
TL;DR: This paper is an attempt to propose a new approach for modeling and automatic verification of security protocols using CSANs and PDETool, where the existing agents in the protocol are expressed formally as roles using the security protocols language (SPL) and then are modeled byCSANs.
Abstract: Coloured stochastic activity networks (CSANs) are a useful formalism for modeling and analysis of computer systems and networks. PDETool is a new powerful modeling tool that supports CSANs. This paper is an attempt to propose a new approach for modeling and automatic verification of security protocols using CSANs and PDETool. In the proposed approach, the existing agents in the protocol are expressed formally as roles using the security protocols language (SPL) and then are modeled by CSANs. The approach has three steps. Firstly, the security protocol will be modeled regardless of the existence of any intruder. Secondly, different potential intruders will be modeled. Finally, by state space analysis of the model, the possibility of any security flaw in the protocol will be checked. As a case study, the Needham-Schroder and TMN protocols have been modeled and verified.

Proceedings ArticleDOI
15 Dec 2009
TL;DR: In this article, the authors empirically validate that PageRank scores of publications, as they change over time, follow the logistic growth model that often arises in the context of population growth, and model one aspect of researchers' citation behavior in technology-driven fields.
Abstract: Within publication digital collections, citation analysis and publication score assignment are commonly used (i) to evaluate the impact of publications (and scientific collections, e.g., journals and conferences), and (ii) to order digital collection search outputs, e.g., Google Scholar. The popular citation-based web page (and, thus, publication) score measure PageRank is criticized for (a) computing only the current (and, thus, time-independent) publication scores, and (b) not taking into account the fact that citation graphs continuously evolve. Thus, the use of PageRank as is results in penalizing recent publications that have not yet developed enough popularity to receive citations. In order to overcome this inherent bias of PageRank and other citation-based popularity measures, Cho et. al. defined Page Quality for a webpage as its popularity after large numbers of web users become aware of it. Page Quality is based on the assumption that popularity evolves over time. In this paper, we (i) experimentally validate that PageRank scores of publications, as they change over time, follow the logistic growth model that often arises in the context of population growth, (ii) model one aspect of researchers' citation behavior in technology-driven fields (such as computer science) where authors tend not to cite old publications, (iii) argue and empirically verify that publication popularity, unlike web page popularity, has two distinct phases, namely, the popularity growth phase and the popularity decay phase, and (iv) extend the popularity growth model developed by Cho et. al. to capture the popularity decay phase. All of our claims are empirically verified using the ACM SIGMOD Anthology digital collection.

Proceedings ArticleDOI
15 Dec 2009
TL;DR: The idea of a wearable tool that uses automatic facial expression recognition developed by the authors to assist the autistic deal with social situations and results show that this new method is superior to analysis of only either static images or dynamic videos which are the two methods commonly used.
Abstract: People with Asperger's Syndrome have difficulty with recognizing other people's emotions and are therefore not able to react to it. Although there have been many attempts aimed at developing software for teaching autistic children how to deal with social situations, the idea of equipping autistic persons with tools to help them recognize emotions has not been explored. This paper presents the idea of a wearable tool that uses automatic facial expression recognition developed by the authors to assist the autistic deal with social situations. In this paper, we describe a method we have developed for facial expression recognition that operates in real time. Experimental results show that this new method is superior to analysis of only either static images or dynamic videos which are the two methods commonly used. The assistive tool as well as the facial expression recognition system that has been developed for this purpose are presented here.

Proceedings ArticleDOI
15 Dec 2009
TL;DR: This paper proposes an enhancement to Fast Mobile IPv6 Handover (FMIPv6), based on link layer information, and presents performance evaluations in terms of the packet loss and Handover latency using evaluation models.
Abstract: Handover latency is the primary cause of packet loss resulting in performance degradation of the standard Mobile IPv6. Mobile IPv6 with fast Handover enables a Mobile Node (MN) to quickly detect at that IP layer it has moved to a new subnet by receiving link-related information from the link-layer; furthermore it gathers anticipative information about the new Access Point (AP) and the associated subnet prefix when the MN is still connected to the previous Corresponding Node (CN). This paper proposes an enhancement to Fast Mobile IPv6 Handover (FMIPv6), based on link layer information, we also present performance evaluations in terms of the packet loss and Handover latency using evaluation models.

Proceedings ArticleDOI
15 Dec 2009
TL;DR: A weighting of shingles is proposed and adapts shingling to be applied on weighted shingle to improve the performance of the algorithm.
Abstract: Broder's shingling is one of the state-of-the-art approaches in detecting near-duplicate documents. Prior evaluations of this method have shown that document-pairs which have different main content but have a large amount of similar unimportant details are the main sources of its errors. Different web pages from the same site are a good example of such documents. In such pages, almost always there is a similar boilerplate text which has a chance to be selected as the document's fingerprint and trick the algorithm. It seems that this problem is due to representing each document only by a sample of its shingles. This sample only contains some of the page's shingles and discards any other information. by Including additional information such as frequencies of shingles in this sample, we can improve the performance of the algorithm. This paper proposes a weighting of shingles and adapts shingling to be applied on weighted shingles. Our results have shown an improvement in shingling's performance.

Proceedings ArticleDOI
15 Dec 2009
TL;DR: This paper introduces a custom fault injection framework that helps to locate the most vulnerable nodes and components of embedded processors and could be used for an effective non-uniform fault-tolerant redundancy technique.
Abstract: Advances in silicon technology and shrinking the feature size to nanometer scale make unreliability of nano devices the most important concern of fault-tolerant designs. Design of reliable and fault-tolerant embedded processors is mostly based on developing techniques that compensate adding hardware or software redundancy. The recently-proposed redundancy techniques are generally applied uniformly to a system and lead to inefficiencies in terms of performance, power, and area. Non-uniform redundancy requires a quantitative analysis of the system behavior encountering transient faults. In this paper, we introduce a custom fault injection framework that helps to locate the most vulnerable nodes and components of embedded processors. Our framework is based on an exhaustive transient fault injection to candidate nodes which are selected from a user-defined list. Furthermore, the list of nodes containing the microarchitectural state is also defined by user to validate execution of instructions. Based on the reported results, the most vulnerable nodes, components, and instructions are found and could be used for an effective non-uniform fault-tolerant redundancy technique.