scispace - formally typeset
Search or ask a question

Showing papers presented at "International Conference on Innovations in Information Technology in 2013"


Proceedings ArticleDOI
17 Mar 2013
TL;DR: The paper presents a case study the goal of which is to investigate the possibility of determining the semantic orientation of Arabic Egyptian tweets and comments given limited Arabic resources and an Egyptian dialect sentiment lexicon.
Abstract: With the rapid increase in the volume of Arabic opinionated posts on different microblogging mediums comes an increasing demand for Arabic sentiment analysis tools. Yet, research in the area of Arabic sentiment analysis is progressing at a very slow pace compared to that being carried out in English and other languages. This paper highlights the major problems and open research issues that face sentiment analysis of Arabic social media. The paper also presents a case study the goal of which is to investigate the possibility of determining the semantic orientation of Arabic Egyptian tweets and comments given limited Arabic resources. One of the outcomes of the presented study, is an Egyptian dialect sentiment lexicon.

134 citations


Proceedings ArticleDOI
17 Mar 2013
TL;DR: A preprocessing phase to sentiment analysis is proposed and shown to noticeably improve the results of sentiment extraction from Arabic social media data.
Abstract: The problem of extracting sentiments from text is a very complex task, in particular due to the significant amount of Natural Language Processing (NLP) required. This task becomes even more difficult when dealing with morphologically rich languages such as Modern Standard Arabic (MSA) and when processing brief, noisy texts such as “tweets” or “Facebook statuses”. This paper highlights key issues researchers are facing and innovative approaches that have been developed when performing subjectivity and sentiment analysis (SSA) on Arabic text in general and Arabic social media text in particular. A preprocessing phase to sentiment analysis is proposed and shown to noticeably improve the results of sentiment extraction from Arabic social media data.

58 citations


Proceedings ArticleDOI
17 Mar 2013
TL;DR: This paper addresses the requirements and design aspects of a reference Machine-to-Machine (M2M) communication platform as an enabler for Smart Cities.
Abstract: For the last decades, we have witnessed new technological evolutions in the Internet, wireless networks, and sensors fields. Currently, we are able to build smart systems that improve quality of life and enhance environment management. However, most available smart systems are implemented based on proprietary hardware/software solutions, restricting interoperation, which is required for large-scale Smart City solutions. To enable the implementation of a general Smart City solution, a platform is needed to fulfill the communication requirements between heterogeneous access technologies. In this paper, we address the requirements and design aspects of a reference Machine-to-Machine (M2M) communication platform as an enabler for Smart Cities.

53 citations


Proceedings ArticleDOI
17 Mar 2013
TL;DR: A novel approach that combines ESN with support vector machines (SVMs) for time series classification by replacing the linear read-out function in the output layer with SVMs with the radial basis function kernel is presented.
Abstract: Echo state network (ESN) is a relatively recent type of recurrent neural network that has proved to achieve state-of-the-art performance in a variety of machine-learning tasks. This robust performance that incorporates the simplicity of ESN implementation has led to wide adoption in the machine-learning community. ESN's simplicity stems from the weights of the recurrent nodes being assigned randomly, known as the reservoir, and weights are only learnt in the output layer using a linear read-out function. In this paper, we present a novel approach that combines ESN with support vector machines (SVMs) for time series classification by replacing the linear read-out function in the output layer with SVMs with the radial basis function kernel. The proposed model has been evaluated with an Arabic digits speech recognition task. The well-known Spoken Arabic Digits Dataset, which contains 8800 instances of Arabic digits 0-9 spoken by 88 different speakers (44 males and 44 females) was used to develop and validate the suggested approach. The result of our system can be compared to the state-of-the-art models introduced by Hammami et al. (2011) and P. R. Cavalin et al. (2012) , which are the best reported results found in the literature that used the same dataset. The result shows that ESN and ESNSVMs can both provide superior performance at a 96.91% and 97.45% recognition accuracy, respectively, compared with 95.99% and 94.04% for other models. The result also shows that when using a smaller reservoir size significant differences exist in the performance of ESN and ESNSVMs, as the latter approach achieves higher accuracy by more than 15% in extreme cases.

39 citations


Proceedings ArticleDOI
17 Mar 2013
TL;DR: The underpinning cloud computing elements which are required to provide forensics friendly cloud services are discussed and a set of questions are suggested that will aid in the process of cloud forensics analysis.
Abstract: Cloud computing and digital forensics are both developing topics and researching these topics requires an understanding of the main aspects of both cloud computing and digital forensics. In cloud computing it is necessary not only to understand its characteristics and the different services and deployment models but also to survey the underpinning elements of cloud computing such as virtualization and the distributed computing which are important to identify its impact on current digital forensics guidelines and procedures. Unlike papers discussing the challenges and opportunities presented by cloud computing in relation to digital forensics, in this paper, we will discuss the underpinning cloud computing elements which are required to provide forensics friendly cloud services. Furthermore, we suggest a set of questions that will aid in the process of cloud forensics analysis.

31 citations


Proceedings ArticleDOI
17 Mar 2013
TL;DR: The identity management framework proposed in this paper, is based on the User Managed Access (UMA) protocol and enables federated identity management and management of access control policies across different infrastructure providers.
Abstract: The Cloud Networking(CloNe) infrastructure provisions elastic, secure, and on-demand virtualized network resources to the end user. It incorporates the Network-as-a-Service(NaaS) provisioning model, which enhances network-level scalability, throughput, and performance. In this paper, we extend the CloNe architecture by designing, deploying, and integrating an identity management framework, customized for the CloNe infrastructure. The identity management framework proposed in this paper, is based on the User Managed Access(UMA) protocol. The framework supports authentication, authorization, and identity management of entities in the CloNe infrastructure. Furthermore it enables federated identity management and management of access control policies across different infrastructure providers.

21 citations


Proceedings ArticleDOI
17 Mar 2013
TL;DR: A multi watermarking algorithm which embeds robust ownership watermark and fragile authentication information into satellite images to protect the copyright ownership and integrity of the images.
Abstract: This paper deals with a multi watermarking algorithm which embeds robust ownership watermark and fragile authentication information into satellite images to protect the copyright ownership and integrity of the images. This new watermarking technique is totally blind and does not require the original satellite image for the extraction process of the embedded information. The robust watermark is embedded first in the discrete cosine transform (DCT) and the second hash authentication code is embedded in the spatial domain. The new technique offers high peak signal to noise ratio (PSNR) and similarity structure index measure (SSIM) values. The technique was successfully tested on a variety of satellite images. The fragile watermark is sensitive to any slight tampering and can locate the area of the editing. The robust watermark survived many intentional and non intentional attacks.

17 citations


Proceedings ArticleDOI
17 Mar 2013
TL;DR: This paper proposes a hypoglycemia prediction model, using recent history of subcutaneous glucose measurements collected via Continuous Glucose Monitoring (CGM) sensors, and shows the ability to develop a generalized prediction model suitable for predicting hypoglyCEmia events for the group of patients participating in the study.
Abstract: The proper control of blood glucose levels in diabetic patients reduces serious complications. Yet tighter glycemic control increases the risk of developing hypoglycemia, a sudden drop in patients' blood glucose levels that causes coma and possibly death if proper action is not taken immediately. In this paper, we propose a hypoglycemia prediction model, using recent history of subcutaneous glucose measurements collected via Continuous Glucose Monitoring (CGM) sensors. The model is able to predict hypoglycemia events within a prediction horizon of thirty minutes accurately (sensitivity= 86.47%, specificity= 96.22, accuracy= 95.97%) using only the last two glucose measurements and the difference between them. More remarkably, this study shows the ability to develop a generalized prediction model suitable for predicting hypoglycemia events for the group of patients participating in the study.

13 citations


Proceedings ArticleDOI
17 Mar 2013
TL;DR: The boredom aspect of the rehabilitation process is tackled wherein a therapy system is developed that uses computer games to emulate the therapeutic activities and commercialization of these games is an interesting endeavor.
Abstract: The physiotherapy and rehabilitation process is an expensive and time consuming affair. It also requires the direct supervision of the specialists for effective results. This high cost coupled with high demand prohibits many patients from receiving appropriate care. Moreover, the emotional and mental strain is excessive for patients and is counterproductive to the therapy. In this paper, we tackle the boredom aspect of the rehabilitation process wherein we develop a therapy system that uses computer games to emulate the therapeutic activities. We describe, in some details, the issues involved in the development process. Commercial of the shelf product is employed (the Xbox Kinect) so the cost is significantly reduced and the accessibility is greatly increased. The system was tested on children at a local hospital, and the feedback was highly encouraging. This work opens avenues for more research to be conducted in exercise development, movement emulation, and game design. Moreover, commercialization of these games is an interesting endeavor.

12 citations


Proceedings ArticleDOI
17 Mar 2013
TL;DR: This paper presents a practical optimized WR algorithm applied to transmission line circuit problems based on the longitudinal partitioning into segments which greatly improves the convergence for strongly coupled RLCG transmission line (TL) type circuits.
Abstract: Today, parallel processing is necessary for the solution of large systems of ordinary differential equations (ODEs) as they are obtained from large electronic circuits or from discretizing partial differential equations (PDEs). Using a fine mesh in the discretization of these problems also leads to large compute times and large storage requirements. The waveform relaxation (WR) technique, which is ideally suited for the use of multiple processors for problems with multiple time scales has been used to solve such problems on parallel processors for such large systems of ODEs. However, applying the so-called classical WR techniques to strongly coupled systems leads to non-uniform slow convergence over a window in time for which the equations are integrated. In this paper, we present a so-called optimizedWR algorithm applied to transmission line circuit problems based on the longitudinal partitioning into segments. This greatly improves the convergence for strongly coupled RLCG transmission line (TL) type circuits. The method can be applied to other similar circuits. The method is based on optimal parameters that lead to the optimal convergence of the iterations. Here, we present a practical optimized WR algorithm which is easy to use and is computationally inexpensive.

11 citations


Proceedings ArticleDOI
17 Mar 2013
TL;DR: The algorithms are analyzed in both theoretical and experimental ways, which have made the suitability of these image in-painting algorithms over different kinds of applications in diversified areas.
Abstract: Digital in-painting is relatively a young research area, yet a large variety of techniques were proposed by the researchers to correct the occlusion. Image in-painting aims to restore images with partly information loss and tries to make in-painting results as these missing parts in such a way that the reconstructed image looks natural. Many different types of image in-painting algorithms exist in the literature. However no recent study has been undertaken for a comparative evaluation of these algorithms to provide a comprehensive visualization. This paper compares different types of image in-painting algorithms. The algorithms are analyzed in both theoretical and experimental ways, which have made the suitability of these image in-painting algorithms over different kinds of applications in diversified areas.

Proceedings ArticleDOI
17 Mar 2013
TL;DR: The perception of patients towards the use of Robots in surgery locally and remotely is investigated, revealing that the majority of respondents from a random sample believe that the Use of robots during surgeries is neither safe and controllable, nor beneficial.
Abstract: Information and communication technology (ICT) has become a core part of every business. Industries form all areas have been taking advantage of ICT. Healthcare sector is no exception. Robots have been playing a pivotal role in assisting surgeons in many types of surgeries in the recent years. Robotic assisted surgery led to the possibility of remote procedures by surgeons on remote patients. This study investigates the perception of patients towards the use of Robots in surgery locally and remotely. A survey method was adopted in this research. Data reveals that the majority of respondents from a random sample (140 participants) believe that the use of robots during surgeries is neither safe and controllable, nor beneficial. The acceptance of the use of this surgical technology in all cases is still questionable within this research sample.

Proceedings ArticleDOI
17 Mar 2013
TL;DR: This paper presents an overview of the epidemic spread modeling and simulation, and summarizes the main technical challenges in this field, and investigates the most relevant recent approaches carried out towards this perspective.
Abstract: Epidemics have disturbed human lives for centuries causing massive numbers of deaths and illnesses among people and animals. As the number of urbanized and mobile population has increased, the possibility of a worldwide pandemic has grown too. The latest advances in high-performance computing and computational network science can help computational epidemiologists to develop large-scale high-fidelity models of epidemic spread. These models can help to characterize the large-scale patterns of epidemics and guide public health officials and policy makers in taking appropriate decisions to prevent and control such epidemics. This paper presents an overview of the epidemic spread modeling and simulation, and summarizes the main technical challenges in this field. It further investigates the most relevant recent approaches carried out towards this perspective and provides a comparison and classification of these approaches.

Proceedings ArticleDOI
17 Mar 2013
TL;DR: This paper demonstrates the design and implementation of a portable embedded system targeting continuous temperature monitoring with wireless interface capability for monitoring babies', disable or elderly population body temperature and initiate immediate alarm in case of hazardous cases.
Abstract: This paper demonstrates the design and implementation of a portable embedded system targeting continuous temperature monitoring with wireless interface capability. Our main motive is to provide a solution for monitoring babies', disable or elderly population body temperature and initiate immediate alarm in case of hazardous cases. Such cases include overheating (fever), under heating, and a high temperature change over a predefined time period. The advantage of the system is its effectiveness as preventive measure against febrile seizures or any other fever condition. The system is extended for interfacing with other devices such as cell phones to enable remote monitoring; an android application to showcase the concept was developed. The system architecture consist of temperature sensors, LCD screen, Bluetooth interface, memory, a sound buzzer all controlled by a single microcontroller core. Even though the system is concentrated on temperature monitoring but the architecture can be expanded to monitor other vital signs like pulse rate, Oxygen saturation or any other interested parameter.

Proceedings ArticleDOI
17 Mar 2013
TL;DR: A new caching strategy called LUV-Path is proposed, in which all the routers along the delivery path will implicitly cooperate in deciding whether to cache the content and each content cached will be assigned a value that combines Least Unified Value (LUV) with router's distance from provider to reflect its relative importance.
Abstract: One of the defining characters of Information-Centric Networking(ICN) is in-network caching which enables content retrieval shift its emphasis from the exact content providers to the content that customers mostly care about. Undoubtedly, how to cache effectively is of primary concern in ICN. However, caching strategies such as Least Recently Used(LRU) are not tailored for the characteristics of content retrieval in ICN, thus compromising the capability of in-network storage. In this paper, we propose a new caching strategy called LUV-Path, in which all the routers along the delivery path will implicitly cooperate in deciding whether to cache the content and each content cached will be assigned a value that combines Least Unified Value(LUV) with router's distance from provider to reflect its relative importance. We evaluate LUV-Path with cache algorithms including LRU, FIFO under various network topology such as highly structured string and tree topologies and irregularly structured Abilene network. Our results suggest that LUV-Path significantly outperforms other caching strategies in reducing customer delay and network traffic as well as alleviating provider pressure across various network topologies with different structural property consistently.

Proceedings ArticleDOI
17 Mar 2013
TL;DR: A detailed review of the most recent approaches for solving the RCPSP that have been proposed in literature is provided, based on a number of relevant metrics, and extensive numerical results based on well-known benchmark problem instance sets are presented.
Abstract: The classical resource-constrained project scheduling problem (RCPSP), a well-known NP-hard problem in scheduling, is one of the most extensively investigated problems in operations research. It has been attracting considerable attention from academia and industry for several decades. Recently, a number of new and promising meta-heuristic approaches for solving the RCPSP problem have emerged. In this paper, we provide a detailed review of the most recent approaches for solving the RCPSP that have been proposed in literature. In particular, we present a comparison, classification and analysis, based on a number of relevant metrics. Extensive numerical results based on well-known benchmark problem instance sets of size J30, J60 and J120 from Project Scheduling Problem Library (PSPLIB), as well as comparisons among state-of-the-art hybrid meta-heuristic algorithms demonstrate the effectiveness of the proposed approaches for solving the RCPSP of various scales.

Proceedings ArticleDOI
17 Mar 2013
TL;DR: Simulation results show that as η decreased, below the bandwidth of chaotic signal, the synchronization error is increased due to high distortion in the normal geometrical configuration of the attractor, and the system performance will be degraded significantly with high possibility of communication link failure.
Abstract: One of the challenging issues involved in the development of future chaos-based secure communication (CBSC) for wireless applications is the synchronization between transmit and receive nodes. In this paper, effects of filtering chaotic signals of CBSC due to finite bandwidth of realistic channel environment and/or detection requirements on the synchronization are investigated. The double scroll chaotic attractor using Chua's circuit is employed at the transmit and receive nodes. Over wide range of filter cut-off frequency (η), simulation results show that as η decreased, below the bandwidth of chaotic signal (W), the synchronization error is increased due to high distortion in the normal geometrical configuration (state-space) of the attractor. Consequently, the system performance will be degraded significantly with high possibility of communication link failure. Thus, careful design should be made to eliminate the impact of filtering and achieve the essential synchronization towards the wide adoption of CBSC in next-generation systems.

Proceedings ArticleDOI
17 Mar 2013
TL;DR: This paper focuses on the development of node-level very short time load forecasting (NL-VSTLF), and specifically on a framework for the utilization of real-time data at the point of use for advancing research in the area of building-very short term load forecasting.
Abstract: Through combining Information and Communication Technologies (ICT), advanced instrumentation, system intelligence, and information on the end user, the smart grid will increase building energy efficiency and conservation. Demand Side Management (DSM) programs serve as an aid in energy conservation and management strategies as well as in the collection of real-time consumption information data. Specifically, as proposed in this paper, real-time, fine-grain consumption data at the point of use in buildings can be used by building energy managers, utilities, and the end user for planning, load forecasting, and feedback for providing information that may lead to end user behavior change. This paper focuses on the development of node-level very short time load forecasting (NL-VSTLF), and specifically on a framework for the utilization of real-time data at the point of use for advancing research in the area of building-very short term load forecasting. This proposed framework will be the foundation for variable one minute to hourly and daily load forecasting using an aggregate of all active nodes in real-time.

Proceedings ArticleDOI
17 Mar 2013
TL;DR: The research conducted to design the Arabic Brain Communicator (ABC), which is a brain-controlled typing system designed to facilitate communication for people with severe motor disabilities in Arabic, is described.
Abstract: This paper describes the research conducted to design the Arabic Brain Communicator (ABC), which is a brain-controlled typing system designed to facilitate communication for people with severe motor disabilities in Arabic. A user centered design was adopted; it included empirical investigations and meetings with Subject-Matter Experts and possible users. Activities conducted in the analysis and design of the system are discussed.

Proceedings ArticleDOI
17 Mar 2013
TL;DR: A scalable random directional broadcasting relay (RDBR) scheme based on the theorem presented in the Lemma is presented, so that a source node is able to effectively determine the forwarding nodes, especially when the density of nodes is high.
Abstract: In this paper, we present a Lemma to achieve the upper bound of coverage for broadcasting relay in military ad hoc based wireless communications in critical battle field environment. Then, a scalable random directional broadcasting relay (RDBR) scheme based on the theorem presented in the Lemma is presented, so that a source node is able to effectively determine the forwarding nodes, especially when the density of nodes is high. The numerical results show that the proposed RDBR scheme associated with relative forwarding distance and angle is able to increase the successful delivery ratio up to 10% compared to conventional distance based broadcasting approaches.

Proceedings ArticleDOI
17 Mar 2013
TL;DR: This paper explores design trade-offs for middleware systems targeting multi-core embedded systems and the experimental platform implemented for the project is presented and future work is highlighted.
Abstract: Efficient software is required in order to make optimum utilization of the full-scale features of the rapidly developing MPSoC hardware. Embedded MPSoCs are characterized by their heterogeneity and resource limitation. In addition, embedded MPSoCs systems would be required to deal with a complex set of tasks whose resource requirements cannot be easily determined statically at design time. Middleware is a particularly important software component of embedded MPSoC that enables the system to overcome heterogeneity and manage resources dynamically at run-time. An important part of the middleware is the run-time system manager responsible for the distributed resource manager of the MPSoC-based system. System management at run-time is important for many modern embedded systems because the tasks performed by the system vary over time. In this paper we explore design trade-offs for middleware systems targeting multi-core embedded systems. The experimental platform implemented for the project is presented and future work is highlighted.

Proceedings ArticleDOI
17 Mar 2013
TL;DR: Simulation results suggest that both implementations of more threads running in a system with more cores have potential to reduce the execution time with negligible or little increase in total power consumption.
Abstract: The advancement of multicore systems demands applications with more threads. In order to facilitate this demand, parallel programming models such as message passing interface (MPI) are developed. By using such models, the execution time and the power consumption can be reduced significantly. However, the performance of MPI programming depends on the total number of threads and the number of processing cores in the system. In this work, we experimentally study the impact of Open MPI and POSIX Thread (Pthread) implementations on performance and power consumption of multicore systems. Data dependent (like heat conduction on 2D surface) and data independent (like matrix multiplication) applications are used with high performance hardware in the experiments. Simulation results suggest that both implementations of more threads running in a system with more cores have potential to reduce the execution time with negligible or little increase in total power consumption. It is observed that the performance of MPI implementation varies (due to the dynamic communication overhead among the processing cores).

Proceedings ArticleDOI
17 Mar 2013
TL;DR: A general analytical model is provided to determine the capacity of timing-based covert channels, and the model is verified with computer simulations.
Abstract: Covert channels provide a medium for secret communication by exploiting caveats in common networking protocols to hide information exchanges within benign activities, without being detected by unsuspecting hosts and network firewalls. This makes covert channels a significant security concern. Therefore, it is of utmost importance to develop effective and comprehensive countermeasures. In general, the more secret data capacity a covert channel provides the higher its estimated threat level is, as it diminishes the time available to detect and disrupt such activities and prevent the information exchange. Hence, determining the capacity of a covert channel is important. However, most work in capacity estimation is specifically targeted at individual algorithms only, and thus is similarly in applicability. A general mathematical model that can predict the capacity of most algorithms is a key research need for effective covert channel prevention. In this paper, we have provided a general analytical model to determine the capacity of timing-based covert channels, and verified the model with computer simulations.

Proceedings ArticleDOI
17 Mar 2013
TL;DR: This paper proposes the concept and primary architecture of a network “memory” (or NetMem) to support smarter data-driven network operations as a foundational component of next generation networks and designs NetMem to mimic functionalities of the human memory.
Abstract: The advent of the Internet of things will bring to bear an explosion in the number of interconnected heterogeneous objects as well as the diverse resources and services they may offer. A fundamental goal is to ensure the availability of resources and services to communicating objects ubiquitously, resiliently, on-demand and at low cost while satisfying users' QoS requirements. We hypothesize that to achieve this goal; there is a need to build capabilities for smarter networking to harvest the currently elusive rich semantics that emerge in interactions. In this paper, we propose the concept and primary architecture of a network “memory” (or NetMem) to support smarter data-driven network operations as a foundational component of next generation networks. Guided by the fact that networking activities exhibit spatiotemporal data patterns, we design NetMem to mimic functionalities of the human memory. NetMem provides capabilities for semantics management through integrating data virtualization, cloud-like scalable storage, associative rule learning and predictive analytics. NetMem provides associative access to data patterns and relevant derived semantics to enable enhancements in decision making, QoS guarantees and utilization of resources, early anomaly detection, and more accurate behavior prediction. We evaluate NetMem using simulation. Preliminary results demonstrate the positive impact of NetMem on various network management operations.

Proceedings ArticleDOI
17 Mar 2013
TL;DR: Empirical study is introduced on the effect of data distribution and other data measures like the mean and standard deviation on the attack detection rates and the cost of the proposed watermark insertion and data verification algorithms.
Abstract: Databases most often contain critical information. Unauthorized changes to databases can have serious consequences and may result in significant losses for the organization. This paper presents a viable solution for protecting the integrity of the data stored in relational databases using fragile watermarking. Prior techniques introduce distortions to the watermarked values and thus cannot be applied to all attributes. Our technique protects relational tables by reordering tuples relative to each other according to a secrete value (watermark). This paper introduces empirical study on the effect of data distribution and other data measures like the mean and standard deviation on the attack detection rates. A study on the cost, in terms of the execution time, of the proposed watermark insertion and data verification algorithms is also presented.

Proceedings ArticleDOI
17 Mar 2013
TL;DR: This paper has given a unique method to secure hosts inside the home network using predictive attack detection and honeypots and offers techniques to roam an attacker over multiple installed honeypots.
Abstract: Our contribution through this paper is to provide a novel approach for securing hosts inside the home network using predictive attack detection and honeypots. We have given a unique method to secure hosts inside the home network. After securing these hosts we redirect attacker traffic to honeypots for further analysis. Our system also offers techniques to roam an attacker over multiple installed honeypots. This in turn helps to load share the attack traffic between honeypots and to record maximum number of attacks. This method also helps to address key problem of determining honeypot location for maximum attack exposure.

Proceedings ArticleDOI
17 Mar 2013
TL;DR: This paper overcome a drawback of the ProRank algorithm and further improve its performance by allowing detected protein complexes to overlap; a supposition that was not considered in the original version of the method.
Abstract: The detection of protein complexes is evidently a cornerstone of understanding various biological processes and identifying key genes causing different diseases. Accordingly, many methods aiming at detecting protein complexes were developed. Recently, a novel method called ProRank was introduced. This method uses a ranking algorithm to detect protein complexes by ordering proteins based on their importance in the interaction network and by accounting for the evolutionary relationships among them. The experimental results showed that ProRank outperformed several well-known methods in terms of the number of detected complexes with high accuracy, precision and recall levels. In this paper, we overcome a drawback of the ProRank algorithm and further improve its performance by allowing detected protein complexes to overlap; a supposition that was not considered in the original version of the method.

Proceedings ArticleDOI
17 Mar 2013
TL;DR: This paper introduces an approach that combines a set of semantic techniques to enhance traditional information retrieval for the Arabic language and shows that semantic retrieval outperforms thetraditional information retrieval which is based only on query keywords matching.
Abstract: This paper introduces an approach that combines a set of semantic techniques to enhance traditional information retrieval for the Arabic language. The approach uses a set of linguistic techniques to deal with the Web content. It expands first the initial user's query with additional related words by using lexical semantic relations (synonyms) covered in Arabic WordNet thesaurus and by domain specific words and abbreviations collected from the corpus. Second, it makes use of linguistic methods to match semantically the query objective with the text content in the search results. The paper presents the approach used and also the implementation of the system. Experimental results show that semantic retrieval outperforms the traditional information retrieval which is based only on query keywords matching.

Proceedings ArticleDOI
17 Mar 2013
TL;DR: A path generation algorithm based on the Principal Component Analysis is presented to find a fixed path for a single mobile sink with the minimum average distance to the sensor nodes.
Abstract: With the energy constraints present in wireless sensor networks, it is essential to find energy-efficient approaches that would result in enhanced network lifetime. One approach is to exploit the sink mobility, by moving it randomly or through a predefined path, to the vicinity of the sensor nodes. By doing so, the communication distance will be reduced, leading to a reduction in the consumed transmission energy. In this paper, a path generation algorithm based on the Principal Component Analysis is presented. The main goal is to find a fixed path for a single mobile sink with the minimum average distance to the sensor nodes. Simulation results indicate that this approach leads to extended network lifetime and results in an increased average energy per live node.

Proceedings ArticleDOI
17 Mar 2013
TL;DR: This study presents a formal description of a linguistic system for developing an Arabic syntax analyzer based combinatorial grammar and several formal concepts to redefine the structure of the Arabic language in order to adapt it to the automatic processing of at syntactic level.
Abstract: This study presents a formal description of a linguistic system for developing an Arabic syntax analyzer based combinatorial grammar. In this context we will present several formal concepts to redefine the structure of the Arabic language in order to adapt it to the automatic processing of at syntactic level, such as support verb, nominalization, and other operators. These concepts provide an effective framework for the implementation of language engineering techniques aimed at integrating the Arabic language within the NLP community.