scispace - formally typeset
Search or ask a question

Showing papers in "Journal of King Saud University - Computer and Information Sciences archive in 2013"


Journal ArticleDOI
TL;DR: It is concluded that drug treatment for patients in the young age group can be delayed to avoid side effects and in contrast, Patients in the old age group should be prescribed drug treatment immediately, along with other treatments, because there are no other alternatives available.
Abstract: This research concentrates upon predictive analysis of diabetic treatment using a regression-based data mining technique. The Oracle Data Miner (ODM) was employed as a software mining tool for predicting modes of treating diabetes. The support vector machine algorithm was used for experimental analysis. Datasets of Non Communicable Diseases (NCD) risk factors in Saudi Arabia were obtained from the World Health Organization (WHO) and used for analysis. The dataset was studied and analyzed to identify effectiveness of different treatment types for different age groups. The five age groups are consolidated into two age groups, denoted as p(y) and p(o) for the young and old age groups, respectively. Preferential orders of treatment were investigated. We conclude that drug treatment for patients in the young age group can be delayed to avoid side effects. In contrast, patients in the old age group should be prescribed drug treatment immediately, along with other treatments, because there are no other alternatives available.

140 citations


Journal ArticleDOI
TL;DR: The quantized histogram statistical texture features are extracted from the DCT blocks of the image using the significant energy of the DC and the first three AC coefficients of the blocks for the effective matching of images in the compressed domain.
Abstract: The effective content-based image retrieval (CBIR) needs efficient extraction of low level features like color, texture and shapes for indexing and fast query image matching with indexed images for the retrieval of similar images. Features are extracted from images in pixel and compressed domains. However, now most of the existing images are in compressed formats like JPEG using DCT (discrete cosine transformation). In this paper we study the issues of efficient extraction of features and the effective matching of images in the compressed domain. In our method the quantized histogram statistical texture features are extracted from the DCT blocks of the image using the significant energy of the DC and the first three AC coefficients of the blocks. For the effective matching of the image with images, various distance metrics are used to measure similarities using texture features. The analysis of the effective CBIR is performed on the basis of various distance metrics in different number of quantization bins. The proposed method is tested by using Corel image database and the experimental results show that our method has robust image retrieval for various distance metrics with different histogram quantization in a compressed domain.

110 citations


Journal ArticleDOI
TL;DR: A user-centered measure of cyber-security is explored, and it is seen how this measure can be used to analyze cloud computing as a business model.
Abstract: Cloud computing is an emerging paradigm of computing that replaces computing as a personal commodity by computing as a public utility. As such, it offers all the advantages of a public utility system, in terms of economy of scale, flexibility, convenience but it raises major issues, not least of which are: loss of control and loss of security. In this paper, we explore a user-centered measure of cyber-security, and see how this measure can be used to analyze cloud computing as a business model.

96 citations


Journal ArticleDOI
TL;DR: An approach to mine user buying patterns using PrefixSpan algorithm and place the products on shelves based on the order of mined purchasing patterns to solve the problem of shelf space allocation and products display in supermarkets.
Abstract: With a great variation of products and user buying behaviors, shelf on which products are being displayed is one of the most important resources in retail environment. Retailers can not only increase their profit but, also decrease cost by proper management of shelf space allocation and products display. To solve this problem, we propose an approach to mine user buying patterns using PrefixSpan algorithm and place the products on shelves based on the order of mined purchasing patterns. The proposed approach is able to mine the patterns in two stages of process. In the first stage, the sequences of product categories are mined to place the product categories on the shelves based on the sequence order of mined patterns. Subsequently, in the second stage, the patterns (products) are mined for each category and then, rearrange the products within the category by incorporating the profit measure on the mined patterns. The experimentation is carried out on the synthetic datasets and the evaluation with two datasets showed that the proposed approach is good for product placement in supermarkets.

62 citations


Journal ArticleDOI
TL;DR: A method for dynamic congestion detection and control routing (DCDR) in ad hoc networks based on the estimations of the average queue length at the node level is proposed, which showed better performance than the EDOCR, EDCSCAODV, EDAODV and AODV routing protocols.
Abstract: In mobile ad hoc networks (MANETs), congestion can occur in any intermediate node, often due to limitation in resources, when data packets are being transmitted from the source to the destination. Congestion will lead to high packet loss, long delay and waste of resource utilization time. The primary objective of congestion control is to best utilize the available network resources and keep the load below the capacity. The congestion control techniques to deal with TCP have been found inadequate to handle congestion in ad hoc networks, because ad hoc networks involve special challenges like high mobility of nodes and frequent changes of topology. This paper proposes a method for dynamic congestion detection and control routing (DCDR) in ad hoc networks based on the estimations of the average queue length at the node level. Using the average queue length, a node detects the present congestion level and sends a warning message to its neighbors. The neighbors then attempt to locate a congestion-free alternative path to the destination. This dynamic congestion estimate mechanism supporting congestion control in ad hoc networks ensures reliable communication within the MANET. According to our simulation results, the DCDR showed better performance than the EDOCR, EDCSCAODV, EDAODV and AODV routing protocols.

53 citations


Journal ArticleDOI
TL;DR: The design problem of imposing deeper nulls in the interference direction of uniform linear antenna arrays under the constraints of a reduced side lobe level (SLL) and a fixed first null beam width (FNBW) is modeled as a simple optimization problem.
Abstract: The design problem of imposing deeper nulls in the interference direction of uniform linear antenna arrays under the constraints of a reduced side lobe level (SLL) and a fixed first null beam width (FNBW) is modeled as a simple optimization problem. The real-coded genetic algorithm (RGA) is used to determine an optimal set of current excitation weights of the antenna elements and the optimum inter-element spacing that satisfies the design goal. Three design examples are presented to illustrate the use of the RGA, and the optimization goal in each example is easily achieved. The numerical results demonstrate the effectiveness of the proposed method.

44 citations


Journal ArticleDOI
TL;DR: A CL-SDVS scheme using elliptic curve bilinear parings, which is provably secure in the random oracle model with the intractability of BDH and CDH assumptions, and supports all desirable security necessities of the CL- SDVS scheme such as strongness, source hiding and non-delegatability.
Abstract: Diffie and Hellman first invented the public key cryptosystem (PKC) wherein the public key infrastructure (PKI) is used for the management of public keys; however, the PKI-based cryptosystems suffer from heavy management trouble of public keys and certificates. An alternative solution to the PKI is Shamir's identity-based cryptosystems (IBC), which eliminate the need of public key certificates; however, the most important shortcoming of IBC is the key escrow problem. To cope with these problems, Al-Riyami and Paterson proposed a novel scheme of certificateless PKC (CL-PKC) by combining the advantages of PKI and IBC. Since then, several certificateless signature schemes have been designed and most of them have been analyzed and proven insecure against different types of adversaries. Besides, the researchers have given very less attention to the certificateless strong designated verifier signature (CL-SDVS) scheme. Therefore, we proposed a CL-SDVS scheme using elliptic curve bilinear parings in this paper. Our scheme, which is provably secure in the random oracle model with the intractability of BDH and CDH assumptions, supports all desirable security necessities of the CL-SDVS scheme such as strongness, source hiding and non-delegatability. The rigorous security analysis and comparison with others guarantee the better performance of the proposed scheme.

37 citations


Journal ArticleDOI
TL;DR: Two major improvements to HSA for NRP are made: replacing random selection with the Global-best selection of Particle Swarm Optimization in memory consideration operator to improve convergence speed and establishing multi-pitch adjustment procedures to improve local exploitation.
Abstract: In this paper, the Harmony Search Algorithm (HSA) is proposed to tackle the Nurse Rostering Problem (NRP) using a dataset introduced in the First International Nurse Rostering Competition (INRC2010). NRP is a combinatorial optimization problem that is tackled by assigning a set of nurses with different skills and contracts to different types of shifts, over a predefined scheduling period. HSA is an approximation method which mimics the improvisation process that has been successfully applied for a wide range of optimization problems. It improvises the new harmony iteratively using three operators: memory consideration, random consideration, and pitch adjustment. Recently, HSA has been used for NRP, with promising results. This paper has made two major improvements to HSA for NRP: (i) replacing random selection with the Global-best selection of Particle Swarm Optimization in memory consideration operator to improve convergence speed. (ii) Establishing multi-pitch adjustment procedures to improve local exploitation. The result obtained by HSA is comparable with those produced by the five INRC2010 winners' methods.

27 citations


Journal ArticleDOI
TL;DR: This paper proposes a new security protocol for proxy signature by a hierarchy of proxy signers and shows that the scheme is efficient in terms of computational complexity as compared to the existing related proxy signature schemes based on the hierarchical access control.
Abstract: In this paper, we propose a new security protocol for proxy signature by a hierarchy of proxy signers. In this protocol, the original signer delegates his/her signing capability to a predefined hierarchy of proxy signers. Given the documents of a security class to be signed by the original signer, our scheme suggests a protocol for the hierarchy of proxy signers to sign the document on behalf of the original signer. The concept of hierarchical access control limits the number of people who could sign the document to the people who have the required security clearances. User in a security class requires two secret keys: one which identifies his/her security clearance, and that can also be derived by a user of upper level security clearance and second is his/her private key which identifies him/her as a proxy signer for the signature generation. We show that our scheme is efficient in terms of computational complexity as compared to the existing related proxy signature schemes based on the hierarchical access control. Our scheme also supports addition and deletion of security classes in the hierarchy. We show through security analysis that our scheme is secure against possible attacks. Furthermore, through the formal security analysis using the AVISPA (Automated Validation of Internet Security Protocols and Applications) tool we show that our scheme is also secure against passive and active attacks.

24 citations


Journal ArticleDOI
TL;DR: A multi-criteria vertical handoff system sensitive to various mobile-terminals' mobility parameters including distance and velocity in a heterogeneous wireless network is analytically formulated and validated via simulations to estimate the essential handoff parameters.
Abstract: A multi-criteria vertical handoff system sensitive to various mobile-terminals' mobility parameters including distance and velocity in a heterogeneous wireless network is analytically formulated and validated via simulations. It is targeted to estimate the essential handoff parameters including outage probability, residual capacity, and signal to interference and noise threshold as well as network access cost. In order to avoid the ping-pong effect in handoff, a signal evolution prediction system is formulated and its performance is examined. Moreover, the handoff scheme is triggered using an on line handoff-initiation-time estimation scheme. When initiated, the handoff procedure begins with a network scoring system based on multi-attribute strategy which results in selection of potentially promising network parameters. Simulation results are shown to track well the analytical formulations.

17 citations


Journal ArticleDOI
TL;DR: In this proposed system, tags are scattered throughout a mobile robot's environment in a constrained random pattern and are treated as landmarks and the experimental results demonstrate the efficiency of the proposed system.
Abstract: Radio Frequency Identification (RFID) technology is broadly deployed for improving trade and transactions. An RFID tag can identify the region (position) where it resides; thus, a popular trend among researchers is to deploy RFID technology for mobile robot localization. Because the intensities of signals at adjacent regions are similar to each other, it is a challenge to employ an RFID system as a sensor. In this proposed system, tags are scattered throughout a mobile robot's environment in a constrained random pattern and are treated as landmarks. An RFID receiver is mounted on a mobile robot that can navigate such an environment. The robot senses all landmarks in the vicinity to acquire the IDs and received signal strength indicator (RSSI) measurements of the scattered tags. The robot can locate itself depending on the classification result provided by a feed-forward back-propagation artificial neural network (BPANN) supplied with a set of all RSSI measurements read by this robot at a specific location. To be acceptable, this set should only have one high RSSI measurement. The robot senses the location information from a high-valued RSSI tag and adds it to a list of tag IDs along with the corresponding location information. The robot can use this information to travel between any two identified locations. The experimental results demonstrate the efficiency of this proposed system.

Journal ArticleDOI
TL;DR: The main goals of this paper are to provide a conceptual framework that can help implement both outsourcing and reversibility projects successfully and to integrate the outsourcing phase with the pre and post outsourcing phases.
Abstract: Outsourcing information systems services is considered a strategic decision for many organizations because it is a risky endeavor. When issues arise during the outsourcing process, many organizations tend to switch their operations from external vendors back to in-house, i.e., implement reversibility or back-sourcing. There is evidence of sufficient scale to warrant further attention to the reversibility process due to the increased failure of outsourcing projects. One of the main goals of this paper is to provide a conceptual framework that can help implement both outsourcing and reversibility projects successfully. In addition to the risks associated with the outsourcing process, most researches focus on the outsourcing process after the relationship between the vendor and the organization is established, while the activities related to pre-outsourcing and post-outsourcing stages are neglected or given little concern. Another objective of this work is to integrate the outsourcing phase with the pre and post outsourcing phases. This paper also aims to identify the critical factors affecting the outsourcing and reversibility processes, thereby dealing with the outsourcing risks from the beginning rather than as an afterthought.

Journal ArticleDOI
TL;DR: Results show the relevance of rhythm metrics to distinguish healthy speech from dysarthrias and to discriminate the levels of dysarthria severity.
Abstract: This paper reports the results of acoustic investigation based on rhythmic classifications of speech from duration measurements carried out to distinguish dysarthric speech from healthy speech. The Nemours database of American dysarthric speakers is used throughout experiments conducted for this study. The speakers are eleven young adult males with dysarthria caused by cerebral palsy (CP) or head trauma (HT) and one non-dysarthric adult male. Eight different sentences for each speaker were segmented manually to vocalic and intervocalic segmentation (176 sentences). Seventy-four different sentences for each speaker were automatically segmented to voiced and non-voiced intervals (1628 sentences). A two-parameters classification related to rhythm metrics was used to determine the most relevant measures investigated through bi-dimensional representations. Results show the relevance of rhythm metrics to distinguish healthy speech from dysarthrias and to discriminate the levels of dysarthria severity. The majority of parameters was more than 54% successful in classifying speech into its appropriate group (90% for the dysarthric patient classification in the feature space (%V, @DV)). The results were not significant for voiced and unvoiced intervals relatively to the vocalic and intervocalic intervals (the highest recognition rates were: 62.98 and 90.30% for dysarthric patient and healthy control classification respectively in the feature space (@DDNV, %DV)).

Journal ArticleDOI
TL;DR: This investigation has shown that the decoding complexity of higher M-values can be overcome at moderate N-values, while the robustness performance is maintained at satisfactory level.
Abstract: Nowadays, the quantization index modulation (QIM) principle is popular in digital watermarking due to its considerable performance advantages over spread-spectrum and low-bit(s) modulation. In a QIM-based data-hiding scheme, it is a challenging task to embed multiple bits of information into the host signal. This work proposes a new model of QIM, i.e., the M-ary amplitude modulation principle for multibit watermarking. The watermark embedding process may be divided into two phases. In the first phase, a binary watermark image is spatially dispersed using a sequence of numbers generated by a secret key. In the second phase, the host image is decomposed by lifting, and the encoded watermark bits are embedded into the high-low (HL) and low-high (LH) subbands of DWT-coefficients using M-ary amplitude modulation. The results of the simulation show that the robustness increases, at the cost of increased decoding complexity, for a high M-value. Furthermore, this investigation has shown that the decoding complexity of higher M-values can be overcome at moderate N-values, while the robustness performance is maintained at satisfactory level.

Journal ArticleDOI
TL;DR: A cookie based accounting model is proposed, which will record each and every client request in the cookie and the hash value of the cookie in the server database to detect the client's misbehavior like modifying the cookie information or resending the prior request cookie with the current request.
Abstract: The Denial of Service (DoS) attack is the major issue in the web service environment, especially in critical infrastructures like government websites. It is the easiest one for the attackers where they continuously generate the duplicate request with less effort to mitigate the availability of server resources to others. To detect and prevent this type of duplicate request attacks, accounting the client history (i.e., client request detail) is very important. This paper proposes a cookie based accounting model, which will record each and every client request in the cookie and the hash value of the cookie in the server database to detect the client's misbehavior like modifying the cookie information or resending (replay) the prior request cookie with the current request. Also this paper has analyzed all the accounting models including the proposed accounting model with respect to qualitative and quantitative results to prove the proposed model efficiency. The proposed model achieves more than 56% efficiency compared to the next efficient existing model.

Journal ArticleDOI
TL;DR: A VPN network simulation model will be built using the MPLS protocol and based on an existing network to evaluate the provision of the end-to-end QoS requirements for various traffic types.
Abstract: In this paper, a VPN network simulation model will be built using the MPLS protocol and based on an existing network. Various queueing policies will be implemented to evaluate the provision of the end-to-end QoS requirements for various traffic types. Input traffic based on real data was used. After a thorough analysis of the policies, the merits and shortcomings of each policy are determined and recommendations are given along with future research directions.

Journal ArticleDOI
TL;DR: A new fuzzy preference degree between two triangular fuzzy numbers is introduced and a new approach is prescribed to solve the problem of supplier selection using this preference degree.
Abstract: As competition is growing high on this globalized world, the companies are imposing more and more importance on the process of supplier selection. After the foundation of fuzzy logic, the problem of supplier selection has been treated from the viewpoint of uncertainty. The present work reviews and classifies different approaches towards this problem. A new fuzzy preference degree between two triangular fuzzy numbers is introduced and a new approach is prescribed to solve the problem using this preference degree. The weights of the Decision Makers are considered and a methodology is proposed to determine the weights. Moreover, a unique process of classifying the suppliers in different groups is proposed. The methodologies are exemplified by a suitable case study.

Journal ArticleDOI
TL;DR: This paper proposes a framework based on smart card that allows partners to realize secure transactions and use smart cards to store keys and perform cryptographic algorithms.
Abstract: In the recent years, we have seen the emergence and the growing of the e-business via the internet. Many organizations are extending their business transactions by using the Web. This will allow them to reach more customers in a cost effective way and to make their business transactions fast and efficient. Meanwhile, sending sensitive information via the Web must satisfy integrity, privacy, authentication and non-repudiation. Organizations are implementing various infrastructures that allow them to have secure e-business transactions. Many protocols and frameworks have been proposed and implemented to provide secure and trusted exchange between parties involved in the transaction. These frameworks store credentials such as keys in local computers which make them subject to piracy or misuse. In this paper, we propose a framework based on smart card that allows partners to realize secure transactions. The proposed solution use smart cards to store keys and perform cryptographic algorithms.

Journal ArticleDOI
TL;DR: This work develops a method that detects feature interactions between telecommunication services modeled in the Cause-Restrict language and demonstrates the applicability of the approach by modeling several services and detecting several feature interactionsBetween them.
Abstract: When several telecommunication services are running at the same time, undesirable behaviors may arise, which are commonly called feature interactions. Several methods have been developed for detecting and resolving feature interactions. However, most of these methods are based on detailed models of services, which make them suffer from state space explosion. Moreover, different telecommunication operators cannot cooperate to manage feature interactions by exchanging detailed service models because this violates the confidentiality principle. Our work is a part of the few attempts to develop feature interaction detection methods targeting to avoid or reduce significantly state space explosion. In order to reach this objective, we first develop a so called Cause-Restrict language to model subscribers of telecommunication services at a very high abstraction level. A Cause-Restrict model of a subscriber provides information such as: what is the cause of what, and what restricts (or forbids) what, and specifies coarsely the frequency of each operation ''cause'' or ''restrict'' by ''always'' or ''sometimes''. Then, we develop a method that detects feature interactions between telecommunication services modeled in the Cause-Restrict language. We demonstrate the applicability of our approach by modeling several services and detecting several feature interactions between them. New feature interactions have been detected by our approach.

Journal ArticleDOI
TL;DR: This paper adds an OID compression algorithm to Net-SNMP, which is one of the popular open source implementations of the SNMP framework and investigates its viability as a viable feature for SNMP libraries.
Abstract: Simple network management protocol (SNMP) object identifier (OID) compression can improve bandwidth usage and response time. The current literature includes several OID compression algorithms to reduce redundancy in SNMP protocol data units (PDUs). The overhead of OID compression could outweigh the benefits it offers if its tradeoffs are not well understood. The main objective of this paper is to investigate the OID compression as a viable feature for SNMP libraries. This is done by adding an OID compression algorithm to Net-SNMP, which is one of the popular open source implementations of the SNMP framework. Change to image size, lines of code added, complexity of compression code, the effect of compression on response time, and testing effort required are the parameters presented to understand the viability of OID compression.