scispace - formally typeset
Search or ask a question

Showing papers presented at "International Conference on Innovations in Information Technology in 2014"


Proceedings ArticleDOI
18 Dec 2014
TL;DR: This work proposes a new protocol, called LEACH-CKM, which uses the K-means classification method to group the nodes and Minimum Transmission Energy routing protocol (MTE) to route information from remote nodes.
Abstract: The requirement for operation with low energy consumption is one of the major constraints that guide the de- sign of routing protocol in Wireless Sensor Networks (WSN) because each node is powered by a limited and generally an irreplaceable source of energy. In this context, we would like to propose a new protocol which respects the energy constraints, nodes' life duration and packets reception. This work is based on the protocol LEACH-C (Low Energy Adaptive Clustering Hierarchy) which uses a centralized architecture to select clus- ter-heads while involving the base station and the sensors' local information. It addresses the isolation problem of remote nodes that fail to communicate their information (coordinates and residual energies) to the base station. This results in an ineffi- cient group formation that leads to a huge loss of data and there- fore the network loses its performance. For this, we propose a solution that takes into account the remote nodes during the formation of groups. This new protocol, called LEACH-CKM, uses the K-means classification method to group the nodes and Minimum Transmission Energy routing protocol (MTE) to route information from remote nodes. Keywords—WSN; Routing Protocol; Energy Optimization; Clustering.

33 citations


Proceedings ArticleDOI
18 Dec 2014
TL;DR: A security system for QR codes that guarantees both users and generators security concerns is implemented and the system is backward compatible with current standard used for encoding QR codes.
Abstract: Quick Response (QR) codes are two dimensional barcodes that can be used to efficiently store small amount of data. They are increasingly used in all life fields, especially with the wide spread of smart phones which are used as QR code scanners. While QR codes have many advantages that make them very popular, there are several security issues and risks that are associated with them. Running malicious code, stealing users' sensitive information and violating their privacy and identity theft are some typical security risks that a user might be subject to in the background while he/she is just reading the QR code in the foreground. In this paper, a security system for QR codes that guarantees both users and generators security concerns is implemented. The system is backward compatible with current standard used for encoding QR codes. The system is implemented and tested using an Android-based smartphone application. It was found that the system introduces a little overhead in terms of the delay required for integrity verification and content validation.

26 citations


Proceedings ArticleDOI
18 Dec 2014
TL;DR: Results show that security static analysis tools are, to some extent, effective in detecting security holes in source code; source code analyzers are able to detect more weaknesses than bytecode and binary code scanners; and while tools can assist the development team in security code review activities, they are not enough to uncover all common weaknesses in software.
Abstract: Security has been always treated as an add-on feature in the software development lifecycle, and addressed by security professionals using firewalls, proxies, intrusion prevention systems, antivirus and platform security. Software is at the root of all common computer security problems, and hence hackers don't create security holes, but rather exploit them. Security holes in software applications are the result of bad design and implementation of software systems and applications. To address this problem, several initiatives for integrating security in the software development lifecycle have been proposed, along with tools to support a security-centric software development lifecycle. This paper introduces a framework for evaluating security static analysis tools such as source code analyzers, and offers evaluation of non-commercial static analysis tools such as Yasca, CAT.NET, and FindBugs. In order to evaluate the effectiveness of such tools, common software weaknesses are defined based on CWE/SANS Top 25, OWASP Top Ten and NIST source code weaknesses. The evaluation methodology is based on the NIST Software Assurance Metrics And Tool Evaluation (SAMATE). Results show that security static analysis tools are, to some extent, effective in detecting security holes in source code; source code analyzers are able to detect more weaknesses than bytecode and binary code scanners; and while tools can assist the development team in security code review activities, they are not enough to uncover all common weaknesses in software. The new test cases developed for this research have been contributed to the NIST Software Assurance Reference Dataset (samate.nist.gov/SARD).

21 citations


Proceedings ArticleDOI
18 Dec 2014
TL;DR: A new adaptive ECG QRS detection ASIC based on Pan and Tompkins algorithm modified to detect a large number of different QRS complex morphologies using two adaptive thresholds.
Abstract: Electrocardiogram analysis is an important tool in the management of cardiac diseases and the QRS complex is the main reference in such analysis. The paper presents a new adaptive ECG QRS detection ASIC based on Pan and Tompkins algorithm. The algorithm has been modified to detect a large number of different QRS complex morphologies using two adaptive thresholds. The dedicated ASIC design architecture is based on the state-of-the-art 65-nm CMOS technology and has achieved 0.073426 mm2 total core area and 0.55105 μW power consumption. The QRS detector is tested on ECG records obtained from Physionet MIT-BIH database and obtained a sensitivity of Se =99.83% and a positive predictivity of P+= 98.65%.

18 citations


Proceedings ArticleDOI
18 Dec 2014
TL;DR: This paper describes the major techniques used for positioning in Wi-Fi networks, and investigates a fingerprinting method constrained by distance information to improve positioning accuracy.
Abstract: The wide deployment of Wi-Fi networks empowers the implementation of numerous applications such as Wi-Fi positioning, Location Based Services (LBS), wireless intrusion detection and real-time tracking. Many techniques are used to estimate Wi-Fi client position. Some of them are based on the Time or Angle of Arrival (ToA or AoA), while others use signal power measurements and fingerprinting. All these techniques require the reception of multiple wireless signals to provide enough data for solving the localization problem. In this paper, we describe the major techniques used for positioning in Wi-Fi networks. Real experiments are done to compare the accuracy of methods that use signal power measurement and Received Signal Strength Indication (RSSI) fingerprinting to estimate client position. Moreover, we investigate a fingerprinting method constrained by distance information to improve positioning accuracy. Localization techniques are more accurate when the estimated client positions are closer to the real geographical positions. Accuracy improvements increase user satisfaction, and make the localization services more robust and efficient.

15 citations


Proceedings ArticleDOI
01 Nov 2014
TL;DR: This paper addresses the problem of mobile phishing via the implementation of a Trojan that commits phishing through the mobile's pre-installed applications, which are naturally trusted.
Abstract: In this paper, we address the problem of mobile phishing via the implementation of a Trojan that commits phishing through the mobile's pre-installed applications, which are naturally trusted. It utilizes task interception along with lack of identity indicators, and it overrides the default behavior of some functions to succeed with the attack. We also study the impact of this Trojan on the device's performance. Finally, we propose some security enhancements that do not rely on the human factor, which are categorized as Operating System (OS) Level solutions, and Secure Socket Layer (SSL) solutions.

11 citations


Proceedings ArticleDOI
18 Dec 2014
TL;DR: A data mining model is developed that allows healthcare services demand planning and four models are built to assist decision makers in predicting the demand for healthcare services in Abu Dhabi Emirate.
Abstract: The application of data mining techniques provides a powerful approach to manipulate and extract useful information from existing data. It allows the learning of information from hidden data that could be used for future predictions. Accurate demand forecasting of any service has been a challenging research problem. Data mining has different techniques that could support demand prediction. Estimating the demand for healthcare services as a result of rapid population growth becomes essential for the strategic planning of Abu Dhabi Emirate. Future plans should be focused on districts that have needs, either for shortor long-term plans. The objective of this paper is to develop a data mining model that allows healthcare services demand planning. We use different data mining techniques in order to build four models to assist decision makers in predicting the demand for healthcare services in Abu Dhabi Emirate. Keywords—Data mining; healthcare services demand; service’s supply; K-Nearest-Neighbor; sequential minimal optimization (SMO).

11 citations


Proceedings ArticleDOI
18 Dec 2014
TL;DR: This paper presents a new aggregation function for textual data based on the affinity between keywords and uses the search of cycles in a graph to find the aggregated keywords.
Abstract: In the last decade, Online Analytical Processing (OLAP) has taken an increasingly important role in Business Intelligence. Approaches, solutions and tools have been provided for both databases and data warehouses, which focus mainly on numerical data. These solutions are not suitable for textual data. Because of the fast growing of this type of data, there is a need for new approaches that take into account the textual content of data. In the context of Text OLAP (OLAP on text or documents), the measure can be textual and need a suitable aggregation function for OLAP operations such as roll-up. We present in this paper a new aggregation function for textual data. Our approach is based on the affinity between keywords and uses the search of cycles in a graph to find the aggregated keywords. We also present performances and a comparison with three other methods. The experimental study shows good results for our approach.

10 citations


Proceedings ArticleDOI
18 Dec 2014
TL;DR: The random forest was used to develop a classification model that could be applied in predicting which of the insurance policies would likely to be chosen by the customers, and was compared to several data mining techniques.
Abstract: Data mining has been recently used in the field of car insurance to help the insurance companies in predicting the customers' choices in order to provide more competitive services. In this composition, the random forest was used to develop a classification model that could be applied in predicting which of the insurance policies would likely to be chosen by the customers. The performance of the developed model was compared to several data mining techniques such as ZeroR classifier, Simple Logistics Function, Decision Tree and Naive Bayes on a dataset contains 7 different policies. The results showed that the random forest was the most precise technique with an overall accuracy of 97.9 %.

9 citations


Proceedings ArticleDOI
18 Dec 2014
TL;DR: This paper proposes an abstract KMS development methodology which alleviates the weaknesses of existing methodologies while reusing their strengths, thus producing bespoke methodologies which are best suited to organizational needs.
Abstract: Powerful organizations are those that manage their power factors efficiently; organizational resources are considered vital power factors, and Knowledge is one of the most important resources to manage. There is no universally accepted Knowledge Management (KM) process, but it is known that establishing the appropriate knowledge flows in the organization is the main goal of organizational KM. A Knowledge Management System (KMS) is an information system which supports the KM process, mainly by providing the required knowledge and enhancing its flow. Organizations increasingly feel the need for appropriate methodologies for developing their target KMSs. However, existing KMS development methodologies are not comprehensive enough to satisfy all organizational needs. In this paper, we propose an abstract KMS development methodology which alleviates the weaknesses of existing methodologies while reusing their strengths. Method engineers can develop concrete methodologies by instantiating the proposed abstract methodology and adding the necessary detail, thus producing bespoke methodologies which are best suited to organizational needs.

9 citations


Proceedings ArticleDOI
18 Dec 2014
TL;DR: This paper discusses a technology that is expected to widely spread in the near future, which is the Sixth Sense Technology, and compares and pros/cons of each approach are explored.
Abstract: This paper discusses a technology that is expected to widely spread in the near future, which is the Sixth Sense Technology. With this new technology yet to be introduced in the market, different implementation approaches are discussed. Such approaches are demonstrated through other inventions similar to the Sixth Sense (SS). They all fall into the same category, which is Augmented Reality Technologies. The paper focuses on the possible applications and opportunities of such technology. Furthermore, the implementation approaches are compared and the pros/cons of each approach are explored. Most importantly, technical challenges and open issues regarding each implementation approach are brought out. Furthermore, predictions are made on which approach is expected to succeed in the coming years. Finally, solutions are discussed to improve the Sixth Sense Technology with regards to its implementation approaches to insure new and better ways of human-computer interaction.

Proceedings ArticleDOI
18 Dec 2014
TL;DR: By exploiting the properties of the information processing, the research is able to better design and integrate information transmission subsystems, with particular regard to the power consumption; by taking into account low-power requirements/constraints, it can develop energy-aware information processing modules.
Abstract: Human robotics is an emerging research field aiming at the conjugation of mechanical and electrical engineering with biology; neuromorphic engineering has shown that it is possible to develop effective systems mimicking key aspects of biological systems, such as spiking (event driven) information processing and communication, adaptive and learning behavior. However, most of these solutions are tightly focused on one biological aspect, often aiming to precisely replicate it with a one-to-one mapping between the electronic and biological domain. We believe that, to successfully “port” biological strengths to the engineering world, such a tight relationship has to be loosened, and a new multi-domain co-design approach has to be devised. By exploiting the properties of the information processing, we are able to better design and integrate information transmission subsystems, with particular regard to the power consumption; by taking into account low-power requirements/constraints, we can develop energy-aware information processing modules. Designing the whole systems in a modular way, allows us to easily re-design a full-fledged system for new scenarios/applications. The research regards the implementation of discrete component systems as well, so that integrated circuits or proof-of-concept implementations can be easily validated.

Proceedings ArticleDOI
18 Dec 2014
TL;DR: This paper proposes a taint analysis and defensive programming based HTML context-sensitive approach for precise detection of XSS vulnerability from source code of PHP web applications and provides automatic suggestions to improve the vulnerable source code.
Abstract: Currently, dependence on web applications is increasing rapidly for social communication, health services, financial transactions and many other purposes. Unfortunately, the presence of cross-site scripting vulnerabilities in these applications allows malicious user to steals sensitive information, install malware, and performs various malicious operations. Researchers proposed various approaches and developed tools to detect XSS vulnerability from source code of web applications. However, existing approaches and tools are not free from false positive and false negative results. In this paper, we propose a taint analysis and defensive programming based HTML context-sensitive approach for precise detection of XSS vulnerability from source code of PHP web applications. It also provides automatic suggestions to improve the vulnerable source code. Preliminary experiments and results on test subjects show that proposed approach is more efficient than existing ones.

Proceedings ArticleDOI
18 Dec 2014
TL;DR: A modified cognitive radio spectrum sharing algorithm based on the Hungarian algorithm is proposed to provide an optimal spectrum sharing solution between the secondary users while guaranteeing the different quality of service requested by each one.
Abstract: Cognitive radio is a promising technology that aims to provide a better utilization of the radio spectrum by giving access opportunistically to the unlicensed users or the secondary users. Spectrum sharing among secondary users is one of the main issues for developing a robust cognitive radio system. In this paper, a modified cognitive radio spectrum sharing algorithm based on the Hungarian algorithm is proposed. The algorithm divides the allocation problem into two; the first problem is when all channels satisfy the minimum quality of service for all secondary users and the second problem is when one or more of the channels fails to meet the requirements. The Hungarian algorithm is applied for the first problem to provide an optimal spectrum sharing solution between the secondary users while guaranteeing the different quality of service requested by each one. For the second problem an algorithm is proposed to find the optimal spectrum allocation for the set of channels which meet the quality of service requirements of secondary users. Simulation results are presented to show how robust is proposed algorithm to meet the required quality of service.

Proceedings ArticleDOI
18 Dec 2014
TL;DR: The placed and routed ASIC implementation of the background subtraction block achieved an operating maximum frequency of 800MHz and provides the system with the capability of processing HD video sequences, typically of spatial resolution 1920×1080 pixels at a potential rate of 385 fps.
Abstract: Background subtraction is an important step for object detection in many video processing systems. This paper presents a low power implementation of mean-filter based background subtraction block in ASIC flow using 65-nm CMOS process technology. The placed and routed ASIC implementation of the background subtraction block achieved an operating maximum frequency of 800MHz. This provides the system with the capability of processing HD video sequences, typically of spatial resolution 1920×1080 pixels at a potential rate of 385 fps. The background subtraction block occupied a total area of 1533.96μm2 using 65-nm CMOS process and consumed a low power of 27.88μW/pixel.

Proceedings ArticleDOI
18 Dec 2014
TL;DR: This work tries to augment existing document assessment approaches to Arabic content with concentration on the Arabic Wikipedia with results that can be applied to general content that lacks major Wikipedia features.
Abstract: With the huge size and large diversity of Arabic web content, machine assessment of document quality acquires added importance. Users are in dire need for quality rating of the material returned in response to their queries. The Wikipedia, with its large metadata, has been a topic of extensive research on document quality assessment. Criteria used include text properties and style parameters, contributor and edit characteristics and multimedia components. In this paper we report on our ongoing work to adapt existing document assessment approaches to Arabic content with concentration on the Arabic Wikipedia and present some of the results. We also try to augment that with features specific to Arabic as well as parameters like author expertise and social media presence. One of our goals is an aggregate measure integrating many of the features into a single document quality index. We plan to use Wikipedia article quality assessment results to train general content assessment methods that can be applied to general content that lacks major Wikipedia features.

Proceedings ArticleDOI
18 Dec 2014
TL;DR: Routine behavior of sampling process is catered in robust Hybrid Normalized Convolution due to the involvement of random sampling pixels criteria and then forward Error Correction, it gives local optimum results in image reconstruction.
Abstract: Frequency domain Normalized Convolution (NC) process is widely performed on images to retrieve and extract valuable information in noisy and distorted environment. Genetic Normalized Convolution (GNC) is carried out for features extraction in an image or features reconstructions in a distorted image. In this paper a hybrid approach is adopted where robust algorithm of convolution based on Normalized Convolution and Genetic Normalized Convolution (GNC) is implemented and performed on a noisy image to reconstruct the original image. Unlike in Normalized Convolution (NC) where it is done at specific positions. Thus random behavior of sampling process is catered in robust Hybrid Normalized Convolution due to the involvement of random sampling pixels criteria and then forward Error Correction, it gives local optimum results in image reconstruction. In robust Hybrid Normalized Convolution approach samples are chosen based on their importance, criteria measured by Phase Congruency and Radial Symmetries algorithms. In the end robust NC and Forward Error Correction analysis is performed and suggested to improve and ease the reconstruction while avoiding data losses and storing less sample in an image finally reaches an local optima.

Proceedings ArticleDOI
01 Nov 2014
TL;DR: A closed-form of the probability of detection under log-normal shadowing is derived, and the performance of this closed- form in collaborative sensing is investigated via Complementary Receiver Operating Characteristic (CROC) curves for both hard fusion and soft fusion rules.
Abstract: Spectrum sensing performance is highly degraded due to log-normal shadowing, resulting in more interference to the primary user caused by secondary users. However, collaborative sensing provides diversity that helps to alleviate the effect of shadowing. In this paper, a closed-form of the probability of detection under log-normal shadowing is derived, and the performance of this closed-form in collaborative sensing is investigated via Complementary Receiver Operating Characteristic (CROC) curves for both hard fusion and soft fusion rules.

Proceedings ArticleDOI
18 Dec 2014
TL;DR: This work dwells on possible improvements of systems for threats detection which may result from integrating several pieces of information, collected in different zones, in order to reduce the false alarm and the miss detection rates with respect to a single detector.
Abstract: Mobile equipments are an attractive target for network threats, due to their wide connectivity that expose them to various kinds of attacks. Besides specific security applications like mobile antiviruses installed into the devices, countermeasures can also be taken at the network operator side, where much larger computational power as well as management information are available. We dwell on possible improvements of systems for threats detection which may result from integrating several pieces of information, collected in different zones, in order to reduce the false alarm and the miss detection rates with respect to a single detector. In our proposal, each node involved in the system implements a threat detection based on the information it collects. Therefore the local decisions are gathered by a Fusion Center in charge of the final decision. Three different fusion strategies are compared, both in case of uncorrelated and correlated local detectors: (i) an optimal one based on the MAP rule, (ii) a majority voting rule having the merit of simplicity and turning out to achieve reasonable performances in the special case of independent detectors with comparable accuracies, (iii) an adaptive linear combiner followed by an hard limiter.

Proceedings ArticleDOI
18 Dec 2014
TL;DR: This paper will start by reviewing bounds for such systems and will then use these bounds to perform quite accurate simulations of large consecutive systems, showing that bounds could be very useful in the exploratory phase, for getting a quick and crude understanding of the reliability of future nano-architectures.
Abstract: A revived interest on consecutive systems due to novel nano-architectures (where reliability estimates are of high interest) as well as particular nanoscale communications (where the reliability of transmission needs to be assessed) is to be expected. The reason is that certain nano-technologies, like, e.g., molecular ones (but also magnetic and even FinFETs) could be mapped onto consecutive systems. This paper will start by reviewing bounds for such systems and will then use these bounds to perform quite accurate simulations of large consecutive systems, showing that bounds could be very useful in the exploratory phase, for getting a quick and crude understanding of the reliability of future (e.g., molecular, magnetic and FinFETs) nano-architectures.

Proceedings ArticleDOI
18 Dec 2014
TL;DR: This paper presents an end-to-end encrypted stream-querying technique to process very large volumes of XML data efficiently without revealing the query nor its result to a non-malicious but untrusted server.
Abstract: The Extensible Markup Language (XML) rapidly establishes itself as the de facto standard for presenting, storing, and exchanging semi structured data on the Internet. Querying large volume of XML data represents a bottleneck for several applications. Applications that require the confidential exchange of semi-structured data can benefit from end-to-end encryption. In this paper we present an end-to-end encrypted stream-querying technique to process very large volumes of XML data eciently without revealing the query nor its result to a non- malicious but untrusted server. The overheads for encryption are found to be predictable and linear in time and space.

Proceedings ArticleDOI
18 Dec 2014
TL;DR: Experimental results demonstrate that the proposed algorithm can achieve substantial improvements in terms of preserving the weak scatterers and removing noise over other reported CS based ISAR imaging algorithms.
Abstract: Compressive sensing (CS) based Inverse Synthetic Aperture Radar (ISAR) imaging exploits the sparsity of the target scene to achieve high resolution and effective denoising with limited measurements. This paper extends the CS based ISAR imaging to further include the continuity structure of the target scene within a Bayesian framework. A correlated prior is imposed to statistically encourage the continuity structures in both the cross-range and range domains of the target region and the Gibbs sampling strategy is used for Bayesian inference. Because the resulted method requires to recover the whole target scene at a time with heavy computational complexity, an approximate strategy is proposed to alleviate the computational burden. Experimental results demonstrate that the proposed algorithm can achieve substantial improvements in terms of preserving the weak scatterers and removing noise over other reported CS based ISAR imaging algorithms.

Proceedings ArticleDOI
18 Dec 2014
TL;DR: The main thrust of this paper is to find out the different classes and the individual factors among these identified classes which may have significant influence on Innovation in the Information Technology sector by way of literature survey and empirical means.
Abstract: Empirical findings have shown that Innovation is the main driving factor in the economic growth in all the industries and Information Technology in particular at present scenario. So in this era of competition Innovation gives an edge to a company over the others and it becomes its sole survival factor. Despite all the interest and importance researchers still lack the fundamental understanding of the factors that create Innovation. It is not just the tangible factors which drive Innovation but the role of intangible factors is also equally important. These various tangible and intangible factors can be grouped into different classes according to their basic characteristics. The quantum of influence of these different classes of factors on innovation may not be the same. The main thrust of this paper is to find out the different classes and the individual factors among these identified classes which may have significant influence on Innovation in the Information Technology sector by way of literature survey and the empirical means.

Proceedings ArticleDOI
18 Dec 2014
TL;DR: It is shown that unconventionally sizedSRAM cells achieve higher SNM's than classically sized SRAM cells (hence it is to be expected that they will work correctly at lower supply voltages), and the benefits of unconventional sizing when applied to ultra-low voltage (ULV) SR AM cells.
Abstract: Noises and variations are ubiquitous, but are still being ill-understood and in most cases treated simplistically, leading in most cases to substantial overdesign costs. A novel reliability-centric design method based on unconventionally sizing transistors has been suggested lately. In this paper our aim is to design, simulate, and compare the benefits of unconventional sizing when applied to ultra-low voltage (ULV) SRAM cells. We will show that unconventionally sized SRAM cells achieve higher SNM's than classically sized SRAM cells (hence it is to be expected that they will work correctly at lower supply voltages).