scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Ambient Computing and Intelligence in 2017"


Journal ArticleDOI
TL;DR: The present work explores the potential of five texture feature vectors computed using GLCM statistics exhaustively for differential diagnosis between normal and MRD images using SVM classifier and indicates that G LCM range feature vector computed with d = 1 yields the highest overall classification accuracy.
Abstract: Early detection of medical renal disease is important as the same may lead to chronic kidney disease which is an irreversible stage. The present work proposes an efficient decision support system for detection of medical renal disease using small feature space consisting of only second order GLCM statistical features computed from raw renal ultrasound images. The GLCM mean feature vector and GLCM range feature vector are computed for inter-pixel distance d varying from 1 to 10. These texture feature vectors are combined in various ways yielding GLCM ratio feature vector, GLCM additive feature vector and GLCM concatenated feature vector. The present work explores the potential of five texture feature vectors computed using GLCM statistics exhaustively for differential diagnosis between normal and MRD images using SVM classifier. The result of the study indicates that GLCM range feature vector computed with d = 1 yields the highest overall classification accuracy of 85.7% with individual classification accuracy values of 93.3% and 77.9% for normal and MRD classes respectively.

81 citations


Journal ArticleDOI
TL;DR: This paper proposes a big data based surveillance system that analyzes spatial climate big data and performs continuous monitoring of correlation between climate change and Dengue and has been implemented with the help of Apache Hadoop MapReduce.
Abstract: Ambient intelligence is an emerging platform that provides advances in sensors and sensor networks, pervasive computing, and artificial intelligence to capture the real time climate data. This result continuously generates several exabytes of unstructured sensor data and so it is often called big climate data. Nowadays, researchers are trying to use big climate data to monitor and predict the climate change and possible diseases. Traditional data processing techniques and tools are not capable of handling such huge amount of climate data. Hence, there is a need to develop advanced big data architecture for processing the real time climate data. The purpose of this paper is to propose a big data based surveillance system that analyzes spatial climate big data and performs continuous monitoring of correlation between climate change and Dengue. Proposed disease surveillance system has been implemented with the help of Apache Hadoop MapReduce and its supporting tools.

67 citations


Journal ArticleDOI
TL;DR: The comparative analysis is carried out over financial bankruptcy data set of Greek industrial bank ETEVA and it is concluded that rough computing techniques provide better accuracy 88.2% as compared to statistical techniques whereas hybridized computing techniques provides still better accuracy 94.1%.
Abstract: Information and technology revolution has brought a radical change in the way data are collected. The data collected is of no use unless some useful information is derived from it. Therefore, it is essential to think of some predictive analysis for analyzing data and to get meaningful information. Much research has been carried out in the direction of predictive data analysis starting from statistical techniques to intelligent computing techniques and further to hybridize computing techniques. The prime objective of this paper is to make a comparative analysis between statistical, rough computing, and hybridized techniques. The comparative analysis is carried out over financial bankruptcy data set of Greek industrial bank ETEVA. It is concluded that rough computing techniques provide better accuracy 88.2% as compared to statistical techniques whereas hybridized computing techniques provides still better accuracy 94.1% as compared to rough computing techniques.

66 citations


Journal ArticleDOI
TL;DR: It is argued that current research efforts and directions are not sufficient in HRI research, and that future research needs to further address interdisciplinary research in order to achieve long-term success of socially interactive robots.
Abstract: Socially interactive robots are expected to have an increasing importance in human society. For social robots to provide long-term added value to people's lives, it is of major importance to stress the need for positive user experience UX of such robots. The human-centered view emphasizes various aspects that emerge in the interaction between humans and robots. However, a positive UX does not appear by itself but has to be designed for and evaluated systematically. In this paper, the focus is on the role and relevance of UX in human-robot interaction HRI and four trends concerning the role and relevance of UX related to socially interactive robots are identified, and three challenges related to its evaluation are also presented. It is argued that current research efforts and directions are not sufficient in HRI research, and that future research needs to further address interdisciplinary research in order to achieve long-term success of socially interactive robots.

62 citations


Journal ArticleDOI
TL;DR: This work proposes a model to provide a trusted cloud service for users' in smart city using fuzzy multi objective decision making and Bio-Inspired Bat algorithm to achieve the principle objective of cloud service providers.
Abstract: Emerging research concerns about the authenticated cloud service with high performance of security and assuring trust for distributed clients in a smart city. Cloud services are deployed by the third-party or web-based service providers. Thus, security and trust would be considered for every layer of cloud architecture. The principle objective of cloud service providers is to deliver better services with assurance of trust about clients' information. Cloud's users recurrently face different security challenges about the use of sharable resources. It is really difficult for Cloud Service Provider for adapting varieties of security policies to sustain their enterprises' goodwill. To make an optimistic decision that would be better suitable to provide a trusted cloud service for users' in smart city. Statistical method known as Multivariate Normal Distribution is used to select different attributes of different security entities for developing the proposed model. Finally, fuzzy multi objective decision making and Bio-Inspired Bat algorithm are applied to achieve the objective.

54 citations


Journal ArticleDOI
TL;DR: Interaction with a computer has been the center of innovation ever since the advent of input devices as discussed by the authors, from simple punch cards to keyboards, there are number of novel ways of interaction with comput...
Abstract: Interaction with a computer has been the center of innovation ever since the advent of input devices. From simple punch cards to keyboards, there are number of novel ways of interaction with comput...

43 citations


Journal ArticleDOI
TL;DR: Results showed that the proposed proposed method fordetection of Anomaly Detection showed that k-nearestﻷnearest-neighborﻴ algorithmﻢdetectedﻵ roadﻅanomaliesﻹdriving-behaviors-detection,﻽ moreover, moreover, £2,000,000-2,500,000 moreover than previously thought.
Abstract: Road traffic accidents are caused 1.25 million deaths per year worldwide. To improve road safety and reducing road accidents, a recognition method for driving events is introduced in this paper. The proposed method detected and classified both driving behaviors and road anomalies patterns based on smartphone sensors (accelerometer and gyroscope). k-Nearest Neighbor and Dynamic Time Warping algorithms were utilized for method evaluation. Experiments were conducted to evaluate k-nearest neighbor and dynamic time warping algorithms accuracy for road anomalies and driving behaviors detection, moreover, driving behaviors classification. Evaluation results showed that k-nearest neighbor algorithm detected road anomalies and driving behaviors with total accuracy 98.67%. Dynamic time warping algorithm classified (normal and abnormal) driving behaviors with total accuracy 96.75%. KeywoRDS Anomaly Detection, Behavior Classification, Driving Behavior, Road Anomalies, Smartphone Sensors

39 citations


Journal ArticleDOI
TL;DR: An unsupervised segmentation methodology is proposed for remotely sensed images by using Fractional Differential FD based texture analysis model and Iterative Self-Organizing Data Analysis Technique Algorithm ISODATA.
Abstract: In this paper, an unsupervised segmentation methodology is proposed for remotely sensed images by using Fractional Differential FD based texture analysis model and Iterative Self-Organizing Data Analysis Technique Algorithm ISODATA. Essentially, image segmentation is used to assign unique class labels to different regions of an image. In this work, it is transformed into texture segmentation by signifying each class label as a unique texture class. The FD based texture analysis model is suggested for texture feature extraction from images and ISODATA is used for segmentation. The proposed methodology was first implemented on artificial target images and then on remote sensing images from Google Earth. The results of the proposed methodology are compared with those of the other texture analysis methods such as LBP Local Binary Pattern and NBP Neighbors based Binary Pattern by visual inspection as well as using classification measures derived from confusion matrix. It is justified that the proposed methodology outperforms LBP and NBP methods.

36 citations


Journal ArticleDOI
TL;DR: Comparing multiple sets of data sets comparing dimensionality Reduction, Large-Scale Image Retrieval, Principal Component Analysis, Scale Invariant Feature Transform, Speeded Up Robust Features, and concluding that the PCA can beeffectively reduced.
Abstract: Dimensionality reduction in large-scale image research plays an important role for their performance in different applications. In this paper, we explore Principal Component Analysis (PCA) as a dimensionality reduction method. For this purpose, first, the Scale Invariant Feature Transform (SIFT) features and Speeded Up Robust Features (SURF) are extracted as image features. Second, the PCA is applied to reduce the dimensions of SIFT and SURF feature descriptors. By comparing multiple sets of experimental data with different image databases, we have concluded that PCA with a reduction in the range, can effectively reduce the computational cost of image features, and maintain the high retrieval performance as well KeywoRDS Dimensionality Reduction, Large-Scale Image Retrieval, Principal Component Analysis, Scale Invariant Feature Transform, Speeded Up Robust Features

35 citations


Journal ArticleDOI
TL;DR: The technological revolutionﻷintegratingﻵ multipleﻴ informationﻅsources﻽�andソextensionﻢextension-based-information-sources-and-extension technology is integrated into computer science.
Abstract: The technological revolution integrating multiple information sources and extension of computer science in different sectors led to the explosion of the data quantities, which reflects the scaling of volumes, numbers and types. These massive increases have resulted in the development of new location techniques and access to data. The final steps in this evolution have emerged new technologies: Cloud and Big Data. The reference implementation of the Clouds and Big Data storage is incontestably the Hadoop Distributed File System (HDFS). This latter is based on the separation of metadata to data that consists in the centralization and isolation of the metadata of storage servers. In this paper, the authors propose an approach to improve the service metadata for Hadoop to maintain consistency without much compromising performance and scalability of metadata by suggesting a mixed solution between centralization and distribution of metadata to enhance the performance and scalability of the model. KeywoRDS Big Data, Clouds of Storage, Hadoop, HDFS, MapReduce, Metadata

34 citations


Journal ArticleDOI
TL;DR: A model which is highly inspired by natural and social behavior of human beings that man is a social animal and is always associated with different types of dynamic groups and the proposed Multicast Route Reliability MRR is a purely a probabilistic function of Cartesian product of past relationships and present moving patterns.
Abstract: The mobile environment in MANETs Mobile Ad hoc Networks itself is a major constraint to deal with, and assuring reliability on the top of that strengthens the problem more. This paper which is entirely an extension of the authors' previous work based on trust based group formation using Fuzzy Logic. They herein propose a model which is highly inspired by natural and social behavior of human beings that man is a social animal and is always associated with different types of dynamic groups. Before ensuring the reliable route take into account the trust T and the speed S of the intermediate nodes which may take part in routing. Trust is actually a matter of feeling with which one feels connected with some person/s in the form of multicast groups. Thus, the proposed Multicast Route Reliability MRR is a purely a probabilistic function of Cartesian product of past relationships and present moving patterns.

Journal ArticleDOI
TL;DR: A way to establish distributed traffic cloud data center based on SOA Service-Oriented Architecture fused with cloud computing is introduced and network-aware energy conservation scheduling DENS Data-center Energy-efficient Network-aware Scheduling algorithm applied in cloud data Center is put forward to realize the full utilization of all kinds of resources in the cloud data centre.
Abstract: This paper is targeted at issues including traditional stovepipe data center, low utilization of IT equipment and data resources as a result of rigid IT structure, high maintenance costs and high energy consumption in system operation. By taking Beijing Municipal Committee of Transport BMCT's data center as an example, a way to establish distributed traffic cloud data center based on SOA Service-Oriented Architecture fused with cloud computing is introduced in this paper; in addition, network-aware energy conservation scheduling DENS Data-center Energy-efficient Network-aware Scheduling algorithm applied in cloud data center is put forward to realize the full utilization of all kinds of resources in the cloud data center. Experimental results also show the effectiveness of the proposed algorithm by comparing with traditional DENS algorithms.

Journal ArticleDOI
TL;DR: The term Ambient Intelligence, Big Data, Sensors, Ubiquitous Computing, and the methodology behind it are described.
Abstract: The term Ambient Intelligence (AmI) encompasses other technologies such as ubiquitous communication, pervasive computing and ubiquitous computing. Hospitals can improve their working by monitoring the health of the patients and performing automatic analysis of various and health parameters inside the room. Security mechanisms can also be enhanced by only allowing authorized hospital staff and attendants in the ward. With the advent of Ambient Intelligence and the congenial political environment, the focus is now shifting to providing better healthcare at homes than at traditional medical centers. In this paper, we implemented an algorithm in which we consider a specific room of a hospital as the environment, with a patient monitored for health and security reasons. If anything is not allowed for the particular patient or there are some unwanted variations in the health parameters of the patient, the alarm was rang and the patient’s assistants were notified. KeywoRDS Ambient Intelligence, Big Data, Sensors, Ubiquitous Computing

Journal ArticleDOI
TL;DR: The paper presents the efficiency versus security requirements tradeoffs in key management for WSN and proposes a novel key management protocol which provides strong resistance against replay attacks.
Abstract: The continuous evolution of Next Generation Internet NGI amplifies the demand for efficient and secure communication capable of responding effectively to the challenges posed by the emerging applications. For secure communication between two sensor nodes, a secret key is needed. Cryptographic key management is a challenging task in sensor networks as the hostile environment of sensor networks makes it more prone to attacks. Apart from resource constraints of the devices, unknown topology of the network, the higher risk of node capture and lack of a fixed infrastructure makes the key management more challenging in Wireless Sensor Network WSN. Paper surveys different key Management schemes for WSN. The paper presents the efficiency versus security requirements tradeoffs in key management for WSN. Paper also proposes a novel key management protocol which provides strong resistance against replay attacks. The results obtained from the mathematical model based on conditional probability of the scheme suggest that the proposed key management in NGI is efficient and attack resistant.

Journal ArticleDOI
TL;DR: A concise survey of developments in MAS is presented highlighting the important contributions in the field and also questions the universal applicability of agents.
Abstract: Multiagent systems have been a fascination for research community and are often seen as an intelligent solution to many complex real world problems. Researchers have been active in the domain since last three decades and many developments pertaining to theoretical design and practical developments of multiagent systems are worth appreciating. The growth in MAS is multidirectional ranging from conceptual ideas to practical implementations and from the wide range of applications; it appears that multiagent systems are proving to be universal. The paper presents a concise survey of developments in MAS highlighting the important contributions in the field and also questions the universal applicability of agents.

Journal ArticleDOI
TL;DR: Experimental results demonstrate that the proposed TRD-MFCC-SS feature is highly effective and performs satisfactorily compared to conventional MFCC feature, especially in complex environment.
Abstract: In this paper, a new real-time approach for audio recognition of waterbird species in noisy environments, based on a Texas Instruments DSP, i.e. TMS320C6713 is proposed. For noise estimation in noisy water bird's sound, a tonal region detector TRD using a sigmoid function is introduced. This method offers flexibility since the slope and the mean of the sigmoid function can be adapted autonomously for a better trade-off between noise overvaluation and undervaluation. Then, the features Mel Frequency Cepstral Coefficients post processed by Spectral Subtraction MFCC-SS were extracted for classification using Support Vector Machine classifier. A development of the Simulink analysis models of classic MFCC and MFCC-SS is described. The audio recognition system is implemented in real time by loading the created models in DSP board, after being converted to target C code using Code Composer Studio. Experimental results demonstrate that the proposed TRD-MFCC-SS feature is highly effective and performs satisfactorily compared to conventional MFCC feature, especially in complex environment.

Journal ArticleDOI
TL;DR: The proposedapproach is to put evidence of effectiveness in the evidence of the effectiveness of the currently proposed approach to solve the problem of nonlinear disorder.
Abstract: The Discrete Event Systems (DES) is a nonlinear dynamic system with discrete state and event evolution. In this article, we are interested in the diagnosis of failures with Timed Automata. The proposed approach is based on the operating time and it is applicable to any system whose dynamic evolution depends not only on the order of discrete events but also on their periods as in industrial processes. The most important part of this work is the construction of a diagnoser which uses observables events to detect and locate the faults. We present on the last part of this work results of the study of the performance of the diagnosis showing the power of this diagnostic approach. An implementation on a hydraulic system is made to illustrate the proposed steps. It put in evidence the effectiveness of this approach. The model of the simulation phase is done using Matlab / Simulink / stateflow. KeywoRDS Detection, Diagnosis, Discrete Event System, Localization, Timed Automata

Journal ArticleDOI
TL;DR: A new service called HAaaS "Health-Assistance as a Service" is proposed which will allow the detection of malaise and the management of the driver and the vehicle and the aspect of co-operation will be guaranteed thanks to a mechanism that the authors will use and that they call the Help/rescue mechanism.
Abstract: These last years, an increasingly significant number of traffic accidents caused by a malaise at the steering wheel was observed. The management of this kind of phenomenon is difficult and presents a real challenge. Given this fact, the authors try, in this paper, to propose a work which, thanks to new technologies, let's take care of people taken malaise while driving. Thus, they will provide a new service called HAaaS "Health-Assistance as a Service" which will allow the detection of malaise and the management of the driver and the vehicle. Moreover, the aspect of co-operation will be guaranteed thanks to a mechanism that the authors will use and that they call the Help/rescue mechanism.

Journal ArticleDOI
TL;DR: To verify and prove the expected behavior of the suggested discovery protocol in the design phase, the Event-B formalism is adopted.
Abstract: The environments of pervasive computing are open and dynamic. In order to ensure the dynamic discovery of services evolving in a heterogeneous and dynamic environment, specific extensions to WSDL, known as A-WSDL are suggested. These extensions permit to a service provider to define the context of service use and the behavior associated to each change of context. To verify and prove the expected behavior of the suggested discovery protocol in the design phase, the Event-B formalism is adopted. One of the advantages of the Event B formalism is the application of the refining techniques which permit to express complex features by means of mathematical proofs and moves from an abstract specification to a concrete specification by using the Rodin tool which offers a support for the refining and the proofs.

Journal ArticleDOI
TL;DR: An algebraic language is presented, called Time-AgLOTOS, to describe time-dependent behavior of intelligent agent for the design of Ambient Intelligence systems and offers new possibilities and strategies for taking agent real-time decisions in context-awareness manner.
Abstract: This paper presents an algebraic language, called Time-AgLOTOS, to describe time-dependent behavior of intelligent agent for the design of Ambient Intelligence systems. This specification model provides a theoretical foundation for performing planning under timing constraints. Based on a true-concurrency semantics, a contextual model, called Spatio-Temporal Planning System STPS, is developed to capture all possible evolutions of an agent plan including context changes. The STPS provides formal description of possible actions to perform supporting timing constraints, action duration and spatial information. This structure offers new possibilities and strategies for taking agent real-time decisions in context-awareness manner.