scispace - formally typeset
Search or ask a question

Showing papers in "International journal of artificial intelligence in 2012"


Journal ArticleDOI
TL;DR: This paper investigates constructing a comprehensive feature set to compensate the lack of parsing structural outcomes in Arabic Language and presents a leading research for the opinion holder extraction in Arabic news independent from any lexical parsers.
Abstract: Opinion mining aims at extracting useful subjective information from reliable amounts of text. Opinion mining holder recognition is a task that has not been considered yet in Arabic Language. This task essentially requires deep understanding of clauses structures. Unfortunately, the lack of a robust, publicly available, Arabic parser further complicates the research. This paper presents a leading research for the opinion holder extraction in Arabic news independent from any lexical parsers. We investigate constructing a comprehensive feature set to compensate the lack of parsing structural outcomes. The proposed feature set is tuned from English previous works coupled with our proposed semantic field and named entities features. Our feature analysis is based on Conditional Random Fields (CRF) and semi-supervised pattern recognition techniques. Different research models are evaluated via cross-validation experiments achieving 54.03 F-measure. We publicly release our own research outcome corpus and lexicon for opinion mining community to encourage further research.

97 citations


Journal Article
TL;DR: A hybrid model to improve the FA algorithm is proposed by introducing learning automata to adjust firefly behavior, and using genetic algorithm to enhance global search and generate new solutions.
Abstract: Firefly algorithm is one of the evolutionary optimization algorithms, and is inspired by the behavior of fireflies in nature. Though efficient, its parameters do not change during iterations, which is also true for particle swarm optimization. This paper propose a hybrid model to improve the FA algorithm by introducing learning automata to adjust firefly behavior, and using genetic algorithm to enhance global search and generate new solutions. We also propose an approach to stabilize firefly movement during iterations. Simulation results show better performance and accuracy than standard firefly algorithm.

55 citations


Journal Article
TL;DR: For the first time, an algorithm based on firefly algorithm is proposed for optimization in dynamic environment and the obtained results show the proper accuracy and convergence rate for the proposed approach in comparison with other well-known approaches.
Abstract: In many optimization problems in real world, objective function, design variable or constraints can be changed during time, so optimal value of these problems also can be changed. These kinds of problems are called dynamic. Algorithms which are designed for optimizing in these environments have some principles that distinguish them from algorithms designed in static environment. In this paper, for the first time, an algorithm based on firefly algorithm is proposed for optimization in dynamic environment. Firefly algorithm is a new meta-heuristic algorithm with a great potential for discovering multiple optima simultaneously. Mentioned ability of this algorithm has been used to propose a novel approach for multi-modal optimization in dynamic environments. The proposed approach evaluated on Moving peaks benchmark problem, which is the most famous benchmark for assessment in dynamic environments. The obtained results show the proper accuracy and convergence rate for the proposed approach in comparison with other well-known approaches.

52 citations


Journal Article
TL;DR: A new adaptation of the so-called Self-Organizing Migration Algorithms (SOMA) for purposes of FCM design is presented and compared also to other methods like particle swarm optimization, simulated annealing, active and nonlinear Hebbian learning on experiments with catching targets for future purposes of robotic soccer.
Abstract: Fuzzy Cognitive Maps (FCM) represent not only a user-friendly knowledge representation but also a convenient means for simulation of dynamic systems and decision-making support. Concerning the nature of robotic systems FCM seem to be convenient in using mainly on upper decision levels. However, FCM strike on problems of their design. Beside manual approach, which is limited by the number of nodes and their connections, various adaptation methods have been proposed. This paper gives a short summary of these methods dividing them into Hebbian-based and evolutionary-based approaches. Further, it presents a new adaptation of the so-called Self-Organizing Migration Algorithms (SOMA) for purposes of FCM design, which is compared also to other methods like particle swarm optimization, simulated annealing, active and nonlinear Hebbian learning on experiments with catching targets for future purposes of robotic soccer. Obtained results are compared where advantages of the proposed method are apparent and in the conclusions their properties are summarized. Besides, a new modification of FCM with active inputs is presented that is able to receive data from sensors in each time step.

34 citations


Journal Article
TL;DR: This paper presents a simple and efficient algorithm that can automatically generate all possible paths in a Control Flow Graph for Path testing and uses cuckoo behaviour for extracting optimal paths.
Abstract: Structural testing is most important and high demand testing technique for code-based criteria in software testing. In structural testing, path testing technique is the most useful technique. In path testing, generation of all independent paths (non redundant) is a complex one. The aim of this paper is to present a simple and efficient algorithm that can automatically generate all possible paths in a Control Flow Graph for Path testing. Cuckoo behaviour is used in this algorithm for extracting optimal paths. This cuckoo search algorithm generates paths equal to the cyclomatic complexity. It can be shown that the proposed approach guarantees full path coverage.

34 citations


Journal Article
TL;DR: Real-time experimental results validate the fuzzy modeling approach and the new optimal T-S fuzzy models for a Magnetic Levitation System with Two Electromagnets (MLS2EM) laboratory equipment.
Abstract: This paper proposes an approach to fuzzy modeling of magnetic levitation systems. These unstable and nonlinear processes are first linearized around several operating points, and next stabilized by a State Feedback Control System (SFCS) structure. Discrete-time Takagi-Sugeno (T-S) fuzzy models of the stabilized processes are derived on the basis of the modal equivalence principle, and the rule consequents contain the state-space models of the local SFCS structures. Optimization problems are defined which aim the minimization of objective functions defined as the squared modeling error considered as the difference between the real-world process output and the fuzzy model output. The variables of the objective functions are represented by a part of the parameters of the input membership functions. Simulated Annealing algorithms are implemented to solve these optimization problems and to obtain optimal T-S fuzzy models. Real-time experimental results validate the fuzzy modeling approach and the new optimal T-S fuzzy models for a Magnetic Levitation System with Two Electromagnets (MLS2EM) laboratory equipment.

19 citations


Journal Article
TL;DR: Two novel algorithms based on the Automata Theory for the Multiobjective Optimization of Combinatorial Problems are proposed, one of which is a hybrid Simulated Annealing MODS inspired algorithm and the other is an Evolutionary Algorithmbased on the MODS theory.
Abstract: This paper states two novel algorithms based on the Automata Theory for the Multiobjective Optimization of Combinatorial Problems. The first algorithm proposed is named SAMODS. It is a hybrid Simulated Annealing MODS inspired algorithm. The main idea behind this approach consists in optimizing a Combinatorial Problem changing the Angle Improvement; this is a novel theory maintain in the classic weighted sum metric. SAMODS avoids unfeasible solutions fall back on the MODS theory. Also, it avoids local optimum due to the use of Boltzmann Distribution Probability for accepting bad solutions as good solutions. The last proposed algorithm was named SAGAMODS. It is an Evolutionary Algorithm based on the MODS theory. In addition to the advantages of SAMODS, SAGAMODS avoids in two times local optimums because of its Crossover Step. It is taken from the Natural Selection Theory that allows creating new solutions (next generation) support in the current solutions (actual generation). Only the best solutions survive. The proposed algorithms were tested using instances from the well-known TSPLIB. The test was made using problems with two objectives, three objectives, four objectives and five objectives inclusive. The proposed algorithms were compared using metrics from the specialized literature of the Multiobjective Optimization. The results of the metrics applied to the algorithms shows that MODS algorithm was superseded up to 100% out of 100%, in some of the instances worked, by the proposed algorithms.

10 citations


Journal Article
TL;DR: Two novel metaheuristics namely discrete particle swarm optimization and imperialist competitive algorithm are utilized to solve the no wait two stage multiprocessor flow shop scheduling problem.
Abstract: This paper discuses about no wait two stage multiprocessor flow shop scheduling problem. The haracteristics of this problem are unit setup times and rework probability for jobs after second stage. The problem investigated in this study belong to NP-hard classes of scheduling problem, Therefore, two novel metaheuristics namely discrete particle swarm optimization and imperialist competitive algorithm are utilized to solve this problem. The performance measure considered is mean tardiness. In order to evaluate the performance of the proposed algorithms, at first some experiments are generated randomly and then the results obtained using the proposed algorithms are compared with those of ant colony optimization and genetic algorithm. Results are compared in terms of relative deviation index. The results of the simulation study reveal that the proposed ICA outperforms the other algorithms.

10 citations


Journal Article
TL;DR: The numerical and signal processing performance of a method to reconstruct numerical data from the published coverage images is evaluated by comparing the reconstructed data with the original forecast data.
Abstract: It is common practice to publish environmental information via the Internet. In the case of geographical coverage information such as pollutant concentration charts and maps in chemical weather forecasts, such data are published as web-resolution images. These forecasts are commonly presented with an associated value-range pseudocolor scale, which represents a simplified version of the original data obtained through, dispersion models and related post-processing methods. In this paper, the numerical and signal processing performance of a method to reconstruct numerical data from the published coverage images is evaluated by comparing the reconstructed data with the original forecast data.

8 citations


Journal Article
TL;DR: In this paper, an overview of an indoor air quality data analysis, using self-organizing maps, was presented, and the suitability of the used method for large-scale data analysis of indoor Air Quality was discussed.
Abstract: This paper presents an overview of an indoor air quality data analysis, using self-organizing maps. The aim of the study was to research quality variations in indoor air, and the suitability of the used method for large scale data analysis of indoor air quality. Research was conducted in a six floor apartment building (built in the 80’ies) located in Kuopio, Finland, from January to May 2011. Quality data were collected continuously in 6 apartments from 10 rooms at the 2nd and the 6th floors, using an energy consumption and indoor air quality monitoring system. Three of the six research apartments were located on the 2nd floor and the other three on the 6th floor. At first the indoor air quality data were modelled using the SOM-algorithm. Next, the neuron reference vectors of the formed map were clustered to reveal dominating elements of each territory of the map. The results indicated that the method presented in this paper is an efficient way to analyse indoor air quality. Results indicated also that problems with indoor air quality occur more often during the wintertime in buildings utilizing mechanical exhaust ventilation. In particular, elevated CO2 concentrations indicate poor air quality in the bedrooms.

7 citations


Journal Article
TL;DR: The paper presents an intelligent system, INTELLEnvQ-Air, developed for air quality analysis in urban regions and for informing the population about the impact of air pollution on human health and possible measures of protection for vulnerable persons.
Abstract: Artificial intelligence provides a variety of techniques and methods that can be implemented in the environmental decision support systems, for solving different problems such as forecasting, analysis, diagnosis, control and planning, for a better quality of the environment and, thus, of the life The paper presents an intelligent system, INTELLEnvQ-Air that was developed for air quality analysis in urban regions and for informing the population about the impact of air pollution on human health and possible measures of protection for vulnerable persons The system integrates two artificial intelligence approaches: feed forward artificial neural networks, for air pollutants concentrations forecasting, and rule-based expert systems for the analysis of air quality and human health impact

Journal Article
TL;DR: A rule system to predict first-day returns of initial public offerings based on the structure of the offerings is introduced, based on a genetic algorithm using a Michigan approach that offers significant advantages on two fronts: predictive performance and robustness to outlier patterns.
Abstract: This paper introduces a rule system to predict first-day returns of initial public offerings based on the structure of the offerings. The solution is based on a genetic algorithm using a Michigan approach. The performance of the system is assessed comparing it to a set of widely used machine learning algorithms. The results suggest that this approach offers significant advantages on two fronts: predictive performance and robustness to outlier patterns. The importance of the latter should be emphasized as the results in this domain are very sensitive to their presence.

Journal Article
TL;DR: Different classification algorithms including Decision Tree and Naive Bayesian are compared using Orange, a data mining tool and the results showed that the Decision Tree algorithm achieved accuracy and efficiency in predicting the required bandwidth inside the network.
Abstract: Classification is one of the most important supervised learning techniques in data mining. Classification algorithms can be extremely beneficial to interpret and demonstrate bandwidth usage pattern and predict the required bandwidth for different groups in distinct time interval, having the intention of improving efficiency. The dataset used in this study was collected over a year from a Squid proxy server’s log file, on access.log file, from a computer institute. This study compares various classification algorithms to predict the bandwidth usage pattern in different time intervals among different groups of users in the network. Different classification algorithms including Decision Tree and Naive Bayesian are compared using Orange, a data mining tool. The results of the experiment showed that the Decision Tree algorithm achieved accuracy and efficiency in predicting the required bandwidth inside the network.

Journal Article
TL;DR: Time of operations of all the relays of radial distribution network obtained from the developed laboratory prototype for different fault locations in various sections have been found to be in close conformity with the theoretical values obtained using an IEC standard relay characteristics equation.
Abstract: Due to incorporation of Distributed Generation (DG), the traditional protection scheme for electric power distribution system lost its radial nature and behaves more like multifeed transmission system. A laboratory prototype of three phase radial distribution network containing DG is presented in this paper. By executing number of single line-to-ground faults at different locations in various sections of radial distribution network containing DG, numbers of maloperations due to miscordination of relay have been observed by the authors. In this paper, the impact of high resistance fault in the radial distribution network in the presence of DG has been analyzed. Time of operations of all the relays of radial distribution network obtained from the developed laboratory prototype for different fault locations in various sections have been found to be in close conformity with the theoretical values obtained using an IEC standard relay characteristics equation.

Journal Article
TL;DR: This work introduces a memetic algorithm that makes use of the multilevel paradigm, which refers to the process of dividing large and difficult problems into smaller ones, which are hopefully much easier to solve, and then work backward towards the solution of the original problem.
Abstract: Many researchers have focused on the satisfiability problem and on many of its variants due to its applicability in many areas of artificial intelligence. This NP-complete problem refers to the task of finding a satisfying assignment that makes a Boolean expression evaluate to True. In this work, we introduce a memetic algorithm that makes use of the multilevel paradigm. The multilevel paradigm refers to the process of dividing large and difficult problems into smaller ones, which are hopefully much easier to solve, and then work backward towards the solution of the original problem, using a solution from a previous level as a starting solution at the next level. Results comparing the memetic with and without the multilevel paradigm are presented using problem instances drawn from real industrial hardware designs.

Journal Article
TL;DR: A new set of application program interfaces (APIs), called Griffon, and its compiler framework for automatic translation of C programs to CUDA-based programs are proposed, which allow programmers to exploit the performance of multicore machines using OpenMP and offloads computations to GPUs using Griffon directives.
Abstract: Applications can accelerate up to hundreds of times faster by offloading some computation from CPU to execute at graphical processing units (GPUs). This technique is so called the general-purpose computation on graphic processing units (GPGPUs). Recent research on accelerating various applications by GPGPUs using a programming model from NVIDIA, called Compute Unified Device Architecture (CUDA), have shown significant improvement on performance results. However, writing an efficient CUDA program requires in-depth understanding of GPU architecture in order to develop a suitable data-parallel strategy, and to express it in a low-level style of code. Thus, CUDA programming is still considered complex and error-prone. This paper proposes a new set of application program interfaces (APIs), called Griffon, and its compiler framework for automatic translation of C programs to CUDA-based programs. Griffon APIs allow programmers to exploit the performance of multicore machines using OpenMP and offloads computations to GPUs using Griffon directives. The compiler framework uses a new graph algorithm for efficiently exploiting data locality. Experimental results on a 16-core NVIDIA Geforce 8400M GS using six workloads show that Griffon-based programs can accelerate from 1.5 up to 89 times faster than their sequential implementation running on CPU.

Journal Article
TL;DR: A method for automatic detection and monitoring of small waterlogged areas in farmland, using multispectral satellite images and diverse classifiers, based on Multilayer Perceptron neural networks and Genetic Programming to achieve per-pixel classification is presented.
Abstract: The paper presents a method for automatic detection and monitoring of small waterlogged areas in farmland, using multispectral satellite images and diverse classifiers. In the waterlogged areas, excess water significantly damages or completely destroys the plants, thus reducing the average crop yield. Automatic detection of (waterlogged) crops damaged by the combined effect of rainfall and rising underground water is an important tool for government agencies dealing with yield assessment and disaster control. The paper describes the application of two different machine learning algorithms to the problem of identifying crops that have been affected by rising underground water levels inWorldView-2 satellite imagery. Satellite images of central European region (Northern Serbia), taken in May and July 2010, with spatial resolution of 0:5m and 8 spectral bands were used to train the classifiers and test their performance when it comes to identifying the water-stressed crops. WorldView-2 satellite provides 4 new bands potentially useful in agricultural applications: coastal-blue, red-edge, yellow and near-infrared 2. We propose a methodology based on Multilayer Perceptron neural networks and Genetic Programming to achieve per-pixel classification. The classifiers constructed are able to achieve 99.4% accuracy when trained and evaluated on a single image and 97.8% accuracy when the testing is done on an image taken under different atmospheric and solar geometry conditions.

Journal Article
TL;DR: In this article, a novel approach to stability analysis of neural networks switched at an arbitrary time is proposed, which reduces the H ∞ norm from the external input to the state vector within a disturbance attenuation level.
Abstract: This article proposes a novel approach to stability analysis of neural networks switched at an arbitrary time. First, a new condition for H_∞ stability of switched neural networks is proposed. Second, a new H_∞ stability condition in the form of linear matrix inequality (LMI) for these neural networks is proposed. These conditions ensure to reduce the H_∞ norm from the external input to the state vector within a disturbance attenuation level. Without the external input, the proposed conditions also guarantee asymptotic stability.

Journal Article
TL;DR: It is proved that these fuzzy control solutions can ensure good control system performance and compensation for plant nonlinearities in mechatronic systems as well and enable the application and full utilization of such systems.
Abstract: This paper treats several application-oriented fuzzy control solutions with Takagi-Sugeno fuzzy controllers (TS-FCs) developed for mechatronic applications Low-cost fuzzy control solutions are offered with simple design approaches and easy implementation results The solutions are organized such that to represent useful recommendations for specialists who apply artificial intelligence techniques in wide range of practical applications related to mechatronic systems It is proved that our fuzzy control solutions can ensure good control system performance and compensation for plant nonlinearities in mechatronic systems as well Therefore they enable the application and full utilization of such systems Three case studies related to the speed and position control of three mechatronic applications are included: a vehicular power train system with continuously variable transmission, an electromagnetically actuated clutch and a magnetic levitation system Plant models expressed as first principle nonlinear models and linearized models are offered Simulations and real-time experimental results validate the low-cost TS-FCs

Journal Article
TL;DR: Analysis of GC, GC3, AT3 bias spectra shows higher correlation for synonymous codons with C or G being at the third position of codons (GC3).
Abstract: GC, GC3, AT3 bias spectra, and codon bias spectra (CBS) are generated through a 3-phase network model of genome. This model locates the regions in genome where distribution of nucleotide combinations and codons are non uniform and are biased towards protein translation. Correlation and regression analysis are performed here for estimating the preponderance of GC, GC3, AT3 bias at nucleotide positions of genome. Statistical analysis show higher correlation for synonymous codons with C or G being at the third position of codons (GC3). A complete LabVIEW schematic is presented here for efficient and parallel computation of GC, GC3, AT3 bias spectra along with CBS.

Journal Article
TL;DR: A multi-agent based approach for the design of simulators where a neuro-fuzzy hybridization technique is applied to model the terrain and weather operations and predict its impact on the effectiveness of air tasking operations and missions is proposed.
Abstract: Environmental effects of military training, live exercises and missile tests pose a major threat to the ecosystem and have a long term impact on bio-diversity. Military simulators and Virtual Warfare Analyses constitute an important and inexpensive tool that is really important for military analysts, safe alternate to live training exercises. A major impediment encountered is to replicate real-world scenarios considering effects of Weather and Terrain on military operations and assessing the damage caused due to the weapons employed for achieving a military objective in training. We propose a multi-agent based approach for the design of simulators where a neuro-fuzzy hybridization technique is applied to model the terrain and weather operations and predict its impact on the effectiveness of air tasking operations and missions. Spatial Terrain and Spatio-temporal weather data from meteorological sources were used as input to a neural network and the predicted weather conditions at a given place were classified with fuzzy logic.

Journal Article
TL;DR: The study results suggested the possibility of neural networks technology introduction for flow values estimation on the base of daily precipitations, with a preference for MLP networks, instead of simple statistical regionalrelationships.
Abstract: In order to assure the smooth function of a water reservoir, we have to estimate streams flow. Flow estimation is usually recognized as a proper tool for regional climatic condition description in respect to soil erosion by water. It is also a basic input to simple and widespread soil erosion prediction models. However its calculation on the base of original precipitation records is a very laborious operation and is completely impossible for many locations without a precise precipitation data. The aim of the research was to develop a simple method of flow values estimation on the base of general precipitation data. We examined the possibility of implementing artificial neural networks for flow values estimation on the base of minimum-maximum temperatures and daily precipitations. The research was conducted with the use of a database containing calculated precipitation and flow values from 3 meteorological stations in Cyprus. As a result of the study 3 radial basis function networks (RBF) of two to five hidden layer neurons and 2 multilayer perceptrons networks (MLP) with one and two hidden layers were developed. The study results suggested the possibility of neural networks technology introduction for flow values estimation on the base of daily precipitations, with a preference for MLP networks, instead of simple statistical regionalrelationships.

Journal Article
TL;DR: A remote body sensoring that paramounts with a cognitive impairment helper, present in a mobile system, to provide constant monitoring of the users’ health condition, enabling proactive actions and medical reports to the user physician.
Abstract: Society walks towards a massive aging phenomena. It is estimated that in 50 years the elderly population will surpass the young population. This means that two major problems will arise: social and economic. The active population will not produce enough wealth to support the elderly population and the problems that elderly person usually have will be growing exponentially, being the health services provided today unable to respond to such demand. Technology and the ever-growing evolution of it can be a possible response to both problems, providing an initial assistance and remote monitoring. This project enables elderly people to have an active life, by providing freedom in form of monitoring the user health state and providing tools to help them overcome daily tasks. In this paper it is presented a remote body sensoring that paramounts with a cognitive impairment helper, present in a mobile system, to provide constant monitoring of the users’ health condition, enabling proactive actions and medical reports to the user physician.

Journal Article
TL;DR: Light is shed in the contribution of determinants to the health status of the population and on whether or not these determinants are producing similar results, in the countries of the Organization for Economic Co-operation and Development (OECD).
Abstract: This paper aims to shed light in the contribution of determinants to the health status of the population and to provide evidence on whether or not these determinants are producing similar results, in the countries of the Organization for Economic Co-operation and Development (OECD). This is done by employing two different approaches, namely Cox regression and Artificial Neural Networks in. In this study, one output – Life Expectancy (LE) at birth of the total population – and seven inputs are included. The inputs represent the three main dimensions of health outcome production: health resources (measured by health spending or the number of health practitioners), socioeconomic environment (pollution, education and income) and lifestyle (tobacco, alcohol and diet). A variable expressing country specificities is also used. The two distinct approaches performed reached the same conclusion, that health resources and country specific effects are more closely related to LE

Journal Article
TL;DR: The experimental results show that normalized lineal and sigmoid kernels and the under-sampling balancing technique outperform the other approaches tested, and a new software tool named BioClass is presented here.
Abstract: In the last decade several text mining methods have been proposed to automate the process of searching and classifying information in on-line biomedical publications. However, results are not enough good mainly because of the unbalanced nature of the documents, with only a very small number of relevant papers to each user query. Due to most data mining and machine learning algorithms have a great difficult dealing with unbalanced data,this problem is taking center stage. Classification techniques such as support-vector machines (SVMs) have excellent performance for balanced data, but may fail when applied to unbalanced datasets. One of the most common techniques for dealing with this problem consists of changing the basic sampling methods including under-sampling, over-sampling and re-sampling. This article discusses the issues associated with classifying of unbalanced data, and analyze the effects of these balancing strategies on four different SVM kernels (lineal, sigmoid, exponential and polynomial kernels) using the TREC Genomics 2005 biomedical text public corpus. The experimental results show that normalized lineal and sigmoid kernels and the under-sampling balancing technique outperform the other approaches tested. Empirical tests are conducted using a new software tool named BioClass which is presented here.

Journal Article
TL;DR: This paper investigates the error performance of FSO channels modeled as Gamma-Gamma and K distribution functions with Luby Transform encoding which are rateless codes and classify the channels using Radial Basis Function Neural Networks to decide the best fit.
Abstract: Free Space Optical (FSO) communication systems offer a license free and cost effective access performance. FSO links can suffer from data packet corruption and erasure. Error control codes can help to mitigate turbulence induced fading and can improve the error performance of such links. Various statistical models have been proposed to describe the atmospheric turbulence channels. The choice of the appropriate model for varying level of turbulence is dependent on the atmospheric parameters. In this paper we classify the channels using Radial Basis Function Neural Networks to decide the best fit. We then investigate the error performance of FSO channels modeled as Gamma-Gamma and K distribution functions with Luby Transform encoding which are rateless codes. Simulation results are used to compare the performance of different modulation schemes with Luby Transform encoding and also to classify the appropriate distribution function for the channel model.

Journal Article
TL;DR: A novel mathematical model based on Multi-Layer Perceptrons for reducing the ground reflection effect, a kind of multipath effect, in indoor Real-Time Locating Systems is presented.
Abstract: Nowadays, indoor Real-Time Locating Systems represent one of the most exciting applications based on Wireless Sensor Networks using wireless technologies such as Wi-Fi or ZigBee. Indoor Real-Time Locating Systems based on Wireless Sensor Networks use different measurements from radio frequency signals such as Received Signal Strength Indication (RSSI) levels in order to estimate distances between reference nodes and the devices to be located. In ideal conditions, the modeling of the relationship between RSSI levels and distances between antennas has a decaying exponential shape. Nevertheless, radio frequency waves used by indoor Real-Time Locating Systems can be affected by different undesired propagation effects, such as attenuation, diffraction, reflection and scattering, which can lead to multipath effect. In this sense, this paper presents a novel mathematical model based on Multi-Layer Perceptrons for reducing the ground reflection effect, a kind of multipath effect, in indoor Real-Time Locating Systems. Presented results demonstrate that the use of Multi-Layer Perceptrons to forecast distances from RSSI levels allows reducing the ground reflection effect that occurs when considering only the current RSSI measurement.