scispace - formally typeset
Search or ask a question

Showing papers in "Artificial Intelligence Review in 2009"


Journal ArticleDOI
TL;DR: This work presents a survey of the algorithms described based on the intelligence in bee swarms and their applications, and presents a list of winners and losers.
Abstract: Swarm intelligence is an emerging area in the field of optimization and researchers have developed various algorithms by modeling the behaviors of different swarm of animals and insects such as ants, termites, bees, birds, fishes. In 1990s, Ant Colony Optimization based on ant swarm and Particle Swarm Optimization based on bird flocks and fish schools have been introduced and they have been applied to solve optimization problems in various areas within a time of two decade. However, the intelligent behaviors of bee swarm have inspired the researchers especially during the last decade to develop new algorithms. This work presents a survey of the algorithms described based on the intelligence in bee swarms and their applications.

624 citations


Journal ArticleDOI
TL;DR: A way of designing cluster algorithms and to improve existing ones, based on reusable components, is proposed, which identifies reusable components as solutions to characteristic sub-problems in partitioned cluster algorithms, and identifies a generic structure for the design of partitioning cluster algorithms.
Abstract: Clustering algorithms are well-established and widely used for solving data-mining tasks. Every clustering algorithm is composed of several solutions for specific sub-problems in the clustering process. These solutions are linked together in a clustering algorithm, and they define the process and the structure of the algorithm. Frequently, many of these solutions occur in more than one clustering algorithm. Mostly, new clustering algorithms include frequently occurring solutions to typical sub-problems from clustering, as well as from other machine-learning algorithms. The problem is that these solutions are usually integrated in their algorithms, and that original algorithms are not designed to share solutions to sub-problems outside the original algorithm easily. We propose a way of designing cluster algorithms and to improve existing ones, based on reusable components. Reusable components are well-documented, frequently occurring solutions to specific sub-problems in a specific area. Thus we identify reusable components, first, as solutions to characteristic sub-problems in partitioning cluster algorithms, and, further, identify a generic structure for the design of partitioning cluster algorithms. We analyze some partitioning algorithms (K-means, X-means, MPCK-means, and Kohonen SOM), and identify reusable components in them. We give examples of how new cluster algorithms can be designed based on them.

23 citations


Journal ArticleDOI
TL;DR: In this paper, state of the art of security systems which use both these technologies of intelligent agents and artificial immune system for security are reviewed, paying special attention to features of human immune system used in the system, the role of the agents in the ABAIS and the security mechanisms provided against intrusions.
Abstract: Since its introduction in the 1990s the internet has proliferated in the life of human kind in many numbers of ways. The two by-products of the internet are intelligent agents and intrusions which are far away from each other in the intention of their creation while similar in their characteristics. With automated code roaming the network intruding the users on one side as worms, viruses, and Trojans and autonomous agents tending to help the users on the other side, the internet has given great research challenges to the computer scientists. The greatest challenge of the internet is intrusion, which has increased and never decreased. There are various security systems for the internet. As the Human Immune System protects human body from external attacks, these security systems tend to protect the internet from intruders. Thus the internet security systems are comparable with human immune systems in which autonomous cells move throughout the body to protect it while learning to tackle new threats and keeping them in their memory for the future. These properties are comparable with that of autonomous agents in the internet. Thus intelligent agent technology combined with ideas from human immune system is a great area of research which is still in its developing phase. In this paper, state of the art of security systems which use both these technologies of intelligent agents and artificial immune system i.e., Agent Based Artificial Immune System (ABAIS) for security are reviewed, paying special attention to features of human immune system used in the system, the role of the agents in the ABAIS and the security mechanisms provided against intrusions.

23 citations


Journal ArticleDOI
TL;DR: When single words are combined together with word n-grams in one list and put in rank order, the frequency of tokens in the combined list extends Zipf’s law with a slope close to −1 on a log-log plot in all five languages.
Abstract: Experiments show that for a large corpus, Zipf's law does not hold for all ranks of words: the frequencies fall below those predicted by Zipf's law for ranks greater than about 5,000 word types in the English language and about 30,000 word types in the inflected languages Irish and Latin. It also does not hold for syllables or words in the syllable-based languages, Chinese or Vietnamese. However, when single words are combined together with word n-grams in one list and put in rank order, the frequency of tokens in the combined list extends Zipf's law with a slope close to ?1 on a log-log plot in all five languages. Further experiments have demonstrated the validity of this extension of Zipf's law to n-grams of letters, phonemes or binary bits in English. It is shown theoretically that probability theory alone can predict this behavior in randomly created n-grams of binary bits.

22 citations


Proceedings ArticleDOI
M. Kastek, T. Sosnowski, T. Or, anowski, K. Kopczy, ski, M. Kwa, ny 
TL;DR: Some commercially available devices and method used for the detection of chemical substances (gases) and theoretical calculation regarding the possibility of methane detection with multispectral infrared camera are presented.
Abstract: The article presents the problem of methane detection with a multispectral infrared camera. It also presents some commercially available devices and method used for the detection of chemical substances (gases). The project of a multispectral infrared camera and theoretical calculation regarding the possibility of methane detection is also reported in this paper. The calculations included the properties of optical path: camera-cloud of methane background. Verification of theoretical results was made during laboratory measurement. Some initial results of methane detection are also presented.

11 citations


Proceedings ArticleDOI
TL;DR: The aim was to evaluate the immissions status in the small settlements selected from the North-Bohemian region, to appraise the predicative value of collected results and prepare a methodology concept of mobile measurements to describe the local immission situation.
Abstract: In the past, due to air pollution, the North-Bohemian Brown Coal Basin used to belong to the famous zone called the “Black Triangle” which also covered the lower parts of Silesia and Saxony. The air pollution was for the greatest part caused by large industrial sources where its impact on the forest ecosystems gained a cross-border character. At the beginning of 1990s, after reduction measures in industry and the implementation of stricter environmental laws, the air condition started to improve rapidly. Due to extensive opencast brown coal mining and the presence of large combustion resources the area is still classed as a region with a poor air quality mainly due to increased concentrations of dust particles. In recent years the increasing effect of local combustion sources on air pollution may be seen. Nevertheless the air pollution in small settlements has not been mapped sufficiently. The article is based on the data collected from mobile measurements of air purity executed within the remit of the subgroup “emissions-immissions” of the research project “Research of physical and chemical features affected by coal mining, its use and impact on environment in the North-Western Bohemia region”. The measurement was performed in the selected communities close to the opencast mines in the period 2004–2007. The aim was to evaluate the immissions status in the small settlements selected from the North-Bohemian region, to appraise the predicative value of collected results and prepare a methodology concept of mobile measurements to describe the local immission situation. This article extends the information gained from mobile measurement and the data from the stationary measuring stations and also takes into account significant sources of pollution such as coal power plants in the North-Bohemian Brown Coal Basin. The results collected during the series of measurements in the course of heating and non-heating seasons are simultaneously compared with the data regarding the manner of heating and type of the fuel used for heating in the communities.

10 citations


Proceedings ArticleDOI
TL;DR: In this paper, the authors performed aerosol sampling close to two of the three doors of the Florence Baptistery and found that the non-carbonate carbon and soluble ionic components of the total suspended matter were measured.
Abstract: It is now well known that air pollution is responsible for the accelerated damage encountered on cultural heritage located outdoors. Although several works on atmospheric pollutants have been performed, studies of atmospheric pollutant monitoring close to monuments remain rare. In addition, the few cases reported in the literature mostly regard indoor environments. As the protection and conservation of monuments and historic buildings constitutes a priority for each country, knowledge of particle composition near monuments over time is an important issue in conservation strategies. For this reason, the atmosphere in proximity of the Florence Baptistery, located in the city centre, was continuously monitored during 2003 and 2004 by means of aerosol sampling performed close to two of the three doors of the monument. In particular, the monitoring was performed close to the North Door, realized by Lorenzo Ghiberti (1403-1424), currently utilized as the entrance to the monument, and the South Door, a masterpiece of Andrea Pisano (1330), employed as the exit for visitors. The sampling sites were characterized by different expositions to road traffic emissions. The non-carbonate carbon and soluble ionic components of the total suspended matter were measured. The data obtained is presented and discussed with the goal of contributing to the formulation of guidelines for a suitable safeguard of the built cultural heritage.

10 citations


Proceedings ArticleDOI
TL;DR: Results of the principal components analysis show that both LIF and FTIR methods allow distinguishing between groups of materials, i.e. pollens and nonbiological materials, thus making application of these methods for detection and preliminary classification possible.
Abstract: Biochemical, highly sensitive methods of pollutant identification have common drawbacks – relatively long analysis time and the necessity of on-site sample collection. Optical methods are less sensitive and less selective, but they allow real time analysis. These optical methods include Fourier Transform Infrared Spectroscopy (FTIR), and Laser Induced Fluorescence (LIF). Both methods give the possibility of stand-off detection. The aim of the presented work is to study the possibilities of detection and classification of air contaminants, based on their optical properties. In this work we present measurement results of UV-VIS fluorescence characteristics and IR optical absorption of vegetable pollens, bacterial spores, and non-biological air contaminants (diesel fuel, dust, syloid). Results of the principal components analysis show that both LIF and FTIR methods allow distinguishing between groups of materials, i.e. pollens and nonbiological materials, thus making application of these methods for detection and preliminary classification possible.

9 citations


Proceedings ArticleDOI
TL;DR: The investigation highlights that despite being labelled as a “green” or “carbon neutral” source of renewable energy, the actual ability of biofuels (especially those made from crops) to reduce GHGs hangs delicately on several crucial factors, namely, direct land use change.
Abstract: Among all the various air pollution issues, greenhouse gases are the key environmental and global concern that the world is facing today. The European Union’s project on Carbon Measurement Toolkit was developed to determine the greenhouse gas (GHG) intensities of products. This kind of evaluation is important for fast growing industrial nations, especially in assessing sources of alternative energy. Life cycle assessment or LCA is used to model the following fuels delivered to Singapore: foreign conventional fuel production; biofuels from palm oil grown in neighbouring countries (with ‘worst’ and ‘best’ cases of direct land use change); and biodiesel produced from used cooking oil in Thailand. The life cycle approach used in this article is similar to the method developed by the European Union’s Carbon Measurement Toolkit. The case studies involve raw material production/plantation, processing and final delivery by long-distance transportation. The investigation highlights that despite being labelled as a “green” or “carbon neutral” source of renewable energy, the actual ability of biofuels (especially those made from crops) to reduce GHGs hangs delicately on several crucial factors, namely, direct land use change.

9 citations


Proceedings ArticleDOI
TL;DR: PM10 samples were collected at nine sampling stations using a high volume (hivol) air sampler during the period of June–November 2007 and the chemical compositions of organic and water-soluble ionic species (WSIS) PM10 aerosols from each emission source were identified.
Abstract: PM10 samples were collected at nine sampling stations using a high volume (hivol) air sampler during the period of June–November 2007. Using the ATRFTIR technique, the chemical compositions of organic and water-soluble ionic species (WSIS) PM10 aerosols from each emission source were identified. WSIS such as SO4 2, NO3 , CO3 2, NH4

9 citations


Journal ArticleDOI
TL;DR: The results obtained show that DE convergence speeds were faster than the ones of multiple population genetic algorithm and genetic algorithms, therefore DE algorithm seems to be a promising approach to engineering optimization problems.
Abstract: This paper aims to adapt the Clonal Selection Algorithm (CSA) which is usually used to explain the basic features of artificial immune systems to the learning of Neural Networks, instead of Back Propagation. The CSA was first applied to a real world problem (IRIS database) then compared with an artificial immune network. CSA performance was contrasted with other versions of genetic algorithms such as: Differential Evolution (DE), Multiple Populations Genetic Algorithms (MPGA). The tested application in the simulation studies were IRIS (vegetal database) and TIMIT (phonetic database). The results obtained show that DE convergence speeds were faster than the ones of multiple population genetic algorithm and genetic algorithms, therefore DE algorithm seems to be a promising approach to engineering optimization problems. On the other hand, CSA demonstrated good performance at the level of pattern recognition, since the recognition rate was equal to 99.11% for IRIS database and 76.11% for TIMIT. Finally, the MPGA succeeded in generalizing all phonetic classes in a homogeneous way: 60% for the vowels and 63% for the fricatives, 68% for the plosives.


Proceedings ArticleDOI
TL;DR: The research concludes that the absence of statutory targets for carbon emission reductions remains a substantial barrier for local authority carbon management initiatives, but in order to utilise scarce resources in the most efficient manner, local authorities should draw upon the existing skill set of their Air Quality Officers.
Abstract: Due to the common sources of emissions of both air quality pollutants and greenhouse gases, management measures directed at one category of emissions are likely to positively impact the other. Through the local air quality management (LAQM) process, local authorities are required to monitor and measure specified air pollutants, the sources of which are also common to the primary sources of carbon emissions at a local level. This research tracks the progression of local authority management of carbon emissions and examines the barriers and opportunities for the integration of carbon emissions into the LAQM process. Results are triangulated from three core research methods deployed in South West England: a time series of local authority questionnaire surveys; secondary data analysis of active Air Quality Action Plans; and case study interviews of six local authorities in the region. The research concludes that the absence of statutory targets for carbon emission reductions remains a substantial barrier for local authority carbon management initiatives. However, in order to utilise scarce resources in the most efficient manner, local authorities should draw upon the existing skill set of their Air Quality Officers.

Proceedings ArticleDOI
TL;DR: Reviewed papers accepted for the Seventeenth International Conference on Modelling, Monitoring and Management of Air Pollution held in Tallinn, Estonia in July 2009.
Abstract: Reviewed papers accepted for the Seventeenth International Conference on Modelling, Monitoring and Management of Air Pollution held in Tallinn, Estonia in July 2009

Journal ArticleDOI
TL;DR: It is demonstrated that it is possible to decompose this task into several parallel computations, each related to a subset of S (respectively of $${{\bar{S}}}$$); these partial results are then put together as a final product.
Abstract: Consider a family $${(X_i)_{i \in I}}$$ of random variables endowed with the structure of a Bayesian network, and a subset S of I. This paper examines the problem of computing the probability distribution of the subfamily $${(X_{a})_{a \in S}}$$ (respectively the probability distribution of $${ (X_{b})_{b \in {\bar{S}}}}$$ , where $${{\bar{S}} = I - S}$$ , conditional on $${(X_{a})_{a \in S}}$$ ). This paper presents some theoretical results that makes it possible to compute joint and conditional probabilities over a subset of variables by computing over separate components. In other words, it is demonstrated that it is possible to decompose this task into several parallel computations, each related to a subset of S (respectively of $${{\bar{S}}}$$ ); these partial results are then put together as a final product. In computing the probability distribution over $${(X_a)_{a \in S}}$$ , this procedure results in the production of a structure of level two Bayesian network structure for S.

Journal ArticleDOI
TL;DR: The effects of project size and number of resource constraints on project duration are compared to the performances of pre-selected priority rules.
Abstract: Priority rules are one of the frequently used methods in project programming with resource-constraints. In this paper, the effects of project size and number of resource constraints on project duration are compared to the performances of pre-selected priority rules. Ten projects in different sizes have been programmed with 3, 5, 7, 9, and 11 limited-resource conditions by means of MRPL (Maximum Remaining Path Length), LFT (Latest Finish Time), MNSLCK (Minimum Slack Time), EFT (Earliest Finish Time), and LST (Latest Start Time) priority rules. When the number of resource constraints is low, the performance of MRPL is generally observed to be higher. As the number of resource constraints increases, a decrease in the performance of MRPL is observed in contrast with an increase in the performance of LFT.

Proceedings ArticleDOI
TL;DR: Some of the critical aspects regarding the conceptual design of such an information system are discussed, and the actual information system developed for Braga is presented, named SmarBRAGA.
Abstract: Evaluating, monitoring and informing about urban environmental quality has become a main issue, particularly important when considered as a decisionmaking tool that contributes to more habitable and sustainable cities. Following a tendency observed in other European cities, the city of Braga (Portugal) has decided to create an infrastructure for environmental data acquisition and a webbased platform as a public information system. Some of the innovations introduced in this new platform include the use of mobile instrumented units, the extensive use of simulation software to create long-term pollution (air and noise) maps, and the presentation of the information through a geographical interface developed over Google Maps technology. This paper discusses some of the critical aspects regarding the conceptual design of such an information system, and presents the actual information system developed for Braga, named SmarBRAGA.

Proceedings ArticleDOI
TL;DR: To get to know the corrosion behaviour samples of the heat treated steel AISI 4140, 42CrMo4, used for casing, and the martensitic stainless injectionpipe steel A ISI 420, X46Cr13 were kept at T=60°C and p=1-60 bar in a CO2-saturated synthetic aquifer environment similar to the geological CCS-site at Ketzin, Germany.
Abstract: The CCS technique involves the compression of emission gasses in deep geological layers. To guarantee the safety of the site, CO2-corrosion of the injection pipe steels has to be given special attention when engineering CCSsites. To get to know the corrosion behaviour samples of the heat treated steel AISI 4140, 42CrMo4, used for casing, and the martensitic stainless injectionpipe steel AISI 420, X46Cr13 were kept at T=60°C and p=1-60 bar for 700 h8000 h in a CO2-saturated synthetic aquifer environment similar to the geological CCS-site at Ketzin, Germany. The isothermal corrosion behaviour obtained by mass gain of the steels in the gas phase, the liquid phase and the intermediate phase gives surface corrosion rates around 0.1 to 0.8 mm/year. Severe pit corrosion with pit heights around 4.5 mm are only located on the AISI 420 steel. Main phase of the continuous complicated multi-layered carbonate/oxide structure is siderite FeCO3 in both types of steel.

Journal ArticleDOI
TL;DR: An overview of ongoing research “HandPuppet3D” being carried out in collaboration with an animation studio to employ computer vision techniques to develop a prototype desktop system and associated animation process that will allow an animator to control 3D character animation through the use of hand gestures.
Abstract: Motion capture is a technique of digitally recording the movements of real entities, usually humans. It was originally developed as an analysis tool in biomechanics research, but has grown increasingly important as a source of motion data for computer animation. In this context it has been widely used for both cinema and video games. Hand motion capture and tracking in particular has received a lot of attention because of its critical role in the design of new Human Computer Interaction methods and gesture analysis. One of the main difficulties is the capture of human hand motion. This paper gives an overview of ongoing research "HandPuppet3D" being carried out in collaboration with an animation studio to employ computer vision techniques to develop a prototype desktop system and associated animation process that will allow an animator to control 3D character animation through the use of hand gestures. The eventual goal of the project is to support existing practice by providing a softer, more intuitive, user interface for the animator that improves the productivity of the animation workflow and the quality of the resulting animations. To help achieve this goal the focus has been placed on developing a prototype camera based desktop gesture capture system to capture hand gestures and interpret them in order to generate and control the animation of 3D character models. This will allow an animator to control 3D character animation through the capture and interpretation of hand gestures. Methods will be discussed for motion tracking and capture in 3D animation and in particular that of hand motion tracking and capture. HandPuppet3D aims to enable gesture capture with interpretation of the captured gestures and control of the target 3D animation software. This involves development and testing of a motion analysis system built from algorithms recently developed. We review current software and research methods available in this area and describe our current work.

Proceedings ArticleDOI
TL;DR: An electronic system specially designed to evaluate ad-hoc the effect of driver’s activity on the polluting emissions is presented, both designed concerning MIVECO research project and includes experimental results obtained in a urban circuit in the city of Madrid.
Abstract: The polluting emissions (gases and particles) produced by the traffic of automobiles are directly related to the activity of vehicle, but they are also affected by the route conditions and moreover, by the driver’s behavior. However, PAMS commercial systems do not usually include elements to registry this last component. This article presents an electronic system specially designed to evaluate ad-hoc the effect of driver’s activity on the polluting emissions. This electronic application integrates a hardware and a software component, both designed concerning MIVECO research project. From the hardware component stands out the sensorial part, formed by potentiometers connected to the pedals that control the vehicle and the inertial device, which allows to evaluate the instantaneous accelerations in the x-y-z axes as well as the turns with respect to these axes. Once the signals are conditioned and acquired, the software component processes them for on-line monitoring in a GUI and storages them in a database to facilitate its evaluation off-line. This electronic application has two important properties: it can be incorporated in any vehicle of the market (light or heavy, diesel or gasoline, pre or post-eobd) and it allows the capture and the registration of the information about the driver’s activity synchronously with PEMS (gases and/or particles) systems. The work includes experimental results obtained in a urban circuit in the city of Madrid.

Proceedings ArticleDOI
TL;DR: The highest measured concentrations of bioaerosols were associated with composting activities such as shredding and turning, and an unexpected second peak was detected 100-150m downwind from source at both sites.
Abstract: The potential risk to human health posed by exposure to bioaerosols released from composting is an important issue. Further growth in the number of composting facilities in the UK is anticipated as biodegradable waste is diverted from landfill. To date, studies of bioaerosol emission from composting have focussed on culturable bioaerosols. This paper describes both culturable bioaerosol and endotoxin release and dispersal from two large green waste composting facilities in the UK. Aspergillus fumigatus, actinomycetes, Gram-negative bacteria, and endotoxins were simultaneously and repeatedly sampled to describe the release and dispersal from these sites. Meteorological and site operational observations were recorded, allowing analysis of factors influencing bioaerosol release and dispersal. The highest measured concentrations of bioaerosols were associated with composting activities such as shredding and turning. Between release and 50-80m downwind bioaerosol concentrations reduced by 80-90%. An unexpected second peak was detected 100-150m downwind from source at both sites. Endotoxin dispersal patterns were site specific and showed some differences to dispersal patterns of culturable microorganisms. © 2009 WIT Press.

Journal ArticleDOI
TL;DR: A framework is presented which can unify both ellipsoid and random-walk MCMC methods into a single Markov chain using a trick called Metropolis-coupled MCMC, which can validly exchange information to each other.
Abstract: Sampling from a truncated distribution is difficult There are currently two major methods proposed for solving this task The first proposed solution is a random-walk MCMC algorithm Although it eventually gives the correct distribution, it can be very slow in multi-modal distributions The second approach called the ellipsoid method is practically more efficient for problems in which users have good prior information, but a correctness is not guaranteed In this paper, we present a framework which can unify these two approaches The key idea is to merge both methods into a single Markov chain using a trick called Metropolis-coupled MCMC Once merged, they can validly exchange information to each other Although the chain constructed from the ellipsoid approach cannot be proven to be correct, it usually rapidly converges to a useful stationary distribution, and its information can help the other chain constructed by the random-walk approach to converge faster to the correct distribution

Proceedings ArticleDOI
TL;DR: The LEAQ project is reviewed and the criteria used to find a suitable core calculator is discussed, with AUSTAL2000 being the model that better suits the criteria due to its simpler characteristics and faster transport calculator.
Abstract: Air pollution models have been developed over the last few decades, ranging from large detailed models, involving complex physical-chemical phenomena, to less detailed models. Air pollution models can also be grouped according to their scale. The air quality model, AYLTP, to be presented in this paper, aims at a spatial grid specificity that falls outside of the typical air quality scales approach. This model requires a spatial domain of approximately 100 km × 100 km, a spatial grid spacing of approximately 100-500 m, a time step of 10 minutes and a temporal domain of 24 hours. Moreover AYLTP requires a fast core calculator, as it will be incorporated on the Luxembourg Energy and Air Quality meta-model (LEAQ), which is built in an optimization framework. This paper aims at selecting the most suitable code to serve as a core calculator to be incorporated in AYLTP. A set of criteria was established to carry out an analysis of different open source air quality models suitable for the LEAQ meta-model. The selection of the models was based on a space-time graph. For each model, areas of influence were determined, based on the assumption that for a fixed CPU time, the grid spacing increases with the spatial domain size. Two models, AUSTAL2000 and METRAS, fit the required criteria. The choice between these two models was based according to the model’s flexilibity in terms of resolution and CPU performance. In this paper we briefly review the LEAQ project and discuss the criteria used to find a suitable core calculator. AUSTAL2000 is the model that better suits the criteria due to its simpler characteristics and faster transport calculator.

Journal ArticleDOI
TL;DR: The computational experiments conducted in this paper demonstrate that the proposed re-assembly algorithm being optimised to re-assemble the complete jigsaw puzzles is not efficient when applied to the puzzles with missing pieces, and indicates that no one algorithm can be used to solve the multitude of possible scenarios involved in the re-Assembly of incompleteJigsaw puzzles.
Abstract: The jigsaw puzzle re-assembly problem has been investigated only intermittently in the research literature. One potential theoretical line of research concerns jigsaw puzzles that do not have a complete set of puzzle pieces. These incomplete puzzles represent a difficult aspect of this problem that is outlined but can not be resolved in the current research. The computational experiments conducted in this paper demonstrate that the proposed re-assembly algorithm being optimised to re-assemble the complete jigsaw puzzles is not efficient when applied to the puzzles with missing pieces. Further work was undertaken to modify the proposed algorithm to enable efficient re-assembly of incomplete jigsaw puzzles. Consequently, a heuristic strategy, termed Empty Slot Prediction, was developed to support the proposed algorithm, and proved successful when applied to certain sub-classes of this problem. The results obtained indicate that no one algorithm can be used to solve the multitude of possible scenarios involved in the re-assembly of incomplete jigsaw puzzles. Other variations of the jigsaw puzzle problem that still remain unsolved are presented as avenues for future research.

Proceedings ArticleDOI
TL;DR: In this paper, the authors used a three dimensional numerical model based on Reynolds-averaged Navier-Stokes equations to simulate the fluid-flow development and pollutant dispersion within an isolated street canyon.
Abstract: Keeping the air quality acceptable has become an important task for decision makers as well as for non-governmental organizations. Particulate and gaseous emissions of pollutants from auto-exhausts are responsible for rising discomfort, increasing airway diseases, decreasing productivity and the deterioration of artistic and cultural patrimony in urban centers. Air quality limit values, which are aimed at protecting public health, are frequently exceeded especially in streets and other urban hotspots. Within these streets, pedestrians, cyclists, drivers and residents are likely to be exposed to pollutant concentrations exceeding current air quality standards. In order to give the right support to decision makers for air pollution control, a suitable microscale dispersion model must be used to investigate phenomenon The paper presents the results obtained by utilizing a three dimensional numerical model based on Reynolds-averaged Navier–Stokes equations to simulate the fluid-flow development and pollutant dispersion within an isolated street canyon. Finally, the authors tested the reliability of the same code examined resemblances and differences between the measured data coming from a survey measurement within the canyon and the data coming from the code.

Journal ArticleDOI
TL;DR: In this paper, three multimodal neural network models of early child language acquisition are reviewed and it is shown how computational modelling, in conjunction with the availability of empirical data, can contribute towards the understanding of childlanguage acquisition.
Abstract: Current opinion suggests that language is a cognitive process in which different modalities such as perceptual entities, communicative intentions and speech are inextricably linked. As such, the process of child language acquisition is one in which the child learns to decipher this inextricability and to acquire language capabilities starting from gesturing, followed by language dominated by single word utterances, through to full-blown native language capability. In this paper I review three multimodal neural network models of early child language acquisition. Using these models, I show how computational modelling, in conjunction with the availability of empirical data, can contribute towards our understanding of child language acquisition. I conclude this paper by proposing a control theoretic approach towards modelling child language acquisition using neural networks.

Proceedings ArticleDOI
TL;DR: A need for more extended philosophical reflections on AP should go beyond mere technical issues of monitoring and control and into some historical, evolutionary, ecological and philosophical aspects of AP.
Abstract: This paper discusses import of physical ideas on management and human response to Air Pollution (AP). It looks into some historical, evolutionary, ecological and philosophical aspects of AP. Paper emphasizes a need for more extended philosophical reflections on AP which should go beyond mere technical issues of monitoring and control.

Proceedings ArticleDOI
TL;DR: The results show that the average of CO and HC from Mexicali and Tijuana are lower than in the Mexico City Metropolitan Area (MCMA), while the average for NO is higher.
Abstract: According to the North American Free Trade Agreement (NAFTA), signed by Canada, the USA and Mexico in 1992, beginning on January 1 st 2009, Mexico may not maintain restrictions on the importation of used cars older than ten years. The increased influx of used vehicles might cause significant changes in the composition of the country's vehicle fleet and this might increase its contribution to air emissions. Due to this situation and the lack of reliable information for decision-making, in 2007 the National Institute of Ecology of Mexico carried out studies of emissions, activity and composition of the vehicular fleet in the Mexican cities of Mexicali and Tijuana, in Baja California State, which share a border with the USA. Measurements were carried out with an AccuScan RSD3000 remote sensing system for on-road vehicle emissions, to obtain concentrations of carbon monoxide, hydrocarbons and nitric oxide (CO, HC and NO) from exhaust fames. To determine the activity, composition and technological characteristics of vehicles, surveys were conducted and analyses of databases were made. The results show that the average of CO and HC from Mexicali and Tijuana are lower than in the Mexico City Metropolitan Area (MCMA), while the average for NO is higher. This study is the first effort conducted by Mexican environmental authorities to document the impact of imported uses cars in emissions.

Proceedings ArticleDOI
TL;DR: In this article, the authors focused on environmental impact of road transport on the air quality around two selected schools for a period of two weeks each using an air pollution monitoring station which continuously recorded various pollutants' concentrations and meteorological variables in five minute intervals.
Abstract: Kuwait having one of the highest GDP and the least fuel price provides ideal opportunity for ownership of motorized vehicles. Weather has also a major role in this issue where in long summer (lasting about nine months), temperature sores to nearly 50 oC very often and in short winter it drops to single digit value in early mornings and nights. The road transport is vital for the local inhabitants as the sole means of transport (commuting and transporting goods). In the last decade, motorized road vehicle fleet has grown significantly bringing unprecedented mobility to the burgeoning population. With the growth of vehicles, the fuel consumption has also increased. Motor vehicles are a critical source of urban air pollution (PM10, CO, CO2, NOx, O3, SO2 and VOCs). Air pollution is a serious health problem and accounts for hundred of millions of dollars for health care and welfare cost. This paper focuses on environmental impact of road transport on the air quality around two selected schools for a period of two weeks each using an air pollution monitoring station which continuously recorded various pollutants’ concentrations and meteorological variables in five minute intervals. The results show that for both sites during the weekdays, the measured pollutants emitted from the road traffic next to the selected schools, such as carbon monoxide (CO) and nitrogen dioxide (NO2), were always under the allowable limits for Kuwaiti air quality standards, except for a single occurrence for NO2 concentration at morning hours for the governmental school. On the other hand, the values of non-methane hydrocarbon pollutants were found to be several times above the Kuwaiti air quality standards throughout the investigated period. The suspended particulates (PM10) concentrations have twice exceeded the limits of Kuwaiti air quality standards.

Proceedings ArticleDOI
TL;DR: It is apparent that vehicle emission reduction contributed to air quality improvement in Tokyo and the levelling off of the EC reductions since 2005 may be explained by the results from the carbon isotope (C) analysis, which suggested contributions from biomass combustion sources in addition to vehicle emissions in downtown Tokyo.
Abstract: In Japan, PM regulations in diesel automobile emissions started in 1994. Longterm measurements of suspended particulate matter (SPM, <7 μm), PMfine (<2.1 μm), and PMcoarse (2.1 to 7 μm) were obtained from an urban Kudan site in downtown Tokyo from 1994 to 2004 to evaluate the effects of emission reduction measures. A remarkable PM mass downward trend was found from 1996 onwards, especially in the PMfine fraction, which decreased at a rate of 2.09 μg m yr. The decrease in PMfine is attributable to the decreases in elemental carbon (EC) at the rate of 0.82 μg m yr. PMfine EC concentrations at the roadside Noge site shows a threefold faster downward trend, at the rate of 2.56 μg m yr. This decrease is consistent with fleet penetration of engines and fuels that complied with a stringent Japanese emission reduction limit which began to take effect in 1994. It is apparent that vehicle emission reduction contributed to air quality improvement in Tokyo. The levelling off of the EC reductions since 2005 may be explained by the results from the carbon isotope (C) analysis, which suggested contributions from biomass combustion sources in addition to vehicle emissions in downtown Tokyo.