scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Geographical Information Science in 2015"


Journal ArticleDOI
TL;DR: The geographic approach proposed here provides a reliable quantitative indicator of the usefulness of messages from social media by leveraging the existing knowledge about natural hazards such as floods, thus being valuable for disaster management in both crisis response and preventive monitoring.
Abstract: In recent years, social media emerged as a potential resource to improve the management of crisis situations such as disasters triggered by natural hazards. Although there is a growing research body concerned with the analysis of the usage of social media during disasters, most previous work has concentrated on using social media as a stand-alone information source, whereas its combination with other information sources holds a still underexplored potential. This article presents an approach to enhance the identification of relevant messages from social media that relies upon the relations between georeferenced social media messages as Volunteered Geographic Information and geographic features of flood phenomena as derived from authoritative data sensor data, hydrological data and digital elevation models. We apply this approach to examine the micro-blogging text messages of the Twitter platform tweets produced during the River Elbe Flood of June 2013 in Germany. This is performed by means of a statistical analysis aimed at identifying general spatial patterns in the occurrence of flood-related tweets that may be associated with proximity to and severity of flood events. The results show that messages near up to 10 km to severely flooded areas have a much higher probability of being related to floods. In this manner, we conclude that the geographic approach proposed here provides a reliable quantitative indicator of the usefulness of messages from social media by leveraging the existing knowledge about natural hazards such as floods, thus being valuable for disaster management in both crisis response and preventive monitoring.

301 citations


Journal ArticleDOI
TL;DR: Find loads of the handbook of spatial point pattern analysis in ecology chapman and hall or crc applied environmental statistics book catalogues in this site as the choice of you visiting this page.
Abstract: The Handbook of Spatial Point-Pattern Analysis in Ecology by Thorsten Wiegand and Kirk A. Moloney is an excellent, modern and detailed overview of the methods and approaches for analyzing location ...

149 citations


Journal ArticleDOI
TL;DR: A new typology for characterizing the role of crowdsourcing in the study of urban morphology is provided by synthesizing recent advancements in the analysis of open-source data, which shows how social media, trajectory, and traffic data can be analyzed to capture the evolving nature of a city’s form and function.
Abstract: Urban form and function have been studied extensively in urban planning and geographical information science. However, gaining a greater understanding of how they merge to define the urban morphology remains a substantial scientific challenge. Toward this goal, this paper addresses the opportunities presented by the emergence of crowdsourced data to gain novel insights into form and function in urban spaces. We are focusing in particular on information harvested from social media and other open-source and volunteered datasets e.g. trajectory and OpenStreetMap data. These data provide a first-hand account of form and function from the people who define urban space through their activities. This novel bottom-up approach to study these concepts complements traditional urban studies to provide a new lens for studying urban activity. By synthesizing recent advancements in the analysis of open-source data, we provide a new typology for characterizing the role of crowdsourcing in the study of urban morphology. We illustrate this new perspective by showing how social media, trajectory, and traffic data can be analyzed to capture the evolving nature of a city’s form and function. While these crowd contributions may be explicit or implicit in nature, they are giving rise to an emerging research agenda for monitoring, analyzing, and modeling form and function for urban design and analysis.

144 citations


Journal ArticleDOI
TL;DR: Methods for delineating indicator points of temporal events referenced as ‘midnight’, ‘morning start’.
Abstract: This paper proposes a methodology for using mobile telephone-based sensor data for detecting spatial and temporal differences in everyday activities in cities. Mobile telephone-based sensor data has great applicability in developing urban monitoring tools and smart city solutions. The paper outlines methods for delineating indicator points of temporal events referenced as ‘midnight’, ‘morning start’, ‘midday’, and ‘duration of day’, which represent the mobile telephone usage of residents what we call social time rather than solar or standard time. Density maps by time quartiles were also utilized to test the versatility of this methodology and to analyze the spatial differences in cities. The methodology was tested with data from cities of Harbin China, Paris France, and Tallinn Estonia. Results show that the developed methods have potential for measuring the distribution of temporal activities in cities and monitoring urban changes with georeferenced mobile phone data.

118 citations


Journal ArticleDOI
TL;DR: The quality procedures used by the platforms that collect VGI to increase and control data quality are reviewed and a framework for addressing VGI quality assessment is proposed.
Abstract: Volunteered Geographic Information VGI represents a growing source of potentially valuable data for many applications, including land cover map validation It is still an emerging field and many different approaches can be used to take value from VGI, but also many pros and cons are related to its use Therefore, since it is timely to get an overview of the subject, the aim of this article is to review the use of VGI as reference data for land cover map validation The main platforms and types of VGI that are used and that are potentially useful are analysed Since quality is a fundamental issue in map validation, the quality procedures used by the platforms that collect VGI to increase and control data quality are reviewed and a framework for addressing VGI quality assessment is proposed A review of cases where VGI was used as an additional data source to assist in map validation is made, as well as cases where only VGI was used, indicating the procedures used to assess VGI quality and fitness for use A discussion and some conclusions are drawn on best practices, future potential and the challenges of the use of VGI for land cover map validation

98 citations


Journal ArticleDOI
TL;DR: The research develops several new algorithms, including one for computing the local kernel of a region, and a compact formal description of the topology and connectivity of the indoor structure represented by a connected, embedded graph.
Abstract: This article proposes a comprehensive approach to computing a navigation graph for an indoor space. It focuses on a single floor, but the work is easily extensible to multi-level spaces. The approach proceeds by using a formal model, based on the combinatorial map but enhanced with geometric and semantic information. The process is almost fully automatic, taking as input the building plans providing the geometric structure of the floors and semantics of the building, such as functions of interior spaces, portals, etc. One of the novel aspects in this work was the use of combinatorial maps and their duals to provide a compact formal description of the topology and connectivity of the indoor structure represented by a connected, embedded graph. While making use of existing libraries for the more routine computational geometry involved, the research develops several new algorithms, including one for computing the local kernel of a region. The process is evaluated by means of a case study using part of a university building.

98 citations


Journal ArticleDOI
TL;DR: A novel approach for generating high-quality routable road maps using a simplified road network graph model that uses circular boundaries to separate all GPS traces into road intersections and road segments and builds road networks that maintain their identical geometric topologies through the entry/exit points at the original boundaries.
Abstract: Public vehicles and personal navigation assistants have become increasingly equipped with single-frequency global positioning system (GPS) receivers or loggers. These commonly used terminals offer an inexpensive way for acquiring large volumes of GPS traces, which contain information pertaining to road position and traffic rules. Using this new type of spatial data resource, we propose a novel approach for generating high-quality routable road maps. In this approach, a simplified road network graph model uses circular boundaries to separate all GPS traces into road intersections and road segments and builds road networks that maintain their identical geometric topologies through the entry/exit points at the original boundaries. One difficulty inherent to this type of approach is how to best determine the appropriate spatial coverage for road intersections. Conflict points among GPS traces that have large intersection angles usually occur within the physical areas of road intersections, particularly those ...

78 citations


Journal ArticleDOI
TL;DR: In this paper, the authors show that Zipf's law holds remarkably well for all natural cities at the global level, and it remains almost valid at the continental level except for Africa at certain time instants.
Abstract: Two fundamental issues surrounding research on Zipf’s law regarding city sizes are whether and why this law holds. This paper does not deal with the latter issue with respect to why, and instead investigates whether Zipf’s law holds in a global setting, thus involving all cities around the world. Unlike previous studies, which have mainly relied on conventional census data such as populations and census-bureau-imposed definitions of cities, we adopt naturally in terms of data speak for itself delineated cities, or natural cities, to be more precise, in order to examine Zipf’s law. We find that Zipf’s law holds remarkably well for all natural cities at the global level, and it remains almost valid at the continental level except for Africa at certain time instants. We further examine the law at the country level, and note that Zipf’s law is violated from country to country or from time to time. This violation is mainly due to our limitations; we are limited to individual countries, or to a static view on city-size distributions. The central argument of this paper is that Zipf’s law is universal, and we therefore must use the correct scope in order to observe it. We further find Zipf’s law applied to city numbers; the number of cities in the largest country is twice as many as that in the second largest country, three times as many as that in the third largest country, and so on. These findings have profound implications for big data and the science of cities.

78 citations


Journal ArticleDOI
TL;DR: Investigating how graph clustering can be applied to support the detection of geo-located communities in Twitter in disaster situations shows that communities that are relevant to identify areas where disaster-related incidents were reported can be extracted and the enhanced algorithm outperforms the generic one in this task.
Abstract: As they increase in popularity, social media are regarded as important sources of information on geographical phenomena. Studies have also shown that people rely on social media to communicate during disasters and emergency situation, and that the exchanged messages can be used to get an insight into the situation. Spatial data mining techniques are one way to extract relevant information from social media. In this article, our aim is to contribute to this field by investigating how graph clustering can be applied to support the detection of geo-located communities in Twitter in disaster situations. For this purpose, we have enhanced the fast-greedy optimization of modularity (FGM) clustering algorithm with semantic similarity so that it can deal with the complex social graphs extracted from Twitter. Then, we have coupled the enhanced FGM with the varied density-based spatial clustering of applications with noise spatial clustering algorithm to obtain spatial clusters at different temporal snapshots. The ...

71 citations


Journal ArticleDOI
TL;DR: A methodology and a planning and design support software tool for evaluating walkability and pedestrian accessibility of places which are relevant for people’s capabilities, and thus an important component of quality of life in cities is presented.
Abstract: We present a methodology and a planning and design support software tool for evaluating walkability and pedestrian accessibility of places which are relevant for people’s capabilities, and thus an important component of quality of life in cities. A multicriteria evaluation model, at the core of the decision support system, is used to assign walkability scores to points in urban space. Walkability scores are obtained through algorithms which process spatial data and run the evaluation model in order to derive potential pedestrian routes along the street network, taking into account the quality of urban space on several attributes relevant for walkability. One of its notable characteristics is a certain reversal of perspective in evaluating walkability: the walkability score of a place does not reflect how that place is per se walkable, but instead how and where to can one walk from there, that is to say, what is the walkability the place is endowed with. This evaluation incorporates three intertwined elements: the number of destinations/opportunities reachable by foot, their walking distances, and the quality of the paths to these destinations. In this article, we furthermore demonstrate possible uses of the support system by reporting and discussing the results of a case-study assessment of a project for the Lisbon’s Segunda Circular Second Ring Road. The software tool is made freely available for download.

69 citations


Journal ArticleDOI
TL;DR: This paper presents a new DEM pre-processing algorithm for removing the artefact dams created by infrastructure in sites of embankment underpasses as well as enforcing flow along drainage ditches and demonstrated that the least-cost breaching method could reliably enforce drainage pathways while minimizing the impact to the original DEM.
Abstract: The detailed topographic information contained in light detection and ranging LiDAR digital elevation models DEMs can present significant challenges for modelling surface drainage patterns. These data frequently represent anthropogenic infrastructure, such as road embankments and drainage ditches. While LiDAR DEMs can improve estimates of catchment boundaries and surface flow paths, modelling efforts are often confounded difficulties associated with incomplete representation of infrastructure. The inability of DEMs to represent embankment underpasses e.g. bridges, culverts and the problems with existing automated techniques for dealing with these problems can lead to unsatisfactory results. This is often dealt with by manually modifying LiDAR DEMs to incorporate the effects of embankment underpasses. This paper presents a new DEM pre-processing algorithm for removing the artefact dams created by infrastructure in sites of embankment underpasses as well as enforcing flow along drainage ditches. The application of the new algorithm to a large LiDAR DEM of a site in Southwestern Ontario, Canada, demonstrated that the least-cost breaching method used by the algorithm could reliably enforce drainage pathways while minimizing the impact to the original DEM.

Journal ArticleDOI
TL;DR: The definition of a level of detail LOD2+ is presented, which extends the CityGML Lod2 specification with indoor building geometries of comparable complexity to their exterior geometry in LOD1; and a method for automatically generating such indoor geometric models based on existing CityG ML L OD2 exterior geometry is presented.
Abstract: In this paper we present two contributions: i the definition of a level of detail LOD2+, which extends the CityGML LOD2 specification with indoor building geometries of comparable complexity to their exterior geometries in LOD2; and more importantly ii a method for automatically generating such indoor geometries based on existing CityGML LOD2 exterior geometries. We validate our method by generating LOD2+ models for a subset of the Rotterdam 3D data set and visually comparing these models to their real counterparts in building blueprints and imagery from Google Street View and Bing Maps. Furthermore, we use the LOD2+ models to compute the net internal area of each dwelling and validate our results by comparing these values to the ones registered in official government data sets.

Journal ArticleDOI
TL;DR: A novel indoor positioning system (IPS) that is combined with an outdoor positioning system to support seamless indoor and outdoor navigation and wayfinding and two end-user mobile applications that allow users to interact with the campus through an augmented reality interface are developed and deployed.
Abstract: A Smart City relies on six key factors: Smart Governance, Smart People, Smart Economy, Smart Environment, Smart Living and Smart Mobility. This paper focuses on Smart Mobility by improving one of its key components: positioning. We developed and deployed a novel indoor positioning system IPS that is combined with an outdoor positioning system to support seamless indoor and outdoor navigation and wayfinding. The positioning system is implemented as a service in our broader cartography-based smart university platform, called SmartUJI, which centralizes access to a diverse collection of campus information and provides basic and complex services for the Universitat Jaume I Spain, which serves as surrogate of a small city. Using our IPS and based on the SmartUJI services, we developed, deployed and evaluated two end-user mobile applications: the SmartUJI APP that allows users to obtain map-based information about the different facilities of the campus, and the SmartUJI AR that allows users to interact with the campus through an augmented reality interface. Students, university staff and visitors who tested the applications reported their usefulness in locating university facilities and generally improving spatial orientation.

Journal ArticleDOI
TL;DR: The extension of the theory of error propagation in GIS from 2D to 3D models using a Monte Carlo simulation and the implementation of a CityGML-compliant software for estimating the solar irradiation of roofs are investigated.
Abstract: While error propagation in GIS is a topic that has received a lot of attention, it has not been researched with 3D GIS data. We extend error propagation to 3D city models using a Monte Carlo simulation on a use case of annual solar irradiation estimation of building rooftops for assessing the efficiency of installing solar panels. Besides investigating the extension of the theory of error propagation in GIS from 2D to 3D, this paper presents the following contributions. We 1 introduce varying XY/Z accuracy levels of the geometry to reflect actual acquisition outcomes; 2 run experiments on multiple accuracy classes 121 in total; 3 implement an uncertainty engine for simulating acquisition positional errors to procedurally modelled synthetic buildings; 4 perform the uncertainty propagation analysis on multiple levels of detail LODs; and 5 implement Solar3Dcity – a CityGML-compliant software for estimating the solar irradiation of roofs, which we use in our experiments. The results show that in the case of the city of Delft in the Netherlands, a 0.3/0.6 m positional uncertainty yields an error of 68 kWh/m2/year 10% in solar irradiation estimation. Furthermore, the results indicate that the planar and vertical uncertainties have a different influence on the estimations, and that the results are comparable between LODs. In the experiments we use procedural models, implying that analyses are carried out in a controlled environment where results can be validated. Our uncertainty propagation method and the framework are applicable to other 3D GIS operations and/or use cases. We released Solar3Dcity as open-source software to support related research efforts in the future.

Journal ArticleDOI
TL;DR: A localized contour tree method that is able to fully exploit high-resolution topographical data for detecting, delineating, and characterizing surface depressions across scales with a multitude of geometric and topological properties is developed.
Abstract: Surface depressions are abundant in topographically complex landscapes, and they exert significant influences on hydrological, ecological, and biogeochemical processes at local and regional scales. The increasing availability of high-resolution topographical data makes it possible to resolve small surface depressions. By analogy with the reasoning process of a human interpreter to visually recognize surface depressions from a topographic map, we developed a localized contour tree method that is able to fully exploit high-resolution topographical data for detecting, delineating, and characterizing surface depressions across scales with a multitude of geometric and topological properties. In this research, we introduce a new concept ‘pour contour’ and a graph theory-based contour tree representation for the first time to tackle the surface depression detection and delineation problem. Beyond the depression detection and filling addressed in the previous raster-based methods, our localized contour tree method derives the location, perimeter, surface area, depth, spill elevation, storage volume, shape index, and other geometric properties for all individual surface depressions, as well as the nested topological structures for complex surface depressions. The combination of various geometric properties and nested topological descriptions provides comprehensive and essential information about surface depressions across scales for various environmental applications, such as fine-scale ecohydrological modeling, limnological analyses, and wetland studies. Our application example demonstrated that our localized contour tree method is functionally effective and computationally efficient.

Journal ArticleDOI
TL;DR: The present article reports the results of a detailed sensitivity analysis of an irregular CA model of urban land use dynamics implemented to simulate urban growth in Central Texas, USA, and confirmed the model’s sensitivity to neighborhood configurations.
Abstract: The neighborhood definition, which determines the influence on a cell from its nearby cells within a localized region, plays a critical role in the performance of a cellular automaton CA model. Raster CA models use a cellular grid to represent geographic space, and are sensitive to the cell size and neighborhood configuration. However, the sensitivity of vector-based CAs, an alternative to the raster-based counterpart, to neighborhood type and size remains uninvestigated. The present article reports the results of a detailed sensitivity analysis of an irregular CA model of urban land use dynamics. The model uses parcel data at the cadastral scale to represent geographic space, and was implemented to simulate urban growth in Central Texas, USA. Thirty neighborhood configurations defined by types and sizes were considered in order to examine the variability in the model outcome. Results from accuracy assessments and landscape metrics confirmed the model’s sensitivity to neighborhood configurations. Furthermore, the centroid intercepted neighborhood with a buffer of 120 m produced the most accurate simulation result. This neighborhood produced scattered development while the centroid extent-wide neighborhood resulted in a clustered development predominantly near the city center.

Journal ArticleDOI
TL;DR: A geostatistical method for multivariate sampling design optimization, using a universal cokriging (UCK) model, is presented, demonstrating that the UCK model-based sampling method can consider the relationship of target variables and environmental covariates, and spatial auto- and cross-correlation of regression residuals, to obtain the optimal design in geographic space and attribute space simultaneously.
Abstract: Optimal selection of observation locations is an essential task in designing an effective ecohydrological process monitoring network, which provides information on ecohydrological variables by capturing their spatial variation and distribution. This article presents a geostatistical method for multivariate sampling design optimization, using a universal cokriging (UCK) model. The approach is illustrated by the design of a wireless sensor network (WSN) for monitoring three ecohydrological variables (land surface temperature, precipitation and soil moisture) in the Babao River basin of China. After removal of spatial trends in the target variables by multiple linear regression, variograms and cross-variograms of regression residuals are fit with the linear model of coregionalization. Using weighted mean UCK variance as the objective function, the optimal sampling design is obtained using a spatially simulated annealing algorithm. The results demonstrate that the UCK model-based sampling method can consider the relationship of target variables and environmental covariates, and spatial auto- and cross-correlation of regression residuals, to obtain the optimal design in geographic space and attribute space simultaneously. Compared with a sampling design without consideration of the multivariate (cross-)correlation and spatial trend, the proposed sampling method reduces prediction error variance. The optimized WSN design is efficient in capturing spatial variation of the target variables and for monitoring ecohydrological processes in the Babao River basin.

Journal ArticleDOI
TL;DR: It is concluded that ANN was the best method to represent landslide susceptibility throughout the study area with an acceptable processing time.
Abstract: The purpose of this study was to investigate the capabilities of different landslide susceptibility methods by comparing their results statistically and spatially to select the best method that portrays the susceptibility zones for the Ulus district of the Bartin province (northern Turkey). Susceptibility maps based on spatial regression (SR), linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), logistic regression (LR) method, and artificial neural network method (ANN) were generated, and the effect of each geomorphological parameter was determined. The landslide inventory map digitized from previous studies was used as a base map for landslide occurrence. All of the analyses were implemented with respect to landslides classified as rotational, active, and deeper than 5 m. Three different sets of data were used to produce nine explanatory variables (layers). The study area was divided into grids of 90 m × 90 m, and the ‘seed cell’ technique was applied to obtain statistically balanc...

Journal ArticleDOI
TL;DR: Simulation results demonstrate the feasibility and practicability of applying CS algorithm to discover transition rules of CA for simulating geographical systems and show that the CS-CA model gets a higher accuracy than NULL, BCO-CA, PSO- CA, and ACO-CA models.
Abstract: This paper presents an intelligent approach to discover transition rules for cellular automata CA by using cuckoo search CS algorithm. CS algorithm is a novel evolutionary search algorithm for solving optimization problems by simulating breeding behavior of parasitic cuckoos. Each cuckoo searches the best upper and lower thresholds for each attribute as a zone. When the zones of all attributes are connected by the operator ‘And’ and linked with a cell status value, one CS-based transition rule is formed by using the explicit expression of ‘if-then’. With two distinct advantages of efficient random walk of Levy flights and balanced mixing, CS algorithm performs well in both local search and guaranteed global convergence. Furthermore, the CA model with transition rules derived by CS algorithm CS-CA has been applied to simulate the urban expansion of Nanjing City, China. The simulation produces encouraging results, in terms of numeric accuracy and spatial distribution, in agreement with the actual patterns. Preliminary results suggest that this CS approach is well suitable for discovering reliable transition rules. The model validation and comparison show that the CS-CA model gets a higher accuracy than NULL, BCO-CA, PSO-CA, and ACO-CA models. Simulation results demonstrate the feasibility and practicability of applying CS algorithm to discover transition rules of CA for simulating geographical systems.

Journal ArticleDOI
TL;DR: This article proposes an algorithm named Greedy Randomized Adaptive Search Procedure for Unsupervised Trajectory Segmentation (GRASP-UTS), which is a meta-heuristic that builds segments by modifying the number and positions of landmarks.
Abstract: An important problem in the knowledge discovery of trajectories is segmentation in subparts (subtrajectories). Existing algorithms for trajectory segmentation generally use explicit criteria to create segments. In this article, we propose segmenting trajectories using a novel, unsupervised approach, in which no explicit criteria are predetermined. To achieve this, we apply the Minimum Description Length (MDL) principle, which can measure homogeneity in the trajectory data by computing the similarities between landmarks (i.e. representative points of the trajectory) and the points in their neighborhood. Based on the homogeneity measurements, we propose an algorithm named Greedy Randomized Adaptive Search Procedure for Unsupervised Trajectory Segmentation (GRASP-UTS), which is a meta-heuristic that builds segments by modifying the number and positions of landmarks. We perform experiments with GRASP-UTS in two real-world datasets, using segment purity and coverage metrics to evaluate its efficiency. Experime...

Journal ArticleDOI
TL;DR: This article proposes a geo-distance-based method of detecting communities in spatially constrained networks to identify communities that are both highly topologically connected and spatially clustered, based on the fast modularity maximisation (CNM) algorithm.
Abstract: One feature discovered in the study of complex networks is community structure, in which vertices are gathered into several groups where more edges exist within groups than between groups. Many approaches have been developed for identifying communities; these approaches essentially segment networks based on topological structure or the attribute similarity of vertices, while few approaches consider the spatial character of the networks. Many complex networks are spatially constrained such that the vertices and edges are embedded in space. In geographical space, nearer objects are more related than distant objects. Thus, the relations among vertices are defined not only by the links connecting them but also by the distance between them. In this article, we propose a geo-distance-based method of detecting communities in spatially constrained networks to identify communities that are both highly topologically connected and spatially clustered. The algorithm is based on the fast modularity maximisation CNM algorithm. First, we modify the modularity to geo-modularity Qgeo by introducing an edge weight that is the inverse of the geographic distance to the power of n. Then, we propose the concept of a spatial clustering coefficient as a measure of clustering of the network to determine the power value n of the distance. The algorithm is tested with China air transport network and BrightKite social network data-sets. The segmentation of the China air transport network is similar to the seven economic regions of China. The segmentation of the BrightKite social network shows the regionality of social groups and identifies the dynamic social groups that reflect users’ location changes. The algorithm is useful in exploring the interaction and clustering properties of geographical phenomena and providing timely location-based services for a group of people.

Journal ArticleDOI
TL;DR: A ground-up approach to explaining dynamic social modelling for an interdisciplinary audience and an important way to study systems...
Abstract: Authors David O’Sullivan and George Perry have done a stellar job in building a methodological and conceptual architecture for using simulation to explore spatial pattern and process. While this bo...

Journal ArticleDOI
TL;DR: An ensemble-urban cellular automata (Ensemble-CA) model to achieve better transition rules and Static validation confirmed that this ensemble framework can achieve better performance in terms of receiver operating characteristic (ROC) statistics and outperformed the best single model.
Abstract: Transition rules are the core of urban cellular automata CA models. Although the logistic cellular automata Logistic-CA is commonly used for rules extraction, it cannot always achieve satisfactory performance because of the spatial heterogeneity and the inherent complexity of urban expansion. This article presents an ensemble-urban cellular automata Ensemble-CA model to achieve better transition rules. First, an uncertainty map that assesses the performance of transition rules spatially was achieved. Then, two auxiliary models i.e. classification and regression tree, CART; and artificial neural network, ANN, both of which have been stabilized with a Bagging algorithm, were prepared for integration using a proposed self-adaptive -nearest neighbors -NN combination algorithm. Thereafter, those unconfident sites were replaced with the ensemble output. This model was applied to Guangzhou, China, for an urban growth simulation from 2003 to 2008. Static validation confirmed that this ensemble framework i.e. without substitution of uncertain sites can achieve better performance 0.87 in terms of receiver operating characteristic ROC statistics area under the curve, AUC, and outperformed the best single model ANN, 0.82 and other common strategies e.g. weighted average, 0.83. After the substitution of unconfident sites, the AUC of Logistic-CA was elevated from 0.78 to 0.81. Subsequently, two urban growth mechanisms i.e. pixel-and patch-based were implemented separately based on the integrated transition rules. Experimental results revealed that the accuracy obtained from simulation of the Ensemble-CA increased considerably. The obtained kappa outperformed the single model, with improvements of 1.74% and 2.76% for pixel-and patch-based approaches, respectively. Correspondingly, landscape similarity index LSI improvements of these two mechanisms were 4.24% and 1.82%.

Journal ArticleDOI
TL;DR: A random trajectory generator (RTG) algorithm that combines the concepts of random walks, space-time prisms, and the Brownian bridge movement model and is capable of efficiently generating random trajectories between a given origin and a destination point, with the least directional bias possible is proposed.
Abstract: For applications in animal movement, we propose a random trajectory generator RTG algorithm that combines the concepts of random walks, space-time prisms, and the Brownian bridge movement model and is capable of efficiently generating random trajectories between a given origin and a destination point, with the least directional bias possible. Since we provide both a planar and a spherical version of the algorithm, it is suitable for simulating trajectories ranging from the local scale up to the inter-continental scale, as exemplified by the movement of migrating birds. The algorithm accounts for physical limitations, including maximum speed and maximum movement time, and provides the user with either single or multiple trajectories as a result. Single trajectories generated by the RTG algorithm can be used as a null model to test hypotheses about movement stimuli, while the multiple trajectories can be used to create a probability density surface akin to Brownian bridges.

Journal ArticleDOI
TL;DR: This paper defines wholeness as a hierarchical graph, in which individual centers are represented as the nodes and their relationships as the directed links, and suggests that the hierarchical levels, or the ht-index of the PR scores induced by the head/tail breaks, can characterize the degree of wholleness for the whole.
Abstract: According to Christopher Alexander’s theory of centers, a whole comprises numerous, recursively defined centers for things or spaces surrounding us. Wholeness is a type of global structure or life-giving order emerging from the whole as a field of the centers. The wholeness is an essential part of any complex system and exists, to some degree or other, in spaces. This paper defines wholeness as a hierarchical graph, in which individual centers are represented as the nodes and their relationships as the directed links. The hierarchical graph gets its name from the inherent scaling hierarchy revealed by the head/tail breaks, which is a classification scheme and visualization tool for data with a heavy-tailed distribution. We suggest that 1 the degrees of wholeness for individual centers should be measured by PageRank PR scores based on the notion that high-degree-of-life centers are those to which many high-degree-of-life centers point, and 2 that the hierarchical levels, or the ht-index of the PR scores induced by the head/tail breaks, can characterize the degree of wholeness for the whole: the higher the ht-index, the more life or wholeness in the whole. Three case studies applied to the Alhambra building complex and the street networks of Manhattan and Sweden illustrate that the defined wholeness captures fairly well human intuitions on the degree of life for the geographic spaces. We further suggest that the mathematical model of wholeness be an important model of geographic representation, because it is topological oriented, which enables us to see the underlying scaling structure. The model can guide geodesign, which should be considered as the wholeness-extending transformations that are essentially like the unfolding processes of seeds or embryos, for creating built and natural environments of beauty or with a high degree of wholeness.

Journal ArticleDOI
TL;DR: A Multicriteria Spatial Decision Support System to identify shelters and emergency service locations in urban evacuation planning and demonstrates the applicability of the proposed method and the efficacy of the procedures and algorithms in an earthquake emergency service station planning case study in the city of Tehran.
Abstract: Earthquakes occurring in urban areas constitute an important concern for emergency management and rescue services. Emergency service location problems may be formulated in discrete space or by restricting the potential locations to a specified finite set of points in continuous space. We propose a Multicriteria Spatial Decision Support System to identify shelters and emergency service locations in urban evacuation planning. The proposed system has emerged as an integration of the geographical information systems GIS and the multicriteria Decision-Making method of Preference Ranking Organization Method for Enrichment Evaluation IV PROMETHEE IV. This system incorporates multiple and often conflicting criteria and decision-makers’ preferences into a spatial decision model. We consider three standard structural attributes i.e., durability density, population density, and oldness density in the form of spatial maps to determine the zones most vulnerable to an earthquake. The information on these spatial maps is then entered into the ArcGIS software to define the relevant scores for each point with regards to the aforementioned attributes. These scores will be used to compute the preference functions in PROMETHEE IV, whose net flow outranking for each alternative will be inputted in ArcGIS to determine the zones that are most vulnerable to an earthquake. The final scores obtained are integrated into a mathematical programming model designed to find the most suitable locations for the construction of emergency service stations. We demonstrate the applicability of the proposed method and the efficacy of the procedures and algorithms in an earthquake emergency service station planning case study in the city of Tehran.

Journal ArticleDOI
TL;DR: The method proposed by this study improves the accuracy in analyzing and predicting human movement and lays the foundation for related urban studies.
Abstract: Human mobility patterns can provide valuable information in understanding the impact of human behavioral regularities in urban systems, usually with a specific focus on traffic prediction, public health or urban planning. While existing studies on human movement have placed huge emphasis on spatial location to predict where people go next, the time dimension component is usually being treated with oversimplification or even being neglected. Time dimension is crucial to understanding and detecting human activity changes, which play a negative role in prediction and thus may affect the predictive accuracy. This study aims to predict human movement from a spatio-temporal perspective by taking into account the impact of activity changes. We analyze and define changes of human activity and propose an algorithm to detect such changes, based on which a Markov chain model is used to predict human movement. The Microsoft GeoLife dataset is used to test our methodology, and the data of two selected users is used to evaluate the performance of the prediction. We compare the predictive accuracy R2 derived from the data with and without implementing the activity change detection. The results show that the R2 is improved from 0.295 to 0.762 for the user with obvious activity changes and from 0.965 to 0.971 for the user without obvious activity changes. The method proposed by this study improves the accuracy in analyzing and predicting human movement and lays the foundation for related urban studies.

Journal ArticleDOI
TL;DR: This study presents a massively parallel spatial computing approach that uses general-purpose graphics processing units (GPUs) to accelerate Ripley’s K function for univariate spatial point pattern analysis.
Abstract: This study presents a massively parallel spatial computing approach that uses general-purpose graphics processing units GPUs to accelerate Ripley’s K function for univariate spatial point pattern analysis. Ripley’s K function is a representative spatial point pattern analysis approach that allows for quantitatively evaluating the spatial dispersion characteristics of point patterns. However, considerable computation is often required when analyzing large spatial data using Ripley’s K function. In this study, we developed a massively parallel approach of Ripley’s K function for accelerating spatial point pattern analysis. GPUs serve as a massively parallel platform that is built on many-core architecture for speeding up Ripley’s K function. Variable-grained domain decomposition and thread-level synchronization based on shared memory are parallel strategies designed to exploit concurrency in the spatial algorithm of Ripley’s K function for efficient parallelization. Experimental results demonstrate that substantial acceleration is obtained for Ripley’s K function parallelized within GPU environments.

Journal ArticleDOI
TL;DR: An automated digital method is introduced that produces shaded relief with locally adjusted illumination directions to simulate the techniques and cartographic principles of manual relief shading and best highlights major landforms in terrain characterized by sharp, clearly defined ridges and valleys.
Abstract: Relief shading is the most common type of cartographic relief representation for print and digital maps. Manual relief shading results in informative and visually pleasing representations of terrain, but it is time consuming and expensive to produce. Current analytical relief shading can be created quickly, but the resulting maps are not as aesthetically appealing and do not show landscape features in an explicit manner. This article introduces an automated digital method that produces shaded relief with locally adjusted illumination directions to simulate the techniques and cartographic principles of manual relief shading. Ridgelines and valley lines are derived from a digital terrain model, vectorized, and used in a diffusion curve algorithm. A graph analysis generalizes the lines before using them for diffusion curve shading. The direction of illumination is adjusted based on the spatial orientation of ridgelines and valley lines. The diffusion curve shading is combined with standard analytical relief shading to create a final diffusion relief shading image. Similar to manual relief shading, major landforms and the structure of the terrain are more clearly shown in the diffusion relief shading. The presented method best highlights major landforms in terrain characterized by sharp, clearly defined ridges and valleys.

Journal ArticleDOI
TL;DR: A novel semantics-enhanced density-based clustering algorithm SEM-DTBJ-Cluster is proposed, to extract semantic POIs from GPS trajectories, and is believed to be the first that considers popularity, temporal and geographical information together.
Abstract: Recently, points of interest POIs recommendation has evolved into a hot research topic with real-world applications. In this paper, we propose a novel semantics-enhanced density-based clustering algorithm SEM-DTBJ-Cluster, to extract semantic POIs from GPS trajectories. We then take into account three different factors popularity, temporal and geographical features that can influence the recommendation score of a POI. We characterize the impacts caused by popularity, temporal and geographical information, by using different scoring functions based on three developed recommendation models. Finally, we combine the three scoring functions together and obtain a unified framework PTG-Recommend for recommending candidate POIs for a mobile user. To the best of our knowledge, this work is the first that considers popularity, temporal and geographical information together. Experimental results on two real-world data sets strongly demonstrate that our framework is robust and effective, and outperforms the baseline recommendation methods in terms of precision and recall.