scispace - formally typeset
Search or ask a question
Author

Monika Sester

Bio: Monika Sester is an academic researcher from Leibniz University of Hanover. The author has contributed to research in topics: Computer science & Point cloud. The author has an hindex of 33, co-authored 216 publications receiving 3741 citations. Previous affiliations of Monika Sester include University of Stuttgart & Lund University.


Papers
More filters
Journal ArticleDOI
TL;DR: The International Society for Photogrammetry and Remote Sensing (ISPRS) Technical Commission II (TC II) revisited the existing geospatial data handling methods and theories.
Abstract: Big data has now become a strong focus of global interest that is increasingly attracting the attention of academia, industry, government and other organizations. Big data can be situated in the disciplinary area of traditional geospatial data handling theory and methods. The increasing volume and varying format of collected geospatial big data presents challenges in storing, managing, processing, analyzing, visualizing and verifying the quality of data. This has implications for the quality of decisions made with big data. Consequently, this position paper of the International Society for Photogrammetry and Remote Sensing (ISPRS) Technical Commission II (TC II) revisits the existing geospatial data handling methods and theories to determine if they are still capable of handling emerging geospatial big data. Further, the paper synthesises problems, major issues and challenges with current developments as well as recommending what needs to be developed further in the near future.

336 citations

Journal ArticleDOI
TL;DR: In this article, a position paper of the International Society for Photogrammetry and Remote Sensing (ISPRS) Technical Commission II (TC II) revisits the existing geospatial data handling methods and theories to determine if they are still capable of handling emerging gespatial big data, synthesises problems, major issues and challenges with current developments as well as recommending what needs to be developed further in the near future.
Abstract: Big data has now become a strong focus of global interest that is increasingly attracting the attention of academia, industry, government and other organizations. Big data can be situated in the disciplinary area of traditional geospatial data handling theory and methods. The increasing volume and varying format of collected geospatial big data presents challenges in storing, managing, processing, analyzing, visualizing and verifying the quality of data. This has implications for the quality of decisions made with big data. Consequently, this position paper of the International Society for Photogrammetry and Remote Sensing (ISPRS) Technical Commission II (TC II) revisits the existing geospatial data handling methods and theories to determine if they are still capable of handling emerging geospatial big data. Further, the paper synthesises problems, major issues and challenges with current developments as well as recommending what needs to be developed further in the near future. Keywords: Big data, Geospatial, Data handling, Analytics, Spatial Modeling, Review

217 citations

Proceedings ArticleDOI
11 Aug 2002
TL;DR: A brief survey of existing facilities for geographical information retrieval on the web is provided, before describing a set of tools and techniques that are being developed in the project SPIRIT : Spatially-Aware Information Retrieval on the Internet.
Abstract: A large proportion of the resources available on the world-wide web refer to information that may be regarded as geographically located. Thus most activities and enterprises take place in one or more places on the Earth's surface and there is a wealth of survey data, images, maps and reports that relate to specific places or regions. Despite the prevalence of geographical context, existing web search facilities are poorly adapted to help people find information that relates to a particular location. When the name of a place is typed into a typical search engine, web pages that include that name in their text will be retrieved, but it is likely that many resources that are also associated with the place may not be retrieved. Thus resources relating to places that are inside the specified place may not be found, nor may be places that are nearby or that are equivalent but referred to by another name. Specification of geographical context frequently requires the use of spatial relationships concerning distance or containment for example, yet such terminology cannot be understood by existing search engines. Here we provide a brief survey of existing facilities for geographical information retrieval on the web, before describing a set of tools and techniques that are being developed in the project SPIRIT : Spatially-Aware Information Retrieval on the Internet (funded by European Commission Framework V Project IST-2001-35047).

184 citations

Journal ArticleDOI
TL;DR: An overview on current approaches for the automation of generalization and data abstraction is given, and solutions for three generalization problems based on optimization techniques based on Neural Network techniques are presented.
Abstract: The availability of methods for abstracting and generalizing spatial data is vital for understanding and communicating spatial information. Spatial analysis using maps at different scales is a good example of this. Such methods are needed not only for analogue spatial data sets but even more so for digital data. In order to automate the process of generating different levels of detail of a spatial data set, generalization operations are used. The paper first gives an overview on current approaches for the automation of generalization and data abstraction, and then presents solutions for three generalization problems based on optimization techniques. Least‐Squares Adjustment is used for displacement and shape simplification (here, building groundplans), and Self‐Organizing Maps, a Neural Network technique, is applied for typification, i.e. a density preserving reduction of objects. The methods are validated with several examples and evaluated according to their advantages and disadvantages. Finally, a scen...

161 citations

01 Jan 2000
TL;DR: The paper presents solutions for generalization problems using least squares adjustment theory, a well known general framework to determine unknown parameters based on given observations, and demonstrates the validity of this approach to the simplification of building ground plans and the displacement of arbitrary cartographic objects.
Abstract: The paper presents solutions for generalization problems using least squares adjustment theory. This concept allows for the introduction of several observations in terms of constraints and for a holistic solution of all these - possibly contrary and competing - constraints. Two examples are used to demonstrate the validity of this approach: the simplification of building ground plans and the displacement of arbitrary cartographic objects. Each approach is verified with several examples; furthermore, the integration of these different approaches is presented, in terms of the fusion of cadastral and topographic data. Least Squares Adjustment theory (LSA) is a well known general framework to determine unknown parameters based on given observations. This optimization technique is well founded in mathematics, operations research, and in geodesy. This general concept allows for the integration of different constraints in order to solve an overall, complex problem. This paper proposes to use adjustment theory for generalization. One problem is the set-up of the constraints for the various generalization tasks. The generalization of building ground plans is formulated in terms of a model-based approach, the problem being the determination of the model. In this case it is derived by the application of some rules. The second generalization operation treated with LSA is displacement: different objects have to be displayed on a map - for reasons of legibility certain constraints have to be satisfied, e.g. minimal object sizes and minimal object distances have to be enforced. LSA offers a straightforward framework to introduce different kinds of these constraints. In one step, all these constraints are solved simultaneously, resulting in one optimized solution with the feature that all residuals are distributed evenly among all the observations. Besides this result, quality parameters indicate how well the initial constraints have been satisfied. The paper is organized as follows: after a review of related work, the simplification of building ground plans using a model-based approach is presented, together with some examples showing the possibilities and the deficiencies. Then the approach for displacement based on least squares adjustment is shown, giving both the theoretical background, and explanatory examples. The integration of the two approaches is demonstrated with the example of the fusion of cadastral information with topographic information. Finally, a summary concludes the paper.

154 citations


Cited by
More filters
Posted Content
TL;DR: In this paper, the authors provide a unified and comprehensive theory of structural time series models, including a detailed treatment of the Kalman filter for modeling economic and social time series, and address the special problems which the treatment of such series poses.
Abstract: In this book, Andrew Harvey sets out to provide a unified and comprehensive theory of structural time series models. Unlike the traditional ARIMA models, structural time series models consist explicitly of unobserved components, such as trends and seasonals, which have a direct interpretation. As a result the model selection methodology associated with structural models is much closer to econometric methodology. The link with econometrics is made even closer by the natural way in which the models can be extended to include explanatory variables and to cope with multivariate time series. From the technical point of view, state space models and the Kalman filter play a key role in the statistical treatment of structural time series models. The book includes a detailed treatment of the Kalman filter. This technique was originally developed in control engineering, but is becoming increasingly important in fields such as economics and operations research. This book is concerned primarily with modelling economic and social time series, and with addressing the special problems which the treatment of such series poses. The properties of the models and the methodological techniques used to select them are illustrated with various applications. These range from the modellling of trends and cycles in US macroeconomic time series to to an evaluation of the effects of seat belt legislation in the UK.

4,252 citations

01 Jan 1990
TL;DR: An overview of the self-organizing map algorithm, on which the papers in this issue are based, is presented in this article, where the authors present an overview of their work.
Abstract: An overview of the self-organizing map algorithm, on which the papers in this issue are based, is presented in this article.

2,933 citations

Journal ArticleDOI
TL;DR: In this paper, the authors offer a new book that enPDFd the perception of the visual world to read, which they call "Let's Read". But they do not discuss how to read it.
Abstract: Let's read! We will often find out this sentence everywhere. When still being a kid, mom used to order us to always read, so did the teacher. Some books are fully read in a week and we need the obligation to support reading. What about now? Do you still love reading? Is reading only for you who have obligation? Absolutely not! We here offer you a new book enPDFd the perception of the visual world to read.

2,250 citations