scispace - formally typeset
Search or ask a question

Showing papers by "University of Applied Sciences, Mainz published in 2010"


Journal ArticleDOI
TL;DR: In this article, it was shown that the so-called total least squares estimate (TLS) within an errors-in-variables (EIV) model can be identified as a special case of the method of least-squares within the nonlinear Gauss-Helmert model.
Abstract: In this contribution it is shown that the so-called “total least-squares estimate” (TLS) within an errors-in-variables (EIV) model can be identified as a special case of the method of least-squares within the nonlinear Gauss–Helmert model. In contrast to the EIV-model, the nonlinear GH-model does not impose any restrictions on the form of functional relationship between the quantities involved in the model. Even more complex EIV-models, which require specific approaches like “generalized total least-squares” (GTLS) or “structured total least-squares” (STLS), can be treated as nonlinear GH-models without any serious problems. The example of a similarity transformation of planar coordinates shows that the “total least-squares solution” can be obtained easily from a rigorous evaluation of the Gauss–Helmert model. In contrast to weighted TLS, weights can then be introduced without further limitations. Using two numerical examples taken from the literature, these solutions are compared with those obtained from certain specialized TLS approaches.

162 citations


Proceedings ArticleDOI
29 Nov 2010
TL;DR: This work uses photogrammetric techniques and simulations to improve the absolute in line positional accuracy of a robot-guided effector to better than 1 mm and designs an LED calibration object adapted to this application.
Abstract: We aim to improve the absolute in line positional accuracy of a robot-guided effector to better than 1 mm. We do so using photogrammetric techniques and by relying heavily on simulations to fine tune each parameter and avoid weak configurations. We also use simulations to design an LED calibration object adapted to this application. A test procedure enables us to validate both the simulated results as well as the calibration procedure. The test results exceed expectations by improving the absolute positioning of a robot effector by a factor of 20.

37 citations


Journal ArticleDOI
TL;DR: In this paper, a hybrid deformation observation system is developed, which is part of a Volcano Fast Response System (VFRS), which will support the locale authorities in their decisions about hazard mitigation provisions, especially about the evacuation of people.

30 citations


Proceedings ArticleDOI
TL;DR: In this article, the authors present a methodology for combining data from two discrete optical measuring systems by registering their individual measurements into a common geometrical frame, which enables the type and extent of surface and colorimetric change to be precisely characterized and quantified over time.
Abstract: Modern optical measuring systems are able to record objects with high spatial and spectral precision. The acquisition of spatial data is possible with resolutions of a few hundredths of a millimeter using active projection-based camera systems, while spectral data can be obtained using filter-based multispectral camera systems that can capture surface spectral reflectance with high spatial resolution. We present a methodology for combining data from these two discrete optical measuring systems by registering their individual measurements into a common geometrical frame. Furthermore, the potential for its application as a tool for the non-invasive monitoring of paintings and polychromy is evaluated. The integration of time-referenced spatial and spectral datasets is beneficial to record and monitor cultural heritage. This enables the type and extent of surface and colorimetric change to be precisely characterized and quantified over time. Together, these could facilitate the study of deterioration mechanisms or the efficacy of conservation treatments by measuring the rate, type, and amount of change over time. An interdisciplinary team of imaging scientists and art scholars was assembled to undertake a trial program of repeated data acquisitions of several valuable historic surfaces of cultural heritage objects. The preliminary results are presented and discussed.

22 citations


Journal ArticleDOI
TL;DR: In this article, a new approach to obtaining unbiased estimates of the value of a statistical life (VSL) with labor market data is presented, which combines the advantages of recent panel studies, allowing us to control for unobserved heterogeneity of workers, and conventional cross-sectional estimations, which are less affected by measurement error.

18 citations


Journal ArticleDOI
TL;DR: In this paper, the authors analyse OP-prozesse eines universitaren Krankenhauses, um uber die Identifizierung potenzieller Schwachstellen Optimierungsstrategien herzuleiten.
Abstract: Die Einfuhrung des „Diagnosis-related-groups“- (DRG-)Systems erhohte den Kostendruck. Infolge des Zusammenwirkens vieler Berufsgruppen ist deshalb die Optimierung der Ablauforganisation im OP von zentraler Bedeutung. Ziel der Untersuchung war die Analyse der OP-Prozesse eines universitaren Krankenhauses, um uber die Identifizierung potenzieller Schwachstellen Optimierungsstrategien herzuleiten. Die umgesetzten Losungen wurden erneut analysiert, um ihre Auswirkungen zu determinieren. Im Beobachtungszeitraum von 6 Wochen vor und 4 Wochen nach Intervention wurden alle Ablaufe in einem strukturierten Zeiterfassungsbogen dokumentiert. Ungenutzte OP-Kapazitaten konnten als wesentliches Ineffektivitatskriterium identifiziert werden. Ein relevanter Zeitverlust wurde in der Saalnutzung am Planende dokumentiert, die auf unterschiedliche Dienstzeiten zuruckgefuhrt werden konnte. Nach deren Angleichung und Einfuhrung weiterer Veranderungen konnte eine Zunahme der Effizienz nachgewiesen werden. Eine perioperative Prozessoptimierung tragt wesentlich zum Erfolg der OP-Organisation bei. Standardisierte Vorgehensweisen und die Konsentierung von Statuten zur OP-Planung sind hierbei essenziell. Somit kann ein konsequentes OP-Management zum wirtschaftlichen Erfolg einer Klinik beitragen.

9 citations


01 Jan 2010
TL;DR: The intention of the approach is to take human cognitive strategy as an example, and to simulate these processes based on available knowledge for the objects of interest, to guide the algorithms used to detect and recognize objects, which will yield a higher effectiveness.
Abstract: The reconstruction of 3D objects from point clouds and images is a major task in many application fields. The processing of such spatial data, especially 3D point clouds from terrestrial laser scanners, generally consumes time and requires extensive interaction between a human and the machine to yield a promising result. Presently, algorithms for an automatic processing are usually datadriven and concentrate on geometric feature extraction. Robust and quick methods for complete object extraction or identification are still an ongoing research topic and suffer from the complex structure of the data, which cannot be sufficiently modelled by purely numerical strategies. Therefore, the intention of our approach is to take human cognitive strategy as an example, and to simulate these processes based on available knowledge for the objects of interest. Such processes will first, introduce a semantic structure for the objects and second, guide the algorithms used to detect and recognize objects, which will yield a higher effectiveness. Hence, our research proposes an approach using knowledge to guide the algorithms in 3D point cloud and image processing.

5 citations


Book ChapterDOI
08 Nov 2010
TL;DR: In this article innovative techniques for the documentation and analysis of stone inscriptions located in the province of Sichuan - south-west of china are shown.
Abstract: Modern high resolution 3D-measuring techniques are widely used in quality control and industrial production, because they allow precise and reliable inspection of objects. Their potential to monitor surfaces, however, must not be restricted to industrial objects. Also in cultural heritage applications a detailed and reliable spatial description of surfaces is often useful and opens up new possibilities for conservation, analysis or presentation of objects. In the actual work we have considered Buddhistic stone inscriptions (8th- 12th centuries) which are important cultural assets of China. They need to be documented, analyzed, interpreted and visualized archaeologically, art-historically and text-scientifically. On one hand such buddhistic stone inscriptions have to be conserved for future generations but on the other hand further possibilities for analyzing the data could be enabled when the inscriptions would be accessible to a larger community, for instance the understanding of the historical growth of Buddhism in China. In this article we show innovative techniques for the documentation and analysis of stone inscriptions located in the province of Sichuan - south-west of china. The stone inscriptions have been captured using high precision 3D- measuring techniques what produces exact copies of the original inscriptions serving as base for further processing tasks. Typical processing might be directed towards an improvement of the legibility of characters or may try to automatically detect individual letters, to automatically identify certain text passages or even to characterize the written elements with respect to a potential style of the monk or the executing stonemason. All these processing steps will support the interpretation of the inscriptions by the sinologists involved with the analysis and evaluation of the texts. The concept and features of the image processing applied on the captured inscription as well as the aims and the effect of an interpretation based on algorithms for identifying and analyzing the inscriptions are demonstrated. In order to present the outcome to a large community, the results of the stone inscription reconstruction, the done interpretation and additional 2D / 3D maps are published within an interactive web platform.

4 citations


Journal ArticleDOI
TL;DR: In this paper, Monte Carlo simulations show how systematic changes in the parameters of the components, of the test equation and of the correlation matrix affect the size of first and second-generation panel unit root tests.
Abstract: Panel unit root tests of real exchange rates—as opposed to univariate tests—usually reject non-stationarity. These tests, however, could be biased if the real exchange rate contained MA roots. Indeed, two independent arguments claim that the real exchange rate, being a sum of a stationary and a non-stationary component, is possibly an ARIMA (1, 1, 1) process. Monte Carlo simulations show how systematic changes in the parameters of the components, of the test equation and of the correlation matrix affect the size of first and second-generation panel unit root tests. Two components of the real exchange rate—the real exchange rate of a single good and a weighted sum of relative prices—are constructed from the data for a panel of countries. Computation of the relevant parameters reveals that panel unit root tests of the real exchange rate are severely oversized, usually much more so than simple augmented Dickey-Fuller tests. Thus, the evidence for purchasing power parity from first and second-generation panel unit root tests may be merely due to extreme size biases.

3 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present an account of a journey of over 15 years through all corners of the national accounting world, from all angles, including critic, compiler, editor, communicator, and finally as a user of national accounts statistics.
Abstract: Leaving aside the purpose of teaching, there are three ways to treat the national accounts: (i) the axiomatic approach, (ii) the historic approach, and (iii) the pragmatic approach. An example of the first is Reich (2001), the second approach was adopted by Vanoli (2005), and this study sets a precedent for the third category. It is an account, or almost a diary, of a journey of over 15 years through all corners of the national accounting world. Working first as a critic, then as a compiler, as an editor, as a communicator, and finally as a user of national accounts statistics, Frits Bos has gained sufficient experience to assess the national accounts from all angles. He deplores it took a long time to write the book (“a never-ending story that became a taboo”, p. 5), but it is hard to see how the book could have become so rich in its detail and exuberant in its different views on the national accounts without having spent so much time writing it. The comprehensive title is correct. The national accounts are examined from all perspectives. The central thesis is that official national accounts statistics are insufficiently serving specific data needs and have become incomprehensible and inaccessible for most data users. The resulting widespread illiteracy in national accounting among researchers should be regarded as a threat to economics as an empirical and policy relevant science. Therefore, the book serves two purposes. The first is to reduce illiteracy in national accounts by clarifying to outsiders and insiders what official national accounting statistics measure, how this measurement proceeds, and how the results are used. The second purpose is to show how the national accounts and its use should be developed further. Meeting these purposes implies that the enormous gap between the users and the producers of the national accounts should be bridged. Formally, the treatise is divided into three parts, but these parts carry very unequal weights in content. In Part I, which consists of three chapters, the development of the national accounts from incidental estimates to a universal tool for analysis and policy is documented over 40 pages. Part II investigates the national accounts “as a tool for analysis and policy in terms of their logic, relevance and reliability”. With some 250 pages, this represents the main corpus of the book. Thirty pages of Part III about the future of the national accounts, dealing with the national accounts “as a modern tool of information”, summarize the lessons of the study with respect to improvement of the product, its marketing and education. At the level of chapters, the textual imbalance is even more pronounced with Chapter 6 “The Universal Model: Eight Interrelated Models. Balancing Relevance and Measurability” consuming half the total volume, and the nine other chapters the rest. Economic Systems Research, 2010, Vol. 22(3), September, pp. 301–303

2 citations


Proceedings ArticleDOI
01 Jun 2010
TL;DR: The goal is to create a musical application framework for multiple casual users that use state of the art multitouch devices that uses the metaphor of ants moving on a hexagonal grid to interact with a pitch pattern.

Journal ArticleDOI
TL;DR: In this paper, the general price level is introduced as a third explicit variable, besides prices and volumes, into the decomposition method, which is not an exercise in index numbers, but is addressed to the practitioner of national accounts.
Abstract: The yearly growth of an industry is commonly measured by valuing its output and its intermediate consumption at previous year’s prices, forming the balance called value added, and subtracting from it previous year’s value added The result may be expressed as a percentage of previous year’s value added ‐ the common usage ‐ or as an amount of value measured in previous year’s currency, which is the approach used in this paper Although superficially equivalent, the two expressions len d themselves to different quantitative results and economic interpretations of an industry’s long-term gr owth Using the Danish case as an example, the paper demonstrates the lack of coherence of the conventional method when applied to actual figures, and suggests a remedy based on introducing the general price level as a third explicit variable, besides prices and volumes, into the decomposition method The paper is not an exercise in index numbers, but is addressed to the practitioner of national accounts

Book ChapterDOI
01 Jan 2010
TL;DR: Gerontopsychiatrie wird vielmehr als ein sozial-kulturelles Ordnungsmuster begriffen, das als Antwort auf sozciale Fragen entstanden ist (z. B. as discussed by the authors ).
Abstract: Dieser Beitrag basiert auf einem Verstandnis von Gerontopsychiatrie, das uber eine medizinische Begriffsdefinition hinausweist. So befasst sich die Gerontopsychiatrie zwar als medizinische Wissenschaft und als arztliche Profession mit der Erforschung, Diagnose, Behandlung und Pravention von mentalen Krankheiten alter Menschen. Doch wird dabei – ungeachtet vehementer Kritik – z. B. seitens der Deutschen Gesellschaft fur Gerontopsychiatrie und -psychotherapie (DGGPP) – der Gegenstand „psychische Krankheit“ noch allzu selbstverstandlich entsprechend einer medizinisch-naturwissenschaftlichen Grundlage physiologisch erklart und behandelt, womit der betroffene Mensch auf einen Trager von gestorten physiologischen Korperprozessen reduziert wird. Das Subjekt und sein Erleben findet als Untersuchungsgegenuber kaum Interesse. Nach wie vor wird ihm lediglich der Status eines „Objekts“ medizinischer „Masnahmen“ zugebilligt. Dieser (inhumanen) Verkurzung wird hier widersprochen. Ferner wird „Gerontopsychiatrie“ auch nicht auf einen konkreten Ort (beispielsweise einer Klinik) reduziert, also auf einen klinischen Ort, in der sich das Zusammenwirken von der Erfahrung der Medizin (Forschung) und von der Behandlung der Patienten ereignet. Gerontopsychiatrie wird vielmehr als ein sozial-kulturelles Ordnungsmuster begriffen, das als Antwort auf soziale Fragen entstanden ist (z. B.: Was tun mit den alten, traurigen, verwirrten, multimorbiden, storenden und/oder leidenden Menschen?) und folglich ein Resultat des Zusammenspiels von gesellschaftlichem Bedarf und Ressourcen darstellt (vgl. Dorr 2005).