scispace - formally typeset
Search or ask a question
Author

Peter Winker

Bio: Peter Winker is an academic researcher from University of Giessen. The author has contributed to research in topics: Heuristic (computer science) & Heuristics. The author has an hindex of 31, co-authored 197 publications receiving 3995 citations. Previous affiliations of Peter Winker include Center for Social and Economic Research & University of Marburg.


Papers
More filters
Journal ArticleDOI
TL;DR: It is shown that UD's have many desirable properties for a wide variety of applications and the global optimization algorithm, threshold accepting, is used to generate UD's with low discrepancy.
Abstract: A uniform design (UD) seeks design points that are uniformly scattered on the domain. It has been popular since 1980. A survey of UD is given in the first portion: The fundamental idea and construction method are presented and discussed and examples are given for illustration. It is shown that UD's have many desirable properties for a wide variety of applications. Furthermore, we use the global optimization algorithm, threshold accepting, to generate UD's with low discrepancy. The relationship between uniformity and orthogonality is investigated. It turns out that most UD's obtained here are indeed orthogonal.

825 citations

Journal ArticleDOI
TL;DR: A continuous global optimization heuristic for a stochastic approximation of an objective function, which is not globally convex, is introduced and some results of the estimation of the parameters for a specific agent based model of the DM/US-$ foreign exchange market are presented.

224 citations

Journal ArticleDOI
TL;DR: In this paper properties and construction of designs under a centered version of the L2-discrepancy are analyzed and optimization is performed using the threshold accepting heuristic which produces low discrepancy designs compared to theoretic expectation and variance.
Abstract: In this paper properties and construction of designs under a centered version of the L2-discrepancy are analyzed. The theoretic expectation and variance of this discrepancy are derived for random designs and Latin hypercube designs. The expectation and variance of Latin hypercube designs are significantly lower than that of random designs. While in dimension one the unique uniform design is also a set of equidistant points, low-discrepancy designs in higher dimension have to be generated by explicit optimization. Optimization is performed using the threshold accepting heuristic which produces low discrepancy designs compared to theoretic expectation and variance.

186 citations

Journal ArticleDOI
TL;DR: An implementation of the efficient multiple-purpose heuristic threshold-accepting heuristic, an assessment of its performance for some small examples, and results for larger sets of points with unknown discrepancy are presented.
Abstract: Efficient routines for multidimensional numerical integration are provided by quasi--Monte Carlo methods. These methods are based on evaluating the integrand at a set of representative points of the integration area. A set may be called representative if it shows a low discrepancy. However, in dimensions higher than two and for a large number of points the evaluation of discrepancy becomes infeasible. The use of the efficient multiple-purpose heuristic threshold-accepting offers the possibility to obtain at least good approximations to the discrepancy of a given set of points. This paper presents an implementation of the threshold-accepting heuristic, an assessment of its performance for some small examples, and results for larger sets of points with unknown discrepancy.

147 citations

Journal ArticleDOI
TL;DR: In this article, the authors focus on the empirical assessment of determinants and effects of financing constraints at the firm level using a standard model of credit rationing based on asymmetric information firm age and size.
Abstract: This paper focuses on the empirical assessment of determinants and effects of financing constraints at the firm level. Using a standard model of credit rationing based on asymmetric information firm age and size are found to be factors which should influence the probability of financing constraints. Improving business conditions strengthen the degree of informational asymmetry. A unique panel of firm data for Germany, including direct information on financing constraints, is used for the econometric analysis. Firms' size and improving business conditions are found to have a significant effect. Furthermore, a significant impact on investment and R&D expenditures cannot be rejected.

129 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: This paper presents a meta-modelling framework for estimating Output from Computer Experiments-Predicting Output from Training Data and Criteria Based Designs for computer Experiments.
Abstract: Many scientific phenomena are now investigated by complex computer models or codes A computer experiment is a number of runs of the code with various inputs A feature of many computer experiments is that the output is deterministic--rerunning the code with the same inputs gives identical observations Often, the codes are computationally expensive to run, and a common objective of an experiment is to fit a cheaper predictor of the output to the data Our approach is to model the deterministic output as the realization of a stochastic process, thereby providing a statistical basis for designing experiments (choosing the inputs) for efficient prediction With this model, estimates of uncertainty of predictions are also available Recent work in this area is reviewed, a number of applications are discussed, and we demonstrate our methodology with an example

6,583 citations

Posted Content
TL;DR: A theme of the text is the use of artificial regressions for estimation, reference, and specification testing of nonlinear models, including diagnostic tests for parameter constancy, serial correlation, heteroscedasticity, and other types of mis-specification.
Abstract: Offering a unifying theoretical perspective not readily available in any other text, this innovative guide to econometrics uses simple geometrical arguments to develop students' intuitive understanding of basic and advanced topics, emphasizing throughout the practical applications of modern theory and nonlinear techniques of estimation. One theme of the text is the use of artificial regressions for estimation, reference, and specification testing of nonlinear models, including diagnostic tests for parameter constancy, serial correlation, heteroscedasticity, and other types of mis-specification. Explaining how estimates can be obtained and tests can be carried out, the authors go beyond a mere algebraic description to one that can be easily translated into the commands of a standard econometric software package. Covering an unprecedented range of problems with a consistent emphasis on those that arise in applied work, this accessible and coherent guide to the most vital topics in econometrics today is indispensable for advanced students of econometrics and students of statistics interested in regression and related topics. It will also suit practising econometricians who want to update their skills. Flexibly designed to accommodate a variety of course levels, it offers both complete coverage of the basic material and separate chapters on areas of specialized interest.

4,284 citations

01 Feb 2016

1,970 citations

Journal ArticleDOI
01 Jan 2006
TL;DR: This work reviews the state-of-the-art metamodel-based techniques from a practitioner's perspective according to the role of meetamodeling in supporting design optimization, including model approximation, design space exploration, problem formulation, and solving various types of optimization problems.
Abstract: Computation-intensive design problems are becoming increasingly common in manufacturing industries. The computation burden is often caused by expensive analysis and simulation processes in order to reach a comparable level of accuracy as physical testing data. To address such a challenge, approximation or metamodeling techniques are often used. Metamodeling techniques have been developed from many different disciplines including statistics, mathematics, computer science, and various engineering disciplines. These metamodels are initially developed as “surrogates” of the expensive simulation process in order to improve the overall computation efficiency. They are then found to be a valuable tool to support a wide scope of activities in modern engineering design, especially design optimization. This work reviews the state-of-the-art metamodel-based techniques from a practitioner’s perspective according to the role of metamodeling in supporting design optimization, including model approximation, design space exploration, problem formulation, and solving various types of optimization problems. Challenges and future development of metamodeling in support of engineering design is also analyzed and discussed.Copyright © 2006 by ASME

1,503 citations

Journal ArticleDOI
TL;DR: In this article, a computer program for modelling financial time series is presented, based on the Random Walk Hypothesis, which is used to forecast trends in prices in futures markets.
Abstract: Features of Financial Returns Modelling Price Volatility Forecasting Standard Deviations The Accuracy of Autocorrelation Estimates Testing the Random Walk Hypothesis Forecasting Trends in Prices Evidence Against the Efficiency of Futures Markets Valuing Options Appendix: A Computer Program for Modelling Financial Time Series.

1,115 citations