Institution
Leibniz University of Hanover
Education•Hanover, Niedersachsen, Germany•
About: Leibniz University of Hanover is a education organization based out in Hanover, Niedersachsen, Germany. It is known for research contribution in the topics: Finite element method & Computer science. The organization has 14283 authors who have published 29845 publications receiving 682152 citations.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: In this article, Monte Carlo simulations of a lattice-gas reaction model incorporating rapid CO diffusion were used to investigate the bifurcation point of catalytic CO oxidation on a Pt field emitter tip.
Abstract: Fluctuations which arise in catalytic CO oxidation on a Pt field emitter tip have been studied with field electron microscopy as the imaging method. Fluctuation-driven transitions between the active and the inactive branch of the reaction are found to occur sufficiently close to the bifurcation point, terminating the bistable range. The experimental results are modeled with Monte Carlo simulations of a lattice-gas reaction model incorporating rapid CO diffusion. {copyright} {ital 1999} {ital The American Physical Society}
134 citations
••
TL;DR: In this article, an all-sky search with the LIGO detectors for periodic gravitational waves in the frequency range 50-1000 Hz and with the frequency's time derivative in the range −1×10^(−8)
Abstract: We report on an all-sky search with the LIGO detectors for periodic gravitational waves in the frequency range 50–1000 Hz and with the frequency’s time derivative in the range −1×10^(−8) Hz s^(−1) to zero. Data from the fourth LIGO science run (S4) have been used in this search. Three different semicoherent methods of transforming and summing strain power from short Fourier transforms (SFTs) of the calibrated data have been used. The first, known as StackSlide, averages normalized power from each SFT. A “weighted Hough” scheme is also developed and used, which also allows for a multi-interferometer search. The third method, known as PowerFlux, is a variant of the StackSlide method in which the power is weighted before summing. In both the weighted Hough and PowerFlux methods, the weights are chosen according to the noise and detector antenna-pattern to maximize the signal-to-noise ratio. The respective advantages and disadvantages of these methods are discussed. Observing no evidence of periodic gravitational radiation, we report upper limits; we interpret these as limits on this radiation from isolated rotating neutron stars. The best population-based upper limit with 95% confidence on the gravitational-wave strain amplitude, found for simulated sources distributed isotropically across the sky and with isotropically distributed spin axes, is 4.28×10^(−24) (near 140 Hz). Strict upper limits are also obtained for small patches on the sky for best-case and worst-case inclinations of the spin axes.
134 citations
••
TL;DR: In this article, the Schrodinger-Newton equation for spherically symmetric gravitational fields can be derived in a WKB-like expansion in 1/c from the Einstein-Klein-Gordon and Einstein-Dirac system.
Abstract: In this paper we show that the Schr\"odinger-Newton equation for spherically symmetric gravitational fields can be derived in a WKB-like expansion in 1/c from the Einstein-Klein-Gordon and Einstein-Dirac system.
134 citations
••
TL;DR: This paper systemize blocking methods for clean-clean ER (an inherently quadratic task) over highly heterogeneous information spaces (HHIS) through a novel framework that consists of two orthogonal layers: the effectiveness layer encompasses methods for building overlapping blocks with small likelihood of missed matches and the efficiency layer comprises a rich variety of techniques that significantly restrict the required number of pairwise comparisons.
Abstract: In the context of entity resolution (ER) in highly heterogeneous, noisy, user-generated entity collections, practically all block building methods employ redundancy to achieve high effectiveness. This practice, however, results in a high number of pairwise comparisons, with a negative impact on efficiency. Existing block processing strategies aim at discarding unnecessary comparisons at no cost in effectiveness. In this paper, we systemize blocking methods for clean-clean ER (an inherently quadratic task) over highly heterogeneous information spaces (HHIS) through a novel framework that consists of two orthogonal layers: the effectiveness layer encompasses methods for building overlapping blocks with small likelihood of missed matches; the efficiency layer comprises a rich variety of techniques that significantly restrict the required number of pairwise comparisons, having a controllable impact on the number of detected duplicates. We map to our framework all relevant existing methods for creating and processing blocks in the context of HHIS, and additionally propose two novel techniques: attribute clustering blocking and comparison scheduling. We evaluate the performance of each layer and method on two large-scale, real-world data sets and validate the excellent balance between efficiency and effectiveness that they achieve.
134 citations
Authors
Showing all 14621 results
Name | H-index | Papers | Citations |
---|---|---|---|
Hyun-Chul Kim | 176 | 4076 | 183227 |
Peter Zoller | 134 | 734 | 76093 |
J. R. Smith | 134 | 1335 | 107641 |
Chao Zhang | 127 | 3119 | 84711 |
Benjamin William Allen | 124 | 807 | 87750 |
J. F. J. van den Brand | 123 | 777 | 93070 |
J. H. Hough | 117 | 904 | 89697 |
Hans-Peter Seidel | 112 | 1213 | 51080 |
Karsten Danzmann | 112 | 754 | 80032 |
Bruce D. Hammock | 111 | 1409 | 57401 |
Benno Willke | 109 | 508 | 74673 |
Roman Schnabel | 108 | 589 | 71938 |
Jan Harms | 108 | 447 | 76132 |
Hartmut Grote | 108 | 434 | 72781 |
Ik Siong Heng | 107 | 423 | 71830 |