scispace - formally typeset
Search or ask a question

Showing papers by "University of Mannheim published in 2001"


Journal ArticleDOI
TL;DR: An overview of ad hoc routing protocols that make forwarding decisions based on the geographical position of a packet's destination and previously proposed location services are discussed in addition to position-based packet forwarding strategies.
Abstract: We present an overview of ad hoc routing protocols that make forwarding decisions based on the geographical position of a packet's destination. Other than the destination's position, each node need know only its own position and the position of its one-hop neighbors in order to forward packets. Since it is not necessary to maintain explicit routes, position-based routing does scale well even if the network is highly dynamic. This is a major advantage in a mobile ad hoc network where the topology may change frequently. The main prerequisite for position-based routing is that a sender can obtain the current position of the destination. Therefore, previously proposed location services are discussed in addition to position-based packet forwarding strategies. We provide a qualitative comparison of the approaches in both areas and investigate opportunities for future research.

1,722 citations


Journal ArticleDOI
TL;DR: In this paper, the authors developed a set of hypotheses related to the moderating effect of selected personal characteristics on the satisfaction-loyalty link, including variety seeking, age, and income.
Abstract: Previous research on the relationship between customer satisfaction and loyalty has largely neglected the issue of moderator variables. The authors develop a set of hypotheses related to the moderating effect of selected personal characteristics on the satisfaction-loyalty link. These hypotheses are tested in a consumer durables context using multiple group causal analysis. Empirical findings provide reasonable support for the theoretical arguments. Specifically, variety seeking, age, and income are found to be important moderators of the satisfaction-loyalty relationship.

1,170 citations


Journal ArticleDOI
TL;DR: In this paper, the authors analyze a model where a multinational firm can use a superior technology in a foreign subsidiary only after training a local worker, and show that the multinational firm might find it optimal to export instead of investing abroad to avoid dissipation of its intangible assets or the payment of a higher wage to the trained worker.

995 citations


Journal ArticleDOI
TL;DR: In this paper, the authors developed a model that explains how supplier behaviors and the management of suppliers affect a customer firm's direct product, acquisition, and operations costs, and proposed that these costs mediate the relationship between buyer-supplier relationship behaviors.
Abstract: Academic literature and business practice are directing increased attention to the importance of creating value in buyer-supplier relationships. One method for creating value is to reduce costs in commercial exchange. The authors develop a model that explains how supplier behaviors and the management of suppliers affect a customer firm’s direct product, acquisition, and operations costs. The model proposes that these costs mediate the relationship between buyer-supplier relationship behaviors and the customer firm’s intentions to expand future purchases from the supplier. The model is tested on data collected from almost 500 buying organizations in the United States and Germany. The results indicate that increased communication frequency, different forms of supplier accommodation, product quality, and the geographic closeness of the supplier’s facilities to the customer’s buying location lower customer firm costs. In addition, customer firms intend to increase purchases from suppliers that provid...

811 citations


Journal ArticleDOI
TL;DR: In this paper, the authors study efficient Bayes-Nash incentive compatible mechanisms in a social choice setting that allows for informational and allocative externalities, and show that such mechanisms exist only if a congruence condition relating private and social rates of information substitution is satisfied.
Abstract: We study efficient, Bayes-Nash incentive compatible mechanisms in a social choice setting that allows for informational and allocative externalities. We show that such mechanisms exist only if a congruence condition relating private and social rates of information substitution is satisfied. If signals are multi-dimensional, the congruence condition is determined by an integrability constraint, and it can hold only in nongeneric cases where values are private or a certain symmetry assumption holds. If signals are one-dimensional, the congruence condition reduces to a monotonicity constraint and it can be generically satisfied. We apply the results to the study of multi-object auctions, and we discuss why such auctions cannot be reduced to one-dimensional models without loss of generality.

493 citations


Journal ArticleDOI
TL;DR: A survey of current approaches to TCP friendliness is presented, both unicast and multicast congestion control protocols are examined, and an evaluation of the different approaches is presented.
Abstract: New trends in communication, in particular the deployment of multicast and real-time audio/video streaming applications, are likely to increase the percentage of non-TCP traffic in the Internet. These applications rarely perform congestion control in a TCP-friendly manner; they do not share the available bandwidth fairly with applications built on TCP, such as Web browsers, FTP, or e-mail clients. The Internet community strongly fears that the current evolution could lead to congestion collapse and starvation of TCP traffic. For this reason, TCP-friendly protocols are being developed that behave fairly with respect to coexistent TCP flows. We present a survey of current approaches to TCP friendliness and discuss their characteristics. Both unicast and multicast congestion control protocols are examined, and an evaluation of the different approaches is presented.

370 citations


Journal ArticleDOI
TL;DR: This taxonomy provides a unifying framework for data-driven and flow-driven, isotropic and anisotropic, as well as spatial and spatio-temporal regularizers, and proves that all these methods are well-posed.
Abstract: Many differential methods for the recovery of the optic flow field from an image sequence can be expressed in terms of a variational problem where the optic flow minimizes some energy. Typically, these energy functionals consist of two terms: a data term, which requires e.g. that a brightness constancy assumption holds, and a regularizer that encourages global or piecewise smoothness of the flow field. In this paper we present a systematic classification of rotation invariant convex regularizers by exploring their connection to diffusion filters for multichannel images. This taxonomy provides a unifying framework for data-driven and flow-driven, isotropic and anisotropic, as well as spatial and spatio-temporal regularizers. While some of these techniques are classic methods from the literature, others are derived here for the first time. We prove that all these methods are well-posed: they posses a unique solution that depends in a continuous way on the initial data. An interesting structural relation between isotropic and anisotropic flow-driven regularizers is identified, and a design criterion is proposed for constructing anisotropic flow-driven regularizers in a simple and direct way from isotropic ones. Its use is illustrated by several examples.

343 citations


Journal ArticleDOI
TL;DR: Qualitative and quantitative results show that the spatio-temporal approach leads to a rotationally invariant and time symmetric convex optimization problem and has a unique minimum that can be found in a stable way by standard algorithms such as gradient descent.
Abstract: Nonquadratic variational regularization is a well-known and powerful approach for the discontinuity-preserving computation of optic flow. In the present paper, we consider an extension of flow-driven spatial smoothness terms to spatio-temporal regularizers. Our method leads to a rotationally invariant and time symmetric convex optimization problem. It has a unique minimum that can be found in a stable way by standard algorithms such as gradient descent. Since the convexity guarantees global convergence, the result does not depend on the flow initialization. Two iterative algorithms are presented that are not difficult to implement. Qualitative and quantitative results for synthetic and real-world scenes show that our spatio-temporal approach (i) improves optic flow fields significantly, (ii) smoothes out background noise efficiently, and (iii) preserves true motion boundaries. The computational costs are only 50% higher than for a pure spatial approach applied to all subsequent image pairs of the sequence.

318 citations


Journal ArticleDOI
TL;DR: An efficient implementation of correlation based disparity calculation with high speed and reasonable quality that can be used in a wide range of applications or to provide an initial solution for more sophisticated methods is presented.
Abstract: This paper presents an efficient implementation for correlation based stereo. Research in this area can roughly be divided in two classes: improving accuracy regardless of computing time and scene reconstruction in real-time. Algorithms achieving video frame rates must have strong limitations in image size and disparity search range, whereas high quality results often need several minutes per image pair. This paper tries to fill the gap, it provides instructions how to implement correlation based disparity calculation with high speed and reasonable quality that can be used in a wide range of applications or to provide an initial solution for more sophisticated methods. Left-right consistency checking and uniqueness validation are used to eliminate false matches. Optionally, a fast median filter can be applied to the results to further remove outliers. Source code will be made publicly available as contribution to the Open Source Computer Vision Library, further acceleration with SIMD instructions is planned for the near future.

318 citations


Journal ArticleDOI
TL;DR: A novel sarcomeric 145-kD protein, myopalladin, which tethers together the COOH-terminal Src homology 3 domains of nebulin and nebulette with the EF hand motifs of α-actinin in vertebrate Z-lines is described, suggesting that palladin and myop alladin may have conserved roles in stress fiber and Z-line assembly.
Abstract: We describe here a novel sarcomeric 145-kD protein, myopalladin, which tethers together the COOH-terminal Src homology 3 domains of nebulin and nebulette with the EF hand motifs of α-actinin in vertebrate Z-lines. Myopalladin's nebulin/nebulette and α-actinin–binding sites are contained in two distinct regions within its COOH-terminal 90-kD domain. Both sites are highly homologous with those found in palladin, a protein described recently required for actin cytoskeletal assembly (Parast, M.M., and C.A. Otey. 2000. J. Cell Biol. 150:643–656). This suggests that palladin and myopalladin may have conserved roles in stress fiber and Z-line assembly. The NH2-terminal region of myopalladin specifically binds to the cardiac ankyrin repeat protein (CARP), a nuclear protein involved in control of muscle gene expression. Immunofluorescence and immunoelectron microscopy studies revealed that myopalladin also colocalized with CARP in the central I-band of striated muscle sarcomeres. Overexpression of myopalladin's NH2-terminal CARP-binding region in live cardiac myocytes resulted in severe disruption of all sarcomeric components studied, suggesting that the myopalladin–CARP complex in the central I-band may have an important regulatory role in maintaining sarcomeric integrity. Our data also suggest that myopalladin may link regulatory mechanisms involved in Z-line structure (via α-actinin and nebulin/nebulette) to those involved in muscle gene expression (via CARP).

281 citations


Journal ArticleDOI
TL;DR: In this article, the authors developed a multiple-item measure of industrial customer satisfaction and assessed its psychometric properties, and analyzed the influence of the identified dimensions of customer satisfaction on overall satisfaction.

Proceedings ArticleDOI
27 Aug 2001
TL;DR: This paper improves upon the well-known approach of using exponentially weighted random timers by biasing feedback in favor of low-rate receivers while still preventing a response implosion in TFMCC.
Abstract: In this paper we introduce TFMCC, an equation-based multicast congestion control mechanism that extends the TCP-friendly TFRC protocol from the unicast to the multicast domain. The key challenges in the design of TFMCC lie in scalable round-trip time measurements, appropriate feedback suppression, and in ensuring that feedback delays in the control loop do not adversely affect fairness towards competing flows. A major contribution is the feedback mechanism, the key component of end-to-end multicast congestion control schemes. We improve upon the well-known approach of using exponentially weighted random timers by biasing feedback in favor of low-rate receivers while still preventing a response implosion. We evaluate the design using simulation, and demonstrate that TFMCC is both TCP-friendly and scales well to multicast groups with thousands of receivers. We also investigate TFMCC's weaknesses and scaling limits to provide guidance as to application domains for which it is well suited.

Journal ArticleDOI
TL;DR: An empirical study is presented that shows that it is indeed necessary to allow the probability weighting function to be source dependent and includes an examination of properties of the probability Weighting function under uncertainty that have not been considered yet.
Abstract: Decision weights are an important component in recent theories of decision making under uncertainty. To better explain these decision weights, a two-stage approach has been proposed: First, the probability of an event is judged and then this probability is transformed by the probability weighting function known from decision making under risk. We extend the two-stage approach by allowing the probability weighting function to depend on the type of uncertainty. Using this more general approach, properties of decision weights can be attributed to properties of probability judgments and/or to properties of probability weighting. We present an empirical study that shows that it is indeed necessary to allow the probability weighting function to be source dependent. The analysis includes an examination of properties of the probability weighting function under uncertainty that have not been considered yet.

Journal ArticleDOI
TL;DR: In this paper, the authors used cluster analysis segments were formed based on combinations of customer ratings for different attitudinal dimensions and benefits of bank service, and four characteristic groups of customers were identified showing special preferences for and against information services and technology.
Abstract: Segmentation by demographic factors is widely used in bank marketing despite the fact that the correlation of such factors with the needs of customers is often weak. Segmentation by expected benefits and attitudes could enhance a bank’s ability to address the conflict between individual service and cost‐saving standardisation. Using cluster analysis segments were formed based on combinations of customer ratings for different attitudinal dimensions and benefits of bank service. The clusters generated in this way were superior in their homogeneity and profile to customer segments gained by referring to demographic differences. Additionally, four characteristic groups of customers were identified showing special preferences for and against information services and technology.

Journal ArticleDOI
TL;DR: A de novo mutation in the sodium channel gene SCN5A, which is associated with sudden infant death, is reported, which was different from previously reported LQTS3 mutants and caused a delay in final repolarization.
Abstract: Background— Congenital long QT syndrome (LQTS), a cardiac ion channel disease, is an important cause of sudden cardiac death. Prolongation of the QT interval has recently been associated with sudden infant death syndrome, which is the leading cause of death among infants between 1 week and 1 year of age. Available data suggest that early onset of congenital LQTS may contribute to premature sudden cardiac death in otherwise healthy infants. Methods and Results— In an infant who died suddenly at the age of 9 weeks, we performed mutation screening in all known LQTS genes. In the surface ECG soon after birth, a prolonged QTc interval (600 ms1/2) and polymorphic ventricular tachyarrhythmias were documented. Mutational analysis identified a missense mutation (Ala1330Pro) in the cardiac sodium channel gene SCN5A, which was absent in both parents. Subsequent genetic testing confirmed paternity, thus suggesting a de novo origin. Voltage-clamp recordings of recombinant A1330P mutant channel expressed in HEK-293 cel...

Journal ArticleDOI
TL;DR: This paper develops a comprehensive framework for evaluating the quality of standard rating systems and suggests a number of principles that ought to be met by 'good rating practice', potentially relevant for the improvement of existing rating systems.
Abstract: Bank internal ratings of corporate clients are intended to quantify the expected likelihood of future borrower defaults. This paper develops a comprehensive framework for evaluating the quality of standard rating systems. We suggest a number of principles that ought to be met by “good rating practice”. These “generally accepted rating principles” are potentially relevant for the improvement of existing rating systems. They are also relevant for the development of certification standards for internal rating systems, as currently discussed in a consultative paper issued by the Bank for International Settlements in Basle, entitled “A new capital adequacy framework”. We would very much appreciate any comments by readers that help to develop these rating standards further.

Journal ArticleDOI
TL;DR: In this article, theoretische Argumente and empirische belege dafur erbracht, dass fur die Situation auf dem deutschen Arbeitsmarkt vor allem der zweite Mechanismus relevant zu sein scheint.
Abstract: Arbeitsmigranten und ihre Nachkommen sind auf dem deutschen Arbeitsmarkt nach wie vor deutlich schlechter gestellt. Wahrend dies jedoch fur die Fruhphasen der Zuwanderung auf relativ naheliegende Ursachen zuruckzufuhren ist, liegen viele dieser Randbedingungen mittlerweile nicht mehr vor. Die Persistenz der ethnischen Ungleichheit konnte deshalb entweder auf Diskriminierungsprozesse oder auf eine systematische Unterinvestition in arbeitsmarktrelevantes Humankapital hindeuten. In diesem Beitrag werden theoretische Argumente und empirische Belege dafur erbracht, dass fur die Situation auf dem deutschen Arbeitsmarkt vor allem der zweite Mechanismus relevant zu sein scheint. Mit Hilfe der Daten des Mikrozensus 1996 wird untersucht, ob eine niedrigere Positionierung auf dem Arbeitsmarkt auch unter Kontrolle von Generationenstatus und Bildungsabschlussen feststellbar ist. Das Ergebnis fallt relativ klar aus: Die niedrigere Arbeitsmarkt-Positionierung der zweiten Generation lasst sich fast ausschlieslich auf Bildungsunterschiede zuruckfuhren.

Journal ArticleDOI
TL;DR: In this paper, a theoretical analysis of the difference in aggregated and segregated portfolio evaluation demonstrates that the higher attractiveness of the aggregated presentation mode is not a general phenomenon, but depends on specific parameters of the lotteries.
Abstract: If individuals have to evaluate a sequence of lotteries, their judgment is influenced by the presentation mode. Experimental studies have found significantly higher acceptance rates for a sequence of lotteries if the overall distribution was displayed instead of the set of lotteries itself. Mental accounting and loss aversion provide an easy and intuitive explanation for this phenomenon. In this paper we offer an explanation that incorporates further evaluation concepts of Prospect Theory. Our formal analysis of the difference in aggregated and segregated portfolio evaluation demonstrates that the higher attractiveness of the aggregated presentation mode is not a general phenomenon (as suggested in the literature) but depends on specific parameters of the lotteries. The theoretical findings are supported by an experimental study. In contrast to the existing evidence and in line with our theoretical results, we find for specific types of lotteries an even lower acceptance rate if the overall distribution is displayed.

Journal ArticleDOI
TL;DR: In this article, the authors examine a principal-agent model with multiple projects where a risk-neutral manager is protected by limited liability and show that incentive problems are a natural source of economies of scope.
Abstract: I examine a principal-agent model with multiple projects where a risk-neutral manager is protected by limited liability. The analysis has several interesting implications: (i) incentive problems are shown to be a natural source of economies of scope, as combining multiple projects under the management of a single manager relaxes the limited-liability constraint; (ii) as a result, managers may be overloaded with work and exert inefficiently high effort; and (iii) the analysis has implications for the optimal allocation of projects to different managers. Copyright 2001 by the RAND Corporation.

Journal ArticleDOI
TL;DR: This paper considers two well-founded PDE methods: a nonlinear isotropic diffusion filter that permits edge enhancement, and a convex nonquadratic variational image restoration method which gives good denoising.

Book ChapterDOI
TL;DR: In this paper, the authors present a new framework to describe trends in the entire wage distribution across education and age groups in a parsimonious way, and explore whether wage trends are uniform across cohorts, thus defining a macroeconomic wage trend.
Abstract: The rise of unemployment in West Germany is often attributed to an inflexibility of the wage structure in the face of a skill bias in labor demand trends. In addition, there is concern in Germany that during the 70s and 80s unions were pursuing a too egalitarian wage policy. In a cohort analysis, we estimate quantile regressions of wages taking account of the censoring in the data. We present a new framework to describe trends in the entire wage distribution across education and age groups in a parsimonious way. We explore whether wage trends are uniform across cohorts, thus defining a macroeconomic wage trend. Our findings are that wages of workers with intermediate education levels, among them especially those of young workers, deteriorated slightly relative to both high and low education levels. Wage inequality within age-education groups stayed fairly constant. Nevertheless, the German wage structure was fairly stable, especially in international comparison. The results appear consistent with a skill bias in labor demand trends, recognizing that union wages are only likely to be binding floors for low-wage earners.

Journal ArticleDOI
TL;DR: Empirical and theoretical analyses of spectral hemispherical reflectances and transmittances of individual leaves and the entire canopy sampled at two sites representative of equatorial rainforests and temperate coniferous forests indicate some simple algebraic combinations of leaf and canopy spectral transmittance and reflectances eliminate their dependencies on wavelength through the specification of two canopy-specific wavelength-independent variables.
Abstract: This paper presents empirical and theoretical analyses of spectral hemispherical reflectances and transmittances of individual leaves and the entire canopy sampled at two sites representative of equatorial rainforests and temperate coniferous forests. The empirical analysis indicates that some simple algebraic combinations of leaf and canopy spectral transmittances and reflectances eliminate their dependencies on wavelength through the specification of two canopy-specific wavelength-independent variables. These variables and leaf optical properties govern the energy conservation in vegetation canopies at any given wavelength of the solar spectrum. The presented theoretical development indicates these canopy-specific wavelength-independent variables characterize the capacity of the canopy to intercept and transmit solar radiation under two extreme situations, namely, when individual leaves 1) are completely absorptive and 2) totally reflect and/or transmit the incident radiation. The interactions of photons with the canopy at red and near-infrared (IR) spectral bands approximate these extreme situations well. One can treat the vegetation canopy as a dynamical system and the canopy spectral interception and transmission as dynamical variables. The system has two independent states: canopies with totally absorbing and totally scattering leaves. Intermediate states are a superposition of these pure states. Such an interpretation provides powerful means to accurately specify changes in canopy structure both from ground-based measurements and remotely sensed data. This concept underlies the operational algorithm of global leaf area index (LAI), and the fraction of photosynthetically active radiation absorbed by vegetation developed for the moderate resolution imaging spectroradiometer (MODIS) and multiangle imaging spectroradiometer (MISR) instruments of the Earth Observing System (EOS) Terra mission.

Posted Content
TL;DR: In this paper, a general model of competition for multiple tax bases and conditions for a restriction on preferential regimes to increase or decrease tax revenues were established, and it was shown that restrictions are most likely to be desirable when tax bases are on average highly responsive to a coordinated increase in tax rates by all governments, and tax bases with large domestic elasticities are also more mobile internationally.
Abstract: Some governments have recently called for international accords restricting the use of preferential taxes targeted to attract mobile tax bases from abroad. Are such agreements likely to discourage tax competition or conversely cause it to spread? We study a general model of competition for multiple tax bases and establish conditions for a restriction on preferential regimes to increase or decrease tax revenues. Our results show that restrictions are most likely to be desirable when tax bases are on average highly responsive to a coordinated increase in tax rates by all governments, and when tax bases with large domestic elasticities are also more mobile internationally. Our analysis allows us to reconcile the apparently contradictory results, derived from analyzing special cases, of the previous literature.

Journal ArticleDOI
TL;DR: Several new techniques for dealing with the Steiner problem in (undirected) networks are presented, including heuristics that achieve sharper upper bounds than the strongest known heuristic for this problem despite running times which are smaller by orders of magnitude.

Journal ArticleDOI
TL;DR: A scheme for identifying scenes which clusters shots according to detected dialogs, settings and similar audio is developed and results from experiments show automatic identification of these types of scenes to be reliable.
Abstract: Determining automatically what constitutes a scene in a video is a challenging task, particularly since there is no precise definition of the term “scene”. It is left to the individual to set attributes shared by consecutive shots which group them into scenes. Certain basic attributes such as dialogs, settings and continuing sounds are consistent indicators. We have therefore developed a scheme for identifying scenes which clusters shots according to detected dialogs, settings and similar audio. Results from experiments show automatic identification of these types of scenes to be reliable.

Journal ArticleDOI
TL;DR: A model of endogenous growth in an economy with competitive markets, which has a unique equilibrium, which involves steady growth at a positive rate, and is inefficiently low because knowledge spillover effects are neglected.

Proceedings ArticleDOI
13 Jul 2001
TL;DR: In this article, a modification of the Mumford-Shah functional and its cartoon limit is presented, which allows the incorporation of statistical shape knowledge in a single energy functional for image segmentation.
Abstract: We present a modification of the Mumford-Shah functional and its cartoon limit which allows the incorporation of statistical shape knowledge in a single energy functional. We show segmentation results on artificial and real-world images with and without prior shape information. In the case of occlusion and strongly cluttered background the shape prior significantly improves segmentation. Finally we compare our results to those obtained by a level-set implementation of geodesic active contours.

Journal ArticleDOI
TL;DR: This model takes into account that cost drivers can also be replaced by combinations of cost drivers, and yields a more accurate cost allocation with the same ABC-system complexity.

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a new explanation for industry shakeout, an event evident in the life-cycle of many industries in which the number of firms drops significantly in a short interval of time.

Book ChapterDOI
13 Dec 2001
TL;DR: In this paper, the authors exploit the various relations between communication complexity of distributed Boolean functions, geometric questions related to half space representations of these functions, and the computational complexity of the functions in various restricted models of computation.
Abstract: Recently, Forster [7] proved a new lower bound on probabilistic communication complexity in terms of the operator norm of the communication matrix. In this paper, we want to exploit the various relations between communication complexity of distributed Boolean functions, geometric questions related to half space representations of these functions, and the computational complexity of these functions in various restricted models of computation. In order to widen the range of applicability of Forster's bound, we start with the derivation of a generalized lower bound. We present a concrete family of distributed Boolean functions where the generalized bound leads to a linear lower bound on the probabilistic communication complexity (and thus to an exponential lower bound on the number of Euclidean dimensions needed for a successful half space representation), whereas the old bound fails. We move on to a geometric characterization of the well known communication complexity class C-PP in terms of half space representations achieving a large margin. Our characterization hints to a close connection between the bounded error model of probabilistic communication complexity and the area of large margin classification. In the final section of the paper, we describe how our techniques can be used to prove exponential lower bounds on the size of depth-2 threshold circuits (with still some technical restrictions). Similar results can be obtained for read-k-times randomized ordered binary decision diagram and related models.