scispace - formally typeset
Search or ask a question

Showing papers by "University of Paderborn published in 2002"


Journal ArticleDOI
TL;DR: In this paper, the authors investigated the role of the endophyte and its biologically active metabolites in its association with its host and found that a higher proportion of the fungal endophytes, in contrast to the soil isolates, inhibited at least one of the test organisms for antialgal and herbicidal activities.

984 citations


Journal ArticleDOI
08 Aug 2002-Nature
TL;DR: It is demonstrated that coherent optical excitations in the quantum-dot two-level system can be converted into deterministic photocurrents and found that this device can function as an optically triggered single-electron turnstile.
Abstract: Present-day information technology is based mainly on incoherent processes in conventional semiconductor devices. To realize concepts for future quantum information technologies, which are based on coherent phenomena, a new type of 'hardware' is required. Semiconductor quantum dots are promising candidates for the basic device units for quantum information processing. One approach is to exploit optical excitations (excitons) in quantum dots. It has already been demonstrated that coherent manipulation between two excitonic energy levels--via so-called Rabi oscillations--can be achieved in single quantum dots by applying electromagnetic fields. Here we make use of this effect by placing an InGaAs quantum dot in a photodiode, which essentially connects it to an electric circuit. We demonstrate that coherent optical excitations in the quantum-dot two-level system can be converted into deterministic photocurrents. For optical excitation with so-called pi-pulses, which completely invert the two-level system, the current is given by I = fe, where f is the repetition frequency of the experiment and e is the elementary charge. We find that this device can function as an optically triggered single-electron turnstile.

702 citations


Journal ArticleDOI
TL;DR: In this article, the authors present the status of development of the density-functional-based tightbinding (DFTB) method and discuss applications to ground-state and excited-state properties.
Abstract: The present status of development of the density-functional-based tightbinding (DFTB) method is reviewed. As a two-centre approach to densityfunctional theory (DFT), it combines computational efficiency with reliability and transferability. Utilizing a minimal-basis representation of Kohn–Sham eigenstates and a superposition of optimized neutral-atom potentials and related charge densities for constructing the effective many-atom potential, all integrals are calculated within DFT. Self-consistency is included at the level of Mulliken charges rather than by self-consistently iterating electronic spin densities and effective potentials. Excited-state properties are accessible within the linear response approach to time-dependent (TD) DFT. The coupling of electronic and ionic degrees of freedom further allows us to follow the non-adiabatic structure evolution via coupled electron–ion molecular dynamics in energetic particle collisions and in the presence of ultrashort intense laser pulses. We either briefly outline or give references describing examples of applications to ground-state and excited-state properties. Addressing the scaling problems in size and time generally and for biomolecular systems in particular, we describe the implementation of the parallel ‘divide-and-conquer’ order-N method with DFTB and the coupling of the DFTB approach as a quantum method with molecular mechanics force fields.

514 citations


Journal ArticleDOI
TL;DR: In this article, a more comprehensive perspective on retail services is adopted by examining three important research gaps related to a service-oriented business strategy: first, the authors elaborate on the dimensions of a service oriented business strategy and introduce a new measure of this strategy.
Abstract: Augmenting products with services is a major way retailers have of gaining differentiation in today’s competitive market. Despite its importance, this topic has received relatively little research attention. Unlike previous research, this study adopts a more comprehensive perspective on retail services by examining three important research gaps related to a service-oriented business strategy: First, the authors elaborate on the dimensions of a service-oriented business strategy and introduce a new measure of this strategy. Second, the authors examine the antecedents of a service-oriented business strategy. In practice, there appears to be considerable variability in terms of the extent to which retailers demonstrate a service orientation, but there is a major gap in the understanding of what factors influence this orientation. Third, the authors investigate the neglected link between a service-oriented business strategy and performance outcomes. To examine these three important areas, the authors...

491 citations


Journal ArticleDOI
TL;DR: In this paper, the authors provide a methodological framework for the calculation of ecological footprints related to leisure tourism, based on the example of the Seychelles, which reveals the statistical obstacles that have to be overcome in the calculation process and discusses the strengths and weaknesses of such an approach.

459 citations


Journal ArticleDOI
TL;DR: In this paper, the cyclic deformation of single crystal NiTi containing Ti3Ni4 precipitates of various sizes has been investigated and it has been shown that the degradation resistance of NiTi is strongly dependent on crystallographic orientation under compression, orientations approaching the [100] pole of the stereographic triangle possess the highest fatigue resistance.

359 citations


Proceedings ArticleDOI
16 Nov 2002
TL;DR: This work introduces a framework for solving online problems that aim to minimize the congestion in general topology networks and achieves a competitive ratio of O(log/sup 3/ n) with respect to the congestion of the network links.
Abstract: A principle task in parallel and distributed systems is to reduce the communication load in the interconnection network, as this is usually the major bottleneck for the performance of distributed applications. We introduce a framework for solving online problems that aim to minimize the congestion (i.e. the maximum load of a network link) in general topology networks. We apply this framework to the problem of online routing of virtual circuits and to a dynamic data management problem. For both scenarios we achieve a competitive ratio of O(log/sup 3/ n) with respect to the congestion of the network links. Our online algorithm for the routing problem has the remarkable property that it is oblivious, i.e., the path chosen for a virtual circuit is independent of the current network load. Oblivious routing strategies can easily be implemented in distributed environments and have therefore been intensively studied for certain network topologies as e.g. meshes, tori and hypercubic networks. This is the first oblivious path selection algorithm that achieves a polylogarithmic competitive ratio in general networks.

278 citations


Book ChapterDOI
07 Oct 2002
TL;DR: For this kind of attributed graph transformation systems, a definition of critical pairs is established and a critical pair lemma is proved, stating that local confluence follows from confluence of all critical pairs.
Abstract: The issue of confluence is of major importance for the successful application of attributed graph transformation, such as automated translation of UML models into semantic domains. Whereas termination is undecidable in general and must be established by carefully designing the rules, local confluence can be shown for term rewriting and graph rewriting using the concept of critical pairs. In this paper, we discuss typed attributed graph transformation using a new simplified notion of attribution. For this kind of attributed graph transformation systems we establish a definition of critical pairs and prove a critical pair lemma, stating that local confluence follows from confluence of all critical pairs.

264 citations


Proceedings ArticleDOI
19 May 2002
TL;DR: A method and a corresponding tool is described which assist design recovery and program understanding by recognising instances of design patterns semi-automatically based on a new recognition algorithm which works incrementally rather than trying to analyse a possibly large software system in one pass without any human intervention.
Abstract: A method and a corresponding tool is described which assist design recovery and program understanding by recognising instances of design patterns semi-automatically. The approach taken is specifically designed to overcome the existing scalability problems caused by many design and implementation variants of design pattern instances. Our approach is based on a new recognition algorithm which works incrementally rather than trying to analyse a possibly large software system in one pass without any human intervention. The new algorithm exploits domain and context knowledge given by a reverse engineer and by a special underlying data structure, namely a special form of an annotated abstract syntax graph. A comparative and quantitative evaluation of applying the approach to the Java AWT and JGL libraries is also given.

206 citations


Book ChapterDOI
01 Jan 2002
TL;DR: This chapter focuses on set oriented numerical methods for dynamical systems, which allow extracting statistical information on the dynamical behavior via the computation of natural invariant measures or almost invariant sets.
Abstract: This chapter focuses on set oriented numerical methods for dynamical systems. The set oriented numerical methods can be used to approximate different types of invariant sets or invariant manifolds but they also allow extracting statistical information on the dynamical behavior via the computation of natural invariant measures or almost invariant sets. In contrast to other numerical techniques, these methods do not rely on the computation of single long-term trajectories but rather use the information obtained from several short-term trajectories. Set oriented method can also be used for the computation of invariant manifolds. Although the method can, in principle, be applied to manifolds of arbitrary hyperbolic invariant sets. An important statistical characterization of the behavior of a dynamical system is given by so-called SRB (Sinai–Ruelle–Bowen) measures. The important property of these invariant measures is that they lend weight to a region in phase space according to the probability by which typical trajectories visit this region.

203 citations


Proceedings ArticleDOI
21 May 2002
TL;DR: This paper addresses the potential benefit of sharing jobs between independent sites in a grid computing environment and the aspect of parallel multi-site job execution on different sites is discussed.
Abstract: This paper addresses the potential benefit of sharing jobs between independent sites in a grid computing environment. Also the aspect of parallel multi-site job execution on different sites is discussed. To this end, various scheduling algorithms have been simulated for several machine configurations with different workloads which have been derived from real traces. The results showed that a significant improvement in terms of a smaller average response time is achievable. The usage of multi-site applications can additionally improve the results as long as the increase of the execution time due to communication overhead is limited to about 25%.

Journal ArticleDOI
TL;DR: In this paper, a new criterion for 3D crack growth under multiaxial loading, that means superposition of the fracture modes Mode I, II and III, is described.
Abstract: In many cases the lifetime of technical structures and components is depending on the behaviour of cracks. Due to the complex geometry and loading situation in real-world structures cracks are often subjected to a superposition of normal, in-plane and out-of-plane loading. In this paper a new criterion for 3D crack growth under multiaxial loading, that means superposition of the fracture modes Mode I, II and III, is described. The criterion allows the prediction of three-dimensional crack surfaces advancing from arbitrary 3D crack fronts with the help of the two deflection angles φ0 and ψ0. The underlying theory for the development of this new criterion is described in detail.

Book ChapterDOI
07 Oct 2002
TL;DR: The use of graph transformation is demonstrated to model object- and component-based systems and to specify syntax and semantics of diagram languages in software engineering.
Abstract: We give an introduction to graph transformation, not only for researchers in software engineering, but based on applications of graph transformation in this domain. In particular, we demonstrate the use of graph transformation to model object- and component-based systems and to specify syntax and semantics of diagram languages. Along the way we introduce the basic concepts, discuss different approaches, and mention relevant theory and tools.

Journal ArticleDOI
TL;DR: It is shown that Coulomb's friction law provides a very good description of the observed phenomena if the kinematics of the system is taken into account and therefore the validity of Coulomb’s friction law even for ultrasonic conditions is shown.

Posted Content
TL;DR: In this paper, the authors give a characterization of the Hilbert functions that can occur for K-algebras with the Weak or Strong Lefschetz property and give a sharp bound on the graded Betti numbers.
Abstract: Let A = bigoplus_{i >= 0} A_i be a standard graded Artinian K-algebra, where char K = 0. Then A has the Weak Lefschetz property if there is an element ell of degree 1 such that the multiplication times ell : A_i --> A_{i+1} has maximal rank, for every i, and A has the Strong Lefschetz property if times ell^d : A_i --> A_{i+d} has maximal rank for every i and d. The main results obtained in this paper are the following. 1) EVERY height three complete intersection has the Weak Lefschetz property. (Our method, surprisingly, uses rank two vector bundles on P^2 and the Grauert-Mulich theorem.) 2) We give a complete characterization (including a concrete construction) of the Hilbert functions that can occur for K-algebras with the Weak or Strong Lefschetz property (and the characterization is the same one). 3) We give a sharp bound on the graded Betti numbers (achieved by our construction) of Artinian K-algebras with the Weak or Strong Lefschetz property and fixed Hilbert function. This bound is again the same for both properties. Some Hilbert functions in fact FORCE the algebra to have the maximal Betti numbers. 4) EVERY Artinian ideal in K[x,y] possesses the Strong Lefschetz property. This is false in higher codimension.

Journal ArticleDOI
TL;DR: In this article, the issue of cross-linguistic influence in second language acquisition is examined from a processing perspective, and it is shown that second language learners can only produce forms they are able to process.
Abstract: In this article, the issue of cross-linguistic influence in second language acquisition is examined from a processing perspective. Applying Processability Theory as the theoretical framework we claim that second language (L2) learners can only produce forms they are able to process.We thus argue that the first language (L1) influence on the L2 is developmentally moderated. Data were collected from German L2 learners with Swedish as their L1. Twenty informants participated in the study, 10 in their first year of German (13 years of age) and 10 in their second year of German (14 years of age). Both languages involved are typologically very close but not mutually intelligible. The results show that Swedish learners of German do not transfer the verb-second structure from their L1 to the L2 even though this structure is identical in both languages.Instead they start out with canonical word order and subsequently produce an intermediate structure (adv NPsubj V X), which is ungrammatical in the L1 and the L2. These observations support the idea of a developmentally moderated transfer. The results clearly contradict the predictions from the 'full transfer/full access' hypothesis (Schwartz and Sprouse, 1994; 1996). (Less)

Journal ArticleDOI
TL;DR: In this paper, the proton and hydride transfers in horse liver alcohol dehydrogenase (LADH) were studied with a potential surface obtained by use of the self-consistent charge−density-functional-tight-binding (SCC−DFTB) QM/MM method implemented in the CHARMM program.
Abstract: The proton and hydride transfers in horse liver alcohol dehydrogenase (LADH) were studied with a potential surface obtained by use of the self-consistent-charge−density-functional-tight-binding (SCC−DFTB) QM/MM method implemented in the CHARMM program; a correction for solvent shielding was introduced by use of a continuum model. The proton transfers were found to proceed in a virtually concerted fashion before the hydride transfer. The calculations also showed that a radical mechanism, suggested as a possibility in the literature for the H transfer between the substrate and NAD+, is very unlikely. The energetics of the reaction and pKa‘s of residues involved in catalysis indicate that the chemical steps of LADH, as characterized by the calculated value of kcat, are slow for a pH below 5.5, and the hydride transfer is hardly affected for pH between 5.5 and 8.1. These results are compared with the experimentally measured pH dependence of kcat for LADH, although a quantitative comparison is difficult becaus...

Journal ArticleDOI
TL;DR: An overview about the characteristic structure of metal chalcogenide nanotubes is given in this article, where parameters for a model are derived, describing the structure of the nanotube.
Abstract: An overview about the characteristic structure of metal chalcogenide nanotubes is given in this paper. On the basis of atomistic calculations, parameters for a model are derived, describing the sta...

Proceedings ArticleDOI
29 Oct 2002
TL;DR: The capabilities of the two most successful industrial-strength CASE-tools in reverse engineering the static structure of software systems are examined and compared to the results produced by two academic prototypes.
Abstract: Today, software-engineering research and industry alike recognize the need for practical tools to support reverse-engineering activities. Most of the well-known CASE tools support reverse engineering in some way. The Unified Modeling Language (UML) has emerged as the de facto standard for graphically representing the design of object-oriented software systems. However, there does not yet exist a standard scheme for representing the reverse-engineered models of these systems. The various CASE tools usually adopt proprietary extensions to UML and, as a result, it is difficult, or even impossible, to ensure that model semantics remains unambiguous when working with different tools at the same time. In this paper, we examine the capabilities of the two most successful industrial-strength CASE-tools in reverse engineering the static structure of software systems and compare them to the results produced by two academic prototypes. The comparisons are carried out both manually and automatically using a research prototype for manipulating and comparing UML models.

Journal ArticleDOI
TL;DR: This paper forms the airline crew assignment problem as a constraint satisfaction problem, thus gaining high expressiveness and introducing an additional constraint which encapsulates a shortest path algorithm for generating columns with negative reduced costs.
Abstract: Airline crew assignment problems are large-scale optimization problems which can be adequately solved by column generation. The subproblem is typically a so-called constrained shortest path problem and solved by dynamic programming. However, complex airline regulations arising frequently in European airlines cannot be expressed entirely in this framework and limit the use of pure column generation. In this paper, we formulate the subproblem as a constraint satisfaction problem, thus gaining high expressiveness. Each airline regulation is encoded by one or several constraints. An additional constraint which encapsulates a shortest path algorithm for generating columns with negative reduced costs is introduced. This constraint reduces the search space of the subproblem significantly. Resulting domain reductions are propagated to the other constraints which additionally reduces the search space. Numerical results based on data of a large European airline are presented and demonstrate the potential of our approach.

Journal ArticleDOI
TL;DR: In this paper, the structures and core energies of dislocations in diamond are calculated using both isotropic and anisotropic elasticity theory combined with ab initio-based tight-binding total energy calculations.
Abstract: The structures and core energies of dislocations in diamond are calculated using both isotropic and anisotropic elasticity theory combined with ab initio-based tight-binding total energy calculations. Perfect and dissociated 60° and screw dislocations are considered. Their possible dissociation reactions are investigated through a consideration of the calculated elastic energy factors and core energies. Dissociation into partials is energetically favored. We find that the double-period reconstruction of the 90° glide partial dislocation is more stable than the single-period reconstruction and that the glide set of 60° perfect dislocations is more stable than the shuffle set. Shuffle partials containing interstitials are less likely than those containing vacancies.

Proceedings ArticleDOI
19 May 2002
TL;DR: A formal interpretation of use case models consisting of UML use case, activity, and collaboration diagrams is proposed, which allows to make precise the notions of conflict and dependency between functional requirements expressed by different use cases.
Abstract: In object-oriented software development, requirements of different stakeholders are often manifested in use case models which complement the static domain model by dynamic and functional requirements. In the course of development, these requirements are analyzed and integrated to produce a consistent overall requirements specification. Iterations of the model may be triggered by conflicts between requirements of different parties.However, due to the diversity, incompleteness, and informal nature, in particular of functional and dynamic requirements, such conflicts are difficult to find. Formal approaches to requirements engineering, often based on logic, attack these problems, but require highly specialized experts to write and reason about such specifications.In this paper, we propose a formal interpretation of use case models consisting of UML use case, activity, and collaboration diagrams. The formalization, which is based on concepts from the theory of graph transformation, allows to make precise the notions of conflict and dependency between functional requirements expressed by different use cases. Then, use case models can be statically analyzed, and conflicts or dependencies detected by the analysis can be communicated to the modeler by annotating the model.An implementation of the static analysis within a graph transformation tool is presented.

Book ChapterDOI
08 Jul 2002
TL;DR: A hunter strategy for general graphs with an escape length of only O(n log(diam(G))) against restricted as well as unrestricted rabbits is found, close to optimal since ?
Abstract: We analyze a randomized pursuit-evasion game on graphs. This game is played by two players, a hunter and a rabbit. Let G be any connected, undirected graph with n nodes. The game is played in rounds and in each round both the hunter and the rabbit are located at a node of the graph. Between rounds both the hunter and the rabbit can stay at the current node or move to another node. The hunter is assumed to be restricted to the graph G: in every round, the hunter can move using at most one edge. For the rabbit we investigate two models: in one model the rabbit is restricted to the same graph as the hunter, and in the other model the rabbit is unrestricted, i.e., it can jump to an arbitrary node in every round. We say that the rabbit is caught as soon as hunter and rabbit are located at the same node in a round. The goal of the hunter is to catch the rabbit in as few rounds as possible, whereas the rabbit aims to maximize the number of rounds until it is caught. Given a randomized hunter strategy for G, the escape length for that strategy is the worst case expected number of rounds it takes the hunter to catch the rabbit, where the worst case is with regards to all (possibly randomized) rabbit strategies. Our main result is a hunter strategy for general graphs with an escape length of only O(n log(diam(G))) against restricted as well as unrestricted rabbits. This bound is close to optimal since ?(n) is a trivial lower bound on the escape length in both models. Furthermore, we prove that our upper bound is optimal up to constant factors against unrestricted rabbits.

Proceedings ArticleDOI
15 Apr 2002
TL;DR: A new generic cryptoprocessor architecture that can be adapted to various area/performance constraints and finite field sizes is presented, and it is shown how to apply high level synthesis techniques to the controller design.
Abstract: For FPGA based coprocessors for elliptic curve cryptography, a significant performance gain can be achieved when hybrid coordinates are used to represent points on the elliptic curve. We provide a new area/performance tradeoff analysis of different hybrid representations over fields of characteristic two. Moreover, we present a new generic cryptoprocessor architecture that can be adapted to various area/performance constraints and finite field sizes, and show how to apply high level synthesis techniques to the controller design.

Proceedings ArticleDOI
25 May 2002
TL;DR: A formal interpretation of use case models consisting of UML use case, activity, and collaboration diagrams is proposed, which allows to make precise the notions of conflict and dependency between functional requirements expressed by different use cases.
Abstract: In object-oriented software development, requirements of different stakeholders are often manifested in use case models which complement the static domain model by dynamic and functional requirements. In the course of development, these requirements are analyzed and integrated to produce a consistent overall requirements specification. Iterations of the model may be triggered by conflicts between requirements of different parties. However, due to the diversity, incompleteness, and informal nature, in particular of functional and dynamic requirements, such conflicts are difficult to find. Formal approaches to requirements engineering, often based on logic ' attack these problems, but require highly specialized experts to write and reason about such specifications. We propose a formal interpretation of use case models consisting of UML use case, activity, and collaboration diagrams. The formalization, which is based on concepts from the theory of graph transformation, allows to make precise the notions of conflict and dependency between functional requirements expressed by different use cases. Then, use case models can be statically analyzed, and conflicts or dependencies detected by the analysis can be communicated to the modeler by annotating the model. An implementation of the static analysis within a graph transformation tool is presented.

Journal Article
TL;DR: In this article, a branch-and-bound algorithm for maximum clique problems is proposed, which combines cost-based filtering and vertex coloring bounds for the so-called candidate set (i.e., a set of nodes that can possibly extend the clique in the current choice point).
Abstract: We consider a branch-and-bound algorithm for maximum clique problems. We introduce cost based filtering techniques for the so-called candidate set (i.e. a set of nodes that can possibly extend the clique in the current choice point). Additionally, we present a taxonomy of upper bounds for maximum clique. Analytical results show that our cost based filtering is in a sense as tight as most of these well-known bounds for the maximum clique problem. Experiments demonstrate that the combination of cost based filtering and vertex coloring bounds outperforms the old approach as well as approaches that only apply either of these techniques. Furthermore, the new algorithm is competitive with other recent algorithms for maximum clique.

Book ChapterDOI
TL;DR: This paper discusses the problem of preserving consistency within model-based evolution focusing on UML-RT models and introduces the concept of a model transformation rule that captures an evolution step.
Abstract: With model-based development being on the verge of becoming an industrial standard, the topic of research of statically checking the consistency of a model made up of several submodels has already received increasing attention. The evolution of models within software engineering requires support for incremental consistency analysis techniques of a new version of the model after evolution, thereby avoiding a complete reiteration of all consistency tests.In this paper, we discuss the problem of preserving consistency within model-based evolution focusing on UML-RT models. We introduce the concept of a model transformation rule that captures an evolution step. Composition of several evolution steps leads to a complex evolution of a model. For each evolution step, we study the effects on the consistency of the overall model and provide localized consistency checks for those parts of the model that have changed. For a complex evolution of a model, consistency can then be established by incrementally performing those localized consistency checks associated to the transformation rules applied within the evolution.

Journal ArticleDOI
TL;DR: The balancing flow that is calculated by schemes for homogeneous networks is minimal with regard to the l2 -norm and it is proved that this to hold true for generalized schemes, too.
Abstract: Several different diffusion schemes have previously been developed for load balancing on homogeneous processor networks. We generalize existing schemes, in order to deal with heterogeneous networks.

Proceedings ArticleDOI
10 Aug 2002
TL;DR: This paper presents two strategies based on hashing that achieve all of the goals above and gives a list of applications demonstrating that they can be used efficiently for distributed data management, web caches, and adaptive random graphs, which may be of interest for peer-to-peer networks.
Abstract: In this paper we study the problem of designing compact, adaptive strategies for the distribution of objects among a heterogeneous set of servers. Ideally, such a strategy should allow the computation of the position of an object with a low time and space complexity, and it should be able to adapt with a near-minimum amount of replacements of objects to changes in the capabilities of the servers so that objects are always distributed among the servers according to their capabilities. Previous techniques are able to handle these requirements only in part. For example, standard hashing techniques can be used to achieve a non-uniform distribution of objects among a set of servers and the time and space efficient computation of the position of the objects, but they usually do not adapt well to a change in the capabilities. We present two strategies based on hashing that achieve all of the goals above. Furthermore, we give a list of applications for these strategies demonstrating that they can be used efficiently for distributed data management, web caches, and adaptive random graphs, which may be of interest for peer-to-peer networks.

Journal ArticleDOI
TL;DR: In this paper, the authors developed a quantum simulation tool to investigate transport in molecular structures based on the joint use of a density functional tight binding (DFTB) and of a Green's function technique which allows us the calculation of current flow through the investigated structures.
Abstract: We have developed a quantum simulation tool to investigate transport in molecular structures. The method is based on the joint use of a density functional tight binding (DFTB) and of a Green's function technique which allows us the calculation of current flow through the investigated structures. Typical calculations are shown for carbon-nanotube-based field effect transistors and for DNA fragments. Transport; molecular structures