scispace - formally typeset
Search or ask a question

Showing papers by "University of Mannheim published in 1997"


01 Nov 1997
TL;DR: This list lists the sites that have the 500 most powerful computer systems installed and the best Linpack benchmark performance achieved is used as a performance measure in ranking the computers.
Abstract: To provide a better basis for statistics on high-performance computers, we list the sites that have the 500 most powerful computer systems installed. The best Linpack benchmark performance achieved is used as a performance measure in ranking the computers.

785 citations


Journal ArticleDOI
TL;DR: In this article, a count of publications over a period of time indicates that management concepts come and go like fashions, after a discussion of theories of fashion in aesthetic and technical objects.
Abstract: A count of publications over a period of time indicates that management concepts come and go like fashions. After a discussion of theories of fashion in aesthetic and technical objects, it is argue...

538 citations


Journal ArticleDOI
TL;DR: The algorithms discussed here for automatically generating video abstracts first decompose the input video into semantic units, called “shot clusters” or “scenes,” then detect and extract semantically rich pieces, especially text from the title sequence and special events, such as dialog, gunfire, and explosions.
Abstract: s were quite different from those made directly by humans, the subjects could not tell which were better [8]. For browsing and searching large information archives, many users are familiar with Web interfaces. Therefore, our abstracting tool can compile its results into an html page, including the anchors for playing short video clips (see Figure 5). The top of the page in Figure 5 shows the film title, an animated gif image constructed from the text bitmaps (including the title), and the title sequence as a video clip. This information is followed by a randomly selected subset of special events, which are followed by a temporally ordered list of the scenes constructed by our shot-clustering algorithms. The bottom part of the page lists the creation parameters of the abstract, such as creation time, length, and statistics. Video abstracting is a young research field. We would like to mention two other systems suitable for creating abstracts of long videos. The first is video skimming [2], which mainly seeks to abstract documentaries and newscasts. It assumes that a transcript of the video is available; the video and the transcript are then aligned by word spotting. The audio track of the video skim is constructed by using language analysis to identify important words in the transcript; audio clips around those words are then cut out. Based on detected faces [10], text, and camera operation, video clips are selected from the surrounding frames. The second system is based on the image track only, generating not a video abstract but a static scene graph of thumbnail images on a 2D “canvas.” The scene graph represents the flow of the story in the form of keyframes, allowing users to interactively descend into the story by selecting a story unit of the graph [11]. Conclusions Using the algorithms discussed here for automatically generating video abstracts, we first decompose the input video into semantic units, called “shot clusters” or “scenes.” Then we detect and extract semantically rich pieces, especially text from the title sequence and special events, such as dialog, gunfire, and explosions. Video clips, audio clips, images, and text are extracted and composed into an abstract. The output can then be compiled into an html page for easy access through browsers. We expect our tools to be used for large multimedia archives in which video abstracts would be a much more powerful browsing technique than textual abstracts. For example, broadcast stations today sit on a gold mine of archived difficult-to-access video material. Another application of our technique could be to create an online TV guide on the Web, with short abstracts of upcoming shows, documentaries, and feature films. Just how well the generated abstracts capture the essentials of all kinds of videos remains to be seen in a larger series of practical experiments.

369 citations


Journal ArticleDOI
08 Aug 1997
TL;DR: It turns out that randomized and genetic algorithms are well suited for optimizing join expressions and generate solutions of high quality within a reasonable running time.
Abstract: Recent developments in database technology, such as deductive database systems, have given rise to the demand for new, cost-effective optimization techniques for join expressions. In this paper many different algorithms that compute approximate solutions for optimizing join orders are studied since traditional dynamic programming techniques are not appropriate for complex problems. Two possible solution spaces, the space of left-deep and bushy processing trees, are evaluated from a statistical point of view. The result is that the common limitation to left-deep processing trees is only advisable for certain join graph types. Basically, optimizers from three classes are analysed: heuristic, randomized and genetic algorithms. Each one is extensively scrutinized with respect to its working principle and its fitness for the desired application. It turns out that randomized and genetic algorithms are well suited for optimizing join expressions. They generate solutions of high quality within a reasonable running time. The benefits of heuristic optimizers, namely the short running time, are often outweighed by merely moderate optimization performance.

348 citations


Proceedings ArticleDOI
03 Jun 1997
TL;DR: Two methods for detecting and extracting commercials in digital videos are described and it is shown how both approaches can be combined into a self-learning system.
Abstract: TV commercials are interesting in many respects: advertisers and psychologists are interested in their influence on human purchasing habits, while parents might be interested in shielding their children from their influence. In this paper, two methods for detecting and extracting commercials in digital videos are described. The first method is based on statistics of measurable features and enables the detection of commercial blocks within TV broadcasts. The second method performs detection and recognition of known commercials with high accuracy. Finally, we show how both approaches can be combined into a self-learning system. Our experimental results underline the practicality of the methods.

328 citations


Journal ArticleDOI
TL;DR: It is shown that almost standard database optimization techniques can be used to answer queries without having to load the entire document into the database, and the interaction of full-text indexes with standard database collection indexes that provide important speed-up are considered.
Abstract: that consist in grammars annotated with database programs. To query documents, we introduce an extension of OQL, the ODMG standard query language for object databases. Our extension (named OQL-doc) allows us to query documents without a precise knowledge of their structure using in particular generalized path expressions and pattern matching. This allows us to introduce in a declarative language (in the style of SQL or OQL), navigational and information retrieval styles of accessing data. Query processing in the context of documents and path expressions leads to challenging implementation issues. We extend an object algebra with new operators to deal with generalized path expressions. We then consider two essential complementary optimization techniques. We show that almost standard database optimization techniques can be used to answer queries without having to load the entire document into the database. We also consider the interaction of full-text indexes (e.g., inverted files) with standard database collection indexes (e.g., B-trees) that provide important speed-up.

282 citations


Proceedings ArticleDOI
01 Feb 1997
TL;DR: The theoretic framework and applications of automatic audio content analysis, including analysis of amplitude, frequency and pitch, and simulations of human audio perception, are described.
Abstract: This paper describes the theoretic framework and applications of automatic audio content analysis. Research in multimedia content analysis has so far concentrated on the video domain. We demonstrate the strength of automatic audio content analysis. We explain the algorithms we use, including analysis of amplitude, frequency and pitch, and simulations of human audio perception. These algorithms serve us as tools for further audio content analysis. We use these tools in applications like the segmentation of audio data streams into logical units for further processing, the analysis of music, as well as the recognition of sounds indicative of violence like shots, explosions and cries.

227 citations


Book ChapterDOI
01 Jan 1997
TL;DR: In this paper, a sukzessiver Ubergange zwischen den verschiedenen Stufen des Bildungswesens is belegt, das die Ungleichheit insbesondere durch einen Abbau der sozialen Beteiligungsdifferentiale beim Ubergang zu den weiterfuhrenden Schulen and beim Erwerb der Mittleren Reife geringer geworden ist.
Abstract: Im Unterschied zu der in der Literatur weithin verbreiteten These konstanter Ungleichheiten zeigt dieser Beitrag, das seit der Zwischenkriegszeit und den ersten Nachkriegsjahren die Unterschiede zwischen verschiedenen Bevolkerungsgruppen in der Bildungsbeteiligung und in den erworbenen Bildungsabschlussen deutlich kleiner geworden sind. Die Analyse sukzessiver Ubergange zwischen den verschiedenen Stufen des Bildungswesens belegt, das die Ungleichheit insbesondere durch einen Abbau der sozialen Beteiligungsdifferentiale beim Ubergang zu den weiterfuhrenden Schulen und beim Erwerb der Mittleren Reife geringer geworden ist. Als Folge haben aber auch die Ungleichheiten beim Erwerb des Abiturs und von Hochschulabschlussen abgenommen. Die Ungleichheitsreduktion ist unterschiedlich stark nach unterschiedlichen Ungleichheitsdimensionen und sie variiert in unterschiedlichen Phasen der Nachkriegsentwicklung. Aus der Konstellation der Befunde werden spezifische Hypothesen zur Erklarung des Ungleichheitsabbaus diskutiert.

218 citations


Journal ArticleDOI
TL;DR: In this paper, it is shown that the composition of the directly elected European Parliament does not precisely reflect the real balance of political forces in the European Community, but in a different way than if nine first-order national elections took place simultaneously.
Abstract: The composition of the directly elected European Parliament does not precisely reflect the ‘real’ balance of political forces in the European Community. As long as the national political systems decide most of what there is to be decided politically, and everything really important, European elections are additional national second-order elections. They are determined more by the domestic political cleavages than by alternatives originating in the EC, but in a different way than if nine first-order national elections took place simultaneously. This is the case because European elections occur at different stages of the national political systems‘respective’electoral cycles'. Such a relationship between a second-order arena and the chief arena of a political system is not at all unusual. What is new here, is that one second-order political arena is related to nine different first-order arenas. A first analysis of European election results satisfactorily justifies the assumption that European Parliament direct elections should be treated as nine simultaneous national second-order elections.

189 citations


Journal ArticleDOI
TL;DR: It is concluded that the application of extrapolation is justified, and the algorithm is obtained a very efficient differential equation solver with practically no additional numerical costs.
Abstract: We present an extrapolation type algorithm for the numerical solution of fractional order differential equations. It is based on the new result that the sequence of approximate solutions of these equations, computed by means of a recently published algorithm by Diethelm [6], possesses an asymptotic expansion with respect to the stepsize. From this we conclude that the application of extrapolation is justified, and we obtain a very efficient differential equation solver with practically no additional numerical costs. This is also illustrated by a number of numerical examples.

182 citations


Book ChapterDOI
07 Jul 1997
TL;DR: A symbolic model checking procedure for Probabilistic Computation Tree Logic PCTL over labelled Markov chains as models is introduced, based on the algorithm used by Hansson and Jonsson [24], and is efficient because it avoids explicit state space construction.
Abstract: We introduce a symbolic model checking procedure for Probabilistic Computation Tree Logic PCTL over labelled Markov chains as models. Model checking for probabilistic logics typically involves solving linear equation systems in order to ascertain the probability of a given formula holding in a state. Our algorithm is based on the idea of representing the matrices used in the linear equation systems by Multi-Terminal Binary Decision Diagrams (MTBDDs) introduced in Clarke et al [14]. Our procedure, based on the algorithm used by Hansson and Jonsson [24], uses BDDs to represent formulas and MTBDDs to represent Markov chains, and is efficient because it avoids explicit state space construction. A PCTL model checker is being implemented in Verus [9].

Journal ArticleDOI
TL;DR: In this article, the impact of new technologies on wages using a unique data set that matches data on individuals and on their firms was studied. But the authors did not give a definitive answer to the following question: if workers who use NT are better paid, is it because they are abler or because NT increases their productivity.

Journal ArticleDOI
TL;DR: This paper analyzed both matching and choice behavior and found that violations of the stationarity axiom are restricted to matching behavior, both for certainty and risk, and also compared the discounting of certain and risky outcomes as well as the discounteding of gains and losses.
Abstract: This study compares time preference in the cases of certainty and risk. We analyze both matching and choice behavior. We find that violations of the stationarity axiom are restricted to matching behavior, both for certainty and risk. We also compare the discounting of certain and risky outcomes as well as the discounting of gains and losses. In matching tasks, certain outcomes are discounted more than risky ones. We could not confirm these results in a choice task. Gains and losses are not found to be discounted at different rates.

Journal ArticleDOI
TL;DR: This paper analyzed the regression of two independent random walks with drifts and showed that the convergence to pseudo true values also applies to the estimation of (spurious) fixed-effects models.

Proceedings ArticleDOI
01 Feb 1997
TL;DR: The proposed text segmentation and text recognition algorithms are suitable for indexing and retrieval of relevant video scenes in and from a video data base and suggest that they can be used in video retrieval applications as well as to recognize higher semantics in video.
Abstract: Efficient indexing and retrieval of digital video is an important aspect of video databases. One powerful index for retrieval is the text appearing in them. It enables content- based browsing. We present our methods for automatic segmentation and recognition of text in digital videos. The algorithms we propose make use of typical characteristics of text in videos in order to enable and enhance segmentation and recognition performance. Especially the inter-frame dependencies of the characters provide new possibilities for their refinement. Then, a straightforward indexing and retrieval scheme is introduced. It is used in the experiments to demonstrate that the proposed text segmentation and text recognition algorithms are suitable for indexing and retrieval of relevant video scenes in and from a video data base. Our experimental results are very encouraging and suggest that these algorithms can be used in video retrieval applications as well as to recognize higher semantics in video.

Book ChapterDOI
22 Jun 1997
TL;DR: In this paper, weak and branching bisimulation for fully probabilistic transition systems is introduced. And the authors give an algorithm to decide weak (and branching) bisimulations with a time complexity cubic in the number of states of the system.
Abstract: Bisimulations that abstract from internal computation have proven to be useful for verification of compositionally defined transition systems. In the literature of probabilistic extensions of such transition systems, similar bisimulations are rare. In this paper, we introduce weak and branching bisimulation for fully probabilistic systems, transition systems where nondeterministic branching is replaced by probabilistic branching. In contrast to the nondeterministic case, both relations coincide. We give an algorithm to decide weak (and branching) bisimulation with a time complexity cubic in the number of states of the fully probabilistic system. This meets the worst case complexity for deciding branching bisimulation in the nondeterministic case. In addition, the relation is shown to be a congruence with respect to the operators of PLSCCS, a lazy synchronous probabilistic variant of CCS. We illustrate that due to these properties, weak bisimulation provides all the crucial ingredients for mechanised compositional veri�cation of probabilistic transition systems.

Journal ArticleDOI
TL;DR: In this paper, Cramer and Ridder's likelihood ratio test for pooled alternatives in multinomial logit models is sequentially applied in order to determine the adequate aggregation level of the mutually exclusive alternatives in the choice set.

Book ChapterDOI
01 Sep 1997
TL;DR: An overview of the interdisciplinary DFG priority program on spatial cognition is given and one specific theme which was the topic of a recent workshop in Gottingen is presented in some more detail.
Abstract: The paper gives a brief overview of the interdisciplinary DFG priority program on spatial cognition and presents one specific theme which was the topic of a recent workshop in Gottingen in some more detail. A taxonomy of landmark, route, and survey knowledge for navigation tasks proposed at the workshop is presented. Different ways of acquiring route knowledge are discussed. The importance of employing different spatial reference systems for carrying out navigation tasks is emphasized. Basic mechanisms of spatial memory in human and animal navigation are presented. After outlining the fundamental representational issues, methodological issues in robot and human navigation are discussed. Three applications of spatial cognition research in navigation tasks are given to exemplify both technological relevance and human impact of basic research in cognition.

Journal ArticleDOI
TL;DR: In this article, a generalized form of vectorial equilibria is proposed, and, using an abstract monotonicity condition, an existence result is demonstrated for the existence of a generalized vectorial equilibrium.
Abstract: A generalized form of vectorial equilibria is proposed, and, using an abstract monotonicity condition, an existence result is demonstrated.

Journal ArticleDOI
TL;DR: A non-parametric test for the German Bundesbank confirms the hypothesis that monetary expansion accelerates when the government has a political majority in the central bank council at the beginning of the pre-election period or when the political majority changes in favor of the government during the pre election period.

Journal ArticleDOI
TL;DR: In this article, a review of approaches which directly model risk judgements is presented, and a short review of naive risk measures used in earlier economic literature is presented in this paper.
Abstract: The concept of risk is essential to many problems in economics and business. Usually, risk is treated in the traditional expected utility framework where it is defined only indirectly through the shape of the utility function. The purpose of utility functions, however, is to model preferences. In this paper, we review those approaches which directly model risk judgements. After a short review of naive risk measures used in earlier economic literature, we present recent theoretical and empirical developments.


Journal ArticleDOI
TL;DR: In this article, the authors considered the Galois representation on the Tate module of a Drinfeld module over a finitely generated field in generic characteristic and determined the image of Galois in this representation, up to commensurability.
Abstract: Consider the Galois representation on the Tate module of a Drinfeld module over a finitely generated field in generic characteristic. The main object of this paper is to determine the image of Galois in this representation, up to commensurability. We also determine the Dirichlet density of the set of places of prescribed reduction type, such as places of ordinary reduction.

Journal ArticleDOI
TL;DR: In this article, the authors develop a rational political business cycle model that starts from the foundations laid down by Rogoff and Sibert, and they depart from their model by having monetary policy delegated to a central banker who is appointed by the elected leaden The concern about reappointment is what motivates the agency problem.
Abstract: I. INTRODUCTION Conventional wisdom holds that during election periods political leaders have incentives to misuse policy instruments to enhance their chances of reelection. Rogoff and Sibert [1988] construct a rational political economy model of monetary policy based on this theme.(1) The results of their model can be interpreted in the following way.(2) Society elects a leader every other period whose competence affects the level of output in the economy and private agents will tend to vote for the incumbent if the level of output is expected to be higher than that which can be provided by an unknown challenger. Voters have imperfect information regarding the politician's true impact on output and exogenous shocks to output and consequently cannot disentangle the effects of one from the other. In an election period, this asymmetry of information gives the incumbent an incentive to expand output through the use of inflationary monetary policy in order to appear more competent than he really is. However, rational voters are aware of this incentive to inflate and adjust their expectations such that, in equilibrium, no gain in output occurs and the economy suffers from excessive inflation. An implication of the Rogoff-Sibert model is that, unless a society does away with elections as a way to choose policymakers, excessive inflation and a political inflation cycle will be an unavoidable cost of democracy. However, societies can try, and have tried, to avoid these ill-effects by taking the power of money creation away from the elected officials and giving it to an appointed "independent" central banker who will presumably pursue the public good. But can this approach actually work? The problem with this proposal is that the central bankers' well-being in many respects - reappointment, post-central bank employment opportunities, avoidance of public criticism, etc. - often hinges on pleasing their political benefactors, giving the central bankers incentives to act in the interest of elected officials.(3) As a result, the central banker could pursue objectives different from society's and may actually function as a "veil" for the elected policymaker. In the real world, this veil is not complete but partial and, hence, we can characterize the relationship between the monetary authority and society as a principal-agent problem. The advantage of delegating power to an appointed central banker and then erecting appropriate institutions is that the inflation problems may be resolved without abandoning the electoral process. In this paper we develop a rational political business cycle model that starts from the foundations laid down by Rogoff and Sibert. We depart from their model by having monetary policy delegated to a central banker who is appointed by the elected leaden The concern about reappointment is what motivates the agency problem - the central banker would like to be reappointed and this desire leads her to partially accommodate the wishes of the current administration which, in turn, may not coincide with the desires of the electorate. We then analyze the problem faced by a central banker who is concerned about her own selfinterest as well as social well-being. It is this feature that turns the setting of monetary policy into a political principal-agent problem. The contributions of our approach to modelling the agency problem of central banking are (1) it explicitly incorporates reappointment concerns into the decision process of the central banker, which heretofore has not been done in the literature, and (2) we explicitly model the source of the agency problem, unlike previous research that simply assumes an agency problem exists. Our key results are as follows. In equilibrium, the pursuit of her own self-interest leads the central banker to create a socially inefficient inflation bias. While Rogoff and Sibert also obtain this result, our explicit functionalform model allows us to analyze this bias in more detail and tie it directly to the strength of the relationship between the central banker and the current administration. …

Book ChapterDOI
10 Sep 1997
TL;DR: This paper presents architectural considerations, design issues and describes a prototypical implementation of a Video Conference Recording on Demand (VCRoD) service for the MBone, which is a clientserver based architecture for interactive remote recording and playback of MBone sessions.
Abstract: One of the most exciting technologies in today's Internet is the MBone. MBone stands for Multicast Backbone; it provides the infrastructure for efficient multipoint packet delivery in the Internet. The most popular scenario is worldwide audio-video conferencing. However, so far there are no satisfactory solutions available on how to archive multimedia data streams of multicast videoconferences, and how to make them accessible for remote sites (i.e. how to remotely record and remotely play back MBone conferences). In this paper we present architectural considerations, design issues and describe a prototypical implementation of a Video Conference Recording on Demand (VCRoD) service for the MBone. Our system is called MBone VCRoD Service (MVoD) and is a clientserver based architecture for interactive remote recording and playback of MBone sessions. MVoD is implemented based on open standards (e.g. CORBA), making it possible for other applications to interface it. Since the MVoD client application is implemented using JAVA, it will be possible to access the service from almost any platform.

Journal ArticleDOI
Uwe Walz1
TL;DR: In this paper, a dynamic general equilibrium model with endogenous technological change is proposed, which allows for geographical separation of the innovation and production of newly developed goods, and the implications of factor flow liberalization as well as of industrial policies are investigated.
Abstract: Direct foreign investment is incorporated in a dynamic general equilibrium model with endogenous technological change. In contrast to recent endogenous growth approaches, I allow for geographical separation of the innovation and production of newly developed goods. Firms acquire specific knowledge through R&D investment in the more developed country and use their specific asset to establish a production plant in the low-cost country. Foreign direct investment is accompanied by interregional spillovers of knowledge from the more to the less advanced country. I derive a steady-state equilibrium with active innovation and production activities in the high-technology sector in both countries. Furthermore, the implications of factor flow liberalization as well as of industrial policies are investigated.

Journal ArticleDOI
TL;DR: In this article, a survey of 120 German college students listed all aspects of a good intimate relationship and found that central features were more salient in memory than were peripheral features, and in study 4, central features associated with shorter response latencies.
Abstract: Laypersons’conceptions of a good heterosexual relationship were examined in four studies. In study 1, a total of 120 German college students listed all aspects of a good intimate relationship. The resulting 1,010 items represented 352 distinct features, of which the 64 most frequently mentioned were selected for further analyses. In study 2, a total of 107 German college students rated the degree of centrality of the 64 items and reliably distinguished central from peripheral features. These data are consistent with a prototype analysis of the concept “good relationship.” In study 3, central features were round to be more salient in memory than were peripheral features, and in study 4, central features were associated with shorter response latencies than were peripheral ones. The specific features identified by laypersons as central to a good relationship were compared to those identified by experts, and the similarities and differences between the lay concepts of love and good relationships were also examined.

Journal ArticleDOI
TL;DR: In this paper, a micro-approximation of tax savings with increasing income is presented, showing that tax savings strongly increase with increasing incomes, resulting in an effective marginal tax rate 16 percentage points below the legislated one for the highest income groups.

Journal ArticleDOI
TL;DR: In this paper, the authors established a complex integral representation for trigonometric B-splines, which is in certain analogy to the polynomial case, but the proof of which has to be done in a different and more complicated way.
Abstract: In this paper we investigate some properties of trigonometric B-splines, which form a finitely-supported basis of the space of trigonometric spline functions. We establish a complex integral representation for trigonometric B-splines, which is in certain analogy to the polynomial case, but the proof of which has to be done in a different and more complicated way. Using this integral representation, we can prove some identities concerning the evaluation of a trigonometric B-spline, its derivative and its partial derivative w.r.t. the knots. As a corollary of the last mentioned identity, we obtain a result on the tangent space of a trigonometric spline function. Finally we show that - in the case of equidistant knots - the trigonometric B-splines of odd order form a partition of a constant, and therefore the corresponding B-spline curve possesses the convex-hull property. This is also illustrated by a numerical example.

Journal ArticleDOI
TL;DR: In this article, the effects of four factors (the bundle: pure or mixed, the price discount, the functional complementarity of bundle components, and the number of bundles) on consumers' intentions to purchase product and service bundles were examined.
Abstract: Examines the effects of four factors (the bundle: pure or mixed, the price discount, the functional complementarity of bundle components, and the number of bundle components) on consumers’ intentions to purchase product and service bundles. The findings were relatively consistent across product (automobile) and service (automotive service) contexts, and illustrate that pure bundles are preferred to mixed bundles, and a greater price discount is preferred to a lesser one. The results also indicate that five component bundles generate greater purchase intention than either three or seven component bundles, and that “very related” bundle components result in greater purchase intention than either moderately or not related components. Additionally, several interactions are present.