scispace - formally typeset
Search or ask a question

Showing papers by "Deutsche Telekom published in 2007"


Journal ArticleDOI
TL;DR: In this paper, the scaling limit approach of statistical physics has been used to determine the achievable bit rate per source-destination pair in a wireless network of n randomly located nodes, where the network operation strategy corresponds to the transition region between order and disorder of an underlying percolation model.
Abstract: An achievable bit rate per source-destination pair in a wireless network of n randomly located nodes is determined adopting the scaling limit approach of statistical physics It is shown that randomly scattered nodes can achieve, with high probability, the same 1/radicn transmission rate of arbitrarily located nodes This contrasts with previous results suggesting that a 1/radicnlogn reduced rate is the price to pay for the randomness due to the location of the nodes The network operation strategy to achieve the result corresponds to the transition region between order and disorder of an underlying percolation model If nodes are allowed to transmit over large distances, then paths of connected nodes that cross the entire network area can be easily found, but these generate excessive interference If nodes transmit over short distances, then such crossing paths do not exist Percolation theory ensures that crossing paths form in the transition region between these two extreme scenarios Nodes along these paths are used as a backbone, relaying data for other nodes, and can transport the total amount of information generated by all the sources A lower bound on the achievable bit rate is then obtained by performing pairwise coding and decoding at each hop along the paths, and using a time division multiple access scheme

755 citations


Proceedings ArticleDOI
13 Jun 2007
TL;DR: By combining a block-level solution with pre-copying and write throttling, it is shown that an entire running web server can be transferred, including its local persistent state, with minimal disruption.
Abstract: So far virtual machine (VM) migration has focused on transferring the run-time memory state of the VMs in local area networks (LAN). However, for wide-area network (WAN) migration it is crucial to not just transfer the VMs image but also transfer its local persistent state (its file system) and its on-going network connections. In this paper we address both: by combining a block-level solution with pre-copying and write throttling we show that we can transfer an entire running web server, including its local persistent state, with minimal disruption --- three seconds in the LAN and 68 seconds in the WAN); by combining dynDNS with tunneling, existing connections can continue transparently while new ones are redirected to the new network location. Thus we show experimentally that by combining well-known techniques in a novel manner we can provide system support for migrating virtual execution environments in the wide area.

469 citations


Journal ArticleDOI
20 Jul 2007
TL;DR: This work proposes and evaluates the feasibility of a solution where the ISP offers an "oracle" to the P2P users, where the oracle ranks them according to certain criteria, like their proximity to the user or higher bandwidth links, to improve its performance.
Abstract: Peer-to-peer (P2P) systems, which are realized as overlays on top of the underlying Internet routing architecture, contribute a significant portion of today's Internet traffic. While the P2P users are a good source of revenue for the Internet Service Providers (ISPs), the immense P2P traffic also poses a significant traffic engineering challenge to the ISPs. This is because P2P systems either implement their own routing in the overlay topology or may use a P2P routing underlay [1], both of which are largely independent of the Internet routing, and thus impedes the ISP's traffic engineering capabilities. On the other hand, P2P users are primarily interested in finding their desired content quickly, with good performance. But as the P2P system has no access to the underlying network, it either has to measure the path performance itself or build its overlay topology agnostic of the underlay. This situation is disadvantageous for both the ISPs and the P2P users.To overcome this, we propose and evaluate the feasibility of a solution where the ISP offers an "oracle" to the P2P users. When the P2P user supplies the oracle with a list of possible P2P neighbors, the oracle ranks them according to certain criteria, like their proximity to the user or higher bandwidth links. This can be used by the P2P user to choose appropriate neighbors, and therefore improve its performance. The ISP can use this mechanism to better manage the immense P2P traffic, e.g., to keep it inside its network, or to direct it along a desired path. The improved network utilization will also enable the ISP to provide better service to its customers.

438 citations


Journal ArticleDOI
Anja Feldmann1
20 Jul 2007
TL;DR: An overview of the challenges that a future Internet has to address and approaches for finding possible solutions, including Clean Slate Design are discussed, and how such solutions can be evaluated and how they can be retrofitted into the current Internet are discussed.
Abstract: Many believe that it is impossible to resolve the challenges facing today's Internet without rethinking the fundamental assumptions and design decisions underlying its current architecture. Therefore, a major research effort has been initiated on the topic of Clean Slate Design of the Internet's architecture. In this paper we first give an overview of the challenges that a future Internet has to address and then discuss approaches for finding possible solutions, including Clean Slate Design. Next, we discuss how such solutions can be evaluated and how they can be retrofitted into the current Internet. Then, we briefly outline the upcoming research activities both in Europe and the U. S. Finally, we end with a perspective on how network and service operators may benefit from such an initiative.

298 citations


Proceedings ArticleDOI
15 Apr 2007
TL;DR: A comparative study of four different approaches to automatic age and gender classification using seven classes on a telephony speech task and also compares the results with human performance on the same data.
Abstract: This paper presents a comparative study of four different approaches to automatic age and gender classification using seven classes on a telephony speech task and also compares the results with human performance on the same data. The automatic approaches compared are based on (1) a parallel phone recognizer, derived from an automatic language identification system; (2) a system using dynamic Bayesian networks to combine several prosodic features; (3) a system based solely on linear prediction analysis; and (4) Gaussian mixture models based on MFCCs for separate recognition of age and gender. On average, the parallel phone recognizer performs as well as Human listeners do, while loosing performance on short utterances. The system based on prosodic features however shows very little dependence on the length of the utterance.

152 citations


Journal ArticleDOI
TL;DR: This work exploits the fact that the shape of an inextensible triangulated mesh can be parameterized in terms of a small subset of the angles between its facets to produce low-dimensional 3D deformation models.
Abstract: Three-dimensional detection and shape recovery of a nonrigid surface from video sequences require deformation models to effectively take advantage of potentially noisy image data. Here, we introduce an approach to creating such models for deformable 3D surfaces. We exploit the fact that the shape of an inextensible triangulated mesh can be parameterized in terms of a small subset of the angles between its facets. We use this set of angles to create a representative set of potential shapes, which we feed to a simple dimensionality reduction technique to produce low-dimensional 3D deformation models. We show that these models can be used to accurately model a wide range of deforming 3D surfaces from video sequences acquired under realistic conditions.

148 citations


Proceedings ArticleDOI
22 Oct 2007
TL;DR: A game theoretical approach is suggested that allows master-slave cognitive radio pairs to update their transmission powers and frequencies simultaneously and a modification to the exact potential game discussed earlier that would allow a Stackelberg leader to charge a virtual price for communicating over a licensed channel is suggested.
Abstract: The ongoing growth in wireless communication continues to increase demand on the frequency spectrum. The current rigid frequency band allocation policy leads to a significant under-utilization of this scarce resource. However, recent policy changes by the Federal Communications Commission (FCC) and research directions suggested by the Defense Advanced Research Projects Agency (DARPA) have been focusing on wireless devices that can adaptively and intelligently adjust their transmission characteristics, which are known as cognitive radios. This paper suggests a game theoretical approach that allows master-slave cognitive radio pairs to update their transmission powers and frequencies simultaneously. This is shown to lead to an exact potential game, for which it is known that a particular update scheme converges to a Nash Equilibrium (NE). Next, a Stackelberg game model is presented for frequency bands where a licensed user has priority over opportunistic cognitive radios. We suggest a modification to the exact potential game discussed earlier that would allow a Stackelberg leader to charge a virtual price for communicating over a licensed channel. We investigate virtual price update algorithms for the leader and prove the convergence of a specific algorithm. Simulations performed in Matlab verify our convergence results and demonstrate the performance gains over alternative algorithms.

130 citations


Proceedings ArticleDOI
12 Nov 2007
TL;DR: The study demonstrates the advantage of dynamic peephole and magic lens interaction over joystick interaction in terms of search time and degree of exploration of the search space.
Abstract: A user study was conducted to compare the performance of three methods for map navigation with mobile devices. These methods are joystick navigation, the dynamic peephole method without visual context, and the magic lens paradigm using external visual context. The joystick method is the familiar scrolling and panning of a virtual map keeping the device itself static. In the dynamic peephole method the device is moved and the map is fixed with respect to an external frame of reference, but no visual information is present outside the device's display. The magic lens method augments an external content with graphical overlays, hence providing visual context outside the device display. Here too motion of the device serves to steer navigation. We compare these methods in a study measuring user performance, motion patterns, and subjective preference via questionnaires. The study demonstrates the advantage of dynamic peephole and magic lens interaction over joystick interaction in terms of search time and degree of exploration of the search space.

117 citations


Patent
Robert Moskovitch1, Dima Stopel1, Zvi Boger1, Yuval Shahar1, Yuval Elovici1 
29 Jan 2007
TL;DR: In this paper, the authors proposed a method for detecting malicious behavioral patterns which are related to malicious software such as a computer worm in computerized systems that include data exchange channels with other systems over a data network.
Abstract: Method for detecting malicious behavioral patterns which are related to malicious software such as a computer worm in computerized systems that include data exchange channels with other systems over a data network. Accordingly, hardware and/or software parameters are determined in the computerized system that is can characterize known behavioral patterns thereof. Known malicious code samples are learned by a machine learning process, such as decision trees and artificial neural networks, and the results of the machine learning process are analyzed in respect to the behavioral patterns of the computerized system. Then known and unknown malicious code samples are identified according to the results of the machine learning process.

114 citations


Book ChapterDOI
22 Jul 2007
TL;DR: A general definition of the concept 'intuitive use of user interfaces' is presented on the basis of the current interdisciplinary work and the relationship between aesthetics and intuitive use is addressed.
Abstract: In this paper we present a general definition of the concept 'intuitive use of user interfaces' on the basis of our current interdisciplinary work. 'Intuitive use' is regarded as a characteristic of human-machine systems. It refers to a special kind of interaction process between users and technical systems that use the users' intuition. The main part of the paper deals with central aspects of this definition in detail and discusses pre-conditions and restrictions of the use of the concept. The main aspects that we discuss are the design of technical systems, application and non-conscious use of previous knowledge, intuition as a non-conscious process, interaction, and effectiveness. We complement this discussion by addressing the relationship between aesthetics and intuitive use.

86 citations


Journal ArticleDOI
TL;DR: This paper shows how to design and implement a novel efficient space-frequency quantization (SFQ) compression algorithm using directionlets and shows that the new compression method outperforms the standard SFQ in a rate-distortion sense, both in terms of mean-square error and visual quality, especially in the low-rate compression regime.
Abstract: The standard separable 2-D wavelet transform (WT) has recently achieved a great success in image processing because it provides a sparse representation of smooth images. However, it fails to efficiently capture 1-D discontinuities, like edges or contours. These features, being elongated and characterized by geometrical regularity along different directions, intersect and generate many large magnitude wavelet coefficients. Since contours are very important elements in the visual perception of images, to provide a good visual quality of compressed images, it is fundamental to preserve good reconstruction of these directional features. In our previous work, we proposed a construction of critically sampled perfect reconstruction transforms with directional vanishing moments imposed in the corresponding basis functions along different directions, called directionlets. In this paper, we show how to design and implement a novel efficient space-frequency quantization (SFQ) compression algorithm using directionlets. Our new compression method outperforms the standard SFQ in a rate-distortion sense, both in terms of mean-square error and visual quality, especially in the low-rate compression regime. We also show that our compression method, does not increase the order of computational complexity as compared to the standard SFQ algorithm.

Proceedings ArticleDOI
01 May 2007
TL;DR: This work investigates the reliability of SMS by analyzing traces collected from a nationwide cellular network over a period of three weeks, and shows that its reliability is not as good as expected.
Abstract: SMS has been arguably the most popular wireless data service for cellular networks. Due to its ubiquitous availability and universal support by mobile handsets and cellular carriers, it is also being considered for emergency notification and other mission-critical applications. Despite its increased popularity, the reliability of SMS service in real-world operational networks has received little study so far. In this work, we investigate the reliability of SMS by analyzing traces collected from a nationwide cellular network over a period of three weeks. Although the SMS service incorporates a number of reliability mechanisms such as delivery acknowledgement and multiple retries, our study shows that its reliability is not as good as we expected. For example the message delivery failure ratio is as high as 5.1% during normal operation conditions. We also analyze the performance of the service under stressful conditions, and in particular during a "flash-crowd" event that occurred in New Year's Eve of 2005. Two important factors that adversely affect reliability of SMS are also examined: bulk message delivery that may induce network-wide congestion, and the topological structure of the social network formed by SMS users, which may facilitate quick propagation of viruses or other malware.

Proceedings ArticleDOI
05 Nov 2007
TL;DR: The results show that CDCF can protect the foreign network from encrypted tunnel traffic with minimal overhead and is implemented and integrated with the OpenVPN software, and evaluated its performance using extensive experiments.
Abstract: Security and privacy are two major concerns in supporting roaming users across administrative domains. In current practices, a roaming user often uses encrypted tunnels, e.g., Virtual Private Networks (VPNs), to protect the secrecy and privacy of her communications. However, due to its encrypted nature, the traffic flowing through these tunnels cannot be examined and regulated by the foreign network's firewall, which may lead the foreign network widely open to various attacks from the Internet. This threat can be alleviated if the users reveal their traffic to the foreign network or the foreign network reveals its firewall rules to the tunnel endpoints. However, neither approach is desirable in practice due to privacy concerns. In this paper, we propose a Cross-Domain Cooperative Firewall (CDCF) that allows two collaborative networks to enforce each other's firewall rules in an oblivious manner. In CDCF, when a roaming user establishes an encrypted tunnel between his home network and the foreign network, the tunnel endpoint (e.g., a VPN server) can regulate the traffic and enforce the foreign network's firewall rules, without knowing these rules. The key ingredients in CDCF are the distribution of firewall primitives across network domains, and the enabling technique of efficient oblivious membership verification. We have implemented CDCF and integrated it with the OpenVPN software, and evaluated its performance using extensive experiments. Our results show that CDCF can protect the foreign network from encrypted tunnel traffic with minimal overhead.

Journal ArticleDOI
TL;DR: With this method, a time delay of more than 120ns for pulses with a temporal width of 30ns is achieved, to the best of the knowledge, this is the highest time delay in just one fiber spool.
Abstract: We compare two simple mechanisms for the enhancement of the time delay in slow light systems. Both are based on the superposition of the Brillouin gain with additional loss. As we will show in theory and experiment if two losses are placed at the wings of a SBS gain, contrary to other methods, the loss power increases the time delay. This leads to higher delay times at lower optical powers and to an increase of the zero gain delay of more than 50%. With this method we achieved a time delay of more than 120ns for pulses with a temporal width of 30ns. To the best of our knowledge, this is the highest time delay in just one fiber spool. Beside the enhancement of the time delay the method could have the potential to decrease the pulse distortions for high bit rate signals.

Patent
24 Oct 2007
TL;DR: In this paper, a method and system for providing and reconstructing a photorealistic environment, by integrating a virtual item into it, comprising: (a) a dedicated marker, placed in a predefined location within an environment, for enabling determining the desired location of said virtual item within said environment; (b) a conventional camera for taking a picture or shooting a video clip of said environment, in which said marker was placed, and then providing a corresponding images of said environments; and (c) one or more servers for receiving said corresponding image from said camera, processing it,
Abstract: The present invention relates to a method and system for providing and reconstructing a photorealistic environment, by integrating a virtual item into it, comprising: (a) a dedicated marker, placed in a predefined location within an environment, in which a virtual item has to be integrated, for enabling determining the desired location of said virtual item within said environment; (b) a conventional camera for taking a picture or shooting a video clip of said environment, in which said marker was placed, and then providing a corresponding images of said environment; and (c) one or more servers for receiving said corresponding image of said environment from said camera, processing it, and outputting a photorealistic image that contains said virtual item integrated within it, comprising: (c.1.) a composer for composing a photorealistic image from said corresponding image of said environment; (c.2.) an image processing unit for processing said corresponding image and for determining the location of said marker within said environment; (c.3.) a configuration database for storing configurations and other data; and (c.4.) an image rendering unit for reconstructing the photorealistic image by integrating said virtual item into said predefined location of the photographed environment, wherein said marker is located.

Patent
20 Sep 2007
TL;DR: In this paper, a hybrid recommender system, in which the initial stereotype is manually defined by an expert and an affinity vector of stereotypes relating to each specific user who registers onto the system, is created to define a specific profile for each user.
Abstract: A hybrid recommender system, in which the initial stereotype is manually defined by an expert and an affinity vector of stereotypes relating to each specific user who registers onto the system, is created to define a specific profile for each user. Recommendations for a specific user are generated according to the initial stereotype and the affinity vector of stereotypes. A binary feedback, from user regarding specific items picked by him is received (e.g., while of the item), which can be either positive or negative. Then the affinity vector of stereotypes is updated.

Journal ArticleDOI
TL;DR: In the presented approach the delay or acceleration of optical signals is decoupled from their amplification or attenuation, which allows the adaptation of the pulse amplitudes to the given application.
Abstract: We show a simple method of time delay enhancement in slow-light systems based on the effect of stimulated Brillouin scattering. The method is based on the reduction of the absolute Brillouin gain by a loss produced by an additional pump laser. With this method we achieved pulse delays of nearly 100 ns in a standard single-mode fiber. In the presented approach the delay or acceleration of optical signals is decoupled from their amplification or attenuation, which allows the adaptation of the pulse amplitudes to the given application.

Patent
24 Jan 2007
TL;DR: In this article, a conceptual and computational architecture that enables monitoring accumulated time-oriented data using knowledge related to the operation of elements of a computer network and deriving temporal abstractions from the accumulated data and the knowledge in order to identify electronic threat patterns and create alerts is presented.
Abstract: The invention is a comprehensive conceptual and computational architecture that enables monitoring accumulated time-oriented data using knowledge related to the operation of elements of a computer network and deriving temporal abstractions from the accumulated data and the knowledge in order to identify electronic threat patterns and create alerts. The architecture of the invention supports two main modes of operation: a. an automated, continuous mode for monitoring, recognition and detection of known eThreats; and b. an interactive, human-operated intelligent tool for dynamic exploration of the contents of a security storage service to identify new temporal patterns that characterize such threats, and to add them to the monitoring database. The architecture of the invention can analyze data collected from various sources, such as end-user devices, network element, network links etc., to identify potentially infected devices, files, sub-streams or network segments.

Patent
10 Dec 2007
TL;DR: In this article, a method for delivery of content data to a plurality of hosts is provided for a peer-to-peer network, where each host is configured to operate as at least one of a content uploading host and a content downloading host.
Abstract: A method is provided for delivery of content data to a plurality of hosts. Each host is configured to operate as at least one of a content uploading host and a content downloading host. The plurality of hosts form a peer-to-peer network.

Proceedings ArticleDOI
18 Jun 2007
TL;DR: It is demonstrated that the MDP based flow assignment policy leads to significant enhancement in the QoS provisioning (lower packet delays and packet loss rates) for the flows, as compared to policies which do not perform dynamic flow assignment but statically allocate flows to different networks using heuristics like average available bit rate on the networks.
Abstract: We consider a scenario where devices with multiple networking capabilities access networks with heterogeneous characteristics. In such a setting, we address the problem of efficient utilization of multiple access networks (wireless and/or wireline) by devices via optimal assignment of traffic flows with given utilities to different networks. We develop and analyze a device middleware functionality that monitors network characteristics and employs a Markov Decision Process (MDP) based control scheme that in conjunction with stochastic characterization of the available bit rate and delay of the networks generates an optimal policy for allocation of flows to different networks. The optimal policy maximizes, under available bit rate and delay constraints on the access networks, a discounted reward which is a function of the flow utilities. The flow assignment policy is periodically updated and is consulted by the flows to dynamically perform network selection during their lifetimes. We perform measurement tests to collect traces of available bit rate and delay characteristics on Ethernet and WLAN networks on a work day in a corporate work environment. We implement our flow assignment framework in ns-2 and simulate the system performance for a set of elastic video-like flows using the collected traces. We demonstrate that the MDP based flow assignment policy leads to significant enhancement in the QoS provisioning (lower packet delays and packet loss rates) for the flows, as compared to policies which do not perform dynamic flow assignment but statically allocate flows to different networks using heuristics like average available bit rate on the networks.

Journal ArticleDOI
TL;DR: This work outlines a cross-layer optimization framework based on the congestion control dynamics of a bulk-transfer TCP flow and demonstrates its application to networks which offer link-layer adaptive measures and shows that TCP dynamics aware link adaptation measures lead to substantial enhancement of TCP throughput in EGPRS and IEEE 802.11a networks.
Abstract: Almost a decade long research on the performance of TCP in wireless networks has resulted in many proposals and solutions to the problem of TCP throughput degradation. Several of these measures, however, have their share of drawbacks. With the continuing emergence of wireless technologies ever since the work on TCP performance over wireless began, smart link-layer mechanisms like adaptive modulation and coding, power control, and incremental redundancy have been designed and deployed. In this work, we outline a cross-layer optimization framework based on the congestion control dynamics of a bulk-transfer TCP flow and demonstrate its application to networks which offer link-layer adaptive measures. We begin by observing that the TCP's congestion window dynamics are comprised of certain recurring patterns which we term as cycles. We then overlay a TCP throughput optimization methodology that selects link-layer transmission modes (e.g. modulation scheme, coding rate, transmission power, or a combination thereof) in accordance with TCP dynamics and wireless channel conditions. We provide insights into the working of the optimization procedure which protects TCP segments against losses on the wireless channel when the TCP congestion window size (in bytes) is below the bandwidth-delay product of the network. The protection against wireless channel losses is rendered by the link-layer by employing robust modulation and coding schemes, high transmission power, etc. We show that TCP dynamics aware link adaptation measures lead to substantial enhancement of TCP throughput in EGPRS and IEEE 802.11a networks

Proceedings ArticleDOI
01 Oct 2007
TL;DR: An inter-BSS direct link setup (iDLS) protocol is proposed to facilitate direct link set up across BSS with the aim of increasing the overall throughput of a set of BSS.
Abstract: Existing 802.11 direct link setup (DLS) protocol specifies procedures for direct communication between a pair of stations within the same basic service set (BSS). This eliminates the unnecessary triangular traffic route through an access point and increases the overall effective throughput within the BSS. In this paper, an inter-BSS direct link setup (iDLS) protocol is proposed to facilitate direct link set up across BSS with the aim of increasing the overall throughput of a set of BSS. A preliminary version of the proposed iDLS protocol is implemented on commercial Atheros chipset based WLAN cards using the MADWiFi open source Linux driver. Our indoor measurement showed that, with pre-calibration on the neighbor sensing range parameter, the average throughput performance of using iDLS can be 24 times that of the conventional infrastructure mode without iDLS.

Proceedings ArticleDOI
04 Jun 2007
TL;DR: This paper presents a method to achieve classification through an anytime, interactive questionnaire, created automatically upon the generation of new stereotypes, within the MediaScout system.
Abstract: The MediaScout system is envisioned to function as personalized media (audio, video, print) service within mobile phones, online media portals, sling boxes, etc. The MediaScout recommender engine uses a novel stereotype-based recommendation engine. Upon the registration of new users the system must decide how to classify the new users to existing stereotypes. In this paper we present a method to achieve this classification through an anytime, interactive questionnaire, created automatically upon the generation of new stereotypes. A comparative study performed on the IMDB database illustrates the advantages of the new system

Patent
16 Mar 2007
TL;DR: In this paper, the authors proposed a method for traffic steering between UTRAN and GERAN, where the mobile radio network can signal to certain mobiles an offset which is applied on the normal cell reselection parameters in order to selectively change the reselection behavior of specific mobile terminals.
Abstract: The User-Equipment-controlled cell reselection algorithms in a UMTS or GERAN system currently operate independently of any subscriber-specific or service considerations. All user equipments, UE, are handled in the same way. In the case that any service or subscriber differentiation is required, e.g. for traffic steering between UTRAN and GERAN, this is typically performed once the UE has entered the UMTS CELL_DCH RRC state in which the network controls the mobility of terminal individually. The proposed invention allows the mobile radio network to signal to certain mobiles an offset which is applied on the normal cell reselection parameters in order to selectively change the cell reselection behaviour of specific mobile terminals. This allows for example to steer some mobiles, e.g. depending on the used service, subscription, terminal capabilities etc., from one radio access technology to another based on changed cell reselection parameters.

Proceedings ArticleDOI
17 Sep 2007
TL;DR: The objective is to provide an online tool, which enables individuals within a potentially large organization to search for experts in a certain area, which may not be represented in company organization or reporting lines.
Abstract: This paper proposes a system to facilitate exchange of information by automatically finding experts, competent in answering a given question. Our objective is to provide an online tool, which enables individuals within a potentially large organization to search for experts in a certain area, which may not be represented in company organization or reporting lines. The advantage of the proposed system over standard forums or group-ware systems is that full-formatted questions can be compared to stored qualification profiles, which were automatically derived from documents, without Human search effort, and possibly refined manually. This allows us to find competent colleagues (or helpful literature such as How-Tos) for a given problem in a single step, and without intermediate iterations. The system is symmetric in that it does not distinguish between "questioners" (asking questions) and "experts" (answering them), therefore forming a "community" of users, which are distributed over an ontology covering the total knowledge. This ontology can either be given (i.e. in the form of an organizational chart), or it can be derived from the experts' knowledge.

Journal ArticleDOI
01 Nov 2007
TL;DR: The design and implementation of a framework for building context aware applications on-demand, as dynamically composed sequences of calls to services are presented, which employs goal-oriented inferencing for assembling composite services, dynamically monitors their execution, and adapts applications to deal with contex- tual changes.
Abstract: Legacy application design models, which are still widely used for developing context-aware applications, incur important limitations. Firstly, embedding contextual dependencies in the form of if–then rules specifying how applications should react to context changes is impractical to accommodate the large variety of possibly even unanticipated context types and their values. Additionally, application development is complicated and challenging, as programmers have to manually determine and encode the associations of all possible combinations of context parameters with application behaviour. In this paper we propose a framework for building context aware applications on-demand, as dynamically composed sequences of calls to services. We present the design and implementation of our system, which employs goal-oriented inferencing for assembling composite services, dynamically monitors their execution, and adapts applications to deal with contex- tual changes. We describe the failure recovery mechanisms we have implemented, allowing the deployment of the system in a non-perfect environment, and avoiding the delays inherent in re-discovering a suitable service instance. By means of experimental evaluation in a realistic infotainment application, we demonstrate the potential of the proposed solution an effective, efficient, and scalable approach.

Proceedings ArticleDOI
08 Jul 2007
TL;DR: In this article, the authors give an overview about the fundamentals and limits of the slow-and fast-light effect in general and based on the stimulated Brillouin scattering (SBS) in optical fibers.
Abstract: Slow- and fast-light is the control of the velocity of light in a medium by light. As a fascinating new field in physics there is a fundamental interest on this effect on the one side, but on the other side there exist a lot of practical applications for telecommunication and information systems. Among these are optical signal processing, the radio frequency-photonics, nonlinear optics and spectroscopy in time domain. Furthermore, the slow- and fast-light effect can be seen as a key technology for optical delay lines, buffers, equalizers and synchronizers in packed switched networks. To realize the effect there are different methods and material systems possible. Beside these especially the nonlinear effect of stimulated Brillouin scattering (SBS) is of special interest because it has several advantages. This article gives an overview about the fundamentals and limits of the slow-and fast-light effect in general and based on the SBS in optical fibers. Some experimental results which were achieved so far are shown.

Journal ArticleDOI
TL;DR: The history of Windows system loggers is described, what has been changed over time and for what reason, and a procedure for how to recover information from log fragments is proposed.

Proceedings ArticleDOI
26 Dec 2007
TL;DR: This paper investigates physics-based plane beam model, frequently used in mechanical and civil engineering, to track large non-linear deformations in images and applies this method to track deformations of the pole vault, the rat whiskers and the car antenna.
Abstract: In this paper we investigate physics-based plane beam model, frequently used in mechanical and civil engineering, to track large non-linear deformations in images. Such models do not only contribute to robust and precise tracking, in the presence of clutter and partial occlusions, but also allow to compute the forces that produce observed deformations. We verify the correctness of the recovered forces by using them in a simulation and compare the results to the original image displacements. We apply this method to track deformations of the pole vault, the rat whiskers and the car antenna.

Book ChapterDOI
01 Jan 2007
TL;DR: A service description approach that is based on OWL-S (Web Ontology Language for Services) and focuses on nonfunctional criteria that starts with the necessary service management tasks and explains non-functional data elements and statements for its automated support.
Abstract: In order for service-oriented architectures (SOAs) to deliver their true value for the business, e.g. flexibility and transparency, a holistic service management needs to be set up in the enterprise. To perform all the service management tasks efficiently heavy support by automated processes and tools is necessary. This article describes a service description approach that is based on OWL-S (Web Ontology Language for Services) and focuses on nonfunctional criteria. It starts with the necessary service management tasks and explains non-functional data elements and statements for its automated support. After covering related work it explains the proposed flexible extension to OWL-S. This extension is twofold. Firstly, simple service lifecycle elements are added using the normal extension mechanism. Secondly for adding QoS (Quality of Service) capabilities, the approach combines this extension mechanism with UML (Unified Modeling Language) Profile for QoS. A prototype delivers the proof-of-concept.