scispace - formally typeset
Search or ask a question

Showing papers by "Bauhaus University, Weimar published in 2012"


Journal ArticleDOI
TL;DR: In this paper, an Isogeometric Boundary Element Method (BEM) is applied to two-dimensional elastostatic problems using Non-Uniform Rational B-Splines (NURBS).

351 citations


Journal ArticleDOI
TL;DR: In this paper, a phantom-node method is developed for three-node shell elements to describe cracks, which can treat arbitrary cracks independently of the mesh mesh and may cut elements completely or partially.

256 citations


Journal ArticleDOI
TL;DR: In this article, a strain smoothing procedure for the extended finite element method (XFEM) is presented, which is tailored to linear elastic fracture mechanics and, in this context, to outperform the standard XFEM.

210 citations


Journal ArticleDOI
TL;DR: In this paper, size dependent linear free flexural vibration behavior of functionally graded (FG) nanoplates using the iso-geometric based finite element method was investigated using non-uniform rational B-splines.

197 citations


Journal ArticleDOI
TL;DR: In this article, the authors present findings from thermal comfort surveys and measurements of indoor environmental variables in naturally ventilated classrooms in Hampshire, England, and compare the results with the two common approaches used in existing comfort standards, the heat balance and the adaptive comfort model.

196 citations


Journal ArticleDOI
TL;DR: In this paper, the calibration of the numerical model of a bowstring-arch railway bridge based on modal parameters was performed using a genetic algorithm that allowed obtaining the optimal values of fifteen parameters.

166 citations


Journal ArticleDOI
TL;DR: In this article, the role of various components of the crack band approach is elucidated by simple examples dealing with a one-dimensional tensile test and a two-dimensional model of a notched beam subjected to bending.

134 citations


Book ChapterDOI
19 Mar 2012
TL;DR: One of the family members, McOEx, which is a design solely based on a standard block cipher, provably guarantees reasonable security against general adversaries as well as standard security against nonce-respecting adversaries is presented.
Abstract: On-Line Authenticated Encryption (OAE) combines privacy with data integrity and is on-line computable Most block cipher-based schemes for Authenticated Encryption can be run on-line and are provably secure against nonce-respecting adversaries But they fail badly for more general adversaries This is not a theoretical observation only --- in practice, the reuse of nonces is a frequent issue In recent years, cryptographers developed misuse-resistant schemes for Authenticated Encryption These guarantee excellent security even against general adversaries which are allowed to reuse nonces Their disadvantage is that encryption can be performed in an off-line way, only This paper considers OAE schemes dealing both with nonce-respecting and with general adversaries It introduces McOE, an efficient design for OAE schemes For this we present in detail one of the family members, McOEx, which is a design solely based on a standard block cipher As all the other member of the McOE family, it provably guarantees reasonable security against general adversaries as well as standard security against nonce-respecting adversaries

115 citations


Journal ArticleDOI
TL;DR: In this paper, a node-based smoothed FEM in combination with a primal-dual algorithm is used to compute the plastic collapse limit and shakedown loads of structures.
Abstract: SUMMARY This paper presents a novel numerical procedure for computing limit and shakedown loads of structures using a node-based smoothed FEM in combination with a primal–dual algorithm. An associated primal–dual form based on the von Mises yield criterion is adopted. The primal-dual algorithm together with a Newton-like iteration are then used to solve this associated primal–dual form to determine simultaneously both approximate upper and quasi-lower bounds of the plastic collapse limit and the shakedown limit. The present formulation uses only linear approximations and its implementation into finite element programs is quite simple. Several numerical examples are given to show the reliability, accuracy, and generality of the present formulation compared with other available methods. Copyright © 2011 John Wiley & Sons, Ltd.

93 citations


Proceedings ArticleDOI
12 Aug 2012
TL;DR: In this article, a machine learning approach was used to predict Wikipedia's most important quality flaws, which is based on user-defined cleanup tags, which are commonly used in many Web applications to tag content that has some shortcomings.
Abstract: The detection and improvement of low-quality information is a key concern in Web applications that are based on user-generated content; a popular example is the online encyclopedia Wikipedia Existing research on quality assessment of user-generated content deals with the classification as to whether the content is high-quality or low-quality This paper goes one step further: it targets the prediction of quality flaws, this way providing specific indications in which respects low-quality content needs improvement The prediction is based on user-defined cleanup tags, which are commonly used in many Web applications to tag content that has some shortcomings We apply this approach to the English Wikipedia, which is the largest and most popular user-generated knowledge source on the Web We present an automatic mining approach to identify the existing cleanup tags, which provides us with a training corpus of labeled Wikipedia articles We argue that common binary or multiclass classification approaches are ineffective for the prediction of quality flaws and hence cast quality flaw prediction as a one-class classification problem We develop a quality flaw model and employ a dedicated machine learning approach to predict Wikipedia's most important quality flaws Since in the Wikipedia setting the acquisition of significant test data is intricate, we analyze the effects of a biased sample selection In this regard we illustrate the classifier effectiveness as a function of the flaw distribution in order to cope with the unknown (real-world) flaw-specific class imbalances The flaw prediction performance is evaluated with 10,000 Wikipedia articles that have been tagged with the ten most frequent quality flaws: provided test data with little noise, four flaws can be detected with a precision close to 1

89 citations


Proceedings Article
22 Jul 2012
TL;DR: This paper considers real-world production plants where the learned model must capture timing behavior, dependencies between system variables, as well as mode switches--in short: hybrid system's characteristics, and presents a taxonomy of learning problems related to model formation tasks.
Abstract: A tailored model of a system is the prerequisite for various analysis tasks, such as anomaly detection, fault identification, or quality assurance. This paper deals with the algorithmic learning of a system's behavior model given a sample of observations. In particular, we consider real-world production plants where the learned model must capture timing behavior, dependencies between system variables, as well as mode switches--in short: hybrid system's characteristics. Usually, such model formation tasks are solved by human engineers, entailing the well-known bunch of problems including knowledge acquisition, development cost, or lack of experience. Our contributions to the outlined field are as follows. (1) We present a taxonomy of learning problems related to model formation tasks. As a result, an important open learning problem for the domain of production system is identified: The learning of hybrid timed automata. (2) For this class of models, the learning algorithm HyBUTLA is presented. This algorithm is the first of its kind to solve the underlying model formation problem at scalable precision. (3) We present two case studies that illustrate the usability of this approach in realistic settings. (4) We give a proof for the learning and runtime properties of HyBUTLA.

Proceedings ArticleDOI
12 Aug 2012
TL;DR: To foster experiments as a service in IR, this work presents a Web framework for experiments that addresses the outlined challenges and possesses a unique set of compelling features in comparison to existing solutions.
Abstract: With its close ties to the Web, the IR community is destined to leverage the dissemination and collaboration capabilities that the Web provides today. Especially with the advent of the software as a service principle, an IR community is conceivable that publishes experiments executable by anyone over the Web. A review of recent SIGIR papers shows that we are far away from this vision of collaboration. The benefits of publishing IR experiments as a service are striking for the community as a whole, and include potential to boost research profiles and reputation. However, the additional work must be kept to a minimum and sensitive data must be kept private for this paradigm to become an accepted practice. To foster experiments as a service in IR, we present a Web framework for experiments that addresses the outlined challenges and possesses a unique set of compelling features in comparison to existing solutions. We also describe how our reference implementation is already used officially as an evaluation platform for an established international plagiarism detection competition.

Proceedings ArticleDOI
12 Aug 2012
TL;DR: The ChatNoir search engine is scalable and returns the first results within three seconds, which is significantly faster than Indri, which allows for implementing reproducible experiments based on retrieving documents from the ClueWeb09 corpus.
Abstract: We present the ChatNoir search engine which indexes the entire English part of the ClueWeb09 corpus. Besides Carnegie Mellon's Indri system, ChatNoir is the second publicly available search engine for this corpus. It implements the classic BM25F information retrieval model including PageRank and spam likelihood. The search engine is scalable and returns the first results within three seconds, which is significantly faster than Indri. A convenient API allows for implementing reproducible experiments based on retrieving documents from the ClueWeb09 corpus. The search engine has successfully accomplished a load test involving 100,000 queries.

Journal ArticleDOI
TL;DR: The authors develop a conceptual framework of the effects of group recommenders and empirically examine these effects through two choice experiments, finding that automated group recommender offer more valuable information than single recommenders when the choice agent must consume the recommended alternative.
Abstract: Because hedonic products consist predominantly of experience attributes, often with many available alternatives, choosing the “right” one is a demanding task for consumers. Decision making becomes even more difficult when a group, instead of an individual consumer, will consume the product, as is regularly the case for hedonic offerings such as movies, opera performances, and wine. Noting the prevalence of automated recommender systems as decision aids, the authors investigate the power of group recommender systems that consider the preferences of all group members. The authors develop a conceptual framework of the effects of group recommenders and empirically examine these effects through two choice experiments. They find that automated group recommenders offer more valuable information than single recommenders when the choice agent must consume the recommended alternative. However, when agents choose freely among alternatives, the group's social relationship quality determines whether group rec...

Journal ArticleDOI
TL;DR: The authors discusses simple and operational methodologies of quantifying payments of guarantees given to PPP toll road projects to protect project sponsors from skyrocketing costs of acquiring land, delays in scheduled toll adjustment, and compensation payments in case of nationalization.
Abstract: By the end of 2010, the Government of Indonesia (GoI) issued a new regulation on government guarantee provision to protect project sponsors from government-related project risks in public–private partnered (PPP) infrastructure development. Whereas the provision of guarantees can help improve the creditworthiness of PPP projects, it also may expose the GoI to considerable fiscal risk as a result of contingent liabilities the GoI incurs when providing guarantees. This requires a systematic contingent liability analysis to understand the full extent of their exposures. The present paper discusses simple and operational methodologies of quantifying payments of guarantees given to PPP toll road projects to protect project sponsors from skyrocketing costs of acquiring land, delays in scheduled toll adjustment, and compensation payments in case of nationalization. The paper also includes extensive modeling of key project risks, i.e., land cost escalation, initial traffic volume, inflation rates, toll adj...

Journal ArticleDOI
TL;DR: A group of problems that have attracted a great deal of attention from the EFG method community includes the treatment of large deformations and dealing with strong discontinuities such as cracks, one efficient solution to model cracks is adding special enrichment functions to the standard shape functions.
Abstract: SUMMARY Meshfree methods (MMs) such as the element free Galerkin (EFG)method have gained popularity because of some advantages over other numerical methods such as the finite element method (FEM). A group of problems that have attracted a great deal of attention from the EFG method community includes the treatment of large deformations and dealing with strong discontinuities such as cracks. One efficient solution to model cracks is adding special enrichment functions to the standard shape functions such as extended FEM, within the FEM context, and the cracking particles method, based on EFG method. It is well known that explicit time integration in dynamic applications is conditionally stable. Furthermore, in enriched methods, the critical time step may tend to very small values leading to computationally expensive simulations. In this work, we study the stability of enriched MMs and propose two mass-lumping strategies. Then we show that the critical time step for enriched MMs based on lumped mass matrices is of the same order as the critical time step of MMs without enrichment. Moreover, we show that, in contrast to extended FEM, even with a consistent mass matrix, the critical time step does not vanish even when the crack directly crosses a node. Copyright © 2011 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this paper, an extended finite element formulation for dynamic fracture of piezo-electric materials is presented in the context of linear elastic fracture mechanics and applied to mode I and mixed mode-fracture for quasi-steady cracks.

Proceedings ArticleDOI
03 Sep 2012
TL;DR: The TIRA (Testbed for Information Retrieval Algorithms) web framework is presented, which is currently used as an official evaluation platform for the well-established PAN international plagiarism detection competition and possesses a unique set of compelling features in comparison to existing web-based solutions.
Abstract: With its close ties to the Web, the information retrieval community is destined to leverage the dissemination and collaboration capabilities that the Web provides today. Especially with the advent of the software as a service principle, an information retrieval community is conceivable that publishes executable experiments by anyone over the Web. A review of recent SIGIR papers shows that we are far away from this vision of collaboration. The benefits of publishing information retrieval experiments as a service are striking for the community as a whole, including potential to boost research profiles and reputation. However, the additional work must be kept to a minimum and sensitive data must be kept private for this paradigm to become an accepted practice. In order to foster experiments as a service in information retrieval, we present the TIRA (Testbed for Information Retrieval Algorithms) web framework that addresses the outlined challenges and possesses a unique set of compelling features in comparison to existing web-based solutions. To describe TIRA in a practical setting, we explain how it is currently used as an official evaluation platform for the well-established PAN international plagiarism detection competition. We also describe how it can be used in future scenarios for search result clustering of non-static collections of web query results, as well as within a simulation data mining setting to support interactive structural design in civil engineering.

Journal ArticleDOI
TL;DR: In this paper, a systematic experiment was conducted to understand the liberation process of concrete recycling aggregates and the product quality, the parameters of the process and the interaction were experimentally investigated, and the results showed that with a low treatment temperature of 250-300°C and a sufficient high duration of mechanical treatment recycled aggregates with properties similar to natural aggregates can be generated.
Abstract: Compared to virgin aggregates concrete recyclates have some specific properties because of the attached hardened cement paste. By the method of thermo-mechanical treatment, the hardened cement paste can be removed. In this paper systematic experiments are described. The aim was to understand the liberation process better. The product quality, the parameters of the process and the interaction were experimentally investigated. The results of the study show that with a low treatment temperature of 250–300 °C and a sufficient high duration of mechanical treatment recycled aggregates with properties similar to natural aggregates can be generated.

Journal ArticleDOI
TL;DR: There is no need for experimentalists to apply tensile strain to the resonators before actuation in order to enhance the mass sensitivity, and enhanced mass sensitivity can be obtained by the far simpler technique of actuating nonlinear oscillations of an existing graphene nanoresonator.
Abstract: We perform classical molecular dynamics simulations to investigate the enhancement of the mass sensitivity and resonant frequency of graphene nanomechanical resonators that is achieved by driving them into the nonlinear oscillation regime. The mass sensitivity as measured by the resonant frequency shift is found to triple if the actuation energy is about 2.5 times the initial kinetic energy of the nanoresonator. The mechanism underlying the enhanced mass sensitivity is found to be the effective strain that is induced in the nanoresonator due to the nonlinear oscillations, where we obtain an analytic relationship between the induced effective strain and the actuation energy that is applied to the graphene nanoresonator. An important implication of this work is that there is no need for experimentalists to apply tensile strain to the resonators before actuation in order to enhance the mass sensitivity. Instead, enhanced mass sensitivity can be obtained by the far simpler technique of actuating nonlinear oscillations of an existing graphene nanoresonator.

Journal ArticleDOI
TL;DR: The finite element method (FEM) has become a standard tool for solving complex problems in geotechnical engineering as mentioned in this paper, and many different advanced constitutive models for fine-grained soils have been developed.
Abstract: The finite element method (FEM) has become a standard tool for solving complex problems in geotechnical engineering. Many different advanced constitutive models for fine-grained soils have been dev...

Proceedings ArticleDOI
16 Apr 2012
TL;DR: This work presents a simple statistical quality measure that is based on facts extracted from Web content using Open Information Extraction, and uses this measure to identify featured/good articles in Wikipedia.
Abstract: Nowadays, many decisions are based on information found in the Web. For the most part, the disseminating sources are not certified, and hence an assessment of the quality and credibility of Web content became more important than ever. With factual density we present a simple statistical quality measure that is based on facts extracted from Web content using Open Information Extraction. In a first case study, we use this measure to identify featured/good articles in Wikipedia. We compare the factual density measure with word count, a measure that has successfully been applied to this task in the past. Our evaluation corroborates the good performance of word count in Wikipedia since featured/good articles are often longer than non-featured. However, for articles of similar lengths the word count measure fails while factual density can separate between them with an F-measure of 90.4%. We also investigate the use of relational features for categorizing Wikipedia articles into featured/good versus non-featured ones. If articles have similar lengths, we achieve an F-measure of 86.7% and 84% otherwise.

Journal ArticleDOI
TL;DR: In this paper, a new coaxial line cell for the determination of dielectric spectra of undisturbed soil samples was developed based on a 1.625-inch -50 Ω coaxial system.
Abstract: A new coaxial line cell for the determination of dielectric spectra of undisturbed soil samples was developed based on a 1.625-inch - 50 Ω coaxial system. Undisturbed soil samples were collected from a soil profile of the Taunus region (Germany) and capillary saturated followed by a step-by-step de-watering in a pressure plate apparatus as well as oven-drying at 40°C. The resultant water contents of the soil samples varied from saturation to air-dry. Permittivity measurements were performed within a frequency range from 1 MHz to 10 GHz with a vector network analyser technique. Complex effective relative permittivity or electrical conductivity was obtained by combining quasi-analytical and numerical inversion algorithms as well as the parameterizing of measured full set S-parameters simultaneously under consideration of a generalized fractional dielectric relaxation model (GDR). The measurement of standard materials shows that the technique provides reliable dielectric spectra up to a restricted upper frequency of 5 GHz. For the soil samples investigated, the real part of complex effective relative permittivity ɛ′r,eff and the real part of complex effective electrical conductivity σ′eff decreased with increasing matric potential or decreasing water contents. Soil texture and porosity affect the dielectric behaviour at frequencies below 1 GHz. For frequencies above 1 GHz minor texture effects were found. The presence of organic matter decreases ɛ′r,eff and σ′eff. At 1 GHz, the empirical model of Topp et al. (1980) is in close agreement with the experimentally determined real part of the effective permittivity with RMSEs ranging from 1.21 for the basal periglacial slope deposit and 1.29 for bedrock to 3.93 for the upper periglacial slope deposit (Ah). The comparison of the experimental results with a semi-empirical dielectric mixing model shows that data, especially for the organic-free soils, tend to be under-estimated below 1 GHz. The main advantage of the new method compared with conventional impedance and coaxial methods is the preservation of the natural in-situ structure and properties such as bulk density of the investigated soil samples.

Proceedings ArticleDOI
11 Nov 2012
TL;DR: A new approach for touch detection on optical multi-touch devices that exploits the fact that the camera images reveal not only the actual touch points, but also objects above the screen such as the hand or arm of a user.
Abstract: We propose a new approach for touch detection on optical multi-touch devices that exploits the fact that the camera images reveal not only the actual touch points, but also objects above the screen such as the hand or arm of a user. Our touch processing relies on the Maximally Stable Extremal Regions algorithm for finding the users' fingertips in the camera image. The hierarchical structure of the generated extremal regions serves as a starting point for agglomerative clustering of the fingertips into hands. Furthermore, we suggest a heuristic supporting the identification of individual fingers as well as the distinction between left hands and right hands if all five fingers of a hand are in contact with the touch surface.Our evaluation confirmed that the system is robust against detection errors resulting from non-uniform illumination and reliably assigns touch points to individual hands based on the implicitly tracked context information. The efficient multithreaded implementation handles two-handed input from multiple users in real-time.

Journal ArticleDOI
TL;DR: The main goal of as discussed by the authors is to generalize Hadamard's real part theorem and invariant forms of Borel-Caratheodory's theorem from complex analysis to solutions of the Riesz system in the three-dimensional Euclidean space in the framework of quaternionic analysis.
Abstract: The main goal of this article is to generalize Hadamard's real part theorem and invariant forms of Borel–Caratheodory's theorem from complex analysis to solutions of the Riesz system in the three-dimensional Euclidean space in the framework of quaternionic analysis.

Journal ArticleDOI
TL;DR: In this paper, a survey of sulphate and sulphide containing environments in a particular region in mid-Europe is presented to obtain reliable information on the long term behaviour of concrete, and the results show that concrete is liable to be destroyed when in contact with sulphide bearing environments or if intimately mixed with gypsum.
Abstract: Laboratory investigations have been used to derive a high number of important details on sulphate attack on hardened concrete, but are not able to forecast the performance of this material under field conditions. In order to obtain reliable information on the long term behaviour of concrete, a survey is presented that looks at sulphate and sulphide containing environments in a particular region in mid-Europe. Twenty concrete structures have been sampled and analyzed. The classical idea of sulphate attack considering the migration of sulphate ions from ground or river water into concrete with subsequent phase transformation and damage has not been confirmed. This kind of exposure was found to be rare and no serious deterioration has been observed in connection with it. However, concrete is liable to be destroyed when in contact with sulphide bearing environments or if intimately mixed with gypsum. Disintegration and serious expansion requiring immediate repair has been observed. Information on all investigated structures are presented in this article and in more detail in a separate report.

Journal ArticleDOI
TL;DR: In this article, the optimization of TMD systems to suppress multi-resonant dynamic structural response of high-speed railway bridges is studied, where the objective function is chosen based on the H"2 norm along with the constraints on all the peaks which are at the same heights over the frequency ranges of interest.

Proceedings ArticleDOI
29 Oct 2012
TL;DR: In this article, Li et al. proposed an in-doubt-without query segmentation approach with the basic idea that, in cases of doubt, it is often better to leave queries without any segmentation.
Abstract: Query segmentation is the problem of identifying those keywords in a query, which together form compound concepts or phrases like "new york times". Such segments can help a search engine to better interpret a user's intents and to tailor the search results more appropriately. Our contributions to this problem are threefold. (1) We conduct the first large-scale study of human segmentation behavior based on more than 500000 segmentations. (2) We show that the traditionally applied segmentation accuracy measures are not appropriate for such large-scale corpora and introduce new, more robust measures. (3) We develop a new query segmentation approach with the basic idea that, in cases of doubt, it is often better to (partially) leave queries without any segmentation. This new in-doubt-without approach chooses different segmentation strategies depending on query types. A large-scale evaluation shows substantial improvement upon the state of the art in terms of segmentation accuracy. To draw a complete picture, we also evaluate the impact of segmentation strategies on retrieval performance in a TREC setting. It turns out that more accurate segmentation not necessarily yields better retrieval performance. Based on this insight, we propose an in-doubt-without variant which achieves the best retrieval performance despite leaving many queries unsegmented. But there is still room for improvement: the optimum segmentation strategy which always chooses the segmentation that maximizes retrieval performance, significantly outperforms all other tested approaches.

Journal ArticleDOI
TL;DR: This paper presents a theoretic foundation for optimum document clustering, to base cluster analysis and evalutation on a set of queries, by defining documents as being similar if they are relevant to the same queries.
Abstract: Document clustering offers the potential of supporting users in interactive retrieval, especially when users have problems in specifying their information need precisely. In this paper, we present a theoretic foundation for optimum document clustering. Key idea is to base cluster analysis and evalutation on a set of queries, by defining documents as being similar if they are relevant to the same queries. Three components are essential within our optimum clustering framework, OCF: (1) a set of queries, (2) a probabilistic retrieval method, and (3) a document similarity metric. After introducing an appropriate validity measure, we define optimum clustering with respect to the estimates of the relevance probability for the query-document pairs under consideration. Moreover, we show that well-known clustering methods are implicitly based on the three components, but that they use heuristic design decisions for some of them. We argue that with our framework more targeted research for developing better document clustering methods becomes possible. Experimental results demonstrate the potential of our considerations.

Proceedings ArticleDOI
16 Apr 2012
TL;DR: Wang et al. as mentioned in this paper conducted an extensive exploratory analysis of Wikipedia's quality flaws and found that more than one in four English Wikipedia articles contains at least one quality flaw, 70% of which concern article verifiability.
Abstract: The online encyclopedia Wikipedia is a successful example of the increasing popularity of user generated content on the Web. Despite its success, Wikipedia is often criticized for containing low-quality information, which is mainly attributed to its core policy of being open for editing by everyone. The identification of low-quality information is an important task since Wikipedia has become the primary source of knowledge for a huge number of people around the world. Previous research on quality assessment in Wikipedia either investigates only small samples of articles, or else focuses on single quality aspects, like accuracy or formality. This paper targets the investigation of quality flaws, and presents the first complete breakdown of Wikipedia's quality flaw structure. We conduct an extensive exploratory analysis, which reveals (1) the quality flaws that actually exist, (2) the distribution of flaws in Wikipedia, and (3) the extent of flawed content. An important finding is that more than one in four English Wikipedia articles contains at least one quality flaw, 70% of which concern article verifiability.