scispace - formally typeset
Search or ask a question

Showing papers by "Bauhaus University, Weimar published in 2011"


Journal Article
TL;DR: Scikit-learn is a Python module integrating a wide range of state-of-the-art machine learning algorithms for medium-scale supervised and unsupervised problems, focusing on bringing machine learning to non-specialists using a general-purpose high-level language.
Abstract: Scikit-learn is a Python module integrating a wide range of state-of-the-art machine learning algorithms for medium-scale supervised and unsupervised problems. This package focuses on bringing machine learning to non-specialists using a general-purpose high-level language. Emphasis is put on ease of use, performance, documentation, and API consistency. It has minimal dependencies and is distributed under the simplified BSD license, encouraging its use in both academic and commercial settings. Source code, binaries, and documentation can be downloaded from http://scikit-learn.sourceforge.net.

47,974 citations


Journal ArticleDOI
TL;DR: A novel approach for isogeometric analysis of thin shells using polynomial splines over hierarchical T-meshes (PHT-splines) that achieves C1 continuity, so the Kirchhoff–Love theory can be used in pristine form.

349 citations


Journal ArticleDOI
TL;DR: The idea is based on polynomial splines and exploits the flexibility of T-meshes for local refinement and satisfies important properties such as non-negativity, local support and partition of unity.

246 citations


Journal ArticleDOI
01 Mar 2011
TL;DR: The results of the evaluation indicate that CL-CNG, despite its simple approach, is the best choice to rank and compare texts across languages if they are syntactically related.
Abstract: Cross-language plagiarism detection deals with the automatic identification and extraction of plagiarism in a multilingual setting. In this setting, a suspicious document is given, and the task is to retrieve all sections from the document that originate from a large, multilingual document collection. Our contributions in this field are as follows: (1) a comprehensive retrieval process for cross-language plagiarism detection is introduced, highlighting the differences to monolingual plagiarism detection, (2) state-of-the-art solutions for two important subtasks are reviewed, (3) retrieval models for the assessment of cross-language similarity are surveyed, and, (4) the three models CL-CNG, CL-ESA and CL-ASA are compared. Our evaluation is of realistic scale: it relies on 120,000 test documents which are selected from the corpora JRC-Acquis and Wikipedia, so that for each test document highly similar documents are available in all of the six languages English, German, Spanish, French, Dutch, and Polish. The models are employed in a series of ranking tasks, and more than 100 million similarities are computed with each model. The results of our evaluation indicate that CL-CNG, despite its simple approach, is the best choice to rank and compare texts across languages if they are syntactically related. CL-ESA almost matches the performance of CL-CNG, but on arbitrary pairs of languages. CL-ASA works best on "exact" translations but does not generalize well.

232 citations


Journal ArticleDOI
TL;DR: A multiscale method to couple the homogeneous macroscale with the heterogeneous mesoscale model in a concurrent embedded approach is proposed, which allows an adaptive transition from a full macroscale model to a multiscales model, where only the relevant parts are resolved on a finer scale.
Abstract: In this paper, a mesoscale model of concrete is presented, which considers particles, matrix material and the interfacial transition zone (ITZ) as separate constituents. Particles are represented as ellipsoides, generated according to a prescribed grading curve and placed randomly into the specimen. Algorithms are proposed to generate realistic particle configurations efficiently. The nonlinear behavior is simulated with a cohesive interface model for the ITZ. For the matrix material, different damage/plasticity models are investigated. The simulation of localization requires to regularize the solution, which is performed by using integral type nonlocal models with strain or displacement averaging. Due to the complexity of a mesoscale model for a realistic structure, a multiscale method to couple the homogeneous macroscale with the heterogeneous mesoscale model in a concurrent embedded approach is proposed. This allows an adaptive transition from a full macroscale model to a multiscale model, where only the relevant parts are resolved on a finer scale. Special emphasis is placed on the investigation of different coupling schemes between the different scales, such as the mortar method and the arlequin method, and a discussion of their advantages and disadvantages within the current context. The applicability of the proposed methodology is illustrated for a variety of examples in tension and compression.

211 citations


Journal ArticleDOI
TL;DR: In this article, Chen et al. extended the strain smoothing to higher order elements and investigated numerically in which condition strain-smoothing is beneficial to accuracy and convergence of enriched finite element approximations.
Abstract: By using the strain smoothing technique proposed by Chen et al. (Comput. Mech. 2000; 25: 137-156) for meshless methods in the context of the finite element method (FEM), Liu et al. (Comput. Mech. 2007; 39(6): 859-877) developed the Smoothed FEM (SFEM). Although the SFEM is not yet well understood mathematically, numerical experiments point to potentially useful features of this particularly simple modification of the FEM. To date, the SFEM has only been investigated for bilinear and Wachspress approximations and is limited to linear reproducing conditions. The goal of this paper is to extend the strain smoothing to higher order elements and to investigate numerically in which condition strain smoothing is beneficial to accuracy and convergence of enriched finite element approximations. We focus on three widely used enrichment schemes, namely: (a) weak discontinuities; (b) strong discontinuities; (c) near-tip linear elastic fracture mechanics functions. The main conclusion is that strain smoothing in enriched approximation is only beneficial when the enrichment functions are polynomial (cases (a) and (b)), but that non-polynomial enrichment of type (c) lead to inferior methods compared to the standard enriched FEM (e.g. XFEM). Copyright (C) 2011 John Wiley & Sons, Ltd.

168 citations


Journal ArticleDOI
TL;DR: This paper reviewed experimental and theoretical studies related to cement hydration and microstructure development that have been published within the four years of the interim period between the 12th and 13th International Congress on the Chemistry of Cement.

162 citations


Journal ArticleDOI
01 Mar 2011
TL;DR: The question whether plagiarism can be detected by a computer program if no reference can be provided, e.g., if the foreign sections stem from a book that is not available in digital form is investigated.
Abstract: Research in automatic text plagiarism detection focuses on algorithms that compare suspicious documents against a collection of reference documents. Recent approaches perform well in identifying copied or modified foreign sections, but they assume a closed world where a reference collection is given. This article investigates the question whether plagiarism can be detected by a computer program if no reference can be provided, e.g., if the foreign sections stem from a book that is not available in digital form. We call this problem class intrinsic plagiarism analysis; it is closely related to the problem of authorship verification. Our contributions are threefold. (1) We organize the algorithmic building blocks for intrinsic plagiarism analysis and authorship verification and survey the state of the art. (2) We show how the meta learning approach of Koppel and Schler, termed "unmasking", can be employed to post-process unreliable stylometric analysis results. (3) We operationalize and evaluate an analysis chain that combines document chunking, style model computation, one-class classification, and meta learning.

159 citations


Journal ArticleDOI
TL;DR: In this paper, the linear free flexural vibration of cracked material plates is studied using the extended finite element method using a 4-noded quadrilateral plate bending element based on field and edge consistency requirement with 20 degrees of freedom per element.

141 citations


Book ChapterDOI
04 Dec 2011
TL;DR: In this paper, the authors derived preimage security bounds for block cipher based double-block-length, double-call hash functions, such as Abreast-DM, Tandem-DM and Hirose's scheme.
Abstract: We present new techniques for deriving preimage resistance bounds for block cipher based double-block-length, double-call hash functions. We give improved bounds on the preimage security of the three "classical" double-block-length, double-call, block cipher-based compression functions, these being Abreast-DM, Tandem-DM and Hirose's scheme. For Hirose's scheme, we show that an adversary must make at least 22n−5 block cipher queries to achieve chance 0.5 of inverting a randomly chosen point in the range. For Abreast-DM and Tandem-DM we show that at least 22n−10 queries are necessary. These bounds improve upon the previous best bounds of Ω(2n) queries, and are optimal up to a constant factor since the compression functions in question have range of size 22n.

141 citations


Journal ArticleDOI
TL;DR: Comparison of ALRM applied to the investigated frequency and temperature range with sophisticated broadband relaxation models indicates the potential and the limitation to predict the high-frequency electromagnetic material properties.
Abstract: Frequency- and temperature-dependent complex permittivity or conductivity of a silty clay loam were examined in a broad saturation and porosity range with network analyzer technique (1 MHz-10 GHz, 5 °C-40 °C, coaxial transmission line and open ended coaxial cells). An advanced mixture model based on the well-known Lichtenecker-Rother model (ALRM) was developed and used to parameterize complex permittivity or conductivity at a measurement frequency of 1 GHz under consideration of a dependence of the so-called structure parameter as well as the apparent pore water conductivity on saturation and porosity. The ALRM is compared with frequently applied mixture models: complex refractive index model, Looyenga-Landau-Lifschitz model, Bruggeman-Hanai-Sen model, and Maxwell-Garnet model as well as empirical calibration functions. Comparison of ALRM applied to the investigated frequency and temperature range with sophisticated broadband relaxation models indicates the potential and the limitation to predict the high-frequency electromagnetic material properties.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated the influence of hydroxide concentration as well as Si/Al in the model system of aluminosilicate gels and showed that a gel with a preferred Si-Al ratio wants to condense.
Abstract: The reaction of geopolymer binders can be subdivided into two more or less parallel reactions, (1) the dissolution of reactable silicate and aluminate monomers from the reactive solid material and (2) the condensation to an aluminosilicate gel. Due to the wide range of possible raw materials, the question arises whether the Si/Al ratio of the hardened aluminosilicate network is predominated by the Si/Al ratio of the raw materials, or a gel with preferred Si/Al ratio wants to condense. Therefore, aluminosilicate gels were synthesized with pure alkali silicate and alkali aluminate solutions. Two measurement series were started to investigate the influence of hydroxide concentration as well as the influence of Si/Al in the model system. The gels were characterized by chemical analysis, FT-IR spectroscopy, X-ray diffraction as well as 29Si and 27Al MAS NMR spectroscopy.

Journal ArticleDOI
TL;DR: In this article, the node-based smoothed finite element method (NS-FEM) was incorporated into the extended finite element (XFEM), which is a novel numerical method for analyzing fracture problems of 2D elasticity.
Abstract: This paper aims to incorporate the node-based smoothed finite element method (NS-FEM) into the extended finite element method (XFEM) to form a novel numerical method (NS-XFEM) for analyzing fracture problems of 2D elasticity. NS-FEM uses the strain smoothing technique over the smoothing domains associated with nodes to compute the system stiffness matrix, which leads to the line integrations using directly the shape function values along the boundaries of the smoothing domains. As a result, we avoid integration of the stress singularity at the crack tip. It is not necessary to divide elements cut by cracks when we replace interior integration by boundary integration, simplifying integration of the discontinuous approximation. The key advantage of the NS-XFEM is that it provides more accurate solutions compared to the XFEM-T3 element. We will show for two numerical examples that the NS-XFEM significantly improves the results in the energy norm and the stress intensity factors. For the examples studied, we obtain super-convergent results

Proceedings ArticleDOI
12 Dec 2011
TL;DR: A new projection-based stereoscopic display for six users is developed, which employs six customized DLP projectors for fast time-sequential image display in combination with polarization.
Abstract: Stereoscopic multi-user systems provide multiple users with individual views of a virtual environment. We developed a new projection-based stereoscopic display for six users, which employs six customized DLP projectors for fast time-sequential image display in combination with polarization. Our intelligent high-speed shutter glasses can be programmed from the application to adapt to the situation. For instance, it does this by staying open if users do not look at the projection screen or switch to a VIP high brightness mode if less than six users use the system. Each user is tracked and can move freely in front of the display while perceiving perspectively correct views of the virtual environment. Navigating a group of six users through a virtual world leads to situations in which the group will not fit through spatial constrictions. Our augmented group navigation techniques ameliorate this situation by fading out obstacles or by slightly redirecting individual users along a collision-free path. While redirection goes mostly unnoticed, both techniques temporarily give up the notion of a consistent shared space. Our user study confirms that users generally prefer this trade-off over naive approaches.

Journal ArticleDOI
TL;DR: This paper proposed an extension of Structural Correspondence Learning (SCL), a recently proposed algorithm for domain adaptation, for cross-lingual adaptation in the context of text classification, which uses unlabeled documents from both languages, along with a word translation oracle, to induce a crosslingual representation that enables the transfer of classification knowledge from the source to the target language.
Abstract: Cross-lingual adaptation is a special case of domain adaptation and refers to the transfer of classification knowledge between two languages. In this article we describe an extension of Structural Correspondence Learning (SCL), a recently proposed algorithm for domain adaptation, for cross-lingual adaptation in the context of text classification. The proposed method uses unlabeled documents from both languages, along with a word translation oracle, to induce a cross-lingual representation that enables the transfer of classification knowledge from the source to the target language. The main advantages of this method over existing methods are resource efficiency and task specificity.We conduct experiments in the area of cross-language topic and sentiment classification involving English as source language and German, French, and Japanese as target languages. The results show a significant improvement of the proposed method over a machine translation baseline, reducing the relative error due to cross-lingual adaptation by an average of 30p (topic classification) and 59p (sentiment classification). We further report on empirical analyses that reveal insights into the use of unlabeled data, the sensitivity with respect to important hyperparameters, and the nature of the induced cross-lingual word correspondences.

Proceedings ArticleDOI
28 Mar 2011
TL;DR: A new method for query segmentation that is easy to implement, fast, and that comes with a segmentation accuracy comparable to current state-of-the-art techniques is introduced.
Abstract: We address the problem of query segmentation: given a keyword query, the task is to group the keywords into phrases, if possible. Previous approaches to the problem achieve reasonable segmentation performance but are tested only against a small corpus of manually segmented queries. In addition, many of the previous approaches are fairly intricate as they use expensive features and are difficult to be reimplemented.The main contribution of this paper is a new method for query segmentation that is easy to implement, fast, and that comes with a segmentation accuracy comparable to current state-of-the-art techniques. Our method uses only raw web n-gram frequencies and Wikipedia titles that are stored in a hash table. At the same time, we introduce a new evaluation corpus for query segmentation. With about 50,000 human-annotated queries, it is two orders of magnitude larger than the corpus being used up to now.

Journal ArticleDOI
TL;DR: In this paper, the authors used a quadrilateral element with smoothed curvatures and the extended finite element method to solve the linear buckling problem for isotropic plates, where the curvature at each point is obtained by a nonlocal approximation via a smoothing function.
Abstract: In this paper, the linear buckling problem for isotropic plates is studied using a quadrilateral element with smoothed curvatures and the extended finite element method. First, the curvature at each point is obtained by a nonlocal approximation via a smoothing function. This element is later coupled with partition of unity enrichment to simplify the simulation of cracks. The proposed formulation suppresses locking and yields elements which behave very well, even in the thin plate limit. The buckling coefficient and mode shapes of square and rectangular plates are computed as functions of crack length, crack location, and plate thickness. The effects of different boundary conditions are also studied.

Journal ArticleDOI
TL;DR: The discrete shear gap (DSG) method is incorporated into the A α FEM to eliminate transverse shear locking and an improved triangular element termed as A α DSG3 is proposed.

Proceedings ArticleDOI
24 Oct 2011
TL;DR: A general probabilistic model for term weights is employed which reveals how ESA actually works and provides a theoretical grounding on how the size and the composition of the index collection affect the ESA-based computation of similarity values for texts.
Abstract: Since its debut the Explicit Semantic Analysis (ESA) has received much attention in the IR community. ESA has been proven to perform surprisingly well in several tasks and in different contexts. However, given the conceptual motivation for ESA, recent work has observed unexpected behavior. In this paper we look at the foundations of ESA from a theoretical point of view and employ a general probabilistic model for term weights which reveals how ESA actually works. Based on this model we explain some of the phenomena that have been observed in previous work and support our findings with new experiments. Moreover, we provide a theoretical grounding on how the size and the composition of the index collection affect the ESA-based computation of similarity values for texts.

Journal ArticleDOI
TL;DR: It is found that show-through techniques can improve collaborative interaction tasks even in such situations, and is proposed to make sure that the objects one is pointing to can always be seen by others.
Abstract: Multi-user virtual reality systems enable natural collaboration in shared virtual worlds. Users can talk to each other, gesture and point into the virtual scenery as if it were real. As in reality, referring to objects by pointing results often in a situation whereon objects are occluded from the other users' viewpoints. While in reality this problem can only be solved by adapting the viewing position, specialized individual views of the shared virtual scene enable various other solutions. As one such solution we propose show-through techniques to make sure that the objects one is pointing to can always be seen by others. We first study the impact of such augmented viewing techniques on the spatial understanding of the scene, the rapidity of mutual information exchange as well as the proxemic behavior of users. To this end we conducted a user study in a co-located stereoscopic multi-user setup. Our study revealed advantages for show-through techniques in terms of comfort, user acceptance and compliance to social protocols while spatial understanding and mutual information exchange is retained. Motivated by these results we further analyze whether show-through techniques may also be beneficial in distributed virtual environments. We investigated a distributed setup for two users, each participant having its own display screen and a minimalist avatar representation for each participant. In such a configuration there is a lack of mutual awareness, which hinders the understanding of each other's pointing gestures and decreases the relevance of social protocols in terms of proxemic behavior. Nevertheless, we found that show-through techniques can improve collaborative interaction tasks even in such situations.

Journal ArticleDOI
TL;DR: Bayesian methods for model selection are extended for model assessment without measurements using model averaging as reference and a procedure is presented which can be used to estimate the model framework uncertainty and which enables the selection of the optimal model with the best compromise between model input and framework uncertainty.

Journal ArticleDOI
TL;DR: The laser beam is a small, flexible and fast polishing tool that is used to finish many outlines or geometries on quartz glass surfaces in the shortest possible time as mentioned in this paper.

Journal ArticleDOI
TL;DR: In this article, the authors proposed an innovative algorithm to facilitate quantitative measures to evaluate coupled partial models in structural engineering and applied it in bridge engineering, analysing bridge behaviour considering dynamic loading, creep and shrinkage material models and further considering geometric nonlinear effects.

Proceedings ArticleDOI
28 Mar 2011
TL;DR: A comprehensive list of information quality flaws in Wikipedia is compiled, model them according to the latest state of the art, and devise one-class classification technology for their identification.
Abstract: Featured articles in Wikipedia stand for high information quality, and it has been found interesting to researchers to analyze whether and how they can be distinguished from "ordinary" articles. Here we point out that article discrimination falls far short of writer support or automatic quality assurance: Featured articles are not identified, but are made. Following this motto we compile a comprehensive list of information quality flaws in Wikipedia, model them according to the latest state of the art, and devise one-class classification technology for their identification.

Book ChapterDOI
26 Sep 2011-Contexts
TL;DR: Approaches for reducing the energy consumption of utilizing smartphone sensors are presented and it is shown that energy awareness benefits from a more abstract view on context elements.
Abstract: Modern smartphones provide sensors that can be used to describe the current context of the device and its user. Contextual knowledge allows software systems to adapt to personal preferences of users and to make data processing context-aware. Different sensors or measurement approaches used for recognizing the values of particular context elements vary greatly in their energy consumption. This paper presents approaches for reducing the energy consumption of utilizing smartphone sensors. We discuss sensor substitution strategies as well as logical dependencies among sensor measurements. The paper describes the first milestone towards a generalization of such strategies. Furthermore,We show that energy awareness benefits from a more abstract view on context elements.

Journal ArticleDOI
TL;DR: The approach is extended by a data structure that facilitates the hierarchical organization of layout elements making it possible to structure and organize larger layout problems into subsets that can be solved in parallel.
Abstract: This paper focuses on computer-based generative methods for layout problems in architecture and urban planning with special regard for the hierarchical structuring of layout elements. The generation of layouts takes place using evolutionary algorithms, which are used to optimize the arrangement of elements in terms of overlapping within a given boundary and the topological relations between them. In this paper, the approach is extended by a data structure that facilitates the hierarchical organization of layout elements making it possible to structure and organize larger layout problems into subsets that can be solved in parallel. An important aspect for the applicability of such a system in the design process is an appropriate means of user interaction. This depends largely on the calculation speed of the system and the variety of viable solutions. These properties are evaluated for hierarchical as well as for nonhierarchical structured layout problems.

Journal ArticleDOI
TL;DR: In this article, an efficient hybrid wavenumber integration-boundary integral equation method (WNI-BIEM) is proposed, validated and applied for synthesis of seismic signals in the finite soil stratum.

Proceedings ArticleDOI
24 Oct 2011
TL;DR: It is argued that common binary or multiclass classification approaches are ineffective in here, and the approach is underpinned by a real-world application: a dedicated one-class learning approach to determine whether a given Wikipedia article suffers from certain quality flaws.
Abstract: For Web applications that are based on user generated content the detection of text quality flaws is a key concern. Our research contributes to automatic quality flaw detection. In particular, we propose to cast the detection of text quality flaws as a one-class classification problem: we are given only positive examples (= texts containing a particular quality flaw) and decide whether or not an unseen text suffers from this flaw. We argue that common binary or multiclass classification approaches are ineffective in here, and we underpin our approach by a real-world application: we employ a dedicated one-class learning approach to determine whether a given Wikipedia article suffers from certain quality flaws. Since in the Wikipedia setting the acquisition of sensible test data is quite intricate, we analyze the effects of a biased sample selection. In addition, we illustrate the classifier effectiveness as a function of the flaw distribution in order to cope with the unknown (real-world) flaw-specific class imbalances. Altogether, provided test data with little noise, four from ten important quality flaws in Wikipedia can be detected with a precision close to 1.

Journal ArticleDOI
TL;DR: In this article, a new method to model fracture of concrete based on energy minimisation is presented. But the method is not suitable for the case where the concrete is considered on the mesoscale as composite consisting of cement paste, aggregates and micro pores.
Abstract: We present a new method to model fracture of concrete based on energy minimisation. The concrete is considered on the mesoscale as composite consisting of cement paste, aggregates and micro pores. In this first step, the alkali-silica reaction is taken into account through damage mechanics though the process is more complex involving thermo-hygro-chemo-mechanical reaction. We use a non-local damage model that ensures the well-posedness of the boundary value problem (BVP). In contrast to existing methods, the interactions between degrees of freedom evolve with the damage evolutions. Numerical results are compared to analytical and experimental results and show good agreement.

Journal ArticleDOI
TL;DR: This work proposes a pseudophysical metaphor that is both plausible enough to provide realistic interaction and robust enough to meet the needs of industrial applications and introduces the concept of Normal Proxies, which extend objects with appropriate normals for improved grasp detection and grasp stability.
Abstract: Natural Interaction in virtual environments is a key requirement for the virtual validation of functional aspects in automotive product development processes. Natural Interaction is the metaphor people encounter in reality: the direct manipulation of objects by their hands. To enable this kind of Natural Interaction, we propose a pseudophysical metaphor that is both plausible enough to provide realistic interaction and robust enough to meet the needs of industrial applications. Our analysis of the most common types of objects in typical automotive scenarios guided the development of a set of refined grasping heuristics to support robust finger-based interaction of multiple hands and users. The objects' behavior in reaction to the users' finger motions is based on pseudophysical simulations, which also take various types of constrained objects into account. In dealing with real-world scenarios, we had to introduce the concept of Normal Proxies, which extend objects with appropriate normals for improved grasp detection and grasp stability. An expert review revealed that our interaction metaphors allow for an intuitive and reliable assessment of several functionalities of objects found in a car interior. Follow-up user studies showed that overall task performance and usability are similar for CAVE and HMD environments. For larger objects and more gross manipulation, using the CAVE without employing a virtual hand representation is preferred, but for more fine-grained manipulation and smaller objects, the HMD turns out to be beneficial.