scispace - formally typeset
Search or ask a question

Showing papers by "Technical University of Berlin published in 2007"


Proceedings ArticleDOI
04 Jul 2007
TL;DR: This work argues that defining a modeling operation by asking for rigidity of the local transformations is useful in various settings, and devise a simple iterative mesh editing scheme based on this principle, that leads to detail-preserving and intuitive deformations.
Abstract: Modeling tasks, such as surface deformation and editing, can be analyzed by observing the local behavior of the surface. We argue that defining a modeling operation by asking for rigidity of the local transformations is useful in various settings. Such formulation leads to a non-linear, yet conceptually simple energy formulation, which is to be minimized by the deformed surface under particular modeling constraints. We devise a simple iterative mesh editing scheme based on this principle, that leads to detail-preserving and intuitive deformations. Our algorithm is effective and notably easy to implement, making it attractive for practical modeling applications.

1,028 citations


Journal ArticleDOI
TL;DR: It is proposed that the key to quick efficiency in the BBCI system is its flexibility due to complex but physiologically meaningful features and its adaptivity which respects the enormous inter-subject variability.

865 citations


Journal Article
TL;DR: This paper proposes a new method called importance weighted cross validation (IWCV), for which its unbiasedness even under the covariate shift is proved, and the IWCV procedure is the only one that can be applied for unbiased classification under covariates.
Abstract: A common assumption in supervised learning is that the input points in the training set follow the same probability distribution as the input points that will be given in the future test phase However, this assumption is not satisfied, for example, when the outside of the training region is extrapolated The situation where the training input points and test input points follow different distributions while the conditional distribution of output values given input points is unchanged is called the covariate shift Under the covariate shift, standard model selection techniques such as cross validation do not work as desired since its unbiasedness is no longer maintained In this paper, we propose a new method called importance weighted cross validation (IWCV), for which we prove its unbiasedness even under the covariate shift The IWCV procedure is the only one that can be applied for unbiased classification under covariate shift, whereas alternatives to IWCV exist for regression The usefulness of our proposed method is illustrated by simulations, and furthermore demonstrated in the brain-computer interface, where strong non-stationarity effects can be seen between training and test sessions

807 citations


Proceedings Article
03 Dec 2007
TL;DR: This paper proposes a direct importance estimation method that does not involve density estimation and is equipped with a natural cross validation procedure and hence tuning parameters such as the kernel width can be objectively optimized.
Abstract: A situation where training and test samples follow different input distributions is called covariate shift. Under covariate shift, standard learning methods such as maximum likelihood estimation are no longer consistent—weighted variants according to the ratio of test and training input densities are consistent. Therefore, accurately estimating the density ratio, called the importance, is one of the key issues in covariate shift adaptation. A naive approach to this task is to first estimate training and test input densities separately and then estimate the importance by taking the ratio of the estimated densities. However, this naive approach tends to perform poorly since density estimation is a hard task particularly in high dimensional cases. In this paper, we propose a direct importance estimation method that does not involve density estimation. Our method is equipped with a natural cross validation procedure and hence tuning parameters such as the kernel width can be objectively optimized. Simulations illustrate the usefulness of our approach.

785 citations


Journal ArticleDOI
TL;DR: The B. amyloliquefaciens FZB42 genome reveals an unexpected potential to produce secondary metabolites, including the polyketides bacillaene and difficidin, and identifies four giant gene clusters absent in B. subtilis 168.
Abstract: Bacillus amyloliquefaciens FZB42 is a Gram-positive, plant-associated bacterium, which stimulates plant growth and produces secondary metabolites that suppress soil-borne plant pathogens. Its 3,918-kb genome, containing an estimated 3,693 protein-coding sequences, lacks extended phage insertions, which occur ubiquitously in the closely related Bacillus subtilis 168 genome. The B. amyloliquefaciens FZB42 genome reveals an unexpected potential to produce secondary metabolites, including the polyketides bacillaene and difficidin. More than 8.5% of the genome is devoted to synthesizing antibiotics and siderophores by pathways not involving ribosomes. Besides five gene clusters, known from B. subtilis to mediate nonribosomal synthesis of secondary metabolites, we identified four giant gene clusters absent in B. subtilis 168. The pks2 gene cluster encodes the components to synthesize the macrolactin core skeleton.

732 citations


Journal ArticleDOI
TL;DR: The key challenges identified include: heat transfer problems and resulting non-uniformity in processing, obtaining reliable and reproducible data for process validation, lack of detailed knowledge about the interaction between high pressure, and a number of food constituents, packaging and statutory issues.
Abstract: Consumers increasingly demand convenience foods of the highest quality in terms of natural flavor and taste, and which are free from additives and preservatives. This demand has triggered the need for the development of a number of nonthermal approaches to food processing, of which high-pressure technology has proven to be very valuable. A number of recent publications have demonstrated novel and diverse uses of this technology. Its novel features, which include destruction of microorganisms at room temperature or lower, have made the technology commercially attractive. Enzymes and even spore forming bacteria can be inactivated by the application of pressure-thermal combinations, This review aims to identify the opportunities and challenges associated with this technology. In addition to discussing the effects of high pressure on food components, this review covers the combined effects of high pressure processing with: gamma irradiation, alternating current, ultrasound, and carbon dioxide or anti-microbial treatment. Further, the applications of this technology in various sectors— fruits and vegetables, dairy, and meat processing—have been dealt with extensively. The integration of high-pressure with other matured processing operations such as blanching, dehydration, osmotic dehydration, rehydration, frying, freezing / thawing and solid-liquid extraction has been shown to open up new processing options. The key challenges identified include: heat transfer problems and resulting non-uniformity in processing, obtaining reliable and reproducible data for process validation, lack of detailed knowledge about the interaction between high pressure, and a number of food constituents, packaging and statutory issues.

711 citations


Journal ArticleDOI
TL;DR: In this paper, the role of emotions in human-technology interaction by using Scherer's (1984) component theory of emotions as a theoretical foundation was investigated. And the results demonstrate that the manipulation of selected system properties may lead to differences in usability that affect emotional user reactions.
Abstract: In the past, research on human–technology interaction has almost exclusively concentrated on aspects of usefulness and usability. Despite the success of this line of research, its narrow perspective has recently become a target for criticism. To explain why people prefer some systems over others, factors such as aesthetic qualities and emotional experiences play an important role in addition to instrumental aspects. In the following, we report three experiments that illustrate the importance of such factors. In the first experiment, we study the role of emotions in human–technology interaction by using Scherer's (1984) component theory of emotions as a theoretical foundation. A combination of methods is derived from that theory and employed to measure subjective feelings, motor expressions, physiological reactions, cognitive appraisals, and behaviour. The results demonstrate that the manipulation of selected system properties may lead to differences in usability that affect emotional user reactions. The s...

480 citations


Journal ArticleDOI
TL;DR: In this paper, the authors sketch out current theoretical and empirical developments in the social sciences and point toward the acute and increasing need for multidisciplinary longitudinal data covering a wide range of living conditions for both theoretical investigation and the evaluation of policy measures.
Abstract: After the introduction in Section 2, we very briefly sketch out current theoretical and empirical developments in the social sciences. In our view, they all point in the same direction: toward the acute and increasing need for multidisciplinary longitudinal data covering a wide range of living conditions and based on a multitude of variables from the social sciences for both theoretical investigation and the evaluation of policy measures. Cohort and panel studies are therefore called upon to become truly interdisciplinary tools. In Section 3, we describe the German Socio-Economic Panel Study (SOEP), in which we discuss recent improvements of that study which approach this ideal and point out existing shortcomings. Section 4 concludes with a discussion of potential future issues and developments for SOEP and other household panel studies.

474 citations


Journal ArticleDOI
01 Apr 2007-Energy
TL;DR: In this paper, the definitions of some terms used in exergy analysis and exergy costing, and options for the symbols to be used for exergy and some exergoeconomic variables, and the nomenclature for the remaining terms.

450 citations


Journal ArticleDOI
TL;DR: The results showed that long-distance dispersal by vehicles was a routine rather than an occasional mechanism that will accelerate plant invasions and induce rapid changes in biodiversity patterns.
Abstract: Roadsides are preferential migration corridors for invasive plant species and can act as starting points for plant invasions into adjacent habitats. Rapid spread and interrupted distribution patterns of in- troduced plant species indicate long-distance dispersal along roads. The extent to which this process is due to species' migration along linear habitats or, alternatively, to seed transport by vehicles has not yet been tested systematically. We tested this by sampling seeds inside long motorway tunnels to exclude nontraffic dispersal. Vehicles transported large amounts of seeds. The annual seed rain caused by vehicles on the roadsides of five different tunnel lanes within three tunnels along a single urban motorway in Berlin, Germany, ranged from 635 to 1579 seeds/m 2 /year. Seeds of non-native species accounted for 50.0% of the 204 species and 54.4% of the total 11,818 seeds trapped inside the tunnels. Among the samples were 39 (19.1%) highly invasive species that exhibit detrimental effects on native biodiversity in some parts of the world. By comparing the flora in the tunnel with that adjacent to the tunnel entrances we confirmed long-distance dispersal events (>250 m) for 32.3% of the sampled species. Seed sources in a radius of 100 m around the entrances of the tunnels had no significant effect on species richness and species composition of seed samples from inside the tunnels, indicating a strong effect of long-distance dispersal by vehicles. Consistently, the species composition of the tunnel seeds was more similar to the regional roadside flora of Berlin than to the local flora around the tunnel entrances. Long-distance dispersal occurred significantly more frequently in seeds of non-native (mean share 38.5%) than native species (mean share 4.1%). Our results showed that long-distance dispersal by vehicles was a routine rather than an occasional mechanism. Dispersal of plants by vehicles will thus accelerate plant invasions and induce rapid changes in biodiversity patterns.

412 citations


Journal ArticleDOI
TL;DR: In this paper, the authors presented the full in-plane phonon dispersion of graphite obtained from inelastic x-ray scattering, including the optical and acoustic branches, as well as the midfrequency range between the $K$ and $M$ points in the Brillouin zone, where the experimental data have been unavailable so far.
Abstract: We present the full in-plane phonon dispersion of graphite obtained from inelastic x-ray scattering, including the optical and acoustic branches, as well as the midfrequency range between the $K$ and $M$ points in the Brillouin zone, where the experimental data have been unavailable so far. The existence of a Kohn anomaly at the $K$ point is further supported. We fit a fifth-nearest neighbor force-constant model to the experimental data, making improved force-constant calculations of the phonon dispersion in both graphite and carbon nanotubes available.

Proceedings ArticleDOI
29 Jul 2007
TL;DR: This system provides real-time algorithms for both control curve deformation and the subsequent surface optimization and it is shown that one can create sophisticated models using this system, which have not yet been seen in previous sketching or functional optimization systems.
Abstract: This paper presents a system for designing freeform surfaces with a collection of 3D curves. The user first creates a rough 3D model by using a sketching interface. Unlike previous sketching systems, the user-drawn strokes stay on the model surface and serve as handles for controlling the geometry. The user can add, remove, and deform these control curves easily, as if working with a 2D line drawing. The curves can have arbitrary topology; they need not be connected to each other. For a given set of curves, the system automatically constructs a smooth surface embedding by applying functional optimization. Our system provides real-time algorithms for both control curve deformation and the subsequent surface optimization. We show that one can create sophisticated models using this system, which have not yet been seen in previous sketching or functional optimization systems.

Journal ArticleDOI
TL;DR: In this article, the Yangtze Platform has been used for the identification of small shelly fossils (SSFs) and five biozones for the Meishucunian Stage.

Journal ArticleDOI
TL;DR: The gap dependence is analyzed explicitly and the result is applied to interpolating Hamiltonians of interest in quantum computing by straightforward proofs of estimates used in the adiabatic approximation.
Abstract: We present straightforward proofs of estimates used in the adiabatic approximation. The gap dependence is analyzed explicitly. We apply the result to interpolating Hamiltonians of interest in quantum computing.

Journal ArticleDOI
TL;DR: Ailanthus altissima (tree of heaven), Simaroubaceae, is an early successional tree, native to China and North Vietnam, which has become invasive in Europe and on all other continents except Antarctica.
Abstract: Ailanthus altissima (tree of heaven), Simaroubaceae, is an early successional tree, native to China and North Vietnam, which has become invasive in Europe and on all other continents except Antarctica. It is most abundant in urban habitats and along transportation corridors, but can also invade natural habitats. This paper reviews the literature on the morphology, distribution, ecology, habitat requirements, population biology, genetics, physiology, impacts, management and uses of this species.

Journal ArticleDOI
TL;DR: In this article, the crucial process parameters electrical field strength, total pulse energy input and treatment temperature were investigated experimentally, and it was found that temperatures higher than 40°C can strongly increase the lethality of the PEF process.
Abstract: Preservation of liquid foods by high intensity pulsed electric fields (PEF) is an interesting alternative to traditional techniques like thermal pasteurization. Based on the underlying mechanism of action, in this paper the crucial process parameters electrical field strength, total pulse energy input and treatment temperature were investigated experimentally. Inactivation studies were performed with three bacteria ( E. coli, Bacillus megaterium, Listeria innocua ) and one yeast ( Saccharomyces cerevisiae ). Stainless steel and carbon electrodes have been tested to investigate their applicability as electrode material. Simulating the influence of cell size and orientation as well as the presence of agglomerations or insulating particles indicated that the applied field strength has to be increased above the critical one to achieve product safety. It was found that temperatures higher than 40 °C can strongly increase the lethality of the PEF process. In this way also small cells like Listeria are easily affected by pulsed fields even at a field strength as low as 16 kV cm −1 . In addition, heating of the product prior to PEF has the advantage that most of the required process energy can be recovered using heat exchangers. Exemplary, such a process is analyzed by an enthalpy balance.


Journal ArticleDOI
TL;DR: In this paper, the authors investigated the effects of planning and control on the performance of new product development (NPD) projects and found that the proficiency of project planning and process management is important predictors of NPD performance.

Journal ArticleDOI
TL;DR: This paper compares the expense of power semiconductors and passive components of a two-level, three-level neutral-point-clamped, four-level flying-capacitor, and five-level series-connected H-bridge voltage source converter on the basis of the state-of-the-art 6.7-kV insulated gate bipolar transistors for industrial medium-voltage drives.
Abstract: This paper compares the expense of power semiconductors and passive components of a (2.3 kV, 2.4 MVA) two-level, three-level neutral-point-clamped, three-level flying-capacitor, four-level flying-capacitor, and five-level series-connected H-bridge voltage source converter on the basis of the state-of-the-art 6.5-, 3.3-, 2.5-, and 1.7-kV insulated gate bipolar transistors for industrial medium-voltage drives. The power semiconductor losses, the loss distribution, the installed switch power, the design of flying capacitors, and the components of an sine filter for retrofit applications are considered.

Journal ArticleDOI
TL;DR: DFH community, as described in this article, is a mechanism that coordinates multiple WRAN cells operating in the DFH mode, such that efficient frequency usage and reliable channel sensing are achieved.
Abstract: One of the key challenges of the emerging cognitive radio-based IEEE 802.22 wireless regional area networks (WRANs) is to address two apparently conflicting requirements: ensuring QoS satisfaction for WRAN services while providing reliable spectrum sensing for guaranteeing licensed user protection. To perform reliable sensing, in the basic operation mode on a single frequency band (non-hopping mode), one must allocate quiet times, that is, periodically interrupt data transmission that could impair the QoS of WRAN. This critical issue can be addressed by an alternative operation mode proposed in 802.22 called dynamic frequency hopping (DFH), where WRAN data transmission is performed in parallel with spectrum sensing without interruptions. DFH community, as described in this article, is a mechanism that coordinates multiple WRAN cells operating in the DFH mode, such that efficient frequency usage and reliable channel sensing are achieved. The key idea of DFH community is that neighboring WRAN cells form cooperating communities that coordinate their DFH operations

Journal ArticleDOI
TL;DR: In this article, the elastic moduli of single-crystalline graphite are determined using inelastic x-ray scattering using an upper bound of $1.1\phantom{\rule{0.3em}{0ex}}\mathrm{TPa}$.
Abstract: The five independent elastic moduli of single-crystalline graphite are determined using inelastic x-ray scattering. At room temperature the elastic moduli are, in units of GPa, ${C}_{11}=1109$, ${C}_{12}=139$, ${C}_{13}=0$, ${C}_{33}=38.7$, and ${C}_{44}=4.95$. Our experimental results are compared with predictions of ab initio calculations and previously reported incomplete and contradictory data sets. We obtain an upper limit of $1.1\phantom{\rule{0.3em}{0ex}}\mathrm{TPa}$ for the on-axis Young's modulus of homogeneous carbon nanotube, thus providing important constraints for further theoretical advances and quantitative input to model elasticity in graphite nanotubes.

Journal ArticleDOI
02 Jul 2007-Small
TL;DR: This study investigates new composite materials made of gold nanorods adsorbed on thermoresponsive poly(N-isopropylacrylamide) (PNIPAM) microgels and shows that the thermally induced collapse of the polymer network inside the particles leads to a red shift of the longitudinal plasmon band of the gold rods, which is found to be fully reversible.
Abstract: Nanoparticles and in particular gold nanorods have interesting optical properties arising from two well-differentiated plasmon modes. The frequency of such modes can be altered by their chemical environment and coupling with neighboring rods. This study investigates new composite materials made of gold nanorods adsorbed on thermoresponsive poly(N-isopropylacrylamide) (PNIPAM) microgels. It is shown that the thermally induced collapse of the polymer network inside the particles leads to a red shift of the longitudinal plasmon band of the gold rods, which is found to be fully reversible.

Journal ArticleDOI
TL;DR: Using Rippa's Theorem, it is shown that, as claimed, Musin’s harmonic index provides an optimality criterion for Delaunay triangulations, and this can be used to prove that the edge flipping algorithm terminates also in the setting of piecewise flat surfaces.
Abstract: We define a discrete Laplace–Beltrami operator for simplicial surfaces (Definition 16). It depends only on the intrinsic geometry of the surface and its edge weights are positive. Our Laplace operator is similar to the well known finite-elements Laplacian (the so called “cotan formula”) except that it is based on the intrinsic Delaunay triangulation of the simplicial surface. This leads to new definitions of discrete harmonic functions, discrete mean curvature, and discrete minimal surfaces. The definition of the discrete Laplace–Beltrami operator depends on the existence and uniqueness of Delaunay tessellations in piecewise flat surfaces. While the existence is known, we prove the uniqueness. Using Rippa’s Theorem we show that, as claimed, Musin’s harmonic index provides an optimality criterion for Delaunay triangulations, and this can be used to prove that the edge flipping algorithm terminates also in the setting of piecewise flat surfaces.

Journal ArticleDOI
09 Aug 2007-Nature
TL;DR: The notion that X-ray flash imaging can be used to achieve high resolution, beyond radiation damage limits for biological samples is supported, and the technique is applied to monitor the dynamics of polystyrene spheres in intense free-electron-laser pulses.
Abstract: Extremely intense and ultrafast X-ray pulses from free-electron lasers offer unique opportunities to study fundamental aspects of complex transient phenomena in materials. Ultrafast time-resolved methods usually require highly synchronized pulses to initiate a transition and then probe it after a precisely defined time delay. In the X-ray regime, these methods are challenging because they require complex optical systems and diagnostics. Here we propose and apply a simple holographic measurement scheme, inspired by Newton's 'dusty mirror' experiment, to monitor the X-ray-induced explosion of microscopic objects. The sample is placed near an X-ray mirror; after the pulse traverses the sample, triggering the reaction, it is reflected back onto the sample by the mirror to probe this reaction. The delay is encoded in the resulting diffraction pattern to an accuracy of one femtosecond, and the structural change is holographically recorded with high resolution. We apply the technique to monitor the dynamics of polystyrene spheres in intense free-electron-laser pulses, and observe an explosion occurring well after the initial pulse. Our results support the notion that X-ray flash imaging can be used to achieve high resolution, beyond radiation damage limits for biological samples. With upcoming ultrafast X-ray sources we will be able to explore the three-dimensional dynamics of materials at the timescale of atomic motion.

Journal ArticleDOI
TL;DR: This work addresses the problem of minimum mean square error (MMSE) transceiver design for point-to-multipoint transmission in multiuser multiple-input-multiple-output (MIMO) systems and proposes two globally optimum algorithms based on convex optimization.
Abstract: We address the problem of minimum mean square error (MMSE) transceiver design for point-to-multipoint transmission in multiuser multiple-input-multiple-output (MIMO) systems. We focus on the problem of minimizing the downlink sum-MSE under a sum power constraint. It is shown that this problem can be solved efficiently by exploiting a duality between the downlink and uplink MSE feasible regions. We propose two different optimization frameworks for downlink MMSE transceiver design. The first one solves an equivalent uplink problem, then the result is transferred to the original downlink problem. Duality ensures that any uplink MMSE scheme, e.g., linear MMSE reception or MMSE-decision feedback equalization (DFE), has a downlink counterpart. We propose two globally optimum algorithms based on convex optimization. The basic idea of the second framework is to perform optimization in an alternating manner by switching between the virtual uplink and downlink channels. This strategy exploits that the same MSE can be achieved in both links for a given choice of transmit and receive filters. This iteration is proven to be convergent.

Proceedings Article
03 Dec 2007
TL;DR: This work defines features based on a variant of the common spatial patterns (CSP) algorithm that are constructed invariant with respect to such nonstationarities such as disturbance covariance matrices from fluctuations in visual processing.
Abstract: Brain-Computer Interfaces can suffer from a large variance of the subject conditions within and across sessions. For example vigilance fluctuations in the individual, variable task involvement, workload etc. alter the characteristics of EEG signals and thus challenge a stable BCI operation. In the present work we aim to define features based on a variant of the common spatial patterns (CSP) algorithm that are constructed invariant with respect to such nonstationarities. We enforce invariance properties by adding terms to the denominator of a Rayleigh coefficient representation of CSP such as disturbance covariance matrices from fluctuations in visual processing. In this manner physiological prior knowledge can be used to shape the classification engine for BCI. As a proof of concept we present a BCI classifier that is robust to changes in the level of parietal α-activity. In other words, the EEG decoding still works when there are lapses in vigilance.

Journal ArticleDOI
TL;DR: In this article, a large variety of realistic QD geometries and composition profiles were studied, and the linear and quadratic parts of the piezoelectric field were calculated.
Abstract: The strain fields in and around self-organized $\mathrm{In}(\mathrm{Ga})\mathrm{As}∕\mathrm{Ga}\mathrm{As}$ quantum dots (QDs) sensitively depend on QD geometry, average InGaAs composition, and the $\mathrm{In}∕\mathrm{Ga}$ distribution profile. Piezoelectric fields of varying sizes are one result of these strain fields. We study systematically a large variety of realistic QD geometries and composition profiles, and calculate the linear and quadratic parts of the piezoelectric field. The balance of the two orders depends strongly on the QD shape and composition. For pyramidal InAs QDs with sharp interfaces, a strong dominance of the second-order fields is found. Upon annealing, the first-order terms become dominant, resulting in a reordering of the electron $p$ and $d$ states and a reorientation of the hole wave functions.

Journal ArticleDOI
TL;DR: In this paper, the influence of innovator roles in highly innovative ventures is studied and the authors take into account the degree of innovativeness as a moderating variable to obtain a differentiated picture.
Abstract: In this paper, we study the influence of innovator roles in highly innovative ventures. In order to obtain a differentiated picture we take into account the degree of innovativeness as a moderating variable. To test our hypotheses we use a sample of 146 highly innovative new product development projects. We choose a rigorous sampling design and apply state-of-theart measures for the degree of innovativeness. Furthermore, we apply multi-trait-multimethod methodology (MTMM) to enhance the validity of our study. The results show that innovator roles have a strong influence on innovation success but these influences are positively and negatively moderated by innovativeness. The moderating influences depend on the type of innovativeness. Remarkably, with increasing technological innovativeness innovator roles which create inter-organizational links with the outside world appear to be more important than intra-organizational linker roles, and support from high-ranked organizational members turns out to have a significant negative effect on project success with higher degrees of technological innovativeness. Possible explanations for these findings are discussed and consequences for innovation research and innovation management are shown.

Proceedings ArticleDOI
29 Jul 2007
TL;DR: It is shown how to design geometrically optimal shapes, and how to find a meaningful meshing and beam layout for existing shapes, in a novel discrete theory of curvatures.
Abstract: The geometric challenges in the architectural design of freeform shapes come mainly from the physical realization of beams and nodes. We approach them via the concept of parallel meshes, and present methods of computation and optimization. We discuss planar faces, beams of controlled height, node geometry, and multilayer constructions. Beams of constant height are achieved with the new type of edge offset meshes. Mesh parallelism is also the main ingredient in a novel discrete theory of curvatures. These methods are applied to the construction of quadrilateral, pentagonal and hexagonal meshes, discrete minimal surfaces, discrete constant mean curvature surfaces, and their geometric transforms. We show how to design geometrically optimal shapes, and how to find a meaningful meshing and beam layout for existing shapes.

Journal ArticleDOI
TL;DR: Evidence that community-based dengue control programmes alone and in combination with other control activities can enhance the effectiveness of denge control programmes is weak.
Abstract: Summary Owing to increased epidemic activity and difficulties in controlling the insect vector, dengue has become a major public health problem in many parts of the tropics. The objective of this review is to analyse evidence regarding the achievements of community-based dengue control programmes. Medline, EMBASE, WHOLIS and the Cochrane Database of Systematic Reviews were searched (all to March 2005) to identify potentially relevant articles using keywords such as ‘Aedes’, ‘dengue’, ‘breeding habits’, ‘housing’ and ‘community intervention’. According to the evaluation criteria recommended by the Cochrane Effective Practice and Organisation of Care Review Group, only studies that met the inclusion criteria of randomised controlled trials (RCT), controlled clinical trials (CCT), controlled before and after trials (CBA) or interrupted time series (ITS) were included. Eleven of 1091 studies met the inclusion criteria. Of these, two were RCTs, six were CBAs and three were ITS. The selected studies varied widely with respect to target groups, intervention procedures and outcome measurements. Six studies combined community participation programmes with dengue control tools. Methodological weaknesses were found in all studies: only two papers reported confidence intervals (95% CI); five studies reported P-values; two studies recognised the importance of water container productivity as a measure for vector density; in no study was cluster randomisation attempted; and in no study were costs and sustainability assessed. Evidence that community-based dengue control programmes alone and in combination with other control activities can enhance the effectiveness of dengue control programmes is weak.