scispace - formally typeset
Search or ask a question

Showing papers by "ETH Zurich published in 2007"


Journal ArticleDOI
TL;DR: In this paper, the authors present a new molecular dynamics algorithm for sampling the canonical distribution, where the velocities of all the particles are rescaled by a properly chosen random factor.
Abstract: The authors present a new molecular dynamics algorithm for sampling the canonical distribution. In this approach the velocities of all the particles are rescaled by a properly chosen random factor. The algorithm is formally justified and it is shown that, in spite of its stochastic nature, a quantity can still be defined that remains constant during the evolution. In numerical applications this quantity can be used to measure the accuracy of the sampling. The authors illustrate the properties of this new method on Lennard-Jones and TIP4P water models in the solid and liquid phases. Its performance is excellent and largely independent of the thermostat parameter also with regard to the dynamic properties.

11,327 citations


Journal ArticleDOI
28 Sep 2007-Science
TL;DR: Experimental progress in exploration of the specific influence of carbon-fluorine single bonds on docking interactions is reviewed and complementary analysis based on comprehensive searches in the Cambridge Structural Database and the Protein Data Bank is added.
Abstract: Fluorine substituents have become a widespread and important drug component, their introduction facilitated by the development of safe and selective fluorinating agents. Organofluorine affects nearly all physical and adsorption, distribution, metabolism, and excretion properties of a lead compound. Its inductive effects are relatively well understood, enhancing bioavailability, for example, by reducing the basicity of neighboring amines. In contrast, exploration of the specific influence of carbon-fluorine single bonds on docking interactions, whether through direct contact with the protein or through stereoelectronic effects on molecular conformation of the drug, has only recently begun. Here, we review experimental progress in this vein and add complementary analysis based on comprehensive searches in the Cambridge Structural Database and the Protein Data Bank.

4,906 citations


Journal ArticleDOI
TL;DR: A new kind of neural-network representation of DFT potential-energy surfaces is introduced, which provides the energy and forces as a function of all atomic positions in systems of arbitrary size and is several orders of magnitude faster than DFT.
Abstract: The accurate description of chemical processes often requires the use of computationally demanding methods like density-functional theory (DFT), making long simulations of large systems unfeasible. In this Letter we introduce a new kind of neural-network representation of DFT potential-energy surfaces, which provides the energy and forces as a function of all atomic positions in systems of arbitrary size and is several orders of magnitude faster than DFT. The high accuracy of the method is demonstrated for bulk silicon and compared with empirical potentials and DFT. The method is general and can be applied to all types of periodic and nonperiodic systems.

2,778 citations


Journal ArticleDOI
TL;DR: The Cosmic Evolution Survey (COSMOS) as mentioned in this paper is designed to probe the correlated evolution of galaxies, star formation, active galactic nuclei (AGNs), and dark matter (DM) with large-scale structure (LSS) over the redshift range z > 0.5-6.
Abstract: The Cosmic Evolution Survey (COSMOS) is designed to probe the correlated evolution of galaxies, star formation, active galactic nuclei (AGNs), and dark matter (DM) with large-scale structure (LSS) over the redshift range z > 0.5-6. The survey includes multiwavelength imaging and spectroscopy from X-ray-to-radio wavelengths covering a 2 deg^2 area, including HST imaging. Given the very high sensitivity and resolution of these data sets, COSMOS also provides unprecedented samples of objects at high redshift with greatly reduced cosmic variance, compared to earlier surveys. Here we provide a brief overview of the survey strategy, the characteristics of the major COSMOS data sets, and a summary the science goals.

1,848 citations


Journal ArticleDOI
B. Rankov1, Armin Wittneben1
TL;DR: Two new half-duplex relaying protocols are proposed that avoid the pre-log factor one-half in corresponding capacity expressions and it is shown that both protocols recover a significant portion of the half- duplex loss.
Abstract: We study two-hop communication protocols where one or several relay terminals assist in the communication between two or more terminals. All terminals operate in half-duplex mode, hence the transmission of one information symbol from the source terminal to the destination terminal occupies two channel uses. This leads to a loss in spectral efficiency due to the pre-log factor one-half in corresponding capacity expressions. We propose two new half-duplex relaying protocols that avoid the pre-log factor one-half. Firstly, we consider a relaying protocol where a bidirectional connection between two terminals is established via one amplify-and-forward (AF) or decode-and-forward (DF) relay (two-way relaying). We also extend this protocol to a multi-user scenario, where multiple terminals communicate with multiple partner terminals via several orthogonalize-and-forward (OF) relay terminals, i.e., the relays orthogonalize the different two-way transmissions by a distributed zero-forcing algorithm. Secondly, we propose a relaying protocol where two relays, either AF or DF, alternately forward messages from a source terminal to a destination terminal (two-path relaying). It is shown that both protocols recover a significant portion of the half-duplex loss

1,728 citations


Journal ArticleDOI
22 Feb 2007-Nature
TL;DR: Observations unequivocally show that quantum information tasks are achievable in solid-state cavity QED by observing quantum correlations in photoluminescence from a photonic crystal nanocavity interacting with one, and only one, quantum dot located precisely at the cavity electric field maximum.
Abstract: Cavity quantum electrodynamics (QED) studies the interaction between a quantum emitter and a single radiation-field mode. When an atom is strongly coupled to a cavity mode, it is possible to realize important quantum information processing tasks, such as controlled coherent coupling and entanglement of distinguishable quantum systems. Realizing these tasks in the solid state is clearly desirable, and coupling semiconductor self-assembled quantum dots to monolithic optical cavities is a promising route to this end. However, validating the efficacy of quantum dots in quantum information applications requires confirmation of the quantum nature of the quantum-dot-cavity system in the strong-coupling regime. Here we find such confirmation by observing quantum correlations in photoluminescence from a photonic crystal nanocavity interacting with one, and only one, quantum dot located precisely at the cavity electric field maximum. When off-resonance, photon emission from the cavity mode and quantum-dot excitons is anticorrelated at the level of single quanta, proving that the mode is driven solely by the quantum dot despite an energy mismatch between cavity and excitons. When tuned to resonance, the exciton and cavity enter the strong-coupling regime of cavity QED and the quantum-dot exciton lifetime reduces by a factor of 145. The generated photon stream becomes antibunched, proving that the strongly coupled exciton/photon system is in the quantum regime. Our observations unequivocally show that quantum information tasks are achievable in solid-state cavity QED.

1,679 citations


Journal ArticleDOI
TL;DR: In this paper, the authors proposed a comprehensive framework for the environmental assessment of solvents that covers major aspects of the environmental performance of Solvents in chemical production, as well as important health and safety issues.

1,363 citations


Journal ArticleDOI
27 Sep 2007-Nature
TL;DR: These experiments show that two nearby qubits can be readily coupled with local interactions, and show the implementation of a quantum bus, using microwave photons confined in a transmission line cavity, to couple two superconducting qubits on opposite sides of a chip.
Abstract: Microfabricated superconducting circuit elements can harness the power of quantum behaviour for information processing. Unlike classical information bits, quantum information bits (qubits) can form superpositions or mixture states of ON and OFF, offering a faster, natural form of parallel processing. Previously, direct qubit–qubit coupling has been achieved for up to four qubits, but now two independent groups demonstrate the next crucial step: communication and exchange of quantum information between two superconducting qubits via a quantum bus, in the form of a resonant cavity formed by a superconducting transmission line a few millimetres long. Using this microwave cavity it is possible to store, transfer and exchange quantum information between two quantum bits. It can also perform multiplexed qubit readout. This basic architecture lends itself to expansion, offering the possibility for the coherent interaction of many superconducting qubits. The cover illustrates a zig-zag-shaped resonant cavity or quantum bus linking two superconducting phase qubits. One of two papers that demonstrate the communication of individual quantum states between superconducting qubits via a quantum bus. This quantum bus is a resonant cavity formed by a superconducting transmission line of several millimetres. Quantum information, initially defined in one qubit on one end, can be stored in this quantum bus and at a later time retrieved by a second qubit at the other end. Superconducting circuits are promising candidates for constructing quantum bits (qubits) in a quantum computer; single-qubit operations are now routine1,2, and several examples3,4,5,6,7,8,9 of two-qubit interactions and gates have been demonstrated. These experiments show that two nearby qubits can be readily coupled with local interactions. Performing gate operations between an arbitrary pair of distant qubits is highly desirable for any quantum computer architecture, but has not yet been demonstrated. An efficient way to achieve this goal is to couple the qubits to a ‘quantum bus’, which distributes quantum information among the qubits. Here we show the implementation of such a quantum bus, using microwave photons confined in a transmission line cavity, to couple two superconducting qubits on opposite sides of a chip. The interaction is mediated by the exchange of virtual rather than real photons, avoiding cavity-induced loss. Using fast control of the qubits to switch the coupling effectively on and off, we demonstrate coherent transfer of quantum states between the qubits. The cavity is also used to perform multiplexed control and measurement of the qubit states. This approach can be expanded to more than two qubits, and is an attractive architecture for quantum information processing on a chip.

1,248 citations


Journal ArticleDOI
TL;DR: Although technical challenges limit the amount of bioavailable iron compounds that can be used in food fortification, studies show that iron fortification can be an effective strategy against nutritional iron deficiency.

1,192 citations


Journal ArticleDOI
TL;DR: The zCOSMOS-bright survey as discussed by the authors is a large-redshift survey that is being undertaken in the CosMOS field using 600 hr of observation with the VIMOS spectrograph on the 8 m VLT.
Abstract: zCOSMOS is a large-redshift survey that is being undertaken in the COSMOS field using 600 hr of observation with the VIMOS spectrograph on the 8 m VLT. The survey is designed to characterize the environments of COSMOS galaxies from the 100 kpc scales of galaxy groups up to the 100 Mpc scale of the cosmic web and to produce diagnostic information on galaxies and active galactic nuclei. The zCOSMOS survey consists of two parts: (1) zCOSMOSbright, a magnitude-limited I-band I_(AB) < 22.5 sample of about 20,000 galaxies with 0.1 < z < 1.2 covering the whole 1.7 deg^2 COSMOS ACS field, for which the survey parameters at z ~ 0.7 are designed to be directly comparable to those of the 2dFGRS at z ~ 0.1; and (2) zCOSMOS-deep, a survey of approximately 10,000 galaxies selected through color-selection criteria to have 1.4 < z < 3.0, within the central 1 deg^2. This paper describes the survey design and the construction of the target catalogs and briefly outlines the observational program and the data pipeline. In the first observing season, spectra of 1303 zCOSMOS-bright targets and 977 zCOSMOS-deep targets have been obtained. These are briefly analyzed to demonstrate the characteristics that may be expected from zCOSMOS, and particularly zCOSMOS-bright, when it is finally completed between 2008 and 2009. The power of combining spectroscopic and photometric redshifts is demonstrated, especially in correctly identifying the emission line in single-line spectra and in determining which of the less reliable spectroscopic redshifts are correct and which are incorrect. These techniques bring the overall success rate in the zCOSMOS-bright so far to almost 90% and to above 97% in the 0.5 < z < 0.8 redshift range. Our zCOSMOS-deep spectra demonstrate the power of our selection techniques to isolate high-redshift galaxies at 1.4 < z < 3.0 and of VIMOS to measure their redshifts using ultraviolet absorption lines.

1,026 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a project named "Vision of Future Energy Networks", which aims at a greenfield approach for future energy systems, defined energy hubs and the conception of combined interconnector devices represent key approaches towards a multicarrier greenfield layout.
Abstract: This paper presents a project named "Vision of Future Energy Networks", which aims at a greenfield approach for future energy systems. The definition of energy hubs and the conception of combined interconnector devices represent key approaches towards a multicarrier greenfield layout. Models and tools for technical, economical and environmental investigations in multicarrier energy systems have been developed and used in various case studies

Journal ArticleDOI
G. L. Bayatian, S. Chatrchyan, G. Hmayakyan, Albert M. Sirunyan  +2060 moreInstitutions (143)
TL;DR: In this article, the authors present a detailed analysis of the performance of the Large Hadron Collider (CMS) at 14 TeV and compare it with the state-of-the-art analytical tools.
Abstract: CMS is a general purpose experiment, designed to study the physics of pp collisions at 14 TeV at the Large Hadron Collider (LHC). It currently involves more than 2000 physicists from more than 150 institutes and 37 countries. The LHC will provide extraordinary opportunities for particle physics based on its unprecedented collision energy and luminosity when it begins operation in 2007. The principal aim of this report is to present the strategy of CMS to explore the rich physics programme offered by the LHC. This volume demonstrates the physics capability of the CMS experiment. The prime goals of CMS are to explore physics at the TeV scale and to study the mechanism of electroweak symmetry breaking--through the discovery of the Higgs particle or otherwise. To carry out this task, CMS must be prepared to search for new particles, such as the Higgs boson or supersymmetric partners of the Standard Model particles, from the start-up of the LHC since new physics at the TeV scale may manifest itself with modest data samples of the order of a few fb−1 or less. The analysis tools that have been developed are applied to study in great detail and with all the methodology of performing an analysis on CMS data specific benchmark processes upon which to gauge the performance of CMS. These processes cover several Higgs boson decay channels, the production and decay of new particles such as Z' and supersymmetric particles, Bs production and processes in heavy ion collisions. The simulation of these benchmark processes includes subtle effects such as possible detector miscalibration and misalignment. Besides these benchmark processes, the physics reach of CMS is studied for a large number of signatures arising in the Standard Model and also in theories beyond the Standard Model for integrated luminosities ranging from 1 fb−1 to 30 fb−1. The Standard Model processes include QCD, B-physics, diffraction, detailed studies of the top quark properties, and electroweak physics topics such as the W and Z0 boson properties. The production and decay of the Higgs particle is studied for many observable decays, and the precision with which the Higgs boson properties can be derived is determined. About ten different supersymmetry benchmark points are analysed using full simulation. The CMS discovery reach is evaluated in the SUSY parameter space covering a large variety of decay signatures. Furthermore, the discovery reach for a plethora of alternative models for new physics is explored, notably extra dimensions, new vector boson high mass states, little Higgs models, technicolour and others. Methods to discriminate between models have been investigated. This report is organized as follows. Chapter 1, the Introduction, describes the context of this document. Chapters 2-6 describe examples of full analyses, with photons, electrons, muons, jets, missing ET, B-mesons and τ's, and for quarkonia in heavy ion collisions. Chapters 7-15 describe the physics reach for Standard Model processes, Higgs discovery and searches for new physics beyond the Standard Model

Journal ArticleDOI
TL;DR: The Gaussian mixture representation of a multivariate t distribution is used as a starting point to construct two new copulas, the skewed t copula and the grouped tCopula, which allow more heterogeneity in the modelling of dependent observations.
Abstract: Summary The t copula and its properties are described with a focus on issues related to the dependence of extreme values. The Gaussian mixture representation of a multivariate t distribution is used as a starting point to construct two new copulas, the skewed t copula and the grouped t copula, which allow more heterogeneity in the modelling of dependent observations. Extreme value considerations are used to derive two further new copulas: the t extreme value copula is the limiting copula of componentwise maxima of t distributed random vectors; the t lower tail copula is the limiting copula of bivariate observations from a t distribution that are conditioned to lie below some joint threshold that is progressively lowered. Both these copulas may be approximated for practical purposes by simpler, better-known copulas, these being the Gumbel and Clayton copulas respectively.

Journal ArticleDOI
TL;DR: It is shown that conjugation to bile acids and long-chain fatty acids, in addition to cholesterol, mediates siRNA uptake into cells and gene silencing in vivo and can be exploited to optimize therapeutic siRNA delivery.
Abstract: Cholesterol-conjugated siRNAs can silence gene expression in vivo. Here we synthesize a variety of lipophilic siRNAs and use them to elucidate the requirements for siRNA delivery in vivo. We show that conjugation to bile acids and long-chain fatty acids, in addition to cholesterol, mediates siRNA uptake into cells and gene silencing in vivo. Efficient and selective uptake of these siRNA conjugates depends on interactions with lipoprotein particles, lipoprotein receptors and transmembrane proteins. High-density lipoprotein (HDL) directs siRNA delivery into liver, gut, kidney and steroidogenic organs, whereas low-density lipoprotein (LDL) targets siRNA primarily to the liver. LDL-receptor expression is essential for siRNA delivery by LDL particles, and SR-BI receptor expression is required for uptake of HDL-bound siRNAs. Cellular uptake also requires the mammalian homolog of the Caenorhabditis elegans transmembrane protein Sid1. Our results demonstrate that conjugation to lipophilic molecules enables effective siRNA uptake through a common mechanism that can be exploited to optimize therapeutic siRNA delivery.

Journal ArticleDOI
TL;DR: A statistical perspective on boosting is presented, with special emphasis on estimating potentially complex parametric or nonparametric models, including generalized linear and additive models as well as regression models for survival analysis.
Abstract: We present a statistical perspective on boosting. Special emphasis is given to estimating potentially complex parametric or nonparametric models, including generalized linear and additive models as well as regression models for survival analysis. Concepts of degrees of freedom and corresponding Akaike or Bayesian information criteria, particularly useful for regularization and variable selection in high-dimensional covariate spaces, are discussed as well. The practical aspects of boosting procedures for fitting statistical models are illustrated by means of the dedicated open-source software package mboost. This package implements functions which can be used for model fitting, prediction and variable selection. It is flexible, allowing for the implementation of new boosting algorithms optimizing user-specified loss functions.

Journal ArticleDOI
TL;DR: The present results indicate that the particles could efficiently enter the cells by a Trojan-horse type mechanism which provoked an up to eight times higher oxidative stress in the case of cobalt or manganese if compared to reference cultures exposed to aqueous solutions of the same metals.
Abstract: The chemical and catalytic activity of nanoparticles has strongly contributed to the current tremendous interest in engineered nanomaterials and often serves as a guiding principle for the design of functional materials. Since it has most recently become evident that such active materials can enter into cells or organisms, the present study investigates the level of intracellular oxidations after exposure to iron-, cobalt-, manganese-, and titania-containing silica nanoparticles and the corresponding pure oxides in vitro. The resulting oxidative stress was quantitatively measured as the release of reactive oxygen species (ROS). The use of thoroughly characterized nanoparticles of the same morphology, comparable size, shape, and degree of agglomeration allowed separation of physical (rate of particle uptake, agglomeration, sedimentation) and chemical effects (oxidations). Three sets of control experiments elucidated the role of nanoparticles as carriers for heavy metal uptake and excluded a potential interference of the biological assay with the nanomaterial. The present results indicate that the particles could efficiently enter the cells by a Trojan-horse type mechanism which provoked an up to eight times higher oxidative stress in the case of cobalt or manganese if compared to reference cultures exposed to aqueous solutions of the same metals. A systematic investigation on iron-containing nanoparticles as used in industrial fine chemical synthesis demonstrated that the presence of catalytic activity could strongly alter the damaging action of a nanomaterial. This indicates that a proactive development of nanomaterials and their risk assessment should consider chemical and catalytic properties of nanomaterials beyond a mere focus on physical properties such as size, shape, and degree of agglomeration.

Journal ArticleDOI
TL;DR: It is shown here that the application of propidium iodide in combination with a green fluorescent total nucleic acid stain on UVA-irradiated cells of Escherichia coli, Salmonella enterica serovar Typhimurium, Shigella flexneri, and a community of freshwater bacteria resulted in a clear and distinctive flow cytometric staining pattern.
Abstract: The commercially available LIVE/DEAD BacLight kit is enjoying increased popularity among researchers in various fields of microbiology. Its use in combination with flow cytometry brought up new questions about how to interpret LIVE/DEAD staining results. Intermediate states, normally difficult to detect with epifluorescence microscopy, are a common phenomenon when the assay is used in flow cytometry and still lack rationale. It is shown here that the application of propidium iodide in combination with a green fluorescent total nucleic acid stain on UVA-irradiated cells of Escherichia coli, Salmonella enterica serovar Typhimurium, Shigella flexneri, and a community of freshwater bacteria resulted in a clear and distinctive flow cytometric staining pattern. In the gram-negative bacterium E. coli as well as in the two enteric pathogens, the pattern can be related to the presence of intermediate cellular states characterized by the degree of damage afflicted specifically on the bacterial outer membrane. This hypothesis is supported by the fact that EDTA-treated nonirradiated cells exhibit the same staining properties. On the contrary, this pattern was not observed in gram-positive Enterococcus faecalis, which lacks an outer membrane. Our observations add a new aspect to the LIVE/DEAD stain, which so far was believed to be dependent only on cytoplasmic membrane permeability.

Journal ArticleDOI
TL;DR: In this paper, the authors presented imaging data and photometry for the COSMOS survey in 15 photometric bands between 0.3 and 2.4 μm, including data taken on the Subaru 8.3 m telescope, the KPNO and CTIO 4 m telescopes, and the CFHT 3.6 m telescope.
Abstract: We present imaging data and photometry for the COSMOS survey in 15 photometric bands between 0.3 and 2.4 μm. These include data taken on the Subaru 8.3 m telescope, the KPNO and CTIO 4 m telescopes, and the CFHT 3.6 m telescope. Special techniques are used to ensure that the relative photometric calibration is better than 1% across the field of view. The absolute photometric accuracy from standard-star measurements is found to be 6%. The absolute calibration is corrected using galaxy spectra, providing colors accurate to 2% or better. Stellar and galaxy colors and counts agree well with the expected values. Finally, as the first step in the scientific analysis of these data we construct panchromatic number counts which confirm that both the geometry of the universe and the galaxy population are evolving.

Journal ArticleDOI
TL;DR: The role of land surface-related processes and feedbacks during the record-breaking 2003 European summer heat wave is explored with a regional climate model in this article, where sensitivity experiments are performed by perturbing spring soil moisture in order to determine its influence on the formation of the heat wave.
Abstract: The role of land surface–related processes and feedbacks during the record-breaking 2003 European summer heat wave is explored with a regional climate model. All simulations are driven by lateral boundary conditions and sea surface temperatures from the ECMWF operational analysis and 40-yr ECMWF ReAnalysis (ERA-40), thereby prescribing the large-scale circulation. In particular, the contribution of soil moisture anomalies and their interactions with the atmosphere through latent and sensible heat fluxes is investigated. Sensitivity experiments are performed by perturbing spring soil moisture in order to determine its influence on the formation of the heat wave. A multiyear regional climate simulation for 1970–2000 using a fixed model setup is used as the reference period. A large precipitation deficit together with early vegetation green-up and strong positive radiative anomalies in the months preceding the extreme summer event contributed to an early and rapid loss of soil moisture, which exceeded the multiyear average by far. The exceptionally high temperature anomalies, most pronounced in June and August 2003, were initiated by persistent anticyclonic circulation anomalies that enabled a dominance of the local heat balance. In this experiment the hottest phase in early August is realistically simulated despite the absence of an anomaly in total surface net radiation. This indicates an important role of the partitioning of net radiation in latent and sensible heat fluxes, which is to a large extent controlled by soil moisture. The lack of soil moisture strongly reduced latent cooling and thereby amplified the surface temperature anomalies. The evaluation of the experiments with perturbed spring soil moisture shows that this quantity is an important parameter for the evolution of European heat waves. Simulations indicate that without soil moisture anomalies the summer heat anomalies could have been reduced by around 40% in some regions. Moreover, drought conditions are revealed to influence the tropospheric circulation by producing a surface heat low and enhanced ridging in the midtroposphere. This suggests a positive feedback mechanism between soil moisture, continental-scale circulation, and temperature.

Journal ArticleDOI
01 Feb 2007-Nature
TL;DR: A circuit QED experiment is reported in the strong dispersive limit, a new regime where a single photon has a large effect on the qubit without ever being absorbed, the basis of a logic bus for a quantum computer.
Abstract: Electromagnetic signals are always composed of photons, although in the circuit domain those signals are carried as voltages and currents on wires, and the discreteness of the photon's energy is usually not evident. However, by coupling a superconducting quantum bit (qubit) to signals on a microwave transmission line, it is possible to construct an integrated circuit in which the presence or absence of even a single photon can have a dramatic effect. Such a system can be described by circuit quantum electrodynamics (QED)-the circuit equivalent of cavity QED, where photons interact with atoms or quantum dots. Previously, circuit QED devices were shown to reach the resonant strong coupling regime, where a single qubit could absorb and re-emit a single photon many times. Here we report a circuit QED experiment in the strong dispersive limit, a new regime where a single photon has a large effect on the qubit without ever being absorbed. The hallmark of this strong dispersive regime is that the qubit transition energy can be resolved into a separate spectral line for each photon number state of the microwave field. The strength of each line is a measure of the probability of finding the corresponding photon number in the cavity. This effect is used to distinguish between coherent and thermal fields, and could be used to create a photon statistics analyser. As no photons are absorbed by this process, it should be possible to generate non-classical states of light by measurement and perform qubit-photon conditional logic, the basis of a logic bus for a quantum computer.

Proceedings ArticleDOI
10 Dec 2007
TL;DR: This paper summarizes the final results of the modeling and control parts of OS4 project, which focused on design and control of a quadrotor and introduces a simulation model which takes into account the variation of the aerodynamical coefficients due to vehicle motion.
Abstract: The research on autonomous miniature flying robots has intensified considerably thanks to the recent growth of civil and military interest in unmanned aerial vehicles (UAV). This paper summarizes the final results of the modeling and control parts of OS4 project, which focused on design and control of a quadrotor. It introduces a simulation model which takes into account the variation of the aerodynamical coefficients due to vehicle motion. The control parameters found with this model are successfully used on the helicopter without re-tuning. The last part of this paper describes the control approach (integral backstepping) and the scheme we propose for full control of quadrotors (attitude, altitude and position). Finally, the results of autonomous take-off, hover, landing and collision avoidance are presented.

Journal ArticleDOI
TL;DR: A single-objective, elitist, CMA-ES is introduced using plus-selection and step size control based on a success rule and a population of individuals that adapt their search strategy as in the elitists is maintained, subject to multi-objectives selection.
Abstract: The covariancematrix adaptation evolution strategy (CMA-ES) is one of themost powerful evolutionary algorithms for real-valued single-objective optimization. In this paper, we develop a variant of the CMA-ES for multi-objective optimization (MOO). We first introduce a single-objective, elitist CMA-ES using plus-selection and step size control based on a success rule. This algorithm is compared to the standard CMA-ES. The elitist CMA-ES turns out to be slightly faster on unimodal functions, but is more prone to getting stuck in sub-optimal local minima. In the new multi-objective CMAES (MO-CMA-ES) a population of individuals that adapt their search strategy as in the elitist CMA-ES is maintained. These are subject to multi-objective selection. The selection is based on non-dominated sorting using either the crowding-distance or the contributing hypervolume as second sorting criterion. Both the elitist single-objective CMA-ES and the MO-CMA-ES inherit important invariance properties, in particular invariance against rotation of the search space, from the original CMA-ES. The benefits of the new MO-CMA-ES in comparison to the well-known NSGA-II and to NSDE, a multi-objective differential evolution algorithm, are experimentally shown.

Journal ArticleDOI
TL;DR: While suspended CNT-bundles were less cytotoxic than asbestos, rope-like agglomerates induced more pronounced cytot toxic effects than asbestos fibres at the same concentrations, which underlines the need for thorough materials characterization prior to toxicological studies and corroborates the role of agglomers in the cytotoxicity effect of nanomaterials.

Journal ArticleDOI
TL;DR: This work systematically evaluates the capacity of 11 objective functions combined with eight adjustable constraints to predict 13C‐determined in vivo fluxes in Escherichia coli under six environmental conditions and identified two sets of objectives for biologically meaningful predictions without the need for further, potentially artificial constraints.
Abstract: To which extent can optimality principles describe the operation of metabolic networks? By explicitly considering experimental errors and in silico alternate optima in flux balance analysis, we systematically evaluate the capacity of 11 objective functions combined with eight adjustable constraints to predict 13C-determined in vivo fluxes in Escherichia coli under six environmental conditions. While no single objective describes the flux states under all conditions, we identified two sets of objectives for biologically meaningful predictions without the need for further, potentially artificial constraints. Unlimited growth on glucose in oxygen or nitrate respiring batch cultures is best described by nonlinear maximization of the ATP yield per flux unit. Under nutrient scarcity in continuous cultures, in contrast, linear maximization of the overall ATP or biomass yields achieved the highest predictive accuracy. Since these particular objectives predict the system behavior without preconditioning of the network structure, the identified optimality principles reflect, to some extent, the evolutionary selection of metabolic network regulation that realizes the various flux states.

Journal ArticleDOI
TL;DR: In this article, the authors demonstrate that laser sintering of inkjet-printed metal nanoparticles enables low-temperature metal deposition as well as high-resolution patterning to overcome the resolution limitation of the current inkjet direct writing processes.
Abstract: All-printed electronics is the key technology to ultra-low-cost, large-area electronics. As a critical step in this direction, we demonstrate that laser sintering of inkjet-printed metal nanoparticles enables low-temperature metal deposition as well as high-resolution patterning to overcome the resolution limitation of the current inkjet direct writing processes. To demonstrate this process combined with the implementation of air-stable carboxylate-functionalized polythiophenes, high-resolution organic transistors were fabricated in ambient pressure and room temperature without utilizing any photolithographic steps or requiring a vacuum deposition process. Local thermal control of the laser sintering process could minimize the heat-affected zone and the thermal damage to the substrate and further enhance the resolution of the process. This local nanoparticle deposition and energy coupling enable an environmentally friendly and cost-effective process as well as a low-temperature manufacturing sequence to realize large-area, flexible electronics on polymer substrates.

Journal ArticleDOI
TL;DR: This work has shown that the combination of AP–MS with other techniques, such as biochemical fractionation, intact mass measurement and chemical crosslinking, can help to decipher the supramolecular organization of protein complexes.
Abstract: The combination of affinity purification and mass spectrometry (AP–MS) has recently been applied to the detailed characterization of protein complexes and large protein-interaction networks. Emerging AP–MS approaches promise a better understanding of protein-complex stoichiometry, structural organization and the dynamics of protein-complex composition. The versatile combination of affinity purification and mass spectrometry (AP–MS) has recently been applied to the detailed characterization of many protein complexes and large protein-interaction networks. The combination of AP–MS with other techniques, such as biochemical fractionation, intact mass measurement and chemical crosslinking, can help to decipher the supramolecular organization of protein complexes. AP–MS can also be combined with quantitative proteomics approaches to better understand the dynamics of protein–complex assembly.

Journal Article
TL;DR: This work proves uniform consistency of the PC-algorithm for very high-dimensional, sparse DAGs where the number of nodes is allowed to quickly grow with sample size n, as fast as O(na) for any 0 < a < ∞.
Abstract: We consider the PC-algorithm (Spirtes et al., 2000) for estimating the skeleton and equivalence class of a very high-dimensional directed acyclic graph (DAG) with corresponding Gaussian distribution. The PC-algorithm is computationally feasible and often very fast for sparse problems with many nodes (variables), and it has the attractive property to automatically achieve high computational efficiency as a function of sparseness of the true underlying DAG. We prove uniform consistency of the algorithm for very high-dimensional, sparse DAGs where the number of nodes is allowed to quickly grow with sample size n, as fast as O(na) for any 0 < a < ∞. The sparseness assumption is rather minimal requiring only that the neighborhoods in the DAG are of lower order than sample size n. We also demonstrate the PC-algorithm for simulated data.

Journal ArticleDOI
11 Jan 2007-Nature
TL;DR: It is concluded that oceanic nitrogen fixation is closely tied to the generation of nitrogen-deficient waters in denitrification zones, supporting the view that nitrogen fixation stabilizes the oceanic inventory of fixed nitrogen over time.
Abstract: Nitrogen fixation is crucial for maintaining biological productivity in the oceans, because it replaces the biologically available nitrogen that is lost through denitrification. But, owing to its temporal and spatial variability, the global distribution of marine nitrogen fixation is difficult to determine from direct shipboard measurements. This uncertainty limits our understanding of the factors that influence nitrogen fixation, which may include iron, nitrogen-to-phosphorus ratios, and physical conditions such as temperature. Here we determine nitrogen fixation rates in the world's oceans through their impact on nitrate and phosphate concentrations in surface waters, using an ocean circulation model. Our results indicate that nitrogen fixation rates are highest in the Pacific Ocean, where water column denitrification rates are high but the rate of atmospheric iron deposition is low. We conclude that oceanic nitrogen fixation is closely tied to the generation of nitrogen-deficient waters in denitrification zones, supporting the view that nitrogen fixation stabilizes the oceanic inventory of fixed nitrogen over time.

Journal ArticleDOI
Markus Niederberger1
TL;DR: The organic components strongly influence the composition, size, shape, and surface properties of the inorganic product, underlining the demand to understand the role of the organic species at all stages of these processes for the development of a rational synthesis strategy for inorganic nanomaterials.
Abstract: Sol–gel routes to metal oxide nanoparticles in organic solvents under exclusion of water have become a versatile alternative to aqueous methods. In comparison to the complex aqueous chemistry, nonaqueous processes offer the possibility of better understanding and controlling the reaction pathways on a molecular level, enabling the synthesis of nanomaterials with high crystallinity and well-defined and uniform particle morphologies. The organic components strongly influence the composition, size, shape, and surface properties of the inorganic product, underlining the demand to understand the role of the organic species at all stages of these processes for the development of a rational synthesis strategy for inorganic nanomaterials.

Journal ArticleDOI
TL;DR: The results suggest that the fate of normal human cells should be considered in evaluating nutrient deprivation as a strategy for cancer therapy, and that understanding how glutamine metabolism is linked to cell viability might provide new approaches for treatment of cancer.
Abstract: The idea that conversion of glucose to ATP is an attractive target for cancer therapy has been supported in part by the observation that glucose deprivation induces apoptosis in rodent cells transduced with the proto-oncogene MYC, but not in the parental line. Here, we found that depletion of glucose killed normal human cells irrespective of induced MYC activity and by a mechanism different from apoptosis. However, depletion of glutamine, another major nutrient consumed by cancer cells, induced apoptosis depending on MYC activity. This apoptosis was preceded by depletion of the Krebs cycle intermediates, was prevented by two Krebs cycle substrates, but was unrelated to ATP synthesis or several other reported consequences of glutamine starvation. Our results suggest that the fate of normal human cells should be considered in evaluating nutrient deprivation as a strategy for cancer therapy, and that understanding how glutamine metabolism is linked to cell viability might provide new approaches for treatment of cancer.