scispace - formally typeset
Search or ask a question

Showing papers by "Chalmers University of Technology published in 2013"


Journal ArticleDOI
06 Jun 2013-Nature
TL;DR: This work uses shotgun sequencing to characterize the faecal metagenome of 145 European women with normal, impaired or diabetic glucose control, and develops a mathematical model based on metagenomic profiles that identified T2D with high accuracy.
Abstract: Recent evidence has suggested that altered gut microbiota are associated with various metabolic diseases including obesity, diabetes and cardiovascular disease. Fredrik Bckhed and colleagues characterized the faecal metagenome of a cohort of European women with normal, impaired or diabetic glucose control and compared these findings to a recently described Chinese cohort. Their analysis reveals differences in the discriminant metagenomic markers for type 2 diabetes between the two cohorts, suggesting that metagenomic predictive tools may have to be specific for age and geographical populations under investigation.

2,248 citations


Journal ArticleDOI
M. P. van Haarlem1, Michael W. Wise1, Michael W. Wise2, A. W. Gunst1  +219 moreInstitutions (27)
TL;DR: In dit artikel zullen the authors LOFAR beschrijven: van de astronomische mogelijkheden met de nieuwe telescoop tot aan een nadere technische beshrijving of het instrument.
Abstract: LOFAR, the LOw-Frequency ARray, is a new-generation radio interferometer constructed in the north of the Netherlands and across europe. Utilizing a novel phased-array design, LOFAR covers the largely unexplored low-frequency range from 10-240 MHz and provides a number of unique observing capabilities. Spreading out from a core located near the village of Exloo in the northeast of the Netherlands, a total of 40 LOFAR stations are nearing completion. A further five stations have been deployed throughout Germany, and one station has been built in each of France, Sweden, and the UK. Digital beam-forming techniques make the LOFAR system agile and allow for rapid repointing of the telescope as well as the potential for multiple simultaneous observations. With its dense core array and long interferometric baselines, LOFAR achieves unparalleled sensitivity and angular resolution in the low-frequency radio regime. The LOFAR facilities are jointly operated by the International LOFAR Telescope (ILT) foundation, as an observatory open to the global astronomical community. LOFAR is one of the first radio observatories to feature automated processing pipelines to deliver fully calibrated science products to its user community. LOFAR's new capabilities, techniques and modus operandi make it an important pathfinder for the Square Kilometre Array (SKA). We give an overview of the LOFAR instrument, its major hardware and software components, and the core science objectives that have driven its design. In addition, we present a selection of new results from the commissioning phase of this new radio observatory.

2,067 citations


Journal ArticleDOI
TL;DR: Recon 2, a community-driven, consensus 'metabolic reconstruction', is described, which is the most comprehensive representation of human metabolism that is applicable to computational modeling and has improved topological and functional features.
Abstract: Multiple models of human metabolism have been reconstructed, but each represents only a subset of our knowledge. Here we describe Recon 2, a community-driven, consensus 'metabolic reconstruction', which is the most comprehensive representation of human metabolism that is applicable to computational modeling. Compared with its predecessors, the reconstruction has improved topological and functional features, including ~2× more reactions and ~1.7× more unique metabolites. Using Recon 2 we predicted changes in metabolite biomarkers for 49 inborn errors of metabolism with 77% accuracy when compared to experimental data. Mapping metabolomic data and drug information onto Recon 2 demonstrates its potential for integrating and analyzing diverse data types. Using protein expression data, we automatically generated a compendium of 65 cell type–specific models, providing a basis for manual curation or investigation of cell-specific metabolic properties. Recon 2 will facilitate many future biomedical studies and is freely available at http://humanmetabolism.org/.

1,002 citations


Journal ArticleDOI
TL;DR: A set of terms, definitions, and recommendations for use in the classi- fication of coordination polymers, networks, and metal-organic frameworks (MOFs) is provided in this paper.
Abstract: A set of terms, definitions, and recommendations is provided for use in the classi- fication of coordination polymers, networks, and metal-organic frameworks (MOFs). A hier- archical terminology is recommended in which the most general term is coordination poly- mer. Coordination networks are a subset of coordination polymers and MOFs a further subset of coordination networks. One of the criteria an MOF needs to fulfill is that it contains poten- tial voids, but no physical measurements of porosity or other properties are demanded per se. The use of topology and topology descriptors to enhance the description of crystal structures of MOFs and 3D-coordination polymers is furthermore strongly recommended.

906 citations


Journal ArticleDOI
TL;DR: ITSx is introduced, a Perl‐based software tool to extract ITS1, 5.8S and ITS2 – as well as full‐length ITS sequences – from both Sanger and high‐throughput sequencing data sets, and is rich in features and written to be easily incorporated into automated sequence analysis pipelines.
Abstract: Summary 1. The nuclear ribosomal internal transcribed spacer (ITS) region is the primary choice for molecular identification of fungi. Its two highly variable spacers (ITS1 and ITS2) are usually species specific, whereas the intercalary 5.8S gene is highly conserved. For sequence clustering and BLAST searches, it is often advantageous to rely on either one of the variable spacers but not the conserved 5.8S gene. To identify and extract ITS1 and ITS2 from large taxonomic and environmental data sets is, however, often difficult, and many ITS sequences are incorrectly delimited in the public sequence databases. 2. We introduce ITSx, a Perl-based software tool to extract ITS1, 5.8S and ITS2 – as well as full-length ITS sequences – from both Sanger and high-throughput sequencing data sets. ITSx uses hidden Markov models computed from large alignments of a total of 20 groups of eukaryotes, including fungi, metazoans and plants, and the sequence extraction is based on the predicted positions of the ribosomal genes in the sequences. 3. ITSx has a very high proportion of true-positive extractions and a low proportion of false-positive extractions. Additionally, process parallelization permits expedient analyses of very large data sets, such as a one million sequence amplicon pyrosequencing data set. ITSx is rich in features and written to be easily incorporated into automated sequence analysis pipelines. 4. ITSx paves the way for more sensitive BLAST searches and sequence clustering operations for the ITS region in eukaryotes. The software also permits elimination of non-ITS sequences from any data set. This is particularly useful for amplicon-based next-generation sequencing data sets, where insidious non-target sequences are often found among the target sequences. Such non-target sequences are difficult to find by other means and would contribute noise to diversity estimates if left in the data set.

901 citations


Journal ArticleDOI
TL;DR: An upgraded version of the Particle and Heavy Ion Transport code system (PHITS2.52) was developed and released to the public in this article, which is a more powerful tool for particle transport simulation applicable to various research and development fields.
Abstract: An upgraded version of the Particle and Heavy Ion Transport code System, PHITS2.52, was developed and released to the public. The new version has been greatly improved from the previously released version, PHITS2.24, in terms of not only the code itself but also the contents of its package, such as the attached data libraries. In the new version, a higher accuracy of simulation was achieved by implementing several latest nuclear reaction models. The reliability of the simulation was improved by modifying both the algorithms for the electron-, positron-, and photon-transport simulations and the procedure for calculating the statistical uncertainties of the tally results. Estimation of the time evolution of radioactivity became feasible by incorporating the activation calculation program DCHAIN-SP into the new package. The efficiency of the simulation was also improved as a result of the implementation of shared-memory parallelization and the optimization of several time-consuming algorithms. Furthermore, a number of new user-support tools and functions that help users to intuitively and effectively perform PHITS simulations were developed and incorporated. Due to these improvements, PHITS is now a more powerful tool for particle transport simulation applicable to various research and development fields, such as nuclear technology, accelerator design, medical physics, and cosmic-ray research.

742 citations


Journal ArticleDOI
TL;DR: A theoretical model that centres on two drivers behind boundary resources design and use – resourcing and securing – and how these drivers interact in third‐party development is proposed and applied to a detailed case study of Apple's iPhone platform.
Abstract: Prior research documents the significance of using platform boundary resources e.g. application programming interfaces for cultivating platform ecosystems through third-party development. However, there are few, if any, theoretical accounts of this relationship. To this end, this paper proposes a theoretical model that centres on two drivers behind boundary resources design and use - resourcing and securing - and how these drivers interact in third-party development. We apply the model to a detailed case study of Apple's iPhone platform. Our application of the model not only serves as an illustration of its plausibility but also generates insights about the conflicting goals of third-party development: the maintenance of platform control and the transfer of design capability to third-party developers. We generate four specialised constructs for understanding the actions taken by stakeholders in third-party development: self-resourcing, regulation-based securing, diversity resourcing and sovereignty securing. Our research extends and complements existing platform literature and contributes new knowledge about an alternative form of system development.

646 citations


Journal ArticleDOI
TL;DR: The R package Piano is developed that collects a range of GSA methods into the same system, for the benefit of the end-user, and suggests to use a consensus scoring approach, based on multiple GSA runs, in combination with the directionality classes.
Abstract: Gene set analysis (GSA) is used to elucidate genome-wide data, in particular transcriptome data. A multitude of methods have been proposed for this step of the analysis, and many of them have been compared and evaluated. Unfortunately, there is no consolidated opinion regarding what methods should be preferred, and the variety of available GSA software and implementations pose a difficulty for the end-user who wants to try out different methods. To address this, we have developed the R package Piano that collects a range of GSA methods into the same system, for the benefit of the end-user. Further on we refine the GSA workflow by using modifications of the gene-level statistics. This enables us to divide the resulting gene set P-values into three classes, describing different aspects of gene expression directionality at gene set level. We use our fully implemented workflow to investigate the impact of the individual components of GSA by using microarray and RNA-seq data. The results show that the evaluated methods are globally similar and the major separation correlates well with our defined directionality classes. As a consequence of this, we suggest to use a consensus scoring approach, based on multiple GSA runs. In combination with the directionality classes, this constitutes a more thorough basis for an enriched biological interpretation.

597 citations


Journal ArticleDOI
TL;DR: An assessment of the mitigation potential possible in the AFOLU sector under possible future scenarios in which demand-side measures codeliver to aid food security concludes that while supply-side mitigation measures, such as changes in land management, might either enhance or negatively impact food security, demand- side mitigation measures should benefit both food security and greenhouse gas mitigation.
Abstract: Feeding 9-10billion people by 2050 and preventing dangerous climate change are two of the greatest challenges facing humanity. Both challenges must be met while reducing the impact of land management on ecosystem services that deliver vital goods and services, and support human health and well-being. Few studies to date have considered the interactions between these challenges. In this study we briefly outline the challenges, review the supply- and demand-side climate mitigation potential available in the Agriculture, Forestry and Other Land Use AFOLU sector and options for delivering food security. We briefly outline some of the synergies and trade-offs afforded by mitigation practices, before presenting an assessment of the mitigation potential possible in the AFOLU sector under possible future scenarios in which demand-side measures codeliver to aid food security. We conclude that while supply-side mitigation measures, such as changes in land management, might either enhance or negatively impact food security, demand-side mitigation measures, such as reduced waste or demand for livestock products, should benefit both food security and greenhouse gas (GHG) mitigation. Demand-side measures offer a greater potential (1.5-15.6Gt CO2-eq. yr(-1)) in meeting both challenges than do supply-side measures (1.5-4.3Gt CO2-eq. yr(-1) at carbon prices between 20 and 100US$ tCO(2)-eq. yr(-1)), but given the enormity of challenges, all options need to be considered. Supply-side measures should be implemented immediately, focussing on those that allow the production of more agricultural product per unit of input. For demand-side measures, given the difficulties in their implementation and lag in their effectiveness, policy should be introduced quickly, and should aim to codeliver to other policy agenda, such as improving environmental quality or improving dietary health. These problems facing humanity in the 21st Century are extremely challenging, and policy that addresses multiple objectives is required now more than ever.

507 citations


Book
01 Jan 2013
TL;DR: In this paper, district heating and cooling systems substitute ordinary primary energy supply for heating, and increase both energy efficiency and decarbonisation in the global energy system, however, they are highly underestimated in contemporary energy policy, both nationally and internationally.
Abstract: District heating and cooling systems move heat in urban areas. Heat and cold are generated in central supply units by heat or cold recycling, renewables, or by direct heat or cold generation. The heat and cold demands should be concentrated in order to keep low distribution costs. District heating and cooling systems substitute ordinary primary energy supply for heating and cooling. Therefore, district heating and cooling increase both energy efficiency and decarbonisation in the global energy system. However, district heating and cooling is a highly underestimated energy efficiency and decarbonisation method in contemporary energy policy, both nationally and internationally.

446 citations


Journal ArticleDOI
TL;DR: Object-oriented and process metrics have been reported to be more successful in finding faults compared to traditional size and complexity metrics and seem to be better at predicting post-release faults than any static code metrics.
Abstract: ContextSoftware metrics may be used in fault prediction models to improve software quality by predicting fault location. ObjectiveThis paper aims to identify software metrics and to assess their applicability in software fault prediction. We investigated the influence of context on metrics' selection and performance. MethodThis systematic literature review includes 106 papers published between 1991 and 2011. The selected papers are classified according to metrics and context properties. ResultsObject-oriented metrics (49%) were used nearly twice as often compared to traditional source code metrics (27%) or process metrics (24%). Chidamber and Kemerer's (CK) object-oriented metrics were most frequently used. According to the selected studies there are significant differences between the metrics used in fault prediction performance. Object-oriented and process metrics have been reported to be more successful in finding faults compared to traditional size and complexity metrics. Process metrics seem to be better at predicting post-release faults compared to any static code metrics. ConclusionMore studies should be performed on large industrial software systems to find metrics more relevant for the industry and to answer the question as to which metrics should be used in a given context.

Journal ArticleDOI
TL;DR: In this paper, the Atacama Large Millimeter/submillimeter Array (ALMA) observations of the [C Pi] 158 mu m fine structure line and dust continuum emission from the host galaxies of five redshift 6 quasars were carried out in the extended array at 0'' 7 resolution.
Abstract: We present Atacama Large Millimeter/submillimeter Array (ALMA) observations of the [C Pi] 158 mu m fine structure line and dust continuum emission from the host galaxies of five redshift 6 quasars. We also report complementary observations of 250 GHz dust continuum and CO (6-5) line emission from the z = 6.00 quasar SDSS J231038.88+185519.7 using the IRAM facilities. The ALMA observations were carried out in the extended array at 0.'' 7 resolution. We have detected the line and dust continuum in all five objects. The derived [C Pi] line luminosities are 1.6 x 10(9) to 8.7 x 10(9) L-circle dot and the [C Pi]-to-FIR luminosity ratios are 2.9-5.1 x 10(-4), which is comparable to the values found in other high-redshift quasar-starburst systems and local ultra-luminous infrared galaxies. The sources are marginally resolved and the intrinsic source sizes (major axis FWHM) are constrained to be 0.'' 3-0.'' 6 (i.e., 1.7-3.5 kpc) for the [C Pi] line emission and 0.'' 2-0.'' 4 (i.e., 1.2-2.3 kpc) for the continuum. These measurements indicate that there is vigorous star formation over the central few kpc in the quasar host galaxies. The ALMA observations also constrain the dynamical properties of the star-forming gas in the nuclear region. The intensity-weighted velocity maps of three sources show clear velocity gradients. Such velocity gradients are consistent with a rotating, gravitationally bound gas component, although they are not uniquely interpreted as such. Under the simplifying assumption of rotation, the implied dynamical masses within the [C Pi]-emitting regions are of order 10(10)-10(11) M-circle dot. Given these estimates, the mass ratios between the supermassive black holes and the spheroidal bulge are an order of magnitude higher than the mean value found in local spheroidal galaxies, which is in agreement with results from previous CO observations of high redshift quasars.

Journal ArticleDOI
TL;DR: The RAVEN Toolbox workflow was applied in order to reconstruct a genome-scale metabolic model for the important microbial cell factory Penicillium chrysogenum Wisconsin54-1255, and was then used to study the roles of ATP and NADPH in the biosynthesis of Penicillin, and to identify potential metabolic engineering targets for maximization of penicillin production.
Abstract: We present the RAVEN (Reconstruction, Analysis and Visualization of Metabolic Networks) Toolbox: a software suite that allows for semi-automated reconstruction of genome-scale models. It makes use of published models and/or the KEGG database, coupled with extensive gap-filling and quality control features. The software suite also contains methods for visualizing simulation results and omics data, as well as a range of methods for performing simulations and analyzing the results. The software is a useful tool for system-wide data analysis in a metabolic context and for streamlined reconstruction of metabolic networks based on protein homology. The RAVEN Toolbox workflow was applied in order to reconstruct a genome-scale metabolic model for the important microbial cell factory Penicillium chrysogenum Wisconsin54-1255. The model was validated in a bibliomic study of in total 440 references, and it comprises 1471 unique biochemical reactions and 1006 ORFs. It was then used to study the roles of ATP and NADPH in the biosynthesis of penicillin, and to identify potential metabolic engineering targets for maximization of penicillin production.

Book ChapterDOI
13 Jul 2013
TL;DR: The superposition calculus is discussed and the key concepts of saturation and redundancy elimination are explained, present saturation algorithms and preprocessing, and demonstrate how these concepts are implemented in Vampire.
Abstract: In this paper we give a short introduction in first-order theorem proving and the use of the theorem prover Vampire. We discuss the superposition calculus and explain the key concepts of saturation and redundancy elimination, present saturation algorithms and preprocessing, and demonstrate how these concepts are implemented in Vampire. Further, we also cover more recent topics and features of Vampire designed for advanced applications, including satisfiability checking, theory reasoning, interpolation, consequence elimination, and program analysis.

Journal ArticleDOI
TL;DR: In this article, a comprehensive study is reported entailed optimization of sodium ion electrolyte formulation and compatibility studies with positive and negative electrode materials, and EC0.45:PC0.1 was found to be the optimum composition resulting in good rate capability and high capacity upon sustained cycling for hard carbon electrodes.
Abstract: A comprehensive study is reported entailing optimization of sodium ion electrolyte formulation and compatibility studies with positive and negative electrode materials. EC:PC:DMC and EC:PC:DME were found to exhibit optimum ionic conductivities and lower viscosities. Yet, hard carbon negative electrode materials tested in such electrolytes exhibit significant differences in performance, rooted in the different resistivity of the SEI, which results in too large polarization and concomitant loss of capacity at low potentials when DME is used as a co-solvent. EC0.45:PC0.45:DMC0.1 was found to be the optimum composition resulting in good rate capability and high capacity upon sustained cycling for hard carbon electrodes. Its compatibility with positive Na3V2(PO4)2F3 (NVPF) electrodes was also confirmed, which led to the assembly of full Na-ion cells displaying an operation voltage of 3.65 V, very low polarisation and excellent capacity retention upon cycling with ca. 97 mA h g−1 of NVPF after more than 120 cycles together with satisfactory coulombic efficiency (>98.5%) and very good power performance. Such values lead to energy densities comparable to those of the current state-of-the-art lithium-ion technology.

Journal ArticleDOI
TL;DR: This paper derives new closed-form expressions for the exact and asymptotic OPs, accounting for hardware impairments at the source, relay, and destination, and proves that for high signal-to-noise ratio (SNR), the end- to-end SNDR converges to a deterministic constant, coined the SNDR ceiling, which is inversely proportional to the level of impairments.
Abstract: Physical transceivers have hardware impairments that create distortions which degrade the performance of communication systems. The vast majority of technical contributions in the area of relaying neglect hardware impairments and, thus, assume ideal hardware. Such approximations make sense in low-rate systems, but can lead to very misleading results when analyzing future high-rate systems. This paper quantifies the impact of hardware impairments on dual-hop relaying, for both amplify-and-forward and decode-and-forward protocols. The outage probability (OP) in these practical scenarios is a function of the effective end-to-end signal-to-noise-and-distortion ratio (SNDR). This paper derives new closed-form expressions for the exact and asymptotic OPs, accounting for hardware impairments at the source, relay, and destination. A similar analysis for the ergodic capacity is also pursued, resulting in new upper bounds. We assume that both hops are subject to independent but non-identically distributed Nakagami-m fading. This paper validates that the performance loss is small at low rates, but otherwise can be very substantial. In particular, it is proved that for high signal-to-noise ratio (SNR), the end-to-end SNDR converges to a deterministic constant, coined the SNDR ceiling, which is inversely proportional to the level of impairments. This stands in contrast to the ideal hardware case in which the end-to-end SNDR grows without bound in the high-SNR regime. Finally, we provide fundamental design guidelines for selecting hardware that satisfies the requirements of a practical relaying system.

Journal ArticleDOI
01 Oct 2013-Diabetes
TL;DR: The use of genetically engineered gnotobiotic mouse models may increase the understanding of mechanisms by which the gut microbiome modulates host metabolism and physiology, and a combination of classical microbiology, sequencing, and animal experiments may provide further insights.
Abstract: Recent findings have demonstrated that the gut microbiome complements our human genome with at least 100-fold more genes. In contrast to our Homo sapiens-derived genes, the microbiome is much more plastic, and its composition changes with age and diet, among other factors. An altered gut microbiota has been associated with several diseases, including obesity and diabetes, but the mechanisms involved remain elusive. Here we discuss factors that affect the gut microbiome, how the gut microbiome may contribute to metabolic diseases, and how to study the gut microbiome. Next-generation sequencing and development of software packages have led to the development of large-scale sequencing efforts to catalog the human microbiome. Furthermore, the use of genetically engineered gnotobiotic mouse models may increase our understanding of mechanisms by which the gut microbiome modulates host metabolism. A combination of classical microbiology, sequencing, and animal experiments may provide further insights into how the gut microbiota affect host metabolism and physiology.

Journal ArticleDOI
TL;DR: It is shown how measurements reveal a strong, but complex pattern of climatic dependence, which is increasingly being characterized using ground-based NH3 monitoring and satellite observations, while advances in process-based modelling are illustrated for agricultural and natural sources.
Abstract: Existing descriptions of bi-directional ammonia (NH3) land-atmosphere exchange incorporate temperature and moisture controls, and are beginning to be used in regional chemical transport models. However, such models have typically applied simpler emission factors to upscale the main NH3 emission terms. While this approach has successfully simulated the main spatial patterns on local to global scales, it fails to address the environment- and climate-dependence of emissions. To handle these issues, we outline the basis for a new modelling paradigm where both NH3 emissions and deposition are calculated online according to diurnal, seasonal and spatial differences in meteorology. We show how measurements reveal a strong, but complex pattern of climatic dependence, which is increasingly being characterized using ground-based NH3 monitoring and satellite observations, while advances in process-based modelling are illustrated for agricultural and natural sources, including a global application for seabird colonies. A future architecture for NH3 emission-deposition modelling is proposed that integrates the spatio-temporal interactions, and provides the necessary foundation to assess the consequences of climate change. Based on available measurements, a first empirical estimate suggests that 5°C warming would increase emissions by 42 per cent (28-67%). Together with increased anthropogenic activity, global NH3 emissions may increase from 65 (45-85) Tg N in 2008 to reach 132 (89-179) Tg by 2100.

Journal ArticleDOI
TL;DR: In this article, the Atacama Large Millimeter/submillimeter Array (ALMA) Cycle 0 survey of 126 submillimeter sources from the LABOCA ECDFS Submillimeter Survey (LESS) was presented.
Abstract: We present an Atacama Large Millimeter/submillimeter Array (ALMA) Cycle 0 survey of 126 submillimeter sources from the LABOCA ECDFS Submillimeter Survey (LESS). Our 870 mu m survey with ALMA (ALESS) has produced maps similar to 3x deeper and with a beam area similar to 200x smaller than the original LESS observations, doubling the current number of interferometrically-observed submillimeter sources. The high resolution of these maps allows us to resolve sources that were previously blended and accurately identify the origin of the submillimeter emission. We discuss the creation of the ALESS submillimeter galaxy (SMG) catalog, including the main sample of 99 SMGs and a supplementary sample of 32 SMGs. We find that at least 35% (possibly up to 50%) of the detected LABOCA sources have been resolved into multiple SMGs, and that the average number of SMGs per LESS source increases with LESS flux density. Using the (now precisely known) SMG positions, we empirically test the theoretical expectation for the uncertainty in the single-dish source positions. We also compare our catalog to the previously predicted radio/mid-infrared counterparts, finding that 45% of the ALESS SMGs were missed by this method. Our similar to 1 ''.6 resolution allows us to measure a size of similar to 9 kpc x 5 kpc for the rest-frame similar to 300 mu m emission region in one resolved SMG, implying a star formation rate surface density of 80 M-circle dot yr(-1) kpc(-2), and we constrain the emission regions in the remaining SMGs to be <10 kpc. As the first statistically reliable survey of SMGs, this will provide the basis for an unbiased multiwavelength study of SMG properties.

Journal ArticleDOI
TL;DR: In this article, it was shown that the Jacobiator of generalised diffeomorphisms gives such a reducibility transformation, i.e., the tower of ghosts for ghosts is infinite.
Abstract: We investigate the generalised diffeomorphisms in M-theory, which are gauge transformations unifying diffeomorphisms and tensor gauge transformations. After giving an En(n)-covariant description of the gauge transformations and their commutators, we show that the gauge algebra is infinitely reducible, i.e., the tower of ghosts for ghosts is infinite. The Jacobiator of generalised diffeomorphisms gives such a reducibility transformation. We give a concrete description of the ghost structure, and demonstrate that the infinite sums give the correct (regularised) number of degrees of freedom. The ghost towers belong to the sequences of rep- resentations previously observed appearing in tensor hierarchies and Borcherds algebras. All calculations rely on the section condition, which we reformulate as a linear condition on the cotangent directions. The analysis holds for n < 8. At n = 8, where the dual gravity field becomes relevant, the natural guess for the gauge parameter and its reducibility still yields the correct counting of gauge parameters.

Journal ArticleDOI
TL;DR: In this paper, the authors reviewed a multitude of methods and indicators for freshwater use potentially applicable in life cycle assessment and identified the key elements to build a scientific consensus for operational characterization methods for LCA.
Abstract: In recent years, several methods have been developed which propose different freshwater use inventory schemes and impact assessment characterization models considering various cause–effect chain relationships. This work reviewed a multitude of methods and indicators for freshwater use potentially applicable in life cycle assessment (LCA). This review is used as a basis to identify the key elements to build a scientific consensus for operational characterization methods for LCA. This evaluation builds on the criteria and procedure developed within the International Reference Life Cycle Data System Handbook and has been adapted for the purpose of this project. It therefore includes (1) description of relevant cause–effect chains, (2) definition of criteria to evaluate the existing methods, (3) development of sub-criteria specific to freshwater use, and (4) description and review of existing methods addressing freshwater in LCA. No single method is available which comprehensively describes all potential impacts derived from freshwater use. However, this review highlights several key findings to design a characterization method encompassing all the impact pathways of the assessment of freshwater use and consumption in life cycle assessment framework as the following: (1) in most of databases and methods, consistent freshwater balances are not reported either because output is not considered or because polluted freshwater is recalculated based on a critical dilution approach; (2) at the midpoint level, most methods are related to water scarcity index and correspond to the methodological choice of an indicator simplified in terms of the number of parameters (scarcity) and freshwater uses (freshwater consumption or freshwater withdrawal) considered. More comprehensive scarcity indices distinguish different freshwater types and functionalities. (3) At the endpoint level, several methods already exist which report results in units compatible with traditional human health and ecosystem quality damage and cover various cause–effect chains, e.g., the decrease of terrestrial biodiversity due to freshwater consumption. (4) Midpoint and endpoint indicators have various levels of spatial differentiation, i.e., generic factors with no differentiation at all, or country, watershed, and grid cell differentiation. Existing databases should be (1) completed with input and output freshwater flow differentiated according to water types based on its origin (surface water, groundwater, and precipitation water stored as soil moisture), (2) regionalized, and (3) if possible, characterized with a set of quality parameters. The assessment of impacts related to freshwater use is possible by assembling methods in a comprehensive methodology to characterize each use adequately.

Journal ArticleDOI
TL;DR: In this paper, a tank-to-wheel (TTW) analysis of a series plug-in hybrid electric bus operated in Gothenburg, Sweden, is presented, where the bus line and the powertrain model are described.

Journal ArticleDOI
TL;DR: A novel isolated-high-power three-phase battery charger based on a split-phase permanent-magnet motor and its winding configuration is presented, which is a bidirectional high-power charger with a unity power factor operation capability that has high efficiency.
Abstract: For vehicles using grid power to charge the battery, traction circuit components are not engaged during the charging time, so there is a possibility to use them in the charger circuit to have an onboard integrated charger. The battery charger can be galvanically isolated or nonisolated from the grid. Different examples of isolated or nonisolated integrated chargers are reviewed and explained. Moreover, a novel isolated-high-power three-phase battery charger based on a split-phase permanent-magnet motor and its winding configuration is presented in this paper. The proposed charger is a bidirectional high-power charger with a unity power factor operation capability that has high efficiency.

Journal ArticleDOI
TL;DR: In this paper, the first counts of faint submillimetre galaxies (SMGs) in the 870-mu m band derived from arcsecond-resolution observations with the Atacama Large Millimeter Array (ALMA) were reported.
Abstract: We report the first counts of faint submillimetre galaxies (SMGs) in the 870-mu m band derived from arcsecond-resolution observations with the Atacama Large Millimeter Array (ALMA). We have used ALMA to map a sample of 122 870-mu m-selected submillimetre sources drawn from the 0 degrees.5x0 degrees.5 the Large Apex BOlometer CAmera (LABOCA) Extended Chandra Deep Field South submillimetre survey (LESS). These ALMA maps have an average depth of sigma 870(mu m) similar to 0.4 mJy, some approximately three times deeper than the original LABOCA survey and critically the angular resolution is more than an order of magnitude higher, FWHM of similar to 1.5 arcsec compared to similar to 19 arcsec for the LABOCA discovery map. This combination of sensitivity and resolution allows us to precisely pinpoint the SMGs contributing to the submillimetre sources from the LABOCA map, free from the effects of confusion. We show that our ALMA-derived SMG counts broadly agree with the submillimetre source counts from previous, lower resolution single-dish surveys, demonstrating that the bulk of the submillimetre sources are not caused by blending of unresolved SMGs. The difficulty which well-constrained theoretical models have in reproducing the high surface densities of SMGs, thus remains. However, our observations do show that all of the very brightest sources in the LESS sample, S-870 (mu m) greater than or similar to 12 mJy, comprise emission from multiple, fainter SMGs, each with 870-mu m fluxes of less than or similar to 9 mJy. This implies a natural limit to the star formation rate in SMGs of less than or similar to 10(3) M-circle dot yr(-1), which in turn suggests that the space densities of z > 1 galaxies with gas masses in excess of similar to 5 x 10(10) M-circle dot is <10(-5) Mpc(-3). We also discuss the influence of this blending on the identification and characterization of the SMG counterparts to these bright submillimetre sources and suggest that it may be responsible for previous claims that they lie at higher redshifts than fainter SMGs.

Journal ArticleDOI
TL;DR: Recent scientific progress in metabolic engineering of S. cerevisiae for the production of bioethanol, advanced biofuels, and chemicals is reviewed.

Journal ArticleDOI
TL;DR: It is demonstrated that many aspects of nuclear structure can be understood in terms of this nucleon-n nucleon interaction, without explicitly invoking three-nucleon forces.
Abstract: We optimize the nucleon-nucleon interaction from chiral effective field theory at next-to-next-to-leading order (NNLO). The resulting new chiral force NNLOopt yields chi(2) approximate to 1 per degree of freedom for laboratory energies below approximately 125 MeV. In the A = 3, 4 nucleon systems, the contributions of three-nucleon forces are smaller than for previous parametrizations of chiral interactions. We use NNLOopt to study properties of key nuclei and neutron matter, and we demonstrate that many aspects of nuclear structure can be understood in terms of this nucleon-nucleon interaction, without explicitly invoking three-nucleon forces.

Journal ArticleDOI
01 Apr 2013
TL;DR: The first ESA S-class mission CHEOPS (CHaracterizing ExoPlanet Satellite) will fill this gap by performing ultra-high precision photometric monitoring of selected bright target stars almost anywhere on the sky with sufficient precision to detect Earth sized transits as mentioned in this paper.
Abstract: Ground based radial velocity (RV) searches continue to discover exoplanets below Neptune mass down to Earth mass. Furthermore, ground based transit searches now reach milli-mag photometric precision and can discover Neptune size planets around bright stars. These searches will find exoplanets around bright stars anywhere on the sky, their discoveries representing prime science targets for further study due to the proximity and brightness of their host stars. A mission for transit follow-up measurements of these prime targets is currently lacking. The first ESA S-class mission CHEOPS (CHaracterizing ExoPlanet Satellite) will fill this gap. It will perform ultra-high precision photometric monitoring of selected bright target stars almost anywhere on the sky with sufficient precision to detect Earth sized transits. It will be able to detect transits of RV-planets by photometric monitoring if the geometric configuration results in a transit. For Hot Neptunes discovered from the ground, CHEOPS will be able to improve the transit light curve so that the radius can be determined precisely. Because of the host stars' brightness, high precision RV measurements will be possible for all targets. All planets observed in transit by CHEOPS will be validated and their masses will be known. This will provide valuable data for constraining the mass-radius relation of exoplanets, especially in the Neptune-mass regime. During the planned 3.5 year mission, about 500 targets will be observed. There will be 20% of open time available for the community to develop new science programmes.

Journal ArticleDOI
TL;DR: It is demonstrated that the platform cell factory can be used to improve the production of α-santalene, a plant sesquiterpene that can be use as a perfume by four-fold.


Journal ArticleDOI
TL;DR: In this article, the DUNES survey aims at detecting extra-solar analogues to the Edgeworth-Kuiper belt around solar-type stars, putting in this way the solar system into context.
Abstract: Context. Debris discs are a consequence of the planet formation process and constitute the fingerprints of planetesimal systems. Their solar system counterparts are the asteroid and Edgeworth-Kuiper belts. Aims. The DUNES survey aims at detecting extra-solar analogues to the Edgeworth-Kuiper belt around solar-type stars, putting in this way the solar system into context. The survey allows us to address some questions related to the prevalence and properties of planetesimal systems. Methods. We used Herschel/PACS to observe a sample of nearby FGK stars. Data at 100 and 160 mu m were obtained, complemented in some cases with observations at 70 mu m, and at 250, 350 and 500 mu m using SPIRE. The observing strategy was to integrate as deep as possible at 100 mu m to detect the stellar photosphere. Results. Debris discs have been detected at a fractional luminosity level down to several times that of the Edgeworth-Kuiper belt. The incidence rate of discs around the DUNES stars is increased from a rate of similar to 12.1% +/- 5% before Herschel to similar to 20.2% +/- 2%. A significant fraction (similar to 52%) of the discs are resolved, which represents an enormous step ahead from the previously known resolved discs. Some stars are associated with faint far-IR excesses attributed to a new class of cold discs. Although it cannot be excluded that these excesses are produced by coincidental alignment of background galaxies, statistical arguments suggest that at least some of them are true debris discs. Some discs display peculiar SEDs with spectral indexes in the 70-160 mu m range steeper than the Rayleigh-Jeans one. An analysis of the debris disc parameters suggests that a decrease might exist of the mean black body radius from the F-type to the K-type stars. In addition, a weak trend is suggested for a correlation of disc sizes and an anticorrelation of disc temperatures with the stellar age.