scispace - formally typeset
Search or ask a question

Showing papers by "University of Bremen published in 2016"


Journal ArticleDOI
TL;DR: Gaia as discussed by the authors is a cornerstone mission in the science programme of the European Space Agency (ESA). The spacecraft construction was approved in 2006, following a study in which the original interferometric concept was changed to a direct-imaging approach.
Abstract: Gaia is a cornerstone mission in the science programme of the EuropeanSpace Agency (ESA). The spacecraft construction was approved in 2006, following a study in which the original interferometric concept was changed to a direct-imaging approach. Both the spacecraft and the payload were built by European industry. The involvement of the scientific community focusses on data processing for which the international Gaia Data Processing and Analysis Consortium (DPAC) was selected in 2007. Gaia was launched on 19 December 2013 and arrived at its operating point, the second Lagrange point of the Sun-Earth-Moon system, a few weeks later. The commissioning of the spacecraft and payload was completed on 19 July 2014. The nominal five-year mission started with four weeks of special, ecliptic-pole scanning and subsequently transferred into full-sky scanning mode. We recall the scientific goals of Gaia and give a description of the as-built spacecraft that is currently (mid-2016) being operated to achieve these goals. We pay special attention to the payload module, the performance of which is closely related to the scientific performance of the mission. We provide a summary of the commissioning activities and findings, followed by a description of the routine operational mode. We summarise scientific performance estimates on the basis of in-orbit operations. Several intermediate Gaia data releases are planned and the data can be retrieved from the Gaia Archive, which is available through the Gaia home page.

5,164 citations


Journal ArticleDOI
TL;DR: The first Gaia data release, Gaia DR1 as discussed by the authors, consists of three components: a primary astrometric data set which contains the positions, parallaxes, and mean proper motions for about 2 million of the brightest stars in common with the Hipparcos and Tycho-2 catalogues.
Abstract: Context. At about 1000 days after the launch of Gaia we present the first Gaia data release, Gaia DR1, consisting of astrometry and photometry for over 1 billion sources brighter than magnitude 20.7. Aims: A summary of Gaia DR1 is presented along with illustrations of the scientific quality of the data, followed by a discussion of the limitations due to the preliminary nature of this release. Methods: The raw data collected by Gaia during the first 14 months of the mission have been processed by the Gaia Data Processing and Analysis Consortium (DPAC) and turned into an astrometric and photometric catalogue. Results: Gaia DR1 consists of three components: a primary astrometric data set which contains the positions, parallaxes, and mean proper motions for about 2 million of the brightest stars in common with the Hipparcos and Tycho-2 catalogues - a realisation of the Tycho-Gaia Astrometric Solution (TGAS) - and a secondary astrometric data set containing the positions for an additional 1.1 billion sources. The second component is the photometric data set, consisting of mean G-band magnitudes for all sources. The G-band light curves and the characteristics of 3000 Cepheid and RR Lyrae stars, observed at high cadence around the south ecliptic pole, form the third component. For the primary astrometric data set the typical uncertainty is about 0.3 mas for the positions and parallaxes, and about 1 mas yr-1 for the proper motions. A systematic component of 0.3 mas should be added to the parallax uncertainties. For the subset of 94 000 Hipparcos stars in the primary data set, the proper motions are much more precise at about 0.06 mas yr-1. For the secondary astrometric data set, the typical uncertainty of the positions is 10 mas. The median uncertainties on the mean G-band magnitudes range from the mmag level to0.03 mag over the magnitude range 5 to 20.7. Conclusions: Gaia DR1 is an important milestone ahead of the next Gaia data release, which will feature five-parameter astrometry for all sources. Extensive validation shows that Gaia DR1 represents a major advance in the mapping of the heavens and the availability of basic stellar data that underpin observational astrophysics. Nevertheless, the very preliminary nature of this first Gaia data release does lead to a number of important limitations to the data quality which should be carefully considered before drawing conclusions from the data.

2,174 citations


Journal ArticleDOI
TL;DR: In this article, the authors present an overview of available machine learning techniques and structuring this rather complicated area, and a special focus is laid on the potential benefit and examples of successful applications in a manufacturing environment.
Abstract: The nature of manufacturing systems faces ever more complex, dynamic and at times even chaotic behaviors. In order to being able to satisfy the demand for high-quality products in an efficient manner, it is essential to utilize all means available. One area, which saw fast pace developments in terms of not only promising results but also usability, is machine learning. Promising an answer to many of the old and new challenges of manufacturing, machine learning is widely discussed by researchers and practitioners alike. However, the field is very broad and even confusing which presents a challenge and a barrier hindering wide application. Here, this paper contributes in presenting an overview of available machine learning techniques and structuring this rather complicated area. A special focus is laid on the potential benefit, and examples of successful applications in a manufacturing environment.

745 citations


Journal ArticleDOI
TL;DR: The PHY and MAC layer solutions developed within METIS to address the main challenge in mMTC is scalable and efficient connectivity for a massive number of devices sending very short packets.
Abstract: MTC are expected to play an essential role within future 5G systems. In the FP7 project METIS, MTC has been further classified into mMTC and uMTC. While mMTC is about wireless connectivity to tens of billions of machinetype terminals, uMTC is about availability, low latency, and high reliability. The main challenge in mMTC is scalable and efficient connectivity for a massive number of devices sending very short packets, which is not done adequately in cellular systems designed for human-type communications. Furthermore, mMTC solutions need to enable wide area coverage and deep indoor penetration while having low cost and being energy-efficient. In this article, we introduce the PHY and MAC layer solutions developed within METIS to address this challenge.

702 citations


Proceedings Article
04 Nov 2016
TL;DR: In this paper, the authors propose to augment deep neural networks with a small "detector" subnetwork which is trained on the binary classification task of distinguishing genuine data from data containing adversarial perturbations.
Abstract: Machine learning and deep learning in particular has advanced tremendously on perceptual tasks in recent years. However, it remains vulnerable against adversarial perturbations of the input that have been crafted specifically to fool the system while being quasi-imperceptible to a human. In this work, we propose to augment deep neural networks with a small "detector" subnetwork which is trained on the binary classification task of distinguishing genuine data from data containing adversarial perturbations. Our method is orthogonal to prior work on addressing adversarial perturbations, which has mostly focused on making the classification network itself more robust. We show empirically that adversarial perturbations can be detected surprisingly well even though they are quasi-imperceptible to humans. Moreover, while the detectors have been trained to detect only a specific adversary, they generalize to similar and weaker adversaries. In addition, we propose an adversarial attack that fools both the classifier and the detector and a novel training procedure for the detector that counteracts this attack.

570 citations


Journal ArticleDOI
TL;DR: It is found that Internet use is strongly skewed in this age group leading to a partial exclusion of the old seniors (70+), and Logistic regression shows that gender differences in usage disappear if controlled for education, income, technical interest, pre-retirement computer use and marital status.
Abstract: The diffusion of the Internet is reaching a level between 80% and 90% in Western societies. Yet, while the digital divide is closing for young cohorts, it is still an issue when comparing various generations. This study focuses specifically on the so-called ‘grey divide’, a divide among seniors of age 65+ years. Based on a representative survey in Switzerland (N = 1105), it is found that Internet use is strongly skewed in this age group leading to a partial exclusion of the old seniors (70+). Logistic regression shows that gender differences in usage disappear if controlled for education, income, technical interest, pre-retirement computer use and marital status. Furthermore, the social context appears to have a manifold influence on Internet use. Encouragement by family and friends is a strong predictor for Internet use, and private learning settings are preferred over professional courses. Implications for digital inequality initiatives and further research are discussed.

558 citations


Journal ArticleDOI
29 Sep 2016-Nature
TL;DR: A global map of abundant, double-stranded DNA viruses complete with genomic and ecological contexts is presented to present a necessary foundation for the meaningful integration of viruses into ecosystem models where they act as key players in nutrient cycling and trophic networks.
Abstract: Ocean microbes drive biogeochemical cycling on a global scale. However, this cycling is constrained by viruses that affect community composition, metabolic activity, and evolutionary trajectories. Owing to challenges with the sampling and cultivation of viruses, genome-level viral diversity remains poorly described and grossly understudied, with less than 1% of observed surface-ocean viruses known. Here we assemble complete genomes and large genomic fragments from both surface- and deep-ocean viruses sampled during the Tara Oceans and Malaspina research expeditions, and analyse the resulting 'global ocean virome' dataset to present a global map of abundant, double-stranded DNA viruses complete with genomic and ecological contexts. A total of 15,222 epipelagic and mesopelagic viral populations were identified, comprising 867 viral clusters (defined as approximately genus-level groups). This roughly triples the number of known ocean viral populations and doubles the number of candidate bacterial and archaeal virus genera, providing a near-complete sampling of epipelagic communities at both the population and viral-cluster level. We found that 38 of the 867 viral clusters were locally or globally abundant, together accounting for nearly half of the viral populations in any global ocean virome sample. While two-thirds of these clusters represent newly described viruses lacking any cultivated representative, most could be computationally linked to dominant, ecologically relevant microbial hosts. Moreover, we identified 243 viral-encoded auxiliary metabolic genes, of which only 95 were previously known. Deeper analyses of four of these auxiliary metabolic genes (dsrC, soxYZ, P-II (also known as glnB) and amoC) revealed that abundant viruses may directly manipulate sulfur and nitrogen cycling throughout the epipelagic ocean. This viral catalog and functional analyses provide a necessary foundation for the meaningful integration of viruses into ecosystem models where they act as key players in nutrient cycling and trophic networks.

557 citations


Journal ArticleDOI
TL;DR: The GLODAPv2 data product as discussed by the authors is composed of data from 724 scientific cruises covering the global ocean and includes data from an additional 168 cruises, including data from CARINA (CARbon IN the Atlantic) in 2009/2010, and PACIFICA (PACIFic ocean interior CArbon) in 2013, as well as data from a total of 724 cruises.
Abstract: . Version 2 of the Global Ocean Data Analysis Project (GLODAPv2) data product is composed of data from 724 scientific cruises covering the global ocean. It includes data assembled during the previous efforts GLODAPv1.1 (Global Ocean Data Analysis Project version 1.1) in 2004, CARINA (CARbon IN the Atlantic) in 2009/2010, and PACIFICA (PACIFic ocean Interior CArbon) in 2013, as well as data from an additional 168 cruises. Data for 12 core variables (salinity, oxygen, nitrate, silicate, phosphate, dissolved inorganic carbon, total alkalinity, pH, CFC-11, CFC-12, CFC-113, and CCl4) have been subjected to extensive quality control, including systematic evaluation of bias. The data are available in two formats: (i) as submitted but updated to WOCE exchange format and (ii) as a merged and internally consistent data product. In the latter, adjustments have been applied to remove significant biases, respecting occurrences of any known or likely time trends or variations. Adjustments applied by previous efforts were re-evaluated. Hence, GLODAPv2 is not a simple merging of previous products with some new data added but a unique, internally consistent data product. This compiled and adjusted data product is believed to be consistent to better than 0.005 in salinity, 1 % in oxygen, 2 % in nitrate, 2 % in silicate, 2 % in phosphate, 4 µmol kg−1 in dissolved inorganic carbon, 6 µmol kg−1 in total alkalinity, 0.005 in pH, and 5 % for the halogenated transient tracers. The original data and their documentation and doi codes are available at the Carbon Dioxide Information Analysis Center ( http://cdiac.ornl.gov/oceans/GLODAPv2/ ). This site also provides access to the calibrated data product, which is provided as a single global file or four regional ones – the Arctic, Atlantic, Indian, and Pacific oceans – under the doi:10.3334/CDIAC/OTG.NDP093_GLODAPv2 . The product files also include significant ancillary and approximated data. These were obtained by interpolation of, or calculation from, measured data. This paper documents the GLODAPv2 methods and products and includes a broad overview of the secondary quality control results. The magnitude of and reasoning behind each adjustment is available on a per-cruise and per-variable basis in the online Adjustment Table.

415 citations


Journal ArticleDOI
TL;DR: This contribution investigates a new paradigm from machine learning, namely deep machine learning by examining design configurations of deep Convolutional Neural Networks and the impact of different hyper-parameter settings towards the accuracy of defect detection results.

409 citations


Journal ArticleDOI
TL;DR: It is shown by atomistic simulations that a consistent part of the green gap in c-plane InGaN/GaN-based light emitting diodes may be attributed to a decrease in the radiative recombination coefficient with increasing indium content due to random fluctuations of the indium concentration naturally present in any In GaN alloy.
Abstract: White light emitting diodes (LEDs) based on III-nitride InGaN/GaN quantum wells currently offer the highest overall efficiency for solid state lighting applications. Although current phosphor-converted white LEDs have high electricity-to-light conversion efficiencies, it has been recently pointed out that the full potential of solid state lighting could be exploited only by color mixing approaches without employing phosphor-based wavelength conversion. Such an approach requires direct emitting LEDs of different colors, including, in particular, the green-yellow range of the visible spectrum. This range, however, suffers from a systematic drop in efficiency, known as the "green gap," whose physical origin has not been understood completely so far. In this work, we show by atomistic simulations that a consistent part of the green gap in c-plane InGaN/GaN-based light emitting diodes may be attributed to a decrease in the radiative recombination coefficient with increasing indium content due to random fluctuations of the indium concentration naturally present in any InGaN alloy.

364 citations


Journal ArticleDOI
TL;DR: In this article, the authors identify eleven interglacials in the last 800,000 years, a result that is robust to alternative definitions, such as the onset of an interglacial (glacial termination) seems to require a reducing precession parameter (increasing Northern Hemisphere summer insolation), but this condition alone is insufficient.
Abstract: Interglacials, including the present (Holocene) period, are warm, low land ice extent (high sea level), end-members of glacial cycles. Based on a sea level definition, we identify eleven interglacials in the last 800,000 years, a result that is robust to alternative definitions. Data compilations suggest that despite spatial heterogeneity, Marine Isotope Stages (MIS) 5e (last interglacial) and 11c (~400 ka ago) were globally strong (warm), while MIS 13a (~500 ka ago) was cool at many locations. A step change in strength of interglacials at 450 ka is apparent only in atmospheric CO2 and in Antarctic and deep ocean temperature. The onset of an interglacial (glacial termination) seems to require a reducing precession parameter (increasing Northern Hemisphere summer insolation), but this condition alone is insufficient. Terminations involve rapid, nonlinear, reactions of ice volume, CO2, and temperature to external astronomical forcing. The precise timing of events may be modulated by millennial-scale climate change that can lead to a contrasting timing of maximum interglacial intensity in each hemisphere. A variety of temporal trends is observed, such that maxima in the main records are observed either early or late in different interglacials. The end of an interglacial (glacial inception) is a slower process involving a global sequence of changes. Interglacials have been typically 10–30 ka long. The combination of minimal reduction in northern summer insolation over the next few orbital cycles, owing to low eccentricity, and high atmospheric greenhouse gas concentrations implies that the next glacial inception is many tens of millennia in the future.

Journal ArticleDOI
TL;DR: A review of the advances in stratospheric aerosol research can be found in this article, with a focus on the agreement between in situ and space-based inferences of aerosol properties during volcanically quiescent periods.
Abstract: Interest in stratospheric aerosol and its role in climate have increased over the last decade due to the observed increase in stratospheric aerosol since 2000 and the potential for changes in the sulfur cycle induced by climate change. This review provides an overview about the advances in stratospheric aerosol research since the last comprehensive assessment of stratospheric aerosol was published in 2006. A crucial development since 2006 is the substantial improvement in the agreement between in situ and space-based inferences of stratospheric aerosol properties during volcanically quiescent periods. Furthermore, new measurement systems and techniques, both in situ and space based, have been developed for measuring physical aerosol properties with greater accuracy and for characterizing aerosol composition. However, these changes induce challenges to constructing a long-term stratospheric aerosol climatology. Currently, changes in stratospheric aerosol levels less than 20% cannot be confidently quantified. The volcanic signals tend to mask any nonvolcanically driven change, making them difficult to understand. While the role of carbonyl sulfide as a substantial and relatively constant source of stratospheric sulfur has been confirmed by new observations and model simulations, large uncertainties remain with respect to the contribution from anthropogenic sulfur dioxide emissions. New evidence has been provided that stratospheric aerosol can also contain small amounts of nonsulfate matter such as black carbon and organics. Chemistry-climate models have substantially increased in quantity and sophistication. In many models the implementation of stratospheric aerosol processes is coupled to radiation and/or stratospheric chemistry modules to account for relevant feedback processes.

Journal ArticleDOI
TL;DR: The GLODAPv2.2016b data set was used by as discussed by the authors to create global 1°'×'1° mapped climatologies of salinity, temperature, oxygen, nitrate, phosphate, silicate, total dissolved inorganic carbon (TCO2), total alkalinity (TAlk), pH, and CaCO3 saturation states.
Abstract: . We present a mapped climatology (GLODAPv2.2016b) of ocean biogeochemical variables based on the new GLODAP version 2 data product (Olsen et al., 2016; Key et al., 2015), which covers all ocean basins over the years 1972 to 2013. The quality-controlled and internally consistent GLODAPv2 was used to create global 1° × 1° mapped climatologies of salinity, temperature, oxygen, nitrate, phosphate, silicate, total dissolved inorganic carbon (TCO2), total alkalinity (TAlk), pH, and CaCO3 saturation states using the Data-Interpolating Variational Analysis (DIVA) mapping method. Improving on maps based on an earlier but similar dataset, GLODAPv1.1, this climatology also covers the Arctic Ocean. Climatologies were created for 33 standard depth surfaces. The conceivably confounding temporal trends in TCO2 and pH due to anthropogenic influence were removed prior to mapping by normalizing these data to the year 2002 using first-order calculations of anthropogenic carbon accumulation rates. We additionally provide maps of accumulated anthropogenic carbon in the year 2002 and of preindustrial TCO2. For all parameters, all data from the full 1972–2013 period were used, including data that did not receive full secondary quality control. The GLODAPv2.2016b global 1° × 1° mapped climatologies, including error fields and ancillary information, are available at the GLODAPv2 web page at the Carbon Dioxide Information Analysis Center (CDIAC; doi:10.3334/CDIAC/OTG.NDP093_GLODAPv2 ).

Journal ArticleDOI
TL;DR: A review of 917 relative sea-level (RSL) data-points has resulted in the first quality-controlled database constraining the Holocene sea level histories of the western Mediterranean Sea (Spain, France, Italy, Slovenia, Croatia, Malta and Tunisia) as discussed by the authors.

Journal ArticleDOI
TL;DR: An overview of the current progress in salinity gradient power generation is presented, the prospects and challenges of the foremost technologies - pressure retarded osmosis (PRO), reverse electrodialysis (RED), and capacitive mixing (CapMix) are discussed and perspectives on the outlook of salinitygradient power generation are provided.
Abstract: Combining two solutions of different composition releases the Gibbs free energy of mixing. By using engineered processes to control the mixing, chemical energy stored in salinity gradients can be harnessed for useful work. In this critical review, we present an overview of the current progress in salinity gradient power generation, discuss the prospects and challenges of the foremost technologies - pressure retarded osmosis (PRO), reverse electrodialysis (RED), and capacitive mixing (CapMix) and provide perspectives on the outlook of salinity gradient power generation. Momentous strides have been made in technical development of salinity gradient technologies and field demonstrations with natural and anthropogenic salinity gradients (for example, seawater-river water and desalination brine-wastewater, respectively), but fouling persists to be a pivotal operational challenge that can significantly ebb away cost-competitiveness. Natural hypersaline sources (e.g., hypersaline lakes and salt domes) can achieve greater concentration difference and, thus, offer opportunities to overcome some of the limitations inherent to seawater-river water. Technological advances needed to fully exploit the larger salinity gradients are identified. While seawater desalination brine is a seemingly attractive high salinity anthropogenic stream that is otherwise wasted, actual feasibility hinges on the appropriate pairing with a suitable low salinity stream. Engineered solutions are foulant-free and can be thermally regenerative for application in low-temperature heat utilization. Alternatively, PRO, RED, and CapMix can be coupled with their analog separation process (reverse osmosis, electrodialysis, and capacitive deionization, respectively) in salinity gradient flow batteries for energy storage in chemical potential of the engineered solutions. Rigorous techno-economic assessments can more clearly identify the prospects of low-grade heat conversion and large-scale energy storage. While research attention is squarely focused on efficiency and power improvements, efforts to mitigate fouling and lower membrane and electrode cost will be equally important to reduce levelized cost of salinity gradient energy production and, thus, boost PRO, RED, and CapMix power generation to be competitive with other renewable technologies. Cognizance of the recent key developments and technical progress on the different technological fronts can help steer the strategic advancement of salinity gradient as a sustainable energy source.

Journal ArticleDOI
TL;DR: In this article, the first major release of the OCO2 retrieval algorithm (B7r) and X_(CO2) from OCO-2's primary ground-based validation network: the Total Carbon Column Observing Network (TCCON) were compared.
Abstract: NASA's Orbiting Carbon Observatory-2 (OCO-2) has been measuring carbon dioxide column-averaged dry-air mole fraction, X_(CO_2), in the Earth's atmosphere for over 2 years. In this paper, we describe the comparisons between the first major release of the OCO-2 retrieval algorithm (B7r) and X_(CO2) from OCO-2's primary ground-based validation network: the Total Carbon Column Observing Network (TCCON). The OCO-2 X_(CO_2) retrievals, after filtering and bias correction, agree well when aggregated around and coincident with TCCON data in nadir, glint, and target observation modes, with absolute median differences less than 0.4 ppm and RMS differences less than 1.5 ppm. After bias correction, residual biases remain. These biases appear to depend on latitude, surface properties, and scattering by aerosols. It is thus crucial to continue measurement comparisons with TCCON to monitor and evaluate the OCO-2 X_(CO_2) data quality throughout its mission.

Journal ArticleDOI
TL;DR: There is no evidence for a post-1980s return to well-oxygenated lacustrine conditions in industrialized countries despite the implementation of restoration programs, and it is shown that the increase of human activities and nutrient release is leading to hypoxia onset.
Abstract: The spread of hypoxia is a threat to aquatic ecosystem functions and services as well as to biodiversity. However, sparse long-term monitoring of lake ecosystems has prevented reconstruction of global hypoxia dynamics while inhibiting investigations into its causes and assessing the resilience capacity of these systems. This study compiles the onset and duration of hypoxia recorded in sediments of 365 lakes worldwide since AD 1700, showing that lacustrine hypoxia started spreading before AD 1900, 70 years prior to hypoxia in coastal zones. This study also shows that the increase of human activities and nutrient release is leading to hypoxia onset. No correlations were found with changes in precipitation or temperature. There is no evidence for a post-1980s return to well-oxygenated lacustrine conditions in industrialized countries despite the implementation of restoration programs. The apparent establishment of stable hypoxic conditions prior to AD 1900 highlights the challenges of a growing nutrient demand, accompanied by increasing global nutrient emissions of our industrialized societies, and climate change.

Journal ArticleDOI
TL;DR: Foundations and technologies required for continuous maintenance within the Industry 4.0 context are presented and the role of IoT, standards and cyber security are identified.
Abstract: High value and long life products require continuous maintenance throughout their life cycle to achieve required performance with optimum through-life cost. This paper presents foundations and technologies required to offer the maintenance service. Component and system level degradation science, assessment and modelling along with life cycle ‘big data’ analytics are the two most important knowledge and skill base required for the continuous maintenance. Advanced computing and visualisation technologies will improve efficiency of the maintenance and reduce through-life cost of the product. Future of continuous maintenance within the Industry 4.0 context also identifies the role of IoT, standards and cyber security.

Journal ArticleDOI
TL;DR: The polar regions have been attracting more and more attention in recent years, fueled by the perceptible impacts of anthropogenic climate change as mentioned in this paper, and it is argued that environmental prediction systems for the polar regions are less developed than elsewhere.
Abstract: The polar regions have been attracting more and more attention in recent years, fueled by the perceptible impacts of anthropogenic climate change. Polar climate change provides new opportunities, such as shorter shipping routes between Europe and East Asia, but also new risks such as the potential for industrial accidents or emergencies in ice-covered seas. Here, it is argued that environmental prediction systems for the polar regions are less developed than elsewhere. There are many reasons for this situation, including the polar regions being (historically) lower priority, with fewer in situ observations, and with numerous local physical processes that are less well represented by models. By contrasting the relative importance of different physical processes in polar and lower latitudes, the need for a dedicated polar prediction effort is illustrated. Research priorities are identified that will help to advance environmental polar prediction capabilities. Examples include an improvement of the p...

Journal ArticleDOI
17 Nov 2016-Nature
TL;DR: It is shown that an anaerobic thermophilic enrichment culture composed of dense consortia of archaea and bacteria apparently uses partly similar pathways to oxidize the C4 hydrocarbon butane.
Abstract: The anaerobic formation and oxidation of methane involve unique enzymatic mechanisms and cofactors, all of which are believed to be specific for C1-compounds. Here we show that an anaerobic thermophilic enrichment culture composed of dense consortia of archaea and bacteria apparently uses partly similar pathways to oxidize the C4 hydrocarbon butane. The archaea, proposed genus 'Candidatus Syntrophoarchaeum', show the characteristic autofluorescence of methanogens, and contain highly expressed genes encoding enzymes similar to methyl-coenzyme M reductase. We detect butyl-coenzyme M, indicating archaeal butane activation analogous to the first step in anaerobic methane oxidation. In addition, Ca. Syntrophoarchaeum expresses the genes encoding β-oxidation enzymes, carbon monoxide dehydrogenase and reversible C1 methanogenesis enzymes. This allows for the complete oxidation of butane. Reducing equivalents are seemingly channelled to HotSeep-1, a thermophilic sulfate-reducing partner bacterium known from the anaerobic oxidation of methane. Genes encoding 16S rRNA and methyl-coenzyme M reductase similar to those identifying Ca. Syntrophoarchaeum were repeatedly retrieved from marine subsurface sediments, suggesting that the presented activation mechanism is naturally widespread in the anaerobic oxidation of short-chain hydrocarbons.

Journal ArticleDOI
TL;DR: It is demonstrated that the sustained production of North Atlantic Deep Water under glacial conditions, indicating that southern-sourced waters were not as spatially extensive during the Last Glacial Maximum as previously believed.
Abstract: Changes in deep ocean ventilation are commonly invoked as the primary cause of lower glacial atmospheric CO2. The water mass structure of the glacial deep Atlantic Ocean and the mechanism by which it may have sequestered carbon remain elusive. Here we present neodymium isotope measurements from cores throughout the Atlantic that reveal glacial–interglacial changes in water mass distributions. These results demonstrate the sustained production of North Atlantic Deep Water under glacial conditions, indicating that southern-sourced waters were not as spatially extensive during the Last Glacial Maximum as previously believed. We demonstrate that the depleted glacial δ13C values in the deep Atlantic Ocean cannot be explained solely by water mass source changes. A greater amount of respired carbon, therefore, must have been stored in the abyssal Atlantic during the Last Glacial Maximum. We infer that this was achieved by a sluggish deep overturning cell, comprised of well-mixed northern- and southern-sourced waters.

Journal ArticleDOI
11 Jan 2016
TL;DR: The current role of machine learning techniques in the context of surgery with a focus on surgical robotics (SR) is reviewed and a perspective on the future possibilities for enhancing the effectiveness of procedures by integrating ML in the operating room is provided.
Abstract: Advances in technology and computing play an increasingly important role in the evolution of modern surgical techniques and paradigms. This article reviews the current role of machine learning (ML) techniques in the context of surgery with a focus on surgical robotics (SR). Also, we provide a perspective on the future possibilities for enhancing the effectiveness of procedures by integrating ML in the operating room. The review is focused on ML techniques directly applied to surgery, surgical robotics, surgical training and assessment. The widespread use of ML methods in diagnosis and medical image computing is beyond the scope of the review. Searches were performed on PubMed and IEEE Explore using combinations of keywords: ML, surgery, robotics, surgical and medical robotics, skill learning, skill analysis and learning to perceive. Studies making use of ML methods in the context of surgery are increasingly being reported. In particular, there is an increasing interest in using ML for developing tools to understand and model surgical skill and competence or to extract surgical workflow. Many researchers begin to integrate this understanding into the control of recent surgical robots and devices. ML is an expanding field. It is popular as it allows efficient processing of vast amounts of data for interpreting and real-time decision making. Already widely used in imaging and diagnosis, it is believed that ML will also play an important role in surgery and interventional treatments. In particular, ML could become a game changer into the conception of cognitive surgical robots. Such robots endowed with cognitive skills would assist the surgical team also on a cognitive level, such as possibly lowering the mental load of the team. For example, ML could help extracting surgical skill, learned through demonstration by human experts, and could transfer this to robotic skills. Such intelligent surgical assistance would significantly surpass the state of the art in surgical robotics. Current devices possess no intelligence whatsoever and are merely advanced and expensive instruments.

Journal ArticleDOI
TL;DR: This review focuses on a summarization of the recent studies in functionalizing graphene-biomolecule nanohybrids using different biomolecules, such as DNA, peptides, proteins, enzymes, carbohydrates, and viruses.
Abstract: Graphene-based materials have attracted increasing attention due to their atomically-thick two-dimensional structures, high conductivity, excellent mechanical properties, and large specific surface areas. The combination of biomolecules with graphene-based materials offers a promising method to fabricate novel graphene–biomolecule hybrid nanomaterials with unique functions in biology, medicine, nanotechnology, and materials science. In this review, we focus on a summarization of the recent studies in functionalizing graphene-based materials using different biomolecules, such as DNA, peptides, proteins, enzymes, carbohydrates, and viruses. The different interactions between graphene and biomolecules at the molecular level are demonstrated and discussed in detail. In addition, the potential applications of the created graphene–biomolecule nanohybrids in drug delivery, cancer treatment, tissue engineering, biosensors, bioimaging, energy materials, and other nanotechnological applications are presented. This review will be helpful to know the modification of graphene with biomolecules, understand the interactions between graphene and biomolecules at the molecular level, and design functional graphene-based nanomaterials with unique properties for various applications.

Journal ArticleDOI
TL;DR: An approach that combines information about the equilibrium sea level response to global warming and last century's observed contribution from the individual components to constrain projections for this century is presented, which may lead to a better understanding of the gap between process-based and global semiempirical approaches.
Abstract: Sea level has been steadily rising over the past century, predominantly due to anthropogenic climate change. The rate of sea level rise will keep increasing with continued global warming, and, even if temperatures are stabilized through the phasing out of greenhouse gas emissions, sea level is still expected to rise for centuries. This will affect coastal areas worldwide, and robust projections are needed to assess mitigation options and guide adaptation measures. Here we combine the equilibrium response of the main sea level rise contributions with their last century's observed contribution to constrain projections of future sea level rise. Our model is calibrated to a set of observations for each contribution, and the observational and climate uncertainties are combined to produce uncertainty ranges for 21st century sea level rise. We project anthropogenic sea level rise of 28-56 cm, 37-77 cm, and 57-131 cm in 2100 for the greenhouse gas concentration scenarios RCP26, RCP45, and RCP85, respectively. Our uncertainty ranges for total sea level rise overlap with the process-based estimates of the Intergovernmental Panel on Climate Change. The "constrained extrapolation" approach generalizes earlier global semiempirical models and may therefore lead to a better understanding of the discrepancies with process-based projections.

Proceedings ArticleDOI
P. Soffitta, R. Bellazzini1, Enrico Bozzo2, Vadim Burwitz  +418 moreInstitutions (132)
TL;DR: The X-ray Imaging Polarimetry Explorer (XIPE) as discussed by the authors is a mission dedicated to Xray Astronomy which is in a competitive phase A as fourth medium size mission of ESA (M4).
Abstract: XIPE, the X-ray Imaging Polarimetry Explorer, is a mission dedicated to X-ray Astronomy. At the time of writing XIPE is in a competitive phase A as fourth medium size mission of ESA (M4). It promises to reopen the polarimetry window in high energy Astrophysics after more than 4 decades thanks to a detector that efficiently exploits the photoelectric effect and to X-ray optics with large effective area. XIPE uniqueness is time-spectrally-spatially- resolved X-ray polarimetry as a breakthrough in high energy astrophysics and fundamental physics. Indeed the payload consists of three Gas Pixel Detectors at the focus of three X-ray optics with a total effective area larger than one XMM mirror but with a low weight. The payload is compatible with the fairing of the Vega launcher. XIPE is designed as an observatory for X-ray astronomers with 75 % of the time dedicated to a Guest Observer competitive program and it is organized as a consortium across Europe with main contributions from Italy, Germany, Spain, United Kingdom, Poland, Sweden.

Journal ArticleDOI
TL;DR: Genes encoding enzymes involved in acetate production as well as in the reductive acetyl-CoA pathway were detected in all four genomes inferring that these Archaea are organo-heterotrophic and autotrophic acetogens.
Abstract: Summary Investigations of the biogeochemical roles of benthic Archaea in marine sediments are hampered by the scarcity of cultured representatives. In order to determine their metabolic capacity, we reconstructed the genomic content of four widespread uncultured benthic Archaea recovered from estuary sediments at 48% to 95% completeness. Four genomic bins were found to belong to different subgroups of the former Miscellaneous Crenarcheota Group (MCG) now called Bathyarchaeota: MCG-6, MCG-1, MCG-7/17 and MCG-15. Metabolic predictions based on gene content of the different genome bins indicate that subgroup 6 has the ability to hydrolyse extracellular plant-derived carbohydrates, and that all four subgroups can degrade detrital proteins. Genes encoding enzymes involved in acetate production as well as in the reductive acetyl–CoA pathway were detected in all four genomes inferring that these Archaea are organo-heterotrophic and autotrophic acetogens. Genes involved in nitrite reduction were detected in all Bathyarchaeota subgroups and indicate a potential for dissimilatory nitrite reduction to ammonium. Comparing the genome content of the different Bathyarchaeota subgroups indicated preferences for distinct types of carbohydrate substrates and implicitly, for different niches within the sedimentary environment.

Journal ArticleDOI
TL;DR: It is shown that, contrary to widespread expectation, hydrogen atoms can be located very accurately using x-ray diffraction, yielding bond lengths involving hydrogen atoms (A–H) that are in agreement with results from neutron diffraction mostly within a single standard deviation.
Abstract: Precise and accurate structural information on hydrogen atoms is crucial to the study of energies of interactions important for crystal engineering, materials science, medicine, and pharmacy, and to the estimation of physical and chemical properties in solids. However, hydrogen atoms only scatter x-radiation weakly, so x-rays have not been used routinely to locate them accurately. Textbooks and teaching classes still emphasize that hydrogen atoms cannot be located with x-rays close to heavy elements; instead, neutron diffraction is needed. We show that, contrary to widespread expectation, hydrogen atoms can be located very accurately using x-ray diffraction, yielding bond lengths involving hydrogen atoms (A–H) that are in agreement with results from neutron diffraction mostly within a single standard deviation. The precision of the determination is also comparable between x-ray and neutron diffraction results. This has been achieved at resolutions as low as 0.8 A using Hirshfeld atom refinement (HAR). We have applied HAR to 81 crystal structures of organic molecules and compared the A–H bond lengths with those from neutron measurements for A–H bonds sorted into bonds of the same class. We further show in a selection of inorganic compounds that hydrogen atoms can be located in bridging positions and close to heavy transition metals accurately and precisely. We anticipate that, in the future, conventional x-radiation sources at in-house diffractometers can be used routinely for locating hydrogen atoms in small molecules accurately instead of large-scale facilities such as spallation sources or nuclear reactors.

Journal ArticleDOI
12 May 2016-Nature
TL;DR: Palaeoclimatic evidence of monsoon rainfall dynamics across different regions and timescales could help to understand and predict the sensitivity and response of monsoons to various forcing mechanisms.
Abstract: Monsoons are the dominant seasonal mode of climate variability in the tropics and are critically important conveyors of atmospheric moisture and energy at a global scale. Predicting monsoons, which have profound impacts on regions that are collectively home to more than 70 per cent of Earth's population, is a challenge that is difficult to overcome by relying on instrumental data from only the past few decades. Palaeoclimatic evidence of monsoon rainfall dynamics across different regions and timescales could help us to understand and predict the sensitivity and response of monsoons to various forcing mechanisms. This evidence suggests that monsoon systems exhibit substantial regional character.

Journal ArticleDOI
TL;DR: In this paper, the authors conducted an airborne campaign in Four Corners during April 2015 with the next-generation Airborne Visible/Infrared Imaging Spectrometer (near-infrared) and Hyperspectral Thermal Emission Spectrometers (thermal infrared) imaging spectrometers to better understand the source of methane by measuring methane plumes at 1-to 3-m spatial resolution.
Abstract: Methane (CH 4 ) impacts climate as the second strongest anthropogenic greenhouse gas and air quality by influencing tropospheric ozone levels. Space-based observations have identified the Four Corners region in the Southwest United States as an area of large CH 4 enhancements. We conducted an airborne campaign in Four Corners during April 2015 with the next-generation Airborne Visible/Infrared Imaging Spectrometer (near-infrared) and Hyperspectral Thermal Emission Spectrometer (thermal infrared) imaging spectrometers to better understand the source of methane by measuring methane plumes at 1- to 3-m spatial resolution. Our analysis detected more than 250 individual methane plumes from fossil fuel harvesting, processing, and distributing infrastructures, spanning an emission range from the detection limit ∼ 2 kg/h to 5 kg/h through ∼ 5,000 kg/h. Observed sources include gas processing facilities, storage tanks, pipeline leaks, and well pads, as well as a coal mine venting shaft. Overall, plume enhancements and inferred fluxes follow a lognormal distribution, with the top 10% emitters contributing 49 to 66% to the inferred total point source flux of 0.23 Tg/y to 0.39 Tg/y. With the observed confirmation of a lognormal emission distribution, this airborne observing strategy and its ability to locate previously unknown point sources in real time provides an efficient and effective method to identify and mitigate major emissions contributors over a wide geographic area. With improved instrumentation, this capability scales to spaceborne applications [Thompson DR, et al. (2016) Geophys Res Lett 43(12):6571–6578]. Further illustration of this potential is demonstrated with two detected, confirmed, and repaired pipeline leaks during the campaign.

Journal ArticleDOI
TL;DR: In this paper, the effect of the electrolyte (nature and concentration, as well as current rate, on the aging of the positive electrode during the operation of the battery is investigated by using differential charge curves.