scispace - formally typeset
Search or ask a question

Showing papers by "University of Wisconsin-Madison published in 2012"


Journal ArticleDOI
TL;DR: The origins, challenges and solutions of NIH Image and ImageJ software are discussed, and how their history can serve to advise and inform other software projects.
Abstract: For the past 25 years NIH Image and ImageJ software have been pioneers as open tools for the analysis of scientific images. We discuss the origins, challenges and solutions of these two programs, and how their history can serve to advise and inform other software projects.

44,587 citations


Journal ArticleDOI
TL;DR: Fiji is a distribution of the popular open-source software ImageJ focused on biological-image analysis that facilitates the transformation of new algorithms into ImageJ plugins that can be shared with end users through an integrated update system.
Abstract: Fiji is a distribution of the popular open-source software ImageJ focused on biological-image analysis. Fiji uses modern software engineering practices to combine powerful software libraries with a broad range of scripting languages to enable rapid prototyping of image-processing algorithms. Fiji facilitates the transformation of new algorithms into ImageJ plugins that can be shared with end users through an integrated update system. We propose Fiji as a platform for productive collaboration between computer science and biology research communities.

43,540 citations


Journal ArticleDOI
TL;DR: The new version provides convergence diagnostics and allows multiple analyses to be run in parallel with convergence progress monitored on the fly, and provides more output options than previously, including samples of ancestral states, site rates, site dN/dS rations, branch rates, and node dates.
Abstract: Since its introduction in 2001, MrBayes has grown in popularity as a software package for Bayesian phylogenetic inference using Markov chain Monte Carlo (MCMC) methods. With this note, we announce the release of version 3.2, a major upgrade to the latest official release presented in 2003. The new version provides convergence diagnostics and allows multiple analyses to be run in parallel with convergence progress monitored on the fly. The introduction of new proposals and automatic optimization of tuning parameters has improved convergence for many problems. The new version also sports significantly faster likelihood calculations through streaming single-instruction-multiple-data extensions (SSE) and support of the BEAGLE library, allowing likelihood calculations to be delegated to graphics processing units (GPUs) on compatible hardware. Speedup factors range from around 2 with SSE code to more than 50 with BEAGLE for codon problems. Checkpointing across all models allows long runs to be completed even when an analysis is prematurely terminated. New models include relaxed clocks, dating, model averaging across time-reversible substitution models, and support for hard, negative, and partial (backbone) tree constraints. Inference of species trees from gene trees is supported by full incorporation of the Bayesian estimation of species trees (BEST) algorithms. Marginal model likelihoods for Bayes factor tests can be estimated accurately across the entire model space using the stepping stone method. The new version provides more output options than previously, including samples of ancestral states, site rates, site d(N)/d(S) rations, branch rates, and node dates. A wide range of statistics on tree parameters can also be output for visualization in FigTree and compatible software.

18,718 citations


Journal ArticleDOI
Georges Aad1, T. Abajyan2, Brad Abbott3, Jalal Abdallah4  +2964 moreInstitutions (200)
TL;DR: In this article, a search for the Standard Model Higgs boson in proton-proton collisions with the ATLAS detector at the LHC is presented, which has a significance of 5.9 standard deviations, corresponding to a background fluctuation probability of 1.7×10−9.

9,282 citations


Journal ArticleDOI
TL;DR: In this paper, results from searches for the standard model Higgs boson in proton-proton collisions at 7 and 8 TeV in the CMS experiment at the LHC, using data samples corresponding to integrated luminosities of up to 5.8 standard deviations.

8,857 citations


Journal ArticleDOI
TL;DR: These guidelines are presented for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process vs. those that measure flux through the autophagy pathway (i.e., the complete process); thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from stimuli that result in increased autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field.

4,316 citations


Journal ArticleDOI
TL;DR: Longer diabetes duration and poorer glycemic and blood pressure control are strongly associated with DR, and these data highlight the substantial worldwide public health burden of DR and the importance of modifiable risk factors in its occurrence.
Abstract: OBJECTIVE To examine the global prevalence and major risk factors for diabetic retinopathy (DR) and vision-threatening diabetic retinopathy (VTDR) among people with diabetes. RESEARCH DESIGN AND METHODS A pooled analysis using individual participant data from population-based studies around the world was performed. A systematic literature review was conducted to identify all population-based studies in general populations or individuals with diabetes who had ascertained DR from retinal photographs. Studies provided data for DR end points, including any DR, proliferative DR, diabetic macular edema, and VTDR, and also major systemic risk factors. Pooled prevalence estimates were directly age-standardized to the 2010 World Diabetes Population aged 20–79 years. RESULTS A total of 35 studies (1980–2008) provided data from 22,896 individuals with diabetes. The overall prevalence was 34.6% (95% CI 34.5–34.8) for any DR, 6.96% (6.87–7.04) for proliferative DR, 6.81% (6.74–6.89) for diabetic macular edema, and 10.2% (10.1–10.3) for VTDR. All DR prevalence end points increased with diabetes duration, hemoglobin A 1c , and blood pressure levels and were higher in people with type 1 compared with type 2 diabetes. CONCLUSIONS There are approximately 93 million people with DR, 17 million with proliferative DR, 21 million with diabetic macular edema, and 28 million with VTDR worldwide. Longer diabetes duration and poorer glycemic and blood pressure control are strongly associated with DR. These data highlight the substantial worldwide public health burden of DR and the importance of modifiable risk factors in its occurrence. This study is limited by data pooled from studies at different time points, with different methodologies and population characteristics.

3,282 citations


Journal ArticleDOI
TL;DR: In this paper, a convex programming problem is used to find the matrix with the minimum nuclear norm that is consistent with the observed entries in a low-rank matrix, which is then used to recover all the missing entries from most sufficiently large subsets.
Abstract: Suppose that one observes an incomplete subset of entries selected from a low-rank matrix. When is it possible to complete the matrix and recover the entries that have not been seen? We demonstrate that in very general settings, one can perfectly recover all of the missing entries from most sufficiently large subsets by solving a convex programming problem that finds the matrix with the minimum nuclear norm agreeing with the observed entries. The techniques used in this analysis draw upon parallels in the field of compressed sensing, demonstrating that objects other than signals and images can be perfectly reconstructed from very limited information.

2,327 citations


Journal ArticleDOI
F. P. An, J. Z. Bai, A. B. Balantekin1, H. R. Band1  +271 moreInstitutions (34)
TL;DR: The Daya Bay Reactor Neutrino Experiment has measured a nonzero value for the neutrino mixing angle θ(13) with a significance of 5.2 standard deviations.
Abstract: The Daya Bay Reactor Neutrino Experiment has measured a nonzero value for the neutrino mixing angle θ13 with a significance of 5.2 standard deviations. Antineutrinos from six 2.9 GW_(th) reactors were detected in six antineutrino detectors deployed in two near (flux-weighted baseline 470 m and 576 m) and one far (1648 m) underground experimental halls. With a 43 000 ton–GW_(th)–day live-time exposure in 55 days, 10 416 (80 376) electron-antineutrino candidates were detected at the far hall (near halls). The ratio of the observed to expected number of antineutrinos at the far hall is R=0.940± 0.011(stat.)±0.004(syst.). A rate-only analysis finds sin^22θ_(13)=0.092±0.016(stat.)±0.005(syst.) in a three-neutrino framework.

2,163 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a measurement of the cosmic distance scale from detections of the baryon acoustic oscillations in the clustering of galaxies from the Baryon Oscillation Spectroscopic Survey (BOSS), which is part of the Sloan Digital Sky Survey III (SDSS-III).
Abstract: We present a one per cent measurement of the cosmic distance scale from the detections of the baryon acoustic oscillations in the clustering of galaxies from the Baryon Oscillation Spectroscopic Survey (BOSS), which is part of the Sloan Digital Sky Survey III (SDSS-III). Our results come from the Data Release 11 (DR11) sample, containing nearly one million galaxies and covering approximately $8\,500$ square degrees and the redshift range $0.2

2,040 citations


Journal ArticleDOI
TL;DR: In this paper, the authors presented the first spectroscopic data from the Baryon Oscillation Spectroscopic Survey (BOSS) for the Sloan Digital Sky Survey III (SDSS-III) dataset.
Abstract: The Sloan Digital Sky Survey III (SDSS-III) presents the first spectroscopic data from the Baryon Oscillation Spectroscopic Survey (BOSS). This ninth data release (DR9) of the SDSS project includes 535,995 new galaxy spectra (median z ~ 0.52), 102,100 new quasar spectra (median z ~ 2.32), and 90,897 new stellar spectra, along with the data presented in previous data releases. These spectra were obtained with the new BOSS spectrograph and were taken between 2009 December and 2011 July. In addition, the stellar parameters pipeline, which determines radial velocities, surface temperatures, surface gravities, and metallicities of stars, has been updated and refined with improvements in temperature estimates for stars with T eff -0.5. DR9 includes new stellar parameters for all stars presented in DR8, including stars from SDSS-I and II, as well as those observed as part of the SEGUE-2. The astrometry error introduced in the DR8 imaging catalogs has been corrected in the DR9 data products. The next data release for SDSS-III will be in Summer 2013, which will present the first data from the APOGEE along with another year of data from BOSS, followed by the final SDSS-III data release in 2014 December.

Journal ArticleDOI
19 Oct 2012-Science
TL;DR: How previously isolated lines of work can be connected are reviewed, it is concluded that many critical transitions (such as escape from the poverty trap) can have positive outcomes, and how the new approaches to sensing fragility can help to detect both risks and opportunities for desired change.
Abstract: Tipping points in complex systems may imply risks of unwanted collapse, but also opportunities for positive change. Our capacity to navigate such risks and opportunities can be boosted by combining emerging insights from two unconnected fields of research. One line of work is revealing fundamental architectural features that may cause ecological networks, financial markets, and other complex systems to have tipping points. Another field of research is uncovering generic empirical indicators of the proximity to such critical thresholds. Although sudden shifts in complex systems will inevitably continue to surprise us, work at the crossroads of these emerging fields offers new approaches for anticipating critical transitions.

Journal ArticleDOI
07 Jun 2012-Nature
TL;DR: Evidence that the global ecosystem as a whole is approaching a planetary-scale critical transition as a result of human influence is reviewed, highlighting the need to improve biological forecasting by detecting early warning signs of critical transitions.
Abstract: There is evidence that human influence may be forcing the global ecosystem towards a rapid, irreversible, planetary-scale shift into a state unknown in human experience. Most forecasts of how the biosphere will change in response to human activity are rooted in projecting trajectories. Such models tend not anticipate critical transitions or tipping points, although recent work indicates a high probability of those taking place. And, at a local scale, ecosystems are known to shift abruptly between states when critical thresholds are passed. These authors review the evidence from across ecology and palaeontology that such a transition is being approached on the scale of the entire biosphere. They go on to suggest how biological forecasting might be improved to allow us to detect early warning signs of critical transitions on a global, as well as local, scale. Localized ecological systems are known to shift abruptly and irreversibly from one state to another when they are forced across critical thresholds. Here we review evidence that the global ecosystem as a whole can react in the same way and is approaching a planetary-scale critical transition as a result of human influence. The plausibility of a planetary-scale ‘tipping point’ highlights the need to improve biological forecasting by detecting early warning signs of critical transitions on global as well as local scales, and by detecting feedbacks that promote such transitions. It is also necessary to address root causes of how humans are forcing biological changes.

Journal ArticleDOI
TL;DR: A comprehensive study that projects the speedup potential of future multicores and examines the underutilization of integration capacity-dark silicon-is timely and crucial.
Abstract: A key question for the microprocessor research and design community is whether scaling multicores will provide the performance and value needed to scale down many more technology generations. To provide a quantitative answer to this question, a comprehensive study that projects the speedup potential of future multicores and examines the underutilization of integration capacity-dark silicon-is timely and crucial.

Journal ArticleDOI
TL;DR: This paper provides a general framework to convert notions of simplicity into convex penalty functions, resulting in convex optimization solutions to linear, underdetermined inverse problems.
Abstract: In applications throughout science and engineering one is often faced with the challenge of solving an ill-posed inverse problem, where the number of available measurements is smaller than the dimension of the model to be estimated. However in many practical situations of interest, models are constrained structurally so that they only have a few degrees of freedom relative to their ambient dimension. This paper provides a general framework to convert notions of simplicity into convex penalty functions, resulting in convex optimization solutions to linear, underdetermined inverse problems. The class of simple models considered includes those formed as the sum of a few atoms from some (possibly infinite) elementary atomic set; examples include well-studied cases from many technical fields such as sparse vectors (signal processing, statistics) and low-rank matrices (control, statistics), as well as several others including sums of a few permutation matrices (ranked elections, multiobject tracking), low-rank tensors (computer vision, neuroscience), orthogonal matrices (machine learning), and atomic measures (system identification). The convex programming formulation is based on minimizing the norm induced by the convex hull of the atomic set; this norm is referred to as the atomic norm. The facial structure of the atomic norm ball carries a number of favorable properties that are useful for recovering simple models, and an analysis of the underlying convex geometry provides sharp estimates of the number of generic measurements required for exact and robust recovery of models from partial information. These estimates are based on computing the Gaussian widths of tangent cones to the atomic norm ball. When the atomic set has algebraic structure the resulting optimization problems can be solved or approximated via semidefinite programming. The quality of these approximations affects the number of measurements required for recovery, and this tradeoff is characterized via some examples. Thus this work extends the catalog of simple models (beyond sparse vectors and low-rank matrices) that can be recovered from limited linear information via tractable convex programming.

Journal ArticleDOI
TL;DR: It is shown that temporal modulation of Wnt signaling is both essential and sufficient for efficient cardiac induction in hPSCs under defined, growth factor-free conditions.
Abstract: Human pluripotent stem cells (hPSCs) offer the potential to generate large numbers of functional cardiomyocytes from clonal and patient-specific cell sources. Here we show that temporal modulation of Wnt signaling is both essential and sufficient for efficient cardiac induction in hPSCs under defined, growth factor-free conditions. shRNA knockdown of β-catenin during the initial stage of hPSC differentiation fully blocked cardiomyocyte specification, whereas glycogen synthase kinase 3 inhibition at this point enhanced cardiomyocyte generation. Furthermore, sequential treatment of hPSCs with glycogen synthase kinase 3 inhibitors followed by inducible expression of β-catenin shRNA or chemical inhibitors of Wnt signaling produced a high yield of virtually (up to 98%) pure functional human cardiomyocytes from multiple hPSC lines. The robust ability to generate functional cardiomyocytes under defined, growth factor-free conditions solely by genetic or chemically mediated manipulation of a single developmental pathway should facilitate scalable production of cardiac cells suitable for research and regenerative applications.

Journal ArticleDOI
29 Jun 2012-Science
TL;DR: Comparative analyses of 31 fungal genomes suggest that lignin-degrading peroxidases expanded in the lineage leading to the ancestor of the Agaricomycetes, which is reconstructed as a white rot species, and then contracted in parallel lineages leading to brown rot and mycorrhizal species.
Abstract: Wood is a major pool of organic carbon that is highly resistant to decay, owing largely to the presence of lignin. The only organisms capable of substantial lignin decay are white rot fungi in the Agaricomycetes, which also contains non-lignin-degrading brown rot and ectomycorrhizal species. Comparative analyses of 31 fungal genomes (12 generated for this study) suggest that lignin-degrading peroxidases expanded in the lineage leading to the ancestor of the Agaricomycetes, which is reconstructed as a white rot species, and then contracted in parallel lineages leading to brown rot and mycorrhizal species. Molecular clock analyses suggest that the origin of lignin degradation might have coincided with the sharp decrease in the rate of organic carbon burial around the end of the Carboniferous period.

Journal ArticleDOI
TL;DR: In this article, the authors analyzed daily fields of 500-hPa heights from the National Centers for Environmental Prediction Reanalysis over N. America and the N. Atlantic to assess changes in north-south (Rossby) wave characteristics associated with Arctic amplification and the relaxation of poleward thickness gradients.
Abstract: [1] Arctic amplification (AA) – the observed enhanced warming in high northern latitudes relative to the northern hemisphere – is evident in lower-tropospheric temperatures and in 1000-to-500 hPa thicknesses. Daily fields of 500 hPa heights from the National Centers for Environmental Prediction Reanalysis are analyzed over N. America and the N. Atlantic to assess changes in north-south (Rossby) wave characteristics associated with AA and the relaxation of poleward thickness gradients. Two effects are identified that each contribute to a slower eastward progression of Rossby waves in the upper-level flow: 1) weakened zonal winds, and 2) increased wave amplitude. These effects are particularly evident in autumn and winter consistent with sea-ice loss, but are also apparent in summer, possibly related to earlier snow melt on high-latitude land. Slower progression of upper-level waves would cause associated weather patterns in mid-latitudes to be more persistent, which may lead to an increased probability of extreme weather events that result from prolonged conditions, such as drought, flooding, cold spells, and heat waves.

Journal ArticleDOI
21 Jun 2012-Nature
TL;DR: Results indicate that H5 HA can convert to an HA that supports efficient viral transmission in mammals, and will help individuals conducting surveillance in regions with circulating H5N1 viruses to recognize key residues that predict the pandemic potential of isolate, which will inform the development, production and distribution of effective countermeasures.
Abstract: Highly pathogenic avian H5N1 influenza A viruses occasionally infect humans, but currently do not transmit efficiently among humans. The viral haemagglutinin (HA) protein is a known host-range determinant as it mediates virus binding to hostspecific cellular receptors 1–3 . Here we assess the molecular changes in HA that would allow a virus possessing subtype H5 HA to be transmissible among mammals. We identified a reassortant H5 HA/H1N1 virus—comprising H5 HA (from an H5N1 virus) with four mutations and the remaining seven gene segments from a 2009 pandemic H1N1 virus—that was capable of droplet transmission in a ferret model. The transmissible H5 reassortant virus preferentially recognized human-type receptors, replicated efficiently in ferrets, caused lung lesions and weight loss, but was not highly pathogenic and did not cause mortality. These results indicate that H5 HA can convert to an HA that supports efficient viral transmission in mammals; however, we do not know whether the four mutations in the H5 HA identified here would render a wholly avian H5N1 virus transmissible. The genetic origin of the remaining seven viral gene segments may also critically contribute to transmissibility in mammals. Nevertheless, as H5N1 viruses continue to evolve and infect humans, receptor-binding variants of H5N1 viruses with pandemic potential, including avian–human reassortant viruses as tested here, may emerge. Our findings emphasize the need to prepare for potential pandemics caused by influenza viruses possessing H5 HA, and will help individuals conducting surveillance in regions with circulating H5N1 viruses to

Journal ArticleDOI
TL;DR: These indices and related worksheet provide an accurate and facile diagnosis-specific tool to estimate survival, potentially select appropriate treatment, and stratify clinical trials for patients with brain metastases.
Abstract: Purpose Our group has previously published the Graded Prognostic Assessment (GPA), a prognostic index for patients with brain metastases. Updates have been published with refinements to create diagnosis-specific Graded Prognostic Assessment indices. The purpose of this report is to present the updated diagnosis-specific GPA indices in a single, unified, user-friendly report to allow ease of access and use by treating physicians. Methods A multi-institutional retrospective (1985 to 2007) database of 3,940 patients with newly diagnosed brain metastases underwent univariate and multivariate analyses of prognostic factors associated with outcomes by primary site and treatment. Significant prognostic factors were used to define the diagnosis-specific GPA prognostic indices. A GPA of 4.0 correlates with the best prognosis, whereas a GPA of 0.0 corresponds with the worst prognosis. Results Significant prognostic factors varied by diagnosis. For lung cancer, prognostic factors were Karnofsky performance score, ag...

Journal ArticleDOI
TL;DR: Natural bond orbital (NBO) methods encompass a suite of algorithms that enable fundamental bonding concepts to be extracted from Hartree-Fock (HF), Density Functional Theory (DFT), and post-HF computations as discussed by the authors.
Abstract: Natural bond orbital (NBO) methods encompass a suite of algorithms that enable fundamental bonding concepts to be extracted from Hartree-Fock (HF), Density Functional Theory (DFT), and post-HF computations. NBO terminology and general mathematical formulations for atoms and polyatomic species are presented. NBO analyses of selected molecules that span the periodic table illustrate the deciphering of the molecular wavefunction in terms commonly understood by chemists: Lewis structures, charge, bond order, bond type, hybridization, resonance, donor–acceptor interactions, etc. Upcoming features in the NBO program address ongoing advances in ab initio computing technology and burgeoning demands of its user community by introducing major new methods, keywords, and electronic structure system/NBO communication enhancements. © 2011 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: The generalized form of context-dependent PPI approach has increased flexibility of statistical modeling, and potentially improves model fit, specificity to true negative findings, and sensitivity to true positive findings.

Journal ArticleDOI
05 Apr 2012-Nature
TL;DR: A record of global surface temperature from 80 proxy records is constructed and it is shown that temperature is correlated with and generally lags CO2 during the last deglaciation, supporting the conclusion that an antiphased hemispheric temperature response to ocean circulation changes superimposed on globally in-phase warming driven by increasing CO2 concentrations is an explanation for much of the temperature change at the end of the most recent ice age.
Abstract: The covariation of carbon dioxide (CO2) concentration and temperature in Antarctic ice-core records suggests a close link between CO2 and climate during the Pleistocene ice ages. The role and relative importance of CO2 in producing these climate changes remains unclear, however, in part because the ice-core deuterium record reflects local rather than global temperature. Here we construct a record of global surface temperature from 80 proxy records and show that temperature is correlated with and generally lags CO2 during the last (that is, the most recent) deglaciation. Differences between the respective temperature changes of the Northern Hemisphere and Southern Hemisphere parallel variations in the strength of the Atlantic meridional overturning circulation recorded in marine sediments. These observations, together with transient global climate model simulations, support the conclusion that an antiphased hemispheric temperature response to ocean circulation changes superimposed on globally in-phase warming driven by increasing CO2 concentrations is an explanation for much of the temperature change at the end of the most recent ice age. Understanding the causes of the Pleistocene ice ages has been a significant question in climate dynamics since they were discovered in the mid-nineteenth century. The identification of orbital frequencies

Journal ArticleDOI
TL;DR: A review of recent results published in the literature for biomass upgrading reactions using bimetallic catalysts offers the possibility of enabling lignocellulosic processing to become a larger part of the biofuels and renewable chemical industry.
Abstract: Research interest in biomass conversion to fuels and chemicals has increased significantly in the last decade as the necessity for a renewable source of carbon has become more evident. Accordingly, many different reactions and processes to convert biomass into high-value products and fuels have been proposed in the literature. Special attention has been given to the conversion of lignocellulosic biomass, which does not compete with food sources and is widely available as a low cost feedstock. In this review, we start with a brief introduction on lignocellulose and the different chemical structures of its components: cellulose, hemicellulose, and lignin. These three components allow for the production of different chemicals after fractionation. After a brief overview of the main reactions involved in biomass conversion, we focus on those where bimetallic catalysts are playing an important role. Although the reactions are similar for cellulose and hemicellulose, which contain C6 and C5 sugars, respectively, different products are obtained, and therefore, they have been reviewed separately. The third major fraction of lignocellulose that we address is lignin, which has significant challenges to overcome, as its structure makes catalytic processing more challenging. Bimetallic catalysts offer the possibility of enabling lignocellulosic processing to become a larger part of the biofuels and renewable chemical industry. This review summarizes recent results published in the literature for biomass upgrading reactions using bimetallic catalysts.

Journal ArticleDOI
04 Sep 2012-PLOS ONE
TL;DR: It is clear that policies encouraging the sustainable management of coastal ecosystems could significantly reduce carbon emissions from the land-use sector, in addition to sustaining the well-recognized ecosystem services of coastal habitats.
Abstract: Recent attention has focused on the high rates of annual carbon sequestration in vegetated coastal ecosystems—marshes, mangroves, and seagrasses—that may be lost with habitat destruction (‘conversion’). Relatively unappreciated, however, is that conversion of these coastal ecosystems also impacts very large pools of previously-sequestered carbon. Residing mostly in sediments, this ‘blue carbon’ can be released to the atmosphere when these ecosystems are converted or degraded. Here we provide the first global estimates of this impact and evaluate its economic implications. Combining the best available data on global area, land-use conversion rates, and near-surface carbon stocks in each of the three ecosystems, using an uncertainty-propagation approach, we estimate that 0.15–1.02 Pg (billion tons) of carbon dioxide are being released annually, several times higher than previous estimates that account only for lost sequestration. These emissions are equivalent to 3–19% of those from deforestation globally, and result in economic damages of $US 6–42 billion annually. The largest sources of uncertainty in these estimates stems from limited certitude in global area and rates of landuse conversion, but research is also needed on the fates of ecosystem carbon upon conversion. Currently, carbon emissions from the conversion of vegetated coastal ecosystems are not included in emissions accounting or carbon market protocols, but this analysis suggests they may be disproportionally important to both. Although the relevant science supporting these initial estimates will need to be refined in coming years, it is clear that policies encouraging the sustainable management of coastal ecosystems could significantly reduce carbon emissions from the land-use sector, in addition to sustaining the wellrecognized ecosystem services of coastal habitats.

Journal ArticleDOI
TL;DR: Lenalidomide maintenance therapy, initiated at day 100 after hematopoietic stem-cell transplantation, was associated with more toxicity and second cancers but a significantly longer time to disease progression and significantly improved overall survival among patients with myeloma.
Abstract: Background Data are lacking on whether lenalidomide maintenance therapy prolongs the time to disease progression after autologous hematopoietic stem-cell transplantation in patients with multiple myeloma. Methods Between April 2005 and July 2009, we randomly assigned 460 patients who were younger than 71 years of age and had stable disease or a marginal, partial, or complete response 100 days after undergoing stem-cell transplantation to lenalidomide or placebo, which was administered until disease progression. The starting dose of lenalidomide was 10 mg per day (range, 5 to 15). Results The study-drug assignments were unblinded in 2009, when a planned interim analysis showed a significantly longer time to disease progression in the lenalidomide group. At unblinding, 20% of patients who received lenalidomide and 44% of patients who received placebo had progressive disease or had died (P<0.001); of the remaining 128 patients who received placebo and who did not have progressive disease, 86 crossed over to ...

Journal ArticleDOI
TL;DR: This work proposes a new targeted proteomics paradigm centered on the use of next generation, quadrupole-equipped high resolution and accurate mass instruments: parallel reaction monitoring (PRM), and suggests that PRM will be a promising new addition to the quantitative proteomics toolbox.

Posted Content
TL;DR: In this paper, the authors argue that the factors leading to a resource-based advantage also predict who will appropriate rent and that knowledge-based assets are promising as a source of sustainable advantage because firmspecificity, social complexity and causal ambiguity make them hard for rivals to imitate.
Abstract: Most theories of competitive advantage seek to explain rent capture at the firm level but ignore which internal stakeholders will appropriate this rent. For example, IO economics focuses on market structure and the resource-based view focuses on unique firm-level capabilities that rivals cannot imitate or acquire. As researchers apply these frameworks, they either: 1) assume rent is captured by shareholders, 2) treat within-firm rent appropriation exogenously, or 3) ignore internal rent appropriation altogether. However, internal rent appropriation determines how much of the rent will be observable in measures of firm performance and is therefore central to empirical research focused on firm performance. What if rent from a competitive advantage is appropriated internally so it cannot be observed in performance measures? The resource-based view was not formulated to examine who will get the rent. Yet, this essay argues that the factors leading to a resource-based advantage also predict who will appropriate rent. Knowledge-based assets are promising as a source of sustainable advantage because firm-specificity, social complexity and causal ambiguity make them hard for rivals to imitate. Accordingly, these strong roles for internal stakeholders may grant them a great deal of bargaining power especially relative to investors who contribute the most fungible of all resources. This article integrates the resource-based view with the bargaining power literature by defining the firm as a nexus of contracts. This lens can help to explain when rent will be generated and, simultaneously, who will appropriate it. In doing so, it provides a more robust theory of firm performance than the resource-based view alone. This lens might also be useful for examining other theories of firm performance.

Journal ArticleDOI
20 May 2012-Spine
TL;DR: Data from this study show that there is excellent inter- and intra- rater reliability and inter-rater agreement for curve type and each modifier and the high degree of reliability demonstrates that applying the classification system is easy and consistent.
Abstract: Study design Inter- and intra-rater variability study. Objective On the basis of a Scoliosis Research Society effort, this study seeks to determine whether the new adult spinal deformity (ASD) classification system is clear and reliable. Summary of background data A classification of adult ASD can serve several purposes, including consistent characterization of a clinical entity, a basis for comparing different treatments, and recommended treatments. Although pediatric scoliosis classifications are well established, an ASD classification is still being developed. A previous classification developed by Schwab et al has met with clinical relevance but did not include pelvic parameters, which have shown substantial correlation with health-related quality of life measures in recent studies. Methods Initiated by the Scoliosis Research Society Adult Deformity Committee, this study revised a previously published classification to include pelvic parameters. Modifier cutoffs were determined using health-related quality of life analysis from a multicenter database of adult deformity patients. Nine readers graded 21 premarked cases twice each, approximately 1 week apart. Inter- and intra-rater variability and agreement were determined for curve type and each modifier separately. Fleiss' kappa was used for reliability measures, with values of 0.00 to 0.20 considered slight, 0.21 to 0.40 fair, 0.41 to 0.60 moderate, 0.61 to 0.80 substantial, and 0.81 to 1.00 almost perfect agreement. Results Inter-rater kappa for curve type was 0.80 and 0.87 for the 2 readings, respectively, with modifier kappas of 0.75 and 0.86, 0.97 and 0.98, and 0.96 and 0.96 for pelvic incidence minus lumbar lordosis (PI-LL), pelvic tilt (PT), and sagittal vertical axis (SVA), respectively. By the second reading, curve type was identified by all readers consistently in 66.7%, PI-LL in 71.4%, PT in 95.2%, and SVA in 90.5% of cases. Intra-rater kappa averaged 0.94 for curve type, 0.88 for PI-LL, 0.97 for PT, and 0.97 for SVA across all readers. Conclusion Data from this study show that there is excellent inter- and intra-rater reliability and inter-rater agreement for curve type and each modifier. The high degree of reliability demonstrates that applying the classification system is easy and consistent.

Journal ArticleDOI
TL;DR: In this article, the authors consider the optimal packet scheduling problem in a single-user EH wireless communication system, where both the data packets and the harvested energy are modeled to arrive at the source node randomly and the goal is to adaptively change the transmission rate according to the traffic load and available energy, such that the time by which all packets are delivered is minimized.
Abstract: We consider the optimal packet scheduling problem in a single-user energy harvesting wireless communication system. In this system, both the data packets and the harvested energy are modeled to arrive at the source node randomly. Our goal is to adaptively change the transmission rate according to the traffic load and available energy, such that the time by which all packets are delivered is minimized. Under a deterministic system setting, we assume that the energy harvesting times and harvested energy amounts are known before the transmission starts. For the data traffic arrivals, we consider two different scenarios. In the first scenario, we assume that all bits have arrived and are ready at the transmitter before the transmission starts. In the second scenario, we consider the case where packets arrive during the transmissions, with known arrival times and sizes. We develop optimal off-line scheduling policies which minimize the time by which all packets are delivered to the destination, under causality constraints on both data and energy arrivals.