scispace - formally typeset
Search or ask a question

Showing papers by "University of Wisconsin-Madison published in 2007"


Journal ArticleDOI
21 Dec 2007-Science
TL;DR: This article showed that OCT4, SOX2, NANOG, and LIN28 factors are sufficient to reprogram human somatic cells to pluripotent stem cells that exhibit the essential characteristics of embryonic stem (ES) cells.
Abstract: Somatic cell nuclear transfer allows trans-acting factors present in the mammalian oocyte to reprogram somatic cell nuclei to an undifferentiated state. We show that four factors (OCT4, SOX2, NANOG, and LIN28) are sufficient to reprogram human somatic cells to pluripotent stem cells that exhibit the essential characteristics of embryonic stem (ES) cells. These induced pluripotent human stem cells have normal karyotypes, express telomerase activity, express cell surface markers and genes that characterize human ES cells, and maintain the developmental potential to differentiate into advanced derivatives of all three primary germ layers. Such induced pluripotent human cell lines should be useful in the production of new disease models and in drug development, as well as for applications in transplantation medicine, once technical limitations (for example, mutation through viral integration) are eliminated.

9,836 citations


Journal ArticleDOI
14 Jun 2007-Nature
TL;DR: Functional data from multiple, diverse experiments performed on a targeted 1% of the human genome as part of the pilot phase of the ENCODE Project are reported, providing convincing evidence that the genome is pervasively transcribed, such that the majority of its bases can be found in primary transcripts.
Abstract: We report the generation and analysis of functional data from multiple, diverse experiments performed on a targeted 1% of the human genome as part of the pilot phase of the ENCODE Project. These data have been further integrated and augmented by a number of evolutionary and computational analyses. Together, our results advance the collective knowledge about human genome function in several major areas. First, our studies provide convincing evidence that the genome is pervasively transcribed, such that the majority of its bases can be found in primary transcripts, including non-protein-coding transcripts, and those that extensively overlap one another. Second, systematic examination of transcriptional regulation has yielded new understanding about transcription start sites, including their relationship to specific regulatory sequences and features of chromatin accessibility and histone modification. Third, a more sophisticated view of chromatin structure has emerged, including its inter-relationship with DNA replication and transcriptional regulation. Finally, integration of these new sources of information, in particular with respect to mammalian evolution based on inter- and intra-species sequence comparisons, has yielded new mechanistic and evolutionary insights concerning the functional landscape of the human genome. Together, these studies are defining a path for pursuit of a more comprehensive characterization of human genome function.

5,091 citations


Journal ArticleDOI
TL;DR: A meta-analysis of the results shows that early math skills have the greatest predictive power, followed by reading and then attention skills, while measures of socioemotional behaviors were generally insignificant predictors of later academic performance.
Abstract: Using 6 longitudinal data sets, the authors estimate links between three key elements of school readiness--school-entry academic, attention, and socioemotional skills--and later school reading and math achievement In an effort to isolate the effects of these school-entry skills, the authors ensured that most of their regression models control for cognitive, attention, and socioemotional skills measured prior to school entry, as well as a host of family background measures Across all 6 studies, the strongest predictors of later achievement are school-entry math, reading, and attention skills A meta-analysis of the results shows that early math skills have the greatest predictive power, followed by reading and then attention skills By contrast, measures of socioemotional behaviors, including internalizing and externalizing problems and social skills, were generally insignificant predictors of later academic performance, even among children with relatively high levels of problem behavior Patterns of association were similar for boys and girls and for children from high and low socioeconomic backgrounds

4,384 citations


Journal ArticleDOI
TL;DR: OpenSim is developed, a freely available, open-source software system that lets users develop models of musculoskeletal structures and create dynamic simulations of a wide variety of movements to simulate the dynamics of individuals with pathological gait and to explore the biomechanical effects of treatments.
Abstract: Dynamic simulations of movement allow one to study neuromuscular coordination, analyze athletic performance, and estimate internal loading of the musculoskeletal system. Simulations can also be used to identify the sources of pathological movement and establish a scientific basis for treatment planning. We have developed a freely available, open-source software system (OpenSim) that lets users develop models of musculoskeletal structures and create dynamic simulations of a wide variety of movements. We are using this system to simulate the dynamics of individuals with pathological gait and to explore the biomechanical effects of treatments. OpenSim provides a platform on which the biomechanics community can build a library of simulations that can be exchanged, tested, analyzed, and improved through a multi-institutional collaboration. Developing software that enables a concerted effort from many investigators poses technical and sociological challenges. Meeting those challenges will accelerate the discovery of principles that govern movement control and improve treatments for individuals with movement pathologies.

3,621 citations


Journal ArticleDOI
14 Sep 2007-Science
TL;DR: Synthesis of six case studies from around the world shows that couplings between human and natural systems vary across space, time, and organizational units and have legacy effects on present conditions and future possibilities.
Abstract: Integrated studies of coupled human and natural systems reveal new and complex patterns and processes not evident when studied by social or natural scientists separately. Synthesis of six case studies from around the world shows that couplings between human and natural systems vary across space, time, and organizational units. They also exhibit nonlinear dynamics with thresholds, reciprocal feedback loops, time lags, resilience, heterogeneity, and surprises. Furthermore, past couplings have legacy effects on present conditions and future possibilities.

2,890 citations


Journal ArticleDOI
TL;DR: Although there was evidence of an off-target effect of torcetrapib, it cannot rule out adverse effects related to CETP inhibition, and the trial was terminated prematurely because of an increased risk of death and cardiac events.
Abstract: Background Inhibition of cholesteryl ester transfer protein (CETP) has been shown to have a substantial effect on plasma lipoprotein levels. We investigated whether torcetrapib, a potent CETP inhibitor, might reduce major cardiovascular events. The trial was terminated prematurely because of an increased risk of death and cardiac events in patients receiving torcetrapib. Methods We conducted a randomized, double-blind study involving 15,067 patients at high cardiovascular risk. The patients received either torcetrapib plus atorvastatin or atorvastatin alone. The primary outcome was the time to the first major cardiovascular event, which was defined as death from coronary heart disease, nonfatal myocardial infarction, stroke, or hospitalization for unstable angina. Results At 12 months in patients who received torcetrapib, there was an increase of 72.1% in high-density lipoprotein cholesterol and a decrease of 24.9% in low-density lipoprotein cholesterol, as compared with baseline (P<0.001 for both compari...

2,832 citations


Journal ArticleDOI
TL;DR: A recent special issue of the Journal of Communication is devoted to theoretical explanations of news framing, agenda setting, and priming effects as mentioned in this paper, which examines if and how the three models are related and what potential relationships between them tell theorists and researchers about the effects of mass media.
Abstract: This special issue of Journal of Communication is devoted to theoretical explanations of news framing, agenda setting, and priming effects. It examines if and how the three models are related and what potential relationships between them tell theorists and researchers about the effects of mass media. As an introduction to this effort, this essay provides a very brief review of the three effects and their roots in media-effects research. Based on this overview, we highlight a few key dimensions along which one can compare, framing, agenda setting, and priming. We conclude with a description of the contexts within which the three models operate, and the broader implications that these conceptual distinctions have for the growth of our discipline. doi:10.1111/j.1460-2466.2006.00326.x In 1997, Republican pollster Frank Luntz sent out a 222-page memo called ‘‘Language of the 21st century’’ to select members of the U.S. Congress. Parts of the memo soon spread among staffers, members of Congress, and also journalists. Luntz’s message was simple: ‘‘It’s not what you say, it’s how you say it’’ (Luntz, in press). Drawing on various techniques for real-time message testing and focus grouping, Frank Luntz had researched Republican campaign messages and distilled terms and phrases that resonated with specific interpretive schemas among audiences and therefore helped shift people’s attitudes. In other words, the effect of the messages was not a function of content differences but of differences in the modes of presentation. The ideas outlined in the memo were hardly new, of course, and drew on decades of existing research in sociology (Goffman, 1974), economics (Kahneman & Tversky, 1979), psychology (Kahneman & Tversky, 1984), cognitive linguistics (Lakoff, 2004), and communication (Entman, 1991; Iyengar, 1991). But Frank Luntz was the first professional pollster to systematically use the concept of framing as a campaign tool. The Democratic Party soon followed and George Lakoff published Don’t Think of an

2,365 citations


Journal ArticleDOI
TL;DR: Diffusion tensor imaging (DTI) is a promising method for characterizing microstructural changes or differences with neuropathology and treatment and the biological mechanisms, acquisition, and analysis of DTI measurements are addressed.

2,315 citations


Journal ArticleDOI
TL;DR: A patient with semantic dementia — a neurodegenerative disease that is characterized by the gradual deterioration of semantic memory — was being driven through the countryside to visit a friend and was able to remind his wife where to turn along the not-recently-travelled route.
Abstract: Mr M, a patient with semantic dementia - a neurodegenerative disease that is characterized by the gradual deterioration of semantic memory - was being driven through the countryside to visit a friend and was able to remind his wife where to turn along the not-recently-travelled route. Then, pointing at the sheep in the field, he asked her "What are those things?" Prior to the onset of symptoms in his late 40s, this man had normal semantic memory. What has gone wrong in his brain to produce this dramatic and selective erosion of conceptual knowledge?

2,237 citations


Journal ArticleDOI
TL;DR: An overview of chemical catalytic transformations of biomass-derived oxygenated feedstocks in the liquid phase to value-added chemicals and fuels is presented, with specific examples emphasizing the development of catalytic processes based on an understanding of the fundamental reaction chemistry.
Abstract: Biomass has the potential to serve as a sustainable source of energy and organic carbon for our industrialized society. The focus of this Review is to present an overview of chemical catalytic transformations of biomass-derived oxygenated feedstocks (primarily sugars and sugar-alcohols) in the liquid phase to value-added chemicals and fuels, with specific examples emphasizing the development of catalytic processes based on an understanding of the fundamental reaction chemistry. The key reactions involved in the processing of biomass are hydrolysis, dehydration, isomerization, aldol condensation, reforming, hydrogenation, and oxidation. Further, it is discussed how ideas based on fundamental chemical and catalytic concepts lead to strategies for the control of reaction pathways and process conditions to produce H(2)/CO(2) or H(2)/CO gas mixtures by aqueous-phase reforming, to produce furan compounds by selective dehydration of carbohydrates, and to produce liquid alkanes by the combination of aldol condensation and dehydration/hydrogenation processes.

2,063 citations


Journal ArticleDOI
21 Jun 2007-Nature
TL;DR: This catalytic strategy for the production of 2,5-dimethylfuran from fructose (a carbohydrate obtained directly from biomass or by the isomerization of glucose) for use as a liquid transportation fuel may diminish the authors' reliance on petroleum.
Abstract: With petrol prices on the rise, biofuels are big news these days. For applications in the transportation sector, perhaps the best known liquid biofuel is biomass-derived ethanol. But ethanol has its limitations: it is highly volatile, absorbs water and has a low energy density. A team from the University of Wisconsin-Madison has developed a two-step catalytic process that can convert fructose into a potentially better liquid biofuel, 2,5-dimethylfuran (DMF). This has 40%-higher energy density and a higher boiling point than ethanol, and is not water soluble. Fructose can be made directly from biomass or from glucose and although there's some work needed before DMF production can be made commercially viable, this new catalytic process looks promising. Diminishing fossil fuel reserves and growing concerns about global warming indicate that sustainable sources of energy are needed in the near future. For fuels to be useful in the transportation sector, they must have specific physical properties that allow for efficient distribution, storage and combustion; these properties are currently fulfilled by non-renewable petroleum-derived liquid fuels. Ethanol, the only renewable liquid fuel currently produced in large quantities, suffers from several limitations, including low energy density, high volatility, and contamination by the absorption of water from the atmosphere. Here we present a catalytic strategy for the production of 2,5-dimethylfuran from fructose (a carbohydrate obtained directly from biomass or by the isomerization of glucose) for use as a liquid transportation fuel. Compared to ethanol, 2,5-dimethylfuran has a higher energy density (by 40 per cent), a higher boiling point (by 20 K), and is not soluble in water. This catalytic strategy creates a route for transforming abundant renewable biomass resources1,2 into a liquid fuel suitable for the transportation sector, and may diminish our reliance on petroleum.

Journal ArticleDOI
05 Dec 2007-Pain
TL;DR: Patients with neuropathic pain are challenging to manage and evidence‐based clinical recommendations for pharmacologic management are needed, and medications should be individualized, considering side effects, potential beneficial or deleterious effects on comorbidities, and whether prompt onset of pain relief is necessary.
Abstract: Patients with neuropathic pain (NP) are challenging to manage and evidence-based clinical recommendations for pharmacologic management are needed. Systematic literature reviews, randomized clinical trials, and existing guidelines were evaluated at a consensus meeting. Medications were considered for recommendation if their efficacy was supported by at least one methodologically-sound, randomized clinical trial (RCT) demonstrating superiority to placebo or a relevant comparison treatment. Recommendations were based on the amount and consistency of evidence, degree of efficacy, safety, and clinical experience of the authors. Available RCTs typically evaluated chronic NP of moderate to severe intensity. Recommended first-line treatments include certain antidepressants (i.e., tricyclic antidepressants and dual reuptake inhibitors of both serotonin and norepinephrine), calcium channel alpha2-delta ligands (i.e., gabapentin and pregabalin), and topical lidocaine. Opioid analgesics and tramadol are recommended as generally second-line treatments that can be considered for first-line use in select clinical circumstances. Other medications that would generally be used as third-line treatments but that could also be used as second-line treatments in some circumstances include certain antiepileptic and antidepressant medications, mexiletine, N-methyl-D-aspartate receptor antagonists, and topical capsaicin. Medication selection should be individualized, considering side effects, potential beneficial or deleterious effects on comorbidities, and whether prompt onset of pain relief is necessary. To date, no medications have demonstrated efficacy in lumbosacral radiculopathy, which is probably the most common type of NP. Long-term studies, head-to-head comparisons between medications, studies involving combinations of medications, and RCTs examining treatment of central NP are lacking and should be a priority for future research.

Journal ArticleDOI
TL;DR: An international group of experts in pharmacokinetic modeling recommends a consensus nomenclature to describe in vivo molecular imaging of reversibly binding radioligands.
Abstract: An international group of experts in pharmacokinetic modeling recommends a consensus nomenclature to describe in vivo molecular imaging of reversibly binding radioligands.

Journal ArticleDOI
01 Jun 2007-Ecology
TL;DR: It is suggested that more effectively integrating microbial ecology into ecosystem ecology will require a more complete integration of microbial physiological ecology, population biology, and process ecology.
Abstract: Microorganisms have a variety of evolutionary adaptations and physiological acclimation mechanisms that allow them to survive and remain active in the face of environmental stress. Physiological responses to stress have costs at the organismal level that can result in altered ecosystem-level C, energy, and nutrient flows. These large-scale impacts result from direct effects on active microbes' physiology and by controlling the composition of the active microbial community. We first consider some general aspects of how microbes experience environmental stresses and how they respond to them. We then discuss the impacts of two important ecosystem-level stressors, drought and freezing, on microbial physiology and community composition. Even when microbial community response to stress is limited, the physiological costs imposed on soil microbes are large enough that they may cause large shifts in the allocation and fate of C and N. For example, for microbes to synthesize the osmolytes they need to survive a single drought episode they may consume up to 5% of total annual net primary production in grassland ecosystems, while acclimating to freezing conditions switches Arctic tundra soils from immobilizing N during the growing season to mineralizing it during the winter. We suggest that more effectively integrating microbial ecology into ecosystem ecology will require a more complete integration of microbial physiological ecology, population biology, and process ecology.

Journal ArticleDOI
TL;DR: The implementation of the penalized likelihood methods for estimating the concentration matrix in the Gaussian graphical model is nontrivial, but it is shown that the computation can be done effectively by taking advantage of the efficient maxdet algorithm developed in convex optimization.
Abstract: SUMMARY We propose penalized likelihood methods for estimating the concentration matrix in the Gaussian graphical model. The methods lead to a sparse and shrinkage estimator of the concentration matrix that is positive definite, and thus conduct model selection and estimation simultaneously. The implementation of the methods is nontrivial because of the positive definite constraint on the concentration matrix, but we show that the computation can be done effectively by taking advantage of the efficient maxdet algorithm developed in convex optimization. We propose a BIC-type criterion for the selection of the tuning parameter in the penalized likelihood methods. The connection between our methods and existing methods is illustrated. Simulations and real examples demonstrate the competitive performance of the new methods.

Journal ArticleDOI
TL;DR: In this article, a range of methods available to estimate national-level forest carbon stocks in developing countries are reviewed, including ground-based and remote-sensing measurements of forest attributes using allometric relationships.
Abstract: Reducing carbon emissions from deforestation and degradation in developing countries is of central importance in efforts to combat climate change. Key scientific challenges must be addressed to prevent any policy roadblocks. Foremost among the challenges is quantifying nations' carbon emissions from deforestation and forest degradation, which requires information on forest clearing and carbon storage. Here we review a range of methods available to estimate national-level forest carbon stocks in developing countries. While there are no practical methods to directly measure all forest carbon stocks across a country, both ground-based and remote-sensing measurements of forest attributes can be converted into estimates of national carbon stocks using allometric relationships. Here we synthesize, map and update prominent forest biomass carbon databases to create the first complete set of national-level forest carbon stock estimates. These forest carbon estimates expand on the default values recommended by the Intergovernmental Panel on Climate Change's National Greenhouse Gas Inventory Guidelines and provide a range of globally consistent estimates.

Journal ArticleDOI
TL;DR: In this paper, the authors show that no-analog communities (communities that are compositionally unlike any found today) occurred frequently in the past and will develop in the greenhouse world of the future.
Abstract: No-analog communities (communities that are compositionally unlike any found today) occurred frequently in the past and will develop in the greenhouse world of the future. The well documented no-analog plant communities of late-glacial North America are closely linked to “novel” climates also lacking modern analogs, characterized by high seasonality of temperature. In climate simulations for the Intergovernmental Panel on Climate Change A2 and B1 emission scenarios, novel climates arise by 2100 AD, primarily in tropical and subtropical regions. These future novel climates are warmer than any present climates globally, with spatially variable shifts in precipitation, and increase the risk of species reshuffling into future no-analog communities and other ecological surprises. Most ecological models are at least partially parameterized from modern observations and so may fail to accurately predict ecological responses to these novel climates. There is an urgent need to test the robustness of ecological mode...

Journal ArticleDOI
18 Oct 2007-Nature
TL;DR: The shared evolutionary fate of humans and their symbiotic bacteria has selected for mutualistic interactions that are essential for human health, and ecological or genetic changes that uncouple this shared fate can result in disease.
Abstract: The microbial communities of humans are characteristic and complex mixtures of microorganisms that have co-evolved with their human hosts. The species that make up these communities vary between hosts as a result of restricted migration of microorganisms between hosts and strong ecological interactions within hosts, as well as host variability in terms of diet, genotype and colonization history. The shared evolutionary fate of humans and their symbiotic bacteria has selected for mutualistic interactions that are essential for human health, and ecological or genetic changes that uncouple this shared fate can result in disease. In this way, looking to ecological and evolutionary principles might provide new strategies for restoring and maintaining human health.

Journal ArticleDOI
TL;DR: In this article, the authors compare a parsimonious null model to a larger model that nests the null model and observe that the mean squared prediction error (MSPE) from the parser is therefore expected to be smaller than that of the larger model.

Journal ArticleDOI
28 Mar 2007-JAMA
TL;DR: Tolvaptan initiated for acute treatment of patients hospitalized with heart failure had no effect on long-term mortality or heart failure-related morbidity.
Abstract: ContextVasopressin mediates fluid retention in heart failure. Tolvaptan, a vasopressin V2 receptor blocker, shows promise for management of heart failure.ObjectiveTo investigate the effects of tolvaptan initiated in patients hospitalized with heart failure.Design, Setting, and ParticipantsThe Efficacy of Vasopressin Antagonism in Heart Failure Outcome Study With Tolvaptan (EVEREST), an event-driven, randomized, double-blind, placebo-controlled study. The outcome trial comprised 4133 patients within 2 short-term clinical status studies, who were hospitalized with heart failure, randomized at 359 North American, South American, and European sites between October 7, 2003, and February 3, 2006, and followed up during long-term treatment.InterventionWithin 48 hours of admission, patients were randomly assigned to receive oral tolvaptan, 30 mg once per day (n = 2072), or placebo (n = 2061) for a minimum of 60 days, in addition to standard therapy.Main Outcome MeasuresDual primary end points were all-cause mortality (superiority and noninferiority) and cardiovascular death or hospitalization for heart failure (superiority only). Secondary end points included changes in dyspnea, body weight, and edema.ResultsDuring a median follow-up of 9.9 months, 537 patients (25.9%) in the tolvaptan group and 543 (26.3%) in the placebo group died (hazard ratio, 0.98; 95% confidence interval [CI], 0.87-1.11; P = .68). The upper confidence limit for the mortality difference was within the prespecified noninferiority margin of 1.25 (P<.001). The composite of cardiovascular death or hospitalization for heart failure occurred in 871 tolvaptan group patients (42.0%) and 829 placebo group patients (40.2%; hazard ratio, 1.04; 95% CI, 0.95-1.14; P = .55). Secondary end points of cardiovascular mortality, cardiovascular death or hospitalization, and worsening heart failure were also not different. Tolvaptan significantly improved secondary end points of day 1 patient-assessed dyspnea, day 1 body weight, and day 7 edema. In patients with hyponatremia, serum sodium levels significantly increased. The Kansas City Cardiomyopathy Questionnaire overall summary score was not improved at outpatient week 1, but body weight and serum sodium effects persisted long after discharge. Tolvaptan caused increased thirst and dry mouth, but frequencies of major adverse events were similar in the 2 groups.ConclusionTolvaptan initiated for acute treatment of patients hospitalized with heart failure had no effect on long-term mortality or heart failure–related morbidity.Trial Registrationclinicaltrials.gov Identifier: NCT00071331Published online March 25, 2007 (doi:10.1001/jama.297.12.1319).

Journal ArticleDOI
06 Jul 2007-Science
TL;DR: This work states that because anthropogenic changes often affect stability and diversity simultaneously, diversity-stability relationships cannot be understood outside the context of the environmental drivers affecting both.
Abstract: Understanding the relationship between diversity and stability requires a knowledge of how species interact with each other and how each is affected by the environment. The relationship is also complex, because the concept of stability is multifaceted; different types of stability describing different properties of ecosystems lead to multiple diversity-stability relationships. A growing number of empirical studies demonstrate positive diversity-stability relationships. These studies, however, have emphasized only a few types of stability, and they rarely uncover the mechanisms responsible for stability. Because anthropogenic changes often affect stability and diversity simultaneously, diversity-stability relationships cannot be understood outside the context of the environmental drivers affecting both. This shifts attention away from diversity-stability relationships toward the multiple factors, including diversity, that dictate the stability of ecosystems.

Journal ArticleDOI
TL;DR: In this paper, the authors analyzed multimodel ensembles for the A2 and B1 emission scenarios produced for the fourth assessment report of the Intergovernmental Panel on Climate Change, with the goal of identifying regions projected to experience high magnitudes of local climate change, development of novel 21st-century climates, and/or the disappearance of extant climates.
Abstract: Key risks associated with projected climate trends for the 21st century include the prospects of future climate states with no current analog and the disappearance of some extant climates. Because climate is a primary control on species distributions and ecosystem processes, novel 21st-century climates may promote formation of novel species associations and other ecological surprises, whereas the disappearance of some extant climates increases risk of extinction for species with narrow geographic or climatic distributions and disruption of existing communities. Here we analyze multimodel ensembles for the A2 and B1 emission scenarios produced for the fourth assessment report of the Intergovernmental Panel on Climate Change, with the goal of identifying regions projected to experience (i) high magnitudes of local climate change, (ii) development of novel 21st-century climates, and/or (iii) the disappearance of extant climates. Novel climates are projected to develop primarily in the tropics and subtropics, whereas disappearing climates are concentrated in tropical montane regions and the poleward portions of continents. Under the high-end A2 scenario, 12-39% and 10-48% of the Earth's terrestrial surface may respectively experience novel and disappearing climates by 2100 AD. Corresponding projections for the low-end B1 scenario are 4-20% and 4-20%. Dispersal limitations increase the risk that species will experience the loss of extant climates or the occurrence of novel climates. There is a close correspondence between regions with globally disappearing climates and previously identified biodiversity hotspots; for these regions, standard conservation solutions (e.g., assisted migration and networked reserves) may be insufficient to preserve biodiversity.

Journal ArticleDOI
15 Sep 2007-Blood
TL;DR: These revisions are made to incorporate advances related to tumor cell biology and diagnostic techniques as pertains to mycosis fungoides and Sézary syndrome to clarify certain variables that currently impede effective interinstitution and interinvestigator communication and/or the development of standardized clinical trials in MF and SS.

Journal ArticleDOI
TL;DR: A large-scale study to experimentally determine the ultrawideband microwave dielectric properties of a variety of normal, malignant and benign breast tissues, measured from 0.5 to 20 GHz using a precision open-ended coaxial probe shows that the contrast in the microwave-frequency dielectrics properties betweenmalignant and normal adipose-dominated tissues in the breast is considerable, as large as 10:1.
Abstract: The development of microwave breast cancer detection and treatment techniques has been driven by reports of substantial contrast in the dielectric properties of malignant and normal breast tissues. However, definitive knowledge of the dielectric properties of normal and diseased breast tissues at microwave frequencies has been limited by gaps and discrepancies across previously published studies. To address these issues, we conducted a large-scale study to experimentally determine the ultrawideband microwave dielectric properties of a variety of normal, malignant and benign breast tissues, measured from 0.5 to 20 GHz using a precision open-ended coaxial probe. Previously, we reported the dielectric properties of normal breast tissue samples obtained from reduction surgeries. Here, we report the dielectric properties of normal (adipose, glandular and fibroconnective), malignant (invasive and non-invasive ductal and lobular carcinomas) and benign (fibroadenomas and cysts) breast tissue samples obtained from cancer surgeries. We fit a one-pole Cole-Cole model to the complex permittivity data set of each characterized sample. Our analyses show that the contrast in the microwave-frequency dielectric properties between malignant and normal adipose-dominated tissues in the breast is considerable, as large as 10:1, while the contrast in the microwave-frequency dielectric properties between malignant and normal glandular/fibroconnective tissues in the breast is no more than about 10%.

Journal ArticleDOI
TL;DR: In this article, a general study of primordial scalar non-Gaussianity in single field inflationary models is performed, where the inflaton Lagrangian is an arbitrary function of the scalar field and its first derivative.
Abstract: We perform a general study of primordial scalar non-Gaussianities in single field inflationary models. We consider models where the inflaton Lagrangian is an arbitrary function of the scalar field and its first derivative, and the sound speed is arbitrary. We find that under reasonable assumptions, the non-Gaussianity is completely determined by 5 parameters. In special limits of the parameter space, one finds distinctive ''shapes'' of the non-Gaussianity. In models with a small sound speed, several of these shapes would become potentially observable in the near future. Different limits of our formulae recover various previously known results.

Journal ArticleDOI
TL;DR: Fitting this model is a way of exploring leaf level photosynthesis in terms of underlying biochemistry and biophysics is subject to assumptions that hold to a greater or lesser degree.
Abstract: Photosynthetic responses to carbon dioxide concentration can provide data on a number of important parameters related to leaf physiology. Methods for fitting a model to such data are briefly described. The method will fit the following parameters: Vcmax, J, TPU, Rd and gm [maximum carboxylation rate allowed by ribulose 1·5-bisphosphate carboxylase/oxygenase (Rubisco), rate of photosynthetic electron transport (based on NADPH requirement), triose phosphate use, day respiration and mesophyll conductance, respectively].The method requires at least five data pairs of net CO2 assimilation (A) and [CO2] in the intercellular airspaces of the leaf (Ci) and requires users to indicate the presumed limiting factor. The output is (1) calculated CO2 partial pressure at the sites of carboxylation, Cc, (2) values for the five parameters at the measurement temperature and (3) values adjusted to 25 °C to facilitate comparisons. Fitting this model is a way of exploring leaf level photosynthesis. However, interpreting leaf level photosynthesis in terms of underlying biochemistry and biophysics is subject to assumptions that hold to a greater or lesser degree, a major assumption being that all parts of the leaf are behaving in the same way at each instant.

Journal ArticleDOI
TL;DR: In this paper, a biphasic system was used to produce 5-hydroxymethylfurfural (HMF) and furfural derivatives from renewable biomass-derived carbohydrates.

Journal ArticleDOI
TL;DR: It is reported that selective expression of an endogenous K-Ras(G12V) oncogene in embryonic cells of acinar/centroacinar lineage results in pancreatic intraepithelial neoplasias (PanINs) and invasive PDA, suggesting that PDA originates by differentiation of acINs or their precursors into ductal-like cells.

Book ChapterDOI
TL;DR: Focusing particularly on steam pretreatment, this review examines the influence that pretreatment conditions have on substrate characteristics such as lignin and hemicellulose content, crystallinity, degree of polymerization and specific surface, and the resulting implications for effective hydrolysis by cellulases.
Abstract: Although the structure and function of cellulase systems continue to be the subject of intense research, it is widely acknowledged that the rate and extent of the cellulolytic hydrolysis of lignocellulosic substrates is influenced not only by the effectiveness of the enzymes but also by the chemical, physical and morphological characteristics of the heterogeneous lignocellulosic substrates. Although strategies such as site-directed mutagenesis or directed evolution have been successfully employed to improve cellulase properties such as binding affinity, catalytic activity and thermostability, complementary goals that we and other groups have studied have been the determination of which substrate characteristics are responsible for limiting hydrolysis and the development of pretreatment methods that maximize substrate accessibility to the cellulase complex. Over the last few years we have looked at the various lignocellulosic substrate characteristics at the fiber, fibril and microfibril level that have been modified during pretreatment and subsequent hydrolysis. The initial characteristics of the woody biomass and the effect of subsequent pretreatment play a significant role on the development of substrate properties, which in turn govern the efficacy of enzymatic hydrolysis. Focusing particularly on steam pretreatment, this review examines the influence that pretreatment conditions have on substrate characteristics such as lignin and hemicellulose content, crystallinity, degree of polymerization and specific surface, and the resulting implications for effective hydrolysis by cellulases.

Journal ArticleDOI
TL;DR: A review of recent advances in the area of MHD stability and disruptions, since the publication of the 1999 ITER Physics Basis document (1999 Nucl. Fusion 39 2137-2664), is reviewed in this paper.
Abstract: Progress in the area of MHD stability and disruptions, since the publication of the 1999 ITER Physics Basis document (1999 Nucl. Fusion 39 2137-2664), is reviewed. Recent theoretical and experimental research has made important advances in both understanding and control of MHD stability in tokamak plasmas. Sawteeth are anticipated in the ITER baseline ELMy H-mode scenario, but the tools exist to avoid or control them through localized current drive or fast ion generation. Active control of other MHD instabilities will most likely be also required in ITER. Extrapolation from existing experiments indicates that stabilization of neoclassical tearing modes by highly localized feedback-controlled current drive should be possible in ITER. Resistive wall modes are a key issue for advanced scenarios, but again, existing experiments indicate that these modes can be stabilized by a combination of plasma rotation and direct feedback control with non-axisymmetric coils. Reduction of error fields is a requirement for avoiding non-rotating magnetic island formation and for maintaining plasma rotation to help stabilize resistive wall modes. Recent experiments have shown the feasibility of reducing error fields to an acceptable level by means of non-axisymmetric coils, possibly controlled by feedback. The MHD stability limits associated with advanced scenarios are becoming well understood theoretically, and can be extended by tailoring of the pressure and current density profiles as well as by other techniques mentioned here. There have been significant advances also in the control of disruptions, most notably by injection of massive quantities of gas, leading to reduced halo current fractions and a larger fraction of the total thermal and magnetic energy dissipated by radiation. These advances in disruption control are supported by the development of means to predict impending disruption, most notably using neural networks. In addition to these advances in means to control or ameliorate the consequences of MHD instabilities, there has been significant progress in improving physics understanding and modelling. This progress has been in areas including the mechanisms governing NTM growth and seeding, in understanding the damping controlling RWM stability and in modelling RWM feedback schemes. For disruptions there has been continued progress on the instability mechanisms that underlie various classes of disruption, on the detailed modelling of halo currents and forces and in refining predictions of quench rates and disruption power loads. Overall the studies reviewed in this chapter demonstrate that MHD instabilities can be controlled, avoided or ameliorated to the extent that they should not compromise ITER operation, though they will necessarily impose a range of constraints.