scispace - formally typeset
Search or ask a question

Showing papers by "University of California, San Diego published in 2009"


Journal ArticleDOI
TL;DR: AutoDock4 incorporates limited flexibility in the receptor and its utility in analysis of covalently bound ligands is reported, using both a grid‐based docking method and a modification of the flexible sidechain technique.
Abstract: We describe the testing and release of AutoDock4 and the accompanying graphical user interface AutoDockTools. AutoDock4 incorporates limited flexibility in the receptor. Several tests are reported here, including a redocking experiment with 188 diverse ligand-protein complexes and a cross-docking experiment using flexible sidechains in 87 HIV protease complexes. We also report its utility in analysis of covalently bound ligands, using both a grid-based docking method and a modification of the flexible sidechain technique.

15,616 citations


Journal ArticleDOI
TL;DR: The popular MEME motif discovery algorithm is now complemented by the GLAM2 algorithm which allows discovery of motifs containing gaps, and all of the motif-based tools are now implemented as web services via Opal.
Abstract: The MEME Suite web server provides a unified portal for online discovery and analysis of sequence motifs representing features such as DNA binding sites and protein interaction domains. The popular MEME motif discovery algorithm is now complemented by the GLAM2 algorithm which allows discovery of motifs containing gaps. Three sequence scanning algorithms—MAST, FIMO and GLAM2SCAN—allow scanning numerous DNA and protein sequence databases for motifs discovered by MEME and GLAM2. Transcription factor motifs (including those discovered using MEME) can be compared with motifs in many popular motif databases using the motif database scanning algorithm Tomtom. Transcription factor motifs can be further analyzed for putative function by association with Gene Ontology (GO) terms using the motif-GO term association tool GOMO. MEME output now contains sequence LOGOS for each discovered motif, as well as buttons to allow motifs to be conveniently submitted to the sequence and motif database scanning algorithms (MAST, FIMO and Tomtom), or to GOMO, for further analysis. GLAM2 output similarly contains buttons for further analysis using GLAM2SCAN and for rerunning GLAM2 with different parameters. All of the motif-based tools are now implemented as web services via Opal. Source code, binaries and a web server are freely available for noncommercial use at http://meme.nbcr.net.

7,733 citations


Journal ArticleDOI
TL;DR: This paper reviewed the literature on gender differences in economic experiments and identified robust differences in risk preferences, social (other-regarding) preferences, and competitive preferences, speculating on the source of these differences and their implications.
Abstract: This paper reviews the literature on gender differences in economic experiments. In the three main sections, we identify robust differences in risk preferences, social (other-regarding) preferences, and competitive preferences. We also speculate on the source of these differences, as well as on their implications. Our hope is that this article will serve as a resource for those seeking to understand gender differences and to use as a starting point to illuminate the debate on gender-specific outcomes in the labor and goods markets.

4,864 citations


Journal ArticleDOI
19 Nov 2009-Nature
TL;DR: The first genome-wide, single-base-resolution maps of methylated cytosines in a mammalian genome, from both human embryonic stem cells and fetal fibroblasts, along with comparative analysis of messenger RNA and small RNA components of the transcriptome, several histone modifications, and sites of DNA-protein interaction for several key regulatory factors were presented in this article.
Abstract: DNA cytosine methylation is a central epigenetic modification that has essential roles in cellular processes including genome regulation, development and disease. Here we present the first genome-wide, single-base-resolution maps of methylated cytosines in a mammalian genome, from both human embryonic stem cells and fetal fibroblasts, along with comparative analysis of messenger RNA and small RNA components of the transcriptome, several histone modifications, and sites of DNA-protein interaction for several key regulatory factors. Widespread differences were identified in the composition and patterning of cytosine methylation between the two genomes. Nearly one-quarter of all methylation identified in embryonic stem cells was in a non-CG context, suggesting that embryonic stem cells may use different methylation mechanisms to affect gene regulation. Methylation in non-CG contexts showed enrichment in gene bodies and depletion in protein binding sites and enhancers. Non-CG methylation disappeared upon induced differentiation of the embryonic stem cells, and was restored in induced pluripotent stem cells. We identified hundreds of differentially methylated regions proximal to genes involved in pluripotency and differentiation, and widespread reduced methylation levels in fibroblasts associated with lower transcriptional activity. These reference epigenomes provide a foundation for future studies exploring this key epigenetic modification in human disease and development.

4,266 citations


Journal ArticleDOI
TL;DR: An update on potentially effective antibacterial drugs in the late-stage development pipeline is provided, in the hope of encouraging collaboration between industry, academia, the National Institutes of Health, the Food and Drug Administration, and the Centers for Disease Control and Prevention work productively together.
Abstract: The Infectious Diseases Society of America (IDSA) continues to view with concern the lean pipeline for novel therapeutics to treat drug-resistant infections, especially those caused by gram-negative pathogens. Infections now occur that are resistant to all current antibacterial options. Although the IDSA is encouraged by the prospect of success for some agents currently in preclinical development, there is an urgent, immediate need for new agents with activity against these panresistant organisms. There is no evidence that this need will be met in the foreseeable future. Furthermore, we remain concerned that the infrastructure for discovering and developing new antibacterials continues to stagnate, thereby risking the future pipeline of antibacterial drugs. The IDSA proposed solutions in its 2004 policy report, “Bad Bugs, No Drugs: As Antibiotic R&D Stagnates, a Public Health Crisis Brews,” and recently issued a “Call to Action” to provide an update on the scope of the problem and the proposed solutions. A primary objective of these periodic reports is to encourage a community and legislative response to establish greater financial parity between the antimicrobial development and the development of other drugs. Although recent actions of the Food and Drug Administration and the 110th US Congress present a glimmer of hope, significant uncertainly remains. Now, more than ever, it is essential to create a robust and sustainable antibacterial research and development infrastructure—one that can respond to current antibacterial resistance now and anticipate evolving resistance. This challenge requires that industry, academia, the National Institutes of Health, the Food and Drug Administration, the Centers for Disease Control and Prevention, the US Department of Defense, and the new Biomedical Advanced Research and Development Authority at the Department of Health and Human Services work productively together. This report provides an update on potentially effective antibacterial drugs in the late-stage development pipeline, in the hope of encouraging such collaborative action.

4,256 citations


Journal ArticleDOI
TL;DR: Biopython includes modules for reading and writing different sequence file formats and multiple sequence alignments, dealing with 3D macro molecular structures, interacting with common tools such as BLAST, ClustalW and EMBOSS, accessing key online databases, as well as providing numerical methods for statistical learning.
Abstract: The Biopython project is a mature open source international collaboration of volunteer developers, providing Python libraries for a wide range of bioinformatics problems. Biopython includes modules for reading and writing different sequence. le formats and multiple sequence alignments, dealing with 3D macromolecular structures, interacting with common tools such as BLAST, ClustalW and EMBOSS, accessing key online databases, as well as providing numerical methods for statistical learning.

3,855 citations


Journal ArticleDOI
03 Sep 2009-Nature
TL;DR: Work in different scientific fields is now suggesting the existence of generic early-warning signals that may indicate for a wide class of systems if a critical threshold is approaching.
Abstract: Complex dynamical systems, ranging from ecosystems to financial markets and the climate, can have tipping points at which a sudden shift to a contrasting dynamical regime may occur. Although predicting such critical points before they are reached is extremely difficult, work in different scientific fields is now suggesting the existence of generic early-warning signals that may indicate for a wide class of systems if a critical threshold is approaching.

3,450 citations


Journal ArticleDOI
06 Feb 2009-Science
TL;DR: In this article, a field is emerging that leverages the capacity to collect and analyze data at a scale that may reveal patterns of individual and group behaviors at a large scale, such as behavior patterns.
Abstract: A field is emerging that leverages the capacity to collect and analyze data at a scale that may reveal patterns of individual and group behaviors.

2,619 citations


Proceedings ArticleDOI
12 Dec 2009
TL;DR: Combining power, area, and timing results of McPAT with performance simulation of PARSEC benchmarks at the 22nm technology node for both common in-order and out-of-order manycore designs shows that when die cost is not taken into account clustering 8 cores together gives the best energy-delay product, whereas when cost is taking into account configuring clusters with 4 cores gives thebest EDA2P and EDAP.
Abstract: This paper introduces McPAT, an integrated power, area, and timing modeling framework that supports comprehensive design space exploration for multicore and manycore processor configurations ranging from 90nm to 22nm and beyond. At the microarchitectural level, McPAT includes models for the fundamental components of a chip multiprocessor, including in-order and out-of-order processor cores, networks-on-chip, shared caches, integrated memory controllers, and multiple-domain clocking. At the circuit and technology levels, McPAT supports critical-path timing modeling, area modeling, and dynamic, short-circuit, and leakage power modeling for each of the device types forecast in the ITRS roadmap including bulk CMOS, SOI, and double-gate transistors. McPAT has a flexible XML interface to facilitate its use with many performance simulators. Combined with a performance simulator, McPAT enables architects to consistently quantify the cost of new ideas and assess tradeoffs of different architectures using new metrics like energy-delay-area2 product (EDA2P) and energy-delay-area product (EDAP). This paper explores the interconnect options of future manycore processors by varying the degree of clustering over generations of process technologies. Clustering will bring interesting tradeoffs between area and performance because the interconnects needed to group cores into clusters incur area overhead, but many applications can make good use of them due to synergies of cache sharing. Combining power, area, and timing results of McPAT with performance simulation of PARSEC benchmarks at the 22nm technology node for both common in-order and out-of-order manycore designs shows that when die cost is not taken into account clustering 8 cores together gives the best energy-delay product, whereas when cost is taken into account configuring clusters with 4 cores gives the best EDA2P and EDAP.

2,487 citations


Journal ArticleDOI
TL;DR: Much progress has been made in the past two years revealing new insights into the regulation and functions of NF-kappaB, and this recent progress is covered in this review.
Abstract: The mammalian Rel/NF-κB family of transcription factors, including RelA, c-Rel, RelB, NF-κB1 (p50 and its precursor p105), and NF-κB2 (p52 and its precursor p100), plays a central role in the immune system by regulating several processes ranging from the development and survival of lymphocytes and lymphoid organs to the control of immune responses and malignant transformation. The five members of the NF-κB family are normally kept inactive in the cytoplasm by interaction with inhibitors called IκBs or the unprocessed forms of NF-κB1 and NF-κB2. A wide variety of signals emanating from antigen receptors, pattern-recognition receptors, receptors for the members of TNF and IL-1 cytokine families, and others induce differential activation of NF-κB heterodimers. Although work over the past two decades has shed significant light on the regulation of NF-κB transcription factors and their functions, much progress has been made in the past two years revealing new insights into the regulation and functions of NF-κB...

2,380 citations


Journal ArticleDOI
22 May 2009-Science
TL;DR: PYR/PYLs are ABA receptors functioning at the apex of a negative regulatory pathway that controls ABA signaling by inhibiting PP2Cs, illustrating the power of the chemical genetic approach for sidestepping genetic redundancy.
Abstract: Type 2C protein phosphatases (PP2Cs) are vitally involved in abscisic acid (ABA) signaling. Here, we show that a synthetic growth inhibitor called pyrabactin functions as a selective ABA agonist. Pyrabactin acts through PYRABACTIN RESISTANCE 1 (PYR1), the founding member of a family of START proteins called PYR/PYLs, which are necessary for both pyrabactin and ABA signaling in vivo. We show that ABA binds to PYR1, which in turn binds to and inhibits PP2Cs. We conclude that PYR/PYLs are ABA receptors functioning at the apex of a negative regulatory pathway that controls ABA signaling by inhibiting PP2Cs. Our results illustrate the power of the chemical genetic approach for sidestepping genetic redundancy.

Journal ArticleDOI
07 May 2009-Nature
TL;DR: The results define over 55,000 potential transcriptional enhancers in the human genome, significantly expanding the current catalogue of human enhancers and highlighting the role of these elements in cell-type-specific gene expression.
Abstract: The human body is composed of diverse cell types with distinct functions. Although it is known that lineage specification depends on cell-specific gene expression, which in turn is driven by promoters, enhancers, insulators and other cis-regulatory DNA sequences for each gene, the relative roles of these regulatory elements in this process are not clear. We have previously developed a chromatin-immunoprecipitation-based microarray method (ChIP-chip) to locate promoters, enhancers and insulators in the human genome. Here we use the same approach to identify these elements in multiple cell types and investigate their roles in cell-type-specific gene expression. We observed that the chromatin state at promoters and CTCF-binding at insulators is largely invariant across diverse cell types. In contrast, enhancers are marked with highly cell-type-specific histone modification patterns, strongly correlate to cell-type-specific gene expression programs on a global scale, and are functionally active in a cell-type-specific manner. Our results define over 55,000 potential transcriptional enhancers in the human genome, significantly expanding the current catalogue of human enhancers and highlighting the role of these elements in cell-type-specific gene expression.

Book
15 Sep 2009
TL;DR: Providing a systematic approach to the topic, the authors begin with a tutorial introducing the foundations of Isogeometric Analysis, before advancing to a comprehensive coverage of the most recent developments in the technique.
Abstract: The authors are the originators of isogeometric analysis, are excellent scientists and good educators. It is very original. There is no other book on this topic. Ren de Borst, Eindhoven University of Technology Written by leading experts in the field and featuring fully integrated colour throughout, Isogeometric Analysis provides a groundbreaking solution for the integration of CAD and FEA technologies. Tom Hughes and his researchers, Austin Cottrell and Yuri Bazilevs, present their pioneering isogeometric approach, which aims to integrate the two techniques of CAD and FEA using precise NURBS geometry in the FEA application. This technology offers the potential to revolutionise automobile, ship and airplane design and analysis by allowing models to be designed, tested and adjusted in one integrative stage. Providing a systematic approach to the topic, the authors begin with a tutorial introducing the foundations of Isogeometric Analysis, before advancing to a comprehensive coverage of the most recent developments in the technique. The authors offer a clear explanation as to how to add isogeometric capabilities to existing finite element computer programs, demonstrating how to implement and use the technology. Detailed programming examples and datasets are included to impart a thorough knowledge and understanding of the material. Provides examples of different applications, showing the reader how to implement isogeometric models Addresses readers on both sides of the CAD/FEA divide Describes Non-Uniform Rational B-Splines (NURBS) basis functions

Journal ArticleDOI
TL;DR: The biophysical and mechanical principles of locomotion at the small scales relevant to cell swimming, tens of micrometers and below are reviewed, with emphasis on the simple physical picture and fundamental flow physics phenomena in this regime.
Abstract: Cell motility in viscous fluids is ubiquitous and affects many biological processes, including reproduction, infection and the marine life ecosystem. Here we review the biophysical and mechanical principles of locomotion at the small scales relevant to cell swimming, tens of micrometers and below. At this scale, inertia is unimportant and the Reynolds number is small. Our emphasis is on the simple physical picture and fundamental flow physics phenomena in this regime. We first give a brief overview of the mechanisms for swimming motility, and of the basic properties of flows at low Reynolds number, paying special attention to aspects most relevant for swimming such as resistance matrices for solid bodies, flow singularities and kinematic requirements for net translation. Then we review classical theoretical work on cell motility, in particular early calculations of swimming kinematics with prescribed stroke and the application of resistive force theory and slender-body theory to flagellar locomotion. After examining the physical means by which flagella are actuated, we outline areas of active research, including hydrodynamic interactions, biological locomotion in complex fluids, the design of small-scale artificial swimmers and the optimization of locomotion strategies. (Some figures in this article are in colour only in the electronic version) This article was invited by Christoph Schmidt.

Proceedings ArticleDOI
09 Nov 2009
TL;DR: It is shown that it is possible to map the internal cloud infrastructure, identify where a particular target VM is likely to reside, and then instantiate new VMs until one is placed co-resident with the target, and how such placement can then be used to mount cross-VM side-channel attacks to extract information from a target VM on the same machine.
Abstract: Third-party cloud computing represents the promise of outsourcing as applied to computation. Services, such as Microsoft's Azure and Amazon's EC2, allow users to instantiate virtual machines (VMs) on demand and thus purchase precisely the capacity they require when they require it. In turn, the use of virtualization allows third-party cloud providers to maximize the utilization of their sunk capital costs by multiplexing many customer VMs across a shared physical infrastructure. However, in this paper, we show that this approach can also introduce new vulnerabilities. Using the Amazon EC2 service as a case study, we show that it is possible to map the internal cloud infrastructure, identify where a particular target VM is likely to reside, and then instantiate new VMs until one is placed co-resident with the target. We explore how such placement can then be used to mount cross-VM side-channel attacks to extract information from a target VM on the same machine.

Journal ArticleDOI
Lorenzo Galluzzi1, Lorenzo Galluzzi2, Lorenzo Galluzzi3, Stuart A. Aaronson4, John M. Abrams5, Emad S. Alnemri6, David W. Andrews7, Eric H. Baehrecke8, Nicolas G. Bazan9, Mikhail V. Blagosklonny10, Klas Blomgren11, Klas Blomgren12, Christoph Borner13, Dale E. Bredesen14, Dale E. Bredesen15, Catherine Brenner16, Maria Castedo1, Maria Castedo3, Maria Castedo2, John A. Cidlowski17, Aaron Ciechanover18, Gerald M. Cohen19, V De Laurenzi20, R De Maria21, Mohanish Deshmukh22, Brian David Dynlacht23, Wafik S. El-Deiry24, Richard A. Flavell25, Richard A. Flavell26, Simone Fulda27, Carmen Garrido2, Carmen Garrido28, Pierre Golstein2, Pierre Golstein16, Pierre Golstein29, Marie-Lise Gougeon30, Douglas R. Green, Hinrich Gronemeyer2, Hinrich Gronemeyer16, Hinrich Gronemeyer31, György Hajnóczky6, J. M. Hardwick32, Michael O. Hengartner33, Hidenori Ichijo34, Marja Jäättelä, Oliver Kepp1, Oliver Kepp3, Oliver Kepp2, Adi Kimchi35, Daniel J. Klionsky36, Richard A. Knight37, Sally Kornbluth38, Sharad Kumar, Beth Levine5, Beth Levine26, Stuart A. Lipton, Enrico Lugli17, Frank Madeo39, Walter Malorni21, Jean-Christophe Marine40, Seamus J. Martin41, Jan Paul Medema42, Patrick Mehlen16, Patrick Mehlen43, Gerry Melino19, Gerry Melino44, Ute M. Moll45, Ute M. Moll46, Eugenia Morselli2, Eugenia Morselli3, Eugenia Morselli1, Shigekazu Nagata47, Donald W. Nicholson48, Pierluigi Nicotera19, Gabriel Núñez36, Moshe Oren35, Josef M. Penninger49, Shazib Pervaiz50, Marcus E. Peter51, Mauro Piacentini44, Jochen H. M. Prehn52, Hamsa Puthalakath53, Gabriel A. Rabinovich54, Rosario Rizzuto55, Cecília M. P. Rodrigues56, David C. Rubinsztein57, Thomas Rudel58, Luca Scorrano59, Hans-Uwe Simon60, Hermann Steller26, Hermann Steller61, J. Tschopp62, Yoshihide Tsujimoto63, Peter Vandenabeele64, Ilio Vitale2, Ilio Vitale3, Ilio Vitale1, Karen H. Vousden65, Richard J. Youle17, Junying Yuan66, Boris Zhivotovsky67, Guido Kroemer2, Guido Kroemer3, Guido Kroemer1 
Institut Gustave Roussy1, French Institute of Health and Medical Research2, University of Paris-Sud3, Icahn School of Medicine at Mount Sinai4, University of Texas Southwestern Medical Center5, Thomas Jefferson University6, McMaster University7, University of Massachusetts Medical School8, LSU Health Sciences Center New Orleans9, Roswell Park Cancer Institute10, University of Gothenburg11, Boston Children's Hospital12, University of Freiburg13, Buck Institute for Research on Aging14, University of California, San Francisco15, Centre national de la recherche scientifique16, National Institutes of Health17, Technion – Israel Institute of Technology18, University of Leicester19, University of Chieti-Pescara20, Istituto Superiore di Sanità21, University of North Carolina at Chapel Hill22, New York University23, University of Pennsylvania24, Yale University25, Howard Hughes Medical Institute26, University of Ulm27, University of Burgundy28, Aix-Marseille University29, Pasteur Institute30, University of Strasbourg31, Johns Hopkins University32, University of Zurich33, University of Tokyo34, Weizmann Institute of Science35, University of Michigan36, University College London37, Duke University38, University of Graz39, Ghent University40, Trinity College, Dublin41, University of Amsterdam42, University of Lyon43, University of Rome Tor Vergata44, University of Göttingen45, Stony Brook University46, Kyoto University47, Merck & Co.48, Austrian Academy of Sciences49, National University of Singapore50, University of Chicago51, Royal College of Surgeons in Ireland52, La Trobe University53, University of Buenos Aires54, University of Padua55, University of Lisbon56, University of Cambridge57, University of Würzburg58, University of Geneva59, University of Bern60, Rockefeller University61, University of Lausanne62, Osaka University63, University of California, San Diego64, University of Glasgow65, Harvard University66, Karolinska Institutet67
TL;DR: A nonexhaustive comparison of methods to detect cell death with apoptotic or nonapoptotic morphologies, their advantages and pitfalls is provided and the importance of performing multiple, methodologically unrelated assays to quantify dying and dead cells is emphasized.
Abstract: Cell death is essential for a plethora of physiological processes, and its deregulation characterizes numerous human diseases Thus, the in-depth investigation of cell death and its mechanisms constitutes a formidable challenge for fundamental and applied biomedical research, and has tremendous implications for the development of novel therapeutic strategies It is, therefore, of utmost importance to standardize the experimental procedures that identify dying and dead cells in cell cultures and/or in tissues, from model organisms and/or humans, in healthy and/or pathological scenarios Thus far, dozens of methods have been proposed to quantify cell death-related parameters However, no guidelines exist regarding their use and interpretation, and nobody has thoroughly annotated the experimental settings for which each of these techniques is most appropriate Here, we provide a nonexhaustive comparison of methods to detect cell death with apoptotic or nonapoptotic morphologies, their advantages and pitfalls These guidelines are intended for investigators who study cell death, as well as for reviewers who need to constructively critique scientific reports that deal with cellular demise Given the difficulties in determining the exact number of cells that have passed the point-of-no-return of the signaling cascades leading to cell death, we emphasize the importance of performing multiple, methodologically unrelated assays to quantify dying and dead cells

Journal ArticleDOI
TL;DR: Research on the following topics is reviewed with respect to reading: (a) the perceptual span, (or span of effective vision), (b) preview benefit, (c) eye movement control, and (d) models of eye movements.
Abstract: Eye movements are now widely used to investigate cognitive processes during reading, scene perception, and visual search. In this article, research on the following topics is reviewed with respect to reading: (a) the perceptual span (or span of effective vision), (b) preview benefit, (c) eye movement control, and (d) models of eye movements. Related issues with respect to eye movements during scene perception and visual search are also reviewed. It is argued that research on eye movements during reading has been somewhat advanced over research on eye movements in scene perception and visual search and that some of the paradigms developed to study reading should be more widely adopted in the study of scene perception and visual search. Research dealing with "real-world" tasks and research utilizing the visual-world paradigm are also briefly discussed.

Journal ArticleDOI
31 Jul 2009-Science
TL;DR: Current trends in world fisheries are analyzed from a fisheries and conservation perspective, finding that 63% of assessed fish stocks worldwide still require rebuilding, and even lower exploitation rates are needed to reverse the collapse of vulnerable species.
Abstract: After a long history of overexploitation, increasing efforts to restore marine ecosystems and rebuild fisheries are under way. Here, we analyze current trends from a fisheries and conservation perspective. In 5 of 10 well-studied ecosystems, the average exploitation rate has recently declined and is now at or below the rate predicted to achieve maximum sustainable yield for seven systems. Yet 63% of assessed fish stocks worldwide still require rebuilding, and even lower exploitation rates are needed to reverse the collapse of vulnerable species. Combined fisheries and conservation objectives can be achieved by merging diverse management actions, including catch restrictions, gear modification, and closed areas, depending on local context. Impacts of international fleets and the lack of alternatives to fishing complicate prospects for rebuilding fisheries in many poorer regions, highlighting the need for a global perspective on rebuilding marine resources.

Journal ArticleDOI
07 Jan 2009-JAMA
TL;DR: Selenium or vitamin E, alone or in combination at the doses and formulations used, did not prevent prostate cancer in this population of relatively healthy men.
Abstract: Context Secondary analyses of 2 randomized controlled trials and supportive epidemiologic and preclinical data indicated the potential of selenium and vitamin E for preventing prostate cancer. Objective To determine whether selenium, vitamin E, or both could prevent prostate cancer and other diseases with little or no toxicity in relatively healthy men. Design, Setting, and Participants A randomized, placebo-controlled trial (Selenium and Vitamin E Cancer Prevention Trial [SELECT]) of 35 533 men from 427 participating sites in the United States, Canada, and Puerto Rico randomly assigned to 4 groups (selenium, vitamin E, selenium + vitamin E, and placebo) in a double-blind fashion between August 22, 2001, and June 24, 2004. Baseline eligibility included age 50 years or older (African American men) or 55 years or older (all other men), a serum prostate-specific antigen level of 4 ng/mL or less, and a digital rectal examination not suspicious for prostate cancer. Interventions Oral selenium (200 μg/d from L-selenomethionine) and matched vitamin E placebo, vitamin E (400 IU/d of all rac-α-tocopheryl acetate) and matched selenium placebo, selenium + vitamin E, or placebo + placebo for a planned follow-up of minimum of 7 years and a maximum of 12 years. Main Outcome Measures Prostate cancer and prespecified secondary outcomes, including lung, colorectal, and overall primary cancer. Results As of October 23, 2008, median overall follow-up was 5.46 years (range, 4.17-7.33 years). Hazard ratios (99% confidence intervals [CIs]) for prostate cancer were 1.13 (99% CI, 0.95-1.35; n = 473) for vitamin E, 1.04 (99% CI, 0.87-1.24; n = 432) for selenium, and 1.05 (99% CI, 0.88-1.25; n = 437) for selenium + vitamin E vs 1.00 (n = 416) for placebo. There were no significant differences (all P>.15) in any other prespecified cancer end points. There were statistically nonsignificant increased risks of prostate cancer in the vitamin E group (P = .06) and type 2 diabetes mellitus in the selenium group (relative risk, 1.07; 99% CI, 0.94-1.22; P = .16) but not in the selenium + vitamin E group. Conclusion Selenium or vitamin E, alone or in combination at the doses and formulations used, did not prevent prostate cancer in this population of relatively healthy men. Trial Registration clinicaltrials.gov identifier: NCT00006392Published online December 9, 2008 (doi:10.1001/jama.2008.864).

Journal ArticleDOI
TL;DR: It is demonstrated that IL-6 is a critical tumor promoter during early CAC tumorigenesis and the NF-kappaB-IL-6-Stat3 cascade is an important regulator of the proliferation and survival of tumor-initiating IECs.

Journal ArticleDOI
TL;DR: Develop a cerebrospinal fluid biomarker signature for mild Alzheimer's disease (AD) in Alzheimer's Disease Neuroimaging Initiative (ADNI) subjects.
Abstract: If the clinical diagnosis of probable AD is imprecise with accuracy rates of approximately 90% or lower using established consensus criteria for probable AD, but definite AD requires autopsy confirmation, it is not surprising that diagnostic accuracy is lower at early and presymptomatic stages of AD.1–4 It is believed that the development of full-blown AD takes place over an approximately 20-year prodromal period, but this is difficult to determine in the absence of biomarkers that reliably signal the onset of nascent disease before the emergence of measurable cognitive impairments. Because intervention with disease-modifying therapies for AD is likely to be most efficacious before significant neurodegeneration has occurred, there is an urgent need for biomarker-based tests that enable a more accurate and early diagnosis of AD.5–7 Moreover, such tests could also improve monitoring AD progression, evaluation of new AD therapies, and enrichment of AD cohorts with specific subsets of AD subjects in clinical trials. The defining lesions of AD are neurofibrillary tangles and senile plaques formed, respectively, by neuronal accumulations of abnormal hyperphosphorylated tau filaments and extracellular deposits of amyloid β (Aβ) fibrils, mostly the 1 to 42 peptide (Aβ1-42), the least soluble of the known Aβ peptides produced from Aβ precursor protein by the action of various peptidases.1–3 Hence, for these and other reasons summarized in consensus reports on AD biomarkers, cerebrospinal fluid (CSF), total tau (t-tau), and Aβ were identified as being among the most promising and informative AD biomarkers.5,6 Increased levels of tau in CSF are thought to occur after its release from damaged and dying neurons that harbor dystrophic tau neurites and tangles, whereas reduced CSF levels of Aβ1-42 are believed to result from large-scale accumulation of this least soluble of Aβ peptides into insoluble plaques in the AD brain. The combination of increased CSF concentrations of t-tau and phosphotau (p-tau) species and decreased concentrations of Aβ1-42 are considered to be a pathological CSF biomarker signature that is diagnostic for AD.5,6,8,9 Notably, recent studies have provided compelling preliminary data to suggest that this combination of CSF tau and Aβ biomarker changes may predict the conversion to AD in mild cognitive impairment (MCI) subjects.10 Thus, an increase in levels of CSF tau associated with a decline in levels of CSF Aβ1-42 may herald the onset of AD before it becomes clinically manifest. However, before the utility of CSF Aβ1-42 and tau concentrations for diagnosis of AD can be established, it is critical to standardize the methodology for their measurement.5–8,10 For example, among the published studies of CSF tau and Aβ, there is considerable variability in the observed levels of these analytes, as well as their diagnostic sensitivity and specificity. This is attributable to variability in analytical methodology standardization and other factors that differ between studies of the same CSF analytes in similar but not identical cohorts.5–7 The Alzheimer’s Disease Neuroimaging Initiative (ADNI) was launched in 2004 to address these and other limitations in AD biomarkers (see reviews in Shaw and colleagues7 and Mueller and coauthors,11 and the ADNI Web site [http://www.adni-info.org/index] where the ADNI grant and all ADNI data are posted for public access). To this end, the Biomarker Core of ADNI conducts studies on ADNI-derived CSF samples to measure CSF Aβ1-42, t-tau, and p-tau (tau phosphorylated at threonine181 [p-tau181p]) in standardized assays. Evaluation of CSF obtained at baseline evaluation of 416 of the 819 ADNI subjects is now complete, and we report here our findings on the performance of these tests using a standardized multiplex immunoassay system that measures the biomarkers simultaneously in the same sample aliquot in ADNI subjects and in an independent cohort of autopsy-confirmed AD cases.

Journal ArticleDOI
TL;DR: This tutorial article surveys some of these techniques based on stochastic geometry and the theory of random geometric graphs, discusses their application to model wireless networks, and presents some of the main results that have appeared in the literature.
Abstract: Wireless networks are fundamentally limited by the intensity of the received signals and by their interference. Since both of these quantities depend on the spatial location of the nodes, mathematical techniques have been developed in the last decade to provide communication-theoretic results accounting for the networks geometrical configuration. Often, the location of the nodes in the network can be modeled as random, following for example a Poisson point process. In this case, different techniques based on stochastic geometry and the theory of random geometric graphs -including point process theory, percolation theory, and probabilistic combinatorics-have led to results on the connectivity, the capacity, the outage probability, and other fundamental limits of wireless networks. This tutorial article surveys some of these techniques, discusses their application to model wireless networks, and presents some of the main results that have appeared in the literature. It also serves as an introduction to the field for the other papers in this special issue.

Proceedings ArticleDOI
20 Jun 2009
TL;DR: It is shown that using Multiple Instance Learning (MIL) instead of traditional supervised learning avoids these problems, and can therefore lead to a more robust tracker with fewer parameter tweaks.
Abstract: In this paper, we address the problem of learning an adaptive appearance model for object tracking. In particular, a class of tracking techniques called “tracking by detection” have been shown to give promising results at real-time speeds. These methods train a discriminative classifier in an online manner to separate the object from the background. This classifier bootstraps itself by using the current tracker state to extract positive and negative examples from the current frame. Slight inaccuracies in the tracker can therefore lead to incorrectly labeled training examples, which degrades the classifier and can cause further drift. In this paper we show that using Multiple Instance Learning (MIL) instead of traditional supervised learning avoids these problems, and can therefore lead to a more robust tracker with fewer parameter tweaks. We present a novel online MIL algorithm for object tracking that achieves superior results with real-time performance.

Journal ArticleDOI
TL;DR: LPSiNPs are presented, a new type of multifunctional nanostructure with a low-toxicity degradation pathway for in vivo applications that can carry a drug payload and of which the intrinsic near-infrared photoluminescence enables monitoring of both accumulation and degradation in vivo.
Abstract: Nanomaterials that can cir culate in the body hold great potential to diagnose and treat disease 1‐4 . For such applications, it is important that the nanomaterials be harmlessly eliminated from the body in a reasonable period of time after they carry out their diagnostic or therapeutic function. Despite efforts to improve their targeting efficiency, significant quantities of systemically administered nanomaterials are cleared by the mononuclear phagocytic system before finding their targets, increasing the likelihood of unintended acute or chronic toxicity. However, there has been little effort to engineer the self-destruction of errant nanoparticles into non-toxic, systemically eliminated products. Here, we present luminescent porous silicon nanoparticles (LPSiNPs) that can carry a drug payload and of which the intrinsic near-infrared photoluminescence enables monitoring of both accumulation and degradation in vivo. Furthermore, in contrast to most optically active nanomaterials (carbon nanotubes, gold nanoparticles and quantum dots), LPSiNPs self-destruct in a mouse model into renally cleared components in a relatively short period of time with no evidence of toxicity. As a preliminary in vivo application, we demonstrate tumour imaging using dextran-coated LPSiNPs (D-LPSiNPs). These results demonstrate a new type of multifunctional nanostructure with a low-toxicity degradation pathway forinvivo applications. The in vivo use of nanomaterials as therapeutic and diagnostic agents is of intense interest owing to their unique properties such as large specific capacity for drug loading 2 , strong superparamagnetism 3 ,efficientphotoluminescence 1,5 ordistinctive Raman signatures 4 , among others. Materials with sizes in the range of 20200nm can avoid renal filtration, leading to prolonged resi

Journal ArticleDOI
TL;DR: This tutorial review will present much of the significant work that has been done in the field of electrocatalytic and homogeneous reduction of carbon dioxide over the past three decades and extend the discussion to the important conclusions from previous work and recommendations for future directions to develop a catalytic system that will convert carbon dioxide to liquid fuels with high efficiencies.
Abstract: Research in the field of catalytic reduction of carbon dioxide to liquid fuels has grown rapidly in the past few decades. This is due to the increasing amount of carbon dioxide in the atmosphere and a steady climb in global fuel demand. This tutorial review will present much of the significant work that has been done in the field of electrocatalytic and homogeneous reduction of carbon dioxide over the past three decades. It will then extend the discussion to the important conclusions from previous work and recommendations for future directions to develop a catalytic system that will convert carbon dioxide to liquid fuels with high efficiencies.

Journal ArticleDOI
TL;DR: The rapid increase in reports on PSM demonstrates this methodology will play an increasingly important role in the development of MOFs for the foreseeable future, and in both scope of chemical reactions and range of suitable MOFs.
Abstract: The modification of metal–organic frameworks (MOFs) in a postsynthetic scheme is discussed in this critical review. In this approach, the MOF is assembled and then modified with chemical reagents with preservation of the lattice structure. Recent findings show amide couplings, isocyanate condensations, ‘click’ chemistry, and other reactions are suitable for postsynthetic modification (PSM). In addition, a number of MOFs, from IRMOF-3 to ZIF-90, are amenable to PSM. The generality of PSM, in both scope of chemical reactions and range of suitable MOFs, clearly indicates that the approach is broadly applicable. Indeed, the rapid increase in reports on PSM demonstrates this methodology will play an increasingly important role in the development of MOFs for the foreseeable future (117 references).

DatasetDOI
TL;DR: The most recent version of the guidelines for the prevention and treatment of opportunistic infections (OI) in HIV-infected adults and adolescents was published in 2002 and 2004, respectively as mentioned in this paper.
Abstract: This report updates and combines earlier versions of guidelines for the prevention and treatment of opportunistic infections (OIs) in HIV-infected adults (i.e., persons aged >/=18 years) and adolescents (i.e., persons aged 13--17 years), last published in 2002 and 2004, respectively. It has been prepared by the Centers for Disease Control and Prevention (CDC), the National Institutes of Health (NIH), and the HIV Medicine Association (HIVMA) of the Infectious Diseases Society of America (IDSA). The guidelines are intended for use by clinicians and other health-care providers, HIV-infected patients, and policy makers in the United States. These guidelines address several OIs that occur in the United States and five OIs that might be acquired during international travel. Topic areas covered for each OI include epidemiology, clinical manifestations, diagnosis, prevention of exposure; prevention of disease by chemoprophylaxis and vaccination; discontinuation of primary prophylaxis after immune reconstitution; treatment of disease; monitoring for adverse effects during treatment; management of treatment failure; prevention of disease recurrence; discontinuation of secondary prophylaxis after immune reconstitution; and special considerations during pregnancy. These guidelines were developed by a panel of specialists from the United States government and academic institutions. For each OI, a small group of specialists with content-matter expertise reviewed the literature for new information since the guidelines were last published; they then proposed revised recommendations at a meeting held at NIH in June 2007. After these presentations and discussion, the revised guidelines were further reviewed by the co-editors; by the Office of AIDS Research, NIH; by specialists at CDC; and by HIVMA of IDSA before final approval and publication. The recommendations are rated by a letter that indicates the strength of the recommendation and a Roman numeral that indicates the quality of evidence supporting the recommendation, so that readers can ascertain how best to apply the recommendations in their practice environments. Major changes in the guidelines include 1) greater emphasis on the importance of antiretroviral therapy for the prevention and treatment of OIs, especially those OIs for which no specific therapy exists; 2) information regarding the diagnosis and management of immune reconstitution inflammatory syndromes; 3) information regarding the use of interferon-gamma release assays for the diagnosis of latent Mycobacterium tuberculosis (TB) infection; 4) updated information concerning drug interactions that affect the use of rifamycin drugs for prevention and treatment of TB; 5) the addition of a section on hepatitis B virus infection; and 6) the addition of malaria to the list of OIs that might be acquired during international travel. This report includes eleven tables pertinent to the prevention and treatment of OIs, a figure that pertains to the diagnois of tuberculosis, a figure that describes immunization recommendations, and an appendix that summarizes recommendations for prevention of exposure to opportunistic pathogens.

Journal ArticleDOI
TL;DR: Recommendations from the ACE and the ADA generally endorsed tight glycemic control in critical care units and for patients in general medical and surgical units, where RCT evidence regarding treatment targets was lacking, glycemic goals similar to those advised for outpatients were advocated.
Abstract: People with diabetes are more likely to be hospitalized and to have longer durations of hospital stay than those without diabetes. A recent survey estimated that 22% of all hospital inpatient days were incurred by people with diabetes and that hospital inpatient care accounted for half of the 174 billion USD total U.S. medical expenditures for this disease (1). These findings are due, in part, to the continued expansion of the worldwide epidemic of type 2 diabetes. In the U.S. alone, there are ∼1.6 million new cases of diabetes each year, with an over all prevalence of 23.6 million people (7.8% of the population, with one-fourth of the cases remaining undiagnosed). An additional 57 million American adults are at high risk for type 2 diabetes (2). Although the costs of illness-related stress hyperglycemia are not known, they are likely to be considerable in light of the poor prognosis of such patients (3–6). There is substantial observational evidence linking hyperglycemia in hospitalized patients (with or without diabetes) to poor outcomes. Cohort studies as well as a few early randomized controlled trials (RCTs) have suggested that intensive treatment of hyperglycemia improved hospital outcomes (5–8). In 2004, this evidence led the American College of Endocrinology (ACE) and the American Association of Clinical Endocrinologists (AACE), in collaboration with the American Diabetes Association (ADA) and other medical organizations, to develop recommendations for treatment of inpatient hyperglycemia (9). In 2005, the ADA added recommendations for treatment of hyperglycemia in the hospitalto itsannual Standards of Medical Care (10). Recommendations from the ACE and the ADA generally endorsed tight glycemic control in critical care units. For patients in general medical and surgical units, where RCT evidence regarding treatment targets was lacking, glycemic goals similar to those advised for outpatients were advocated (9, …

Journal ArticleDOI
21 Aug 2009-Science
TL;DR: It is found that expression of the transcription factor Bcl6 in CD4+ T cells is both necessary and sufficient for in vivo TFH differentiation and T cell help to B cells in mice, and that Bcl 6 and Blimp-1 play central but opposing roles inTFH differentiation.
Abstract: Effective B cell–mediated immunity and antibody responses often require help from CD4+ T cells. It is thought that a distinct CD4+ effector T cell subset, called T follicular helper cells (TFH), provides this help; however, the molecular requirements for TFH differentiation are unknown. We found that expression of the transcription factor Bcl6 in CD4+ T cells is both necessary and sufficient for in vivo TFH differentiation and T cell help to B cells in mice. In contrast, the transcription factor Blimp-1, an antagonist of Bcl6, inhibits TFH differentiation and help, thereby preventing B cell germinal center and antibody responses. These findings demonstrate that TFH cells are required for proper B cell responses in vivo and that Bcl6 and Blimp-1 play central but opposing roles in TFH differentiation.

Proceedings ArticleDOI
01 Jan 2009
TL;DR: It is demonstrated that when designed properly, integral channel features not only outperform other features including histogram of oriented gradient (HOG), they also result in fast detectors when coupled with cascade classifiers.
Abstract: We study the performance of ‘integral channel features’ for image classification tasks, focusing in particular on pedestrian detection. The general idea behind integral channel features is that multiple registered image channels are computed using linear and non-linear transformations of the input image, and then features such as local sums, histograms, and Haar features and their various generalizations are efficiently computed using integral images. Such features have been used in recent literature for a variety of tasks – indeed, variations appear to have been invented independently multiple times. Although integral channel features have proven effective, little effort has been devoted to analyzing or optimizing the features themselves. In this work we present a unified view of the relevant work in this area and perform a detailed experimental evaluation. We demonstrate that when designed properly, integral channel features not only outperform other features including histogram of oriented gradient (HOG), they also (1) naturally integrate heterogeneous sources of information, (2) have few parameters and are insensitive to exact parameter settings, (3) allow for more accurate spatial localization during detection, and (4) result in fast detectors when coupled with cascade classifiers.