scispace - formally typeset
Search or ask a question
Author

Takashi Saito

Bio: Takashi Saito is an academic researcher from Nagoya City University. The author has contributed to research in topics: Medicine & MAGIC (telescope). The author has an hindex of 112, co-authored 1041 publications receiving 52937 citations. Previous affiliations of Takashi Saito include Mitsubishi Electric & Nippon Medical School.


Papers
More filters
Journal ArticleDOI
21 Jan 2011-Science
TL;DR: Oral inoculation of Clostridium during the early life of conventionally reared mice resulted in resistance to colitis and systemic immunoglobulin E responses in adult mice, suggesting a new therapeutic approach to autoimmunity and allergy.
Abstract: CD4+ T regulatory cells (Tregs), which express the Foxp3 transcription factor, play a critical role in the maintenance of immune homeostasis. Here, we show that in mice, Tregs were most abundant in the colonic mucosa. The spore-forming component of indigenous intestinal microbiota, particularly clusters IV and XIVa of the genus Clostridium, promoted Treg cell accumulation. Colonization of mice by a defined mix of Clostridium strains provided an environment rich in transforming growth factor–β and affected Foxp3+ Treg number and function in the colon. Oral inoculation of Clostridium during the early life of conventionally reared mice resulted in resistance to colitis and systemic immunoglobulin E responses in adult mice, suggesting a new therapeutic approach to autoimmunity and allergy.

3,096 citations

Journal ArticleDOI
Marcos Daniel Actis1, G. Agnetta2, Felix Aharonian3, A. G. Akhperjanian  +682 moreInstitutions (109)
TL;DR: The ground-based gamma-ray astronomy has had a major breakthrough with the impressive results obtained using systems of imaging atmospheric Cherenkov telescopes as mentioned in this paper, which is an international initiative to build the next generation instrument, with a factor of 5-10 improvement in sensitivity in the 100 GeV-10 TeV range and the extension to energies well below 100GeV and above 100 TeV.
Abstract: Ground-based gamma-ray astronomy has had a major breakthrough with the impressive results obtained using systems of imaging atmospheric Cherenkov telescopes. Ground-based gamma-ray astronomy has a huge potential in astrophysics, particle physics and cosmology. CTA is an international initiative to build the next generation instrument, with a factor of 5-10 improvement in sensitivity in the 100 GeV-10 TeV range and the extension to energies well below 100 GeV and above 100 TeV. CTA will consist of two arrays (one in the north, one in the south) for full sky coverage and will be operated as open observatory. The design of CTA is based on currently available technology. This document reports on the status and presents the major design concepts of CTA.

1,006 citations

Journal ArticleDOI
29 Sep 1995-Science
TL;DR: The medium chains of two clathrin-associated protein complexes specifically interacted with tyrosine-based signals of several integral membrane proteins, and it is likely that the medium chains serve as signal-binding components of the clathin-dependent sorting machinery.
Abstract: Tyrosine-based signals within the cytoplasmic domain of integral membrane proteins mediate clathrin-dependent protein sorting in the endocytic and secretory pathways. A yeast two-hybrid system was used to identify proteins that bind to tyrosine-based signals. The medium chains (mu 1 and mu 2) of two clathrin-associated protein complexes (AP-1 and AP-2, respectively) specifically interacted with tyrosine-based signals of several integral membrane proteins. The interaction was confirmed by in vitro binding assays. Thus, it is likely that the medium chains serve as signal-binding components of the clathrin-dependent sorting machinery.

952 citations

Journal ArticleDOI
TL;DR: It is shown that FcγRs have two additional specific attributes in murine DCs: the induction of DC maturation and the promotion of efficient MHC class I–restricted presentation of peptides from exogenous, IgG-complexed antigens.
Abstract: Dendritic cells (DCs) express several receptors for the Fc portion of immunoglobulin (Ig)G (FcγR), which mediate internalization of antigen–IgG complexes (immune complexes, ICs) and promote efficient major histocompatibility complex (MHC) class II–restricted antigen presentation. We now show that FcγRs have two additional specific attributes in murine DCs: the induction of DC maturation and the promotion of efficient MHC class I–restricted presentation of peptides from exogenous, IgG-complexed antigens. Both FcγR functions require the FcγR-associated γ chain. FcγR-mediated MHC class I–restricted antigen presentation is extremely sensitive and specific to immature DCs. It requires proteasomal degradation and is dependent on functional peptide transporter associated with antigen processing, TAP1-TAP2. By promoting DC maturation and presentation on both MHC class I and II molecules, ICs should efficiently sensitize DCs for priming of both CD4+ helper and CD8+ cytotoxic T lymphocytes in vivo.

936 citations

Journal ArticleDOI
01 Jul 2006-Immunity
TL;DR: It is proposed that T CR signaling is sustained by stabilized microclusters and is terminated in the cSMAC, a structure from which TCR are sorted for degradation, and a role for F-actin in TCR signaling beyond microcluster formation is revealed.

828 citations


Cited by
More filters
01 May 1993
TL;DR: Comparing the results to the fastest reported vectorized Cray Y-MP and C90 algorithm shows that the current generation of parallel machines is competitive with conventional vector supercomputers even for small problems.
Abstract: Three parallel algorithms for classical molecular dynamics are presented. The first assigns each processor a fixed subset of atoms; the second assigns each a fixed subset of inter-atomic forces to compute; the third assigns each a fixed spatial region. The algorithms are suitable for molecular dynamics models which can be difficult to parallelize efficiently—those with short-range forces where the neighbors of each atom change rapidly. They can be implemented on any distributed-memory parallel machine which allows for message-passing of data between independently executing processors. The algorithms are tested on a standard Lennard-Jones benchmark problem for system sizes ranging from 500 to 100,000,000 atoms on several parallel supercomputers--the nCUBE 2, Intel iPSC/860 and Paragon, and Cray T3D. Comparing the results to the fastest reported vectorized Cray Y-MP and C90 algorithm shows that the current generation of parallel machines is competitive with conventional vector supercomputers even for small problems. For large problems, the spatial algorithm achieves parallel efficiencies of 90% and a 1840-node Intel Paragon performs up to 165 faster than a single Cray C9O processor. Trade-offs between the three algorithms and guidelines for adapting them to more complex molecular dynamics simulations are also discussed.

29,323 citations

Journal ArticleDOI
Eric S. Lander1, Lauren Linton1, Bruce W. Birren1, Chad Nusbaum1  +245 moreInstitutions (29)
15 Feb 2001-Nature
TL;DR: The results of an international collaboration to produce and make freely available a draft sequence of the human genome are reported and an initial analysis is presented, describing some of the insights that can be gleaned from the sequence.
Abstract: The human genome holds an extraordinary trove of information about human development, physiology, medicine and evolution. Here we report the results of an international collaboration to produce and make freely available a draft sequence of the human genome. We also present an initial analysis of the data, describing some of the insights that can be gleaned from the sequence.

22,269 citations

Journal ArticleDOI
TL;DR: Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.
Abstract: Machine Learning is the study of methods for programming computers to learn. Computers are applied to a wide range of tasks, and for most of these it is relatively easy for programmers to design and implement the necessary software. However, there are many tasks for which this is difficult or impossible. These can be divided into four general categories. First, there are problems for which there exist no human experts. For example, in modern automated manufacturing facilities, there is a need to predict machine failures before they occur by analyzing sensor readings. Because the machines are new, there are no human experts who can be interviewed by a programmer to provide the knowledge necessary to build a computer system. A machine learning system can study recorded data and subsequent machine failures and learn prediction rules. Second, there are problems where human experts exist, but where they are unable to explain their expertise. This is the case in many perceptual tasks, such as speech recognition, hand-writing recognition, and natural language understanding. Virtually all humans exhibit expert-level abilities on these tasks, but none of them can describe the detailed steps that they follow as they perform them. Fortunately, humans can provide machines with examples of the inputs and correct outputs for these tasks, so machine learning algorithms can learn to map the inputs to the outputs. Third, there are problems where phenomena are changing rapidly. In finance, for example, people would like to predict the future behavior of the stock market, of consumer purchases, or of exchange rates. These behaviors change frequently, so that even if a programmer could construct a good predictive computer program, it would need to be rewritten frequently. A learning program can relieve the programmer of this burden by constantly modifying and tuning a set of learned prediction rules. Fourth, there are applications that need to be customized for each computer user separately. Consider, for example, a program to filter unwanted electronic mail messages. Different users will need different filters. It is unreasonable to expect each user to program his or her own rules, and it is infeasible to provide every user with a software engineer to keep the rules up-to-date. A machine learning system can learn which mail messages the user rejects and maintain the filtering rules automatically. Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis. Statistics focuses on understanding the phenomena that have generated the data, often with the goal of testing different hypotheses about those phenomena. Data mining seeks to find patterns in the data that are understandable by people. Psychological studies of human learning aspire to understand the mechanisms underlying the various learning behaviors exhibited by people (concept learning, skill acquisition, strategy change, etc.).

13,246 citations