scispace - formally typeset
Search or ask a question
Author

Stephen R. Quake

Bio: Stephen R. Quake is an academic researcher from Stanford University. The author has contributed to research in topics: Transcriptome & Cell type. The author has an hindex of 132, co-authored 589 publications receiving 77778 citations. Previous affiliations of Stephen R. Quake include Agency for Science, Technology and Research & Allegheny Health Network.


Papers
More filters
Journal ArticleDOI
07 Apr 2000-Science
TL;DR: An extension to the soft lithography paradigm, multilayersoft lithography, with which devices consisting of multiple layers may be fabricated from soft materials is described, to build active microfluidic systems containing on-off valves, switching valves, and pumps entirely out of elastomer.
Abstract: Soft lithography is an alternative to silicon-based micromachining that uses replica molding of nontraditional elastomeric materials to fabricate stamps and microfluidic channels. We describe here an extension to the soft lithography paradigm, multilayer soft lithography, with which devices consisting of multiple layers may be fabricated from soft materials. We used this technique to build active microfluidic systems containing on-off valves, switching valves, and pumps entirely out of elastomer. The softness of these materials allows the device areas to be reduced by more than two orders of magnitude compared with silicon-based devices. The other advantages of soft lithography, such as rapid prototyping, ease of fabrication, and biocompatibility, are retained.

4,218 citations

Journal ArticleDOI
TL;DR: A review of the physics of small volumes (nanoliters) of fluids is presented, as parametrized by a series of dimensionless numbers expressing the relative importance of various physical phenomena as mentioned in this paper.
Abstract: Microfabricated integrated circuits revolutionized computation by vastly reducing the space, labor, and time required for calculations. Microfluidic systems hold similar promise for the large-scale automation of chemistry and biology, suggesting the possibility of numerous experiments performed rapidly and in parallel, while consuming little reagent. While it is too early to tell whether such a vision will be realized, significant progress has been achieved, and various applications of significant scientific and practical interest have been developed. Here a review of the physics of small volumes (nanoliters) of fluids is presented, as parametrized by a series of dimensionless numbers expressing the relative importance of various physical phenomena. Specifically, this review explores the Reynolds number Re, addressing inertial effects; the Peclet number Pe, which concerns convective and diffusive transport; the capillary number Ca expressing the importance of interfacial tension; the Deborah, Weissenberg, and elasticity numbers De, Wi, and El, describing elastic effects due to deformable microstructural elements like polymers; the Grashof and Rayleigh numbers Gr and Ra, describing density-driven flows; and the Knudsen number, describing the importance of noncontinuum molecular effects. Furthermore, the long-range nature of viscous flows and the small device dimensions inherent in microfluidics mean that the influence of boundaries is typically significant. A variety of strategies have been developed to manipulate fluids by exploiting boundary effects; among these are electrokinetic effects, acoustic streaming, and fluid-structure interactions. The goal is to describe the physics behind the rich variety of fluid phenomena occurring on the nanoliter scale using simple scaling arguments, with the hopes of developing an intuitive sense for this occasionally counterintuitive world.

4,044 citations

PatentDOI
24 Sep 2003-Science
TL;DR: The fluidic multiplexor as discussed by the authors is a combinatorial array of binary valve patterns that exponentially increases the processing power of a network by allowing complex fluid manipulations with a minimal number of inputs.
Abstract: High-density microfluidic chips contain plumbing networks with thousands of micromechanical valves and hundreds of individually addressable chambers. These fluidic devices are analogous to electronic integrated circuits fabricated using large scale integration (LSI). A component of these networks is the fluidic multiplexor, which is a combinatorial array of binary valve patterns that exponentially increases the processing power of a network by allowing complex fluid manipulations with a minimal number of inputs. These integrated microfluidic networks can be used to construct a variety of highly complex microfluidic devices, for example the microfluidic analog of a comparator array, and a microfluidic memory storage device resembling electronic random access memories.

2,292 citations

Journal ArticleDOI
09 Apr 2009-Nature
TL;DR: It is shown that normal mammary epithelial stem cells contain lower concentrations of ROS than their more mature progeny cells, and subsets of CSCs in some tumours contain lower ROS levels and enhanced ROS defences compared to their non-tumorigenic progeny, which may contribute to tumour radioresistance.
Abstract: The metabolism of oxygen, although central to life, produces reactive oxygen species (ROS) that have been implicated in processes as diverse as cancer, cardiovascular disease and ageing. It has recently been shown that central nervous system stem cells and haematopoietic stem cells and early progenitors contain lower levels of ROS than their more mature progeny, and that these differences are critical for maintaining stem cell function. We proposed that epithelial tissue stem cells and their cancer stem cell (CSC) counterparts may also share this property. Here we show that normal mammary epithelial stem cells contain lower concentrations of ROS than their more mature progeny cells. Notably, subsets of CSCs in some human and murine breast tumours contain lower ROS levels than corresponding non-tumorigenic cells (NTCs). Consistent with ROS being critical mediators of ionizing-radiation-induced cell killing, CSCs in these tumours develop less DNA damage and are preferentially spared after irradiation compared to NTCs. Lower ROS levels in CSCs are associated with increased expression of free radical scavenging systems. Pharmacological depletion of ROS scavengers in CSCs markedly decreases their clonogenicity and results in radiosensitization. These results indicate that, similar to normal tissue stem cells, subsets of CSCs in some tumours contain lower ROS levels and enhanced ROS defences compared to their non-tumorigenic progeny, which may contribute to tumour radioresistance.

2,261 citations

Journal ArticleDOI
TL;DR: It is shown that a microfluidic device designed to produce reverse micelles can generate complex, ordered patterns as it is continuously operated far from thermodynamic equilibrium.
Abstract: Spatiotemporal pattern formation occurs in a variety of nonequilibrium physical and chemical systems. Here we show that a microfluidic device designed to produce reverse micelles can generate complex, ordered patterns as it is continuously operated far from thermodynamic equilibrium. Flow in a microfluidic system is usually simple—viscous effects dominate and the low Reynolds number leads to laminar flow. Self-assembly of the vesicles into patterns depends on channel geometry and relative fluid pressures, enabling the production of motifs ranging from monodisperse droplets to helices and ribbons.

2,061 citations


Cited by
More filters
28 Jul 2005
TL;DR: PfPMP1)与感染红细胞、树突状组胞以及胎盘的单个或多个受体作用,在黏附及免疫逃避中起关键的作�ly.
Abstract: 抗原变异可使得多种致病微生物易于逃避宿主免疫应答。表达在感染红细胞表面的恶性疟原虫红细胞表面蛋白1(PfPMP1)与感染红细胞、内皮细胞、树突状细胞以及胎盘的单个或多个受体作用,在黏附及免疫逃避中起关键的作用。每个单倍体基因组var基因家族编码约60种成员,通过启动转录不同的var基因变异体为抗原变异提供了分子基础。

18,940 citations

Journal ArticleDOI
TL;DR: SPAdes generates single-cell assemblies, providing information about genomes of uncultivatable bacteria that vastly exceeds what may be obtained via traditional metagenomics studies.
Abstract: The lion's share of bacteria in various environments cannot be cloned in the laboratory and thus cannot be sequenced using existing technologies. A major goal of single-cell genomics is to complement gene-centric metagenomic data with whole-genome assemblies of uncultivated organisms. Assembly of single-cell data is challenging because of highly non-uniform read coverage as well as elevated levels of sequencing errors and chimeric reads. We describe SPAdes, a new assembler for both single-cell and standard (multicell) assembly, and demonstrate that it improves on the recently released E+V−SC assembler (specialized for single-cell data) and on popular assemblers Velvet and SoapDeNovo (for multicell data). SPAdes generates single-cell assemblies, providing information about genomes of uncultivatable bacteria that vastly exceeds what may be obtained via traditional metagenomics studies. SPAdes is available online (http://bioinf.spbau.ru/spades). It is distributed as open source software.

16,859 citations

Journal ArticleDOI
TL;DR: Preface to the Princeton Landmarks in Biology Edition vii Preface xi Symbols used xiii 1.
Abstract: Preface to the Princeton Landmarks in Biology Edition vii Preface xi Symbols Used xiii 1. The Importance of Islands 3 2. Area and Number of Speicies 8 3. Further Explanations of the Area-Diversity Pattern 19 4. The Strategy of Colonization 68 5. Invasibility and the Variable Niche 94 6. Stepping Stones and Biotic Exchange 123 7. Evolutionary Changes Following Colonization 145 8. Prospect 181 Glossary 185 References 193 Index 201

14,171 citations

Journal ArticleDOI
TL;DR: Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.
Abstract: Machine Learning is the study of methods for programming computers to learn. Computers are applied to a wide range of tasks, and for most of these it is relatively easy for programmers to design and implement the necessary software. However, there are many tasks for which this is difficult or impossible. These can be divided into four general categories. First, there are problems for which there exist no human experts. For example, in modern automated manufacturing facilities, there is a need to predict machine failures before they occur by analyzing sensor readings. Because the machines are new, there are no human experts who can be interviewed by a programmer to provide the knowledge necessary to build a computer system. A machine learning system can study recorded data and subsequent machine failures and learn prediction rules. Second, there are problems where human experts exist, but where they are unable to explain their expertise. This is the case in many perceptual tasks, such as speech recognition, hand-writing recognition, and natural language understanding. Virtually all humans exhibit expert-level abilities on these tasks, but none of them can describe the detailed steps that they follow as they perform them. Fortunately, humans can provide machines with examples of the inputs and correct outputs for these tasks, so machine learning algorithms can learn to map the inputs to the outputs. Third, there are problems where phenomena are changing rapidly. In finance, for example, people would like to predict the future behavior of the stock market, of consumer purchases, or of exchange rates. These behaviors change frequently, so that even if a programmer could construct a good predictive computer program, it would need to be rewritten frequently. A learning program can relieve the programmer of this burden by constantly modifying and tuning a set of learned prediction rules. Fourth, there are applications that need to be customized for each computer user separately. Consider, for example, a program to filter unwanted electronic mail messages. Different users will need different filters. It is unreasonable to expect each user to program his or her own rules, and it is infeasible to provide every user with a software engineer to keep the rules up-to-date. A machine learning system can learn which mail messages the user rejects and maintain the filtering rules automatically. Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis. Statistics focuses on understanding the phenomena that have generated the data, often with the goal of testing different hypotheses about those phenomena. Data mining seeks to find patterns in the data that are understandable by people. Psychological studies of human learning aspire to understand the mechanisms underlying the various learning behaviors exhibited by people (concept learning, skill acquisition, strategy change, etc.).

13,246 citations

Christopher M. Bishop1
01 Jan 2006
TL;DR: Probability distributions of linear models for regression and classification are given in this article, along with a discussion of combining models and combining models in the context of machine learning and classification.
Abstract: Probability Distributions.- Linear Models for Regression.- Linear Models for Classification.- Neural Networks.- Kernel Methods.- Sparse Kernel Machines.- Graphical Models.- Mixture Models and EM.- Approximate Inference.- Sampling Methods.- Continuous Latent Variables.- Sequential Data.- Combining Models.

10,141 citations