scispace - formally typeset
Search or ask a question

Showing papers by "Rutgers University published in 2008"


Journal ArticleDOI
16 May 2008-Science
TL;DR: Optimizing the need for a key human resource while minimizing its negative consequences requires an integrated interdisciplinary approach and the development of strategies to decrease nitrogen-containing waste.
Abstract: Humans continue to transform the global nitrogen cycle at a record pace, reflecting an increased combustion of fossil fuels, growing demand for nitrogen in agriculture and industry, and pervasive inefficiencies in its use. Much anthropogenic nitrogen is lost to air, water, and land to cause a cascade of environmental and human health problems. Simultaneously, food production in some parts of the world is nitrogen-deficient, highlighting inequities in the distribution of nitrogen-containing fertilizers. Optimizing the need for a key human resource while minimizing its negative consequences requires an integrated interdisciplinary approach and the development of strategies to decrease nitrogen-containing waste.

5,249 citations


Journal ArticleDOI
TL;DR: The Compact Muon Solenoid (CMS) detector at the Large Hadron Collider (LHC) at CERN as mentioned in this paper was designed to study proton-proton (and lead-lead) collisions at a centre-of-mass energy of 14 TeV (5.5 TeV nucleon-nucleon) and at luminosities up to 10(34)cm(-2)s(-1)
Abstract: The Compact Muon Solenoid (CMS) detector is described. The detector operates at the Large Hadron Collider (LHC) at CERN. It was conceived to study proton-proton (and lead-lead) collisions at a centre-of-mass energy of 14 TeV (5.5 TeV nucleon-nucleon) and at luminosities up to 10(34)cm(-2)s(-1) (10(27)cm(-2)s(-1)). At the core of the CMS detector sits a high-magnetic-field and large-bore superconducting solenoid surrounding an all-silicon pixel and strip tracker, a lead-tungstate scintillating-crystals electromagnetic calorimeter, and a brass-scintillator sampling hadron calorimeter. The iron yoke of the flux-return is instrumented with four stations of muon detectors covering most of the 4 pi solid angle. Forward sampling calorimeters extend the pseudo-rapidity coverage to high values (vertical bar eta vertical bar <= 5) assuring very good hermeticity. The overall dimensions of the CMS detector are a length of 21.6 m, a diameter of 14.6 m and a total weight of 12500 t.

5,193 citations


Journal ArticleDOI
TL;DR: A solution-based method is reported that allows uniform and controllable deposition of reduced graphene oxide thin films with thicknesses ranging from a single monolayer to several layers over large areas, which could represent a route for translating the interesting fundamental properties of graphene into technologically viable devices.
Abstract: The integration of novel materials such as single-walled carbon nanotubes and nanowires into devices has been challenging, but developments in transfer printing and solution-based methods now allow these materials to be incorporated into large-area electronics1,2,3,4,5,6. Similar efforts are now being devoted to making the integration of graphene into devices technologically feasible7,8,9,10. Here, we report a solution-based method that allows uniform and controllable deposition of reduced graphene oxide thin films with thicknesses ranging from a single monolayer to several layers over large areas. The opto-electronic properties can thus be tuned over several orders of magnitude, making them potentially useful for flexible and transparent semiconductors or semi-metals. The thinnest films exhibit graphene-like ambipolar transistor characteristics, whereas thicker films behave as graphite-like semi-metals. Collectively, our deposition method could represent a route for translating the interesting fundamental properties of graphene into technologically viable devices.

4,174 citations


Journal ArticleDOI
TL;DR: In this paper, the authors constructed three dimensional Chern-Simons-matter theories with gauge groups U(N) × U(n) and SU(N), SU(2) × SU (2) which have explicit = 6 superconformal symmetry.
Abstract: We construct three dimensional Chern-Simons-matter theories with gauge groups U(N) × U(N) and SU(N) × SU(N) which have explicit = 6 superconformal symmetry. Using brane constructions we argue that the U(N) × U(N) theory at level k describes the low energy limit of N M2-branes probing a C4/Zk singularity. At large N the theory is then dual to M-theory on AdS4 × S7/Zk. The theory also has a 't Hooft limit (of large N with a fixed ratio N/k) which is dual to type IIA string theory on AdS4 × CP3. For k = 1 the theory is conjectured to describe N M2-branes in flat space, although our construction realizes explicitly only six of the eight supersymmetries. We give some evidence for this conjecture, which is similar to the evidence for mirror symmetry in d = 3 gauge theories. When the gauge group is SU(2) × SU(2) our theory has extra symmetries and becomes identical to the Bagger-Lambert theory.

3,091 citations


Journal ArticleDOI
TL;DR: This work shows that the fluctuations are significantly reduced in suspended graphene samples and reports low-temperature mobility approaching 200,000 cm2 V-1 s-1 for carrier densities below 5 x 109 cm-2, which cannot be attained in semiconductors or non-suspended graphene.
Abstract: The discovery of graphene1,2 raises the prospect of a new class of nanoelectronic devices based on the extraordinary physical properties3,4,5,6 of this one-atom-thick layer of carbon. Unlike two-dimensional electron layers in semiconductors, where the charge carriers become immobile at low densities, the carrier mobility in graphene can remain high, even when their density vanishes at the Dirac point. However, when the graphene sample is supported on an insulating substrate, potential fluctuations induce charge puddles that obscure the Dirac point physics. Here we show that the fluctuations are significantly reduced in suspended graphene samples and we report low-temperature mobility approaching 200,000 cm2 V−1 s−1 for carrier densities below 5 × 109 cm−2. Such values cannot be attained in semiconductors or non-suspended graphene. Moreover, unlike graphene samples supported by a substrate, the conductivity of suspended graphene at the Dirac point is strongly dependent on temperature and approaches ballistic values at liquid helium temperatures. At higher temperatures, above 100 K, we observe the onset of thermally induced long-range scattering. The novel electronic properties of graphene can be compromised when it is supported on an insulating substrate. However, suspended graphene samples can display low-temperature mobility values that cannot be attained in semiconductors or non-suspended graphene, and the conductivity approaches ballistic values at liquid-helium temperatures.

2,977 citations


Journal ArticleDOI
TL;DR: Wannier90 is a program for calculating maximally-localised Wannier functions (MLWF) from a set of Bloch energy bands that may or may not be attached to or mixed with other bands, and is able to output MLWF for visualisation and other post-processing purposes.

2,599 citations


Journal ArticleDOI
23 May 2008-Science
TL;DR: Virtually all nonequilibrium electron transfers on Earth are driven by a set of nanobiological machines composed largely of multimeric protein complexes associated with a small number of prosthetic groups.
Abstract: Virtually all nonequilibrium electron transfers on Earth are driven by a set of nanobiological machines composed largely of multimeric protein complexes associated with a small number of prosthetic groups. These machines evolved exclusively in microbes early in our planet's history yet, despite their antiquity, are highly conserved. Hence, although there is enormous genetic diversity in nature, there remains a relatively stable set of core genes coding for the major redox reactions essential for life and biogeochemical cycles. These genes created and coevolved with biogeochemical cycles and were passed from microbe to microbe primarily by horizontal gene transfer. A major challenge in the coming decades is to understand how these machines evolved, how they work, and the processes that control their activity on both molecular and planetary scales.

2,345 citations


Journal ArticleDOI
TL;DR: A set of guidelines for the selection and interpretation of the methods that can be used by investigators who are attempting to examine macroautophagy and related processes, as well as by reviewers who need to provide realistic and reasonable critiques of papers that investigate these processes are presented.
Abstract: Research in autophagy continues to accelerate,(1) and as a result many new scientists are entering the field Accordingly, it is important to establish a standard set of criteria for monitoring macroautophagy in different organisms Recent reviews have described the range of assays that have been used for this purpose(2,3) There are many useful and convenient methods that can be used to monitor macroautophagy in yeast, but relatively few in other model systems, and there is much confusion regarding acceptable methods to measure macroautophagy in higher eukaryotes A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers of autophagosomes versus those that measure flux through the autophagy pathway; thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from fully functional autophagy that includes delivery to, and degradation within, lysosomes (in most higher eukaryotes) or the vacuole (in plants and fungi) Here, we present a set of guidelines for the selection and interpretation of the methods that can be used by investigators who are attempting to examine macroautophagy and related processes, as well as by reviewers who need to provide realistic and reasonable critiques of papers that investigate these processes This set of guidelines is not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to verify an autophagic response

2,310 citations


Journal ArticleDOI
TL;DR: A conceptual framework depicting the interplay among four basic mechanistic components of organismal movement is introduced, providing a basis for hypothesis generation and a vehicle facilitating the understanding of the causes, mechanisms, and spatiotemporal patterns of movement and their role in various ecological and evolutionary processes.
Abstract: Movement of individual organisms is fundamental to life, quilting our planet in a rich tapestry of phenomena with diverse implications for ecosystems and humans. Movement research is both plentiful and insightful, and recent methodological advances facilitate obtaining a detailed view of individual movement. Yet, we lack a general unifying paradigm, derived from first principles, which can place movement studies within a common context and advance the development of a mature scientific discipline. This introductory article to the Movement Ecology Special Feature proposes a paradigm that integrates conceptual, theoretical, methodological, and empirical frameworks for studying movement of all organisms, from microbes to trees to elephants. We introduce a conceptual framework depicting the interplay among four basic mechanistic components of organismal movement: the internal state (why move?), motion (how to move?), and navigation (when and where to move?) capacities of the individual and the external factors affecting movement. We demonstrate how the proposed framework aids the study of various taxa and movement types; promotes the formulation of hypotheses about movement; and complements existing biomechanical, cognitive, random, and optimality paradigms of movement. The proposed framework integrates eclectic research on movement into a structured paradigm and aims at providing a basis for hypothesis generation and a vehicle facilitating the understanding of the causes, mechanisms, and spatiotemporal patterns of movement and their role in various ecological and evolutionary processes. "Now we must consider in general the common reason for moving with any movement whatever." (Aristotle, De Motu Animalium, 4th century B.C.).

2,133 citations


Journal ArticleDOI
TL;DR: The most comprehensive list so far of human p53-regulated genes and their experimentally validated, functional binding sites that confer p53 regulation is presented.
Abstract: The p53 protein regulates the transcription of many different genes in response to a wide variety of stress signals. Following DNA damage, p53 regulates key processes, including DNA repair, cell-cycle arrest, senescence and apoptosis, in order to suppress cancer. This Analysis article provides an overview of the current knowledge of p53-regulated genes in these pathways and others, and the mechanisms of their regulation. In addition, we present the most comprehensive list so far of human p53-regulated genes and their experimentally validated, functional binding sites that confer p53 regulation.

1,799 citations


Journal ArticleDOI
TL;DR: In this paper, the authors show how the Netherlands, Denmark, and Germany have made bicycling a safe, convenient, and practical way to get around their cities, relying on national aggregate data as well as case studies of large and small cities in each country.

Journal ArticleDOI
TL;DR: This work quantified the negative correlation between these two networks in 26 subjects, during active (Eriksen flanker task) and resting state scans, and found that the strength of the correlation between the two networks varies across individuals.

Reference EntryDOI
15 Mar 2008
TL;DR: The sections in this article are==================PRECI and the standard isotherm concept as mentioned in this paper, the BET method, the standard isotope concept, and an assessment of porosity.
Abstract: The sections in this article are Introduction Physisorption of Gases Determination of Surface Area The BET Method The Standard Isotherm Concept Assessment of Porosity Capillary Condensation and the Kelvin Equation Adsorption Hysteresis Microporosity Micropore Analysis: Dubinin's Theory of Micropore Filling Micropore Analysis: Empirical Methods Other Methods for Micropore Pore Size Analysis Application of Density Functional Theory Adsorption at the Liquid–Solid Interface Adsorption from Solution Heat of Immersion Mercury Porosimetry General Conclusions Keywords: physisorption; pore size; mercury porosimetry; heat of immersion

Book ChapterDOI
TL;DR: This expository presentation addresses the precise formulation of questions of robustness with respect to disturbances, formulated in the paradigm of input to state stability, with an intuitive and informal presentation of the main concepts.
Abstract: The analysis and design of nonlinear feedback systems has recently undergone an exceptionally rich period of progress and maturation, fueled, to a great extent, by (1) the discovery of certain basic conceptual notions, and (2) the identification of classes of systems for which systematic decomposition approaches can result in effective and easily computable control laws. These two aspects are complementary, since the latter approaches are, typically, based upon the inductive verification of the validity of the former system properties under compositions (in the terminology used in [62], the “activation” of theoretical concepts leads to “constructive” control). This expository presentation addresses the first of these aspects, and in particular the precise formulation of questions of robustness with respect to disturbances, formulated in the paradigm of input to state stability. We provide an intuitive and informal presentation of the main concepts. More precise statements, especially about older results, are given in the cited papers, as well as in several previous surveys such as [103] and [105] (of which the present paper represents an update), but we provide a little more detail about relatively recent work. Regarding applications and extensions of the basic framework, we give some pointers to the literature, but we do not focus on feedback design and specific engineering problems; for the latter we refer the reader to textbooks such as [43], [60], [58], [96], [66], [27], [44].

Journal ArticleDOI
05 Sep 2008-Science
TL;DR: It is hypothesize that neuronal representations, evolved for encoding distance in spatial navigation, also support episodic recall and the planning of action sequences.
Abstract: A long-standing conjecture in neuroscience is that aspects of cognition depend on the brain's ability to self-generate sequential neuronal activity. We found that reliably and continually changing cell assemblies in the rat hippocampus appeared not only during spatial navigation but also in the absence of changing environmental or body-derived inputs. During the delay period of a memory task, each moment in time was characterized by the activity of a particular assembly of neurons. Identical initial conditions triggered a similar assembly sequence, whereas different conditions gave rise to different sequences, thereby predicting behavioral choices, including errors. Such sequences were not formed in control (nonmemory) tasks. We hypothesize that neuronal representations, evolved for encoding distance in spatial navigation, also support episodic recall and the planning of action sequences.


Journal ArticleDOI
TL;DR: A novel approach for classifying points lying on a connected Riemannian manifold using the geometry of the space of d-dimensional nonsingular covariance matrices as object descriptors.
Abstract: We present a new algorithm to detect pedestrian in still images utilizing covariance matrices as object descriptors. Since the descriptors do not form a vector space, well known machine learning techniques are not well suited to learn the classifiers. The space of d-dimensional nonsingular covariance matrices can be represented as a connected Riemannian manifold. The main contribution of the paper is a novel approach for classifying points lying on a connected Riemannian manifold using the geometry of the space. The algorithm is tested on INRIA and DaimlerChrysler pedestrian datasets where superior detection rates are observed over the previous approaches.

Journal ArticleDOI
24 Apr 2008-Nature
TL;DR: Papaya offers numerous advantages as a system for fruit-tree functional genomics, and this draft genome sequence provides the foundation for revealing the basis of Carica’s distinguishing morpho-physiological, medicinal and nutritional properties.
Abstract: Papaya, a fruit crop cultivated in tropical and subtropical regions, is known for its nutritional benefits and medicinal applications. Here we report a 3x draft genome sequence of 'SunUp' papaya, the first commercial virus-resistant transgenic fruit tree to be sequenced. The papaya genome is three times the size of the Arabidopsis genome, but contains fewer genes, including significantly fewer disease-resistance gene analogues. Comparison of the five sequenced genomes suggests a minimal angiosperm gene set of 13,311. A lack of recent genome duplication, atypical of other angiosperm genomes sequenced so far, may account for the smaller papaya gene number in most functional groups. Nonetheless, striking amplifications in gene number within particular functional groups suggest roles in the evolution of tree-like habit, deposition and remobilization of starch reserves, attraction of seed dispersal agents, and adaptation to tropical daylengths. Transgenesis at three locations is closely associated with chloroplast insertions into the nuclear genome, and with topoisomerase I recognition sites. Papaya offers numerous advantages as a system for fruit-tree functional genomics, and this draft genome sequence provides the foundation for revealing the basis of Carica's distinguishing morpho-physiological, medicinal and nutritional properties.

Posted Content
TL;DR: In this article, it is argued that no simple correlation can be established between corporate social performance and corporate financial performance, and it is suggested that corporate citizenship programs can be designed to help companies address reputational threats.
Abstract: It is argued that no simple correlation can be established between corporate social performance and corporate financial performance. The activities that generate CSP do not directly impact the company's financial performance, but instead affect the bottom line via its stock of reputational capital - the financial value of its intangible assets. It is suggested that corporate citizenship programs can be designed to help companies address reputational threats and opportunities to achieve reputational gains while mitigating reputational losses.

Journal ArticleDOI
TL;DR: An extensive, if not nearly complete, listing of DEA research covering theoretical developments as well as "real-world" applications from inception to the year 2007 is presented.
Abstract: Since the original Data Envelopment Analysis (DEA) study by Charnes et al. [Measuring the efficiency of decision-making units. European Journal of Operational Research 1978;2(6):429–44], there has been rapid and continuous growth in the field. As a result, a considerable amount of published research has appeared, with a significant portion focused on DEA applications of efficiency and productivity in both public and private sector activities. While several bibliographic collections have been reported, a comprehensive listing and analysis of DEA research covering its first 30 years of history is not available. This paper thus presents an extensive, if not nearly complete, listing of DEA research covering theoretical developments as well as “real-world” applications from inception to the year 2007. A listing of the most utilized/relevant journals, a keyword analysis, and selected statistics are presented.

Journal ArticleDOI
TL;DR: The North Pacific Gyre Oscillation (NPGO) as mentioned in this paper is the most widely used index of large-scale climate variability in the Northeast Pacific region and has been shown to be correlated with previously unexplained fluctuations of salinity, nutrients, chlorophyll, and zooplankton taxa.
Abstract: Decadal fluctuations in salinity, nutrients, chlorophyll, a variety of zooplankton taxa, and fish stocks in the Northeast Pacific are often poorly correlated with the most widely-used index of large-scale climate variability in the region - the Pacific Decadal Oscillation (PDO). We define a new pattern of climate change, the North Pacific Gyre Oscillation (NPGO) and show that its variability is significantly correlated with previously unexplained fluctuations of salinity, nutrients and chlorophyll. Fluctuations in the NPGO are driven by regional and basin-scale variations in wind-driven upwelling and horizontal advection - the fundamental processes controlling salinity and nutrient concentrations. Nutrient fluctuations drive concomitant changes in phytoplankton concentrations, and may force similar variability in higher trophic levels. The NPGO thus provides a strong indicator of fluctuations in the mechanisms driving planktonic ecosystem dynamics. The NPGO pattern extends beyond the North Pacific and is part of a global-scale mode of climate variability that is evident in global sea level trends and sea surface temperature. Therefore the amplification of the NPGO variance found in observations and in global warming simulations implies that the NPGO may play an increasingly important role in forcing global-scale decadal changes in marine ecosystems.

Journal ArticleDOI
26 Nov 2008-Neuron
TL;DR: It is hypothesized that temporal coordination of neocortical gamma oscillators by hippocampal theta is a mechanism by which information contained in spatially widespread neocorticals assemblies can be synchronously transferred to the associative networks of the hippocampus.

Journal ArticleDOI
16 May 2008-Science
TL;DR: Although ∼10% of the ocean's drawdown of atmospheric anthropogenic carbon dioxide may result from this atmospheric nitrogen fertilization, leading to a decrease in radiative forcing, up to about two-thirds of this amount may be offset by the increase in N2O emissions.
Abstract: Increasing quantities of atmospheric anthropogenic fixed nitrogen entering the open ocean could account for up to about a third of the ocean's external (nonrecycled) nitrogen supply and up to 3% of the annual new marine biological production, 0.3 petagram of carbon per year. This input could account for the production of up to 1.6 teragrams of nitrous oxide (N2O) per year. Although 10% of the ocean's drawdown of atmospheric anthropogenic carbon dioxide may result from this atmospheric nitrogen fertilization, leading to a decrease in radiative forcing, up to about two-thirds of this amount may be offset by the increase in N2O emissions. The effects of increasing atmospheric nitrogen deposition are expected to continue to grow in the future.

Journal ArticleDOI
TL;DR: Seven criteria for good qualitative research emerged and important divergent perspectives were observed in how these criteria should be applied to qualitative research, with differences based on the paradigm embraced by the authors.
Abstract: PURPOSE We wanted to review and synthesize published criteria for good qualitative research and develop a cogent set of evaluative criteria. METHODS We identified published journal articles discussing criteria for rigorous research using standard search strategies then examined reference sections of relevant journal articles to identify books and book chapters on this topic. A cross-publication content analysis allowed us to identify criteria and understand the beliefs that shape them. RESULTS Seven criteria for good qualitative research emerged: (1) carrying out ethical research; (2) importance of the research; (3) clarity and coherence of the research report; (4) use of appropriate and rigorous methods; (5) importance of reflexivity or attending to researcher bias; (6) importance of establishing validity or credibility; and (7) importance of verification or reliability. General agreement was observed across publications on the first 4 quality dimensions. On the last 3, important divergent perspectives were observed in how these criteria should be applied to qualitative research, with differences based on the paradigm embraced by the authors. CONCLUSION Qualitative research is not a unified field. Most manuscript and grant reviewers are not qualitative experts and are likely to embrace a generic set of criteria rather than those relevant to the particular qualitative approach proposed or reported. Reviewers and researchers need to be aware of this tendency and educate health care researchers about the criteria appropriate for evaluating qualitative research from within the theoretical and methodological framework from which it emerges.

Journal ArticleDOI
TL;DR: The chemical shift based structure determination protocol uses an empirically optimized procedure to select protein fragments from the Protein Data Bank, in conjunction with the standard ROSETTA Monte Carlo assembly and relaxation methods, and potentially provides a new direction for high-throughput NMR structure determination.
Abstract: Protein NMR chemical shifts are highly sensitive to local structure. A robust protocol is described that exploits this relation for de novo protein structure generation, using as input experimental parameters the (13)C(alpha), (13)C(beta), (13)C', (15)N, (1)H(alpha) and (1)H(N) NMR chemical shifts. These shifts are generally available at the early stage of the traditional NMR structure determination process, before the collection and analysis of structural restraints. The chemical shift based structure determination protocol uses an empirically optimized procedure to select protein fragments from the Protein Data Bank, in conjunction with the standard ROSETTA Monte Carlo assembly and relaxation methods. Evaluation of 16 proteins, varying in size from 56 to 129 residues, yielded full-atom models that have 0.7-1.8 A root mean square deviations for the backbone atoms relative to the experimentally determined x-ray or NMR structures. The strategy also has been successfully applied in a blind manner to nine protein targets with molecular masses up to 15.4 kDa, whose conventional NMR structure determination was conducted in parallel by the Northeast Structural Genomics Consortium. This protocol potentially provides a new direction for high-throughput NMR structure determination.

Journal ArticleDOI
10 Jun 2008
TL;DR: In this article, the authors reported subarcsecond resolution IRAM PdBI millimeter CO interferometry of four z ~ 2 submillimeter galaxies (SMGs), and sensitive CO(3-2) flux limits toward three z ≥ 2 UV/optically selected star-forming galaxies.
Abstract: We report subarcsecond resolution IRAM PdBI millimeter CO interferometry of four z ~ 2 submillimeter galaxies (SMGs), and sensitive CO(3-2) flux limits toward three z ~ 2 UV/optically selected star-forming galaxies. The new data reveal for the first time spatially resolved CO gas kinematics in the observed SMGs. Two of the SMGs show double or multiple morphologies, with complex, disturbed gas motions. The other two SMGs exhibit CO velocity gradients of ~500 km s^−1 across ≤0.2" (1.6 kpc) diameter regions, suggesting that the star-forming gas is in compact, rotating disks. Our data provide compelling evidence that these SMGs represent extreme, short-lived "maximum" star-forming events in highly dissipative mergers of gas-rich galaxies. The resulting high-mass surface and volume densities of SMGs are similar to those of compact quiescent galaxies in the same redshift range and much higher than those in local spheroids. From the ratio of the comoving volume densities of SMGs and quiescent galaxies in the same mass and redshift ranges, and from the comparison of gas exhaustion timescales and stellar ages, we estimate that the SMG phase duration is about 100 Myr. Our analysis of SMGs and optically/UV selected high-redshift star-forming galaxies supports a "universal" Chabrier IMF as being valid over the star-forming history of these galaxies. We find that the ^(12)CO luminosity to total gas mass conversion factors at z ~ 2-3 are probably similar to those assumed at z ~ 0. The implied gas fractions in our sample galaxies range from 20% to 50%.

Journal ArticleDOI
TL;DR: This review presents methods that could be applied at the outset of any project, a prioritized list of alternate strategies and a list of pitfalls that trip many new investigators.
Abstract: In selecting a method to produce a recombinant protein, a researcher is faced with a bewildering array of choices as to where to start. To facilitate decision-making, we describe a consensus 'what to try first' strategy based on our collective analysis of the expression and purification of over 10,000 different proteins. This review presents methods that could be applied at the outset of any project, a prioritized list of alternate strategies and a list of pitfalls that trip many new investigators.

Journal ArticleDOI
TL;DR: Two major reform efforts in K-12 science education have taken place during the past 50 years as discussed by the authors during the 1950-1970 curriculum reform efforts motivated by the launching of Sputnik and sponsored by the newly formed National Science Foundation (NSF) in United States and by the Nuffield Foundation in the United Kingdom.
Abstract: Two major reform efforts in K-12 science education have taken place during the past 50 years. The first was the 1950-1970 curriculum reform efforts motivated by the launching of Sputnik and sponsored by the newly formed National Science Foundation (NSF) in the United States and by the Nuffield Foundation in the United Kingdom. The signature goal for these reformed programs was to produce courses of study that would get students to "think like scientists," thus placing them in a "pipeline" for science careers (Rudolph, 2002). The second U.S. and U.K. reform effort in science education began in the 1980s and continues to this day as part of the national standards movement. Referred to as the "Science for All" movement in the United States and the "Public Understanding of Science" in the United Kingdom, here the education goal was and is to develop a scientifically literate populace that can participate in both the economic and democra tic agendas of our increasingly global market-focused science, technology, engineering, and mathematics (STEM) societies. In addition to the economic and democratic imperatives as a purpose for science education, more recent voices of science education reform (Driver, Leach, Millar, & Scott, 1996; Millar, 1996; Millar & Hunt, 2002; Osborne, Duschl, & Fairbrother, 2002) have advocated that the proper perspective for science education in schools ought to be the cultural imperative. The cultural impera tive perspective sees STEM disciplines, knowledge, and practices as woven into the very fabric of our nations and societies. What the cultural imperative provides that the democratic and economic imperatives do not is recognition of important social and epistemic dimensions that are embedded in the growth, evaluation, representation, and communication of STEM knowledge and practices. New perspectives and under standings in the learning sciences about learning and learning environments, and in science studies about knowing and inquiring, highlight the importance of science

Journal ArticleDOI
TL;DR: In this article, two generalizations of the = 6 superconformal Chern-Simons-matter theories with gauge group U(n) × U(N) have been considered, and they are conjectured to describe M−N| fractional M2-branes localized at the orbifold singularity.
Abstract: We consider two generalizations of the = 6 superconformal Chern-Simons-matter theories with gauge group U(N) × U(N). The first generalization is to = 6 superconformal U(M) × U(N) theories, and the second to = 5 superconformal O(2M) × USp(2N) and O(2M+1) × USp(2N) theories. These theories are conjectured to describe M2-branes probing C4/Zk in the unitary case, and C4/k in the orthogonal/symplectic case, together with a discrete flux, which can be interpreted as |M−N| fractional M2-branes localized at the orbifold singularity. The classical theories with these gauge groups have been constructed before; in this paper we focus on some quantum aspects of these theories, and on a detailed description of their M theory and type IIA string theory duals.

Proceedings ArticleDOI
14 Sep 2008
TL;DR: The design, implement, and evaluate a technique to identify the source network interface card (NIC) of an IEEE 802.11 frame through passive radio-frequency analysis, called PARADIS, which leverages minute imperfections of transmitter hardware that are acquired at manufacture and are present even in otherwise identical NICs.
Abstract: We design, implement, and evaluate a technique to identify the source network interface card (NIC) of an IEEE 802.11 frame through passive radio-frequency analysis. This technique, called PARADIS, leverages minute imperfections of transmitter hardware that are acquired at manufacture and are present even in otherwise identical NICs. These imperfections are transmitter-specific and manifest themselves as artifacts of the emitted signals. In PARADIS, we measure differentiating artifacts of individual wireless frames in the modulation domain, apply suitable machine-learning classification tools to achieve significantly higher degrees of NIC identification accuracy than prior best known schemes.We experimentally demonstrate effectiveness of PARADIS in differentiating between more than 130 identical 802.11 NICs with accuracy in excess of 99%. Our results also show that the accuracy of PARADIS is resilient against ambient noise and fluctuations of the wireless channel.Although our implementation deals exclusively with IEEE 802.11, the approach itself is general and will work with any digital modulation scheme.