scispace - formally typeset
Search or ask a question

Showing papers by "University of Waterloo published in 2015"


Journal ArticleDOI
28 Aug 2015-Science
TL;DR: A large-scale assessment suggests that experimental reproducibility in psychology leaves a lot to be desired, and correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.
Abstract: Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams.

5,532 citations


Book
01 Jan 2015
TL;DR: The aim of this textbook is to introduce machine learning, and the algorithmic paradigms it offers, in a principled way in an advanced undergraduate or beginning graduate course.
Abstract: Machine learning is one of the fastest growing areas of computer science, with far-reaching applications. The aim of this textbook is to introduce machine learning, and the algorithmic paradigms it offers, in a principled way. The book provides an extensive theoretical account of the fundamental ideas underlying machine learning and the mathematical derivations that transform these principles into practical algorithms. Following a presentation of the basics of the field, the book covers a wide array of central topics that have not been addressed by previous textbooks. These include a discussion of the computational complexity of learning and the concepts of convexity and stability; important algorithmic paradigms including stochastic gradient descent, neural networks, and structured output learning; and emerging theoretical concepts such as the PAC-Bayes approach and compression-based bounds. Designed for an advanced undergraduate or beginning graduate course, the text makes the fundamentals and algorithms of machine learning accessible to students and non-expert readers in statistics, computer science, mathematics, and engineering.

3,857 citations


Journal ArticleDOI
TL;DR: The Review considers some of the current scientific issues underpinning sodium ion batteries, including the discovery of new materials, their electrochemistry, and an increased understanding of ion mobility based on computational methods.
Abstract: Energy storage technology has received significant attention for portable electronic devices, electric vehicle propulsion, bulk electricity storage at power stations, and load leveling of renewable sources, such as solar energy and wind power. Lithium ion batteries have dominated most of the first two applications. For the last two cases, however, moving beyond lithium batteries to the element that lies below-sodium-is a sensible step that offers sustainability and cost-effectiveness. This requires an evaluation of the science underpinning these devices, including the discovery of new materials, their electrochemistry, and an increased understanding of ion mobility based on computational methods. The Review considers some of the current scientific issues underpinning sodium ion batteries.

1,694 citations


Journal ArticleDOI
TL;DR: A strategy to entrap polysulfides in the cathode that relies on a chemical process, whereby a host--manganese dioxide nanosheets serve as the prototype--reacts with initially formed lithium polysolfides to form surface-bound intermediates, which are among the best reported to date.
Abstract: The lithium-sulfur battery is receiving intense interest because its theoretical energy density exceeds that of lithium-ion batteries at much lower cost, but practical applications are still hindered by capacity decay caused by the polysulfide shuttle. Here we report a strategy to entrap polysulfides in the cathode that relies on a chemical process, whereby a host--manganese dioxide nanosheets serve as the prototype--reacts with initially formed lithium polysulfides to form surface-bound intermediates. These function as a redox shuttle to catenate and bind 'higher' polysulfides, and convert them on reduction to insoluble lithium sulfide via disproportionation. The sulfur/manganese dioxide nanosheet composite with 75 wt% sulfur exhibits a reversible capacity of 1,300 mA h g(-1) at moderate rates and a fade rate over 2,000 cycles of 0.036%/cycle, among the best reported to date. We furthermore show that this mechanism extends to graphene oxide and suggest it can be employed more widely.

1,625 citations


Journal ArticleDOI
TL;DR: In this paper, the authors explore the emergence of Airbnb, a company whose website permits ordinary people to rent out their residences as tourist accommodation, and examine its rise through the lens of disruptive innovation theory, which describes how products that lack in traditionally favored attributes but offer alternative benefits can, over time, transform a market and capture mainstream consumers.
Abstract: This article explores the emergence of Airbnb, a company whose website permits ordinary people to rent out their residences as tourist accommodation. The company was just recently established, but it has grown extremely rapidly and is now selling many millions of room nights annually. This rise is examined through the lens of disruptive innovation theory, which describes how products that lack in traditionally favoured attributes but offer alternative benefits can, over time, transform a market and capture mainstream consumers. The concepts of disruptive innovation are used to consider Airbnb's novel business model, which is built around modern internet technologies, and Airbnb's distinct appeal, which centres on cost-savings, household amenities, and the potential for more authentic local experiences. Despite Airbnb's growing popularity, many Airbnb rentals are actually illegal due to short-term rental regulations. These legality issues and their corresponding tax concerns are discussed, with an overview...

1,317 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a loophole-free violation of local realism using entangled photon pairs, ensuring that all relevant events in their Bell test are spacelike separated by placing the parties far enough apart and by using fast random number generators and high-speed polarization measurements.
Abstract: We present a loophole-free violation of local realism using entangled photon pairs. We ensure that all relevant events in our Bell test are spacelike separated by placing the parties far enough apart and by using fast random number generators and high-speed polarization measurements. A high-quality polarization-entangled source of photons, combined with high-efficiency, low-noise, single-photon detectors, allows us to make measurements without requiring any fair-sampling assumptions. Using a hypothesis test, we compute p values as small as 5.9×10^{-9} for our Bell violation while maintaining the spacelike separation of our events. We estimate the degree to which a local realistic system could predict our measurement choices. Accounting for this predictability, our smallest adjusted p value is 2.3×10^{-7}. We therefore reject the hypothesis that local realism governs our experiment.

1,201 citations


Journal ArticleDOI
TL;DR: An updated and revised ISCEV Standard for full-field clinical electroretinography (ffERG or simply ERG) is presented and the parameters for Standard flash stimuli have been revised to accommodate a variety of light sources including gas discharge lamps and light emitting diodes.
Abstract: This document, from the International Society for Clinical Electrophysiology of Vision (ISCEV), presents an updated and revised ISCEV Standard for full-field clinical electroretinography (ffERG or simply ERG). The parameters for Standard flash stimuli have been revised to accommodate a variety of light sources including gas discharge lamps and light emitting diodes. This ISCEV Standard for clinical ERGs specifies six responses based on the adaptation state of the eye and the flash strength: (1) Dark-adapted 0.01 ERG (rod ERG); (2) Dark-adapted 3 ERG (combined rod-cone standard flash ERG); (3) Dark-adapted 3 oscillatory potentials; (4) Dark-adapted 10 ERG (strong flash ERG); (5) Light-adapted 3 ERG (standard flash “cone” ERG); and (6) Light-adapted 30 Hz flicker ERG. ISCEV encourages the use of additional ERG protocols for testing beyond this minimum standard for clinical ERGs.

1,112 citations


Journal ArticleDOI
TL;DR: It is reported that 2D early-transition-metal carbide conductive MXene phases-reported to be impressive supercapacitor materials-also perform as excellent sulfur battery hosts owing to their inherently high underlying metallic conductivity and self-functionalized surfaces.
Abstract: Lithium–sulfur batteries are amongst the most promising candidates to satisfy emerging energy-storage demands. Suppression of the polysulfide shuttle while maintaining high sulfur content is the main challenge that faces their practical development. Here, we report that 2D early-transition-metal carbide conductive MXene phases—reported to be impressive supercapacitor materials—also perform as excellent sulfur battery hosts owing to their inherently high underlying metallic conductivity and self-functionalized surfaces. We show that 70 wt % S/Ti2C composites exhibit stable long-term cycling performance because of strong interaction of the polysulfide species with the surface Ti atoms, demonstrated by X-ray photoelectron spectroscopy studies. The cathodes show excellent cycling performance with specific capacity close to 1200 mA h g−1 at a five-hour charge/discharge (C/5) current rate. Capacity retention of 80 % is achieved over 400 cycles at a two-hour charge/discharge (C/2) current rate.

1,064 citations


Journal ArticleDOI
TL;DR: The ecology of rare microbial populations is discussed, molecular and computational methods for targeting taxonomic 'blind spots' within the rare biosphere of complex microbial communities are highlighted, and the value of studying the biogeography of microorganisms is highlighted.
Abstract: The profound influence of microorganisms on human life and global biogeochemical cycles underlines the value of studying the biogeography of microorganisms, exploring microbial genomes and expanding our understanding of most microbial species on Earth: that is, those present at low relative abundance. The detection and subsequent analysis of low-abundance microbial populations—the 'rare biosphere'—have demonstrated the persistence, population dynamics, dispersion and predation of these microbial species. We discuss the ecology of rare microbial populations, and highlight molecular and computational methods for targeting taxonomic 'blind spots' within the rare biosphere of complex microbial communities.

794 citations


Journal ArticleDOI
TL;DR: This commentary considers the amassed evidence that shows that self-report dietary intake data can successfully be used to inform dietary guidance and public health policy and 7 specific recommendations for collecting, analyzing, and interpreting self- report dietary data are provided.
Abstract: Recent reports have asserted that, because of energy underreporting, dietary self-report data suffer from measurement error so great that findings that rely on them are of no value. This commentary considers the amassed evidence that shows that self-report dietary intake data can successfully be used to inform dietary guidance and public health policy. Topics discussed include what is known and what can be done about the measurement error inherent in data collected by using self-report dietary assessment instruments and the extent and magnitude of underreporting energy compared with other nutrients and food groups. Also discussed is the overall impact of energy underreporting on dietary surveillance and nutritional epidemiology. In conclusion, 7 specific recommendations for collecting, analyzing, and interpreting self-report dietary data are provided: (1) continue to collect self-report dietary intake data because they contain valuable, rich, and critical information about foods and beverages consumed by populations that can be used to inform nutrition policy and assess diet-disease associations; (2) do not use self-reported energy intake as a measure of true energy intake; (3) do use self-reported energy intake for energy adjustment of other self-reported dietary constituents to improve risk estimation in studies of diet-health associations; (4) acknowledge the limitations of self-report dietary data and analyze and interpret them appropriately; (5) design studies and conduct analyses that allow adjustment for measurement error; (6) design new epidemiologic studies to collect dietary data from both short-term (recalls or food records) and long-term (food-frequency questionnaires) instruments on the entire study population to allow for maximizing the strengths of each instrument; and (7) continue to develop, evaluate, and further expand methods of dietary assessment, including dietary biomarkers and methods using new technologies.

725 citations


Journal ArticleDOI
TL;DR: A sulfur electrode exhibiting strong polysulfide chemisorption using a porous N, S dual-doped carbon is reported, and the synergistic functionalization from the N and S heteroatoms dramatically modifies the electron density distribution and leads to much stronger polys sulfuride binding.
Abstract: A sulfur electrode exhibiting strong polysulfide chemisorption using a porous N, S dual-doped carbon is reported. The synergistic functionalization from the N and S heteroatoms dramatically modifies the electron density distribution and leads to much stronger polysulfide binding. X-ray photoelectron spectroscopy studies combined with ab initio calculations reveal strong Li(+) -N and Sn (2-) -S interactions. The sulfur electrodes exhibit an ultralow capacity fading of 0.052% per cycle over 1100 cycles.

Journal ArticleDOI
TL;DR: An introductory tutorial on the usage of the Hyperopt library, including the description of search spaces, minimization (in serial and parallel), and the analysis of the results collected in the course of minimization.
Abstract: Sequential model-based optimization (also known as Bayesian optimization) is one of the most efficient methods (per function evaluation) of function minimization. This efficiency makes it appropriate for optimizing the hyperparameters of machine learning algorithms that are slow to train. The Hyperopt library provides algorithms and parallelization infrastructure for performing hyperparameter optimization (model selection) in Python. This paper presents an introductory tutorial on the usage of the Hyperopt library, including the description of search spaces, minimization (in serial and parallel), and the analysis of the results collected in the course of minimization. This paper also gives an overview of Hyperopt-Sklearn, a software project that provides automatic algorithm configuration of the Scikit-learn machine learning library. Following Auto-Weka, we take the view that the choice of classifier and even the choice of preprocessing module can be taken together to represent a single large hyperparameter optimization problem. We use Hyperopt to define a search space that encompasses many standard components (e.g. SVM, RF, KNN, PCA, TFIDF) and common patterns of composing them together. We demonstrate, using search algorithms in Hyperopt and standard benchmarking data sets (MNIST, 20-newsgroups, convex shapes), that searching this space is practical and effective. In particular, we improve on best-known scores for the model space for both MNIST and convex shapes. The paper closes with some discussion of ongoing and future work.

Journal ArticleDOI
TL;DR: This review paper presents the current state of practice of assessing the visual condition of vertical and horizontal civil infrastructure; in particular of reinforced concrete bridges, precast concrete tunnels, underground concrete pipes, and asphalt pavements.

Journal ArticleDOI
TL;DR: This paper proposes a novel objective image quality assessment (IQA) algorithm for MEF images based on the principle of the structural similarity approach and a novel measure of patch structural consistency and shows that the proposed model well correlates with subjective judgments and significantly outperforms the existing IQA models for general image fusion.
Abstract: Multi-exposure image fusion (MEF) is considered an effective quality enhancement technique widely adopted in consumer electronics, but little work has been dedicated to the perceptual quality assessment of multi-exposure fused images. In this paper, we first build an MEF database and carry out a subjective user study to evaluate the quality of images generated by different MEF algorithms. There are several useful findings. First, considerable agreement has been observed among human subjects on the quality of MEF images. Second, no single state-of-the-art MEF algorithm produces the best quality for all test images. Third, the existing objective quality models for general image fusion are very limited in predicting perceived quality of MEF images. Motivated by the lack of appropriate objective models, we propose a novel objective image quality assessment (IQA) algorithm for MEF images based on the principle of the structural similarity approach and a novel measure of patch structural consistency. Our experimental results on the subjective database show that the proposed model well correlates with subjective judgments and significantly outperforms the existing IQA models for general image fusion. Finally, we demonstrate the potential application of the proposed model by automatically tuning the parameters of MEF algorithms. 1 The subjective database and the MATLAB code of the proposed model will be made available online. Preliminary results of Section III were presented at the 6th International Workshop on Quality of Multimedia Experience , Singapore, 2014.

Journal ArticleDOI
TL;DR: A comparison of traditional statistical and novel machine learning models applied for regional scale landslide susceptibility modeling is presented and it is suggested that the framework of this model evaluation approach can be applied to assist in selection of a suitable landslide susceptibility modeled technique.

Journal ArticleDOI
TL;DR: It is shown, through a combination of experiments and computations, that introducing hierarchy into the architecture of 3D structural metamaterials enables the attainment of a unique combination of properties: ultralightweight, recoverability, and a near-linear scaling of stiffness and strength with density.
Abstract: Hierarchically designed structures with architectural features that span across multiple length scales are found in numerous hard biomaterials, like bone, wood, and glass sponge skeletons, as well as manmade structures, like the Eiffel Tower. It has been hypothesized that their mechanical robustness and damage tolerance stem from sophisticated ordering within the constituents, but the specific role of hierarchy remains to be fully described and understood. We apply the principles of hierarchical design to create structural metamaterials from three material systems: (i) polymer, (ii) hollow ceramic, and (iii) ceramic–polymer composites that are patterned into self-similar unit cells in a fractal-like geometry. In situ nanomechanical experiments revealed (i) a nearly theoretical scaling of structural strength and stiffness with relative density, which outperforms existing nonhierarchical nanolattices; (ii) recoverability, with hollow alumina samples recovering up to 98% of their original height after compression to ≥50% strain; (iii) suppression of brittle failure and structural instabilities in hollow ceramic hierarchical nanolattices; and (iv) a range of deformation mechanisms that can be tuned by changing the slenderness ratios of the beams. Additional levels of hierarchy beyond a second order did not increase the strength or stiffness, which suggests the existence of an optimal degree of hierarchy to amplify resilience. We developed a computational model that captures local stress distributions within the nanolattices under compression and explains some of the underlying deformation mechanisms as well as validates the measured effective stiffness to be interpreted as a metamaterial property.

Journal ArticleDOI
TL;DR: In this paper, a simple, efficient method for simulating Hamiltonian dynamics on a quantum computer by approximating the truncated Taylor series of the evolution operator is described, which can simulate the time evolution of a wide variety of physical systems.
Abstract: We describe a simple, efficient method for simulating Hamiltonian dynamics on a quantum computer by approximating the truncated Taylor series of the evolution operator. Our method can simulate the time evolution of a wide variety of physical systems. As in another recent algorithm, the cost of our method depends only logarithmically on the inverse of the desired precision, which is optimal. However, we simplify the algorithm and its analysis by using a method for implementing linear combinations of unitary operations together with a robust form of oblivious amplitude amplification.

Journal ArticleDOI
TL;DR: This paper provides a concise and comprehensive review of Pickering emulsion systems that possess the ability to respond to an array of external triggers, including pH, temperature, CO2 concentration, light intensity, ionic strength, and magnetic field.
Abstract: Pickering emulsions possess many advantages over traditional surfactant stabilized emulsions. For example, Pickering emulsions impart better stability against coalescence and, in many cases, are biologically compatible and environmentally friendly. These characteristics open the door for their use in a variety of industries spanning petroleum, food, biomedicine, pharmaceuticals, and cosmetics. Depending on the application, rapid, but controlled stabilization and destabilization of an emulsion may be necessary. As a result, Pickering emulsions with stimuli-responsive properties have, in recent years, received a considerable amounts of attention. This paper provides a concise and comprehensive review of Pickering emulsion systems that possess the ability to respond to an array of external triggers, including pH, temperature, CO2 concentration, light intensity, ionic strength, and magnetic field. Potential applications for which stimuli-responsive Pickering emulsion systems would be of particular value, such as emulsion polymerization, enhanced oil recovery, catalyst recovery, and cosmetics, are discussed.

Journal ArticleDOI
TL;DR: In this article, the authors consider disordered many-body systems with periodic time-dependent Hamiltonians in one spatial dimension and identify two distinct phases: (i) a many body localized (MBL) phase, in which almost all eigenstates have area-law entanglement entropy, and the eigenstate thermalization hypothesis (ETH) is violated, and (ii) a delocalized phase, where eigen states have volume-law entropy and obey the ETH.
Abstract: We consider disordered many-body systems with periodic time-dependent Hamiltonians in one spatial dimension By studying the properties of the Floquet eigenstates, we identify two distinct phases: (i) a many-body localized (MBL) phase, in which almost all eigenstates have area-law entanglement entropy, and the eigenstate thermalization hypothesis (ETH) is violated, and (ii) a delocalized phase, in which eigenstates have volume-law entanglement and obey the ETH The MBL phase exhibits logarithmic in time growth of entanglement entropy when the system is initially prepared in a product state, which distinguishes it from the delocalized phase We propose an effective model of the MBL phase in terms of an extensive number of emergent local integrals of motion, which naturally explains the spectral and dynamical properties of this phase Numerical data, obtained by exact diagonalization and time-evolving block decimation methods, suggest a direct transition between the two phases

Journal ArticleDOI
TL;DR: A comprehensive study of different mechanisms of collaboration and defense in collaborative security, covering six types of security systems, with the goal of helping to make collaborative security systems more resilient and efficient.
Abstract: Security is oftentimes centrally managed. An alternative trend of using collaboration in order to improve security has gained momentum over the past few years. Collaborative security is an abstract concept that applies to a wide variety of systems and has been used to solve security issues inherent in distributed environments. Thus far, collaboration has been used in many domains such as intrusion detection, spam filtering, botnet resistance, and vulnerability detection. In this survey, we focus on different mechanisms of collaboration and defense in collaborative security. We systematically investigate numerous use cases of collaborative security by covering six types of security systems. Aspects of these systems are thoroughly studied, including their technologies, standards, frameworks, strengths and weaknesses. We then present a comprehensive study with respect to their analysis target, timeliness of analysis, architecture, network infrastructure, initiative, shared information and interoperability. We highlight five important topics in collaborative security, and identify challenges and possible directions for future research. Our work contributes the following to the existing research on collaborative security with the goal of helping to make collaborative security systems more resilient and efficient. This study (1) clarifies the scope of collaborative security, (2) identifies the essential components of collaborative security, (3) analyzes the multiple mechanisms of collaborative security, and (4) identifies challenges in the design of collaborative security.

Journal ArticleDOI
TL;DR: In this article, a nonlinear finite element analysis of reinforced concrete slab-column connections under static and pseudo-dynamic loadings was conducted to investigate their failures modes in terms of ultimate load and cracking patterns.

Journal ArticleDOI
TL;DR: It is discussed how these models can capture the dynamics that characterize many real-world scenarios, thereby suggesting ways that policy makers can better design effective prevention strategies and pitfalls which might be faced by researchers in the field.

Journal ArticleDOI
TL;DR: In this paper, the tree-level S-matrix of Einstein's theory is known to have a representation as an integral over the moduli space of punctured spheres localized to the solutions of the scattering equations.
Abstract: The tree-level S-matrix of Einstein’s theory is known to have a representation as an integral over the moduli space of punctured spheres localized to the solutions of the scattering equations In this paper we introduce three operations that can be applied on the integrand in order to produce other theories Starting in d + M dimensions we use dimensional reduction to construct Einstein-Maxwell with gauge group U(1) M The second operation turns gravitons into gluons and we call it “squeezing” This gives rise to a formula for all multi-trace mixed amplitudes in Einstein-Yang-Mills Dimensionally reducing Yang-Mills we find the S-matrix of a special Yang-Mills-Scalar (YMS) theory, and by the squeezing operation we find that of a YMS theory with an additional cubic scalar vertex A corollary of the YMS formula gives one for a single massless scalar with a ϕ4 interaction Starting again from Einstein’s theory but in d + d dimensions we introduce a “generalized dimensional reduction” that produces the Born-Infeld theory or a special Galileon theory in d dimensions depending on how it is applied An extension of Born-Infeld formula leads to one for the Dirac-Born-Infeld (DBI) theory By applying the same operation to Yang-Mills we obtain the U(N ) non-linear sigma model (NLSM) Finally, we show how the Kawai-Lewellen-Tye relations naturally follow from our formulation and provide additional connections among these theories One such relation constructs DBI from YMS and NLSM

Journal ArticleDOI
TL;DR: The high prevalence of multimorbidity and numerous combinations of conditions suggests that single, disease-oriented management programs may be less effective or efficient tools for high quality care compared to person-centered approaches.
Abstract: Background: Multimorbidity, the co-occurrence of two or more chronic conditions, is common among older adults and is known to be associated with high costs and gaps in quality of care Population-based estimates of multimorbidity are not readily available, which makes future planning a challenge We aimed to estimate the population-based prevalence and trends of multimorbidity in Ontario, Canada and to examine patterns in the co-occurrence of chronic conditions Methods: This retrospective cohort study includes all Ontarians (aged 0 to 105 years) with at least one of 16 common chronic conditions Descriptive statistics were used to examine and compare the prevalence of multimorbidity by age and number of conditions in 2003 and 2009 The co-occurrence of chronic conditions among individuals with multimorbidity was also explored Results: The prevalence of multimorbidity among Ontarians rose from 174% in 2003 to 243% in 2009, a 40% increase This increase over time was evident across all age groups Within individual chronic conditions, multimorbidity rates ranged from 44% to 99% Remarkably, there were no dominant patterns of co-occurring conditions Conclusion: The high prevalence of multimorbidity and numerous combinations of conditions suggests that single, disease-oriented management programs may be less effective or efficient tools for high quality care compared to person-centered approaches

Journal ArticleDOI
TL;DR: An aperiodic array of coupled dielectric nanoresonators is utilized to demonstrate a multiwavelength achromatic lens and are an essential step toward a realization of broadband flat optical elements.
Abstract: Nanoscale optical resonators enable a new class of flat optical components called metasurfaces. This approach has been used to demonstrate functionalities such as focusing free of monochromatic aberrations (i.e., spherical and coma), anomalous reflection, and large circular dichroism. Recently, dielectric metasurfaces that compensate the phase dispersion responsible for chromatic aberrations have been demonstrated. Here, we utilize an aperiodic array of coupled dielectric nanoresonators to demonstrate a multiwavelength achromatic lens. The focal length remains unchanged for three wavelengths in the near-infrared region (1300, 1550, and 1800 nm). Experimental results are in agreement with full-wave simulations. Our findings are an essential step toward a realization of broadband flat optical elements.

Journal ArticleDOI
TL;DR: A feasible and seemingly robust method for applying systematic search strategies to identify web-based resources in the grey literature and is amenable to adaptation to identify other types of grey literature from other disciplines and answering a wide range of research questions.
Abstract: Grey literature is an important source of information for large-scale review syntheses. However, there are many characteristics of grey literature that make it difficult to search systematically. Further, there is no ‘gold standard’ for rigorous systematic grey literature search methods and few resources on how to conduct this type of search. This paper describes systematic review search methods that were developed and applied to complete a case study systematic review of grey literature that examined guidelines for school-based breakfast programs in Canada. A grey literature search plan was developed to incorporate four different searching strategies: (1) grey literature databases, (2) customized Google search engines, (3) targeted websites, and (4) consultation with contact experts. These complementary strategies were used to minimize the risk of omitting relevant sources. Since abstracts are often unavailable in grey literature documents, items’ abstracts, executive summaries, or table of contents (whichever was available) were screened. Screening of publications’ full-text followed. Data were extracted on the organization, year published, who they were developed by, intended audience, goal/objectives of document, sources of evidence/resources cited, meals mentioned in the guidelines, and recommendations for program delivery. The search strategies for identifying and screening publications for inclusion in the case study review was found to be manageable, comprehensive, and intuitive when applied in practice. The four search strategies of the grey literature search plan yielded 302 potentially relevant items for screening. Following the screening process, 15 publications that met all eligibility criteria remained and were included in the case study systematic review. The high-level findings of the case study systematic review are briefly described. This article demonstrated a feasible and seemingly robust method for applying systematic search strategies to identify web-based resources in the grey literature. The search strategy we developed and tested is amenable to adaptation to identify other types of grey literature from other disciplines and answering a wide range of research questions. This method should be further adapted and tested in future research syntheses.

Journal ArticleDOI
TL;DR: In this paper, the authors compared various works in terms of capacity, areal mass loading, and fraction of conductive additive, which are the critical parameters dictating the potential for a device to achieve a specifi c energy higher than current Li-ion batteries (i.e., >200 Wh kg −1 ).
Abstract: Battery technologies involving Li-S chemistries have been touted as one of the most promising next generation systems. The theoretical capacity of sulfur is nearly an order of magnitude higher than current Li-ion battery insertion cathodes and when coupled with a Li metal anode, Li-S batteries promise specifi c energies nearly fi ve-fold higher. However, this assertion only holds if sulfur cathodes could be designed in the same manner as cathodes for Li-ion batteries. Here, the recent efforts to engineer high capacity, thick, sulfur-based cathodes are explored. Various works are compared in terms of capacity, areal mass loading, and fraction of conductive additive, which are the critical parameters dictating the potential for a device to achieve a specifi c energy higher than current Li-ion batteries (i.e., >200 Wh kg −1 ). While an inferior specifi c energy is projected in the majority of cases, several promising strategies have the potential to achieve >500 Wh kg −1 . The challenges associated with the limited cycle-life of these systems due to both the polysulfi de shuttle phenomenon and the rapid degradation of the Li metal anode that is experienced at the current densities required to charge high specifi c energy batteries in a reasonable timeframe are also discussed.

Journal ArticleDOI
TL;DR: The objectives of this paper were to critically review research on P transport in subsurface drainage, to determine factors that control P losses, and to identify gaps in the current scientific understanding of the role of subsurfaced drainage in P transport.
Abstract: Phosphorus (P) loss from agricultural fields and watersheds has been an important water quality issue for decades because of the critical role P plays in eutrophication. Historically, most research has focused on P losses by surface runoff and erosion because subsurface P losses were often deemed to be negligible. Perceptions of subsurface P transport, however, have evolved, and considerable work has been conducted to better understand the magnitude and importance of subsurface P transport and to identify practices and treatments that decrease subsurface P loads to surface waters. The objectives of this paper were (i) to critically review research on P transport in subsurface drainage, (ii) to determine factors that control P losses, and (iii) to identify gaps in the current scientific understanding of the role of subsurface drainage in P transport. Factors that affect subsurface P transport are discussed within the framework of intensively drained agricultural settings. These factors include soil characteristics (e.g., preferential flow, P sorption capacity, and redox conditions), drainage design (e.g., tile spacing, tile depth, and the installation of surface inlets), prevailing conditions and management (e.g., soil-test P levels, tillage, cropping system, and the source, rate, placement, and timing of P application), and hydrologic and climatic variables (e.g., baseflow, event flow, and seasonal differences). Structural, treatment, and management approaches to mitigate subsurface P transport-such as practices that disconnect flow pathways between surface soils and tile drains, drainage water management, in-stream or end-of-tile treatments, and ditch design and management-are also discussed. The review concludes by identifying gaps in the current understanding of P transport in subsurface drains and suggesting areas where future research is needed.

Proceedings ArticleDOI
01 Sep 2015
TL;DR: This work proposes a model for comparing sentences that uses a multiplicity of perspectives, first model each sentence using a convolutional neural network that extracts features at multiple levels of granularity and uses multiple types of pooling.
Abstract: Modeling sentence similarity is complicated by the ambiguity and variability of linguistic expression. To cope with these challenges, we propose a model for comparing sentences that uses a multiplicity of perspectives. We first model each sentence using a convolutional neural network that extracts features at multiple levels of granularity and uses multiple types of pooling. We then compare our sentence representations at several granularities using multiple similarity metrics. We apply our model to three tasks, including the Microsoft Research paraphrase identification task and two SemEval semantic textual similarity tasks. We obtain strong performance on all tasks, rivaling or exceeding the state of the art without using external resources such as WordNet or parsers.