scispace - formally typeset
Search or ask a question

Showing papers by "National Physical Laboratory published in 2019"


Journal ArticleDOI
TL;DR: This Review focuses on the recent advances in the synthesis and lithium storage properties of silicon oxide-based anode materials and presents the progress in a systematic manner.
Abstract: Silicon oxides have been recognized as a promising family of anode materials for high-energy lithium-ion batteries (LIBs) owing to their abundant reserve, low cost, environmental friendliness, easy synthesis, and high theoretical capacity. However, the extended application of silicon oxides is severely hampered by the intrinsically low conductivity, large volume change, and low initial coulombic efficiency. Significant efforts have been dedicated to tackling these challenges towards practical applications. This Review focuses on the recent advances in the synthesis and lithium storage properties of silicon oxide-based anode materials. To present the progress in a systematic manner, this review is categorized as follows: (i) SiO-based anode materials, (ii) SiO2-based anode materials, (iii) non-stoichiometric SiOx-based anode materials, and (iv) Si-O-C-based anode materials. Finally, future outlook and our personal perspectives on silicon oxide-based anode materials are presented.

596 citations


Journal ArticleDOI
TL;DR: In this article, the decoherence of transmon qubits is studied and the temporal stability of energy relaxation, dephasing, and qubit transition frequency is examined. But, the authors do not examine the reproducibility of qubit parameters, where these fluctuations could affect qubit gate fidelity.
Abstract: We benchmark the decoherence of superconducting transmon qubits to examine the temporal stability of energy relaxation, dephasing, and qubit transition frequency. By collecting statistics during measurements spanning multiple days, we find the mean parameters $$\overline {T_1}$$ = 49 μs and $$\overline {T_2^ \ast }$$ = 95 μs; however, both of these quantities fluctuate, explaining the need for frequent re-calibration in qubit setups. Our main finding is that fluctuations in qubit relaxation are local to the qubit and are caused by instabilities of near-resonant two-level-systems (TLS). Through statistical analysis, we determine sub-millihertz switching rates of these TLS and observe the coherent coupling between an individual TLS and a transmon qubit. Finally, we find evidence that the qubit’s frequency stability produces a 0.8 ms limit on the pure dephasing which we also observe. These findings raise the need for performing qubit metrology to examine the reproducibility of qubit parameters, where these fluctuations could affect qubit gate fidelity.

263 citations


Journal ArticleDOI
TL;DR: Magnetic force microscopy (MFM) has become a truly widespread and commonly used characterization technique that has been applied to a variety of research and industrial applications as discussed by the authors, where the main advantages of the method include its high spatial resolution (typically ∼50 nm), ability to work in variable temperature and applied magnetic fields, versatility, and simplicity in operation, all without almost any need for sample preparation.
Abstract: Since it was first demonstrated in 1987, magnetic force microscopy (MFM) has become a truly widespread and commonly used characterization technique that has been applied to a variety of research and industrial applications. Some of the main advantages of the method includes its high spatial resolution (typically ∼50 nm), ability to work in variable temperature and applied magnetic fields, versatility, and simplicity in operation, all without almost any need for sample preparation. However, for most commercial systems, the technique has historically provided only qualitative information, and the number of available modes was typically limited, thus not reflecting the experimental demands. Additionally, the range of samples under study was largely restricted to “classic” ferromagnetic samples (typically, thin films or patterned nanostructures). Throughout this Perspective article, the recent progress and development of MFM is described, followed by a summary of the current state-of-the-art techniques and objects for study. Finally, the future of this fascinating field is discussed in the context of emerging instrumental and material developments. Aspects including quantitative MFM, the accurate interpretation of the MFM images, new instrumentation, probe-engineering alternatives, and applications of MFM to new (often interdisciplinary) areas of the materials science, physics, and biology will be discussed. We first describe the physical principles of MFM, specifically paying attention to common artifacts frequently occurring in MFM measurements; then, we present a comprehensive review of the recent developments in the MFM modes, instrumentation, and the main application areas; finally, the importance of the technique is speculated upon for emerging or anticipated to emerge fields including skyrmions, 2D-materials, and topological insulators.

166 citations


Journal ArticleDOI
27 Sep 2019-Sensors
TL;DR: This paper is a survey of existing and upcoming industrial applications of terahertz technologies, comprising sections on polymers, paint and coatings, pharmaceuticals, electronics, petrochemicals, gas sensing, and paper and wood industries.
Abstract: This paper is a survey of existing and upcoming industrial applications of terahertz technologies, comprising sections on polymers, paint and coatings, pharmaceuticals, electronics, petrochemicals, gas sensing, and paper and wood industries. Finally, an estimate of the market size and growth rates is given, as obtained from a comparison of market reports.

147 citations


Journal ArticleDOI
TL;DR: A framework for MR sensor technology (non-recording applications) to be used for public and private R&D planning, in order to provide guidance into likely MR sensor applications, products, and services expected in the next 15 years and beyond.
Abstract: Magnetoresistive (MR) sensors have been identified as promising candidates for the development of high-performance magnetometers due to their high sensitivity, low cost, low power consumption, and small size. The rapid advance of MR sensor technology has opened up a variety of MR sensor applications. These applications are in different areas that require MR sensors with different properties. Future MR sensor development in each of these areas requires an overview and a strategic guide. An MR sensor roadmap (non-recording applications) was therefore developed and made public by the Technical Committee of the IEEE Magnetics Society with the aim to provide an research and development (R&D) guide for MR sensors intended to be used by industry, government, and academia. The roadmap was developed over a three-year period and coordinated by an international effort of 22 taskforce members from ten countries and 17 organizations, including universities, research institutes, and sensor companies. In this paper, the current status of MR sensors for non-recording applications was identified by analyzing the patent and publication statistics. As a result, timescales for MR sensor development were established and critical milestones for sensor parameters were extracted in order to gain insight into potential MR sensor applications (non-recording). Five application areas were identified, and five MR sensor roadmaps were established. These include biomedical applications, flexible electronics, position sensing and human–computer interactions, non-destructive evaluation and monitoring, and navigation and transportation. Each roadmap was analyzed using a logistic growth model, and new opportunities were predicted based on the extrapolated curve, forecast milestones, and professional judgment of the taskforce members. This paper provides a framework for MR sensor technology (non-recording applications) to be used for public and private R&D planning, in order to provide guidance into likely MR sensor applications, products, and services expected in the next 15 years and beyond.

136 citations


Journal ArticleDOI
TL;DR: The barriers that have held back progress such as lack of sensitivity and the breakthroughs that have been made including laser-postionization are highlighted as well as the future challenges and opportunities for metabolic imaging at the single-cell scale are highlighted.
Abstract: There is an increasing appreciation that every cell, even of the same type, is different. This complexity, when additionally combined with the variety of different cell types in tissue, is driving the need for spatially resolved omics at the single-cell scale. Rapid advances are being made in genomics and transcriptomics, but progress in metabolomics lags. This is partly because amplification and tagging strategies are not suited to dynamically created metabolite molecules. Mass spectrometry imaging has excellent potential for metabolic imaging. This review summarizes the recent advances in two of these techniques: matrix-assisted laser desorption ionization (MALDI) and secondary ion mass spectrometry (SIMS) and their convergence in subcellular spatial resolution and molecular information. The barriers that have held back progress such as lack of sensitivity and the breakthroughs that have been made including laser-postionization are highlighted as well as the future challenges and opportunities for metabolic imaging at the single-cell scale.

113 citations


Journal ArticleDOI
TL;DR: The nature of the mitochondrial network in human muscle is defined, morphological differences between human and mouse and among patients with mitochondrial DNA diseases compared to healthy controls are established, and potential morphological markers of mitochondrial dysfunction in human tissues are suggested.

110 citations


Journal ArticleDOI
20 Feb 2019
TL;DR: In this article, a robust and simple way to increase the soliton access window by using an auxiliary laser that passively stabilizes intracavity power was proposed. But this scheme does not address the problem of the sudden change in circulating power (soliton step) during transition into soliton regime.
Abstract: The recent demonstration of dissipative Kerr solitons in microresonators has opened a new pathway for the generation of ultrashort pulses and low-noise frequency combs with gigahertz to terahertz repetition rates, enabling applications in frequency metrology, astronomy, optical coherent communications, and laser-based ranging. A main challenge for soliton generation, in particular in ultra-high-Q resonators, is the sudden change in circulating intracavity power during the onset of soliton generation. This sudden power change requires precise control of the seed laser frequency and power or fast control of the resonator temperature. Here, we report a robust and simple way to increase the soliton access window by using an auxiliary laser that passively stabilizes intracavity power. In our experiments with fused silica resonators, we are able to extend the access range of microresonator solitons by two orders of magnitude, which enables soliton generation by slow and manual tuning of the pump laser into resonance and at unprecedented low power levels. Importantly, this scheme eliminates the sudden change in circulating power (“soliton step”) during transition into the soliton regime. Both single- and multi-soliton mode-locked states are generated in a 1.3-mm-diameter fused silica microrod resonator with a free spectral range of ∼50.6 GHz, at a 1554 nm pump wavelength at threshold powers <3 mW. Moreover, with a smaller 230-μm-diameter microrod, we demonstrate soliton generation at 780 μW threshold power. The passive enhancement of the soliton access range paves the way for robust and low-threshold microcomb systems and has the potential to be a practical tool for soliton microcomb generation.

109 citations


Journal ArticleDOI
TL;DR: This first guide outlines steps appropriate for determining whether XPS is capable of obtaining the desired information, identifies issues relevant to planning, conducting and reporting an XPS measurement, and identifies sources of practical information for conducting XPS measurements.
Abstract: Over the past three decades, the widespread utility and applicability of X-ray photoelectron spectroscopy (XPS) in research and applications has made it the most popular and widely used method of surface analysis. Associated with this increased use has been an increase in the number of new or inexperienced users which has led to erroneous uses and misapplications of the method. This article is the first in a series of guides assembled by a committee of experienced XPS practitioners that are intended to assist inexperienced users by providing information about good practices in the use of XPS. This first guide outlines steps appropriate for determining whether XPS is capable of obtaining the desired information, identifies issues relevant to planning, conducting and reporting an XPS measurement, and identifies sources of practical information for conducting XPS measurements. Many of the topics and questions addressed in this article also apply to other surface-analysis techniques.

105 citations


Journal ArticleDOI
TL;DR: The Radiometric Calibration Network (RadCalNet) is an effort to provide automated surface and atmosphere in situ data as part of a network including multiple sites for the purpose of optical imager radiometric calibration in the visible to shortwave infrared spectral range.
Abstract: Vicarious calibration approaches using in situ measurements saw first use in the early 1980s and have since improved to keep pace with the evolution of the radiometric requirements of the sensors that are being calibrated. The advantage of in situ measurements for vicarious calibration is that they can be carried out with traceable and quantifiable accuracy, making them ideal for interconsistency studies of on-orbit sensors. The recent development of automated sites to collect the in situ data has led to an increase in the available number of datasets for sensor calibration. The current work describes the Radiometric Calibration Network (RadCalNet) that is an effort to provide automated surface and atmosphere in situ data as part of a network including multiple sites for the purpose of optical imager radiometric calibration in the visible to shortwave infrared spectral range. The key goals of RadCalNet are to standardize protocols for collecting data, process to top-of-atmosphere reflectance, and provide uncertainty budgets for automated sites traceable to the international system of units. RadCalNet is the result of efforts by the RadCalNet Working Group under the umbrella of the Committee on Earth Observation Satellites (CEOS) Working Group on Calibration and Validation (WGCV) and the Infrared Visible Optical Sensors (IVOS). Four radiometric calibration instrumented sites located in the USA, France, China, and Namibia are presented here that were used as initial sites for prototyping and demonstrating RadCalNet. All four sites rely on collection of data for assessing the surface reflectance as well as atmospheric data over that site. The data are converted to top-of-atmosphere reflectance within RadCalNet and provided through a web portal to allow users to either radiometrically calibrate or verify the calibration of their sensors of interest. Top-of-atmosphere reflectance data with associated uncertainties are available at 10 nm intervals over the 400 nm to 1000 nm spectral range at 30 min intervals for a nadir-viewing geometry. An example is shown demonstrating how top-of-atmosphere data from RadCalNet can be used to determine the interconsistency between two sensors.

97 citations



Journal ArticleDOI
TL;DR: The Space Atomic Gravity Explorer (SAGE) has the scientific objective to investigate gravitational waves, dark matter, and other fundamental aspects of gravity as well as the connection between gravitational physics and quantum physics using new quantum sensors, namely, optical atomic clocks and atom interferometers based on ultracold strontium atoms as mentioned in this paper.
Abstract: The proposed mission “Space Atomic Gravity Explorer” (SAGE) has the scientific objective to investigate gravitational waves, dark matter, and other fundamental aspects of gravity as well as the connection between gravitational physics and quantum physics using new quantum sensors, namely, optical atomic clocks and atom interferometers based on ultracold strontium atoms.

Journal ArticleDOI
TL;DR: The robust Gaussian regression filter and morphological filters are used for the separation of the waviness component due to their robustness and watershed segmentation is enhanced to extract globules from the residual surface as discussed by the authors.
Abstract: Powder bed fusion (PBF) is a popular additive manufacturing (AM) process with wide applications in key industrial sectors, including aerospace, automotive, healthcare, defence However, a deficiency of PBF is its low quality of surface finish A number of PBF process variables and other factors (eg powders, recoater) can influence the surface quality It is of significant importance to measure and characterise PBF surfaces for the benefits of process optimisation, product performance evaluation and also product design A state-of-the-art review is given to summarise the current research work on the characterisation of AM surfaces, particularly PBF surfaces It is recognised that AM processes are different from conventional manufacturing processes and their produced surface topographies are different as well In this paper, the surface characterisation framework is updated to reflect the unique characteristics of PBF processes The surface spatial wavelength components and other process signature features are described and their production mechanisms are elaborated A bespoke surface characterisation procedure is developed based on the updated framework The robust Gaussian regression filter and the morphological filters are proposed to be used for the separation of the waviness component due to their robustness The watershed segmentation is enhanced to extract globules from the residual surface Two AM components produced by electron beam melting (EBM) and selective laser melting (SLM), are measured and characterised by the proposed methodology Both of the two filters are qualified for the extraction of melted tracks The watershed segmentation can enable the extraction of globules The standard surface texture parameters of different surface wavelength components are compared A set of bespoke parameters are intentionally developed to offer a quantitative evaluation of the globules

Journal ArticleDOI
TL;DR: The theories behind crystallisation and acoustic cavitation are summarised, followed by a description of all the current proposed sonocrystallisation mechanisms, and an overview on future prospects of sonocrystalisation applications are concluded.

Journal ArticleDOI
TL;DR: In this article, the authors assess various policy needs for biomass data and recommend a long-term collaborative effort among forest biomass data producers and users to meet these needs, and also highlight the potential strength of the multitude of upcoming space-based missions in combination to provide for these varying needs and to ensure continuity for longterm data provision which one-off research missions cannot provide.
Abstract: The achievement of international goals and national commitments related to forest conservation and management, climate change, and sustainable development requires credible, accurate, and reliable monitoring of stocks and changes in forest biomass and carbon. Most prominently, the Paris Agreement on Climate Change and the United Nations’ Sustainable Development Goals in particular require data on biomass to monitor progress. Unprecedented opportunities to provide forest biomass data are created by a series of upcoming space-based missions, many of which provide open data targeted at large areas and better spatial resolution biomass monitoring than has previously been achieved. We assess various policy needs for biomass data and recommend a long-term collaborative effort among forest biomass data producers and users to meet these needs. A gap remains, however, between what can be achieved in the research domain and what is required to support policy making and meet reporting requirements. There is no single biomass dataset that serves all users in terms of definition and type of biomass measurement, geographic area, and uncertainty requirements, and whether there is need for the most recent up-to-date biomass estimate or a long-term biomass trend. The research and user communities should embrace the potential strength of the multitude of upcoming missions in combination to provide for these varying needs and to ensure continuity for long-term data provision which one-off research missions cannot provide. International coordination bodies such as Global Forest Observations Initiative (GFOI), Committee on Earth Observation Satellites (CEOS), and Global Observation of Forest Cover and Land Dynamics (GOFC‐GOLD) will be integral in addressing these issues in a way that fulfils these needs in a timely fashion. Further coordination work should particularly look into how space-based data can be better linked with field reference data sources such as forest plot networks, and there is also a need to ensure that reference data cover a range of forest types, management regimes, and disturbance regimes worldwide.

Journal ArticleDOI
TL;DR: A simple, high-throughput screening method to identify optimal detergent conditions for membrane protein stabilization and the identification of conditions that are suitable for downstream handling of membrane proteins during purification is established.
Abstract: Protein stability in detergent or membrane-like environments is the bottleneck for structural studies on integral membrane proteins (IMP). Irrespective of the method to study the structure of an IMP, detergent solubilization from the membrane is usually the first step in the workflow. Here, we establish a simple, high-throughput screening method to identify optimal detergent conditions for membrane protein stabilization. We apply differential scanning fluorimetry in combination with scattering upon thermal denaturation to study the unfolding of integral membrane proteins. Nine different prokaryotic and eukaryotic membrane proteins were used as test cases to benchmark our detergent screening method. Our results show that it is possible to measure the stability and solubility of IMPs by diluting them from their initial solubilization condition into different detergents. We were able to identify groups of detergents with characteristic stabilization and destabilization effects for selected targets. We further show that fos-choline and PEG family detergents may lead to membrane protein destabilization and unfolding. Finally, we determined thenmodynamic parameters that are important indicators of IMP stability. The described protocol allows the identification of conditions that are suitable for downstream handling of membrane proteins during purification.

Journal ArticleDOI
TL;DR: In this article, the authors present a review of the main approaches and limitations in our current capability to diagnose the drivers of changes in atmospheric CH4 and, crucially, proposes ways to improve this capability in the coming decade.
Abstract: The 2015 Paris Agreement of the United Nations Framework Convention on Climate Change aims to keep global average temperature increases well below 2 °C of preindustrial levels in the Year 2100. Vital to its success is achieving a decrease in the abundance of atmospheric methane (CH4), the second most important anthropogenic greenhouse gas. If this reduction is to be achieved, individual nations must make and meet reduction goals in their nationally determined contributions, with regular and independently verifiable global stock taking. Targets for the Paris Agreement have been set, and now the capability must follow to determine whether CH4 reductions are actually occurring. At present, however, there are significant limitations in the ability of scientists to quantify CH4 emissions accurately at global and national scales and to diagnose what mechanisms have altered trends in atmospheric mole fractions in the past decades. For example, in 2007, mole fractions suddenly started rising globally after a decade of almost no growth. More than a decade later, scientists are still debating the mechanisms behind this increase. This study reviews the main approaches and limitations in our current capability to diagnose the drivers of changes in atmospheric CH4 and, crucially, proposes ways to improve this capability in the coming decade. Recommendations include the following: (i) improvements to process‐based models of the main sectors of CH4 emissions—proposed developments call for the expansion of tropical wetland flux measurements, bridging remote sensing products for improved measurement of wetland area and dynamics, expanding measurements of fossil fuel emissions at the facility and regional levels, expanding country‐ specific data on the composition of waste sent to landfill and the types of wastewater treatment systems implemented, characterizing and representing temporal profiles of crop growing seasons, implementing parameters related to ruminant emissions such as animal feed, and improving the detection of small fires associated with agriculture and deforestation; (ii) improvements to measurements of CH4 mole fraction and its isotopic variations—developments include greater vertical profiling at background sites, expanding networks of dense urban measurements with a greater focus on relatively poor countries, improving the precision of isotopic ratio measurements of 13CH4, CH3D, 14CH4, and clumped isotopes, creating isotopic reference materials for international‐scale development, and expanding spatial and temporal characterization of isotopic source signatures; and (iii) improvements to inverse modeling systems to derive emissions from atmospheric measurements—advances are proposed in the areas of hydroxyl radical quantification, in systematic uncertainty quantification through validation of chemical transport models, in the use of source tracers for estimating sector‐level emissions, and in the development of time and spaceresolved national inventories. These and other recommendations are proposed for the major areas of CH4 science with the aim of improving capability in the coming decade to quantify atmospheric CH4 budgets on the scales necessary for the success of climate policies. Plain Language Summary Methane is the second largest contributor to climate warming from human activities since preindustrial times. Reducing human‐made emissions by half is a major component of the 2015 Paris Agreement target to keep global temperature increases well below 2 °C. In parallel to the methane emission reductions pledged by individual nations, new capabilities are needed to determine independently whether these reductions are actually occurring and whether methane concentrations in the atmosphere are changing for reasons that are clearly understood. At present significant challenges limit the ability of scientists to identify the mechanisms causing changes in atmospheric methane. This study reviews current and emerging tools in methane science and proposes major advances needed in the coming decade to achieve this crucial capability. We recommend further developing the models that simulate the processes behind methane emissions, improving atmospheric measurements of methane and its major carbon and hydrogen isotopes, and advancing abilities to infer the rates of methane being emitted and removed from the atmosphere from these measurements. The improvements described here will play a major role in assessing emissions commitments as more cities, states, and countries report methane emission inventories and commit to specific emission reduction targets.

Journal ArticleDOI
TL;DR: This protocol describes procedures that will enable researchers to reliably perform TERS imaging using a transmission-mode AFM-TERS configuration on both biological and non-biological samples and provides procedures and example data for a range of different sample types.
Abstract: Confocal and surface-enhanced Raman spectroscopy (SERS) are powerful techniques for molecular characterization; however, they suffer from the drawback of diffraction-limited spatial resolution. Tip-enhanced Raman spectroscopy (TERS) overcomes this limitation and provides chemical information at length scales in the tens of nanometers. In contrast to alternative approaches to nanoscale chemical analysis, TERS is label free, is non-destructive, and can be performed in both air and liquid environments, allowing its use in a diverse range of applications. Atomic force microscopy (AFM)-based TERS is especially versatile, as it can be applied to a broad range of samples on various substrates. Despite its advantages, widespread uptake of this technique for nanoscale chemical imaging has been inhibited by various experimental challenges, such as limited lifetime, and the low stability and yield of TERS probes. This protocol details procedures that will enable researchers to reliably perform TERS imaging using a transmission-mode AFM-TERS configuration on both biological and non-biological samples. The procedure consists of four stages: (i) preparation of plasmonically active TERS probes; (ii) alignment of the TERS system; (iii) experimental procedures for nanoscale imaging using TERS; and (iv) TERS data processing. We provide procedures and example data for a range of different sample types, including polymer thin films, self-assembled monolayers (SAMs) of organic molecules, photocatalyst surfaces, small molecules within biological cells, single-layer graphene and single-walled carbon nanotubes in both air and water. With this protocol, TERS probes can be prepared within ~23 h, and each subsequent TERS experimental procedure requires 3–5 h. This protocol describes how to perform nanoscale chemical imaging using tip-enhanced Raman spectroscopy (TERS). The procedure details the preparation of plasmonically active TERS probes, alignment of a TERS system, and various example procedures.

Journal ArticleDOI
TL;DR: Graphene grown on silicon carbide is found to be the most promising substrate for obtaining of 1–5 nm thick Bi2Se3 films.
Abstract: Knowledge of nucleation and further growth of Bi2Se3 nanoplates on different substrates is crucial for obtaining ultrathin nanostructures and films of this material by physical vapour deposition technique. In this work, Bi2Se3 nanoplates were deposited under the same experimental conditions on different types of graphene substrates (as-transferred and post-annealed chemical vapour deposition grown monolayer graphene, monolayer graphene grown on silicon carbide substrate). Dimensions of the nanoplates deposited on graphene substrates were compared with the dimensions of the nanoplates deposited on mechanically exfoliated mica and highly ordered pyrolytic graphite flakes used as reference substrates. The influence of different graphene substrates on nucleation and further lateral and vertical growth of the Bi2Se3 nanoplates is analysed. Possibility to obtain ultrathin Bi2Se3 thin films on these substrates is evaluated. Between the substrates considered in this work, graphene grown on silicon carbide is found to be the most promising substrate for obtaining of 1–5 nm thick Bi2Se3 films.

Journal ArticleDOI
TL;DR: In this article, the essential optimization mechanisms of preintercalation in improving electronic conductivity and ionic diffusion, inhibiting "lattice breathing" and screening the carrier charge are discussed.
Abstract: Rational design of the morphology and complementary compounding of electrode materials have contributed substantially to improving battery performance, yet the capabilities of conventional electrode materials have remained limited in some key parameters including energy and power density, cycling stability, etc. because of their intrinsic properties, especially the restricted thermodynamics of reactions and the inherent slow diffusion dynamics induced by the crystal structures. In contrast, preintercalation of ions or molecules into the crystal structure with/without further lattice reconstruction could provide fundamental optimizations to overcome these intrinsic limitations. In this Perspective, we discuss the essential optimization mechanisms of preintercalation in improving electronic conductivity and ionic diffusion, inhibiting “lattice breathing” and screening the carrier charge. We also summarize the current challenges in preintercalation and offer insights on future opportunities for the rational design of preintercalation electrodes in next-generation rechargeable batteries.

Journal ArticleDOI
TL;DR: In this paper, the authors demonstrate a method to control the movement of individual skyrmions by making use of the magnetic interaction between a sample and a magnetic force microscopy probe.
Abstract: Magnetic skyrmions are topologically protected spin textures, stabilised in systems with strong Dzyaloshinskii-Moriya interaction (DMI). Several studies have shown that electrical currents can move skyrmions efficiently through spin-orbit torques. While promising for technological applications, current-driven skyrmion motion is intrinsically collective and accompanied by undesired heating effects. Here we demonstrate a new approach to control individual skyrmion positions precisely, which relies on the magnetic interaction between sample and a magnetic force microscopy (MFM) probe. We investigate perpendicularly magnetised X/CoFeB/MgO multilayers, where for X = W or Pt the DMI is sufficiently strong to allow for skyrmion nucleation in an applied field. We show that these skyrmions can be manipulated individually through the local field gradient generated by the scanning MFM probe with an unprecedented level of accuracy. Furthermore, we show that the probe stray field can assist skyrmion nucleation. Our proof-of-concepts results pave the way towards achieving current-free skyrmion control. Skyrmions are topologically nontrivial spin textures which could be used as energy efficient carriers of information in future memory devices, but first reliable and efficient control of their movement is required. Here, the authors demonstrate a method to control the movement of individual skyrmions by making use of the magnetic interaction between a sample and a magnetic force microscopy probe.

Journal ArticleDOI
TL;DR: In this paper, the results of a third benchmark exercise using in-plane permeability measurement, based on systems applying the radial unsaturated injection method, were presented, where 19 participants using 20 systems characterized a non-crimp and a woven fabric at three different fiber volume contents, using a commercially available silicone oil as impregnating fluid.
Abstract: Although good progress was made by two international benchmark exercises on in-plane permeability, existing methods have not yet been standardized. This paper presents the results of a third benchmark exercise using in-plane permeability measurement, based on systems applying the radial unsaturated injection method. 19 participants using 20 systems characterized a non-crimp and a woven fabric at three different fiber volume contents, using a commercially available silicone oil as impregnating fluid. They followed a detailed characterization procedure and also completed a questionnaire on their set-up and analysis methods. Excluding outliers (2 of 20), the average coefficient of variation (cv) between the participant?s results was 32% and 44% (non-crimp and woven fabric), while the average cv for individual participants was 8% and 12%, respectively. This indicates statistically significant variations between the measurement systems. Cavity deformation was identified as a major influence, besides fluid pressure / viscosity measurement, textile variations, and data analysis.

Journal ArticleDOI
TL;DR: In this paper, the decoherence of superconducting qubits is studied and the temporal stability of energy-relaxation and dephasing is examined. But the main finding is that fluctuations in qubit relaxation are local to the qubit and are caused by instabilities of near-resonant two-level-systems (TLS).
Abstract: We benchmark the decoherence of superconducting qubits to examine the temporal stability of energy-relaxation and dephasing. By collecting statistics during measurements spanning multiple days, we find the mean parameters $\overline{T_{1}}$ = 49 $\mu$s and $\overline{T_{2}^{*}}$ = 95 $\mu$s, however, both of these quantities fluctuate explaining the need for frequent re-calibration in qubit setups. Our main finding is that fluctuations in qubit relaxation are local to the qubit and are caused by instabilities of near-resonant two-level-systems (TLS). Through statistical analysis, we determine switching rates of these TLS and observe the coherent coupling between an individual TLS and a transmon qubit. Finally, we find evidence that the qubit's frequency stability is limited by capacitance noise. Importantly, this produces a 0.8 ms limit on the pure dephasing which we also observe. Collectively, these findings raise the need for performing qubit metrology to examine the reproducibility of qubit parameters, where these fluctuations could affect qubit gate fidelity.

Journal ArticleDOI
TL;DR: It is reported that few-layer graphene yields of up to 18% in three hours can be achieved by optimising inertial Cavitation dose during ultrasonication, and inertial cavitation is shown to preferentially exfoliate larger graphene flakes which causes the exfoliation rate to decrease as a function of sonication time.
Abstract: Ultrasonication is widely used to exfoliate two dimensional (2D) van der Waals layered materials such as graphene. Its fundamental mechanism, inertial cavitation, is poorly understood and often ignored in ultrasonication strategies resulting in low exfoliation rates, low material yields and wide flake size distributions, making the graphene dispersions produced by ultrasonication less economically viable. Here we report that few-layer graphene yields of up to 18% in three hours can be achieved by optimising inertial cavitation dose during ultrasonication. We demonstrate that inertial cavitation preferentially exfoliates larger flakes and that the graphene exfoliation rate and flake dimensions are strongly correlated with, and therefore can be controlled by, inertial cavitation dose. Furthermore, inertial cavitation is shown to preferentially exfoliate larger graphene flakes which causes the exfoliation rate to decrease as a function of sonication time. This study demonstrates that measurement and control of inertial cavitation is critical in optimising the high yield sonication-assisted aqueous liquid phase exfoliation of size-selected nanomaterials. Future development of this method should lead to the development of high volume flow cell production of 2D van der Waals layered nanomaterials.

Journal ArticleDOI
TL;DR: In this article, the authors presented a new approach in the study of wind damage: combining terrestrial laser scanning (TLS) data and finite element analysis, which is applicable at the plot level and could also be applied to open-grown trees, such as in cities or parks.

Journal ArticleDOI
TL;DR: In this article, a micro-resonator-based frequency comb (microcomb) was used to generate stable terahertz wave at the soliton's repetition rate (331 GHz).
Abstract: The Terahertz or millimeter wave frequency band (300 GHz - 3 THz) is spectrally located between microwaves and infrared light and has attracted significant interest for applications in broadband wireless communications, space-borne radiometers for Earth remote sensing, astrophysics, and imaging. In particular optically generated THz waves are of high interest for low-noise signal generation. Here, we propose and demonstrate stabilized terahertz wave generation using a microresonator-based frequency comb (microcomb). A unitravelling-carrier photodiode (UTC-PD) converts low-noise optical soliton pulses from the microcomb to a terahertz wave at the soliton's repetition rate (331 GHz). With a free-running microcomb, the Allan deviation of the Terahertz signal is 4.5×10-9 at 1 s measurement time with a phase noise of -72 dBc/Hz (-118 dBc/Hz) at 10 kHz (10 MHz) offset frequency. By locking the repetition rate to an in-house hydrogen maser, in-loop fractional frequency stabilities of 9.6×10-15 and 1.9×10-17 are obtained at averaging times of 1 s and 2000 s respectively, indicating that the stability of the generated THz wave is limited by the maser reference signal. Moreover, the terahertz signal is successfully used to perform a proof-of-principle demonstration of terahertz imaging of peanuts. Combining the monolithically integrated UTC-PD with an on-chip microcomb, the demonstrated technique could provide a route towards highly stable continuous terahertz wave generation in chip-scale packages for out-of-the-lab applications. In particular, such systems would be useful as compact tools for high-capacity wireless communication, spectroscopy, imaging, remote sensing, and astrophysical applications.

Journal ArticleDOI
TL;DR: In this article, an approach to control individual skyrmion positions precisely, which relies on the magnetic interaction between sample and a magnetic force microscopy (MFM) probe, is presented.
Abstract: Magnetic skyrmions are topologically protected spin textures, stabilised in systems with strong Dzyaloshinskii-Moriya interaction (DMI). Several studies have shown that electrical currents can move skyrmions efficiently through spin-orbit torques. While promising for technological applications, current-driven skyrmion motion is intrinsically collective and accompanied by undesired heating effects. Here we demonstrate a new approach to control individual skyrmion positions precisely, which relies on the magnetic interaction between sample and a magnetic force microscopy (MFM) probe. We investigate perpendicularly magnetised X/CoFeB/MgO multilayers, where for X = W or Pt the DMI is sufficiently strong to allow for skyrmion nucleation in an applied field. We show that these skyrmions can be manipulated individually through the local field gradient generated by the scanning MFM probe with an unprecedented level of accuracy. Furthermore, we show that the probe stray field can assist skyrmion nucleation. Our proof-of-concepts results offer current-free paradigms to efficient individual skyrmion control.

Journal ArticleDOI
TL;DR: In this paper, a new method for in situ monitoring of internal degradation in Li-ion batteries using an electrochemical model (EM) is introduced, based on a single particle model, the parameters of which are reorganized from the original physical property parameters.

Journal ArticleDOI
TL;DR: In this paper, the authors present the results of research to develop a novel analytical framework for assessing the potential of Earth Observation (EO) approaches to populate the SDG indicators.
Abstract: In 2015, member countries of the United Nations adopted the 17 Sustainable Development Goals (SDGs) at the Sustainable Development Summit in New York. These global goals have 169 targets and 232 indicators which are based on the three pillars of sustainable development: economic, social and environmental. Substantial challenges remain in obtaining data of the required quality, especially in developing countries, given the often limited resources available. One promising and innovative way of addressing this issue of data availability is to use Earth Observation (EO). This paper presents the results of research to develop a novel analytical framework for assessing the potential of EO approaches to populate the SDG indicators. We present a Maturity Matrix Framework (MMF) and apply it to all of the 232 SDG indicators. The results demonstrate that while the applicability of EO-derived data does vary between the SDG indicators, overall, EO has an important contribution to make towards populating a wide diversity of the SDG indicators.

Journal ArticleDOI
TL;DR: It is shown that the membranolytic activity of antimicrobial peptides on biomimetic lipid vesicles in a multilayer microfluidic total analysis system quantifies as well as the polypeptide content of these peptides increases with age.
Abstract: The spread of bacterial resistance against conventional antibiotics generates a great need for the discovery of novel antimicrobials. Polypeptide antibiotics constitute a promising class of antimicrobial agents that favour attack on bacterial membranes. However, efficient measurement platforms for evaluating their mechanisms of action in a systematic manner are lacking. Here we report an integrated lab-on-a-chip multilayer microfluidic platform to quantify the membranolytic efficacy of such antibiotics. The platform is a biomimetic vesicle-based screening assay, which generates giant unilamellar vesicles (GUVs) in physiologically relevant buffers on demand. Hundreds of these GUVs are individually immobilised downstream in physical traps connected to separate perfusion inlets that facilitate controlled antibiotic delivery. Antibiotic efficacy is expressed as a function of the time needed for an encapsulated dye to leak out of the GUVs as a result of antibiotic treatment. This proof-of-principle study probes the dose response of an archetypal polypeptide antibiotic cecropin B on GUVs mimicking bacterial membranes. The results of the study provide a foundation for engineering quantitative, high-throughput microfluidics devices for screening antibiotics.