scispace - formally typeset
Search or ask a question

Showing papers by "University of Twente published in 2019"


Journal ArticleDOI
TL;DR: An expert elicitation survey estimates yield losses for the five major food crops worldwide, suggesting that the highest losses are associated with food-deficit regions with fast-growing populations and frequently with emerging or re-emerging pests and diseases.
Abstract: Crop pathogens and pests reduce the yield and quality of agricultural production. They cause substantial economic losses and reduce food security at household, national and global levels. Quantitative, standardized information on crop losses is difficult to compile and compare across crops, agroecosystems and regions. Here, we report on an expert-based assessment of crop health, and provide numerical estimates of yield losses on an individual pathogen and pest basis for five major crops globally and in food security hotspots. Our results document losses associated with 137 pathogens and pests associated with wheat, rice, maize, potato and soybean worldwide. Our yield loss (range) estimates at a global level and per hotspot for wheat (21.5% (10.1–28.1%)), rice (30.0% (24.6–40.9%)), maize (22.5% (19.5–41.1%)), potato (17.2% (8.1–21.0%)) and soybean (21.4% (11.0–32.4%)) suggest that the highest losses are associated with food-deficit regions with fast-growing populations, and frequently with emerging or re-emerging pests and diseases. Our assessment highlights differences in impacts among crop pathogens and pests and among food security hotspots. This analysis contributes critical information to prioritize crop health management to improve the sustainability of agroecosystems in delivering services to societies. An expert elicitation survey estimates yield losses for the five major food crops worldwide, suggesting that the highest losses are associated with food-deficit regions with fast-growing populations and frequently with emerging or re-emerging pests and diseases.

1,376 citations


Journal ArticleDOI
TL;DR: The use of nanomedicine in cancer requires the adoption of specific strategies to optimize its potential, and this perspective proposes four strategies including the identification of patients for clinical trials, investments in modular nanocarrier design, the integration in multimodal combination therapy regimes and the inclusion in immunotherapy studies.
Abstract: Nanomedicines are extensively employed in cancer therapy. We here propose four strategic directions to improve nanomedicine translation and exploitation. (1) Patient stratification has become common practice in oncology drug development. Accordingly, probes and protocols for patient stratification are urgently needed in cancer nanomedicine, to identify individuals suitable for inclusion in clinical trials. (2) Rational drug selection is crucial for clinical and commercial success. Opportunistic choices based on drug availability should be replaced by investments in modular (pro)drug and nanocarrier design. (3) Combination therapies are the mainstay of clinical cancer care. Nanomedicines synergize with pharmacological and physical co-treatments, and should be increasingly integrated in multimodal combination therapy regimens. (4) Immunotherapy is revolutionizing the treatment of cancer. Nanomedicines can modulate the behaviour of myeloid and lymphoid cells, thereby empowering anticancer immunity and immunotherapy efficacy. Alone and especially together, these four directions will fuel and foster the development of successful cancer nanomedicine therapies.

659 citations


Journal ArticleDOI
TL;DR: The synthesis, surface functionalization and characterization of iron oxide nanoparticles, as well as their (pre‐) clinical use in diagnostic, therapeutic and theranostic settings, are summarized.

618 citations


Journal ArticleDOI
TL;DR: The Third Pole (TP) is experiencing rapid warming and is currently in its warmest period in the past 2,000 years as mentioned in this paper, and the latest development in multidisciplinary TP research is reviewed in this paper.
Abstract: The Third Pole (TP) is experiencing rapid warming and is currently in its warmest period in the past 2,000 years. This paper reviews the latest development in multidisciplinary TP research ...

530 citations


Journal ArticleDOI
TL;DR: In this article, a community initiative to identify major unsolved scientific problems in hydrology motivated by a need for stronger harmonisation of research efforts is described. But despite the diversity of the participants (230 scientists in total), the process revealed much about community priorities and the state of our science: a preference for continuity in research questions rather than radical departures or redirections from past and current work.
Abstract: This paper is the outcome of a community initiative to identify major unsolved scientific problems in hydrology motivated by a need for stronger harmonisation of research efforts. The procedure involved a public consultation through online media, followed by two workshops through which a large number of potential science questions were collated, prioritised, and synthesised. In spite of the diversity of the participants (230 scientists in total), the process revealed much about community priorities and the state of our science: a preference for continuity in research questions rather than radical departures or redirections from past and current work. Questions remain focused on the process-based understanding of hydrological variability and causality at all space and time scales. Increased attention to environmental change drives a new emphasis on understanding how change propagates across interfaces within the hydrological system and across disciplinary boundaries. In particular, the expansion of the human footprint raises a new set of questions related to human interactions with nature and water cycle feedbacks in the context of complex water management problems. We hope that this reflection and synthesis of the 23 unsolved problems in hydrology will help guide research efforts for some years to come.

469 citations


Journal ArticleDOI
TL;DR: In this paper, the authors analyze how earthquakes trigger landslides and highlight research gaps, and suggest pathways toward a more complete understanding of the seismic effects on the Earth's surface, highlighting research gaps.
Abstract: Large earthquakes initiate chains of surface processes that last much longer than the brief moments of strong shaking. Most moderate‐ and large‐magnitude earthquakes trigger landslides, ranging from small failures in the soil cover to massive, devastating rock avalanches. Some landslides dam rivers and impound lakes, which can collapse days to centuries later, and flood mountain valleys for hundreds of kilometers downstream. Landslide deposits on slopes can remobilize during heavy rainfall and evolve into debris flows. Cracks and fractures can form and widen on mountain crests and flanks, promoting increased frequency of landslides that lasts for decades. More gradual impacts involve the flushing of excess debris downstream by rivers, which can generate bank erosion and floodplain accretion as well as channel avulsions that affect flooding frequency, settlements, ecosystems, and infrastructure. Ultimately, earthquake sequences and their geomorphic consequences alter mountain landscapes over both human and geologic time scales. Two recent events have attracted intense research into earthquake‐induced landslides and their consequences: the magnitude M 7.6 Chi‐Chi, Taiwan earthquake of 1999, and the M 7.9 Wenchuan, China earthquake of 2008. Using data and insights from these and several other earthquakes, we analyze how such events initiate processes that change mountain landscapes, highlight research gaps, and suggest pathways toward a more complete understanding of the seismic effects on the Earth's surface.

424 citations


Journal ArticleDOI
TL;DR: The study finds that a diversity in access to devices and peripherals, device-related opportunities, and the ongoing expenses required to maintain the hardware, software, and subscriptions affect existing inequalities related to Internet skills, uses, and outcomes.
Abstract: For a long time, a common opinion among policy-makers was that the digital divide problem would be solved when a country’s Internet connection rate reaches saturation. However, scholars of the second-level digital divide have concluded that the divides in Internet skills and type of use continue to expand even after physical access is universal. This study—based on an online survey among a representative sample of the Dutch population—indicates that the first-level digital divide remains a problem in one of the richest and most technologically advanced countries in the world. By extending basic physical access combined with material access, the study finds that a diversity in access to devices and peripherals, device-related opportunities, and the ongoing expenses required to maintain the hardware, software, and subscriptions affect existing inequalities related to Internet skills, uses, and outcomes.

371 citations


Journal ArticleDOI
TL;DR: This review distills the historical and current developments spanning the last several decades of SIF heritage and complementarity within the broader field of fluorescence science, the maturation of physiological and radiative transfer modelling, SIF signal retrieval strategies, techniques for field and airborne sensing, advances in satellite-based systems, and applications of these capabilities in evaluation of photosynthesis and stress effects.

313 citations


Journal ArticleDOI
TL;DR: PTM scaffolds are promising composite biomaterials for repairing challenging bone defect that would have great potential for its clinical translation.

306 citations


Journal ArticleDOI
TL;DR: To ensure efficient translation of nano-immunotherapy constructs and concepts, biomarkers have to consider biomarkers in their clinical development, to make sure that the right nanomedicine formulation is combined with the right immunotherapy in the right patient.
Abstract: Nanomedicine holds significant potential to improve the efficacy of cancer immunotherapy. Thus far, nanomedicines, i.e., 1-100(0) nm sized drug delivery systems, have been primarily used to improve the balance between the efficacy and toxicity of conjugated or entrapped chemotherapeutic drugs. The clinical performance of cancer nanomedicines has been somewhat disappointing, which is arguably mostly due to the lack of tools and technologies for patient stratification. Conversely, the clinical progress made with immunotherapy has been spectacular, achieving complete cures and inducing long-term survival in advanced-stage patients. Unfortunately, however, immunotherapy only works well in relatively small subsets of patients. Increasing amounts of preclinical and clinical data demonstrate that combining nanomedicine with immunotherapy can boost therapeutic outcomes, by turning "cold" nonimmunoresponsive tumors and metastases into "hot" immunoresponsive lesions. Nano-immunotherapy can be realized via three different approaches, in which nanomedicines are used (1) to target cancer cells, (2) to target the tumor immune microenvironment, and (3) to target the peripheral immune system. When targeting cancer cells, nanomedicines typically aim to induce immunogenic cell death, thereby triggering the release of tumor antigens and danger-associated molecular patterns, such as calreticulin translocation, high mobility group box 1 protein and adenosine triphosphate. The latter serve as adjuvants to alert antigen-presenting cells to take up, process and present the former, thereby promoting the generation of CD8+ cytotoxic T cells. Nanomedicines targeting the tumor immune microenvironment potentiate cancer immunotherapy by inhibiting immunosuppressive cells, such as M2-like tumor-associated macrophages, as well as by reducing the expression of immunosuppressive molecules, such as transforming growth factor beta. In addition, nanomedicines can be employed to promote the activity of antigen-presenting cells and cytotoxic T cells in the tumor immune microenvironment. Nanomedicines targeting the peripheral immune system aim to enhance antigen presentation and cytotoxic T cell production in secondary lymphoid organs, such as lymph nodes and spleen, as well as to engineer and strengthen peripheral effector immune cell populations, thereby promoting anticancer immunity. While the majority of immunomodulatory nanomedicines are in preclinical development, exciting results have already been reported in initial clinical trials. To ensure efficient translation of nano-immunotherapy constructs and concepts, we have to consider biomarkers in their clinical development, to make sure that the right nanomedicine formulation is combined with the right immunotherapy in the right patient. In this context, we have to learn from currently ongoing efforts in nano-biomarker identification as well as from partially already established immuno-biomarker initiatives, such as the Immunoscore and the cancer immunogram. Together, these protocols will help to capture the nano-immuno status in individual patients, enabling the identification and use of individualized and improved nanomedicine-based treatments to boost the performance of cancer immunotherapy.

273 citations


Journal ArticleDOI
TL;DR: This study provides a first spatial overview of key processes affecting pesticide fate that can be used to identify areas potentially vulnerable to pesticide accumulation and compiled a database of studies that measured pesticide residues in Africa.
Abstract: The application of agricultural pesticides in Africa can have negative effects on human health and the environment. The aim of this study was to identify African environments that are vulnerable to the accumulation of pesticides by mapping geospatial processes affecting pesticide fate. The study modelled processes associated with the environmental fate of agricultural pesticides using publicly available geospatial datasets. Key geospatial processes affecting the environmental fate of agricultural pesticides were selected after a review of pesticide fate models and maps for leaching, surface runoff, sedimentation, soil storage and filtering capacity, and volatilization were created. The potential and limitations of these maps are discussed. We then compiled a database of studies that measured pesticide residues in Africa. The database contains 10,076 observations, but only a limited number of observations remained when a standard dataset for one compound was extracted for validation. Despite the need for more in-situ data on pesticide residues and application, this study provides a first spatial overview of key processes affecting pesticide fate that can be used to identify areas potentially vulnerable to pesticide accumulation.

Journal ArticleDOI
TL;DR: In this article, the authors investigated the thermogravimetric analysis of high-ash sewage sludge using thermodynamic analysis and showed that the activation energy and preexponential factor from mass loss data using five major reaction mechanisms showed positive ΔH except F1.5.

Journal ArticleDOI
29 Apr 2019-Small
TL;DR: The history of bioprinting and the most recent advances in instrumentation and methods are covered, and the requirements for bioinks and cells to achieve optimal fabrication of biomimetic constructs are focused on.
Abstract: Over the last decades, the fabrication of 3D tissues has become commonplace in tissue engineering and regenerative medicine. However, conventional 3D biofabrication techniques such as scaffolding, microengineering, and fiber and cell sheet engineering are limited in their capacity to fabricate complex tissue constructs with the required precision and controllability that is needed to replicate biologically relevant tissues. To this end, 3D bioprinting offers great versatility to fabricate biomimetic, volumetric tissues that are structurally and functionally relevant. It enables precise control of the composition, spatial distribution, and architecture of resulting constructs facilitating the recapitulation of the delicate shapes and structures of targeted organs and tissues. This Review systematically covers the history of bioprinting and the most recent advances in instrumentation and methods. It then focuses on the requirements for bioinks and cells to achieve optimal fabrication of biomimetic constructs. Next, emerging evolutions and future directions of bioprinting are discussed, such as freeform, high-resolution, multimaterial, and 4D bioprinting. Finally, the translational potential of bioprinting and bioprinted tissues of various categories are presented and the Review is concluded by exemplifying commercially available bioprinting platforms.

Journal ArticleDOI
TL;DR: The findings point to directions for new research, in particular concerning the distinction between drivers’ ability and motivation to take over, and the roles of urgency and prior experience, about how quickly drivers take over control of the vehicle in response to a critical event or a take-over request.
Abstract: An important question in automated driving research is how quickly drivers take over control of the vehicle in response to a critical event or a take-over request. Although a large number of studies have been performed, results vary strongly. In this study, we investigated mean take-over times from 129 studies with SAE level 2 automation or higher. We used three complementary approaches: (1) a within-study analysis, in which differences in mean take-over time were assessed for pairs of experimental conditions, (2) a between-study analysis, in which correlations between experimental conditions and mean take-over times were assessed, and (3) a linear mixed-effects model combining between-study and within-study effects. The three methods showed that a shorter mean take-over time is associated with a higher urgency of the situation, not using a handheld device, not performing a visual non-driving task, having experienced another take-over scenario before in the experiment, and receiving an auditory or vibrotactile take-over request as compared to a visual-only or no take-over request. A consistent effect of age was not observed. We also found the mean and standard deviation of the take-over time were highly correlated, indicating that the mean is predictive of variability. Our findings point to directions for new research, in particular concerning the distinction between drivers’ ability and motivation to take over, and the roles of urgency and prior experience. © 2019 Elsevier Ltd

Journal ArticleDOI
TL;DR: The current state‐of‐the‐art of low‐field systems (defined as 0.25–1T), both with respect to its low cost, low foot‐print, and subject accessibility and how low field could potentially benefit from many of the developments that have occurred in higher‐field MRI are described.
Abstract: Historically, clinical MRI started with main magnetic field strengths in the ∼0.05-0.35T range. In the past 40 years there have been considerable developments in MRI hardware, with one of the primary ones being the trend to higher magnetic fields. While resulting in large improvements in data quality and diagnostic value, such developments have meant that conventional systems at 1.5 and 3T remain relatively expensive pieces of medical imaging equipment, and are out of the financial reach for much of the world. In this review we describe the current state-of-the-art of low-field systems (defined as 0.25-1T), both with respect to its low cost, low foot-print, and subject accessibility. Furthermore, we discuss how low field could potentially benefit from many of the developments that have occurred in higher-field MRI. In the first section, the signal-to-noise ratio (SNR) dependence on the static magnetic field and its impact on the achievable contrast, resolution, and acquisition times are discussed from a theoretical perspective. In the second section, developments in hardware (eg, magnet, gradient, and RF coils) used both in experimental low-field scanners and also those that are currently in the market are reviewed. In the final section the potential roles of new acquisition readouts, motion tracking, and image reconstruction strategies, currently being developed primarily at higher fields, are presented. Level of Evidence: 5 Technical Efficacy Stage: 1 J. Magn. Reson. Imaging 2019.

Journal ArticleDOI
TL;DR: In this paper, the authors present a system of performance metrics and reporting conditions that resolves the issue that the separation conditions under which these metrics are measured are often not specified, resulting in optimal performance at minimal removal.

Journal ArticleDOI
TL;DR: Novel 3D‐bioprinted mini‐brains consisting of glioblastoma cells and macrophages are presented as tool to study the interactions between these two cell types and to test therapeutics that target this interaction.
Abstract: Glioblastoma-associated macrophages (GAMs) play a crucial role in the progression and invasiveness of glioblastoma multiforme (GBM); however, the exact crosstalk between GAMs and glioblastoma cells is not fully understood. Furthermore, there is a lack of relevant in vitro models to mimic their interactions. Here, novel 3D-bioprinted mini-brains consisting of glioblastoma cells and macrophages are presented as tool to study the interactions between these two cell types and to test therapeutics that target this interaction. It is demonstrated that in the mini-brains, glioblastoma cells actively recruit macrophages and polarize them into a GAM-specific phenotype, showing clinical relevance to transcriptomic and patient survival data. Furthermore, it is shown that macrophages induce glioblastoma cell progression and invasiveness in the mini-brains. Finally, it is demonstrated how therapeutics can inhibit the interaction between GAMs and tumor cells resulting in reduced tumor growth and more sensitivity to chemotherapy. It is envisioned that this 3D-bioprinted tumor model is used to improve the understanding of tumor biology and for evaluating novel cancer therapeutics.

Journal ArticleDOI
19 Mar 2019-Sensors
TL;DR: This work proposed a generic approach to enabling spatiotemporal capabilities in information services for smart cities, adopted a multidisciplinary approach to achieving data integration and real-time processing, and developed a reference architecture for the development of event-driven applications.
Abstract: Smart cities are urban environments where Internet of Things (IoT) devices provide a continuous source of data about urban phenomena such as traffic and air pollution. The exploitation of the spatial properties of data enables situation and context awareness. However, the integration and analysis of data from IoT sensing devices remain a crucial challenge for the development of IoT applications in smart cities. Existing approaches provide no or limited ability to perform spatial data analysis, even when spatial information plays a significant role in decision making across many disciplines. This work proposes a generic approach to enabling spatiotemporal capabilities in information services for smart cities. We adopted a multidisciplinary approach to achieving data integration and real-time processing, and developed a reference architecture for the development of event-driven applications. This type of applications seamlessly integrates IoT sensing devices, complex event processing, and spatiotemporal analytics through a processing workflow for the detection of geographic events. Through the implementation and testing of a system prototype, built upon an existing sensor network, we demonstrated the feasibility, performance, and scalability of event-driven applications to achieve real-time processing capabilities and detect geographic events.

Journal ArticleDOI
TL;DR: This work provides an overview of LSCI as a tool to image tissue perfusion, a brief introduction to the theory, review clinical studies from various medical fields, and discusses current limitations impeding clinical acceptance.
Abstract: When a biological tissue is illuminated with coherent light, an interference pattern will be formed at the detector, the so-called speckle pattern. Laser speckle contrast imaging (LSCI) is a technique based on the dynamic change in this backscattered light as a result of interaction with red blood cells. It can be used to visualize perfusion in various tissues and, even though this technique has been extensively described in the literature, the actual clinical implementation lags behind. We provide an overview of LSCI as a tool to image tissue perfusion. We present a brief introduction to the theory, review clinical studies from various medical fields, and discuss current limitations impeding clinical acceptance.

Journal ArticleDOI
TL;DR: Organs-on-chips can be ‘personalised’ so they can be used as functional tests to inform clinical decision-making for specific patients.
Abstract: Organs-on-chips are microfluidic systems with controlled, dynamic microenvironments in which cultured cells exhibit functions that emulate organ-level physiology. They can in principle be ‘personalised’ to reflect individual physiology, for example by including blood samples, primary human tissue, and cells derived from induced pluripotent stem cell-derived cells, as well as by tuning key physico-chemical parameters of the cell culture microenvironment based on personal health data. The personalised nature of such systems, combined with physiologically relevant read-outs, provides new opportunities for person-specific assessment of drug efficacy and safety, as well as personalised strategies for disease prevention and treatment; together, this is known as ‘precision medicine’. There are multiple reports of how to personalise organs-on-chips, with examples including airway-on-a-chip systems containing primary patient alveolar epithelial cells, vessels-on-chips with shapes based on personal biomedical imaging data and lung-on-a-chip systems that can be exposed to various regimes of cigarette smoking. In addition, multi-organ chip systems even allow the systematic and dynamic integration of more complex combinations of personalised cell culture parameters. Current personalised organs-on-chips have not yet been used for precision medicine as such. The major challenges that affect the implementation of personalised organs-on-chips in precision medicine are related to obtaining access to personal samples and corresponding health data, as well as to obtaining data on patient outcomes that can confirm the predictive value of personalised organs-on-chips. We argue here that involving all biomedical stakeholders from clinicians and patients to pharmaceutical companies will be integral to transition personalised organs-on-chips to precision medicine.

Journal ArticleDOI
01 Mar 2019-Stroke
TL;DR: This review summarized the state of the art regarding kinematic upper limb assessments poststroke with respect to the assessment task, measurement system, and performance metrics with their clinimetric properties and provided evidence-based recommendations for future applications of upper limb kinematics in stroke recovery research.
Abstract: Background and Purpose- Assessing upper limb movements poststroke is crucial to monitor and understand sensorimotor recovery. Kinematic assessments are expected to enable a sensitive quantification of movement quality and distinguish between restitution and compensation. The nature and practice of these assessments are highly variable and used without knowledge of their clinimetric properties. This presents a challenge when interpreting and comparing results. The purpose of this review was to summarize the state of the art regarding kinematic upper limb assessments poststroke with respect to the assessment task, measurement system, and performance metrics with their clinimetric properties. Subsequently, we aimed to provide evidence-based recommendations for future applications of upper limb kinematics in stroke recovery research. Methods- A systematic search was conducted in PubMed, Embase, CINAHL, and IEEE Xplore. Studies investigating clinimetric properties of applied metrics were assessed for risk of bias using the Consensus-Based Standards for the Selection of Health Measurement Instruments checklist. The quality of evidence for metrics was determined according to the Grading of Recommendations Assessment, Development, and Evaluation approach. Results- A total of 225 studies (N=6197) using 151 different kinematic metrics were identified and allocated to 5 task and 3 measurement system groups. Thirty studies investigated clinimetrics of 62 metrics: reliability (n=8), measurement error (n=5), convergent validity (n=22), and responsiveness (n=2). The metrics task/movement time, number of movement onsets, number of movement ends, path length ratio, peak velocity, number of velocity peaks, trunk displacement, and shoulder flexion/extension received a sufficient evaluation for one clinimetric property. Conclusions- Studies on kinematic assessments of upper limb sensorimotor function are poorly standardized and rarely investigate clinimetrics in an unbiased manner. Based on the available evidence, recommendations on the assessment task, measurement system, and performance metrics were made with the goal to increase standardization. Further high-quality studies evaluating clinimetric properties are needed to validate kinematic assessments, with the long-term goal to elucidate upper limb sensorimotor recovery poststroke. Clinical Trial Registration- URL: https://www.crd.york.ac.uk/prospero/ . Unique identifier: CRD42017064279.

Journal ArticleDOI
Roy Burstein1, Nathaniel J Henry1, Michael Collison1, Laurie B. Marczak1  +663 moreInstitutions (290)
16 Oct 2019-Nature
TL;DR: A high-resolution, global atlas of mortality of children under five years of age between 2000 and 2017 highlights subnational geographical inequalities in the distribution, rates and absolute counts of child deaths by age.
Abstract: Since 2000, many countries have achieved considerable success in improving child survival, but localized progress remains unclear. To inform efforts towards United Nations Sustainable Development Goal 3.2—to end preventable child deaths by 2030—we need consistently estimated data at the subnational level regarding child mortality rates and trends. Here we quantified, for the period 2000–2017, the subnational variation in mortality rates and number of deaths of neonates, infants and children under 5 years of age within 99 low- and middle-income countries using a geostatistical survival model. We estimated that 32% of children under 5 in these countries lived in districts that had attained rates of 25 or fewer child deaths per 1,000 live births by 2017, and that 58% of child deaths between 2000 and 2017 in these countries could have been averted in the absence of geographical inequality. This study enables the identification of high-mortality clusters, patterns of progress and geographical inequalities to inform appropriate investments and implementations that will help to improve the health of all populations.

Journal ArticleDOI
TL;DR: In this article, the advantages of airborne and spaceborne remote sensing (ASRS), the principles that make passive (photography, multispectral and hyperspectral) and active (synthetic aperture radar (SAR) and light detection and ranging radar (LiDAR)) imaging techniques suitable for ACH applications are summarized and pointed out; a review of ASRS and the methodologies used over the past century is then presented together with relevant highlights from well-known research projects.

Journal ArticleDOI
15 Mar 2019-JOM
TL;DR: In this paper, the impact of reusing powders on the additive manufacturing (AM) process under an argon high-purity atmosphere is investigated, by means of a simple but well-structured method that links the particle feature characterization process to the flowability of metal AM powders.
Abstract: In a selective laser melting process, it is common to reuse the powder in consecutive cycles of the route because it is more sustainable and cost effective. However, it is unknown whether reusing the material has an influence on the process. In this paper, Inconel 718, Ti6Al4V, AlSi10Mg and Scalmalloy are characterized to determine the impact of reusing powders on the additive manufacturing (AM) process under an argon high-purity atmosphere. Virgin powders were taken from the suppliers and compared to powders that had been used in the process for a long period of time with periodic ‘rejuvenation’. A well-structured characterization procedure, combining many existing techniques, is proposed, determining changes in the morphology, composition (chemical and microstructure) and flowability. Clear differences between the virgin and used state are revealed by the characterizations; AlSi10Mg, appears to be the most sensitive to reuse with changes in particle size distribution and morphology, and with an increase in the oxygen content. The main contribution of this paper is providing insight into the effects of reuse for four commonly used AM powders, by means of a simple but well-structured method that links the particle feature characterization process to the flowability of metal AM powders. The provided insights enable enhanced decision-making on recycling and reuse of powder for specific AM processes.

Journal ArticleDOI
TL;DR: A typical type of geospatial big data, points-of-interest (POIs), was combined with multi-source remote sensing data in a random forests model to disaggregate the 2010 county-level census population data to 100 × 100 m grids and showed higher accuracy.

Journal ArticleDOI
TL;DR: In this article, a roadmap for two-dimensional materials for energy storage and conversion is presented, which includes graphite, black phosporus, MXenes, covalent organic frameworks (COFs), 2D oxides, 2D chalcogenides, and others.

Journal ArticleDOI
TL;DR: The Roadmap is organized so as to put side by side contributions on different aspects of optical processing, aiming to enhance the cross-contamination of ideas between scientists working in three different fields of photonics: optical gates and logical units, high bit-rate signal processing and optical quantum computing.
Abstract: The ability to process optical signals without passing into the electrical domain has always attracted the attention of the research community. Processing photons by photons unfolds new scenarios, in principle allowing for unseen signal processing and computing capabilities. Optical computation can be seen as a large scientific field in which researchers operate, trying to find solutions to their specific needs by different approaches; although the challenges can be substantially different, they are typically addressed using knowledge and technological platforms that are shared across the whole field. This significant know-how can also benefit other scientific communities, providing lateral solutions to their problems, as well as leading to novel applications. The aim of this Roadmap is to provide a broad view of the state-of-the-art in this lively scientific research field and to discuss the advances required to tackle emerging challenges, thanks to contributions authored by experts affiliated to both academic institutions and high-tech industries. The Roadmap is organized so as to put side by side contributions on different aspects of optical processing, aiming to enhance the cross-contamination of ideas between scientists working in three different fields of photonics: optical gates and logical units, high bit-rate signal processing and optical quantum computing. The ultimate intent of this paper is to provide guidance for young scientists as well as providing research-funding institutions and stake holders with a comprehensive overview of perspectives and opportunities offered by this research field.

Journal ArticleDOI
TL;DR: This paper proposes a set of design principles for in-vehicle HMI and reviews some current HMI designs in the light of those principles and makes recommendations on how, building on each principle, HMI design solutions can be adopted to address these challenges.
Abstract: As long as vehicles do not provide full automation, the design and function of the Human Machine Interface (HMI) is crucial for ensuring that the human “driver” and the vehicle-based automated systems collaborate in a safe manner. When the driver is decoupled from active control, the design of the HMI becomes even more critical. Without mutual understanding, the two agents (human and vehicle) will fail to accurately comprehend each other’s intentions and actions. This paper proposes a set of design principles for in-vehicle HMI and reviews some current HMI designs in the light of those principles. We argue that in many respects, the current designs fall short of best practice and have the potential to confuse the driver. This can lead to a mismatch between the operation of the automation in the light of the current external situation and the driver’s awareness of how well the automation is currently handling that situation. A model to illustrate how the various principles are interrelated is proposed. Finally, recommendations are made on how, building on each principle, HMI design solutions can be adopted to address these challenges.

Journal ArticleDOI
TL;DR: Zhang et al. as discussed by the authors incorporated illumination information into two-stream deep convolutional neural networks to learn multispectral human-related features under different illumination conditions (daytime and nighttime).

Journal ArticleDOI
TL;DR: In this article, the effect of process parameters comprising laser power, scan speed, hatch space, laser pattern angle coupling, along with heat treatment as a post-process, in relation to hardness was analyzed.
Abstract: In this paper, we printed Ti-6Al-4V SLM parts based on Taguchi design of experiment and related standards to measure and compare hardness with different mechanical properties that were obtained in our previous research such as density, strength, elongation, and average surface. Then the effect of process parameters comprising laser power, scan speed, hatch space, laser pattern angle coupling, along with heat treatment as a post-process, in relation to hardness was analysed. The relation of measured factors with each other was also studied and related mechanisms were discussed in depth. The original contribution in this paper is in producing a large and precise dataset and the comparison with mechanical properties. Another contribution is related to the analysis of process parameters in relation to hardness and explaining them by rheological phenomena. The results showed an interesting similarity between hardness and density which is highly related to the formation of the melting pool and porosities within the process.