scispace - formally typeset
Search or ask a question

Showing papers by "ETH Zurich published in 2023"


Journal ArticleDOI
23 Jan 2023
TL;DR: This article used AlphaFold2 to predict structures for 65,484 human protein interactions and identified 3,137 high-confidence models, of which 1,371 have no homology to a known structure.
Abstract: Abstract Cellular functions are governed by molecular machines that assemble through protein-protein interactions. Their atomic details are critical to studying their molecular mechanisms. However, fewer than 5% of hundreds of thousands of human protein interactions have been structurally characterized. Here we test the potential and limitations of recent progress in deep-learning methods using AlphaFold2 to predict structures for 65,484 human protein interactions. We show that experiments can orthogonally confirm higher-confidence models. We identify 3,137 high-confidence models, of which 1,371 have no homology to a known structure. We identify interface residues harboring disease mutations, suggesting potential mechanisms for pathogenic variants. Groups of interface phosphorylation sites show patterns of co-regulation across conditions, suggestive of coordinated tuning of multiple protein interactions as signaling responses. Finally, we provide examples of how the predicted binary complexes can be used to build larger assemblies helping to expand our understanding of human cell biology.

22 citations


Book ChapterDOI
Takashi Suzuki1
01 Jan 2023
TL;DR: In this paper , the authors developed an end-to-end AI-based bokeh effect rendering approach that can run on modern smartphone GPUs using TensorFlow Lite and evaluated it on the Kirin 9000's Mali GPU that provides excellent acceleration results for the majority of common deep learning ops.
Abstract: As mobile cameras with compact optics are unable to produce a strong bokeh effect, lots of interest is now devoted to deep learning-based solutions for this task. In this Mobile AI challenge, the target was to develop an efficient end-to-end AI-based bokeh effect rendering approach that can run on modern smartphone GPUs using TensorFlow Lite. The participants were provided with a large-scale EBB! bokeh dataset consisting of 5K shallow/wide depth-of-field image pairs captured using the Canon 7D DSLR camera. The runtime of the resulting models was evaluated on the Kirin 9000’s Mali GPU that provides excellent acceleration results for the majority of common deep learning ops. A detailed description of all models developed in this challenge is provided in this paper.

7 citations


Journal ArticleDOI
Pedro Beltrao1
TL;DR: In this article , the authors used interaction networks to expand the list of candidate trait-associated genes from genome-wide association studies, showing that the similarity of network expansion scores identifies groups of traits likely to share an underlying genetic and biological process.
Abstract: Interacting proteins tend to have similar functions, influencing the same organismal traits. Interaction networks can be used to expand the list of candidate trait-associated genes from genome-wide association studies. Here, we performed network-based expansion of trait-associated genes for 1,002 human traits showing that this recovers known disease genes or drug targets. The similarity of network expansion scores identifies groups of traits likely to share an underlying genetic and biological process. We identified 73 pleiotropic gene modules linked to multiple traits, enriched in genes involved in processes such as protein ubiquitination and RNA processing. In contrast to gene deletion studies, pleiotropy as defined here captures specifically multicellular-related processes. We show examples of modules linked to human diseases enriched in genes with known pathogenic variants that can be used to map targets of approved drugs for repurposing. Finally, we illustrate the use of network expansion scores to study genes at inflammatory bowel disease genome-wide association study loci, and implicate inflammatory bowel disease-relevant genes with strong functional and genetic support.

6 citations


Journal ArticleDOI
Aurora Reid1
TL;DR: In this article , a shape-corrected sediment transport law is proposed to account for grain shape effects on fluid drag and granular friction and predicts that the onset and efficiency of transport depend on the coefficients of drag and bulk friction of the transported grains.
Abstract: Bed load sediment transport, in which wind or water flowing over a bed of sediment causes grains to roll or hop along the bed, is a critically important mechanism in contexts ranging from river restoration1 to planetary exploration2. Despite its widespread occurrence, predictions of bed load sediment flux are notoriously imprecise3,4. Many studies have focused on grain size variability5 as a source of uncertainty, but few have investigated the role of grain shape, even though shape has long been suspected to influence transport rates6. Here we show that grain shape can modify bed load transport rates by an amount comparable to the scatter in many sediment transport datasets4,7,8. We develop a theory that accounts for grain shape effects on fluid drag and granular friction and predicts that the onset and efficiency of transport depend on the coefficients of drag and bulk friction of the transported grains. Laboratory experiments confirm these predictions and reveal that the effect of grain shape on sediment transport can be difficult to intuit from the appearance of grains. We propose a shape-corrected sediment transport law that collapses our experimental measurements. Our results enable greater accuracy in predictions of sediment transport and help reconcile theories developed for spherical particles with the behaviour of natural sediment grains.

5 citations


Journal ArticleDOI
TL;DR: In this article , the authors investigated the current utilization of biomass in agricultural anaerobic digestion plants in terms of mass, nutrients, and energy flows to assess its contribution to the circular economy and climate change mitigation through the substitution of mineral fertilizers and fossil fuels.
Abstract: Today's agro-food system is typically based on linear fluxes (e.g., mineral fertilizers importation) when a circular approach should be privileged. The production of biogas as a renewable energy source and digestate as an organic fertilizer is essential for the circular economy in agriculture. This study investigates the current utilization of biomass in agricultural anaerobic digestion plants in Switzerland in terms of mass, nutrients, and energy flows to assess its contribution to the circular economy and climate change mitigation through the substitution of mineral fertilizers and fossil fuels. We quantify the system and its benefits in detail and examine potential future developments using different scenarios. Today, agricultural anaerobic digestion provides 1300 TJ/a of biogas. Our results demonstrate that the system could be largely expanded and provide ten times more biogas by 2050 while saving significant mineral fertilizer amounts (over 10 kt/a of dry mass nutrients yielding 38 kt/a of CO2 equivalent).

5 citations


Journal ArticleDOI
Gisbert Schneider1
TL;DR: Geometric deep learning, an emerging concept of neural-network-based machine learning, has been applied to macromolecular structures as mentioned in this paper , highlighting its potential for structure-based drug discovery and design.

5 citations


Journal ArticleDOI
Jonas Anderegg1
TL;DR: In this article , the authors evaluated the feasibility of weed detection in on-farm wheat fields characterized by a narrow row spacing, throughout the early and late developmental stages using UAV imagery and ground-based high-resolution imagery.

4 citations


Journal ArticleDOI
Detlef Stolten1
TL;DR: In this paper , a bottom-up model covering both the production and end-of-life treatment of 90% of global plastics to the planetary boundaries framework is presented, showing that even a circular, climate-optimal plastics industry combining current recycling technologies with biomass utilization transgresses sustainability thresholds by up to four times.
Abstract: Abstract The rapid growth of plastics production exacerbated the triple planetary crisis of habitat loss, plastic pollution and greenhouse gas (GHG) emissions. Circular strategies have been proposed for plastics to achieve net-zero GHG emissions. However, the implications of such circular strategies on absolute sustainability have not been examined on a planetary scale. This study links a bottom-up model covering both the production and end-of-life treatment of 90% of global plastics to the planetary boundaries framework. Here we show that even a circular, climate-optimal plastics industry combining current recycling technologies with biomass utilization transgresses sustainability thresholds by up to four times. However, improving recycling technologies and recycling rates up to at least 75% in combination with biomass and CO 2 utilization in plastics production can lead to a scenario in which plastics comply with their assigned safe operating space in 2030. Although being the key to sustainability and in improving the unquantified effect of novel entities on the biosphere, even enhanced recycling cannot cope with the growth in plastics demand predicted until 2050. Therefore, achieving absolute sustainability of plastics requires a fundamental change in our methods of both producing and using plastics.

4 citations


Book ChapterDOI
Takashi Suzuki1
01 Jan 2023
TL;DR: In this article , the authors presented a novel MicroISP model designed specifically for edge devices, taking into account their computational and memory limitations, which is capable of processing up to 32MP photos on recent smartphones using the standard mobile ML libraries and requiring less than 1 s to perform the inference.
Abstract: While neural networks-based photo processing solutions can provide a better image quality compared to the traditional ISP systems, their application to mobile devices is still very limited due to their very high computational complexity. In this paper, we present a novel MicroISP model designed specifically for edge devices, taking into account their computational and memory limitations. The proposed solution is capable of processing up to 32MP photos on recent smartphones using the standard mobile ML libraries and requiring less than 1 s to perform the inference, while for FullHD images it achieves real-time performance. The architecture of the model is flexible, allowing to adjust its complexity to devices of different computational power. To evaluate the performance of the model, we collected a novel Fujifilm UltraISP dataset consisting of thousands of paired photos captured with a normal mobile camera sensor and a professional 102MP medium-format FujiFilm GFX100 camera. The experiments demonstrated that, despite its compact size, the MicroISP model is able to provide comparable or better visual results than the traditional mobile ISP systems, while outperforming the previously proposed efficient deep learning based solutions. Finally, this model is also compatible with the latest mobile AI accelerators, achieving good runtime and low power consumption o n smartphone NPUs and APUs. The code, dataset and pre-trained models are available on the project website: https://people.ee.ethz.ch/~ihnatova/microisp.html .

4 citations


Journal ArticleDOI
TL;DR: Fuzzy seed matches as mentioned in this paper can be used to identify both exact-matching and highly similar seeds with a single lookup of their hash values, called fuzzy seed matches, which can reduce the use of costly sequence alignment or limited sensitivity.
Abstract: Generating the hash values of short subsequences, called seeds, enables quickly identifying similarities between genomic sequences by matching seeds with a single lookup of their hash values. However, these hash values can be used only for finding exact-matching seeds as the conventional hashing methods assign distinct hash values for different seeds, including highly similar seeds. Finding only exact-matching seeds causes either 1) increasing the use of the costly sequence alignment or 2) limited sensitivity. We introduce BLEND, the first efficient and accurate mechanism that can identify both exact-matching and highly similar seeds with a single lookup of their hash values, called fuzzy seed matches. BLEND 1) utilizes a technique called SimHash, that can generate the same hash value for similar sets, and 2) provides the proper mechanisms for using seeds as sets with the SimHash technique to find fuzzy seed matches efficiently. We show the benefits of BLEND when used in read overlapping and read mapping. For read overlapping, BLEND is faster by 2.4x - 83.9x (on average 19.3x), has a lower memory footprint by 0.9x - 14.1x (on average 3.8x), and finds higher quality overlaps leading to accurate de novo assemblies than the state-of-the-art tool, minimap2. For read mapping, BLEND is faster by 0.8x - 4.1x (on average 1.7x) than minimap2. Source code is available at https://github.com/CMU-SAFARI/BLEND.

3 citations


Posted ContentDOI
None Greg Murray1
08 Jan 2023
TL;DR: In this paper , a method to estimate (re)modeling velocity curves from time-lapsed in vivo mouse caudal vertebrae data under static and cyclic mechanical loading was proposed.
Abstract: Abstract Mechanical loading is a key factor governing bone adaptation. Both preclinical and clinical studies have demonstrated its effects on bone tissue, which were also notably predicted in the mechanostat theory. Indeed, existing methods to quantify bone mechanoregulation have successfully associated the frequency of (re)modeling events with local mechanical signals, combining time-lapsed in vivo micro-computed tomography (micro-CT) imaging and micro-finite element (micro-FE) analysis. However, a correlation between the local surface velocity of (re)modeling events and mechanical signals has not been shown. As many degenerative bone diseases have also been linked to impaired bone (re)modeling, this relationship could provide an advantage in detecting the effects of such conditions and advance our understanding of the underlying mechanisms. Therefore, in this study, we introduce a novel method to estimate (re)modeling velocity curves from time-lapsed in vivo mouse caudal vertebrae data under static and cyclic mechanical loading. These curves can be fitted with piecewise linear functions as proposed in the mechanostat theory. Accordingly, new (re)modeling parameters can be derived from such data, including formation saturation levels, resorption velocity modulus, and (re)modeling thresholds. Our results revealed that the norm of the gradient of strain energy density yielded the highest accuracy in quantifying mechanoregulation data using micro-FE analysis with homogeneous material properties, while effective strain was the best predictor for micro-FE analysis with heterogeneous material properties. Furthermore, (re)modeling velocity curves could be accurately described with piecewise linear and hyperbola functions (root mean square error below 0.2 µm/day for weekly analysis), and several (re)modeling parameters determined from these curves followed a logarithmic relationship with loading frequency. Crucially, (re)modeling velocity curves and derived parameters could detect differences in mechanically driven bone adaptation, which complemented previous results showing a logarithmic relationship between loading frequency and net change in bone volume fraction over four weeks. Together, we expect this data to support the calibration of in silico models of bone adaptation and the characterization of the effects of mechanical loading and pharmaceutical treatment interventions in vivo .


Journal ArticleDOI
James S Tomlinson1
TL;DR: In this article , the authors determined the solubility of water in 14 peridotite liquids, representative of Earth's mantle, synthesised in a laser-heated aerodynamic levitation furnace.

Journal ArticleDOI
Junying Hui1
TL;DR: In this paper , an adaptive robust optimal power flow is proposed to secure the hourly schedule against uncertain intra-hour power injections, where second-order cone programming relaxation is employed to address the nonconvexity of power flow constraints.

Journal ArticleDOI
TL;DR: In this article , a thermodynamically well-posed multiphase numerical model was proposed for phase compression and expansion, which relies on a finite pressure-relaxation rate formulation.
Abstract: Investigations of shock-induced cavitation within a droplet are highly challenged by the multiphase nature of the mechanisms involved. Within the context of heterogeneous nucleation, we introduce a thermodynamically well-posed multiphase numerical model accounting for phase compression and expansion, which relies on a finite pressure-relaxation rate formulation. We simulate (i) the spherical collapse of a bubble in a free field, (ii) the interaction of a cylindrical water droplet with a planar shock wave, and (iii) the high-speed impact of a gelatin droplet onto a solid surface. The determination of the finite pressure-relaxation rate is done by comparing the numerical results with the Keller–Miksis model, and the corresponding experiments of Sembian et al. and Field et al., respectively. For the latter two, the pressure-relaxation rate is found to be [Formula: see text] and [Formula: see text], respectively. Upon the validation of the determined pressure-relaxation rate, we run parametric simulations to elucidate the critical Mach number from which cavitation is likely to occur. Complementing simulations with a geometrical acoustic model, we provide a phenomenological description of the shock-induced cavitation within a droplet, as well as a discussion on the bubble-cloud growth effect on the droplet flow field. The usual prediction of the bubble cloud center, given in the literature, is eventually modified to account for the expansion wave magnitude.

Posted ContentDOI
Carolin C. Wendling1
11 Mar 2023
TL;DR: In this paper , the authors summarize the latest advances exploring how the impact of prophages are transmitted through multiple levels with potential impacts on ecosystem stability and functioning, ranging from contributions to global biogeochemical processes and mutualistic interactions to increased disease severity with negative impacts to ecosystem engineers and potential cascading effects for multiple species.
Abstract: Prophages, latent viral elements residing in bacterial genomes impact bacterial ecology and evolution in diverse ways. Do these prophage-mediated effects extend beyond the prophage-bacterium relationship? Here, I summarize the latest advances exploring how the impact of prophages are transmitted through multiple levels with potential impacts on ecosystem stability and functioning. The diverse effects of prophages on higher-order interactions are context-specific, ranging from contributions to global biogeochemical processes and mutualistic interactions to increased disease severity with negative impacts on ecosystem engineers and potential cascading effects for multiple species. While we have a solid understanding about the mechanisms by which prophages modulate their bacterial host at the cellular and population level, future research should take an integrative approach to quantify their effects in complex ecosystems.

Journal ArticleDOI
TL;DR: In early 2020, an international team set out to investigate trade-wind cumulus clouds and their coupling to the large-scale circulation through the field campaign EUREC4A: ElUcidating the RolE of Clouds-Circulation Coupling in ClimAte as discussed by the authors .
Abstract: Abstract. In early 2020, an international team set out to investigate trade-wind cumulus clouds and their coupling to the large-scale circulation through the field campaign EUREC4A: ElUcidating the RolE of Clouds-Circulation Coupling in ClimAte. Focused on the western tropical Atlantic near Barbados, EUREC4A deployed a number of innovative observational strategies, including a large network of water isotopic measurements collectively known as EUREC4A-iso, to study the tropical shallow convective environment. The goal of the isotopic measurements was to elucidate processes that regulate the hydroclimate state – for example, by identifying moisture sources, quantifying mixing between atmospheric layers, characterizing the microphysics that influence the formation and persistence of clouds and precipitation, and providing an extra constraint in the evaluation of numerical simulations. During the field experiment, researchers deployed seven water vapor isotopic analyzers on two aircraft, on three ships, and at the Barbados Cloud Observatory (BCO). Precipitation was collected for isotopic analysis at the BCO and from aboard four ships. In addition, three ships collected seawater for isotopic analysis. All told, the in situ data span the period 5 January–22 February 2020 and cover the approximate area 6 to 16∘ N and 50 to 60∘ W, with water vapor isotope ratios measured from a few meters above sea level to the mid-free troposphere and seawater samples spanning the ocean surface to several kilometers depth. This paper describes the full EUREC4A isotopic in situ data collection – providing extensive information about sampling strategies and data uncertainties – and also guides readers to complementary remotely sensed water vapor isotope ratios. All field data have been made publicly available even if they are affected by known biases, as is the case for high-altitude aircraft measurements, one of the two BCO ground-based water vapor time series, and select rain and seawater samples from the ships. Publication of these data reflects a desire to promote dialogue around improving water isotope measurement strategies for the future. The remaining, high-quality data create unprecedented opportunities to close water isotopic budgets and evaluate water fluxes and their influence on cloudiness in the trade-wind environment. The full list of dataset DOIs and notes on data quality flags are provided in Table 3 of Sect. 5 (“Data availability”).

Posted ContentDOI
Anthony Dayan1
15 Jan 2023
TL;DR: In this paper , the authors compared two variant callers, GATK and DeepVariant, in 50 Brown Swiss cattle with sequencing coverages ranging from 4 to 63fold.
Abstract: Background Low-pass sequencing followed by sequence variant genotype imputation is an alternative to the routine microarray-based genotyping in cattle. However, the impact of haplotype reference panel composition and its interplay with the coverage of low-pass whole-genome sequencing data has not been sufficiently explored in typical livestock settings where only a small number of reference samples are available. Methods Sequence variant genotyping accuracy was compared between two variant callers, GATK and DeepVariant, in 50 Brown Swiss cattle with sequencing coverages ranging from 4 to 63-fold. Haplotype reference panels of varying sizes and composition were built with DeepVariant considering 501 cattle from nine breeds. High coverage sequencing data of 24 Brown Swiss cattle was downsampled to between 0.01- and 4-fold coverage to mimic low-pass sequencing. GLIMPSE was used to infer sequence variant genotypes from the low-pass sequencing data using different haplotype reference panels. The accuracy of the sequence variant genotypes imputed inferred from low-pass sequencing data was compared with sequence variant genotypes called from high-coverage data. Results DeepVariant was used to establish bovine haplotype reference panels because it outperformed GATK in all evaluations. Same-breed haplotype reference panels were better suited to impute sequence variant genotypes from low-pass sequencing than equally-sized multibreed haplotype reference panels for all target sample coverages and allele frequencies. F1 scores greater than 0.9, implying high harmonic means of recall and precision of called genotypes, were achieved with 0.25-fold sequencing coverage when large breed-specific haplotype reference panels (n = 150) were used. In absence of such large same-breed haplotype panels, variant genotyping accuracy from low-pass sequencing could be increased either by adding non-related samples to the haplotype reference panel or by increasing the coverage of the low-pass sequencing data. Sequence variant genotyping from low pass sequencing was substantially less accurate when the reference panel lacks individuals from the target breed. Conclusions Variant genotyping is more accurate with Deep-Variant than GATK. DeepVariant is therefore suitable to establish bovine haplotype reference panels. Medium-sized breed-specific haplotype reference panels and large multibreed haplotype reference panels enable accurate imputation of low-pass sequencing data in a typical cattle breed.

Journal ArticleDOI
Christian Mercure1
TL;DR: In this paper , a selection of meson and baryon masses on two QCD and five QCD+QED gauge ensembles is presented, which preserves locality, gauge and translational invariance all through the calculation.
Abstract: A bstract Accounting for isospin-breaking corrections is critical for achieving subpercent precision in lattice computations of hadronic observables. A way to include QED and strong-isospin-breaking corrections in lattice QCD calculations is to impose C ⋆ boundary conditions in space. Here, we demonstrate the computation of a selection of meson and baryon masses on two QCD and five QCD+QED gauge ensembles in this setup, which preserves locality, gauge and translational invariance all through the calculation. The generation of the gauge ensembles is performed for two volumes, and three different values of the renormalized fine-structure constant at the U-symmetric point, corresponding to the SU(3)-symmetric QCD in the two ensembles where the electromagnetic coupling is turned off. We also present our tuning strategy and, to the extent possible, a cost analysis of the simulations with C ⋆ boundary conditions.

Journal ArticleDOI
TL;DR: In this paper , the authors used a quasi-2D experimental model of a porous medium to characterize biofilm growth dynamics for different pore sizes and flow rates, which can be used to stochastically generate permeability fields within biofilms.
Abstract: The functioning of natural and engineered porous media, like soils and filters, depends in many cases on the interplay between biochemical processes and hydrodynamics. In such complex environments, microorganisms often form surface-attached communities known as biofilms. Biofilms can take the shape of clusters, which alter the distribution of fluid flow velocities within the porous medium, subsequently influencing biofilm growth. Despite numerous experimental and numerical efforts, the control of the biofilm clustering process and the resulting heterogeneity in biofilm permeability is not well understood, limiting our predictive abilities for biofilm-porous medium systems. Here, we use a quasi-2D experimental model of a porous medium to characterize biofilm growth dynamics for different pore sizes and flow rates. We present a method to obtain the time-resolved biofilm permeability field from experimental images and use the obtained permeability field to compute the flow field through a numerical model. We observe a biofilm cluster size distribution characterized by a spectrum slope evolving in time between -2 and -1, a fundamental measure that can be used to create spatio-temporal distributions of biofilm clusters for upscaled models. We find a previously undescribed biofilm permeability distribution, which can be used to stochastically generate permeability fields within biofilms. An increase in velocity variance for a decrease in physical heterogeneity shows that the bioclogged porous medium behaves differently than expected from studies on heterogeneity in abiotic porous media.

Journal ArticleDOI
Paola Bacigaluppi1
TL;DR: Abgrall et al. as discussed by the authors proposed a novel approximation strategy for time-dependent hyperbolic systems of conservation laws for the Euler system of gas dynamics that aims to represent the dynamics of strong interacting discontinuities.

Posted ContentDOI
15 May 2023
TL;DR: In this paper , satellite remote sensing estimates of current forest biomass were integrated with a machine learning framework to show that existing global forests could increase their above-ground biomass by 44.1 PgC at most (an increase of 16% over current levels) if allowed to reach their natural equilibrium state.
Abstract: Global forests play a key role in the global carbon cycle and are a cornerstone in international policy-making to prevent global warming from exceeding 1.5°C and reach carbon neutrality. In line with recent climate science, the actions taken in the current decade are crucial for obtaining the goals laid out in international agreements. One forest-based strategy with high short-term climate benefits is the return of global forests to their carbon storage potential, by ceasing forest management, but the ecological boundaries of increasing biomass in existing forests remain poorly quantified. Recent studies preferentially focus on the mitigation potential of reforestation, without explicitly accounting for the carbon dynamics in existing forests, thus providing an incomplete evaluation of the possible expansion of the forest carbon stock. Here we integrate satellite remote sensing estimates of current forest biomass with a machine learning framework to show that existing global forests could increase their above-ground biomass by 44.1 PgC at most (an increase of 16% over current levels) if allowed to reach their natural equilibrium state. In total, the maximum carbon storage potential in this hypothetical scenario equates to just about 4 years of global anthropogenic CO2 emissions (at the 2019 rate). This maximum potential would require the complete stop of forest management and harvesting for decades. Therefore, without first strongly reducing CO2 emissions, this strategy holds low climate change mitigation potential. This urges to view storing additional carbon in existing forests as an effective strategy to offset carbon emission from sectors that will be hard to decarbonise, rather than as a tool to compensate all business-as-usual emissions.

Posted ContentDOI
15 May 2023
TL;DR: In this paper , an autoencoder is proposed to denoise the data efficiently and to unmask the signals of interest in Cryospheric seismic data, which can potentially separate the incoherent noise (such as wind or water flow) from the temporally and spatially coherent signals.
Abstract: One major challenge in Environmental Seismology is that signals of interest are often buried within the high noise level emitted by a multitude of environmental processes. Those signals potentially stay unnoticed and thus, might not be analyzed further.Distributed acoustic sensing (DAS) is an emerging technology for measuring strain rate data by using common fiber-optic cables in combination with an interrogation unit. This technology enables researchers to acquire seismic monitoring data on poorly accessible terrain with great spatial and temporal resolution. We utilized a DAS unit in a cryospheric environment on a temperate glacier. The data collection took place in July 2020 on Rhonegletscher, Switzerland, where a 9 km long fiber-optic cable was installed, covering the entire glacier from its accumulation to its ablation zone. During one month 17 TB of data were acquired. Due to the highly active and dynamic cryospheric environment, our collected DAS data are characterized by a low signal to noise ratio compared to classical point sensors. Therefore, new techniques are required to denoise the data efficiently and to unmask the signals of interest. Here we propose an autoencoder, which is a deep neural network, as a denoising tool for the analysis of our cryospheric seismic data. An autoencoder can potentially separate the incoherent noise (such as wind or water flow) from the temporally and spatially coherent signals of interest (e.g., stick-slip event or crevasse formation). We test this approach on the continuous microseismic Rhonegletscher DAS records. To investigate the autoencoder’s general suitability and performance, three different types of training data are tested: purely synthetic data, original data from on-site seismometers, and original data from the DAS recordings themselves. Finally, suitability, performance as well as advantages and disadvantages of the different types of training data are discussed.

Journal ArticleDOI
14 Feb 2023-Physics
TL;DR: In this article , the real time variation of the ratio of recovery to infection rate as a key parameter of the SIR (susceptible-infected-recovered/removed) epidemic model is analyzed.
Abstract: Monitored differential infection rates of past corona waves are used to infer, a posteriori, the real time variation of the ratio of recovery to infection rate as a key parameter of the SIR (susceptible-infected-recovered/removed) epidemic model. From monitored corona waves in five different countries, it is found that this ratio exhibits a linear increase at early times below the first maximum of the differential infection rate, before the ratios approach a nearly constant value close to unity at the time of the first maximum with small amplitude oscillations at later times. The observed time dependencies at early times and at times near the first maximum agree favorably well with the behavior of the calculated ratio for the Gaussian temporal evolution of the rate of new infections, although the predicted linear increase of the Gaussian ratio at late times is not observed.

Book ChapterDOI
Ren Yang1
01 Jan 2023
TL;DR: The Challenge on Super-Resolution of Compressed Image and Video at AIM 2022 as mentioned in this paper includes two tracks. Track 1 aims at the super-resolution of compressed image, and Track 2 targets the super resolution of compressed video.
Abstract: This paper reviews the Challenge on Super-Resolution of Compressed Image and Video at AIM 2022. This challenge includes two tracks. Track 1 aims at the super-resolution of compressed image, and Track 2 targets the super-resolution of compressed video. In Track 1, we use the popular dataset DIV2K as the training, validation and test sets. In Track 2, we propose the LDV 3.0 dataset, which contains 365 videos, including the LDV 2.0 dataset (335 videos) and 30 additional videos. In this challenge, there are 12 teams and 2 teams that submitted the final results to Track 1 and Track 2, respectively. The proposed methods and solutions gauge the state-of-the-art of super-resolution on compressed image and video. The proposed LDV 3.0 dataset is available at https://github.com/RenYang-home/LDV_dataset . The homepage of this challenge is at https://github.com/RenYang-home/AIM22_CompressSR .

Journal ArticleDOI
Jordan Aaron1
TL;DR: In this paper , high-resolution and high-frequency 3D LiDAR data was used to explore the dynamics of a debris flow at Illgraben, Switzerland, and the results showed that the relative velocity of different particles can be used to infer that the vertical velocity profile varies between plug flow and one that features internal shear.
Abstract: Surging debris flows are among the most destructive natural hazards, and elucidating the interaction between coarse-grained fronts and the trailing liquefied slurry is key to understanding these flows. Here, we describe the application of high-resolution and high-frequency 3D LiDAR data to explore the dynamics of a debris flow at Illgraben, Switzerland. The LiDAR measurements facilitate automated detection of features on the flow surface, and construction of the 3D flow depth and velocity fields through time. Measured surface velocities (2–3 m s−1) are faster than front velocities (0.8–2 m s−1), illustrating the mechanism whereby the flow front is maintained along the channel. Further, we interpret the relative velocity of different particles to infer that the vertical velocity profile varies between plug flow and one that features internal shear. Our measurements provide unique insights into debris-flow motion, and provide the foundation for a more detailed understanding of these hazardous events.


Journal ArticleDOI
Kholikov Umidjon1
TL;DR: In this article , the standard technical robustness concept is enlarged by societal and institutional, including regulatory, aspects, and then amplified by the more dynamic notion of resilience and adaptiveness, followed by an analysis of processes and procedures in all three policy areas under scrutiny.
Abstract: The standard (technical) robustness concept (of Chap. 4 ) is enlarged by societal and institutional, including regulatory, aspects, and then amplified by the more dynamic notion of “resilience” and “adaptiveness”, followed by an analysis of processes and procedures in all three policy areas under scrutiny: radioactive waste, conventional hazardous waste and carbon storage CCS. The approach culminates in turning over the concept of “governance” to the applications, thus establishing a framework for the intended “Strategic Monitoring”.

Journal ArticleDOI
Annette Oxenius1
TL;DR: In this article , the role of asymmetric cell division (ACD) for CD8 T cell fate regulation was investigated under different activation conditions, and it was shown that strong TCR stimulation induces elevated ACD rates, and subsequent single-cell-derived colonies comprised both effector and memory precursor cells.

Journal ArticleDOI
Lars-Erik Cederman1
TL;DR: In this paper , a systematic analysis of the internal aspects of Tilly's theory with regard to the territorial expansion of states is presented, focusing on the period from 1490 through 1790.
Abstract: Abstract Charles Tilly's classical claim that “war made states” in early modern Europe remains controversial. The “bellicist” paradigm has attracted theoretical criticism both within and beyond its original domain of applicability. While several recent studies have analyzed the internal aspects of Tilly's theory, there have been very few systematic attempts to assess its logic with regard to the territorial expansion of states. In this paper, we test this key aspect of bellicist theory directly by aligning historical data on European state borders with conflict data, focusing on the period from 1490 through 1790. Proceeding at the systemic, state, and dyadic levels, our analysis confirms that warfare did in fact play a crucial role in the territorial expansion of European states before (and beyond) the French Revolution.