scispace - formally typeset
Search or ask a question

Showing papers by "IBM published in 2023"



Posted ContentDOI
15 May 2023
TL;DR: In this paper , the authors argue that the future of climate & sustainability research relies on networked/federated systems and present recent progress towards multi-cloud technologies that can scale federated geospatial discovery and modeling services across a network of nodes.
Abstract: The ballooning volume and complexity of geospatial data is one of the main inhibitors for advancements in climate & sustainability research. Oftentimes, researchers need to create bespoke and time-consuming workflows to harmonize datasets, build/deploy AI and simulation models, and perform statistical analysis. It is increasingly evident that these workflows and the underlying infrastructure are failing to scale and exploit the massive amounts of data (Peta and Exa-scale) which reside across multiple data centers and continents. While there have been attempts to consolidate relevant geospatial data and tooling into single cloud infrastructures, we argue that the future of climate & sustainability research relies on networked/federated systems. Here we present recent progress towards multi-cloud technologies that can scale federated geospatial discovery and modeling services across a network of nodes. We demonstrate how the system architecture and associated tooling can simplify the discovery and modeling process in multi-cloud environments via examples of federated analytics for AI-based flood detection and efficient data dissemination inspired by AI foundation models.

Book ChapterDOI
marieltlgx1
01 Jan 2023

Journal ArticleDOI
Matěj Bílý1
28 Feb 2023

Book ChapterDOI
Henrik Krehenwinkel1
01 Jan 2023

Book ChapterDOI
DR Vincent P M1
01 Jan 2023

Posted ContentDOI
15 May 2023
TL;DR: In this article , the authors develop methods that guarantee physical constraints are satisfied by a deep learning downscaling model while also improving their performance according to traditional metrics, and compare different constraining approaches and demonstrate their applicability across different neural architectures.
Abstract: The availability of reliable, high-resolution climate and weather data is important to inform long-term decisions on climate adaptation and mitigation and to guide rapid responses to extreme events. Forecasting models are limited by computational costs and, therefore, often generate coarse-resolution predictions. Statistical downscaling can provide an efficient method of upsampling low-resolution data. In this field, deep learning has been applied successfully, often using image super-resolution methods from computer vision. However, despite achieving visually compelling results in some cases, such models frequently violate conservation laws when predicting physical variables. In order to conserve physical quantities, we develop methods that guarantee physical constraints are satisfied by a deep learning downscaling model while also improving their performance according to traditional metrics. We compare different constraining approaches and demonstrate their applicability across different neural architectures as well as a variety of climate and weather data sets, including ERA5 and WRF data sets.

Journal ArticleDOI
Tianyong Zhang1
01 Jun 2023
TL;DR: The point-contact transistor was invented in 1947 by Bardeen and Brattain this paper , and the sandwich-like BJT was invented by Shockley in 1948, which is commonly referred to as a bipolar transistor.
Abstract: The point-contact transistor was invented in 1947 by Bardeen and Brattain [1] , and the sandwich-like BJT was invented by Shockley in 1948 [2] . While point-contact transistors were made and used briefly for a few years, it is Shockley’s sandwich-like device structure that is commonly referred to as a bipolar transistor .

Book ChapterDOI
Jana Berberich1
01 Jan 2023

Book ChapterDOI
Rosario Valenzuela Blásquez1
01 Jan 2023

Journal ArticleDOI
Pablo Meyer1
TL;DR: In this paper , semantic descriptors of odors can be implemented in a model to successfully predict odor mixture discriminability, an olfactory attribute, by taking advantage of the structure-to-percept model.
Abstract: Abstract Language is often thought as being poorly adapted to precisely describe or quantify smell and olfactory attributes. In this work, we show that semantic descriptors of odors can be implemented in a model to successfully predict odor mixture discriminability, an olfactory attribute. We achieved this by taking advantage of the structure-to-percept model we previously developed for monomolecular odorants, using chemical descriptors to predict pleasantness, intensity and 19 semantic descriptors such as “fish,” “cold,” “burnt,” “garlic,” “grass,” and “sweet” for odor mixtures, followed by a metric learning to obtain odor mixture discriminability. Through this expansion of the representation of olfactory mixtures, our Semantic model outperforms state of the art methods by taking advantage of the intermediary semantic representations learned from human perception data to enhance and generalize the odor discriminability/similarity predictions. As 10 of the semantic descriptors were selected to predict discriminability/similarity, our approach meets the need of rapidly obtaining interpretable attributes of odor mixtures as illustrated by the difficulty of finding olfactory metamers. More fundamentally, it also shows that language can be used to establish a metric of discriminability in the everyday olfactory space.

Book ChapterDOI
Joel Sauza Bedolla1
01 Jan 2023

Book ChapterDOI
Samuel Jonson Sutanto1
01 Jan 2023

Book ChapterDOI
Momiao Xiong1
01 Jan 2023

Book ChapterDOI
Fabo Feng1
01 Jan 2023

Journal ArticleDOI
Ahmad Vasel-Be-Hagh1
TL;DR: In this article , the authors take advantage of quantum computers to simulate the Hamiltonian evolution and thermal relaxation of two radical pair systems undergoing the quantum beats phenomenon, namely, 9,10-octalin+/p-terphenyl-d14 (PTP) and 2,3-dimethylbutane (DMB) with one and two groups of magnetically equivalent nuclei.
Abstract: Quantum dynamics of the radical pair mechanism is a major driving force in quantum biology, materials science, and spin chemistry. The rich quantum physical underpinnings of the mechanism are determined by a coherent oscillation (quantum beats) between the singlet and triplet spin states and their interactions with the environment, which is challenging to experimentally explore and computationally simulate. In this work, we take advantage of quantum computers to simulate the Hamiltonian evolution and thermal relaxation of two radical pair systems undergoing the quantum beats phenomenon. We study radical pair systems with nontrivial hyperfine coupling interactions, namely, 9,10-octalin+/p-terphenyl-d14 (PTP)- and 2,3-dimethylbutane (DMB)+/p-terphenyl-d14 (PTP)- with one and two groups of magnetically equivalent nuclei, respectively. Thermal relaxation dynamics in these systems are simulated using three methods: Kraus channel representations, noise models on Qiskit Aer and the inherent qubit noise present on the near-term quantum hardware. By leveraging the inherent qubit noise, we are able to simulate the noisy quantum beats in the two radical pair systems better than with any classical approximation or quantum simulator. While classical simulations of paramagnetic relaxation grow errors and uncertainties as a function of time, near-term quantum computers can match the experimental data throughout its time evolution, showcasing their unique suitability and future promise in simulating open quantum systems in chemistry.

Journal ArticleDOI
Michel Bonetti1
30 Jan 2023

BookDOI
Heike Konow1
01 Jan 2023

Posted ContentDOI
15 May 2023
TL;DR: The authors developed a pluvial/fluvial flood susceptibility model for England, using high quality open datasets (elevation, land use, soil type, location of water bodies, rainfall) to derive hydrologically-meaningful features, and an open flood inventory dataset to sample flooded/non-flooded points.
Abstract: Flooding is one of the most costly disasters in the UK, and its impact is projected to increase under climate change. Detailed, accurate and high resolution modelling and mapping of flood hazards are therefore essential to enable climate change adaptation. However, high resolution physics-based flood inundation models are extremely computationally intensive to run, presenting a challenge when mapping flood risk at the country scale, especially when working with ensembles of driving scenarios to account for uncertainty. Furthermore, efficient physical modelling for a target location and/or event required a priori categorisation of dominant flood type (for example fluvial or pluvial), which determines the selection and configuration of appropriate models. In reality, floods at scales beyond a local level are often a combination of multiple flood types. In recent years, machine learning approaches to mapping flood susceptibility have grown in popularity, enabled by large volumes of geospatial and weather/climate data from which explanatory flood factors can be derived. In this study, we develop a pluvial/fluvial flood susceptibility model for England, using high quality open datasets (elevation, land use, soil type, location of water bodies, rainfall) to derive hydrologically-meaningful features, and an open flood inventory dataset to sample flooded/non-flooded points. We train and test the model with grouped cross-validation hyper-parameter tuning for repeated samples of the data on a regular grid, where testing is carried out on unseen grid squares. We discuss the relative performance of different machine learning algorithms, including Random Forest and XG Boost, and assess the computational intensity and scalability of the model across training and inference phases. We also consider the potential of machine learning approaches to provide uncertainty estimates and, via explainable AI techniques, the sensitivity of the predicted flood probability to explanatory flood factors at any given location. Finally, we reflect on the part the modelling approach can play as part of a range of tools to meet the needs of consumers of flood risk information across multiple economic sectors.

Journal ArticleDOI
09 Jan 2023-Leonardo
TL;DR: In this paper , an artificial intelligence-based design methodology is presented based on permutations and neural networks, where elements are combined in all possible ways to form all possible design solutions and a neural network extracts the best solutions after being trained on either objective or subjective criteria.
Abstract: Abstract An artificial intelligence-based design methodology is presented based on permutations and neural networks. Elements are combined in all possible ways to form all possible design solutions and a neural network extracts the best solutions after being trained on either objective or subjective criteria. This methodology is projected to have many applications in fashion, architecture, music, storytelling, cooking, or any other design or art field that can be represented as a set of permutations.

Posted ContentDOI
Anne E Jones1, Liqun Yang
15 May 2023
TL;DR: In this paper , the authors describe the application of such a technology for the case of pluvial flooding, undertaken as part of the IBM Research and Science and Technology Facilities Research Council partnership, the Hartree National Centre for Digital Innovation (HNCDI), a 5-year programme established to develop and apply new technology to key economic challenges in the UK.
Abstract: Climate change is driving increased urgency for better quantification of climate hazards and their impacts for stakeholders across multiple economic sectors. Flooding has been highlighted as one of the most significant climates risk to UK economic infrastructure, with costs expected to increase with climate-driven changes to rainfall, such as increased intensity of summer storms. To accelerate climate change adaptation and enable economic resilience to climate change impacts, close collaboration is needed between climate scientists, impact modellers, and stakeholders, and technology advances can support this by enabling and streamlining the process of developing and deploying climate impact modelling workflows to translate complex datasets and scientific models into actionable information.In this presentation, we describe the application of such a technology for the case of pluvial flooding, undertaken as part of the IBM Research and Science and Technology Facilities Research Council partnership, the Hartree National Centre for Digital Innovation (HNCDI), a 5-year programme established to develop and apply new technology to key economic challenges in the UK. Here, we model pluvial flood hazard for a case study region in northeastern England, using a 2-d physical simulation model of flood inundation, driven by open-access geospatial and climate datasets. Flood hazard maps are translated to impact using open asset location data and damage functions.We consider the sensitivity and scalability (in terms of computational cost) of the hazard and impact predictions to multiple factors, including (1) DEM/DSM representation of land surface (2) soil and land use parameterisation, and (3) model spatial resolution. We also contrast the use of drivers in the form of extreme rainfall scenarios created using a traditional design storm approach, and ensembles of synthetic storms from a stochastic weather generator, both derived from hourly 1km gridded rainfall observations. Finally, we reflect on key gaps to be addressed in the models, data and technology to meaningfully inform climate adaptation across industry sectors.

Posted ContentDOI
Christoph Kittel1
20 Mar 2023
TL;DR: In this paper , quantum machine learning was used to detect and classify ocular diseases across age related macular degradation, cataract, diabetic, glaucoma, hypertension, and patological myopia categories versus a control group.
Abstract: In this paper we use quantum machine learning to detect and classify ocular diseases across age related macular degradation, cataract, diabetic, glaucoma, hypertension, and patological myopia categories versus a control group. We analyze fundus imagery from 1000 patients. Early findings indicate there may be benefit in terms of accuracy and loss function minimization of 2.07% and 1.979x respectively compared to a similar method implemented using traditional computers.

Posted ContentDOI
15 May 2023
TL;DR: In this paper , a cloud-native modelling framework for running geospatial models in a flexible, scalable, configurable, user-friendly manner is presented, which enables models (physical or ML/AI) to be rapidly onboarded and composed into workflows.
Abstract: Understanding and quantifying the risk of the physical impacts of climate change and their subsequent consequences have crucial importance in the changing climate for both businesses and society more widely. Historically, modelling workflows to assess such impacts have been bespoke and constrained by the data they can consume, the compute infrastructure, the expertise required to run them and the specific ways they are configured. Here we present, a cloud-native modelling framework for running geospatial models in a flexible, scalable, configurable, user-friendly manner. This enables models (physical or ML/AI) to be rapidly onboarded and composed into workflows. These workflows can be flexible, dynamic and extendable, running as for historical events, or as forecast ensembles, with varying data inputs, or extended to model impact in the real world (e.g. for example to infrastructure and populations). The framework supports the streamlined training and deployment of AI models, which can be seamlessly integrated with physical models to create hybrid workflows. We demonstrate the application and features of the framework for the examples of flooding and wildfire.

Proceedings ArticleDOI
Sue Stover1
01 Jan 2023
TL;DR: In this article , the authors cover recent industry developments in VCSEL-based transceivers that are designed for co-packaging on a first level package with ASICs, such as CPUs, GPUs, and data center switches.
Abstract: This paper covers recent industry developments in VCSEL-based transceivers that are designed for co-packaging on a first level package with ASICs, such as CPUs, GPUs, and data center switches.

Posted ContentDOI
Chan, Tsz Ho1
03 Jan 2023
TL;DR: In this article , it was shown that the cardinality of the continuous set of all reals is Θ(n) and not Ω(n), i.e., there are infinitely many naturals between Ω and Ω.
Abstract: In this short paper, we provide a mathematical proof that in set theory, developed in a mathematical universe following the ZFC axioms, Cantor’s continuum hypothesis does not hold: the cardinality of the continuous set of all reals is 𝔠, and not א1, i.e., there are infinity א1 (and maybe more than one) between 𝔠, the cardinality of the continuum, and the cardinality of the infinite set of naturals, א0.The proof is derived from combinatorics, relying on ZFC solely for the model of Cantor and Gödel defining א0. It provides input to the still unresolved first of Hilbert famous 23 math problems of interest.This paper, resolves the first of the 23 Hilbert problems with invalidation of the continuum hypothesis.

Book ChapterDOI
Fredrik Roggan1
01 Jan 2023

Book ChapterDOI
Jonathas de Brito Balbino1
01 Jan 2023

Book ChapterDOI
Toko1
01 Jan 2023

Book ChapterDOI
Muhammad Mehar1
01 Jan 2023
TL;DR: In this paper , the authors investigate how far serverless computing can be extended to become Hybrid Serverless Computing, outline its definition, describe steps towards achieving it, and identify opportunities and challenges.
Abstract: In recent years, the adoption of serverless computing has surged due to the ease of deployment, attractive pay-per-use pricing, and transparent horizontal auto-scaling. At the same time, infrastructure advancements such as the emergence of 5G networks, the explosion of devices connected to the Internet known as the Internet of Things (IoT), as well as new application requirements that constrain where computation and data can happen, will expand the reach of Cloud computing beyond traditional data centers into the emergent Hybrid Cloud market that is predicted to expand to over a trillion dollars in next few years. In Hybrid Cloud environments, driven by serverless tenants, there is an increased need to focus on enabling productive work for application builders using a distributed platform consisting of public clouds, private clouds, and edge systems. In this chapter we investigate how far serverless computing can be extended to become Hybrid Serverless Computing, outline its definition, describe steps towards achieving it, and identify opportunities and challenges.

Book ChapterDOI
01 Jan 2023