scispace - formally typeset
Search or ask a question

Showing papers by "Ivo F. Sbalzarini published in 2018"


Journal ArticleDOI
TL;DR: The Parallel Particle-Mesh Environment (PPME) as mentioned in this paper is a DSL and development environment for numerical simulations based on particle methods and hybrid particle-mesh methods, which uses the Meta Programming System, a projectional language workbench.
Abstract: Domain-specific languages (DSLs) are of increasing importance in scientific high-performance computing to reduce development costs, raise the level of abstraction, and, thus, ease scientific programming. However, designing DSLs is not easy, as it requires knowledge of the application domain and experience in language engineering and compilers. Consequently, many DSLs follow a weak approach using macros or text generators, which lack many of the features that make a DSL comfortable for programmers. Some of these features—e.g., syntax highlighting, type inference, error reporting—are easily provided by language workbenches, which combine language engineering techniques and tools in a common ecosystem. In this article, we present the Parallel Particle-Mesh Environment (PPME), a DSL and development environment for numerical simulations based on particle methods and hybrid particle-mesh methods. PPME uses the Meta Programming System, a projectional language workbench. PPME is the successor of the Parallel Particle-Mesh Language, a Fortran-based DSL that uses conventional implementation strategies. We analyze and compare both languages and demonstrate how the programmer’s experience is improved using static analyses and projectional editing, i.e., code-structure editing, constrained by syntax, as opposed to free-text editing. We present an explicit domain model for particle abstractions and the first formal type system for particle methods.

14 citations


Journal ArticleDOI
TL;DR: The authors present a content-adaptive image representation as an alternative to standard pixels that goes beyond data compression to overcome storage, memory, and processing bottlenecks.
Abstract: Modern microscopes create a data deluge with gigabytes of data generated each second, and terabytes per day. Storing and processing this data is a severe bottleneck, not fully alleviated by data compression. We argue that this is because images are processed as grids of pixels. To address this, we propose a content-adaptive representation of fluorescence microscopy images, the Adaptive Particle Representation (APR). The APR replaces pixels with particles positioned according to image content. The APR overcomes storage bottlenecks, as data compression does, but additionally overcomes memory and processing bottlenecks. Using noisy 3D images, we show that the APR adaptively represents the content of an image while maintaining image quality and that it enables orders of magnitude benefits across a range of image processing tasks. The APR provides a simple and efficient content-aware representation of fluosrescence microscopy images.

14 citations


Posted Content
TL;DR: An algorithm is formulated, termed PIP-SOLVER, based on a multivariate divided difference scheme that computes the solution of the classic Newton interpolation from one-dimensional to arbitrary-dimensional spaces in time using $\mathcal{O}\big(mN(m,n)^2\big)$ memory.
Abstract: For $m,n \in \mathbb{N}$, $m\geq 1$ and a given function $f : \mathbb{R}^m\longrightarrow \mathbb{R}$, the polynomial interpolation problem (PIP) is to determine a unisolvent node set $P_{m,n} \subseteq \mathbb{R}^m$ of $N(m,n):=|P_{m,n}|=\binom{m+n}{n}$ points and the uniquely defined polynomial $Q_{m,n,f}\in \Pi_{m,n}$ in $m$ variables of degree $\mathrm{deg}(Q_{m,n,f})\leq n \in \mathbb{N}$ that fits $f$ on $P_{m,n}$, i.e., $Q_{m,n,f}(p) = f(p)$, $\forall\, p \in P_{m,n}$. For $m=1$ the solution to the PIP is well known. In higher dimensions, however, no closed framework was available. We here present a generalization of the classic Newton interpolation from one-dimensional to arbitrary-dimensional spaces. Further we formulate an algorithm, termed PIP-SOLVER, based on a multivariate divided difference scheme that computes the solution $Q_{m,n,f}$ in $\mathcal{O}\big(N(m,n)^2\big)$ time using $\mathcal{O}\big(mN(m,n)\big)$ memory. Further, we introduce unisolvent Newton-Chebyshev nodes and show that these nodes avoid Runge's phenomenon in the sense that arbitrary periodic Sobolev functions $f \in H^k(\Omega,\mathbb{R}) \subsetneq C^0(\Omega,\mathbb{R})$, $\Omega =[-1,1]^m$ of regularity $k >m/2$ can be uniformly approximated, i.e., $ \lim_{n\rightarrow \infty}||\,f -Q_{m,n,f} \,||_{C^0(\Omega)}= 0$. Numerical experiments demonstrate the computational performance and approximation accuracy of the PIP-SOLVER in practice. We expect the presented results to be relevant for many applications, including numerical solvers, quadrature, non-linear optimization, polynomial regression, adaptive sampling, Bayesian inference, and spectral analysis.

11 citations


Book ChapterDOI
10 Jul 2018
TL;DR: This article proposes a quadratic-time solution of the Multivariate Polynomial Interpolation Problem (PIP), and formulate an algorithm for determining the N(m, n) Fourier coefficients with positive frequency of the Fourier series of f up to order n in the same amount of computational time and storage.
Abstract: In scientific computing, the problem of finding an analytical representation of a given function \(f: \Omega \subseteq \mathbb {R}^m \longrightarrow \mathbb {R},\mathbb {C}\) is ubiquitous. The most practically relevant representations are polynomial interpolation and Fourier series. In this article, we address both problems in high-dimensional spaces. First, we propose a quadratic-time solution of the Multivariate Polynomial Interpolation Problem (PIP), i.e., the N(m, n) coefficients of a polynomial Q, with \(\deg (Q)\le n\), uniquely fitting f on a determined set of generic nodes \(P\subseteq \mathbb {R}^m\) are computed in \(\mathcal {O}(N(m,n)^2)\) time requiring storage in \(\mathcal {O}(mN(m,n))\). Then, we formulate an algorithm for determining the N(m, n) Fourier coefficients with positive frequency of the Fourier series of f up to order n in the same amount of computational time and storage. Especially in high dimensions, this provides a fast Fourier interpolation, outperforming modern Fast Fourier Transform methods. We expect that these fast and scalable solutions of the polynomial and Fourier interpolation problems in high-dimensional spaces are going to influence modern computing techniques occurring in Big Data and Data Mining, Deep Learning, Image and Signal Analysis, Cryptography, and Non-linear Optimization.

10 citations


Posted ContentDOI
27 Feb 2018-bioRxiv
TL;DR: Open Microscopy Environment inteGrated Analysis (OMEGA), a cross-platform data management, analysis, and visualization system, for particle tracking data, with particular emphasis on results from viral and vesicular trafficking experiments is presented.
Abstract: MOTIVATION: Particle tracking coupled with time-lapse microscopy is critical for understanding the dynamics of intracellular processes of clinical importance. Spurred on by advances in the spatiotemporal resolution of microscopy and automated computational methods, this field is increasingly amenable to multi-dimensional high-throughput data collection schemes (Snijder et al, 2012). Typically, complex particle tracking datasets generated by individual laboratories are produced with incompatible methodologies that preclude comparison to each other. There is therefore an unmet need for data management systems that facilitate data standardization, meta-analysis, and structured data dissemination. The integration of analysis, visualization, and quality control capabilities into such systems would eliminate the need for manual transfer of data to diverse downstream analysis tools. At the same time, it would lay the foundation for shared trajectory data, particle tracking, and motion analysis standards. RESULTS: Here, we present Open Microscopy Environment inteGrated Analysis (OMEGA), a cross-platform data management, analysis, and visualization system, for particle tracking data, with particular emphasis on results from viral and vesicular trafficking experiments. OMEGA provides easy to use graphical interfaces to implement integrated particle tracking and motion analysis workflows while keeping track of error propagation and data provenance. Specifically, OMEGA: 1) imports image data and metadata from data management tools such as Open Microscopy Environment Remote Objects (OMERO; Allan et al., 2012); 2) tracks intracellular particles moving across time series of image planes; 3) facilitates parameter optimization and trajectory results inspection and validation; 4) performs downstream trajectory analysis and motion type classification; 5) estimates the uncertainty associated with motion analysis; and, 6) facilitates storage and dissemination of analysis results, and analysis definition metadata, on the basis of our newly proposed Minimum Information About Particle Tracking Experiments (MIAPTE; Rigano & Strambio-De-Castillia, 2016; 2017) guidelines in combination with the OME-XML data model (Goldberg et al, 2005).

8 citations


Book ChapterDOI
TL;DR: It is argued that modeling and computer simulation, combined with mechanistic insights, yields unprecedented deep understanding of phenomena in biology and especially in virus infections by providing a way of showing sufficiency of a hypothetical mechanism.
Abstract: An implicit aim in cellular infection biology is to understand the mechanisms how viruses, microbes, eukaryotic parasites, and fungi usurp the functions of host cells and cause disease. Mechanistic insight is a deep understanding of the biophysical and biochemical processes that give rise to an observable phenomenon. It is typically subject to falsification, that is, it is accessible to experimentation and empirical data acquisition. This is different from logic and mathematics, which are not empirical, but built on systems of inherently consistent axioms. Here, we argue that modeling and computer simulation, combined with mechanistic insights, yields unprecedented deep understanding of phenomena in biology and especially in virus infections by providing a way of showing sufficiency of a hypothetical mechanism. This ideally complements the necessity statements accessible to empirical falsification by additional positive evidence. We discuss how computational implementations of mathematical models can assist and enhance the quantitative measurements of infection dynamics of enveloped and non-enveloped viruses and thereby help generating causal insights into virus infection biology.

7 citations


Proceedings Article
21 Jul 2018
TL;DR: In this paper, the authors propose to use the minimum energy configurations of 38-atom Lennard-Jones (LJ38) clusters as a benchmark for real-valued, single-objective optimization.
Abstract: A common shortcoming in the Evolutionary Computation (EC) community is that the publication of many search heuristics is not accompanied by rigorous benchmarks on a balanced set of test problems. A welcome effort to promote such test suites are the IEEE CEC competitions on real-valued black-box optimization. These competitions prescribe carefully designed synthetic test functions and benchmarking protocols. They do, however, not contain tunable real-world examples of the important class of multi-funnel functions. We argue that finding minimum-energy configurations of 38-atom Lennard-Jones (LJ38) clusters could serve as such a benchmark for real-valued, single-objective evolutionary optimization. We thus suggest that this problem be included in EC studies whenever general-purpose optimizers are proposed. The problem is tunable from a single-funnel to a double-funnel topology. We show that the winner of the CEC 2005 competition, the Evolution Strategy with Covariance Matrix Adaptation (CMA-ES), works on the single-funnel version of this test case, but fails on the double-funnel version. We further argue that this performance loss of CMA-ES can be relaxed by using parallel island models. We support this hypothesis by simulation results of a parallel island CMA-ES, the Particle Swarm CMA-ES, on a subset of the multi-funnel functions in the CEC 2005

3 citations


Posted ContentDOI
08 Aug 2018-bioRxiv
TL;DR: A novel, algorithmic-centric, Monte Carlo method to assess the effect of experimental parameters such as signal to noise ratio (SNR), particle detection error, trajectory length, and the diffusivity characteristics of the moving particle on the uncertainty associated with motion type classification is described.
Abstract: Quantitative analysis of microscopy images is ideally suited for understanding the functional biological correlates of individual molecular species identified by one of the several available 'omics' techniques. Due to advances in fluorescent labeling, microscopy engineering and image processing, it is now possible to routinely observe and quantitatively analyze at high temporal and spatial resolution the real-time behavior of thousands of individual cellular structures as they perform their functional task inside living systems. Despite the central role of microscopic imaging in modern biology, unbiased inference, valid interpretation, scientific reproducibility and results dissemination are hampered by the still prevalent need for subjective interpretation of image data and by the limited attention given to the quantitative assessment and reporting of the error associated with each measurement or calculation, and on its effect on downstream analysis steps (i.e., error propagation). One of the mainstays of bioimage analysis is represented by single-particle tracking (SPT), which coupled with the mathematical analysis of trajectories and with the interpretative modeling of motion modalities, is of key importance for the quantitative understanding of the heterogeneous intracellular dynamic behavior of fluorescently labeled individual cellular structures, vesicles, viral particles and single-molecules. Despite substantial advances, the evaluation of analytical error propagation through SPT and motion analysis pipelines is absent from most available tools (Sbalzarini, 2016).

3 citations


Posted ContentDOI
11 Feb 2018-bioRxiv
TL;DR: The APR provides a simple, extendable, and efficient content-aware representation of images that relaxes current data and processing bottlenecks and provides orders of magnitude benefits across a range of image processing tasks.
Abstract: Modern microscopy modalities create a data deluge with gigabytes of data generated each second, or terabytes per day. Storing and processing these data is a severe bottleneck, not fully alleviated by data compression. We argue that this is because images are processed as regular grids of pixels. To address the root of the problem, we here propose a content-adaptive representation of fluorescence microscopy images called the Adaptive Particle Representation (APR). The APR replaces the regular grid of pixels with particles positioned according to image content. This overcomes storage bottlenecks, as data compression does, but additionally overcomes memory and processing bottlenecks, since the APR can directly be used in processing without going back to pixels. We present the ideas, concepts, and algorithms of the APR and validate them using noisy 3D image data. We show that the APR represents the content of an image while maintaining image quality. We then show that the adaptivity of the APR provides orders of magnitude benefits across a range of image processing tasks. Therefore, the APR provides a simple, extendable, and efficient content-aware representation of images that relaxes current data and processing bottlenecks.

2 citations


Journal ArticleDOI
TL;DR: A reaction-diffusion spatiotemporal model involving processes in the membrane and in the cytosol is developed and parameter identification can reveal the mechanisms involved in the regulation of the Crb protein in the different stages of embryonic development and how morphogenesis affects these mechanisms.