scispace - formally typeset
Search or ask a question

Showing papers on "Point (geometry) published in 2010"


Journal ArticleDOI
A. Sampath1, Jie Shan1
TL;DR: An extended boundary regularization approach is developed based on multiple parallel and perpendicular line pairs to achieve topologically consistent and geometrically correct building models.
Abstract: This paper presents a solution framework for the segmentation and reconstruction of polyhedral building roofs from aerial LIght Detection And Ranging (lidar) point clouds. The eigenanalysis is first carried out for each roof point of a building within its Voronoi neighborhood. Such analysis not only yields the surface normal for each lidar point but also separates the lidar points into planar and nonplanar ones. In the second step, the surface normals of all planar points are clustered with the fuzzy k-means method. To optimize this clustering process, a potential-based approach is used to estimate the number of clusters, while considering both geometry and topology for the cluster similarity. The final step of segmentation separates the parallel and coplanar segments based on their distances and connectivity, respectively. Building reconstruction starts with forming an adjacency matrix that represents the connectivity of the segmented planar segments. A roof interior vertex is determined by intersecting all planar segments that meet at one point, whereas constraints in the form of vertical walls or boundary are applied to determine the vertices on the building outline. Finally, an extended boundary regularization approach is developed based on multiple parallel and perpendicular line pairs to achieve topologically consistent and geometrically correct building models. This paper describes the detail principles and implementation steps for the aforementioned solution framework. Results of a number of buildings with diverse roof complexities are presented and evaluated.

364 citations


Journal ArticleDOI
TL;DR: An ℓ1-sparse method for the reconstruction of a piecewise smooth point set surface that consists mainly of smooth modes, with the residual of the objective function strongly concentrated near sharp features.
Abstract: We introduce an e1-sparse method for the reconstruction of a piecewise smooth point set surface. The technique is motivated by recent advancements in sparse signal reconstruction. The assumption underlying our work is that common objects, even geometrically complex ones, can typically be characterized by a rather small number of features. This, in turn, naturally lends itself to incorporating the powerful notion of sparsity into the model. The sparse reconstruction principle gives rise to a reconstructed point set surface that consists mainly of smooth modes, with the residual of the objective function strongly concentrated near sharp features. Our technique is capable of recovering orientation and positions of highly noisy point sets. The global nature of the optimization yields a sparse solution and avoids local minima. Using an interior-point log-barrier solver with a customized preconditioning scheme, the solver for the corresponding convex optimization problem is competitive and the results are of high quality.

200 citations


Patent
Kuwamura Shin Ya1
18 Feb 2010
TL;DR: In this article, the second virtual machine monitors at least one first virtual machine that is created on the computer and execute one or more application programs, periodically storing a state of the first VM as snapshot, and restoring the VM at a point in time when the snapshot is stored by using the snapshot of the suspended VM.
Abstract: In one computer system, causing the second virtual machine, which executes antivirus software for detecting and removing the virus, to monitor at least one first virtual machine that is created on the computer and execute one or more application program, periodically storing a state of the first virtual machine as snapshot, suspending the first virtual machine from which the virus is detected if the antivirus software executed on the second virtual machine detects the virus, and restoring the first virtual machine at a state of a point in time when the snapshot is stored by using the snapshot of the suspended first virtual machine.

180 citations


Journal ArticleDOI
TL;DR: In this article, the authors review theoretical and experimental works that focus on shape selection in non-Euclidean plates and provide an overview of this new field, and point out to open questions in the field and to its applicative potential.
Abstract: Non-Euclidean plates are plates (“stacks” of identical surfaces) whose two-dimensional intrinsic geometry is not Euclidean, i.e. cannot be realized in a flat configuration. They can be generated via different mechanisms, such as plastic deformation, natural growth or differential swelling. In recent years there has been a concurrent theoretical and experimental progress in describing and fabricating non-Euclidean plates (NEP). In particular, an effective plate theory was derived and experimental methods for a controlled fabrication of responsive NEP were developed. In this paper we review theoretical and experimental works that focus on shape selection in NEP and provide an overview of this new field. We made an effort to focus on the governing principles, rather than on details and to relate the main observations to known mechanical behavior of ordinary plates. We also point out to open questions in the field and to its applicative potential.

155 citations


01 Jan 2010
TL;DR: This work compares and contrast from a geometric perspective a number of low-dimensional signal models that support stable information-preserving dimensionality reduction, and points out a common misconception related to probabilistic compressible signal models.
Abstract: We compare and contrast from a geometric perspective a number of low-dimensional signal models that support stable information-preserving dimensionality reduction. We consider sparse and compressible signal models for deter- ministic and random signals, structured sparse and compressible signal models, point clouds, and manifold signal models. Each model has a particular geometrical structure that enables signal information to be stably preserved via a simple linear and nonadaptive projection to a much lower dimensional space; in each case the projection dimension is independent of the signal's ambient dimension at best or grows logarithmically with it at worst. As abonus, we point out a common misconception related to probabilistic compressible signal models, namely, by showing that the oft-used generalized Gaussian and Laplacian models do not support stable linear dimensionality reduction.

153 citations


01 Jan 2010
TL;DR: An implementation of the Radiosity lighting model is described along with the issues involved in combining it with the algorithms described in this dissertation, and two different approaches to computing the PVS for a cell are explored.
Abstract: Pre-processing some building models can radically reduce the number of polygons processed during interactive building walkthroughs. New model-space subdivision and potentially visible set (PVS) calculation techniques, used in combination, reduce the number of polygons processed in a real building model by an average factor of 30, and a worst case factor of at least 3.25. A method of recursive model-space subdivision using binary space partitioning is presented. Heuristics are developed to guide the choice of splitting planes. The spatial subdivisions resulting from binary space partitioning are called cells. Cells correspond roughly to rooms. An observer placed in a cell may see features exterior to the cell through transparent portions of the cell boundary called portals. Computing the polygonal definitions of the portals is cast as a problem of computing a set difference operation on co-planar polygons. A plane-sweep algorithm to compute the set operations, union, intersection and difference, on co-planar sets of polygons is presented with an emphasis on handling real-world data. Two different approaches to computing the PVS for a cell are explored. The first uses point sampling and has the advantage that it is easy to trade time for results, but has the disadvantage of under-estimating the PVS. The second approach is to analytically compute a conservative over-estimation of the PVS using techniques similar to analytical shadow computation. An implementation of the Radiosity lighting model is described along with the issues involved in combining it with the algorithms described in this dissertation.

114 citations


Journal ArticleDOI
TL;DR: In this article, a parametric level set method for reconstruction of obstacles in general inverse problems is considered, where the level set function is parameterized in terms of adaptive compactly supported radial basis functions, which provides flexibility in presenting a larger class of shapes with fewer terms.
Abstract: In this paper, a parametric level set method for reconstruction of obstacles in general inverse problems is considered. General evolution equations for the reconstruction of unknown obstacles are derived in terms of the underlying level set parameters. We show that using the appropriate form of parameterizing the level set function results a significantly lower dimensional problem, which bypasses many difficulties with traditional level set methods, such as regularization, re-initialization and use of signed distance function. Moreover, we show that from a computational point of view, low order representation of the problem paves the path for easier use of Newton and quasi-Newton methods. Specifically for the purposes of this paper, we parameterize the level set function in terms of adaptive compactly supported radial basis functions, which used in the proposed manner provides flexibility in presenting a larger class of shapes with fewer terms. Also they provide a "narrow-banding" advantage which can further reduce the number of active unknowns at each step of the evolution. The performance of the proposed approach is examined in three examples of inverse problems, i.e., electrical resistance tomography, X-ray computed tomography and diffuse optical tomography.

105 citations


Journal ArticleDOI
TL;DR: A roadmap algorithm for generating collision-free paths in terms of cubic B-spline curves for unmanned vehicles used in mining operations and allows us to find a switch back point where the vehicle reverses its direction to enter the loading area.
Abstract: In this paper we introduce a roadmap algorithm for generating collision-free paths in terms of cubic B-spline curves for unmanned vehicles used in mining operations. The algorithm automatically generates collision-free paths that are curvature continuous with an upper bounded curvature and a small slope discontinuity of curvature at knots, when we are given the locations of the obstacles, the boundary geometry of the working area, positions and directions of the vehicle at the start, loading, and the goal points. Our algorithm also allows us to find a switch back point where the vehicle reverses its direction to enter the loading area. Examples are provided to demonstrate the effectiveness of the proposed algorithms.

104 citations


Patent
19 Aug 2010
TL;DR: In this article, a non-contacting measurement apparatus for measuring distances, angles, and related geometric quantities, and for computing other quantities based on the measurements, is provided, where a user can point the device at one or more points to which the distance is measured, and angular rotation between the various points of interest can be measured and recorded.
Abstract: An non-contacting measurement apparatus for measuring distances, angles and related geometric quantities, and for computing other quantities based on the measurements, is provided. A visible light beam or ultrasonic beam allows a user to point the device at one or more points to which the distance is measured, and angular rotation between the various points of interest can be measured and recorded. Then, geometric and trigonometric relationships are used to compute and display lengths, areas, volumes or other facts derived from the measurements. Various input and output features are provided in the present embodiments.

103 citations


Journal ArticleDOI
TL;DR: This letter proposes a method based on local feature point extraction using Gabor filters to detect the urban area using an optimal decision-making approach on the vote distribution and test the method on a diverse panchromatic aerial and Ikonos satellite image set.
Abstract: Automatically detecting and monitoring urban regions is an important problem in remote sensing. Very high resolution aerial and satellite images provide valuable information to solve this problem. However, they are not sufficient alone for two main reasons. First, a human expert should analyze these very large images. There may be some errors in the operation. Second, the urban area is dynamic. Therefore, detection should be done periodically, and this is time consuming. To handle these shortcomings, an automated system is needed to detect the urban area from aerial and satellite images. In this letter, we propose such a method based on local feature point extraction using Gabor filters. We use these local feature points to vote for the candidate urban areas. Then, we detect the urban area using an optimal decision-making approach on the vote distribution. We test our method on a diverse panchromatic aerial and Ikonos satellite image set. Our test results indicate the possible use of our method in practical applications.

95 citations


Journal ArticleDOI
TL;DR: The mapping of structure sets to fingerprint space could become a new paradigm for studying crystal-structure ensembles and global chemical features of the energy landscape.
Abstract: The initial aim of the crystal fingerprint project was to solve a very specific problem: to classify and remove duplicate crystal structures from the results generated by the evolutionary crystal-structure predictor USPEX. These duplications decrease the genetic diversity of the population used by the evolutionary algorithm, potentially leading to stagnation and, after a certain time, reducing the likelihood of predicting essentially new structures. After solving the initial problem, the approach led to unexpected discoveries: unforeseen correlations, useful derived quantities and insight into the structure of the overall set of results. All of these were facilitated by the project's underlying idea: to transform the structure sets from the physical configuration space to an abstract, high-dimensional space called the fingerprint space. Here every structure is represented as a point whose coordinates (fingerprint) are computed from the crystal structure. Then the space's distance measure, interpreted as structure `closeness', enables grouping of structures into similarity classes. This model provides much flexibility and facilitates access to knowledge and algorithms from fields outside crystallography, e.g. pattern recognition and data mining. The current usage of the fingerprint-space model is revealing interesting properties that relate to chemical and crystallographic attributes of a structure set. For this reason, the mapping of structure sets to fingerprint space could become a new paradigm for studying crystal-structure ensembles and global chemical features of the energy landscape.

Journal ArticleDOI
TL;DR: A novel approach named the scaling iterative closest point (SICP) algorithm which integrates a scale matrix with boundaries into the original ICP algorithm for scaling registration of m-D point sets is introduced.

01 Jan 2010
TL;DR: In this paper, the authors consider the calculations of three-point functions at night temperature and show that the Imaginary-Time and Real Time temperature formalisms calculate expectation values of retarded and time ordered three point functions respectively.
Abstract: We consider the calculations of three-point functions at nite temperature as they are usually performed in the literature. We show that as normally used, the Imaginary-Time and the Real Time nite temperature formalisms calculate expectation values of retarded and time ordered three point functions respectively. We also present a relation between these quantities that shows that they are not generally equal. In the past in nite temperature eld theory, attention has focused largely on two-point functions 1, 2]. Recently there has been increasing interest in

Journal ArticleDOI
TL;DR: In this paper, the authors simplify and extend previous work on three-point functions in Vasiliev's higher spin gauge theory in AdS4, and find complete agreement of the tree level three point functions of higher spin currents with the conjectured dual free O(N) vector theory.
Abstract: In this paper we simplify and extend previous work on three-point functions in Vasiliev's higher spin gauge theory in AdS4. We work in a gauge in which the space-time dependence of Vasiliev's master fields is gauged away completely, leaving only the internal twistor-like variables. The correlation functions of boundary operators can be easily computed in this gauge. We find complete agreement of the tree level three point functions of higher spin currents in Vasiliev's theory with the conjectured dual free O(N) vector theory.

Journal ArticleDOI
TL;DR: In this article, a stable MPC formulation for constrained linear systems with several practical properties is developed for this scenario, and the concept of distance from a point to a set is exploited to propose an additional cost term, which ensures both recursive feasibility and local optimality.

Book ChapterDOI
08 Nov 2010
TL;DR: The idea is to say that the 2D-3D map must be everywhere a local isometry, which induces conditions on the Jacobian matrix of the map which are included in a least-squares minimization problem.
Abstract: We present different approaches to reconstructing an inextensible surface from point correspondences between an input image and a template image representing a flat reference shape from a frontoparallel point of view We first propose a 'point-wise' method, ie a method that only retrieves the 3D positions of the point correspondences This method is formulated as a second-order cone program and it handles inaccuracies in the point measurements It relies on the fact that the Euclidean distance between two 3D points must be shorter than their geodesic distance (which can easily be computed from the template image) We then present an approach that reconstructs a smooth 3D surface based on Free-Form Deformations The surface is represented as a smooth map from the template image space to the 3D space Our idea is to say that the 2D-3D map must be everywhere a local isometry This induces conditions on the Jacobian matrix of the map which are included in a least-squares minimization problem

Patent
04 Jan 2010
TL;DR: In this article, an electronic stylus emits an excitation signal to apply to a trace of a capacitive touchpad module near a touch point when the stylus touches the touchpad, so as to change a waveform of a charging/discharging signal in the trace.
Abstract: An electronic stylus emits an excitation signal to apply to a trace of a capacitive touchpad module near a touch point when the electronic stylus touches the capacitive touchpad module, so as to change a waveform of a charging/discharging signal in the trace, and depending on the waveform variation, the capacitive touchpad module can identify the touch point.

Journal ArticleDOI
TL;DR: A robust algorithm for estimating visibility from a given viewpoint for a point set containing concavities, non-uniformly spaced samples, and possibly corrupted with noise is presented.

Journal ArticleDOI
TL;DR: The results show that the ES-PIM can provide more close-to-exact stiffness compared with the “overly-stiff” finite element method (FEM) and the � “ overly-soft” node-based smoothed point interpolation method (NS-P IM).

Patent
10 Feb 2010
TL;DR: In this article, the registration of a two-dimensional image data set and a 3D image comprising point cloud data is performed by cropping a volume of point clouds to remove a portion of the ground surface within a scene, and dividing the volume into a plurality of m sub-volumes.
Abstract: Method and system for registration of a two dimensional image data set and a three-dimensional image comprising point cloud data. The method begins by cropping a three-dimensional volume of point cloud data comprising a three-dimensional image data to remove a portion of the point cloud data comprising a ground surface within a scene, and dividing the three-dimensional volume into a plurality of m sub-volumes. Thereafter, the method continues by edge-enhancing a two-dimensional image data. Then, for each qualifying sub-volume, creating a filtered density image, calculating a two-dimensional correlation surface based on the filtered density image and the two-dimensional image data that has been edge enhanced, finding a peak of the two-dimensional correlation surface, determining a corresponding location of the peak within the two-dimensional image, defining a correspondence point set; and storing the correspondence point set in a point set list. Finally, a transformation is determined that minimizes the error between a plurality of the correspondence point sets contained in the point set list.

01 Jan 2010
TL;DR: Analysis of the design thoughts of an architect from the following point of view suggests that design sketches serve not only as external memory or as a provider of visual cues for association of non-visual information, but also as a physical setting in which design thoughts are constructed on the fly.
Abstract: Design sketches are believed to play essential roles in early conceptual design processes. Exploration of how sketches are essential for the formation of new design ideas is expected to bring important implications for design education and design support systems. Little research has been done, however, to empirically examine the ways in which designers cognitively interact with their own sketches. Using a protocol analysis technique, we examined the design thoughts of an architect from the following point of view; how he drew depictions, inspected depicted elements, perceived visuo-spatial features, and thought of non-visual functional or conceptual information. The findings suggest that design sketches serve not only as external memory or as a provider of visual cues for association of non-visual information, but also as a physical setting in which design thoughts are constructed on the fly.

Posted Content
TL;DR: Kaleu is an independent, true phase space generator, written in Fortran, such that it can independently deal with several scattering processes in parallel.
Abstract: Kaleu is an independent, true phase space generator. After providing it with some information about the field theory and the particular multi-particle scattering process under consideration, it returns importance sampled random phase space points. Providing it also with the total weight of each generated phase space point, it further adapts to the integration problem on the fly. It is written in Fortran, such that it can independently deal with several scattering processes in parallel.

Journal ArticleDOI
TL;DR: In this article, a split-and-merge segmentation based on an octree structure is proposed to segment coplanar point clusters and derive parameters of their best-fit plane.
Abstract: Lidar (light detection and ranging) point cloud data contain abundant three-dimensional (3D) information. Dense distribution of scanned points on object surfaces prominently implies surface features. Particularly, plane features commonly appear in a typical lidar dataset of artificial structures. To explore implicitly contained spatial information, this study developed an automatic scheme to segment a lidar point cloud dataset into coplanar point clusters. The central mechanism of the proposed method is a split-and-merge segmentation based on an octree structure. Plane fitting serves as an engine in the mechanism that evaluates how well a group of points fits to a plane. Segmented coplanar points and derived parameters of their best-fit plane are obtained through the process. This paper also provides algorithms to derive various geometric properties of segmented coplanar points, including inherent properties of a plane, intersections of planes, and properties of point distribution on a plane. Several successful cases of handling airborne and terrestrial lidar data as well as a combination of the two are demonstrated. This method should improve the efficiency of object modelling using lidar data.

Journal ArticleDOI
TL;DR: It is shown that pair information is indeed insufficient to uniquely determine the configuration in general, including the reconstruction of atomic structures from experimentally obtained g(2) and a recently proposed "decorrelation" principle.
Abstract: Point configurations have been widely used as model systems in condensed-matter physics, materials science, and biology. Statistical descriptors, such as the n -body distribution function g(n), are usually employed to characterize point configurations, among which the most extensively used is the pair distribution function g(2). An intriguing inverse problem of practical importance that has been receiving considerable attention is the degree to which a point configuration can be reconstructed from the pair distribution function of a target configuration. Although it is known that the pair-distance information contained in g(2) is, in general, insufficient to uniquely determine a point configuration, this concept does not seem to be widely appreciated and general claims of uniqueness of the reconstructions using pair information have been made based on numerical studies. In this paper, we present the idea of the distance space called the D space. The pair distances of a specific point configuration are then represented by a single point in the D space. We derive the conditions on the pair distances that can be associated with a point configuration, which are equivalent to the realizability conditions of the pair distribution function g(2). Moreover, we derive the conditions on the pair distances that can be assembled into distinct configurations, i.e., with structural degeneracy. These conditions define a bounded region in the D space. By explicitly constructing a variety of degenerate point configurations using the D space, we show that pair information is indeed insufficient to uniquely determine the configuration in general. We also discuss several important problems in statistical physics based on the D space, including the reconstruction of atomic structures from experimentally obtained g(2) and a recently proposed "decorrelation" principle. The degenerate configurations have relevance to open questions involving the famous traveling salesman problem.

Patent
15 Jul 2010
TL;DR: In this article, a monitoring device can be configured to facilitate intra-tissue inspection of a probe at a target region, the monitoring device comprising a housing, a transducer coupled to the housing, and a display coupled to both the transducers and the display.
Abstract: A monitoring device can be configured to facilitate intra-tissue inspection of a probe at a target region, the monitoring device comprising a housing, a transducer coupled to the housing, and a display coupled to the transducer and to the housing. The transducer can comprise a first transducer array aligned to scan a longitudinal plane of the target region, and a second transducer array aligned to scan a transverse plane of the target region. The display can be configured to receive information derived from the first and second transducers to present an image of at least one of the longitudinal plane or the transverse plane, and to present at the image a probe point highlight of a probe point of the probe. Other examples, embodiments, and related methods are described herein.

24 Jun 2010
TL;DR: This work developed an automatic technique for the reconstruction of Manhattan-world interior scenes from point clouds that aims to a full automation of the modelling phase with a consequent optimization of the efficiency of the laser-scanning project pipeline.
Abstract: We developed an automatic technique for the reconstruction of Manhattan-world interior scenes from point clouds. Our method mainly focuses on the reduction of human intervention in the modelling process, thus it aims to a full automation of the modelling phase with a consequent optimization of the efficiency of the laser-scanning project pipeline. We refer to a reconstruction complexity typical for indoor scenes. The work flow starts with a volume sweep reconstruction of the interior from the three-dimensional point cloud. As result of a discrete translational plane sweep, the input data is segmented into separate point sets including the floor, ceiling and wall points. Consequently, each point is assigned to a surface of the volume. The ground plan contours are extracted with a cell decomposition approach after partitioning the floor surface into rectangular cells of variable size. Only cells considered suitable are added to the ground shape and unified to define the ground plan. Along the ground plan contour, the walls are raised from the floor up to the ceiling level. Finally, the interior model is enhanced by the addition of built-in feature like doors.

Journal ArticleDOI
TL;DR: In this article, the authors find all constant slope surfaces in the Euclidean 3-space, namely, those surfaces for which the position vector of a point of the surface makes constant angle with the normal at the surface in that point.
Abstract: In this paper, we find all constant slope surfaces in the Euclidean 3-space, namely, those surfaces for which the position vector of a point of the surface makes constant angle with the normal at the surface in that point. These surfaces could be thought as the bidimensional analog of the generalized helices. Some pictures are drawn by using the parametric equations we found.

Proceedings ArticleDOI
03 May 2010
TL;DR: This paper proposes an generalization of the Metric based Iterative Closest Point (MbICP) to the 3D case, including the derivation of point to plane distances based on the new metric.
Abstract: Scan matching techniques have been widely used to compute the displacement of robots This estimate is part of many algorithms addressing navigation and mapping This paper addresses the scan matching problem in three dimensional workspaces We propose an generalization of the Metric based Iterative Closest Point (MbICP) to the 3D case The main contribution is the development of all the mathematical tools required to formulate the ICP with this new metric, including the derivation of point to plane distances based on the new metric We also provide experimental results to evaluate the algorithms and different combinations of ICP and MbICP to illustrate the advantages of the metric based approach

Proceedings ArticleDOI
15 Dec 2010
TL;DR: In this paper, the authors propose cone carving, a space carving technique supporting topologically correct surface reconstruction from an incomplete scanned point cloud, which utilizes the point samples not only for local surface position estimation but also to obtain global visibility information under the assumption that each acquired point is visible from a point lying outside the shape.
Abstract: We present cone carving, a novel space carving technique supporting topologically correct surface reconstruction from an incomplete scanned point cloud. The technique utilizes the point samples not only for local surface position estimation but also to obtain global visibility information under the assumption that each acquired point is visible from a point lying outside the shape. This enables associating each point with a generalized cone, called the visibility cone, that carves a portion of the outside ambient space of the shape from the inside out. These cones collectively provide a means to better approximate the signed distances to the shape specifically near regions containing large holes in the scan, allowing one to infer the correct surface topology. Combining the new distance measure with conventional RBF, we define an implicit function whose zero level set defines the surface of the shape. We demonstrate the utility of cone carving in coping with significant missing data and raw scans from a commercial 3D scanner as well as synthetic input.

Journal ArticleDOI
TL;DR: In this paper, the relation between the Wasserstein distance of order 1 between probability distributions on a metric space, arising in the study of Monge-Kantorovich transport problem, and the spectral distance of noncommutative geometry is discussed.
Abstract: We discuss the relation between the Wasserstein distance of order 1 between probability distributions on a metric space, arising in the study of Monge-Kantorovich transport problem, and the spectral distance of noncommutative geometry. Starting from a remark of Rieffel on compact manifolds, we first show that on any - i.e. non-necessary compact - complete Riemannian spin manifolds, the two distances coincide. Then, on convex manifolds in the sense of Nash embedding, we provide some natural upper and lower bounds to the distance between any two probability distributions. Specializing to the Euclidean space R-n, we explicitly compute the distance for a particular class of distributions generalizing Gaussian wave packet. Finally we explore the analogy between the spectral and the Wasserstein distances in the noncommutative case, focusing on the standard model and the Moyal plane. In particular we point out that in the two-sheet space of the standard model, an optimal-transport interpretation of the metric requires a cost function that does not vanish on the diagonal. The latest is similar to the cost function occurring in the relativistic heat equation.