scispace - formally typeset
Topic

Distribution (differential geometry)

About: Distribution (differential geometry) is a(n) research topic. Over the lifetime, 911 publication(s) have been published within this topic receiving 10149 citation(s).


Papers
More filters
Journal ArticleDOI

[...]

TL;DR: In this paper, it was shown that the orbits of D are C' submanifolds of M, and moreover that they are the maximal integral submansions of a certain C9? distribution PD.
Abstract: Let D be an arbitrary set of Cc vector fields on the Cc manifold M. It is shown that the orbits of D are C' submanifolds of M, and that, moreover, they are the maximal integral submanifolds of a certain C9? distribution PD. (In general, the dimension of PD(m) will not be the same for all m EM.) The second main result gives necessary and sufficient conditions for a distribution to be integrable. These two results imply as easy corollaries the theorem of Chow about the points attainable by broken integral curves of a family of vector fields, and all the known results about integrability of distributions (i.e. the classical theorem of Frobenius for the case of constant dimension and the more recent work of Hermann, Nagano, Lobry and Matsuda). Hermann and Lobry studied orbits in connection with their work on the accessibility problem in control theory. Their method was to apply Chow's theorem to the maximal integral submanifolds of the smallest distribution A such that every vector field X in the Lie algebra generated by D belongs to A (i.e. X(m) e A(m) for every m EM). Their work therefore requires the additional assumption that A be integrable. Here the opposite approach is taken. The orbits are studied directly, and the integrability of A is not assumed in proving the first main result. It turns out that A is integrable if and only if A = PD' and this fact makes it possible to derive a characterization of integrability and Chow's theorem. Therefore, the approach presented here generalizes and unifies the work of the authors quoted above.

818 citations

Journal ArticleDOI

[...]

TL;DR: In this article, it was shown that the Brodsky-Lepage evolution equation for the leading twist spin 3/2 baryon distribution amplitude is completely integrable and reduces to the three-particle XXX-s=-1} Heisenberg spin chain.
Abstract: We show that Brodsky-Lepage evolution equation for the leading twist spin 3/2 baryon distribution amplitude is completely integrable and reduces to the three-particle XXX_{s=-1} Heisenberg spin chain. Trajectories of the anomalous dimensions are identified and calculated using the 1/N expansion. Extending this result, we prove integrability of the evolution equations for twist 3 quark-gluon operators in the large N_c limit and derive explicit expressions for the corresponding integrals of motion.

256 citations

Proceedings ArticleDOI

[...]

28 Jun 2009
TL;DR: This paper proposes a Dual Regularized Co-Clustering (DRCC) method based on semi-nonnegative matrix tri-factorization with two graph regularizers, and shows that it can be solved via alternating minimization, and its convergence is theoretically guaranteed.
Abstract: Co-clustering is based on the duality between data points (e.g. documents) and features (e.g. words), i.e. data points can be grouped based on their distribution on features, while features can be grouped based on their distribution on the data points. In the past decade, several co-clustering algorithms have been proposed and shown to be superior to traditional one-side clustering. However, existing co-clustering algorithms fail to consider the geometric structure in the data, which is essential for clustering data on manifold. To address this problem, in this paper, we propose a Dual Regularized Co-Clustering (DRCC) method based on semi-nonnegative matrix tri-factorization. We deem that not only the data points, but also the features are sampled from some manifolds, namely data manifold and feature manifold respectively. As a result, we construct two graphs, i.e. data graph and feature graph, to explore the geometric structure of data manifold and feature manifold. Then our co-clustering method is formulated as semi-nonnegative matrix tri-factorization with two graph regularizers, requiring that the cluster labels of data points are smooth with respect to the data manifold, while the cluster labels of features are smooth with respect to the feature manifold. We will show that DRCC can be solved via alternating minimization, and its convergence is theoretically guaranteed. Experiments of clustering on many benchmark data sets demonstrate that the proposed method outperforms many state of the art clustering methods.

202 citations

Journal ArticleDOI

[...]

TL;DR: In this paper, the authors present an algorithm for fitting a manifold to an unknown probability distribution supported in a separable Hilbert space, only using i.i.d samples from that distribution.
Abstract: The hypothesis that high dimensional data tend to lie in the vicinity of a low dimensional manifold is the basis of manifold learning. The goal of this paper is to develop an algorithm (with accompanying complexity guarantees) for fitting a manifold to an unknown probability distribution supported in a separable Hilbert space, only using i.i.d samples from that distribution. More precisely, our setting is the following. Suppose that data are drawn independently at random from a probability distribution $P$ supported on the unit ball of a separable Hilbert space $H$. Let $G(d, V, \tau)$ be the set of submanifolds of the unit ball of $H$ whose volume is at most $V$ and reach (which is the supremum of all $r$ such that any point at a distance less than $r$ has a unique nearest point on the manifold) is at least $\tau$. Let $L(M, P)$ denote mean-squared distance of a random point from the probability distribution $P$ to $M$. We obtain an algorithm that tests the manifold hypothesis in the following sense. The algorithm takes i.i.d random samples from $P$ as input, and determines which of the following two is true (at least one must be): (a) There exists $M \in G(d, CV, \frac{\tau}{C})$ such that $L(M, P) \leq C \epsilon.$ (b) There exists no $M \in G(d, V/C, C\tau)$ such that $L(M, P) \leq \frac{\epsilon}{C}.$ The answer is correct with probability at least $1-\delta$.

181 citations

Proceedings ArticleDOI

[...]

02 Jul 2007
TL;DR: Through manifold analysis of face pictures, a novel age estimation framework is developed to find a sufficient embedding space and model the low-dimensional manifold data with a multiple linear regression function.
Abstract: Extensive recent studies on human faces reveal significant potential applications of automatic age estimation via face image analysis. Due to the temporal features of age progression, aging face images display sequential pattern of low-dimensional distribution. Through manifold analysis of face pictures, we developed a novel age estimation framework. The manifold learning methods are applied to find a sufficient embedding space and model the low-dimensional manifold data with a multiple linear regression function. Experimental results on a large size age database demonstrate the effectiveness of the framework. To our best knowledge, this is the first work involving the manifold ways of age estimation.

178 citations

Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20221
202159
202067
201953
201843
201733