scispace - formally typeset
Search or ask a question

Showing papers by "École Normale Supérieure published in 2009"


Proceedings ArticleDOI
20 Jun 2009
TL;DR: This paper introduces a method for salient region detection that outputs full resolution saliency maps with well-defined boundaries of salient objects that outperforms the five algorithms both on the ground-truth evaluation and on the segmentation task by achieving both higher precision and better recall.
Abstract: Detection of visually salient image regions is useful for applications like object segmentation, adaptive compression, and object recognition. In this paper, we introduce a method for salient region detection that outputs full resolution saliency maps with well-defined boundaries of salient objects. These boundaries are preserved by retaining substantially more frequency content from the original image than other existing techniques. Our method exploits features of color and luminance, is simple to implement, and is computationally efficient. We compare our algorithm to five state-of-the-art salient region detection methods with a frequency domain analysis, ground truth, and a salient object segmentation application. Our method outperforms the five algorithms both on the ground-truth evaluation and on the segmentation task by achieving both higher precision and better recall.

3,723 citations


Journal ArticleDOI
TL;DR: In this paper, a survey on the model selection performances of cross-validation procedures is presented, with a particular emphasis on distinguishing empirical statements from rigorous theoretical results, and guidelines are provided for choosing the best crossvalidation procedure according to the particular features of the problem in hand.
Abstract: Used to estimate the risk of an estimator or to perform model selection, cross-validation is a widespread strategy because of its simplicity and its apparent universality. Many results exist on the model selection performances of cross-validation procedures. This survey intends to relate these results to the most recent advances of model selection theory, with a particular emphasis on distinguishing empirical statements from rigorous theoretical results. As a conclusion, guidelines are provided for choosing the best cross-validation procedure according to the particular features of the problem in hand.

2,720 citations


Journal ArticleDOI
TL;DR: In this article, the surface forces that lead to wetting are considered, and the equilibrium surface coverage of a substrate in contact with a drop of liquid is examined, while the hydrodynamics of both wetting and dewetting is influenced by the presence of the three-phase contact line separating "wet" regions from those that are either dry or covered by a microscopic film.
Abstract: Wetting phenomena are ubiquitous in nature and technology. A solid substrate exposed to the environment is almost invariably covered by a layer of fluid material. In this review, the surface forces that lead to wetting are considered, and the equilibrium surface coverage of a substrate in contact with a drop of liquid. Depending on the nature of the surface forces involved, different scenarios for wetting phase transitions are possible; recent progress allows us to relate the critical exponents directly to the nature of the surface forces which lead to the different wetting scenarios. Thermal fluctuation effects, which can be greatly enhanced for wetting of geometrically or chemically structured substrates, and are much stronger in colloidal suspensions, modify the adsorption singularities. Macroscopic descriptions and microscopic theories have been developed to understand and predict wetting behavior relevant to microfluidics and nanofluidics applications. Then the dynamics of wetting is examined. A drop, placed on a substrate which it wets, spreads out to form a film. Conversely, a nonwetted substrate previously covered by a film dewets upon an appropriate change of system parameters. The hydrodynamics of both wetting and dewetting is influenced by the presence of the three-phase contact line separating "wet" regions from those that are either dry or covered by a microscopic film only. Recent theoretical, experimental, and numerical progress in the description of moving contact line dynamics are reviewed, and its relation to the thermodynamics of wetting is explored. In addition, recent progress on rough surfaces is surveyed. The anchoring of contact lines and contact angle hysteresis are explored resulting from surface inhomogeneities. Further, new ways to mold wetting characteristics according to technological constraints are discussed, for example, the use of patterned surfaces, surfactants, or complex fluids.

2,501 citations


Proceedings ArticleDOI
14 Jun 2009
TL;DR: A new online optimization algorithm for dictionary learning is proposed, based on stochastic approximations, which scales up gracefully to large datasets with millions of training samples, and leads to faster performance and better dictionaries than classical batch algorithms for both small and large datasets.
Abstract: Sparse coding---that is, modelling data vectors as sparse linear combinations of basis elements---is widely used in machine learning, neuroscience, signal processing, and statistics. This paper focuses on learning the basis set, also called dictionary, to adapt it to specific data, an approach that has recently proven to be very effective for signal reconstruction and classification in the audio and image processing domains. This paper proposes a new online optimization algorithm for dictionary learning, based on stochastic approximations, which scales up gracefully to large datasets with millions of training samples. A proof of convergence is presented, along with experiments with natural images demonstrating that it leads to faster performance and better dictionaries than classical batch algorithms for both small and large datasets.

2,313 citations


Journal ArticleDOI
TL;DR: In this paper, the authors assessed the economic consequences of pollinator decline by measuring the contribution of insect pollination to the world agricultural output economic value, and the vulnerability of world agriculture in the face of the decline of pollinators.

2,270 citations


Journal ArticleDOI
TL;DR: This tutorial article surveys some of these techniques based on stochastic geometry and the theory of random geometric graphs, discusses their application to model wireless networks, and presents some of the main results that have appeared in the literature.
Abstract: Wireless networks are fundamentally limited by the intensity of the received signals and by their interference. Since both of these quantities depend on the spatial location of the nodes, mathematical techniques have been developed in the last decade to provide communication-theoretic results accounting for the networks geometrical configuration. Often, the location of the nodes in the network can be modeled as random, following for example a Poisson point process. In this case, different techniques based on stochastic geometry and the theory of random geometric graphs -including point process theory, percolation theory, and probabilistic combinatorics-have led to results on the connectivity, the capacity, the outage probability, and other fundamental limits of wireless networks. This tutorial article surveys some of these techniques, discusses their application to model wireless networks, and presents some of the main results that have appeared in the literature. It also serves as an introduction to the field for the other papers in this special issue.

1,893 citations


Proceedings ArticleDOI
01 Sep 2009
TL;DR: Experimental results in image denoising and demosaicking tasks with synthetic and real noise show that the proposed method outperforms the state of the art, making it possible to effectively restore raw images from digital cameras at a reasonable speed and memory cost.
Abstract: We propose in this paper to unify two different approaches to image restoration: On the one hand, learning a basis set (dictionary) adapted to sparse signal descriptions has proven to be very effective in image reconstruction and classification tasks. On the other hand, explicitly exploiting the self-similarities of natural images has led to the successful non-local means approach to image restoration. We propose simultaneous sparse coding as a framework for combining these two approaches in a natural manner. This is achieved by jointly decomposing groups of similar signals on subsets of the learned dictionary. Experimental results in image denoising and demosaicking tasks with synthetic and real noise show that the proposed method outperforms the state of the art, making it possible to effectively restore raw images from digital cameras at a reasonable speed and memory cost.

1,812 citations


Book
06 Jun 2009
TL;DR: This book begins with several metrics in classical geometry, then proceeds to applications of distance in fields like algebra and probability, eventually working through applied mathematics, computer science, physics and chemistry, social science, and even art and religion.
Abstract: The text is divided into seven parts, with the organizational strategy moving roughly from the abstract to the concrete. The first part of the book covers important concepts in and near the study of metric spaces in general, including metric-like structures, topological separation axioms, and several metric invariants. The remaining parts are inventories of metrics in various areas of mathematics, science, and other fields. This begins with several metrics in classical geometry, then proceeds to applications of distance in fields like algebra and probability, eventually working through applied mathematics, computer science, physics and chemistry, social science, and even art and religion.

1,594 citations


Journal ArticleDOI
TL;DR: An increasing number of publications have appeared concerning Ullmann-type intermolecular reactions for the coupling of aryl and vinyl halides with N, O, and C nucleophiles, and this Minireview highlights recent and major developments in this topic since 2004.
Abstract: Copper-catalyzed Ullmann condensations are key reactions for the formation of carbon-heteroatom and carbon-carbon bonds in organic synthesis. These reactions can lead to structural moieties that are prevalent in building blocks of active molecules in the life sciences and in many material precursors. An increasing number of publications have appeared concerning Ullmann-type intermolecular reactions for the coupling of aryl and vinyl halides with N, O, and C nucleophiles, and this Minireview highlights recent and major developments in this topic since 2004.

1,458 citations


Book
21 Dec 2009
TL;DR: The theory of random matrices plays an important role in many areas of pure mathematics and employs a variety of sophisticated mathematical tools (analytical, probabilistic and combinatorial) as mentioned in this paper.
Abstract: The theory of random matrices plays an important role in many areas of pure mathematics and employs a variety of sophisticated mathematical tools (analytical, probabilistic and combinatorial). This diverse array of tools, while attesting to the vitality of the field, presents several formidable obstacles to the newcomer, and even the expert probabilist. This rigorous introduction to the basic theory is sufficiently self-contained to be accessible to graduate students in mathematics or related sciences, who have mastered probability theory at the graduate level, but have not necessarily been exposed to advanced notions of functional analysis, algebra or geometry. Useful background material is collected in the appendices and exercises are also included throughout to test the reader's understanding. Enumerative techniques, stochastic analysis, large deviations, concentration inequalities, disintegration and Lie algebras all are introduced in the text, which will enable readers to approach the research literature with confidence.

1,289 citations


Journal ArticleDOI
TL;DR: In this paper, the authors propose to use the output of a single climate model as an input for infrastructure design, instead of optimizing based on the climate conditions projected by models, therefore, future infrastructure should be made more robust to possible changes in climate conditions.
Abstract: Many decisions concerning long-lived investments already need to take into account climate change. But doing so is not easy for at least two reasons. First, due to the rate of climate change, new infrastructure will have to be able to cope with a large range of changing climate conditions, which will make design more difficult and construction more expensive. Second, uncertainty in future climate makes it impossible to directly use the output of a single climate model as an input for infrastructure design, and there are good reasons to think that the needed climate information will not be available soon. Instead of optimizing based on the climate conditions projected by models, therefore, future infrastructure should be made more robust to possible changes in climate conditions. This aim implies that users of climate information must also change their practices and decision-making frameworks, for instance by adapting the uncertainty-management methods they currently apply to exchange rates or RD (ii) favouring reversible and flexible options; (iii) buying “safety margins” in new investments; (iv) promoting soft adaptation strategies, including long-term prospective; and (v) reducing decision time horizons. Moreover, it is essential to consider both negative and positive side-effects and externalities of adaptation measures. Adaptation–mitigation interactions also call for integrated design and assessment of adaptation and mitigation policies, which are often developed by distinct communities.

Proceedings ArticleDOI
14 Jun 2009
TL;DR: A new penalty function is proposed which, when used as regularization for empirical risk minimization procedures, leads to sparse estimators and is studied theoretical properties of the estimator, and illustrated on simulated and breast cancer gene expression data.
Abstract: We propose a new penalty function which, when used as regularization for empirical risk minimization procedures, leads to sparse estimators. The support of the sparse vector is typically a union of potentially overlapping groups of co-variates defined a priori, or a set of covariates which tend to be connected to each other when a graph of covariates is given. We study theoretical properties of the estimator, and illustrate its behavior on simulated and breast cancer gene expression data.

Proceedings ArticleDOI
07 Sep 2009
TL;DR: A framework is presented that uses a higher order prior computed from an English dictionary to recognize a word, which may or may not be a part of the dictionary, and achieves significant improvement in word recognition accuracies without using a restricted word list.
Abstract: The problem of recognizing text in images taken in the wild has gained significant attention from the computer vision community in recent years. Contrary to recognition of printed documents, recognizing scene text is a challenging problem. We focus on the problem of recognizing text extracted from natural scene images and the web. Significant attempts have been made to address this problem in the recent past. However, many of these works benefit from the availability of strong context, which naturally limits their applicability. In this work we present a framework that uses a higher order prior computed from an English dictionary to recognize a word, which may or may not be a part of the dictionary. We show experimental results on publicly available datasets. Furthermore, we introduce a large challenging word dataset with five thousand words to evaluate various steps of our method exhaustively. The main contributions of this work are: (1) We present a framework, which incorporates higher order statistical language models to recognize words in an unconstrained manner (i.e. we overcome the need for restricted word lists, and instead use an English dictionary to compute the priors). (2) We achieve significant improvement (more than 20%) in word recognition accuracies without using a restricted word list. (3) We introduce a large word recognition dataset (atleast 5 times larger than other public datasets) with character level annotation and benchmark it.

Journal ArticleDOI
TL;DR: The demonstration that numerous epialleles across the genome can be stable over many generations in the absence of selection or extensive DNA sequence variation highlights the need to integrate epigenetic information into population genetics studies.
Abstract: Loss or gain of DNA methylation can affect gene expression and is sometimes transmitted across generations. Such epigenetic alterations are thus a possible source of heritable phenotypic variation in the absence of DNA sequence change. However, attempts to assess the prevalence of stable epigenetic variation in natural and experimental populations and to quantify its impact on complex traits have been hampered by the confounding effects of DNA sequence polymorphisms. To overcome this problem as much as possible, two parents with little DNA sequence differences, but contrasting DNA methylation profiles, were used to derive a panel of epigenetic Recombinant Inbred Lines (epiRILs) in the reference plant Arabidopsis thaliana. The epiRILs showed variation and high heritability for flowering time and plant height (~30%), as well as stable inheritance of multiple parental DNA methylation variants (epialleles) over at least eight generations. These findings provide a first rationale to identify epiallelic variants that contribute to heritable variation in complex traits using linkage or association studies. More generally, the demonstration that numerous epialleles across the genome can be stable over many generations in the absence of selection or extensive DNA sequence variation highlights the need to integrate epigenetic information into population genetics studies.

Journal ArticleDOI
11 Sep 2009-Science
TL;DR: Using an 800-kilometer-long, densely spaced seismic array, an image of the crust and upper mantle beneath the Himalayas and the southern Tibetan Plateau is constructed, revealing in a continuous fashion the Main Himalayan thrust fault as it extends from a shallow depth under Nepal to the mid-crust under southern Tibet.
Abstract: We studied the formation of the Himalayan mountain range and the Tibetan Plateau by investigating their lithospheric structure. Using an 800-kilometer-long, densely spaced seismic array, we have constructed an image of the crust and upper mantle beneath the Himalayas and the southern Tibetan Plateau. The image reveals in a continuous fashion the Main Himalayan thrust fault as it extends from a shallow depth under Nepal to the mid-crust under southern Tibet. Indian crust can be traced to 31°N. The crust/mantle interface beneath Tibet is anisotropic, indicating shearing during its formation. The dipping mantle fabric suggests that the Indian mantle is subducting in a diffuse fashion along several evolving subparallel structures.

Journal ArticleDOI
TL;DR: The proposed filter is an extension of the nonlocal means (NL means) algorithm introduced by Buades, which performs a weighted average of the values of similar pixels which depends on the noise distribution model.
Abstract: Image denoising is an important problem in image processing since noise may interfere with visual or automatic interpretation. This paper presents a new approach for image denoising in the case of a known uncorrelated noise model. The proposed filter is an extension of the nonlocal means (NL means) algorithm introduced by Buades, which performs a weighted average of the values of similar pixels. Pixel similarity is defined in NL means as the Euclidean distance between patches (rectangular windows centered on each two pixels). In this paper, a more general and statistically grounded similarity criterion is proposed which depends on the noise distribution model. The denoising process is expressed as a weighted maximum likelihood estimation problem where the weights are derived in a data-driven way. These weights can be iteratively refined based on both the similarity between noisy patches and the similarity of patches extracted from the previous estimate. We show that this iterative process noticeably improves the denoising performance, especially in the case of low signal-to-noise ratio images such as synthetic aperture radar (SAR) images. Numerical experiments illustrate that the technique can be successfully applied to the classical case of additive Gaussian noise but also to cases such as multiplicative speckle noise. The proposed denoising technique seems to improve on the state of the art performance in that latter case.

Journal ArticleDOI
TL;DR: The recent development of powerful tools, including zinc-sensitive fluorescent probes, selective chelators and genetically modified animal models, has brought a deeper understanding of the roles of this cation as a crucial intra- and intercellular signalling ion of the CNS.
Abstract: The past few years have witnessed dramatic progress on all frontiers of zinc neurobiology. The recent development of powerful tools, including zinc-sensitive fluorescent probes, selective chelators and genetically modified animal models, has brought a deeper understanding of the roles of this cation as a crucial intra- and intercellular signalling ion of the CNS, and hence of the neurophysiological importance of zinc-dependent pathways and the injurious effects of zinc dyshomeostasis. The development of some innovative therapeutic strategies is aimed at controlling and preventing the damaging effects of this cation in neurological conditions such as stroke and Alzheimer's disease.

Journal ArticleDOI
TL;DR: In this paper, the authors explore the impact of annual report environmental disclosures and environmental press releases as legitimation tools, and find that environmental legitimacy is significantly and positively affected by the quality of the economic-based segments of environmental disclosures.
Abstract: Using a direct measure of environmental legitimacy, we explore the impact of annual report environmental disclosures and environmental press releases as legitimation tools. The sample comprises North American firms (Canada and the United States). The results obtained show that environmental legitimacy is significantly and positively affected by the quality of the economic-based segments of annual report environmental disclosures and by reactive environmental press releases, but not by proactive press releases. Moreover, our results suggest that negative media legitimacy is a driver of environmental press releases but not of annual report environmental disclosures.

Journal ArticleDOI
TL;DR: A consensus of opinions on the diagnosis, treatment, prognosis and prevention of CanL is presented, and a system of four clinical stages, based on clinical signs, clinicopathological abnormalities and serological status is proposed.

Journal ArticleDOI
12 Mar 2009-Nature
TL;DR: This work uses a quantum-well waveguide structure to optically tune light–matter interaction from weak to ultrastrong and turn on maximum coupling within less than one cycle of light, and directly monitors how a coherent photon population converts to cavity polaritons during abrupt switching.
Abstract: Controlling the way light interacts with material excitations is at the heart of cavity quantum electrodynamics (QED). In the strong-coupling regime, quantum emitters in a microresonator absorb and spontaneously re-emit a photon many times before dissipation becomes effective, giving rise to mixed light-matter eigenmodes. Recent experiments in semiconductor microcavities reached a new limit of ultrastrong coupling, where photon exchange occurs on timescales comparable to the oscillation period of light. In this limit, ultrafast modulation of the coupling strength has been suggested to lead to unconventional QED phenomena. Although sophisticated light-matter coupling has been achieved in all three spatial dimensions, control in the fourth dimension, time, is little developed. Here we use a quantum-well waveguide structure to optically tune light-matter interaction from weak to ultrastrong and turn on maximum coupling within less than one cycle of light. In this regime, a class of extremely non-adiabatic phenomena becomes observable. In particular, we directly monitor how a coherent photon population converts to cavity polaritons during abrupt switching. This system forms a promising laboratory in which to study novel sub-cycle QED effects and represents an efficient room-temperature switching device operating at unprecedented speed.

Journal ArticleDOI
TL;DR: An approach to object retrieval which searches for and localizes all the occurrences of an object in a video, given a query image of the object, and investigates retrieval performance with respect to different quantizations of region descriptors and compares the performance of several ranking measures.
Abstract: We describe an approach to object retrieval which searches for and localizes all the occurrences of an object in a video, given a query image of the object. The object is represented by a set of viewpoint invariant region descriptors so that recognition can proceed successfully despite changes in viewpoint, illumination and partial occlusion. The temporal continuity of the video within a shot is used to track the regions in order to reject those that are unstable. Efficient retrieval is achieved by employing methods from statistical text retrieval, including inverted file systems, and text and document frequency weightings. This requires a visual analogy of a word which is provided here by vector quantizing the region descriptors. The final ranking also depends on the spatial layout of the regions. The result is that retrieval is immediate, returning a ranked list of shots in the manner of Google. We report results for object retrieval on the full length feature films 'Groundhog Day', 'Casablanca' and 'Run Lola Run', including searches from within the movie and specified by external images downloaded from the Internet. We investigate retrieval performance with respect to different quantizations of region descriptors and compare the performance of several ranking measures.

Journal ArticleDOI
TL;DR: In this paper, a set of functional equations defining the anomalous dimensions of arbitrary local single trace operators in planar supersymmetric Yang-Mills theory is presented in the form of a $Y$ system based on the integrability of the dual superstring model.
Abstract: We present a set of functional equations defining the anomalous dimensions of arbitrary local single trace operators in planar $\mathcal{N}=4$ supersymmetric Yang-Mills theory. It takes the form of a $Y$ system based on the integrability of the dual superstring $\ensuremath{\sigma}$ model on the five-dimensional anti--de Sitter space (${\mathrm{AdS}}_{5}\ifmmode\times\else\texttimes\fi{}{\mathrm{S}}^{5}$) background. This $Y$ system passes some very important tests: it incorporates the full asymptotic Bethe ansatz at large length of operator $L$, including the dressing factor, and it confirms all recently found wrapping corrections. The recently proposed ${\mathrm{AdS}}_{4}/\mathrm{\text{three-dimensional}}$ conformal field theory duality is also treated in a similar fashion.

Journal ArticleDOI
TL;DR: This work focuses on a challenging city logistics planning issue, the integrated short-term scheduling of operations and management of resources, for the general case involving a two-tiered distribution structure.
Abstract: City logistics aims to reduce the nuisances associated to freight transportation in urban areas while supporting their economic and social development. The fundamental idea is to view individual stakeholders and decisions as components of an integrated logistics system. This implies the coordination of shippers, carriers, and movements as well as the consolidation of loads of several customers and carriers into the same environment-friendly vehicles. City logistics explicitly aims to optimize such advanced urban transportation systems. We focus on a challenging city logistics planning issue, the integrated short-term scheduling of operations and management of resources, for the general case involving a two-tiered distribution structure. We investigate the main issues related to the problem, introduce a new problem class, propose both a general model and formulations for the main system components, and identify promising solution avenues.

Journal ArticleDOI
TL;DR: “Brian” is a simulator for spiking neural networks that uses vector-based computation to allow for efficient simulations, and is particularly useful for neuroscientific modelling at the systems level, and for teaching computational neuroscience.
Abstract: "Brian" is a simulator for spiking neural networks (http://www.briansimulator.org). The focus is on making the writing of simulation code as quick and easy as possible for the user, and on flexibility: new and non-standard models are no more difficult to define than standard ones. This allows scientists to spend more time on the details of their models, and less on their implementation. Neuron models are defined by writing differential equations in standard mathematical notation, facilitating scientific communication. Brian is written in the Python programming language, and uses vector-based computation to allow for efficient simulations. It is particularly useful for neuroscientific modelling at the systems level, and for teaching computational neuroscience.

Journal ArticleDOI
16 Oct 2009-Cell
TL;DR: Results reveal SCPs as a cellular origin of melanocytes, and have broad implications on the molecular mechanisms regulating skin pigmentation during development, in health and pigmentation disorders.

Book ChapterDOI
23 Jun 2009
TL;DR: Aron is a freely available library dedicated to the static analysis of the numerical variables of programs by abstract interpretation, and its goal is to provide analysis implementers with ready-to-use numerical abstractions under a unified API.
Abstract: This article describes Apron , a freely available library dedicated to the static analysis of the numerical variables of programs by abstract interpretation. Its goal is threefold: provide analysis implementers with ready-to-use numerical abstractions under a unified API, encourage the research in numerical abstract domains by providing a platform for integration and comparison, and provide teaching and demonstration tools to disseminate knowledge on abstract interpretation.

Book ChapterDOI
03 Oct 2009
TL;DR: The main result is that the required exploration-exploitation trade-offs are qualitatively different, in view of a general lower bound on the simple regret in terms of the cumulative regret.
Abstract: We consider the framework of stochastic multi-armed bandit problems and study the possibilities and limitations of strategies that perform an online exploration of the arms. The strategies are assessed in terms of their simple regret, a regret notion that captures the fact that exploration is only constrained by the number of available rounds (not necessarily known in advance), in contrast to the case when the cumulative regret is considered and when exploitation needs to be performed at the same time.We believe that this performance criterion is suited to situations when the cost of pulling an arm is expressed in terms of resources rather than rewards. We discuss the links between the simple and the cumulative regret. The main result is that the required exploration-exploitation trade-offs are qualitatively different, in view of a general lower bound on the simple regret in terms of the cumulative regret.

01 Jan 2009
TL;DR: In this paper, the authors used soil microcosms to show that functional dissimilarity among detritivorous species, not species number, drives community compositional effects on leaf litter mass loss and soil respiration, two key soil ecosystem processes.
Abstract: The loss of biodiversity can have significant impacts on ecosystem functioning, but the mechanisms involved lack empirical confirmation. Using soil microcosms, we show experimentally that functional dissimilarity among detritivorous species, not species number, drives community compositional effects on leaf litter mass loss and soil respiration, two key soil ecosystem processes. These experiments confirm theoretical predictions that biodiversity effects on ecosystem functioning can be predicted by the degree of functional differences among species.

Journal ArticleDOI
TL;DR: In this article, a convex-concave programming approach is proposed for the labeled weighted graph matching problem, which is obtained by rewriting the problem as a least-square problem on the set of permutation matrices and relaxing it to two different optimization problems.
Abstract: We propose a convex-concave programming approach for the labeled weighted graph matching problem. The convex-concave programming formulation is obtained by rewriting the weighted graph matching problem as a least-square problem on the set of permutation matrices and relaxing it to two different optimization problems: a quadratic convex and a quadratic concave optimization problem on the set of doubly stochastic matrices. The concave relaxation has the same global minimum as the initial graph matching problem, but the search for its global minimum is also a hard combinatorial problem. We, therefore, construct an approximation of the concave problem solution by following a solution path of a convex-concave problem obtained by linear interpolation of the convex and concave formulations, starting from the convex relaxation. This method allows to easily integrate the information on graph label similarities into the optimization problem, and therefore, perform labeled weighted graph matching. The algorithm is compared with some of the best performing graph matching methods on four data sets: simulated graphs, QAPLib, retina vessel images, and handwritten Chinese characters. In all cases, the results are competitive with the state of the art.

Journal ArticleDOI
TL;DR: Moderate but regular physical activity is associated with a reduction in total mortality among older people, a positive effect on primary prevention of coronary heart disease and a significant benefit on the lipid profile.
Abstract: As the number of elderly persons in our country increases, more attention is being given to geriatric healthcare needs and successful ageing is becoming an important topic in medical literature. Concept of successful ageing is in first line on a preventive approach of care for older people. Promotion of regular physical activity is one of the main non-pharmaceutical measures proposed to older subjects as low rate of physical activity is frequently noticed in this age group. Moderate but regular physical activity is associated with a reduction in total mortality among older people, a positive effect on primary prevention of coronary heart disease and a significant benefit on the lipid profile. Improving body composition with a reduction in fat mass, reducing blood pressure and prevention of stroke, as well as type 2 diabetes, are also well established. Prevention of some cancers (especially that of breast and colon), increasing bone density and prevention of falls are also reported. Moreover, some longitudinal studies suggest that physical activity is linked to a reduced risk of developing dementia and Alzheimer's disease in particular.