scispace - formally typeset
Search or ask a question

Showing papers by "École Normale Supérieure published in 2014"


Proceedings ArticleDOI
23 Jun 2014
TL;DR: This work designs a method to reuse layers trained on the ImageNet dataset to compute mid-level image representation for images in the PASCAL VOC dataset, and shows that despite differences in image statistics and tasks in the two datasets, the transferred representation leads to significantly improved results for object and action classification.
Abstract: Convolutional neural networks (CNN) have recently shown outstanding image classification performance in the large- scale visual recognition challenge (ILSVRC2012). The suc- cess of CNNs is attributed to their ability to learn rich mid- level image representations as opposed to hand-designed low-level features used in other image classification meth- ods. Learning CNNs, however, amounts to estimating mil- lions of parameters and requires a very large number of annotated image samples. This property currently prevents application of CNNs to problems with limited training data. In this work we show how image representations learned with CNNs on large-scale annotated datasets can be effi- ciently transferred to other visual recognition tasks with limited amount of training data. We design a method to reuse layers trained on the ImageNet dataset to compute mid-level image representation for images in the PASCAL VOC dataset. We show that despite differences in image statistics and tasks in the two datasets, the transferred rep- resentation leads to significantly improved results for object and action classification, outperforming the current state of the art on Pascal VOC 2007 and 2012 datasets. We also show promising results for object and action localization.

3,316 citations


Proceedings Article
08 Dec 2014
TL;DR: SAGA as discussed by the authors improves on the theory behind SAG and SVRG, with better theoretical convergence rates, and has support for composite objectives where a proximal operator is used on the regulariser.
Abstract: In this work we introduce a new optimisation method called SAGA in the spirit of SAG, SDCA, MISO and SVRG, a set of recently proposed incremental gradient algorithms with fast linear convergence rates SAGA improves on the theory behind SAG and SVRG, with better theoretical convergence rates, and has support for composite objectives where a proximal operator is used on the regulariser Unlike SDCA, SAGA supports non-strongly convex problems directly, and is adaptive to any inherent strong convexity of the problem We give experimental results showing the effectiveness of our method

1,455 citations


Journal ArticleDOI
27 Mar 2014-Cell
TL;DR: Although the inheritance of epigenetic characters can certainly occur-particularly in plants-how much is due to the environment and the extent to which it happens in humans remain unclear.

1,291 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a set of recommendations for obtaining kinetic data that are adequate to the actual kinetics of various processes, including thermal decomposition of inorganic solids; thermal and thermo-oxidative degradation of polymers and organics; reactions of solids with gases; polymerization and crosslinking; crystallization of polymer and inorganics; hazardous processes.

890 citations


Journal ArticleDOI
TL;DR: This paper aims to provide a history of the use of glass in the construction of buildings in Montpellier and its applications in the 21st Century.
Abstract: Reḿi Auvergne,† Sylvain Caillol,† Ghislain David,*,† Bernard Boutevin,† and Jean-Pierre Pascault‡,§ †Institut Charles Gerhardt UMR CNRS 5253 Laboratoire Ingeńierie et Architecture Macromolećulaire, Ecole Nationale Supeŕieure de Chimie de Montpellier, 8 rue de l’Ecole Normale, 34296 Montpellier Cedex 05, France ‡INSA-Lyon, IMP, UMR5223, F-69621, Villeurbanne, France Universite ́ de Lyon, F-69622, Lyon, France

790 citations


Journal ArticleDOI
TL;DR: In this paper, an updated review of the state of technology and installations of several energy storage technologies were presented, and their various characteristics were analyzed, including their storage properties, current state in the industry and feasibility for future installation.

761 citations


Journal ArticleDOI
TL;DR: It is shown that after 100 million years of evolution the two ancestral subgenomes have remained extremely collinear, despite the loss of half of the duplicated protein-coding genes, mostly through pseudogenization.
Abstract: Vertebrate evolution has been shaped by several rounds of whole-genome duplications (WGDs) that are often suggested to be associated with adaptive radiations and evolutionary innovations. Due to an additional round of WGD, the rainbow trout genome offers a unique opportunity to investigate the early evolutionary fate of a duplicated vertebrate genome. Here we show that after 100 million years of evolution the two ancestral subgenomes have remained extremely collinear, despite the loss of half of the duplicated protein-coding genes, mostly through pseudogenization. In striking contrast is the fate of miRNA genes that have almost all been retained as duplicated copies. The slow and stepwise rediploidization process characterized here challenges the current hypothesis that WGD is followed by massive and rapid genomic reorganizations and gene deletions.

742 citations


Journal ArticleDOI
TL;DR: In this article, the basic principles and implementation of ultrafast imaging in biomedical ultrasound are illustrated and discussed in particular, present and future applications of ultra-fast imaging for screening, diagnosis, and therapeutic monitoring.
Abstract: Although the use of ultrasonic plane-wave transmissions rather than line-per-line focused beam transmissions has been long studied in research, clinical application of this technology was only recently made possible through developments in graphical processing unit (GPU)-based platforms Far beyond a technological breakthrough, the use of plane or diverging wave transmissions enables attainment of ultrafast frame rates (typically faster than 1000 frames per second) over a large field of view This concept has also inspired the emergence of completely novel imaging modes which are valuable for ultrasound-based screening, diagnosis, and therapeutic monitoring In this review article, we present the basic principles and implementation of ultrafast imaging In particular, present and future applications of ultrafast imaging in biomedical ultrasound are illustrated and discussed

718 citations


Journal ArticleDOI
16 Jan 2014-Nature
TL;DR: It is shown that strong, rapid adhesion between two hydrogels can be achieved at room temperature by spreading a droplet of a nanoparticle solution on one gel’s surface and then bringing the other gel into contact with it.
Abstract: Adhesives are made of polymers because, unlike other materials, polymers ensure good contact between surfaces by covering asperities, and retard the fracture of adhesive joints by dissipating energy under stress. But using polymers to 'glue' together polymer gels is difficult, requiring chemical reactions, heating, pH changes, ultraviolet irradiation or an electric field. Here we show that strong, rapid adhesion between two hydrogels can be achieved at room temperature by spreading a droplet of a nanoparticle solution on one gel's surface and then bringing the other gel into contact with it. The method relies on the nanoparticles' ability to adsorb onto polymer gels and to act as connectors between polymer chains, and on the ability of polymer chains to reorganize and dissipate energy under stress when adsorbed onto nanoparticles. We demonstrate this approach by pressing together pieces of hydrogels, for approximately 30 seconds, that have the same or different chemical properties or rigidities, using various solutions of silica nanoparticles, to achieve a strong bond. Furthermore, we show that carbon nanotubes and cellulose nanocrystals that do not bond hydrogels together become adhesive when their surface chemistry is modified. To illustrate the promise of the method for biological tissues, we also glued together two cut pieces of calf's liver using a solution of silica nanoparticles. As a rapid, simple and efficient way to assemble gels or tissues, this method is desirable for many emerging technological and medical applications such as microfluidics, actuation, tissue engineering and surgery.

614 citations


Journal ArticleDOI
TL;DR: This review contends that most of the contradictory findings are related to methodological inconsistencies and/or misinterpretation of the data rather than to limitations of heart rate measures to accurately inform on training status, and provides evidence that measures derived from 5-min recordings of resting and submaximal exercise heart rate are likely the most useful monitoring tools.
Abstract: Monitoring an athlete's physiological status in response to various types and volumes of (aerobic-oriented) training can provide useful information for optimizing training programs. Measures of resting, exercise and recovery heart rate (HR) are receiving increasing interest for monitoring fatigue, fitness and endurance performance responses, which has direct implications for adjusting training load 1) daily during specific training blocks and 2) throughout the competitive season. These measures are still not widely implemented to monitor athletes’ responses to training load, probably because of apparent contradictory findings in the literature. In this review I contend that most of the contradictory findings are related to methodological inconsistencies and/or misinterpretation of the data rather than to limitations of heart rate measures to accurately inform on training status. I also provide evidence that measures derived from 5-min (almost daily) recordings of resting (indices capturing beat-to-beat changes in HR, reflecting parasympathetic activity) and submaximal exercise (30- to 60-s average) HR are likely the most useful monitoring tools. For appropriate interpretation at the individual level, changes in a given measure should be interpreted by taking into account the error of measurement and the smallest important change of the measure, as well as the training context (training phase, load and intensity distribution). The decision to use a given measure should be based upon the level of information that is required by the athlete, the marker’s sensitivity to changes in training status and the practical constrains required for the measurements. However, measures of HR cannot inform on all aspects of wellness, fatigue and performance, so their use in combination with daily training logs, psychometric questionnaires and non-invasive, cost-effective performance tests such as a countermovement jump may offer a complete solution to monitor training status

590 citations


Journal ArticleDOI
TL;DR: In this article, the physical structure and morphology of conjugated polymers are discussed, and the key properties that make organic materials ideal for bioelectronics applications are highlighted, and a few recent devices that show either unique features or exceptionally high performance.
Abstract: In this Perspective, we make the case that the biological applications of organic semiconductor devices are significant. Indeed, we argue that this is an arena where organic materials have an advantage compared to traditional electronic materials, such as silicon. By discussing the physical structure and morphology of conjugated polymers, we are able to emphasize the key properties that make organic materials ideal for bioelectronics applications. We highlight a few recent devices that show either unique features or exceptionally high performance. On the basis of these examples, we discuss the future trajectory of this emerging field, note areas where further research is needed, and suggest possible applications in the short term.

Journal ArticleDOI
TL;DR: A scattering transform defines a locally translation invariant representation which is stable to time-warping deformation and extends MFCC representations by computing modulation spectrum coefficients of multiple orders, through cascades of wavelet convolutions and modulus operators.
Abstract: A scattering transform defines a locally translation invariant representation which is stable to time-warping deformation. It extends MFCC representations by computing modulation spectrum coefficients of multiple orders, through cascades of wavelet convolutions and modulus operators. Second-order scattering coefficients characterize transient phenomena such as attacks and amplitude modulation. A frequency transposition invariant representation is obtained by applying a scattering transform along log-frequency. State-the-of-art classification results are obtained for musical genre and phone classification on GTZAN and TIMIT databases, respectively.

Journal ArticleDOI
TL;DR: It is shown that embryonic microglia, which display a transiently uneven distribution, regulate the wiring of forebrain circuits and reveals roles for immune cells during normal assembly of brain circuits.

Journal ArticleDOI
TL;DR: The theory of Gaussian multiplicative chaos was introduced by Kahane's seminal work in 1985 as discussed by the authors, and it has been applied in many applications, ranging from finance to quantum gravity.
Abstract: In this article, we review the theory of Gaussian multiplicative chaos initially introduced by Kahane’s seminal work in 1985. Though this beautiful paper faded from memory until recently, it already contains ideas and results that are nowadays under active investigation, like the construction of the Liouville measure in $2d$-Liouville quantum gravity or thick points of the Gaussian Free Field. Also, we mention important extensions and generalizations of this theory that have emerged ever since and discuss a whole family of applications, ranging from finance, through the Kolmogorov-Obukhov model of turbulence to $2d$-Liouville quantum gravity. This review also includes new results like the convergence of discretized Liouville measures on isoradial graphs (thus including the triangle and square lattices) towards the continuous Liouville measures (in the subcritical and critical case) or multifractal analysis of the measures in all dimensions.

Journal ArticleDOI
TL;DR: Mutations of the SHANK genes were detected in the whole spectrum of autism with a gradient of severity in cognitive impairment and the clinical relevance of these genes remains to be ascertained.
Abstract: SHANK genes code for scaffold proteins located at the post-synaptic density of glutamatergic synapses. In neurons, SHANK2 and SHANK3 have a positive effect on the induction and maturation of dendritic spines, whereas SHANK1 induces the enlargement of spine heads. Mutations in SHANK genes have been associated with autism spectrum disorders (ASD), but their prevalence and clinical relevance remain to be determined. Here, we performed a new screen and a meta-analysis of SHANK copy-number and coding-sequence variants in ASD. Copy-number variants were analyzed in 5,657 patients and 19,163 controls, coding-sequence variants were ascertained in 760 to 2,147 patients and 492 to 1,090 controls (depending on the gene), and, individuals carrying de novo or truncating SHANK mutations underwent an extensive clinical investigation. Copy-number variants and truncating mutations in SHANK genes were present in ∼1% of patients with ASD: mutations in SHANK1 were rare (0.04%) and present in males with normal IQ and autism; mutations in SHANK2 were present in 0.17% of patients with ASD and mild intellectual disability; mutations in SHANK3 were present in 0.69% of patients with ASD and up to 2.12% of the cases with moderate to profound intellectual disability. In summary, mutations of the SHANK genes were detected in the whole spectrum of autism with a gradient of severity in cognitive impairment. Given the rare frequency of SHANK1 and SHANK2 deleterious mutations, the clinical relevance of these genes remains to be ascertained. In contrast, the frequency and the penetrance of SHANK3 mutations in individuals with ASD and intellectual disability-more than 1 in 50-warrant its consideration for mutation screening in clinical practice.

Journal ArticleDOI
TL;DR: It is shown, using theory and numerical simulation, that the landscape is much rougher than is classically assumed and undergoes a 'roughness transition' to fractal basins, which brings about isostaticity and marginal stability on approaching jamming.
Abstract: Glasses are amorphous solids whose constituent particles are caged by their neighbors and thus cannot flow. This sluggishness is often ascribed to the free energy landscape containing multiple minima (basins) separated by high barriers. Here we show, using theory and numerical simulation, that the landscape is much rougher than is classically assumed. Deep in the glass, it undergoes a "roughness transition" to fractal basins. This brings about isostaticity at jamming and marginality of glassy states near jamming. Critical exponents for the basin width, the weak force distribution, and the spatial spread of quasi-contacts at jamming can be analytically determined. Their value is found to be compatible with numerical observations. This advance therefore incorporates the jamming transition of granular materials into the framework of glass theory. Because temperature and pressure control which features of the landscape are experienced, glass mechanics and transport are expected to reflect the features of the topology we discuss here. Hitherto mysterious properties of low-temperature glasses could be explained by this approach.

Posted Content
TL;DR: In this article, a self-contained view of sparse modeling for visual recognition and image processing is presented, where the dictionary is learned and adapted to data, yielding a compact representation that has been successful in various contexts.
Abstract: In recent years, a large amount of multi-disciplinary research has been conducted on sparse models and their applications. In statistics and machine learning, the sparsity principle is used to perform model selection---that is, automatically selecting a simple model among a large collection of them. In signal processing, sparse coding consists of representing data with linear combinations of a few dictionary elements. Subsequently, the corresponding tools have been widely adopted by several scientific communities such as neuroscience, bioinformatics, or computer vision. The goal of this monograph is to offer a self-contained view of sparse modeling for visual recognition and image processing. More specifically, we focus on applications where the dictionary is learned and adapted to data, yielding a compact representation that has been successful in various contexts.

Journal ArticleDOI
TL;DR: A comparative analysis of different energy management schemes for a fuel-cell-based emergency power system of a more-electric aircraft and the main criteria for performance comparison are the hydrogen consumption, the state of charges of the batteries/supercapacitors, and the overall system efficiency.
Abstract: This paper presents a comparative analysis of different energy management schemes for a fuel-cell-based emergency power system of a more-electric aircraft. The fuel-cell hybrid system considered in this paper consists of fuel cells, lithium-ion batteries, and supercapacitors, along with associated dc/dc and dc/ac converters. The energy management schemes addressed are state of the art and are most commonly used energy management techniques in fuel-cell vehicle applications, and they include the following: the state machine control strategy, the rule-based fuzzy logic strategy, the classical proportional-integral control strategy, the frequency decoupling/fuzzy logic control strategy, and the equivalent consumption minimization strategy. The main criteria for performance comparison are the hydrogen consumption, the state of charges of the batteries/supercapacitors, and the overall system efficiency. Moreover, the stresses on each energy source, which impact their life cycle, are measured using a new approach based on the wavelet transform of their instantaneous power. A simulation model and an experimental test bench are developed to validate all analysis and performances.

Journal ArticleDOI
TL;DR: Understanding the ecology of ticks and their associations with hosts in a European urbanized environment is crucial to quantify parameters necessary for risk pre-assessment and identification of public health strategies for control and prevention of tick-borne diseases.
Abstract: Tick-borne diseases represent major public and animal health issues worldwide. Ixodes ricinus, primarily associated with deciduous and mixed forests, is the principal vector of causative agents of viral, bacterial, and protozoan zoonotic diseases in Europe. Recently, abundant tick populations have been observed in European urban green areas, which are of public health relevance due to the exposure of humans and domesticated animals to potentially infected ticks. In urban habitats, small and medium-sized mammals, birds, companion animals (dogs and cats), and larger mammals (roe deer and wild boar) play a role in maintenance of tick populations and as reservoirs of tick-borne pathogens. Presence of ticks infected with tick-borne encephalitis virus and high prevalence of ticks infected with Borrelia burgdorferi s.l., causing Lyme borreliosis, have been reported from urbanized areas in Europe. Emerging pathogens, including bacteria of the order Rickettsiales (Anaplasma phagocytophilum, "Candidatus Neoehrlichia mikurensis," Rickettsia helvetica, and R. monacensis), Borrelia miyamotoi, and protozoans (Babesia divergens, B. venatorum, and B. microti) have also been detected in urban tick populations. Understanding the ecology of ticks and their associations with hosts in a European urbanized environment is crucial to quantify parameters necessary for risk pre-assessment and identification of public health strategies for control and prevention of tick-borne diseases.

Journal ArticleDOI
TL;DR: In this paper, a new formalism, alternative to the old thermodynamic-Bethe- ansatz-like approach, for solution of the spectral problem of planar N=4 super Yang-Mills theory is presented.
Abstract: We present a new formalism, alternative to the old thermodynamic-Bethe- ansatz-like approach, for solution of the spectral problem of planar N=4 super Yang-Mills theory. It takes a concise form of ...

Journal ArticleDOI
TL;DR: Vanillin was used as a renewable building-block to develop a platform of 22 biobased compounds for polymer chemistry as mentioned in this paper, which can be used, among many others, in epoxy, polyester, polyurethanes, and Non-Isocyanate PolyUrethanes (NIPU) polymer synthesis.

Journal ArticleDOI
27 Jun 2014-Science
TL;DR: Using computational modeling and neuroimaging, it is shown that the human PFC has two concurrent inferential tracks that make probabilistic inferences about the reliability of the ongoing behavioral strategy and arbitrates between adjusting this strategy versus exploring new ones from long-term memory.
Abstract: The prefrontal cortex (PFC) subserves reasoning in the service of adaptive behavior. Little is known, however, about the architecture of reasoning processes in the PFC. Using computational modeling and neuroimaging, we show here that the human PFC has two concurrent inferential tracks: (i) one from ventromedial to dorsomedial PFC regions that makes probabilistic inferences about the reliability of the ongoing behavioral strategy and arbitrates between adjusting this strategy versus exploring new ones from long-term memory, and (ii) another from polar to lateral PFC regions that makes probabilistic inferences about the reliability of two or three alternative strategies and arbitrates between exploring new strategies versus exploiting these alternative ones. The two tracks interact and, along with the striatum, realize hypothesis testing for accepting versus rejecting newly created strategies.

Journal ArticleDOI
TL;DR: In this paper, the optimal allocation of Dispersed Storage Systems (DSSs) in active distribution networks (ADNs) is studied by defining a multi-objective optimization problem aiming at finding the optimal trade-off between technical and economical goals.
Abstract: Dispersed storage systems (DSSs) can represent an important near-term solution for supporting the operation and control of active distribution networks (ADNs). Indeed, they have the capability to support ADNs by providing ancillary services in addition to energy balance capabilities. Within this context, this paper focuses on the optimal allocation of DSSs in ADNs by defining a multi-objective optimization problem aiming at finding the optimal trade-off between technical and economical goals. In particular, the proposed procedure accounts for: 1) network voltage deviations; 2) feeders/lines congestions; 3) network losses; 4) cost of supplying loads (from external grid or local producers) together with the cost of DSS investment/maintenance; 5) load curtailment; and 6) stochasticity of loads and renewables produc- tions. The DSSs are suitably modeled to consider their ability to support the network by both active and reactive powers. A convex formulation of ac optimal power flow problem is used to define a mixed integer second-order cone programming problem to opti- mally site and size the DSSs in the network. A test case referring to IEEE 34 bus distribution test feeder is used to demonstrate and discuss the effectiveness of the proposed methodology.

Journal ArticleDOI
05 Sep 2014-Science
TL;DR: To understand the seismicity preceding this event, the location and mechanisms of the foreshocks and computed Global Positioning System time series at stations located on shore were studied and model as a slow slip event situated in the same area where the mainshock occurred.
Abstract: The subduction zone in northern Chile is a well-identified seismic gap that last ruptured in 1877. The moment magnitude (Mw) 8.1 Iquique earthquake of 1 April 2014 broke a highly coupled portion of this gap. To understand the seismicity preceding this event, we studied the location and mechanisms of the foreshocks and computed Global Positioning System (GPS) time series at stations located on shore. Seismicity off the coast of Iquique started to increase in January 2014. After 16 March, several Mw > 6 events occurred near the low-coupled zone. These events migrated northward for ~50 kilometers until the 1 April earthquake occurred. On 16 March, on-shore continuous GPS stations detected a westward motion that we model as a slow slip event situated in the same area where the mainshock occurred.

Journal ArticleDOI
TL;DR: The proposed Unified Hybrid Genetic Search metaheuristic relies on problem-independent unified local search, genetic operators, and advanced diversity management methods and shows remarkable performance, which matches or outperforms the current state-of-the-art problem-tailored algorithms.

Journal ArticleDOI
TL;DR: In this paper, the authors present current knowledge and understanding on gravity waves near jets and fronts from observations, theory, and modeling, and discuss challenges for progress in coming years, including the need for improving parameterizations of nonorographic gravity waves in climate models that include a stratosphere.
Abstract: For several decades, jets and fronts have been known from observations to be significant sources of internal gravity waves in the atmosphere. Motivations to investigate these waves have included their impact on tropospheric convection, their contribution to local mixing and turbulence in the upper troposphere, their vertical propagation into the middle atmosphere, and the forcing of its global circulation. While many different studies have consistently highlighted jet exit regions as a favored locus for intense gravity waves, the mechanisms responsible for their emission had long remained elusive: one reason is the complexity of the environment in which the waves appear; another reason is that the waves constitute small deviations from the balanced dynamics of the flow generating them; i.e., they arise beyond our fundamental understanding of jets and fronts based on approximations that filter out gravity waves. Over the past two decades, the pressing need for improving parameterizations of nonorographic gravity waves in climate models that include a stratosphere has stimulated renewed investigations. The purpose of this review is to present current knowledge and understanding on gravity waves near jets and fronts from observations, theory, and modeling, and to discuss challenges for progress in coming years.

Journal ArticleDOI
TL;DR: It is shown that an unstructured, sparsely connected network of model spiking neurons can display two fundamentally different types of asynchronous activity that imply vastly different computational properties.
Abstract: Here the author shows that an unstructured, sparsely connected network of model spiking neurons can display two different types of asynchronous activity: one in which an external input leads to a highly redundant response of different neurons that favors information transmission and another in which the firing rates of individual neurons fluctuate strongly in time and across neurons to provide a substrate for complex information processing.

Journal ArticleDOI
TL;DR: The ROA can be computed by solving a convex linear programming (LP) problem over the space of measures and this problem can be solved approximately via a classical converging hierarchy of convex finite-dimensional linear matrix inequalities (LMIs).
Abstract: We address the long-standing problem of computing the region of attraction (ROA) of a target set (e.g., a neighborhood of an equilibrium point) of a controlled nonlinear system with polynomial dynamics and semialgebraic state and input constraints. We show that the ROA can be computed by solving an infinite-dimensional convex linear programming (LP) problem over the space of measures. In turn, this problem can be solved approximately via a classical converging hierarchy of convex finite-dimensional linear matrix inequalities (LMIs). Our approach is genuinely primal in the sense that convexity of the problem of computing the ROA is an outcome of optimizing directly over system trajectories. The dual infinite-dimensional LP on nonnegative continuous functions (approximated by polynomial sum-of-squares) allows us to generate a hierarchy of semialgebraic outer approximations of the ROA at the price of solving a sequence of LMI problems with asymptotically vanishing conservatism. This sharply contrasts with the existing literature which follows an exclusively dual Lyapunov approach yielding either nonconvex bilinear matrix inequalities or conservative LMI conditions. The approach is simple and readily applicable as the outer approximations are the outcome of a single semidefinite program with no additional data required besides the problem description. The approach is demonstrated on several numerical examples.

Book
19 Dec 2014
TL;DR: The goal of this monograph is to offer a self-contained view of sparse modeling for visual recognition and image processing, focusing on applications where the dictionary is learned and adapted to data, yielding a compact representation that has been successful in various contexts.
Abstract: In recent years, a large amount of multi-disciplinary research has been conducted on sparse models and their applications. In statistics and machine learning, the sparsity principle is used to perform model selection - that is, automatically selecting a simple model among a large collection of them. In signal processing, sparse coding consists of representing data with linear combinations of a few dictionary elements. Subsequently, the corresponding tools have been widely adopted by several scientific communities such as neuroscience, bioinformatics, or computer vision. The goal of this monograph is to offer a self-contained view of sparse modeling for visual recognition and image processing. More specifically, we focus on applications where the dictionary is learned and adapted to data, yielding a compact representation that has been successful in various contexts.

Journal ArticleDOI
TL;DR: This article analyzed concurrent predictions of phonological processing (awareness and memory) and rapid automatized naming (RAN) for literacy development in a rural area of the United States and found that the cognitive underpinnings of reading and spelling are universal or language/orthography-specific.