scispace - formally typeset
Search or ask a question
Institution

L'Abri

About: L'Abri is a based out in . It is known for research contribution in the topics: Decidability & Vertex (geometry). The organization has 657 authors who have published 1491 publications receiving 25598 citations.


Papers
More filters
Journal ArticleDOI
01 Jul 2004-Nature
TL;DR: Analysis of chromosome maps and genome redundancies reveal that the different yeast lineages have evolved through a marked interplay between several distinct molecular mechanisms, including tandem gene repeat formation, segmental duplication, a massive genome duplication and extensive gene loss.
Abstract: Identifying the mechanisms of eukaryotic genome evolution by comparative genomics is often complicated by the multiplicity of events that have taken place throughout the history of individual lineages, leaving only distorted and superimposed traces in the genome of each living organism. The hemiascomycete yeasts, with their compact genomes, similar lifestyle and distinct sexual and physiological properties, provide a unique opportunity to explore such mechanisms. We present here the complete, assembled genome sequences of four yeast species, selected to represent a broad evolutionary range within a single eukaryotic phylum, that after analysis proved to be molecularly as diverse as the entire phylum of chordates. A total of approximately 24,200 novel genes were identified, the translation products of which were classified together with Saccharomyces cerevisiae proteins into about 4,700 families, forming the basis for interspecific comparisons. Analysis of chromosome maps and genome redundancies reveal that the different yeast lineages have evolved through a marked interplay between several distinct molecular mechanisms, including tandem gene repeat formation, segmental duplication, a massive genome duplication and extensive gene loss.

1,604 citations

Journal ArticleDOI
TL;DR: A comprehensive overview of the modern classification algorithms used in EEG-based BCIs is provided, the principles of these methods and guidelines on when and how to use them are presented, and a number of challenges to further advance EEG classification in BCI are identified.
Abstract: Objective: Most current Electroencephalography (EEG)-based Brain-Computer Interfaces (BCIs) are based on machine learning algorithms. There is a large diversity of classifier types that are used in this field, as described in our 2007 review paper. Now, approximately 10 years after this review publication, many new algorithms have been developed and tested to classify EEG signals in BCIs. The time is therefore ripe for an updated review of EEG classification algorithms for BCIs. Approach: We surveyed the BCI and machine learning literature from 2007 to 2017 to identify the new classification approaches that have been investigated to design BCIs. We synthesize these studies in order to present such algorithms, to report how they were used for BCIs, what were the outcomes, and to identify their pros and cons. Main results: We found that the recently designed classification algorithms for EEG-based BCIs can be divided into four main categories: adaptive classifiers, matrix and tensor classifiers, transfer learning and deep learning, plus a few other miscellaneous classifiers. Among these, adaptive classifiers were demonstrated to be generally superior to static ones, even with unsupervised adaptation. Transfer learning can also prove useful although the benefits of transfer learning remain unpredictable. Riemannian geometry-based methods have reached state-of-the-art performances on multiple BCI problems and deserve to be explored more thoroughly, along with tensor-based methods. Shrinkage linear discriminant analysis and random forests also appear particularly useful for small training samples settings. On the other hand, deep learning methods have not yet shown convincing improvement over state-of-the-art BCI methods. Significance: This paper provides a comprehensive overview of the modern classification algorithms used in EEG-based BCIs, presents the principles of these Review of Classification Algorithms for EEG-based BCI 2 methods and guidelines on when and how to use them. It also identifies a number of challenges to further advance EEG classification in BCI.

1,280 citations

Book ChapterDOI
TL;DR: The possibilities to collect and store data increase at a faster rate than the ability to use it for making decisions, and in most applications, raw data has no value in itself; instead the authors want to extract the information contained in it.
Abstract: We are living in a world which faces a rapidly increasing amount of data to be dealt with on a daily basis. In the last decade, the steady improvement of data storage devices and means to create and collect data along the way influenced our way of dealing with information: Most of the time, data is stored without filtering and refinement for later use. Virtually every branch of industry or business, and any political or personal activity nowadays generate vast amounts of data. Making matters worse, the possibilities to collect and store data increase at a faster rate than our ability to use it for making decisions. However, in most applications, raw data has no value in itself; instead we want to extract the information contained in it.

1,047 citations

Journal ArticleDOI
Christophe Schlick1
TL;DR: A new BRDF model is presented which can be viewed as an kind of intermediary model between empirism and theory and especially intended for computer graphics applications and therefore includes two main features: simplicity and efficiency.
Abstract: : A new BRDF model is presented which can be viewed as an kind of intermediary model between empirism and theory. Main results of physics are observed (energy conservation, reciprocity rule, microfacet theory) and numerous phenomena involved in light reflection are accounted for, in a physically plausible way (incoherent and coherent reflection, spectrum modifications, anisotropy, self-shadowing, multiple surface and subsurface reflection, differences between homogeneous and heterogeneous materials). The model has been especially intended for computer graphics applications and therefore includes two main features: simplicity (a small number of intuitively understandable parameters controls the model) and efficiency (the formulation provides adequation to Monte-Carlo rendering techniques and/or hardware implementations).

483 citations

Journal ArticleDOI
01 Jul 2008
TL;DR: This paper presents a set of algorithms, implemented in the PT-Scotch software package, which allows one to order large graphs in parallel, yielding orderings the quality of which is only slightly worse than the one of state-of-the-art sequential algorithms.
Abstract: The parallel ordering of large graphs is a difficult problem, because on the one hand minimum degree algorithms do not parallelize well, and on the other hand the obtainment of high quality orderings with the nested dissection algorithm requires efficient graph bipartitioning heuristics, the best sequential implementations of which are also hard to parallelize. This paper presents a set of algorithms, implemented in the PT-Scotch software package, which allows one to order large graphs in parallel, yielding orderings the quality of which is only slightly worse than the one of state-of-the-art sequential algorithms. Our implementation uses the classical nested dissection approach but relies on several novel features to solve the parallel graph bipartitioning problem. Thanks to these improvements, PT-Scotch produces consistently better orderings than ParMeTiS on large numbers of processors.

376 citations


Authors

Showing all 657 results

NameH-indexPapersCitations
Vincent Lepetit7026826207
Pierrick Coupé451839147
Bruno Courcelle4318011012
Fabien Lotte421799441
Mireille Bousquet-Mélou411385098
Charles Consel401965362
Cyril Gavoille381744562
David James Sherman37949263
Igor Walukiewicz331214635
Toufik Mansour335564961
Arnaud Legrand331434980
Anca Muscholl321333276
Chokri Ben Amar313103357
André Raspaud311573091
Raymond Namyst301054559
Network Information
Related Institutions (5)
Microsoft
86.9K papers, 4.1M citations

85% related

AT&T Labs
5.5K papers, 483.1K citations

85% related

Paris Dauphine University
6.9K papers, 162.7K citations

84% related

École normale supérieure de Cachan
5.5K papers, 175.9K citations

83% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
20223
202179
2020107
2019107
201881
201797