J
Jean-Louis Durrieu
Researcher at École Polytechnique Fédérale de Lausanne
Publications - 17
Citations - 2120
Jean-Louis Durrieu is an academic researcher from École Polytechnique Fédérale de Lausanne. The author has contributed to research in topics: Source separation & Audio signal processing. The author has an hindex of 13, co-authored 17 publications receiving 1974 citations. Previous affiliations of Jean-Louis Durrieu include Télécom ParisTech & École Normale Supérieure.
Papers
More filters
Journal ArticleDOI
Nonnegative matrix factorization with the itakura-saito divergence: With application to music analysis
TL;DR: Results indicate that IS-NMF correctly captures the semantics of audio and is better suited to the representation of music signals than NMF with the usual Euclidean and KL costs.
Journal ArticleDOI
Source/Filter Model for Unsupervised Main Melody Extraction From Polyphonic Audio Signals
TL;DR: A new signal model is proposed where the leading vocal part is explicitly represented by a specific source/filter model and reaches state-of-the-art performances on all test sets.
Journal ArticleDOI
A Musically Motivated Mid-Level Representation for Pitch Estimation and Musical Audio Source Separation
TL;DR: A source/filter signal model which provides a mid-level representation which makes the pitch content of the signal as well as some timbre information available, hence keeping as much information from the raw data as possible.
Proceedings ArticleDOI
Multichannel nonnegative tensor factorization with structured constraints for user-guided audio source separation
TL;DR: This work addresses the problem of separating multiple tracks from professionally produced music recordings with a user-guided approach in which the separation system is provided segmental information indicating the time activations of the particular instruments to separate, with sufficient quality for real-world music editing applications.
Proceedings ArticleDOI
Lower and upper bounds for approximation of the Kullback-Leibler divergence between Gaussian Mixture Models
TL;DR: Lower and upper bounds for the KL divergence are proposed, which lead to a new approximation and interesting insights into previously proposed approximations, which are used to validate assumptions on the models.