Categories
Non classé

New preprint

We uploaded the huge work of P.-J. Bénard for his PhD:

Estimation of off-the-grid sparse spikes with over-parametrized projected gradient descent: theory and application. P.-J. Bénard, Y. Traonmilin, J.-F. Aujol and E. Soubies, 2023.

Abstract: “In this article, we study the problem of recovering sparse spikes with overparametrized projected descent. We first provide a theoretical study of approximate recovery with our chosen initialization method: Continuous Orthogonal Matching Pursuit without Sliding. Then we study the effect of over-parametrization on the gradient descent which highlights the benefits of the projection step. Finally, we show the improved calculation times of our algorithm compared to state-of-the-art model-based methods on realistic simulated microscopy data.”

Categories
Non classé Paper

New preprint

We uploaded the last work of Hui Shi during her PhD thesis:

Batch-less stochastic gradient descent for compressive learning of deep regularization for image denoising, H. Shi, Y. Traonmilin and J.-F. Aujol, 2023.

Abstract: “We consider the problem of denoising with the help of prior information taken from a database of clean signals or images. Denoising with variational methods is very efficient if a regularizer well adapted to the nature of the data is available. Thanks to the maximum a posteriori Bayesian framework, such regularizer can be systematically linked with the distribution of the data. With deep neural networks (DNN), complex distributions can be recovered from a large training database. To reduce the computational burden of this task, we adapt the compressive learning framework to the learning of regularizers parametrized by DNN. We propose two variants of stochastic gradient descent (SGD) for the recovery of deep regularization parameters from a heavily compressed database. These algorithms outperform the initially proposed method that was limited to low-dimensional signals, each iteration using information from the whole database. They also benefit from classical SGD convergence guarantees. Thanks to these improvements we show that this method can be applied for patch based image denoising.”

Categories
Non classé

Compressive learning of deep regularization for denoising

Hui Shi will be presenting “Compressive learning of deep regularization for denoising” at SSVM 2023.

Abstract: “Solving ill-posed inverse problems can be done accurately if a regularizer well adapted to the nature of the data is available. Such regularizer can be systematically linked with the distribution of the data itself through the maximum a posteriori Bayesian framework. Recently, regularizers designed with the help of deep neural networks received impressive success. Such regularizers are typically learned from voluminous training data. To reduce the computational burden of this task, we propose to adapt the compressive learning framework to the learning of regularizers parametrized by deep neural networks (DNN). Our work shows the feasibility of batchless learning of regularizers from a compressed dataset. In order to achieve this, we propose an approximation of the compression operator that can be calculated explicitly for the task of learning a regularizer by DNN. We show that the proposed regularizer is capable of modeling complex regularity prior and can be used to solve the denoising inverse problem.”

Categories
Non classé

Talk at iTWIST 2020

I will give a talk about our work on “Projected gradient descent for non-convex sparse spike estimation” at iTWIST 2020 in Nantes.

Categories
Non classé

Past news

• I will be at the Optimisation on Measure Spaces Workshop in November to present my latest work on sparse spike super-resolution.

• I will be presenting my work with J.-F. Aujol at the SPARS 2019 workshop. Title of my talk: « On the non-convex sparse spike estimation problem: explicit basins of attractions of global minimizers »

• I will be presenting a follow-up of our work on optimal regularization at ITWIST 2018 (November 21st-23rd)  (abstract): Is the 1-norm the best convex sparse regularization? , Yann Traonmilin, Samuel Vaiter and Rémi Gribonval. Stay tuned for the full manuscript to come!

• I will be presenting the following work about the concept of optimal regularization at NCMIP 2018  (May 25th) and at the Institut de Mathématiques de Toulouse’s « séminaire MIP »  (May 22nd) in May :

• With the support of the GDR MIA, we are organizing a thematic workshop on « Sparsity and applications  » (journée parcimonie et application) in Bordeaux, May 3rd 2018. Check the website for our call for contributions and details about organization.

• I will be presenting my latest work on spikes super-resolution at the IOP seminar at IMB Bordeaux November, 9th, 2017 and at the SPOC seminar at Institut de Mathématiques de Bourgogne in Dijon, December 13th, 2017.

• My latest work on statistical learning, super-resolution and phase unmixing will be presented at the SPARS 2017 workshop. We have three communications:

Spikes super-resolution with random Fourier sampling, Y. Traonmilin, N. Keriven, R. Gribonval and G. Blanchard.

Random Moments for Sketched Mixture Learning, N. Keriven, R. Gribonval, G. Blanchard and Y. Traonmilin.

Signal Separation with Magnitude Constraints : a Phase Unmixing Problem, A. Deleforge and Y. Traonmilin.

• I will be presenting my latest work on compressive statistical learning and super-resolution (follow-up of Compressive K-means) at the SMAI 2017 congress in June .

Super-résolution d’impulsions de Diracs par échantillonnage de Fourier aléatoire. Y. Traonmilin, N. Keriven, R. Gribonval and G. Blanchard. Abstract in French.

• A preprint of a digest of our work on compressed sensing in Hilbert spaces (with Gilles Puy, Rémi Gribonval and Mike E. Davies) is availaible :

Compressed sensing in Hilbert spaces. Y. Traonmilin, G. Puy, R. Gribonval and M. E. Davies

• Our work with Nicolas Keriven, Nicolas Tremblay and Rémi Gribonval  has been accepted at ICASSP 2017. It showcases how a database can be compressed in order to perform the K-means clustering task on huge volume of data:

Compressive K-means. N. Keriven, N. Tremblay, Y. Traonmilin and R. Gribonval

• Our work with Antoine Deleforge has been accepted at ICASSP 2017. This work opens an interesting new line of research in the domain of source separation.

Phase Unmixing : Multichannel Source Separation with Magnitude Constraints. A. Deleforge and Y. Traonmilin.

• Our article Stable recovery of low-dimensional cones in Hilbert spaces: One RIP to rule them all has been accepted for publication in Applied and Computational Harmonic Analysis (follow the link to access « In press » ACHA version).

I gave a talk September 13th  at the IEEE Information Theory Workshop 2016, Cambridge, UK.

Our latest preprint is available :
Stable recovery of low-dimensional cones in Hilbert spaces: One RIP to rule them all
Yann Traonmilin and Rémi Gribonval. https://hal.archives-ouvertes.fr/hal-01207987

I presented my work, March 10th at the mathematics for image processing seminar of Descartes University and Telecom Paristech in Paris.

I presented my work,  January 29th 2016 at the Rennes Statistics Seminar.

I presented my work with R. Gribonval at the GDR ISIS day in Marseille October 8th : http://www.gdr-isis.fr/index.php?page=reunion&idreunion=280

Journée Science et Musique 2015, organised by PANAMA team in Rennes is on September 26th! http://jsm.irisa.fr/

I presented my work with R. Gribonval at  the missDATA2015 conference in Rennes http://missdata2015.agrocampus-ouest.fr/infoglueDeliverLive/