when: May 23, 2014 at 2PM | where: Laboratoire AstroParticule et Cosmologie(check for the directions here) room Valentin |

**who:**

Yannick Mellier Institut d’Astrophysique de Paris. More info. | François Lanusse Laboratoire AIM More info. | Erwan Le Pennec Centre de Mathematiques Appliquées, Ecole Polytechnique More info. |

**what:**

**14:00-14:35:**Yannick Mellier (Institut d’Astrophysique de Paris and CEA/IRFU/SAp). More info.

Title: *The Euclid Project.*

Abstract: Euclid is an ESA M-class mission that was selected in October 2011. Euclid aims at understanding the origin of the accelerating expansion of the Universe by observing signatures of dark energy, modified gravity and dark matter on the expansion history and the growth rate of cosmic structure. Euclid will use 5 complementary and/or independent cosmological probes: weak lensing, baryon acoustic oscillations, redshift-space distortion, clusters of galaxies and integrated Sachs-Wolf effect. The payload module will be composed of a 1.2 meter telescope that will feed a wide field high image quality optical imager and a field field near-infrared photometer and spectrometer. The instruments will measure the shapes of about 1.5 billion galaxies and redshifts of several tens of millions galaxies observed over the whole darkest extragalactic sky (15,000 square degrees). I will present the mission and its scientific objectives and will show its performance and capability to pin down the properties and the history of the dark universe.

**and**

**14:40-15:15:**François Lanusse (AIM). More info.

Title:*Sparsity based tools to map the invisible universe from weak lensing data*Abstract:

Most of the matter content of the universe is composed of dark matter which would remain invisible to us if it were not for its gravitational influence. In particular the slight deformation of background galaxy images at the proximity of dark matter halos (weak gravitational lensing) is a very powerful probe into the dark matter distribution and is at the heart of Euclid mission. Weak lensing data can be used in a number of different ways to probe the universe and constrain cosmology but one particular application is the mapping of the dark matter distribution. I will be presenting the different challenges inherent to this mapping problem and how they can be addressed in the sparsity framework (with wavelet filtering, sparse inpainting, sparse regularization of inverse problems, …).

More specifically I will present a recently proposed method to map the 3D dark matter distribution with unprecedented accuracy using a sparsity based algorithm.

**15:15-15:45**Coffee and a discussion.

**15:45-16:20:**Erwan Le Pennec (Centre de Mathematiques Appliquées, Ecole Polytechnique). More info.

Title:*Non-supervised hyperspectral image segmentation, a conditional density approach*Abstract: Located at the SOLEIL Synchrotron (Saint-Aubin, France), IPANEMA is a unique in the world platform is dedicated to the study of ancient material. It supports research projects on ancient material using the synchrotron beamlines and develops novel methodological tools to be used in this studies. The high quality lightbeam of SOLEIL allows for instance high resolution high signal over noise ratio hyperspectral images acquisition. This tools has proved to be very interesting in the ancient material context, as shown by the conclusive study on Stradivarius varnish for instance. Segmenting in an unsupervised manner an hyperspectral into homogeneous region would be a powerful preprocessing tool to accelerate the data analysis flow.

We propose a novel unsupervised hyperspectral image segmentation algorithm. This algorithm extend the classical Gaussian Mixture Model based unsupervised classification technique by incorporating a spatial flavor to the model: the spectrum are modeled by a mixture of K classes, each having a Gaussian distribution, whose mixing proportions depends on the position. Using a piecewise constant structure for those mixing proportions, we are able to construct a penalized maximum likelihood procedure that estimate the number of classes as well as all the other parameters. This algorithm is supported by a theoretical analysis of the corresponding conditional density estimation problem. Using deviation bound of empirical process suprema, we have obtain under mild assumption on a model collection a choice of penalty leading to oracle type inequality for the corresponding penalized maximum likelihood estimate.

In this talk, I will describe the algorithm and explain briefly how our penalized maximum likelihood for conditional density approach leads to some theoretical justification of this methodology. I will explain how to implement such a method and show some numerical experiments.