Deterministic and Bayesian Sparsity enforcing models in signal and

Sparse modelling in Inverse problems and Machine learning: ... Matrix factorisation for recommender systems ... Tipping, Bishop: Sparse Bayesian Learning,.
15MB taille 2 téléchargements 335 vues
.

Deterministic and Bayesian Sparsity enforcing models in signal and image processing Ali Mohammad-Djafari Laboratoire des Signaux et Syst`emes (L2S) UMR8506 CNRS-CentraleSup´elec-UNIV PARIS SUD SUPELEC, 91192 Gif-sur-Yvette, France http://lss.centralesupelec.fr Email: [email protected] http://djafari.free.fr http://publicationslist.org/djafari Keynote talk at SITIS 2016, Napoli, Italy

A. Mohammad-Djafari,

Sparsity in signal and image processing,

Keynote talk at SITIS 2016, Napoli, Italy

1/65

Contents 1. 2. 3. 4. 5.

Sparsity ? Sparsity in data, signal and image processing Modelling for sparse representation Sparsity as a deterministic Regularizer Bayesian approach: Maximum A Posteriori (MAP) and link with Regularization 6. Prior models for enforcing (promoting) sparsity I I I I

Heavy tailed: Double Exponential, Generalized Gaussian, ... Mixture models: Mixture of Gaussians, Student-t, ... Hierarchical models with hidden variables General Gauss-Markov-Potts models

7. Bayesian Computational tools: Joint Maximum A Posteriori (JMAP), MCMC and Variational Bayesian Approximation (VBA) 8. Applications in Inverse Problems: X ray Computed Tomography, Microwave and Ultrasound imaging, Sattelite and Hyperspectral image processing, ... A. Mohammad-Djafari,

Sparsity in signal and image processing,

Keynote talk at SITIS 2016, Napoli, Italy

2/65

Sparsity? I

Signal and image representation and modelling: Many real world signals, sounds, images can be represented by a sparse model.

I

Sparse modelling in Inverse problems and Machine learning: Sparsity can be used as regularizer to avoid over fitting in many machine learning problems: Feature selection, SVMs, ...

I

Sparsity as a tool for fast algorithms: Sparsity can be exploited for fast computations Matrix factorisation for recommender systems Sparse solutions in kernel machines

A. Mohammad-Djafari,

Sparsity in signal and image processing,

Keynote talk at SITIS 2016, Napoli, Italy

3/65

Sparse signals and images I

Sparse signals: Direct sparsity

I

Sparse images: Direct sparsity

B& W text A. Mohammad-Djafari,

Cell Colony Sparsity in signal and image processing,

embryos Keynote talk at SITIS 2016, Napoli, Italy

4/65

Sparse signals and images I

Sparse signals in a Transform domain

I

Sparse images in a Transform domain

A. Mohammad-Djafari,

Sparsity in signal and image processing,

Keynote talk at SITIS 2016, Napoli, Italy

5/65

Sparse signals and images I

Sparse signals in Time and Fourier domain Time domain Fourier domain

I

Sparse images in Space and Fourier domain Space domain Fourier domain

A. Mohammad-Djafari,

Sparsity in signal and image processing,

Keynote talk at SITIS 2016, Napoli, Italy

6/65

Sparse signals and images I

Sparse signals: Sparsity in a Transform domaine

A. Mohammad-Djafari,

Sparsity in signal and image processing,

Keynote talk at SITIS 2016, Napoli, Italy

7/65

Sparse signals and images

A. Mohammad-Djafari,

Sparsity in signal and image processing,

Keynote talk at SITIS 2016, Napoli, Italy

8/65

Sparse signals and images (Fourier and Wavelets domain)

Image

Fourier

Wavelets

Image hist.

Fourier coeff. hist.

Wavelet coeff. hist.

processing, bands 1-3Sparsity in signal and image bands 4-6

A. Mohammad-Djafari,

bands 7-9

Keynote talk at SITIS 2016, Napoli, Italy

9/65

Finite and Sparse representation: some references I I

1948: Shannon: Sampling theorem and reconstruction of a band limited signal 1993-2007: I

I I

I

I

Mallat, Zhang, Cand`es, Romberg, Tao and Baraniuk: Non linear sampling, Compression and reconstruction, Fuch: Sparse representation Donoho, Elad, Tibshirani, Tropp, Duarte, Laska: Compressive Sampling, Compressive Sensing

2007-2016: Deterministic Algorithms for sparse representation and Compressive Sampling: Matching Pursuit (MP), Projection Pursuit Regression, Pure Greedy Algorithm, OMP, Basis Poursuit (BP), Dantzig Selector (DS), Least Absolute Shrinkage and Selection Operator (LASSO), Iterative Hard Thresholding... 2003-2016: Bayesian Bayesian approach to sparse modeling Tipping, Bishop: Sparse Bayesian Learning, Relevance Vector Machine (RVM), Sparsity enforcing priors,...

A. Mohammad-Djafari,

Sparsity in signal and image processing,

Keynote talk at SITIS 2016, Napoli, Italy

10/65

Modelling and representation I

Modelling via decomposition (basis, codebook, dictionary, Design Matrix) g (t) =

N X

f j φj (t), t = 1, · · · , T −→ g = Φ f

j=1

g (t)

φj (t)

fj

T = 100

[100 × 35]

N = 35(7nonzero)

A. Mohammad-Djafari,

Sparsity in signal and image processing,

Keynote talk at SITIS 2016, Napoli, Italy

11/65

Modelling and representation I

Modelling via decomposition (basis, codebook, dictionary, Design Matrix,...) g (t) =

N X

f j φj (t), t = 1, · · · , T −→ g = Φ f

j=1

g (t)

φj (t)

fj

T = 100

[100 × 35]

N = 35(7nonzero)

A. Mohammad-Djafari,

Sparsity in signal and image processing,

Keynote talk at SITIS 2016, Napoli, Italy

12/65

Modelling and representation g (t)

=

g

=

T = 100

A. Mohammad-Djafari,

P

φj (t)

fj

Φ

f

[100 × 35]

N = 35

j

Sparsity in signal and image processing,

Keynote talk at SITIS 2016, Napoli, Italy

13/65

Modelling and representation g (t)

=

g

=

T = 100

A. Mohammad-Djafari,

P

φj (t)

fj

Φ

f

[100 × 35]

N = 35

j

Sparsity in signal and image processing,

Keynote talk at SITIS 2016, Napoli, Italy

14/65

Modelling and representation I

Modelling via a basis (codebook, dictionary, Design Matrix) g (t) =

N X

f j φj (t), t = 1, · · · , T −→ g = Φ f

j=1 I

When T ≥ N

I

2    N T X X b −→ g (t) − f j φj (t) f j = arg min  fj  t=1 j=1  bf = arg min kg − Φfk2 = [Φ0 Φ]−1 Φ0 g 2 f When orthogonal basis: Φ0 Φ = I −→ bf = Φ0 g b fj =

N X

g (t) φj (t) =< g (t), φj (t) >

t=1 I

Application in Compression, Transmission and Decompression

A. Mohammad-Djafari,

Sparsity in signal and image processing,

Keynote talk at SITIS 2016, Napoli, Italy

15/65

Modelling and representation I

When over complete basis N > T : Infinite number of solutions for Φf = g. We have to select one. Minimum norm solution:  bf = arg min kfk2 2 f : Φf =g or writing differently: minimize kfk22 subject to Φf = g resulting to:

I I I

bf = Φ0 [ΦΦ0 ]−1 g Again if ΦΦ0 = I −→ bf = Φ0 g. No real interest if we have to keep all the N coefficients: Sparsity: minimize kfk0 subject to Φf = g or minimize kfk1 subject to Φf = g

A. Mohammad-Djafari,

Sparsity in signal and image processing,

Keynote talk at SITIS 2016, Napoli, Italy

16/65

Sparse decomposition (MP and OMP) I

Strict sparsity and exact reconstruction minimize kfk0 subject to Φf = g kfk0 is the number of non-zero elements of f I

I

Matching Pursuit (MP) [Mallat & Zhang, 1993] I

MP is a greedy algorithm that finds one atom at a time.

I

Find the one atom that best matches the signal; Given the previously found atoms, find the next one to best fit, Continue to the end.

Orthogonal Matching Pursuit (OMP) [Lin, Huang et al., 1993] The Orthogonal MP (OMP) is an improved version of MP that re-evaluates the coefficients after each round.

A. Mohammad-Djafari,

Sparsity in signal and image processing,

Keynote talk at SITIS 2016, Napoli, Italy

17/65

Sparse decomposition (BP,PPR,BCR,IHT,...) I

Sparsity enforcing and exact reconstruction minimize kfk1 subject to Φf = g I

This problem is convex (linear programming).

I

Very efficient solvers has been deployed: I I

I I I

I I

Interior point methods [Chen, Donoho & Saunders (95)], Iterated shrinkage [Figuerido & Nowak (03), Daubechies, Defrise, & Demole (04), Elad (05), Elad, Matalon, & Zibulevsky (06), Marvasti et al].

Basis Pursuit (BP) Projection Pursuit Regression Block Coordinate Relaxation (BCR) Greedy Algorithms Iterative Hard Thresholding (IHT)

A. Mohammad-Djafari,

Sparsity in signal and image processing,

Keynote talk at SITIS 2016, Napoli, Italy

18/65

Sparse decomposition algorithms I

Strict sparsity and exact reconstruction minimize kfk0 subject to g = Φf

I

Strict sparsity and approximate reconstruction minimize kfk0 subject to kg − Φfk22 < c

I I

NP-hard. Looking for other solutions: Sparsity promoting and exact reconstruction: Basis Pursuit (BP) minimize kfk1 subject to Φf = g

I

Sparsity promoting and approximate reconstruction: minimize kfk1 subject to kg − Φfk22 < c or equivalently (LASSO): bf = arg min {J(f)} f

A. Mohammad-Djafari,

with

1 J(f) = kg − Φfk22 + λkfk1 2

Sparsity in signal and image processing,

Keynote talk at SITIS 2016, Napoli, Italy

19/65

Sparse Decomposition Applications I

Denoising: g = f +  with f = Φz 1 J(z) = kg − Φzk22 + λkzk1 2

I

When b z computed, we can compute bf = Φb z. Compressed Sensing and Linear Inverse problems: g = Hf +  with f = Φz 1 J(z) = kg − HΦzk22 + λkzk1 2 When b z computed, we can compute bf = Φb z.

I

Linear Inverse problems with piecewise constant prior: g = Hf +  with Df = z and z j ∼ DE(λ) Sparse 1 J(f) = kg − Hfk22 + λkDfk1 2

A. Mohammad-Djafari,

Sparsity in signal and image processing,

Keynote talk at SITIS 2016, Napoli, Italy

20/65

Sparse Decomposition algorithms (Unitary decomposition) 1 J(z) = kg − Φzk22 + λkzk1 2 0 0 I When ΦΦ = Φ Φ = I 1 1 J(z) = kΦ0 g−Φ0 Φzk22 +λkzk1 = kz0 −zk22 +λkzk1 with z0 = Φ0 g 2 2 which is a separable criterion: X1 1 J(f) = kz − z0 k22 + λkzk1 = |z j − z0j |2 + λ|z j |1 2 2 j

I

Closed form solution: Shrinkage 

zj =

0 |z 0j | < λ z 0j − sign(z 0j )λ otherwise

A. Mohammad-Djafari,

Sparsity in signal and image processing,

Keynote talk at SITIS 2016, Napoli, Italy

21/65

Sparse Decomposition Algorithms (Lasso and extensions) I

LASSO: J(f) = kg − Φfk22 + λ

X

|f j |

j I

Other Criteria I

Lp J(f) = kg − Φfk22 + λ1

X

|f j |p ,

1