Sparsity in signal and image processing: from ... - Ali Mohammad-Djafari

Tsinghua University, Beijing, China, December 10, 2013. A. Mohammad-Djafari,. Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, ...
2MB taille 3 téléchargements 275 vues
. Sparsity in signal and image processing: from modeling and representation to reconstruction and processing Ali Mohammad-Djafari Laboratoire des Signaux et Syst`emes, UMR8506 CNRS-SUPELEC-UNIV PARIS SUD SUPELEC, 91192 Gif-sur-Yvette, France http://lss.supelec.free.fr Email: [email protected] http://djafari.free.fr http://publicationslist.org/djafari Seminar 2 given at Tsinghua University, Beijing, China, December 10, 2013 A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

1/58

Contents Sparse signals and images First ideas for using sparsity in signal processing Modeling for sparse representation Bayesian Maximum A Posteriori (MAP) approach and link with Deterministic Regularization 5. Priors which enforce sparsity 1. 2. 3. 4.

◮ ◮ ◮ ◮

Heavy tailed: Double Exponential, Generalized Gaussian, ... Mixture models: Mixture of Gaussians, Student-t, ... Hierarchical models with hidden variables General Gauss-Markov-Potts models

6. Computational tools: Joint Maximum A Posteriori (JMAP), MCMC and Variational Bayesian Approximation (VBA) 7. Applications: X ray Computed Tomography, Microwave and Ultrasound imaging, Sattelite and Hyperspectral image processing, Spectrometry, CMB, ... A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

2/58

1. Sparse signals and images ◮

Sparse signals: Direct sparsity 3

4

3.5 2 3 1 2.5

0

2

1.5 −1 1 −2 0.5

−3



0

20

40

60

80

100

120

140

160

180

0

0

20

40

60

80

100

120

140

160

180

Sparse images: Direct sparsity 50

100

150

200

250 50

A. Mohammad-Djafari,

100

150

200

250

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

3/58

Sparse signals and images ◮

Sparse signals in a Transform domain 1

1

0.8

0.8

0.6

0.6

0.4

0.4

0.2

0.2

0

0

−0.2

−0.2

−0.4

−0.4

−0.6

−0.6

−0.8

−1



−0.8

0

20

40

60

80

100

120

140

160

180

200

−1

0

20

40

60

80

100

120

140

160

180

Sparse images in a Transform domain

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

4/58

200

Sparse signals and images ◮

Sparse signals in Fourier domain Time domain

Fourier domain

6

200

180 4 160

140 2 120

0

100

80 −2 60

40 −4 20

−6



0

20

40

60

80

100

120

140

160

180

200

Sparse images in wavelet domain Space domain

0

0

50

100

150

200

250

Fourier domain

50

100

150

200

250

A. Mohammad-Djafari,

50

100

150

200

250

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

5/58

Sparse signals and images ◮

Sparse signals: Sparsity in a Transform domaine

3

6

2.5

4

2

2

1.5

0

1

−2

0.5

−4

1

0.8

0.6

0.4

0.2

0

−0.2

−0.4

−0.6

−0.8

0

0

20

40

60

80

100

120

140

160

180

200

3

−6

0

20

40

60

80

100

120

140

160

180

−1

200

0

20

40

60

80

100

120

140

160

180

200

180

200

200

1 180 2

2

160

140

3

1 120

4 0

100

5 80 −1

6

60

40

7

−2 20

8 −3

0

20

40

60

80

100

120

A. Mohammad-Djafari,

140

160

180

200

0

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

20

40

60

80

100

120

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

140

160

6/58

Sparse signals and images

50

50

100

100

150

150

200

200

250

250 50

100

150

200

250

50

100

150

200

250

50

100

150

200

250

50

100

150

200

250

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

7/58

Sparse signals and images

Image

Fourier

3500

Wavelets

6000

10 3000

5000

8

2500 4000 2000

6 3000

1500

4 2000 1000

0

2

1000

500

0

50

100

150

200

0

250

0

Image hist.

2

4

6

8

10

12

14

0 16 −500

Fourier coeff. hist. 1400

1400

1200

1200

1200

1000

1000

1000

800

800

800

600

600

600

400

400

400

200

200

−150

−100

−50

0

50

100

150

200

0 −200

500

Wavelet coeff. hist.

1400

0 −200

0

200

−150

−100

−50

0

50

100

150

200

0 −200

−150

−100

−50

0

50

100

image processing:..., Beijing, China bands 1-3Seminar 2: in signal andbands 4-6 Tsinghua University, bands 7-9

A. Mohammad-Djafari,

150

200

8/58

2. First ideas: some history ◮



1948: Shannon: Sampling theorem and reconstruction of a band limited signal 1993-2007: ◮

◮ ◮





Mallat, Zhang, Cand`es, Romberg, Tao and Baraniuk: Non linear sampling, Compression and reconstruction, Fuch: Sparse representation Donoho, Elad, Tibshirani, Tropp, Duarte, Laska: Compressive Sampling, Compressive Sensing

2007-2012: Algorithms for sparse representation and compressive Sampling: Matching Pursuit (MP), Projection Pursuit Regression, Pure Greedy Algorithm, OMP, Basis Poursuit (BP), Dantzig Selector (DS), Least Absolute Shrinkage and Selection Operator (LASSO),... 2003-2012: Bayesian approach to sparse modeling Tipping, Bishop: Sparse Bayesian Learning, Relevance Vector Machine (RVM), ...

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

9/58

3. Modeling and representation ◮

Modeling via a basis (codebook, overcomplete dictionnary, Design Matrix) g(t) =

N X

f j φj (t), t = 1, · · · , T −→ g = Φ′ f

j=1



When T ≥ N

2    N T X X b f j = arg min g(t) − f j φj (t) −→ fj   t=1 j=1  b = arg min kg − Φ′ f k2 = [ΦΦ′ ]−1 Φg f f





b = Φg When orthogonal basis: ΦΦ′ = I −→ f fbj =

N X

g(t) φj (t) =< g(t), φj (t) >

t=1

Application in Compression, Transmission and Decompression

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

10/58

Modeling and representation ◮

When overcomplete basis N > T : Infinite number of solutions for Φ′ f = g. We have to select one:  b = arg min f kf k22 ′ f : Φ f =g

or writing differently:

minimize kf k22 subject to Φ′ f = g resulting to: ◮ ◮ ◮

b = Φ[Φ′ Φ]−1 g f Again if Φ′ Φ = I −→ fb = Φg. No real interest if we have to keep all the N > T coefficients. Sparsity: minimize kf k0 subject to Φ′ f = g or minimize kf k1 subject to Φ′ f = g

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

11/58

Sparse decomposition ◮

Strict sparsity and exact reconstruction minimize kf k0 subject to Φ′ f = g kf k0 is the number of non-zero elements of f ◮ ◮ ◮ ◮



Matching Pursuit (MP) [Mallat & Zhang, 1993] Orthogonal Matching Pursuit (OMP) [Lin, Huang et al., 1993] Projection Pursuit Regression Greedy Algorithms

Sparsity enforcing and exact reconstruction minimize kf k1 subject to Φ′ f = g ◮ ◮

Basis Pursuit (BP) Block Coordinate Relaxation

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

12/58

Sparse decomposition ◮

Strict sparsity and approximate reconstruction

minimize kf k0 subject to kg − Φ′ f k2 < c   b = arg min kf k0 + µkg − Φ′ f k2 = arg min kg − Φ′ f k2 + λkf k0 f f



f

Sparsity enforcing and approximate reconstruction  b = arg min kg − Φ′ f k2 + λkf k1 f f

J(f ) = kg − Φ′ f k2 + λkf k1 = kg − Φ′ f k2 + λ

X

|f j |

j



Main Algorithm: LASSO [Tibshirani 2003] minimize kg − Φ′ f k2 subject to kf k1 < τ

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

13/58

Sparse Decomposition Algorithms ◮

LASSO: J(f ) = kg − Φ′ f k2 + λ

X

|f j |

j



Other Criteria ◮

Lp J(f ) = kg − Φ′ f k2 + λ1

X

|f j |p ,

1 0 DE(f |a) = exp [−a|f |] = 2 0 h i Q Q P = j p(fj |zj ) = j N (fj |0, zj ) ∝ exp − 12 j fj2 /zj p(f |z) hP 2 i Q 2 2 a = j E(zj | a2 ) ∝ exp p(z| a2 ) j 2 zj i h P 2 2 p(f , z| a2 ) ∝ exp − 12 j fj2 /zj + a2 zj ◮

        



With these models we have: ◮

Simple priors p(f , θ|g) ∝ p(g|f , θ1 ) p(f |θ 2 ) p(θ)



Hierarchical priors p(f , z, θ|g) ∝ p(g|f , θ1 ) p(f |z, θ2 ) p(z|θ3 ) p(θ)

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

29/58

Bayesian Computation and Algorithms ◮

When the expression of p(f , θ|g) or of p(f , z, θ|g) is obtained, we have following options:



Joint MAP: (needs optimization algorithms) b = arg max {p(f , θ|g)} b , θ) (f (f ,θ)



MCMC: Needs the expressions of the conditionals p(f |z, θ, g), p(z|f , θ, g), and p(θ|f , z, g)



Variational Bayesian Approximation (VBA): Approximate p(f , z, θ|g) by a separable one q(f , z, θ|g) = q1 (f ) q2 (z) q3 (θ) and do any computations with these separable ones.

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

30/58

Joint MAP p(f , θ|g) ∝ p(g|f , θ 1 ) p(f |θ 2 ) p(θ) ◮

Objective:

b = arg max {p(f , θ|g)} b , θ) (f (f ,θ)



Alternate optimization:  n o b = arg maxf p(f , θ|g) b  f n o b , θ|g) b = arg maxθ p(f  θ b θ (0) −→ θ−→ ↑



b θ←−

n o b g) b = arg maxf p(f |θ, f

n o b = arg maxθ p(θ|f b , g) θ

Uncertainties are not propagated.

A. Mohammad-Djafari,

b −→f ↓

b ←− f

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

31/58

MCMC based algorithm p(f , z, θ|g) ∝ p(g|f , z, θ) p(f |z, θ) p(z) p(θ) General scheme (Gibbs Sampling): ◮

Generate samples from the conditionals: b g) −→ z b g) −→ θ b ∼ p(f |b b ∼ (θ|fb, zb, g) b ∼ p(z|fb, θ, f z , θ,



Waite for convergency



Compute empirical statistics (means, modes, variances) from the samples E {f } ≈

A. Mohammad-Djafari,

1 X (n) f N n

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

32/58

Variational Bayesian Approximation ◮





Approximate p(f , θ|g) by q(f , θ|g) = q1 (f |g) q2 (θ|g) and then continue computations. Criterion KL(q(f , θ|g) : p(f , θ|g)) ZZ ZZ q q1 q2 KL(q : p) = q ln = q1 q2 ln p p Iterative algorithm q1 −→ q2 −→ q1 −→ q2 , · · ·  h i  q1 (f ) ∝ exp hln p(g, f , θ; M)i q2 (θ) i h  q2 (θ) ∝ exp hln p(g, f , θ; M)i q1 (f )

h i (0) qb2 −→ qb2 −→ q1 (f ) ∝ exp hln p(g, f , θ; M)iq2 ↑

h

b ← qb2 (θ)←− q2 (θ) ∝ exp hln p(g, f , θ; M)i θ q1 ◮

i

b −→b q1 (f ) −→ f ↓

←− qb1

Uncertainties are propagated (Message Passing methods)

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

33/58

Summary of Bayesian approach ◮

Simple priors

↓ α, β

Hyper prior model p(θ|α, β) p(θ 2 |α2 , β2 ) p(θ 1 |α1 , β1 ) ❄

p(f |θ 2 )

⋄ p(g|f , θ 1 ) −→p(f , θ|g, α, β) → MCMC

Prior ◮

JMAP



Likelihood

Joint Posterior

VBA

Hierarchical priors

b −→ f b −→ θ

↓ α, β, γ Hyper prior model p(θ|α, β, γ) p(θ 3 |α3 , β3 ) ❄ ⋄ p(z|θ3 ) Hidden variable

p(θ2 |α2 , β2 ) p(θ1 |α1 , β1 ) JMAP b −→ f ❄ ❄ b p(f |z, θ2 ) ⋄ p(g|f , θ1 ) −→ p(f , z, θ|g) −→ MCMC −→ z b −→ θ VBA Prior Likelihood Joint Posterior

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

34/58

Advantages of the Bayesian Approach ◮

More possibilities to model sparsity



More tools to handle hyperparameters



More tools to account for uncertainties



More possibilities to understand and to control many ad hoc deterministic algorithms Hierarchical models give still more modeling possibilities



◮ ◮ ◮

◮ ◮ ◮

Bernouilli-Gaussian: strict sparsity Bernouilli-Gamma: strict sparsity + positivity Bernouilli-Multinomial: strict sparsity + discrete values (finite states) Independent Mixture models: sparsity enforcing Mixture of multivariate models: group sparsity enforcing Gauss-Markov-Potts models: sparsity in transform domains

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

35/58

Examples of Hierarchical models 3

3

2

2.5

1

2

0

1.5

−1

1

−2

0.5

−3

0

20

40

60

80

100

120

140

160

180

0

0

20

40

Bernouilli-Gaussian

60

80

100

120

140

160

180

160

180

Bernouilli-Gamma

3

4

3 2 2 1 1

0

0

−1 −1 −2 −2 −3

−3

0

20

40

60

80

100

120

140

160

180

−4

0

20

Bernouilli-Binomial 4

4

3.5

3.5

3

3

2.5

2.5

2

2

1.5

1.5

1

80

100

120

140

0.5

0

20

40

60

80

100

120

140

160

180

0

0

20

40

Piecewise constant

60

80

100

120

140

160

180

160

180

Piecewise Gaussian

3

3

2

2

1

1

0

0

−1

−1

−2

−3

60

1

0.5

0

40

Bernouilli-Multinomial

−2

0

20

40

60

80

100

120

140

160

Gauss-Markov-Potts 1 A. Mohammad-Djafari,

180

−3

0

20

40

60

80

100

120

140

Gauss-Markov-Potts 2

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

36/58

Which class of images I am looking for? 50 100 150 200 250 300 350 400 450 50

A. Mohammad-Djafari,

100

150

200

250

300

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

37/58

Which class of signals I am looking for?

Gauss-Markov

Generalized GM

Piecewize Gaussian

Mixture of GM: Gauss-Markov-Potts

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

38/58

Gauss-Markov-Potts prior models for images

f (r)

q(r) = 1 − δ(z(r) − z(r ′ ))

z(r)

p(f (r)|z(r) = k, mk , vk ) = N (mk , vk ) X p(f (r)) = P (z(r) = k) N (mk , vk ) Mixture of Gaussians k

◮ ◮

Separable iid hidden variables: Markovian hidden variables:



Q p(z) = r p(z(r)) p(z) Potts-Markov: X



δ(z(r) − z(r ′ )) p(z(r)|z(r ′ ), r ′ ∈ V(r)) ∝ exp γ   r ′ ∈V(r) X X δ(z(r) − z(r ′ )) p(z) ∝ exp γ r∈R r ′ ∈V(r)

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

39/58

Four different cases To each pixel of the image is associated 2 variables f (r) and z(r) ◮

f |z Gaussian iid, z iid : Mixture of Gaussians



f |z Gauss-Markov, z iid : Mixture of Gauss-Markov



f |z Gaussian iid, z Potts-Markov : Mixture of Independent Gaussians (MIG with Hidden Potts)



f |z Markov, z Potts-Markov : Mixture of Gauss-Markov (MGM with hidden Potts)

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

f (r)

z(r) 40/58

Application of CT in NDT Reconstruction from only 2 projections

g1 (x) = ◮



Z

f (x, y) dy,

g2 (y) =

Z

f (x, y) dx

Given the marginals g1 (x) and g2 (y) find the joint distribution f (x, y). Infinite number of solutions : f (x, y) = g1 (x) g2 (y) Ω(x, y) Ω(x, y) is a Copula: Z Z Ω(x, y) dx = 1 and Ω(x, y) dy = 1

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

41/58

Application in CT 20

40

60

80

100

120 20

g|f f |z g = Hf + ǫ iid Gaussian g|f ∼ N (Hf , σǫ2 I) or Gaussian Gauss-Markov

z iid or Potts

40

60

80

100

c q(r) ∈ {0, 1} 1 − δ(z(r) − z(r ′ )) binary

p(f , z, θ|g) ∝ p(g|f , θ 1 ) p(f |z, θ 2 ) p(z|θ 3 ) p(θ)

A. Mohammad-Djafari,

120

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

42/58

Proposed algorithms p(f , z, θ|g) ∝ p(g|f , θ 1 ) p(f |z, θ 2 ) p(z|θ 3 ) p(θ) • MCMC based general scheme: b g) −→ zb ∼ p(z|fb, θ, b g) −→ θ b ∼ (θ|fb, zb, g) fb ∼ p(f |b z , θ,

Iterative algorithme: ◮



b g) ∝ p(g|f , θ) p(f |b b Estimate f using p(f |b z , θ, z , θ) Needs optimisation of a quadratic criterion. b g) ∝ p(g|fb, zb, θ) b p(z) Estimate z using p(z|fb, θ,

Needs sampling of a Potts Markov field. ◮

Estimate θ using z , (mk , vk )) p(θ) p(θ|fb, zb, g) ∝ p(g|fb, σǫ2 I) p(fb|b Conjugate priors −→ analytical expressions.

• Variational Bayesian Approximation ◮

Approximate p(f , z, θ|g) by q1 (f ) q2 (z) q3 (θ)

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

43/58

Results

Original

Backprojection

Gauss-Markov+pos

Filtered BP

GM+Line process

GM+Label process

20

20

20

40

40

40

60

60

60

80

80

80

100

100

100

120

120 20

A. Mohammad-Djafari,

LS

40

60

80

100

120

c

120 20

40

60

80

100

120

z

20

40

60

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

80

100

120

c

44/58

Application in Microwave imaging g(ω) = g(u, v) =

ZZ

Z

f (r) exp [−j(ω.r)] dr + ǫ(ω)

f (x, y) exp [−j(ux + vy)] dx dy + ǫ(u, v) g = Hf + ǫ

20

20

20

20

40

40

40

40

60

60

60

60

80

80

80

80

100

100

100

100

120

120 20

40

60

80

100

120

f (x, y)

A. Mohammad-Djafari,

120 20

40

60

80

g(u, v)

100

120

120 20

40

60

80

fb IFT

100

120

20

40

60

80

100

120

fb Proposed method

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

45/58

Images fusion and joint segmentation (with O. F´eron)   gi (r) = fi (r) + ǫi (r) 2 p(fi (r)|z(r) Q = k) = N (mik , σi k )  p(f |z) = i p(fi |z)

g1

g2

A. Mohammad-Djafari,

−→

b f 1 b f 2

b z

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

46/58

Data fusion in medical imaging (with O. F´eron)   gi (r) = fi (r) + ǫi (r) 2 p(fi (r)|z(r) Q = k) = N (mik , σi k )  p(f |z) = i p(fi |z)

g1

g2 A. Mohammad-Djafari,

−→

b f 1

b2 f

b z

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

47/58

Joint segmentation of hyper-spectral images (with N. Bali & A. Mohammadpour)  gi (r) = fi (r) + ǫi (r)    2 p(fi (r)|z(r) Q = k) = N (mik , σi k ), k = 1, · · · , K p(f |z) = i p(fi |z)    mik follow a Markovian model along the index i

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

48/58

Segmentation of a video sequence of images (with P. Brault)  gi (r) = fi (r) + ǫi (r)    2 p(fi (r)|zi (r) Q = k) = N (mik , σi k ), k = 1, · · · , K p(f |z) = i p(fi |zi )    zi (r) follow a Markovian model along the index i

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

49/58

Image separation in Sattelite imaging (with H. Snoussi & M. Ichir)  N  X    Aij fj (r) + ǫi (r)  gi (r) = j=1

 p(fj (r)|zj (r) = k) = N (mj k , σj2 k )     p(A ) = N (A , σ 2 ) ij 0ij 0 ij

f

A. Mohammad-Djafari,

g

b f

b z

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

50/58

Conclusions Sparsity: a great property to use in signal and image processing ◮ Origine: Sampling theory and reconstruction, modeling and representation Compressed Sensing, Approximation theory ◮ Deterministic Algorithms: Optimization of a two termes criterion, penalty term, regularization term ◮ Probabilistic: Bayesian approach ◮ Sprasity enforcing priors: Simple heavy tailed and Hierarchical with hidden variables. ◮ Gauss-Markov-Potts models for images incorporating hidden regions and contours ◮ Main Bayesian computation tools: JMAP, MCMC and VBA ◮ Application in different imaging system (X ray CT, Microwaves, PET, ultrasound and microwave imaging) Current Projects: ◮ Efficient implementation in 2D and 3D cases ◮ Comparison between MCMC and VBA methods ◮

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

51/58

Thanks to: Graduated PhD students: ◮ Sh. Zhu (2012: SAR Imaging) ◮ D. Fall (2012: Emission Positon Tomography, Non Parametric Bayesian) ◮ D. Pougaza (2011: Copula and Tomography) ◮ H. Ayasso (2010: Optical Tomography, Variational Bayes) ◮ S. F´ ekih-Salem (2009: 3D X ray Tomography) ◮ N. Bali (2007: Hyperspectral imaging) ◮ O. F´ eron (2006: Microwave imaging) ◮ F. Humblot (2005: Super-resolution) ◮ M. Ichir (2005: Image separation in Wavelet domain) ◮ P. Brault (2005: Video segmentation using Wavelet domain) ◮ H. Snoussi (2003: Sources separation) ◮ Ch. Soussen (2000: Geometrical Tomography) ◮ G. Mont´ emont (2000: Detectors, Filtering) ◮ H. Carfantan (1998: Microwave imaging) ◮ S. Gautier (1996: Gamma ray imaging for NDT) ◮ M. Nikolova (1994: Piecewise Gaussian models and GNC) ◮ D. Pr´ emel (1992: Eddy current imaging) A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

52/58

Thanks to: Freshly graduated (2013) PhD students: ◮ C. Cai (Multispectral X ray Tomography) ◮ N. Chu (Acoustic sources localization) ◮ Th. Boulay (Non Cooperative Radar Target Recognition) ◮ R. Prenon (Proteomic and Masse Spectrometry) Present PhD students: ◮ L. Gharsali (Microwave imaging for Cancer detection) ◮ M. Dumitru (Multivariate time series analysis for biological signals) ◮ S. AlAli (Diffraction imaging for geophysical applications) Present Master students: ◮ A. Lupu (Non-circular X ray Tomography) ◮ F. Fuc (Multi component signal analysis for biology applications) Post-Docs: ◮ J. Lapuyade (2011: Dimentionality Reduction and multivariate analysis) ◮ S. Su (2006: Color image separation) ◮ A. Mohammadpour (2004: HyperSpectral image segmentation) A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

53/58

Thanks my colleagues and collaborators ◮ ◮ ◮ ◮ ◮ ◮ ◮ ◮ ◮ ◮ ◮ ◮

B. Duchˆene & A. Joisel (L2S) (Inverse scattering and Microwave Imaging) N. Gac (L2S) (GPU Implementation) Th. Rodet (L2S) (Computed Tomography) —————– A. Vabre & S. Legoupil (CEA-LIST), (3D X ray Tomography) E. Barat (CEA-LIST) (Positon Emission Tomography, Non Parametric Bayesian) C. Comtat (SHFJ, CEA) (PET, Spatio-Temporal Brain activity) J. Picheral (SSE, Sup´elec) (Acoustic sources localization) D. Blacodon (ONERA) (Acoustic sources separation) J. Lagoutte (Thales Air Systems) (Non Cooperative Radar Target Recognition) P. Grangeat (LETI, CEA, Grenoble) (Proteomic and Masse Spectrometry) F. L´evi (CNRS-INSERM, Hopital Paul Brousse) (Biological rythms and Chronotherapy of Cancer)

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

54/58

References 1 ◮

A. Mohammad-Djafari, “Bayesian approach with prior models which enforce sparsity in signal and image processing,” EURASIP Journal on Advances in Signal Processing, vol. Special issue on Sparse Signal Processing, (2012).



A. Mohammad-Djafari (Ed.) Probl` emes inverses en imagerie et en vision (Vol. 1 et 2), Hermes-Lavoisier, Trait´ e Signal et Image, IC2, 2009,



A. Mohammad-Djafari (Ed.) Inverse Problems in Vision and 3D Tomography, ISTE, Wiley and sons, ISBN: 9781848211728, December 2009, Hardback, 480 pp.



A. Mohammad-Djafari, Gauss-Markov-Potts Priors for Images in Computer Tomography Resulting to Joint Optimal Reconstruction and segmentation, International Journal of Tomography & Statistics 11: W09. 76-92, 2008.



A Mohammad-Djafari, Super-Resolution : A short review, a new method based on hidden Markov modeling of HR image and future challenges, The Computer Journal doi:10,1093/comjnl/bxn005:, 2008.



H. Ayasso and Ali Mohammad-Djafari Joint NDT Image Restoration and Segmentation using Gauss-Markov-Potts Prior Models and Variational Bayesian Computation, IEEE Trans. on Image Processing, TIP-04815-2009.R2, 2010.



H. Ayasso, B. Duchene and A. Mohammad-Djafari, Bayesian Inversion for Optical Diffraction Tomography Journal of Modern Optics, 2008.



N. Bali and A. Mohammad-Djafari, “Bayesian Approach With Hidden Markov Modeling and Mean Field Approximation for Hyperspectral Data Analysis,” IEEE Trans. on Image Processing 17: 2. 217-225 Feb. (2008).



H. Snoussi and J. Idier., “Bayesian blind separation of generalized hyperbolic processes in noisy and underdeterminate mixtures,” IEEE Trans. on Signal Processing, 2006.

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

55/58

References 2 ◮

O. F´ eron, B. Duch` ene and A. Mohammad-Djafari, Microwave imaging of inhomogeneous objects made of a finite number of dielectric and conductive materials from experimental data, Inverse Problems, 21(6):95-115, Dec 2005.



M. Ichir and A. Mohammad-Djafari, Hidden markov models for blind source separation, IEEE Trans. on Signal Processing, 15(7):1887-1899, Jul 2006.



F. Humblot and A. Mohammad-Djafari, Super-Resolution using Hidden Markov Model and Bayesian Detection Estimation Framework, EURASIP Journal on Applied Signal Processing, Special number on Super-Resolution Imaging: Analysis, Algorithms, and Applications:ID 36971, 16 pages, 2006.



O. F´ eron and A. Mohammad-Djafari, Image fusion and joint segmentation using an MCMC algorithm, Journal of Electronic Imaging, 14(2):paper no. 023014, Apr 2005.



H. Snoussi and A. Mohammad-Djafari, Fast joint separation and segmentation of mixed images, Journal of Electronic Imaging, 13(2):349-361, April 2004.



A. Mohammad-Djafari, J.F. Giovannelli, G. Demoment and J. Idier, Regularization, maximum entropy and probabilistic methods in mass spectrometry data processing problems, Int. Journal of Mass Spectrometry, 215(1-3):175-193, April 2002.



H. Snoussi and A. Mohammad-Djafari, “Estimation of Structured Gaussian Mixtures: The Inverse EM Algorithm,” IEEE Trans. on Signal Processing 55: 7. 3185-3191 July (2007).



N. Bali and A. Mohammad-Djafari, “A variational Bayesian Algorithm for BSS Problem with Hidden Gauss-Markov Models for the Sources,” in: Independent Component Analysis and Signal Separation (ICA 2007) Edited by:M.E. Davies, Ch.J. James, S.A. Abdallah, M.D. Plumbley. 137-144 Springer (LNCS 4666) (2007).

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

56/58

References 3 ◮

N. Bali and A. Mohammad-Djafari, “Hierarchical Markovian Models for Joint Classification, Segmentation and Data Reduction of Hyperspectral Images” ESANN 2006, September 4-8, Belgium. (2006)



M. Ichir and A. Mohammad-Djafari, “Hidden Markov models for wavelet-based blind source separation,” IEEE Trans. on Image Processing 15: 7. 1887-1899 July (2005)



S. Moussaoui, C. Carteret, D. Brie and A Mohammad-Djafari, “Bayesian analysis of spectral mixture data using Markov Chain Monte Carlo methods sampling,” Chemometrics and Intelligent Laboratory Systems 81: 2. 137-148 (2005).



H. Snoussi and A. Mohammad-Djafari, “Fast joint separation and segmentation of mixed images” Journal of Electronic Imaging 13: 2. 349-361 April (2004)



H. Snoussi and A. Mohammad-Djafari, “Bayesian unsupervised learning for source separation with mixture of Gaussians prior,” Journal of VLSI Signal Processing Systems 37: 2/3. 263-279 June/July (2004)



F. Su and A. Mohammad-Djafari, “An Hierarchical Markov Random Field Model for Bayesian Blind Image Separation,” 27-30 May 2008, Sanya, Hainan, China: International Congress on Image and Signal Processing (CISP 2008).



N. Chu, J. Picheral and A. Mohammad-Djafari, “A robust super-resolution approach with sparsity constraint for near-field wideband acoustic imaging,” IEEE International Symposium on Signal Processing and Information Technology pp 286–289, Bilbao, Spain, Dec14-17,2011

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

57/58

Questions, Discussions, Open mathematical problems ◮

Sparsity representation, low rank matrix decomposition ◮ ◮ ◮



Optimization of the KL divergence for VBA separable approximation ◮ ◮



Convergency of alternate optimization Other possible algorithms

Properties of the obtained approximation ◮ ◮



Sparsity and positivity or other constraints Group sparsity Algorithmic and implementation issues for great dimensional applications

Does the moments of q’s corresponds to the moments of p? How about any other statistics: entropy, ...

Other divergency measures?

A. Mohammad-Djafari,

Seminar 2: in signal and image processing:..., Tsinghua University, Beijing, China

58/58