Sparsity in signal and image processing: from ... - Ali Mohammad-Djafari

Nov 28, 2013 - −1. −0.8. −0.6. −0.4. −0.2. 0. 0.2. 0.4. 0.6. 0.8. 1. ▷ Sparse images in a .... A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 9/53 ... No real interest if we have to keep all the N>T coefficients. ..... |fj|β + (β − 1) log |fj| . .... 2log (1 + f2/ν).
2MB taille 0 téléchargements 281 vues
.

Sparsity in signal and image processing: from modeling and representation to reconstruction and processing Ali Mohammad-Djafari Laboratoire des Signaux et Syst`emes, UMR8506 CNRS-SUPELEC-UNIV PARIS SUD 11 SUPELEC, 91192 Gif-sur-Yvette, France http://lss.supelec.free.fr Email: [email protected] http://djafari.free.fr Seminar at NUDT, Changsha, China, Nov. 28, 2013 A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 1/53

Contents Sparse signals and images First ideas for using sparsity in signal processing Modeling for sparse representation Bayesian Maximum A Posteriori (MAP) approach and link with Deterministic Regularization 5. Priors which enforce sparsity 1. 2. 3. 4.

◮ ◮ ◮ ◮

Heavy tailed: Double Exponential, Generalized Gaussian, ... Mixture models: Mixture of Gaussians, Student-t, ... Hierarchical models with hidden variables General Gauss-Markov-Potts models

6. Computational tools: Joint Maximum A Posteriori (JMAP), MCMC and Variational Bayesian Approximation (VBA) 7. Applications: X ray Computed Tomography, Microwave and Ultrasound imaging, Sattelite and Hyperspectral image processing, Spectrometry, CMB, ... A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 2/53

1. Sparse signals and images ◮

Sparse signals: Direct sparsity 3

4

3.5 2 3 1 2.5

0

2

1.5 −1 1 −2 0.5

−3



0

20

40

60

80

100

120

140

160

180

0

0

20

40

60

80

100

120

140

160

180

Sparse images: Direct sparsity 50

100

150

200

250 50

100

150

200

250

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 3/53

Sparse signals and images ◮

Sparse signals in a Transform domain 1

1

0.8

0.8

0.6

0.6

0.4

0.4

0.2

0.2

0

0

−0.2

−0.2

−0.4

−0.4

−0.6

−0.6

−0.8

−1



−0.8

0

20

40

60

80

100

120

140

160

180

200

−1

0

20

40

60

80

100

120

140

160

180

Sparse images in a Transform domain

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 4/53

200

Sparse signals and images ◮

Sparse signals in Fourier domain Time domain

Fourier domain

6

200

180 4 160

140 2 120

0

100

80 −2 60

40 −4 20

−6



0

20

40

60

80

100

120

140

160

180

200

Sparse images in wavelet domain Space domain

0

0

50

100

150

200

Fourier domain

50

100

150

200

250 50

100

150

200

250

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 5/53

250

Sparse signals and images ◮

Sparse signals: Sparsity in a Transform domaine

3

6

2.5

4

2

2

1.5

0

1

−2

0.5

−4

1

0.8

0.6

0.4

0.2

0

−0.2

−0.4

−0.6

−0.8

0

0

20

40

60

80

100

120

140

160

180

3

200

−6

0

20

40

60

80

100

120

140

160

180

200

−1

0

20

40

60

80

100

120

140

160

180

200

140

160

180

200

200

2

180 2 160

4

140

6

1 120

8 0

100

80

10

60

12

−1

40 −2

14 20

16 −3

0

20

40

60

80

100

120

140

160

180

0

0

50

100

150

200

250

20

40

60

80

100

120

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 6/53

Sparse signals and images

50

50

100

100

150

150

200

200

250

250 50

100

150

200

250

50

100

150

200

250

50

100

150

200

250

50

100

150

200

250

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 7/53

Sparse signals and images

Image

Fourier

3500

Wavelets

6000

10 3000

5000

8

2500 4000 2000

6 3000

1500

4 2000 1000

0

2

1000

500

0

50

100

150

200

0

250

0

Image hist.

2

4

6

8

10

12

14

0 16 −500

Fourier coeff. hist. 1400

1400

1200

1200

1200

1000

1000

1000

800

800

800

600

600

600

400

400

400

200

200

−150

−100

−50

0

50

bands 1-3

100

150

200

0 −200

500

Wavelet coeff. hist.

1400

0 −200

0

200

−150

−100

−50

0

50

bands 4-6

100

150

200

0 −200

−150

−100

−50

0

50

bands 7-9

100

150

200

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 8/53

2. First ideas: some history ◮



1948: Shannon: Sampling theorem and reconstruction of a band limited signal 1993-2007: ◮

◮ ◮





Mallat, Zhang, Cand`es, Romberg, Tao and Baraniuk: Non linear sampling, Compression and reconstruction, Fuch: Sparse representation Donoho, Elad, Tibshirani, Tropp, Duarte, Laska: Compressive Sampling, Compressive Sensing

2007-2012: Algorithms for sparse representation and compressive Sampling: Matching Pursuit (MP), Projection Pursuit Regression, Pure Greedy Algorithm, OMP, Basis Poursuit (BP), Dantzig Selector (DS), Least Absolute Shrinkage and Selection Operator (LASSO),... 2003-2012: Bayesian approach to sparse modeling Tipping, Bishop: Sparse Bayesian Learning, Relevance Vector Machine (RVM), ...

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 9/53

3. Modeling and representation ◮

Modeling via a basis (codebook, overcomplete dictionnary, Design Matrix) g(t) =

N X

f j φj (t), t = 1, · · · , T −→ g = Φ′ f

j=1







When T ≥ N

2    N T X X b g(t) − f j φj (t) −→ f j = arg min fj   t=1 j=1  b = arg min kg − Φ′ f k2 = [ΦΦ′ ]−1 Φg f f b = Φg When orthogonal basis: ΦΦ′ = I −→ f fbj =

N X

g(t) φj (t) =< g(t), φj (t) >

t=1

Application in Compression, Transmission and Decompression

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 10/53

Modeling and representation ◮

When overcomplete basis N > T : Infinite number of solutions for Φ′ f = g. We have to select one:  b = arg min f kf k22 ′ f : Φ f =g or writing differently:

minimize kf k22 subject to Φ′ f = g resulting to: ◮ ◮ ◮

b = Φ[Φ′ Φ]−1 g f b = Φg. Again if Φ′ Φ = I −→ f No real interest if we have to keep all the N > T coefficients. Sparsity: minimize kf k0 subject to Φ′ f = g or minimize kf k1 subject to Φ′ f = g

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 11/53

Sparse decomposition ◮

Strict sparsity and exact reconstruction minimize kf k0 subject to Φ′ f = g kf k0 is the number of non-zero elements of f ◮ ◮ ◮ ◮



Matching Pursuit (MP) [Mallat & Zhang, 1993] Orthogonal Matching Pursuit (OMP) [Lin, Huang et al., 1993] Projection Pursuit Regression Greedy Algorithms

Sparsity enforcing and exact reconstruction minimize kf k1 subject to Φ′ f = g ◮ ◮

Basis Pursuit (BP) Block Coordinate Relaxation

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 12/53

Sparse decomposition ◮

Strict sparsity and approximate reconstruction

minimize kf k0 subject to kg − Φ′ f k2 < c   b = arg min kf k0 + µkg − Φ′ f k2 = arg min kg − Φ′ f k2 + λkf k0 f f f ◮

Sparsity enforcing and approximate reconstruction  b = arg min kg − Φ′ f k2 + λkf k1 f f

J(f ) = kg − Φ′ f k2 + λkf k1 = kg − Φ′ f k2 + λ

X

|f j |

j



Main Algorithm: LASSO [Tibshirani 2003] minimize kg − Φ′ f k2 subject to kf k1 < τ

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 13/53

Sparse Decomposition Algorithms ◮

LASSO: J(f ) = kg − Φ′ f k2 + λ

X

|f j |

j



Other Criteria ◮

Lp J(f ) = kg − Φ′ f k2 + λ1

X

|f j |p ,

1 0 DE(f |a) = exp {−a|f |} = 2 0 n o Q Q P = j p(fj |zj ) = j N (fj |0, zj ) ∝ exp − 12 j fj2 /zj p(f |z) nP 2 o Q 2 2 a = j E(zj | a2 ) ∝ exp p(z| a2 ) j 2 zj o n P 2 2 p(f , z| a2 ) ∝ exp − 21 j fj2 /zj + a2 zj ◮

        



With these models we have: ◮

Simple priors p(f , θ|g) ∝ p(g|f , θ1 ) p(f |θ 2 ) p(θ)



Hierarchical priors p(f , z, θ|g) ∝ p(g|f , θ1 ) p(f |z, θ2 ) p(z|θ3 ) p(θ)

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 29/53

Bayesian Computation and Algorithms ◮

When the expression of p(f , θ|g) or of p(f , z, θ|g) is obtained, we have following options:



Joint MAP: (needs optimization algorithms) b = arg max {p(f , θ|g)} b , θ) (f (f ,θ )



MCMC: Needs the expressions of the conditionals p(f |z, θ, g), p(z|f , θ, g), and p(θ|f , z, g)



Variational Bayesian Approximation (VBA): Approximate p(f , z, θ|g) by a separable one q(f , z, θ|g) = q1 (f ) q2 (z) q3 (θ) and do any computations with these separable ones.

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 30/53

Joint MAP p(f , θ|g) ∝ p(g|f , θ 1 ) p(f |θ 2 ) p(θ) ◮



Objective:

b = arg max {p(f , θ|g)} b , θ) (f (f ,θ ) Alternate optimization:  n o b = arg max b  f p(f , θ|g) fn o b , θ|g) b = arg max p(f  θ θ b θ (0) −→ θ−→ ↑



b θ←−

n o b g) b = arg max θ, f p(f | f n o b = arg max p(θ|f b , g) θ θ

Uncertainties are not propagated.

b −→f ↓

b ←− f

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 31/53

MCMC based algorithm p(f , z, θ|g) ∝ p(g|f , z, θ) p(f |z, θ) p(z) p(θ) General scheme (Gibbs Sampling): ◮

Generate samples from the conditionals: b g) −→ z b , θ, b g) −→ θ b, z b ∼ p(f |b b ∼ (θ|f b ∼ p(z|f b, g) f z , θ,



Waite for convergency



Compute empirical statistics (means, modes, variances) from the samples E {f } ≈

1 X (n) f N n

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 32/53

Variational Bayesian Approximation ◮





Approximate p(f , θ|g) by q(f , θ|g) = q1 (f |g) q2 (θ|g) and then continue computations. Criterion KL(q(f , θ|g) : p(f , θ|g)) ZZ ZZ q q1 q2 KL(q : p) = q ln = q1 q2 ln p p Iterative algorithm q1 −→ q2 −→ q1 −→ q2 , · · ·  n o  q1 (f ) ∝ exp hln p(g, f , θ; M)i q2 (θ ) o n  q2 (θ) ∝ exp hln p(g, f , θ; M)i q1 (f )

n o (0) qb2 −→ qb2 −→ q1 (f ) ∝ exp hln p(g, f , θ; M)iq2 ↑

n o b ← qb2 (θ)←− q2 (θ) ∝ exp hln p(g, f , θ; M)i θ q1 ◮

b −→b q1 (f ) −→ f ↓

←− qb1

Uncertainties are propagated (Message Passing methods)

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 33/53

Summary of Bayesian approach ◮

Simple priors

↓ α, β

Hyper prior model p(θ|α, β) p(θ 2 |α2 , β2 ) p(θ 1 |α1 , β1 ) ❄

p(f |θ 2 )

⋄ p(g|f , θ 1 ) −→p(f , θ|g, α, β) → MCMC

Prior ◮

JMAP



Likelihood

Hierarchical priors

Joint Posterior

VBA

b −→ f b −→ θ

↓ α, β, γ Hyper prior model p(θ|α, β, γ) p(θ 3 |α3 , β3 ) ❄ ⋄ p(z|θ3 ) Hidden variable

p(θ2 |α2 , β2 ) p(θ1 |α1 , β1 ) JMAP b −→ f ❄ ❄ b p(f |z, θ2 ) ⋄ p(g|f , θ1 ) −→ p(f , z, θ|g) −→ MCMC −→ z b −→ θ VBA Prior Likelihood Joint Posterior

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 34/53

Advantages of the Bayesian Approach ◮

More possibilities to model sparsity



More tools to handle hyperparameters



More tools to account for uncertainties



More possibilities to understand and to control many ad hoc deterministic algorithms Hierarchical models give still more modeling possibilities



◮ ◮ ◮

◮ ◮ ◮

Bernouilli-Gaussian: strict sparsity Bernouilli-Gamma: strict sparsity + positivity Bernouilli-Multinomial: strict sparsity + discrete values (finite states) Independent Mixture models: sparsity enforcing Mixture of multivariate models: group sparsity enforcing Gauss-Markov-Potts models: sparsity in transform domains

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 35/53

Examples of Hierarchical models 3

3

2

2.5

1

2

0

1.5

−1

1

−2

0.5

−3

0

20

40

60

80

100

120

140

160

180

0

0

20

40

Bernouilli-Gaussian

60

80

100

120

140

160

180

160

180

Bernouilli-Gamma

3

4

3 2 2 1 1

0

0

−1 −1 −2 −2 −3

−3

0

20

40

60

80

100

120

140

160

180

−4

0

20

Bernouilli-Binomial 4

4

3.5

3.5

3

3

2.5

2.5

2

2

1.5

1.5

1

80

100

120

140

0.5

0

20

40

60

80

100

120

140

160

180

0

0

20

40

Piecewise constant

60

80

100

120

140

160

180

160

180

Piecewise Gaussian

3

3

2

2

1

1

0

0

−1

−1

−2

−3

60

1

0.5

0

40

Bernouilli-Multinomial

−2

0

20

40

60

80

100

120

140

160

Gauss-Markov-Potts 1

180

−3

0

20

40

60

80

100

120

140

Gauss-Markov-Potts 2

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 36/53

Which class of images I am looking for? 50 100 150 200 250 300 350 400 450 50

100

150

200

250

300

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 37/53

Which class of signals I am looking for?

Gauss-Markov

Generalized GM

Piecewize Gaussian

Mixture of GM: Gauss-Markov-Potts

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 38/53

Gauss-Markov-Potts prior models for images

f (r)

z(r)

q(r) = 1 − δ(z(r) − z(r′ ))

p(f (r)|z(r) = k, mk , vk ) = N (mk , vk ) X p(f (r)) = P (z(r) = k) N (mk , vk ) Mixture of Gaussians

k Q Separable iid hidden variables: p(z) = r p(z(r)) ◮ Markovian hidden variables: p(z) Potts-Markov:     X ′ ′ ′ p(z(r)|z(r ), r ∈ V(r)) ∝ exp γ δ(z(r) − z(r ))  ′    r ∈V(r )  X X  p(z) ∝ exp γ δ(z(r) − z(r ′ ))   r ∈R r ′ ∈V(r)



A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 39/53

Four different cases To each pixel of the image is associated 2 variables f (r) and z(r) ◮

f |z Gaussian iid, z iid : Mixture of Gaussians



f |z Gauss-Markov, z iid : Mixture of Gauss-Markov



f |z Gaussian iid, z Potts-Markov : Mixture of Independent Gaussians (MIG with Hidden Potts)



f |z Markov, z Potts-Markov : Mixture of Gauss-Markov (MGM with hidden Potts)

f (r)

z(r)

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 40/53

Application of CT in NDT Reconstruction from only 2 projections

g1 (x) = ◮



Z

f (x, y) dy,

g2 (y) =

Z

f (x, y) dx

Given the marginals g1 (x) and g2 (y) find the joint distribution f (x, y). Infinite number of solutions : f (x, y) = g1 (x) g2 (y) Ω(x, y) Ω(x, y) is a Copula: Z Z Ω(x, y) dx = 1 and Ω(x, y) dy = 1

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 41/53

Application in CT

20

40

60

80

100

120 20

g|f f |z g = Hf + ǫ iid Gaussian or g|f ∼ N (Hf , σǫ2 I) Gaussian Gauss-Markov

z iid or Potts

40

60

80

100

120

c q(r) ∈ {0, 1} 1 − δ(z(r) − z(r ′ )) binary

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 42/53

Proposed algorithm p(f , z, θ|g) ∝ p(g|f , z, θ) p(f |z, θ) p(θ) General scheme: b ∼ p(f |b b g) −→ z b , θ, b g) −→ θ b ∼ (θ|f b, z b ∼ p(z|f b, g) f z , θ,

Iterative algorithme: ◮



b g) ∝ p(g|f , θ) p(f |b b Estimate f using p(f |b z , θ, z , θ) Needs optimisation of a quadratic criterion. b , θ, b g) ∝ p(g|f b, z b p(z) b, θ) Estimate z using p(z|f

Needs sampling of a Potts Markov field. ◮

Estimate θ using b, z b , σ 2 I) p(f b |b b, g) ∝ p(g|f p(θ|f z , (mk , vk )) p(θ) ǫ Conjugate priors −→ analytical expressions.

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 43/53

Results

Original

Backprojection

Gauss-Markov+pos

Filtered BP

GM+Line process

LS

GM+Label process

20

20

20

40

40

40

60

60

60

80

80

80

100

100

100

120

120 20

40

60

80

100

120

c

120 20

40

60

80

100

120

z

20

40

60

80

100

120

c

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 44/53

Application in Microwave imaging g(ω) = ZZ

g(u, v) =

Z

f (r) exp {−j(ω.r)} dr + ǫ(ω)

f (x, y) exp {−j(ux + vy)} dx dy + ǫ(u, v) g = Hf + ǫ

20

20

20

20

40

40

40

40

60

60

60

60

80

80

80

80

100

100

100

100

120

120 20

40

60

80

f (x, y)

100

120

120 20

40

60

80

g(u, v)

100

120

120 20

40

60

80

b IFT f

100

120

20

40

60

80

100

120

b Proposed method f

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 45/53

Images fusion and joint segmentation (with O. F´eron)   gi (r) = fi (r) + ǫi (r) 2 p(fi (r)|z(r) Q = k) = N (mik , σi k )  p(f |z) = i p(f i |z)

g1

g2

−→

b f 1 b f 2

b z

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 46/53

Data fusion in medical imaging (with O. F´eron)   gi (r) = fi (r) + ǫi (r) 2 p(fi (r)|z(r) Q = k) = N (mik , σi k )  p(f |z) = i p(f i |z)

g1

g2

−→

b f 1

b2 f

b z

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 47/53

Joint segmentation of hyper-spectral images (with N. Bali & A. Mohammadpour)  gi (r) = fi (r) + ǫi (r)    2 p(fi (r)|z(r) Q = k) = N (mik , σi k ), k = 1, · · · , K p(f |z) = i p(f i |z)    mik follow a Markovian model along the index i

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 48/53

Segmentation of a video sequence of images (with P. Brault)  gi (r) = fi (r) + ǫi (r)    2 p(fi (r)|zi (r) Q = k) = N (mik , σi k ), k = 1, · · · , K p(f |z) = i p(f i |z i )    zi (r) follow a Markovian model along the index i

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 49/53

Image separation in Sattelite imaging (with H. Snoussi & M. Ichir)  N  X    Aij fj (r) + ǫi (r)  gi (r) = j=1

 p(fj (r)|zj (r) = k) = N (mj k , σj2 k )     p(A ) = N (A , σ 2 ) ij 0ij 0 ij

f

g

b f

b z

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 50/53

Conclusions Sparsity: a great property to use in signal and image processing ◮ Origine: Sampling theory and reconstruction, modeling and representation Compressed Sensing, Approximation theory ◮ Deterministic Algorithms: Optimization of a two termes criterion, penalty term, regularization term ◮ Probabilistic: Bayesian approach ◮ Sprasity enforcing priors: Simple heavy tailed and Hierarchical with hidden variables. ◮ Gauss-Markov-Potts models for images incorporating hidden regions and contours ◮ Main Bayesian computation tools: JMAP, MCMC and VBA ◮ Application in different imaging system (X ray CT, Microwaves, PET, ultrasound and microwave imaging) Current Projects and Perspectives : ◮ Efficient implementation in 2D and 3D cases ◮ Evaluation of performances and comparison between MCMC ◮

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 51/53

Some references ◮ ◮ ◮ ◮ ◮ ◮ ◮ ◮ ◮

◮ ◮ ◮

A. Mohammad-Djafari (Ed.) Probl` emes inverses en imagerie et en vision (Vol. 1 et 2), Hermes-Lavoisier, Trait´ e Signal et Image, IC2, 2009, A. Mohammad-Djafari (Ed.) Inverse Problems in Vision and 3D Tomography, ISTE, Wiley and sons, ISBN: 9781848211728, December 2009, Hardback, 480 pp. H. Ayasso and Ali Mohammad-Djafari Joint NDT Image Restoration and Segmentation using Gauss-Markov-Potts Prior Models and Variational Bayesian Computation, To appear in IEEE Trans. on Image Processing, TIP-04815-2009.R2, 2010. H. Ayasso, B. Duchene and A. Mohammad-Djafari, Bayesian Inversion for Optical Diffraction Tomography Journal of Modern Optics, 2008. A. Mohammad-Djafari, Gauss-Markov-Potts Priors for Images in Computer Tomography Resulting to Joint Optimal Reconstruction and segmentation, International Journal of Tomography & Statistics 11: W09. 76-92, 2008. A Mohammad-Djafari, Super-Resolution : A short review, a new method based on hidden Markov modeling of HR image and future challenges, The Computer Journal doi:10,1093/comjnl/bxn005:, 2008. O. F´ eron, B. Duch` ene and A. Mohammad-Djafari, Microwave imaging of inhomogeneous objects made of a finite number of dielectric and conductive materials from experimental data, Inverse Problems, 21(6):95-115, Dec 2005. M. Ichir and A. Mohammad-Djafari, Hidden markov models for blind source separation, IEEE Trans. on Signal Processing, 15(7):1887-1899, Jul 2006. F. Humblot and A. Mohammad-Djafari, Super-Resolution using Hidden Markov Model and Bayesian Detection Estimation Framework, EURASIP Journal on Applied Signal Processing, Special number on Super-Resolution Imaging: Analysis, Algorithms, and Applications:ID 36971, 16 pages, 2006. O. F´ eron and A. Mohammad-Djafari, Image fusion and joint segmentation using an MCMC algorithm, Journal of Electronic Imaging, 14(2):paper no. 023014, Apr 2005. H. Snoussi and A. Mohammad-Djafari, Fast joint separation and segmentation of mixed images, Journal of Electronic Imaging, 13(2):349-361, April 2004. A. Mohammad-Djafari, J.F. Giovannelli, G. Demoment and J. Idier, Regularization, maximum entropy and probabilistic methods in mass spectrometry data processing problems, Int. Journal of Mass Spectrometry, 215(1-3):175-193, April 2002.

A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 52/53

Thanks, Questions and Discussions Thanks to:

My graduated PhD students:

◮ ◮ ◮ ◮

H. Snoussi, M. Ichir, (Sources separation) F. Humblot (Super-resolution) H. Carfantan, O. F´ eron (Microwave Tomography) S. F´ ekih-Salem (3D X ray Tomography)

My present PhD students:

◮ ◮ ◮ ◮ ◮

H. Ayasso (Optical Tomography, Variational Bayes) D. Pougaza (Tomography and Copula) —————– Sh. Zhu (SAR Imaging) D. Fall (Emission Positon Tomography, Non Parametric Bayesian)

My colleages in GPI (L2S) & collaborators in other instituts:

◮ ◮ ◮ ◮ ◮ ◮ ◮

B. Duchˆ ene & A. Joisel (Inverse scattering and Microwave Imaging) N. Gac & A. Rabanal (GPU Implementation) Th. Rodet (Tomography) —————– A. Vabre & S. Legoupil (CEA-LIST), (3D X ray Tomography) E. Barat (CEA-LIST) (Positon Emission Tomography, Non Parametric Bayesian) C. Comtat (SHFJ, CEA)(PET, Spatio-Temporal Brain activity)

Questions and Discussions A. Mohammad-Djafari, Tutorial: Sparsity in signal and image processing, November 28, 2013, NUDT, Changsha, China, 53/53