Scene Flow by Tracking in Intensity and Depth Data - HAL-Inria

Approach. Sparse scene flow: we track small surface patches in the scene by ..... The End. Julian Quiroga (INRIA). Scene Flow by Tracking in I+D. June 12, 2012.
5MB taille 4 téléchargements 278 vues
Scene Flow by Tracking in Intensity and Depth Data Julian Quiroga

Frederic Devernay

James Crowley

PRIMA team, INRIA Grenoble [email protected]

June 12, 2012

Julian Quiroga (INRIA)

Scene Flow by Tracking in I+D

June 12, 2012

1 / 20

Motivation

Surface Flow, Morpheo-INRIA 2011

Julian Quiroga (INRIA)

Scene Flow by Tracking in I+D

Messing et al., ICCV 2009

June 12, 2012

2 / 20

Scene flow computation

Stereo or multiview: From several optical flows By using structure constrains and 2D/3D regularization

Basha et al., CVPR 2010

Simultaneously with 3D surface Tracking surfels (surface elements) Color and depth: Photometric constrains and 3D regularization Particle filtering in 3D Hadfield and Bowden, ICCV 2011

Julian Quiroga (INRIA)

Scene Flow by Tracking in I+D

June 12, 2012

3 / 20

Our work Approach Sparse scene flow: we track small surface patches in the scene by using a pair of aligned intensity and depth images.

Model To constraint the scene flow in the image domain we assume a scene composed of rigidly moving 3D parts performing translation.

Framework By using the scene flow as parameter vector we extend the Lukas-Kanade approach to exploit both intensity and depth data.

Result We simultaneously solve for the scene flow and the image flow. Julian Quiroga (INRIA)

Scene Flow by Tracking in I+D

June 12, 2012

4 / 20

Presentation outline

Lucas-Kanade framework Motion model Locally rigid tracking approach Tracking in intensity and depth Experimentation Conclusion

Julian Quiroga (INRIA)

Scene Flow by Tracking in I+D

June 12, 2012

5 / 20

Lucas-Kanade Framework The goal of Lucas-Kanade algorithm is to align a template image T (x) to an input image I(x). This problem can be stated as X P = arg min [I (W(x; P)) − T (x)]2 P

x

Assuming that an initial estimate of P is known, each optimization step finds ∆P which minimizes 2 X X ∂W 2 ∆P − T (x) [I (W(x; P + ∆P)) − T (x)] = I (W(x; P)) + ∇I ∂P x x Taking the partial derivative with respect to ∆P and solving it gives −1

∆P = H where H =

X  ∂W T ∇I [I (W(x; P)) − T (x)] ∂P x

P  ∂W T  ∂W  ∇I ∂P ∇I ∂P . x

Julian Quiroga (INRIA)

Scene Flow by Tracking in I+D

June 12, 2012

6 / 20

Presentation outline

Lucas-Kanade framework Motion model Locally rigid tracking approach Tracking in intensity and depth Experimentation Conclusion

Julian Quiroga (INRIA)

Scene Flow by Tracking in I+D

June 12, 2012

7 / 20

Motion model The instantaneous motion of a rigid surface point can be expressed as X0 = RX + V, The 3D motion of the surface generates the image flow given by   X X − Y ΩZ + Z ΩY + VX 0 − fx u =x −x = Z − X ΩY + Y ΩX + V Z Z and 0

v =y −y =



Y − X ΩZ + Z ΩX + Vy Y − Z − X ΩY + Y ΩX + V Z Z

 fy

Assuming that the inter-frame rotation is negligible , the image flow induced on a pixel x = (x, y ) by the 3D translation of the surface can be modeled as follows       V 1 u 1 0 −x  X  VY = ∆(x; V) = 0 1 −y v Z VZ Julian Quiroga (INRIA)

Scene Flow by Tracking in I+D

June 12, 2012

8 / 20

Presentation outline

Lucas-Kanade framework Motion model Locally rigid tracking approach Tracking in intensity and depth Experimentation Conclusion

Julian Quiroga (INRIA)

Scene Flow by Tracking in I+D

June 12, 2012

9 / 20

Locally rigid tracking approach Under brightness constancy assumption, points X at time t − 1 and X0 = X + V at time t are projected with the same intensity in the image     ˆ (X + V) = I t−1 M(X) ˆ It M Considering a set of surface points S, the scene flow computation is stated as finding vector V = {VX , VY , VZ } which minimizes   i2 Xh  ˆ (X + V) − I t−1 M ˆ (X) It M X∈S

The imagen flow of each surface points is given by the warp function ˆ (X + V) = x + ∆(x; V) W(x; V) = M where ∆(x; V) is the proposed motion model. Julian Quiroga (INRIA)

Scene Flow by Tracking in I+D

June 12, 2012

10 / 20

Locally rigid tracking approach The problem can be formulated in the image domain as follows X V = arg min [I (W(x; V)) − T (x)]2 V

{x}∈s

Solution. Each element of the Jacobian is given by   ∂W 1 1 0 −x = ∂V Z (x) 0 1 −y Over each iteration the Hessian matrix can be expressed as  2  Ix Ix Iy Ix IΣ X 1  Ix Iy Iy2 Iy IΣ  H= 2 Z (x) {x} Ix IΣ Iy IΣ IΣ2  with IΣ = − xIx + yIy . Julian Quiroga (INRIA)

Scene Flow by Tracking in I+D

June 12, 2012

11 / 20

Presentation outline

Lucas-Kanade framework Motion model Locally rigid tracking approach Tracking in intensity and depth Experimentation Conclusion

Julian Quiroga (INRIA)

Scene Flow by Tracking in I+D

June 12, 2012

12 / 20

Tracking in intensity and depth Under translation V the depth image must satisfy ˆ + V)) = Z t−1 (M(X)) ˆ Z t (M(X + VZ Therefore, we propose to formulate the scene flow computation by constraining V both in the intensity and depth: X

h  i2 [I (W(x; V)) − T (x)]2 + λ Z (W(x; V)) − Tz (x) + D T V

{x}∈s

where λ = σI2 /σZ2 and D = (0, 0, 1) separates the Z component. Solution.  X 1  Ix2 + λZx2 Ix Iy + λZx Zy Ix IΣ + λZx (ZΣ − 1) 2 2 Ix Iy + λZx Zy Iy + λZy Iy IΣ + λZy (ZΣ − 1) H= 2 2 Ix IΣ + λZx (ZΣ − 1) Iy IΣ + λZy (ZΣ − 1) IΣ + λ(ZΣ − 1)2 Z (x) {x}   with IΣ = − xIx + yIy and ZΣ = − xZx + y Zy . Julian Quiroga (INRIA)

Scene Flow by Tracking in I+D

June 12, 2012

13 / 20

Presentation outline

Lucas-Kanade framework Motion model Locally rigid tracking approach Tracking in intensity and depth Experimentation Conclusion

Julian Quiroga (INRIA)

Scene Flow by Tracking in I+D

June 12, 2012

14 / 20

Experimentation - Middlebury datastes

I1

I2

Z1

Details

Z2

Comparisons

Images : Teddy, Cones (2 and 6)

RT: proposed method (rigid traslation)

5 leves of PYR decomposition

RT0: proposed method with λ = 0

Window size: 11×11

KLT: KLT by Bouguet (OpenCV)

Image coverage: 85%

OFR : KLT with a robust norm Hug07 : Huguet and Devernay, ICCV 2007

Error measures Optical flow: RMSOF , AEEOF , RX

Bas10 : Basha et al., CVPR 2010

Scene flow: NRMSV , RX %

Had11 : Hadfield and Bowden, ICCV 2011

Julian Quiroga (INRIA)

Scene Flow by Tracking in I+D

June 12, 2012

15 / 20

Experimentation - Middlebury datastes RMSOF R1.0 R5.0 AEEOF

RT 2.51 12.9 2.32 1.15

RT0 2.61 14.8 3.99 1.33

OFR 4.69 28.4 15.2 1.56

KLT 5.95 40.1 19.7 1.45

NRMSV R5% R20%

RMSOF R1.0 NRMSV R5%

RT Utex 3.11 39.8 10.9 28.5

RT0 55.9 28.8 9.68

OFR 68.4 37.9 17.5

KLT 82.0 38.1 19.1

Table 2: Errors in the scene flow.

Table 1: Errors in the optical flow. Tex 4.99 16.5 23.2 12.1

RT 11.1 17.1 4.97

DD 6.91 32.5 26.5 25.1

Tex 5.75 38.9 96.7 32.0

OFR Utex 7.20 58.8 202 51.5

DD 7.05 68.7 188 69.6

Table 3: Errors by regions. RMSOF (%) AEEOF

RT 5.70 2.47

Hug07 6.00 0.60

Bas10 2.96 0.70

Had11 0.10 5.03

Table 4: Scene flow comparison. Julian Quiroga (INRIA)

Scene Flow by Tracking in I+D

June 12, 2012

16 / 20

Experimentation - Kinect images

I1

Optical flow

I2

V Scene flow projection

Z1

Julian Quiroga (INRIA)

Vz

Scene Flow by Tracking in I+D

June 12, 2012

17 / 20

Presentation outline

Lucas-Kanade framework Motion model Locally rigid tracking approach Tracking in intensity and depth Experimentation Conclusion

Julian Quiroga (INRIA)

Scene Flow by Tracking in I+D

June 12, 2012

18 / 20

Conclusion We have proposed a method to compute a sparse scene flow by using an aligned pair of intensity and depth images. Modeling the image flow as a function of the 3D motion field with help from the depth sensor allows the constraint of the scene flow of a small surface patch in the image domain. Combining intensity and depth data in a Lucas-Kanade framework we simultaneoulsy solve for the scene flow and image flow. This method is versatile and cand be used to generate more accurate trajectories or to define scene flow based descriptors.

Future work A criterion for selecting good regions to track Experimentation: action and gesture recognition

Julian Quiroga (INRIA)

Scene Flow by Tracking in I+D

June 12, 2012

19 / 20

The End

Julian Quiroga (INRIA)

Scene Flow by Tracking in I+D

June 12, 2012

20 / 20