Vision par ordinateur - Frédéric Devernay

and any two such reconstructions from these correspondences are projectively equivalent allows reconstruction from pair of uncalibrated images! Properties of ...
1MB taille 2 téléchargements 75 vues
Epipolar geometry Underlying structure in set of matches for rigid scenes

Vision par ordinateur

C1

π L2

m1

lT 1 l2

M e1 e2

Géométrie épipolaire

Fundamental matrix (3x3 rank 2 matrix)

Frédéric Devernay Avec des transparents de Marc Pollefeys

Canonical representation:

L2 M

1. Computable from corresponding points 2. Simplifies matching 3. Allows to detect wrong matches 4. Related to calibration

allows reconstruction from pair of uncalibrated images!

L1

l1 lT 1 l2

e1 e2

Fundamental matrix (3x3 rank 2 matrix)

x2

l2

C2

If a set of point correspondences in two views determine the fundamental matrix uniquely, then the scene and cameras may be reconstructed from these correspondences alone, and any two such reconstructions from these correspondences are projectively equivalent

Π P x1

m2

The projective reconstruction theorem

Epipolar geometry

C1

L1

l1

l2

C2

Properties of the fundamental matrix

Computation of F • • • •

Linear (8-point) Minimal (7-point) Robust (RANSAC) Non-linear refinement (MLE, …)

• Practical approach

the NOT normalized 8-point algorithm

Epipolar geometry: basic equation

separate known from unknown ~10000

(data)

(unknowns) (linear)

!

the normalized 8-point algorithm

(700,500)

(700,0)

~10000

~10000

~100 ~100 ~100

1

Orders of magnitude difference between column of data matrix → least-squares yields poor results

SVD from linearly computed F matrix (rank 3) (-1,1)

(1,1)

Compute closest rank-2 approximation

(0,0) (0,0)

~100

the singularity constraint

Transform image to ~[-1,1]x[-1,1]

(0,500)

~10000

(-1,-1)

(1,-1)

normalized least squares yields good results (Hartley, PAMI´97)

F vs. F'

the minimum case – 7 point correspondences

one parameter family of solutions but F1+λF2 not automatically rank 2

the minimum case – impose rank 2

Robust estimation

σ3 (obtain 1 or 3 solutions)

• What if set of matches contains gross outliers? (to keep things simple let’s consider line fitting first)

F7pts F F1

F2 (cubic equation)

Compute possible λ as eigenvalues of (only real solutions are potential solutions) Minimal solution for calibrated cameras: 5-point

RANSAC (RANdom Sampling Consensus) Objective Robust fit of model to data set S which contains outliers Algorithm (i) Randomly select a sample of s data points from S and instantiate the model from this subset. (ii) Determine the set of data points Si which are within a distance threshold t of the model. The set Si is the consensus set of samples and defines the inliers of S. (iii) If the subset of Si is greater than some threshold T, reestimate the model using all the points in Si and terminate (iv) If the size of Si is less than T, select a new subset and repeat the above. (v) After N trials the largest consensus set Si is selected, and the model is re-estimated using all the points in the subset Si

How many samples?

proportion of outliers e

5% 2 3 3 4 4 4 5

10% 3 4 5 6 7 8 9

20% 5 7 9 12 16 20 26

Choose t so probability for inlier is α (e.g. 0.95) • Often empirically • Zero-mean Gaussian noise σ then follows distribution with m=codimension of model (dimension+codimension=dimension space) Codimension

Model

t2

1

line,F

3.84σ2

2

H,P

5.99σ2

3

T

7.81σ2

Acceptable consensus set?

Choose N so that, with probability p, at least one random sample is free from outliers. e.g. p=0.99

s 2 3 4 5 6 7 8

Distance threshold

25% 6 9 13 17 24 33 44

30% 7 11 17 26 37 54 78

40% 11 19 34 57 97 163 272

50% 17 35 72 146 293 588 1177

Note: Assumes that inliers allow to identify other inliers

• Typically, terminate when inlier ratio reaches expected ratio of inliers

Adaptively determining the number of samples e is often unknown a priori, so pick worst case, i.e. 0, and adapt if more inliers are found, e.g. 80% would yield e=0.2

• RANSAC maximizes number of inliers • LMedS minimizes median error inlier
percent ile 100%

• N=∞, sample_count =0 • While N >sample_count repeat – – – –

Other robust algorithms

Choose a sample and count the number of inliers Set e=1-(number of inliers)/(total number of points) Recompute N from e Increment the sample_count by 1

50% residual ( pixels) 1 .2 5

• Terminate

• Not recommended: case deletion, iterative least-squares, etc.

Geometric distance

Non-linear refinment

Gold standard Symmetric epipolar distance

Gold standard

Gold standard

Alternative, minimal parametrization (with a=1)

Maximum Likelihood Estimation (= least-squares for Gaussian noise)

Initialize: normalized 8-point, (P,P‘) from F, reconstruct Xi Parameterize: (overparametrized) Minimize cost using Levenberg-Marquardt (preferably sparse LM, e.g. see H&Z)

(note (x,y,1) and (x‘,y‘,1) are epipoles) problems: • a=0 → pick largest of a,b,c,d to fix to 1 • epipole at infinity → pick largest of x,y,w and of x’,y’,w’ 4x3x3=36 parametrizations! reparametrize at every iteration, to be sure

Symmetric epipolar error

Some experiments:

Some experiments:

Some experiments:

Some experiments:

Recommendations: 1. Do not use unnormalized algorithms 2. Quick and easy to implement: 8-point normalized 3. Better: enforce rank-2 constraint during minimization 4. Best: Maximum Likelihood Estimation (minimal parameterization, sparse implementation)

Residual error:

(for all points!)

Automatic computation of F

Two-view geometry

Step 1. Extract features Step 2. Compute a set of potential matches Step 3. do (generate hypothesis)

Step 3.1 select minimal sample (i.e. 7 matches) Step 3.2 compute solution(s) for F Step 3.3 determine inliers (verify hypothesis)

until Γ(#inliers,#samples)