Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Content Lecture 1: Bayes Theory T. Chateau
1
Before beginning
2
Bayesian Decision Theory
3
Exercices
4
Example: Bayesian Tracking
Lasmea/Gravir/ComSee, Blaise Pascal University
MSIR MASTER
T. Chateau
T. Chateau
Lecture 1: Bayes Theory
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Lecture 1: Bayes Theory
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Content
Mathematical notations
Principal notations for variables, symbols and operations ! : approximately equal to
1
Before beginning
2
Bayesian Decision Theory
≡ : equivalent to (or defined to be)
3
Exercices
arg maxx f (x) : the value of x that leads to the maximum value of f (x)
4
Example: Bayesian Tracking
arg minx f (x) : the value of x that leads to the minimum value of f (x)
T. Chateau
Lecture 1: Bayes Theory
∝ : proportional to
T. Chateau
Lecture 1: Bayes Theory
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Mathematical notations
Mathematical notations
Principal notations for vectors, matrices and sets Rd :
d-dimensional Euclidean space
x, A: boldface is used for (column) vectors and matrices
Principal notations for vectors, matrices and sets ||x||: Euclidean norm of vector x
tr[A]: trace of matrix A ; the sum of its diagonal elements
A: special font can also be used for matrices
A−1 : inverse of matrix A
f(x): vector-valued function of a scalar argument
A† : pseudo inverse of matrix A
f(x): vector-valued function of a vector argument
|A|: determinant of A
I: identity matrix diag (a1 , a2 , ..., ad ): matrix whose diagonal elements are a1 , a2 , ..., ad and off-diagonal elements are 0 xt :
transpose of vector x
T. Chateau
Lecture 1: Bayes Theory
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
|D|: the cardinality of set D ; the number of possibly non distinct discrete elements in it.
T. Chateau
Lecture 1: Bayes Theory
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Probability, Distributions and Complexity Principal notations for vectors, matrices and sets ω: state of nature P(.): probability mass p(.): probability density P(a, b): the joint probability ; the probability of having both a and b. p(a, b): the joint probability density ; the probability density of having both a and b. Pr [.]: the probability of a condition being ; for example Pr [x < x0 ] p(x|θ): the conditional probability density of x given θ. ˆ maximum likelihood estimate of θ θ: T. Chateau
A, B, C, ..: “Calligraphic” font generally denotes sets or lists ; D = {x1 , .., xn }
Lecture 1: Bayes Theory
Probability, Distributions and Complexity Probability, Distributions and Complexity ∼: “has the distribution” ; for example, p(x) ∼ N(µ, σ 2 ) means that the density of x is normal, with mean µ and variance σ 2 N(µ, σ 2 ): normal or Gaussian distribution with mean µ and variance σ 2 N(µ, Σ): normal or Gaussian distribution with mean vector µ and covariance matrix Σ More notations can be found in : Duda, O and al., “Pattern Classification, second edition”, Wiley-Interscience Publication, 2001 Publication, T. Chateau
Lecture 1: Bayes Theory
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Probability Theory
Probability Theory
Probability Mass Function P(x) ≡ px and
!
P(x) = 1
x∈X
Pairs of Discrete Random Variables x and y x ∈ X = {v1 , v2 , ..., vm } and y ∈ Y = {w1 , w2 , ..., wn }
Pairs of Discrete Random Variables x and y Joint probability mass function !! P(x, y ) with P(x, y ) ≥ 0 and P(x, y ) = 1 x∈X y ∈Y
Marginal distribution
Joint probability
Px (x) = P(x) =
Lecture 1: Bayes Theory
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
p(x, y )
y ∈Y
pij = Pr [x = vi , y = wj ]
T. Chateau
!
T. Chateau
Lecture 1: Bayes Theory
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Probability Theory
Probability Theory
Conditional Probability Conditional probability of x given y : Statistical independence x and y are said statistically independent if and only if P(x, y ) = Px (x).Py (y )
Pr [x = vi |y = wi ] = In terms of mass functions: P(x|y ) =
T. Chateau
Lecture 1: Bayes Theory
Pr [x = vi , y = wi ] Pr [y = wi ]
T. Chateau
P(x, y ) P(y )
Lecture 1: Bayes Theory
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Probability Theory
Probability Theory
Bayes rule !
P(y ) =
P(x, y )
Continuous Random Variables Similar properties than discrete random variables:
x∈X
P(x, y ) P(x|y ) = P(y ) P(x, y ) = P(y |x)P(x) And finally: P(y |x)P(x) x∈X P(y |x)P(x)
P(x|y ) = " T. Chateau
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Pr [x ∈ (a, b)] =
b
p(x)dx
a
with p(x): probability density function: # ∞ p(x) ≥ 0 and p(x)dx = 1 −∞
T. Chateau
Lecture 1: Bayes Theory
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Minimum Error Rate Classification Decision Function
Content
#
Lecture 1: Bayes Theory
Minimum Error Rate Classification Decision Function
Principle
1
2
Before beginning Bayesian Decision Theory Minimum Error Rate Classification Decision Function
Bayesian decision theory is a fundamental statistic approach to pattern classification problems A statistic (stochastic) method Assumption: the problem must be expressed using probabilities Therefore: Bayesian decision theory is optimum.
3
Exercices
4
Example: Bayesian Tracking
T. Chateau
Lecture 1: Bayes Theory
T. Chateau
Lecture 1: Bayes Theory
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Minimum Error Rate Classification Decision Function
Toy example
Minimum Error Rate Classification Decision Function
Bayes Rule
Prior probability associated to a class. We consider a car builder manufacturer and we know the prior proportion of black (1/2), white (1/4) and other cars (1/4) produced by the factory. Question: why no additional observation, how to decide the color of the next built car ?
Let {ω1 , ω2 , ...ωc } be a set of c classes and x a feature vector. For each class ωi we assume that a prior can be computed : P(ωi ) :prior probability for class i, p(x|ωi ) : probability density function related to x given the class ωi (likelihood)
Answer: we will decide a black car (minimization of the error risk) We use an important information (Priors associated to each class): p(ω1 ) = 1/2 p(ω2 ) = 1/4 p(ω3 ) = 1/4 T. Chateau
Lecture 1: Bayes Theory
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
T. Chateau
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Minimum Error Rate Classification Decision Function
Bayes Rule
Lecture 1: Bayes Theory
Minimum Error Rate Classification Decision Function
toy example for two classes 2 classes
The Bayes rule give the posterior according to the prior, the likelihood and the evidence:
" pdf1
" pdf2
0.04
0.06 0.05
P(ωi |x) =
p(x|ωi )P(ωi ) p(x)
0.03 0.04 0.02
0.03 0.02
0.01 0.01
with : p(x) =
!
0
(p(x|ωi ).P(ωi ))
0
20
40
60
80
0
100
0
20
40
60
80
100
P(!1)1=0.5 and P(!2)=0.5
i
1 0.8
Bayes Rule
0.6
posterior =
T. Chateau
likelihood.prior evidence Lecture 1: Bayes Theory
0.4 0.2 0
0
10
20
30
40
T. Chateau
50
60
70
80
Lecture 1: Bayes Theory
90
100
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Minimum Error Rate Classification Decision Function
toy example for two classes
Minimum Error Rate Classification Decision Function
Error probability
2 classes " pdf1
" pdf2
0.04
let x be a feature vector and δ(x) = ωi a decision rule, the probability of error related to δ is: ! P(error|x) = P(ωj |x) = 1 − P(ωi |x)
0.06 0.05
0.03 0.04 0.02
0.03 0.02
0.01
j$=i
0.01 0
0
20
40
60
80
100
0
0
P(!1)1=0.2 and P(!2)=0.8 1
0.8
0.8
0.6
0.6
0.4
0.4
0.2
0.2
0
20
40
60
80
40
60
80
100
P(!1)1=0.8 and P(!2)=0.2
1
0
20
100
T. Chateau
0
The probability of the global error related to the system is: # ∞ P(errorglob|x) = P(error|x).P(x)1dx −∞
0
20
40
60
80
100
Lecture 1: Bayes Theory
T. Chateau
Lecture 1: Bayes Theory
P(ω1 ) = P(ω2 ) = 0.5 Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Minimum Error Rate Classification Decision Function
Optimal decision
Minimum Error Rate Classification Decision Function
Decision surfaces " pdf1
" pdf2
0.04
0.06 0.05
0.03
Optimal decision (associated to error probability) is given by δ(x) = ωi such as p(ωi |x) is maximum: P(ωi |x) ≥ P(ωj |x)∀j
0.04 0.02
0.03 0.02
0.01 0.01 0
0
20
40
60
80
0
100
0
20
40
60
80
100
P(!1)1=0.5 and P(!2)=0.5 1
p(x|ωi ).P(ωi ) ≥ p(x|ωj ).P(ωj )∀j
0.8 0.6 0.4 0.2 0
0
10
20
30
40
R1 T. Chateau
Lecture 1: Bayes Theory
T. Chateau
50
60
70
80
R2 Lecture 1: Bayes Theory
90
100
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Minimum Error Rate Classification Decision Function
Bayes Loss and Risk
Minimum Error Rate Classification Decision Function
Bayes Loss and Risk
Let consider a finite set of c states {ω1 , ...ωc } and a finite set of a possible actions {α1 , ...αa }. Loss Function
Let δi be the decision to choose the state ωi . Risk related to the decision ωi ! R(δi |x) = λ(δi |ωj )P(ωj |x) j
λ(αi |ωj ) describes the loss incurred for tacking action αi while the state of nature is ωj .
Overall Risk R=
#
R(δ(x)|x)p(x)dx
Rn
T. Chateau
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
T. Chateau
Lecture 1: Bayes Theory
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Minimum Error Rate Classification Decision Function
Bayes Loss and Risk (for two classes)
Lecture 1: Bayes Theory
Minimum Error Rate Classification Decision Function
Bayes Loss and Risk (for two classes) therefore,
Let λij = λ(δi |ωj ) be the loss related to decision δi while the true state is ωj ) R(δ1 |x) = λ11 p(ω1 |x) + λ12 p(ω2 |x) R(δ2 |x) = λ21 p(ω1 |x) + λ22 p(ω2 |x) with: λ11 < λ12 and λ21 < λ22 ω1 if R(δ1 |x) < R(δ2 |x)
(λ21 − λ11 )P(ω1 |x) > (λ12 − λ22 )P(ω2 |x) then, (λ21 − λ11 )P(x|ω1 )P(ω1 ) > (λ12 − λ22 p(x|ω1 )P(ω1 ) and finally, we decide ω1 if: P(x|ω1 ) λ12 − λ22 P(ω2 ) > P(x|ω2 ) λ21 − λ11 P(ω1 ) this is the likelihood ratio
T. Chateau
Lecture 1: Bayes Theory
T. Chateau
Lecture 1: Bayes Theory
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Minimum Error Rate Classification Decision Function
Bayes decision rule
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Minimum Error Rate Classification Decision Function
Zero-One Loss When errors are to be avoided, the following loss function can be define: λij = 0 if i = j
minimization of the overall risk For all possible decisions, compute the conditional risk: ! R(δi |x) = λ(δi |ωj )P(ωj |x) j
Select the action from which R(δi |x) is minimum.
λij = 1 if i )= j Risk for Zero-One Loss R(δi |x) = R(δi |x) =
! j$=i
! j
λ(δi |ωj )P(ωj |x)
P(ωj |x) = 1 − P(ωi |x)
Conclusion: in this case, risk minimization is equivalent to posterior probability maximization. T. Chateau
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
T. Chateau
Lecture 1: Bayes Theory
Minimum Error Rate Classification Decision Function
Discriminant Function
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Lecture 1: Bayes Theory
Minimum Error Rate Classification Decision Function
Discriminant Function
Several decision functions can be defined: A decision rule is modelized by a discriminant function defined by: gi (x) , i = 1, ..s (s=number of states)
gi (x) = −R(δi |x)
Moreover: x is decided to be the state ωi if gi (x) > gj (x) ∀j )= i
gi (x) = p(x|ωi )P(ωi )
T. Chateau
Lecture 1: Bayes Theory
gi (x) = p(ωi |x)
f (gi (x)) if f is a monotonically increasing function and gi is a decision function.
T. Chateau
Lecture 1: Bayes Theory
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Minimum Error Rate Classification Decision Function
Commonly used Discriminant Function
gi (x) = p(ωi |x) p(x|ωi )P(ωi ) gi (x) = " i (p(x|ωi ).P(ωi )) gi (x) = p(x|ωi )P(ωi )
Content
1
Before beginning
2
Bayesian Decision Theory
3
Exercices Exercice 1 (french) Exercice 2 (french) Exercice 3 (french)
4
Example: Bayesian Tracking
gi (x) = log (p(x|ωi )) + log (p(ωi ))
T. Chateau
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Exercice 1 (french) Exercice 2 (french) Exercice 3 (french)
Lecture 1: Bayes Theory
Exercice 1 (french) Exercice 2 (french) Exercice 3 (french)
Exercice 1
´enonc´e Les Anglais et les Am´ericains orthographient le mot rigueur, respectivement, rigour et rigor. Un homme ayant pris une chambre dans un hˆotel parisien a ´ecrit ce mot sur un bout de papier. Une lettre est prise au hasard dans ce mot, c’est une voyelle. Or 40% des anglophones de l’hˆotel sont des Anglais et les 60% restants sont Am´ericains. Quelle est la probabilit´e que l’auteur du mot soit anglais ?
T. Chateau
T. Chateau
Lecture 1: Bayes Theory
Exercice 1 : r`egle de Bayes
Exercice 1 (french) Exercice 2 (french) Exercice 3 (french)
Lecture 1: Bayes Theory
correction Soit V l’´ev´enement ”la lettre prise au hasard dans le mot est une voyelle”, B l’´ev´enement ”le mot a ´et´e ´ecrit par un Anglais”, et A l’´ev´enement ”le mot a ´et´e ´ecrit par un Am´ericain”. En employant la formule de Bayes on obtient pour la probabilit´e demand´ee : P(B|V ) =
P(V |B)P(B) P(V |B)P(B) + P(V |A)P(A)
P(B|V ) =
1/2.4/10 = 5/11 1/2.4/10 + 2/5.6/10
T. Chateau
Lecture 1: Bayes Theory
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Exercice 2 : r`egle de Bayes
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
P(RN|R) =
P(R|RN)P(RN) P(R|RR)P(RR) + P(R|RN)P(RN) + +P(R|NN)P(NN)
P(RN|R) =
p(x|ωi ) ∝ e −|x−ai |/bi pour i = 1, 2 et bi > 0 donner l’expression analytique de chaque densit´e en fonction de ai et bi (normalisation)
2
exprimer le rapport des densit´es comme une fonction de 4 variables
3
Tracer la courbe du rapport p(x|ω1 )/p(x|ω2 ) en fonction de x pour le cas o` u a1 = 0, b1 = 0, a2 = 1 et b2 = 2 Lecture 1: Bayes Theory
1/2.1/3 = 1/3 1.1/3 + 1/2.1/3 + 0.1/3
T. Chateau
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Exercice 1 (french) Exercice 2 (french) Exercice 3 (french)
´enonc´e Soit deux densit´es de probabilit´es d´efinies, de mani`ere litt´erale, par l’expression :
T. Chateau
correction oient RR, NN et RN les ´ev´enements, ”la carte choisie est enti`erement rouge” (respectivement: ”enti`erement noire”, ”bicolore”). Soit encore R l’´ev´enement: ”la face apparente de la carte tir´ee est rouge”. Par la formule de Bayes :
Lecture 1: Bayes Theory
Exercice 3 : ratio de vraisemblance
1
Exercice 1 (french) Exercice 2 (french) Exercice 3 (french)
Exercice 2
´enonc´e On consid`ere 3 cartes `a jouer de mˆeme forme. Cependant, les deux faces de la premi`ere carte ont ´et´e color´ees en noir, les deux faces de la deuxi`eme carte en rouge tandis que la troisi`eme porte une face noire et l’autre rouge. On m´elange les trois cartes au fond d’un chapeau, puis une carte est tir´ee au hasard et plac´ee au sol. Si la face apparente est rouge, quelle est la probabilit´e que l’autre soit noire ?
T. Chateau
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Exercice 1 (french) Exercice 2 (french) Exercice 3 (french)
Lecture 1: Bayes Theory
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
Content 1
Before beginning
2
Bayesian Decision Theory
3
Exercices
4
Example: Bayesian Tracking Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera Context Solution T. Chateau
Lecture 1: Bayes Theory
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
Visual Tracking
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
Visual Tracking (before beginning)
Definition Visual Tracking is the process of locating, identifying, and determining the dynamic configuration of one or many moving (possibly deformable) objects (or parts of objects) in each frame of one or several cameras
State Vector The dynamic configuration of the the tracked object at time k is modelled by a State vector denoted: xk State Sequence The state sequence is given by the set (sequence) of State vectors, denoted: . X = {xk }k=1,...,K
Human equivalent Follow something with your eyes
Observation
. Observation: Z = {zk }k=1,...,K T. Chateau
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Lecture 1: Bayes Theory
T. Chateau
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
On-line and Off-line Tracking
Estimation of the state xk uses the entire observation sequence . Z = {zk }k=1,...,K
zk
zk+1
On-line Tracking Estimation of the state xk uses the current and past observation: z0:k
Available Observations
Available Observations
zk−1
zk+2
zk−3 zk−2
zk−1
Lecture 1: Bayes Theory
zk
zk+1
xk
xk T. Chateau
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
On-line and Off-line Tracking
Off-line Tracking (Deferred Tracking)
zk−3 zk−2
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Lecture 1: Bayes Theory
T. Chateau
Lecture 1: Bayes Theory
zk+2
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
On-line and Off-line Tracking
Estimation of the state needs current, past and a part (delay) of future observation
Available Observations
xk−2
zk−1
On-line Tracking For robotic applications: estimation of the state xk uses the current and past observation: z0:k
Available Observations
zk
zk+1
zk+2
zk−1
zk−3 zk−2
delay
T. Chateau
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
On-line and Off-line Tracking
Delayed Tracking
zk−3 zk−2
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Probabilistic Approaches to Visual Tracking Random Vectors Both the state X and the observation Z are random vectors: X ∈ X and Z ∈ Z Joint Probability The Probability of a sate sequence is given by: p(X|Z) = p(x1 ; x2 ; ...; xK |z1 ; z2 ; ...; zK )
zk+1
T. Chateau
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Lecture 1: Bayes Theory
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
The Recursive Bayesian Estimation Approach Dynamic Bayesian Network representation First order Markovian assumption: the object configuration at time k, xk , depends only on the previous state Xk−1 .
xk−3 xk−2
xk−1
xk
xk+1
xk+2
Lecture 1: Bayes Theory
States
. X = {xk }k=1,...,K
zk−3 zk−2
zk−1
zk
zk+1
Observations
zk+2
The final output of a Visual Tracking process is an estimate ˆ X
T. Chateau
zk+2
xk
Lecture 1: Bayes Theory
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
zk
T. Chateau
Lecture 1: Bayes Theory
. Z = {zk }k=1,...,K
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
Recursive state-space Bayesian estimation approach
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
computing p(xk |zk ) from p(xk−1 |zk−1 ) A two steps algorithm
Posterior distribution The belief about the current state xk is expressed by a probability distribution:
p(xk−1 |zk−1 )
p(xk |zk−1 ) Prediction (Chapman Kolmogorov)
p(xk |zk ) Update (Bayes)
p(xk |zk ): POSTERIOR DISTRIBUTION How to recursively compute p(xk |zk )?
p(xk |xk−1 )
p(zk |xk )
Dynamics
T. Chateau
Lecture 1: Bayes Theory
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Chapman-Kolmogorov equation: # p(xk |z1:k−1 ) = p(xk |xk−1 )p(xk−1 |z1:k−1 )dxk−1 p(xk |zk−1 ) Prediction (Chapman Kolmogorov)
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Dynamics
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
computing p(xk |zk ) from p(xk−1 |zk−1 )
p(xk |zk ) Update (Bayes)
p(xk |z1:k ) = with : p(zk |z1:k−1 ) =
#
p(zk |xk )p(xk |z1:k−1 ) p(zk |z1:k−1 ) p(zk |xk )p(xk |z1:k−1 )dxk
p(xk−1 |zk−1 ) p(xk |xk−1 )
Lecture 1: Bayes Theory
Update step Bayes theorem:
Prediction step (dynamical model)
p(xk−1 |zk−1 )
T. Chateau
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
computing p(xk |zk ) from p(xk−1 |zk−1 )
Likelihood
p(zk |xk )
p(xk |zk−1 ) Prediction (Chapman Kolmogorov)
Likelihood
p(xk |xk−1 ) Dynamics
T. Chateau
p(xk |zk ) Update (Bayes)
Lecture 1: Bayes Theory
T. Chateau
p(zk |xk ) Likelihood
Lecture 1: Bayes Theory
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
computing p(xk |zk ) from p(xk−1 |zk−1 ) Recursive Bayesian filtering distribution p(xk |z1:k ) = C
−1
p(zk |xk )
#
xk−1
p(xk−1 |zk−1 )
p(xk |xk−1 )p(xk−1 |z1:k−1 )dxk−1
p(xk |zk−1 ) Prediction (Chapman Kolmogorov)
p(xk |xk−1 ) Dynamics
T. Chateau
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
p(xk |zk ) Update (Bayes)
p(zk |xk ) Likelihood
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
computing p(xk |zk ) from p(xk−1 |zk−1 ) Partial Conclusion The recursive bayesian filtering distribution provides an efficient solution to compute the posterior at time k (p(xk |zk )) from the posterior at time k − 1 (p(xk−1 |zk−1 )) , the dynamic model (p(xk |xk−1 )), and the likelihood (p(zk |xk )) Operations (integrals, products) on pdf have to be done:
Question how to define probabilities such that operations like product and integration become tractable ?
T. Chateau
Lecture 1: Bayes Theory
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
Modelling pdf
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Lecture 1: Bayes Theory
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
Parametric models (Kalman,...)
Parametric and stochastic models
Kalman filter Assumption: all pdf are modelized with Gaussian p(xk−1 |zk−1 )
p(xk |zk−1 ) Prediction (Chapman Kolmogorov)
p(xk |xk−1 ) Parametric function (Gaussian)
T. Chateau
Stochastic approximation of the pdf
Lecture 1: Bayes Theory
Dynamics
T. Chateau
p(xk |zk ) Update (Bayes)
p(zk |xk ) Likelihood
Lecture 1: Bayes Theory
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
Stochastic models (Particle filters,...)
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
Probabilistic filters
Particle filters All pdf are approximated by a set of samples. p(xk−1 |zk−1 )
p(xk |zk−1 )
p(xk |zk )
Prediction (Chapman Kolmogorov)
Update (Bayes)
Remark Kalman filters and derived: we assume that the unknown pdf can be modelized by a parametric function Stochastic solutions: approximation of the pdf by a set of particles.
p(xk |xk−1 )
p(zk |xk )
Dynamics
T. Chateau
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Likelihood
Lecture 1: Bayes Theory
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
What do we want to do ?
We want to estimate velocity and steering angle of the vehicle.
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
Bicycle kinematic model
Visual tracking of a vehicle from a static camera The dynamic model of the object is known. We want to estimate velocity and steering angle of the vehicle. L
L
Car
T. Chateau
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Lecture 1: Bayes Theory
What do we want to do ?
Visual tracking of a vehicle from a static camera The dynamic model of the object is known.
L
T. Chateau
δ
Lecture 1: Bayes Theory
L
Car Bicycle kinematic model
T. Chateau
δ
Lecture 1: Bayes Theory
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
Tracking scheme
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
Kinematic model Bicycle model L
Pt : position (size 2) STATE MODEL
βt : orientation δt: steering angle vt: velocity
L
Car Bicycle kinematic model
δ
x˙ = v . cos β y˙ = v . sin β β˙ = v . tan δ
TRACKING ENGINE SIR
L
LIKELIHOOD FUNCTON (OBSERVATI ON)
BACKGROUND/FOREGROUND
T. Chateau
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
KINEMATIC MODEL
v : velocity x, y : position
BICYCLE MODEL
β: orientation T. Chateau
Lecture 1: Bayes Theory
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
Observation function
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Lecture 1: Bayes Theory
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
Observation function
T. Chateau
Lecture 1: Bayes Theory
T. Chateau
Lecture 1: Bayes Theory
(1)
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
Observation function
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
Results Results
T. Chateau
Before beginning Bayesian Decision Theory Exercices Example: Bayesian Tracking
T. Chateau
Lecture 1: Bayes Theory
Lecture 1: Bayes Theory
Probabilistic Approaches to Visual Tracking On-line and Off-line Tracking Tracking a vehicle from a static camera
Results
Pr´ecision (cm) Speed km/hr 40 60 80
Results
T. Chateau
Lecture 1: Bayes Theory
Vision ave/std 0.25/0.18 0.19/0.16 0.18/0.15
Rangefinder ave/std 0.65/0.54 0.72/0.67 0.33/0.22
T. Chateau
Sensor merge ave/std 0.17/0.10 0.09/0.06 0.14/0.10
Lecture 1: Bayes Theory