Boris Saulnier - GOLON seminar June 30th, 2006. ENS, Paris

tive interaction pseudo-energies. ie : correlations ⇒NO ... pi logq. 1 pi note:logq x α. = αlogq x... -. Renyi entropy. Generalized average (Kolmogorov-nagumo).
80KB taille 3 téléchargements 239 vues
Boris Saulnier - GOLON seminar June 30th, 2006. ENS, Paris Correlations, nonadditivity and nonextensivity Composite systemes, zoom in/out, (de)compose Is the global description the sum of local descriptions? Criticality and long-range correlations Additivity/extensivity - Generalized entropies Ω: space of events / set of possibilities ω ∈ Ω: elementary events / microstates Ab: alphabet - |Ab| = b Specification of ω ∈ Ω requires logb |Ω| let-

A “code” must have been specified... Specifying one element in A ⊂ Ω requires log |A| bits |Ω|

Info “ ω ∈ A”: log |A| bits It suggests IΩ(ω) = − log p(ω) On average coding any ω ∈ Ω requires P H(p) = − p(ω) log p(ω) H(p) is an average code length Important (grouping property)

Hn(p1, . . . , pn) = Hn−1(p1 + p2, p3, . . . , pn)

+(p1 + p2)H2(

p2 p1 , ) p1 + p2 p1 + p2

i.e. you can calculate information locally, then add -

random variable X : Ω → X , PX : X → R

X

pX (x) = p[X = x] =

p(ω)

ω∈Ω:X(ω)=x

Joint distribution of 2 random variables XY : Ω → X × Y,p : Ω → R pXY : X × Y → R pXY 6= PX .PY From joint you can calculate margin distributions e.g. p(X = x) =

P ω∈Ω:X(ω)=x p(ω)

Conditional proba of events A, B ⊂ Ω: p(A|B) = p(A∩B) p(B)

Independance : p(A) = p(A∩B) p(A ∩ B) = p(A)p(B) ; p(Ω) p(B)

You can “zoom” from Ω to B The part of A in Ω is invariant “independance”, “conditional” = about what happens when zooming and working “locally” Conditional proba (needs a joint distribution..)

pXY (x, y) p(X = x|Y = y) = pY (y)

P ω∈Ω:X(ω)=x&Y (ω)=y) p(ω) = P ω∈Ω:Y (ω)=y p(ω)

p [{X = x} ∩ {Y = y}] = XY pY [Y = y] Interpretation: “Zoom” in {ω ∈ Ω : Y (ω) = y} and normalize by pY (y)... Conditional entropy=entropy of the “induced partition”

H(X|Y ) = −EPXY log P (X|Y ) = −

X X

pXY (x, y) log pXY (y|x)

x∈X y∈Y

Chain rule

H(X, Y ) = H(X) + H(Y |X)

because log p(x, y) = log p(x) + log p(y|x)

Generalized p(x1, . . . , xn) = p(x1)p(x2|x1) . . . p(xn|x1, . . . , xn−1) Pn H(X1, . . . , Xn) = i=1 H(Xi|Xi−1, . . . , X1)

Relative entropy ∼ kind of distance between 2 distrib. X : Ω → X ,p : X → R,q : X → R

D(pkq) =

X x∈X

p(x) log

p(x) q(x)

P note f : X → R, f (x) = ω∈Ω:X(ω)=x p(ω)

is NOT a pdf If distribution of {x ∈ X } is p: H(p) bits on average If distribution is q: H(p) + D(pkq) bits on average Asymetric

Mutual information

I(X, Y ) = D(PXY kPX .PY ) “distance” between joint and product distributions p(x, y) I(X, Y ) = EpXy log p(x)p(y) I(X, Y ) = H(X)−H(X|Y ) = H(Y )−H(Y |X) PROPERTIES logis concave ! ! ! ⇒ D(pkq) ≥ 0 (equality iff p = q) ⇒ I(X; Y ) ≥ 0 (equality iff X indep of Y ) ⇒ H(X|Y ) ≤ H(X) (equality iff...)

⇒ H(X1, . . . , Xn) ≤ tivity, equality iff...)

Pn i=1 H(Xi) (subaddi-

ADDITIVE versus EXTENSIVE A system made of n units (eg: gaz) xn: state vector Q an observable Additivity: if Q(xn) = Q(xm) + Q(xn−m) complete additivity if Q(xn) =

P

Q(xi)

Eg: ideal gaz made of non interacting particles (and Q=Hamiltonian=total erngy) p(xn): the probability of states U(xn) = − log p(xn): “pseudo-energy”, a basis of the “thermodynamic formalism (Ruelle)

p(xn) factorizable if

− log p(xn) = −

X

log p(xi)

implies U (xn) additive. Otherwise

p(xn) = p(x1)p(x2|x1)p(x3|x1, x2) . . . p(xn|xn−1) and the p(xm|xm−1) represent NON additive interaction pseudo-energies. ie : correlations ⇒NO additivity (of pseudo energy) Extensivity Q(xn) −−−−→ n → ∞C n

eg: ergodic sequence of correlated random variables The BGS entropy is non additive But converges to the entropy rate=cste Non additive, but extensive In physics most often additivity ⇒ extensivity no extensivity when: -∃ interaction energy -∃ long-range correlations (slow decay with distance) eg: gaz of electrons in semi-conductor

P

Total energy= free lectrons energy with RENORMALIZED mass factor. Intuition: no additivy if long-range correlation, memory, criticality, extended criticality... Tsallis entropy idea: generalized q-logarithm x1−q − 1 logq x = 1−q q = 1: usual log pseudo-additivity logq xy = logq x+logq y +(1−q) logq x logq y Generalized q-exponential 1

exq = [1 + (1 − q)x] 1−q

ST (P, q) =

Pn 1 p log q i i=1 pi

note:logq xα 6= α logq x... Renyi entropy Generalized average (Kolmogorov-nagumo) Quasi-linear mean 

M = f −1 

N X



pif (Ii)

i=1

where: f strictly monotone continuous and invertible (KN function) Renyi showed only 2 KN functions preserve additivity -common arithmetic mean -exponential mean f (x) = c1b(1−q)x + c2

Renyi entropy: PN q 1 SR (P, q) = 1−q log i=1 pi

For q → 1 becomes BGS entropy SR (P, q) represents an exponential mean over the same elementary information gains log( p1 ) i

CHALLENGE Defining microscopic entropy in open thermodynamic systems, far from equilibrium... -SRB distributions... (Sinai Ruelle Bowen) System=hyerpbolic (chaotic hypothesis) - Lies on set of lebesge measure=0 -Galavotti-Cohen: define ONLY entropy creation -

Tsallis claim: need for generalizing P

H(p) = pi log pi (Shannon-Gibbs) and H = k log W (Boltzmann) for systems made of strongly nonindependant subsystems about Boltzmann-Gibbs Stat Mechs Simplest derivation=optimization of S=−

P

pi log pi

under constraints P

pi = 1

and P

piEi = UBG

(Ei :eigenvalues associated to the hamiltonian) Lagrange multipliers..

e−βEi

pi = P W

−βEj e j=1

where β ≡ 1/kB T is related to UBG Tsallis approach Variational approach (maxEnt) Sq = k

1−

PW

q

i=1 pi q−1

Optimization gives −βEi

eq

pi = P W

−βEj j=1 eq

Sq concave if q > 0 (convex if q < 0) Lesche-stability (experimental robustness)

Non additive for q 6= 1 therefore non extensive definition: long-range interaction S a d-dimensional many-body hamiltonian system two bodt potential energy decays in −1/r α (α > 0) Long-range iff 0 ≤ α/d ≤ 1 R (divergence of 1∞ dr rd−1r −α)

For correlated systems, Sq IS extensive (for given q) Whereas SBGIS NOT Weak chaos

maximal Lyapunov exponent vanishes xt+1 = 1 − a|xt|z strong chaos, mixing, ergodicity ⇒ BG concepts are OK Ubiquity in nature of BG exponentials (Gaussians) relies on central limit theorem ubiquity of q-exponentials expected to rely on a generalized central limit theorem generically applicable when random variables are scale-invariantly correlated