Shannon's Formula & Hartley's Rule:

Sep 23, 2014 - “A Mathematical Theory of Communication,” The Bell System Technical Journal, Vol. 27, pp. ..... and R. S. Ware, are particularly appreciated.
7MB taille 15 téléchargements 354 vues
Shannon’s Formula & Hartley’s Rule: Olivier Rioul∗ ∗ Télécom

José Carlos Magossi†

ParisTech

† Unicamp

claude

2/31

23 Sept 2014

shannon

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

claude

shannon

a sound channel

2/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

claude

shannon

a sound channel Shannon’s formula: C=

2/31

23 Sept 2014

1 2

 P log2 1 + N

Shannon’s Formula & Hartley’s Rule:

bits/symbol

A Mathematical Coincidence?

claude

shannon

a sound channel Shannon’s formula: or...

2/31

23 Sept 2014

 P C = W log2 1 + N

Shannon’s Formula & Hartley’s Rule:

bits/second

A Mathematical Coincidence?

Claude Shannon

Shannon’s formula: C=

1 2

 P log2 1 + N

“A Mathematical Theory of Communication,” The Bell System Technical Journal, Vol. 27, pp. 623–656, October, 1948 .

3/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Claude Shannon

Shannon’s formula: C=

1 2

 P log2 1 + N

“A Mathematical Theory of Communication,” The Bell System Technical Journal, Vol. 27, pp. 623–656, October, 1948 .

3/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Claude Shannon Shannon’s formula: C=

1 2

 P log2 1 + N

“A Mathematical Theory of Communication,” The Bell System Technical Journal, Vol. 27, pp. 623–656, October, 1948 .

3/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Ralph Hartley 20 years before... in the same journal...

4/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Ralph Hartley 20 years before... in the same journal...

Hartley’s rule:

 A C 0 = log2 1 + ∆

bits/symbol

“Transmission of Information,” The Bell System Technical Journal, Vol. 7, pp. 535–563, July 1928 . 4/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Ralph Hartley 20 years before... in the same journal...

Hartley’s rule: or...

 A C 0 = 2W log2 1 + ∆

bits/second

“Transmission of Information,” The Bell System Technical Journal, Vol. 7, pp. 535–563, July 1928 . 4/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Ralph Hartley Hartley’s rule:

 A C 0 = log2 1 + ∆

(Wozencraft-Jacobs textbook, 1965) 5/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Ralph Hartley Hartley’s rule:

 A C 0 = log2 1 + ∆

I

amplitude “SNR” A/∆ (factor 1/2 is missing)

I

no coding involved (except quantization)

I

zero error

(Wozencraft-Jacobs textbook, 1965) 5/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Outline  Hartley’s C 0 = log2 1 +

6/31

23 Sept 2014

A ∆



came 20 years before Shannon

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Outline  Hartley’s C 0 = log2 1 + Shannon’s C =

6/31

23 Sept 2014

1 2

A ∆



 log2 1 +

P N

came 20 years before Shannon 

came unexpected in 1948

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Outline  Hartley’s C 0 = log2 1 + Shannon’s C =

1 2

A ∆



 log2 1 +

P N

came 20 years before Shannon 

came unexpected in 1948

Hartley’s rule is inexact: C 0 6= C

6/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Outline  Hartley’s C 0 = log2 1 + Shannon’s C =

1 2

A ∆



 log2 1 +

P N

came 20 years before Shannon 

came unexpected in 1948

Hartley’s rule is inexact: C 0 6= C Besides, C 0 is not the capacity of a noisy channel

6/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Outline  Hartley’s C 0 = log2 1 + Shannon’s C =

1 2

A ∆



 log2 1 +

P N

came 20 years before Shannon 

came unexpected in 1948

Hartley’s rule is inexact: C 0 6= C Besides, C 0 is not the capacity of a noisy channel (no question)

6/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Outline

Wrong!

7/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Outline  This Hartley’s rule C 0 = log2 1 +

8/31

23 Sept 2014

A ∆



Shannon’s Formula & Hartley’s Rule:

is not Hartley’s

A Mathematical Coincidence?

Outline  This Hartley’s rule C 0 = log2 1 +

A ∆



is not Hartley’s

Many authors independently derived C =

8/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

1 2

 log2 1 +

P N



in 1948.

A Mathematical Coincidence?

Outline  This Hartley’s rule C 0 = log2 1 +

A ∆



is not Hartley’s

Many authors independently derived C =

1 2

 log2 1 +

P N



in 1948.

In fact, C 0 = C (a coincidence?)

8/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Outline  This Hartley’s rule C 0 = log2 1 +

A ∆



is not Hartley’s

Many authors independently derived C =

1 2

 log2 1 +

P N



in 1948.

In fact, C 0 = C (a coincidence?) Besides, C 0 is the capacity of the “uniform” channel

8/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Outline  This Hartley’s rule C 0 = log2 1 +

A ∆



is not Hartley’s

Many authors independently derived C =

1 2

 log2 1 +

P N



in 1948.

In fact, C 0 = C (a coincidence?) Besides, C 0 is the capacity of the “uniform” channel (and we can explain)

8/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Outline

This Hartley’s rule

C0



= log2 1 +

A ∆



is not Hartley’s

Many authors independently derived C =

1 2

 log2 1 +

P N



in 1948.

In fact, C 0 = C (a coincidence?) Besides, C 0 is the capacity of the “uniform” channel (and we can explain)

9/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

1945 cryptographyreport,ithasthephrase“information Hartley or not Hartley theory” and it says that you are next going to get around to writingupinformationtheory.Thismakes it soundasif cryptography gave you the mysterious “missing link,” but it’s now clear that information theory did not come out of cryptography. Quote from Shannon, 1984: C.S.: Working on cryptography led backtothegood aspects of informationtheory. I startedwithinformation theory, inspired by Hartley’s paper, which was a good paper, but it didnottakeaccount of thingslikenoiseandbest encoding and probabilistic aspect^.^ R.P.: You have said to other people that these were closely I In Hartley’s intertwined, and that no cryptography application paper, mention of was signalnovs. mere noise or A vs. ∆  say, you got stimulus. Could I  As you of information theory. I Why was C 0 = log 1 + A mistakenly attributed to Hartley? suggest that there is a2sort ∆ of duality there? The cryptography problem is, in some ways, the “mirror image” of the communications problem, so you naturally got some insights out of it.

s t

h

r N w

b a W G u b



10/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

o

The first tutorial of information theory!

.. .

.. .

11/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Outline

This Hartley’s rule

C0



= log2 1 +

A ∆



is not Hartley’s

Many authors independently derived C =

1 2

 log2 1 +

P N



in 1948.

In fact, C 0 = C (a coincidence?) Besides, C 0 is the capacity of the “uniform” channel (and we can explain)

12/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

And then there were eight Quote from Shannon, 1948:

1. Norbert Wiener, Cybernetics, early 1948

13/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

And then there were eight Quote from Shannon, 1948:

1. Norbert Wiener, Cybernetics, early 1948 2. William G. Tuller, PhD Thesis, June 1948

13/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

And then there were eight Quote from Shannon, 1948:

1. Norbert Wiener, Cybernetics, early 1948 2. William G. Tuller, PhD Thesis, June 1948 3. H. Sullivan, ?

13/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

And then there were eight Quote from Shannon, 1948:

1. Norbert Wiener, Cybernetics, early 1948 2. William G. Tuller, PhD Thesis, June 1948 3. H. Sullivan, ? 4. Jacques Laplume, April 1948

13/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

And then there were eight Quote from Shannon, 1948:

1. Norbert Wiener, Cybernetics, early 1948 2. William G. Tuller, PhD Thesis, June 1948 3. H. Sullivan, ? 4. Jacques Laplume, April 1948 5. Charles W. Earp, June 1948

13/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

And then there were eight Quote from Shannon, 1948:

1. Norbert Wiener, Cybernetics, early 1948 2. William G. Tuller, PhD Thesis, June 1948 3. H. Sullivan, ? 4. Jacques Laplume, April 1948 5. Charles W. Earp, June 1948 6. André G. Clavier, December 1948

13/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

And then there were eight Quote from Shannon, 1948:

1. Norbert Wiener, Cybernetics, early 1948 2. William G. Tuller, PhD Thesis, June 1948 3. H. Sullivan, ? 4. Jacques Laplume, April 1948 5. Charles W. Earp, June 1948 6. André G. Clavier, December 1948 7. Stanford Goldman, May 1948

13/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

And then there were eight Quote from Shannon, 1948:

1. Norbert Wiener, Cybernetics, early 1948 2. William G. Tuller, PhD Thesis, June 1948 3. H. Sullivan, ? 4. Jacques Laplume, April 1948 5. Charles W. Earp, June 1948 6. André G. Clavier, December 1948 7. Stanford Goldman, May 1948 8. Claude E. Shannon, .... July 1940 ???? 13/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Norbert Wiener

.. . 14/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Norbert Wiener

15/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

+;

48

Norbert Wiener .,_

;

c.2

i ,g

Ir

IRE TRANSACTIONS

Later. . . in 1956:

ON INFORMATION

THEORY

What Theory? is Information

What is Information NORBERT

WIENER

NORBERT

WIENER

communication theory and in my opinion in all NFORMATION THEORY has been identified communicat NFORMATION THEORY has been identified branches of science whatever. It is generally recogin the public mind to denote the theory of inforbranches o in the public mind to denote infor-which now nized thatthe thetheory quantum oftheory dominates mation by bits, as developed by Claude E. nized that mation by bits, as developed by Claude E. the whole of physics is at root a statistical theory; hannon and myself. This notion is certainly imporalthough is it iscertainly perhaps notimporyet as generally ant and has provedShannon profitable and as a myself. standpoint This at the recognized whole notion as it should be, the quantum theory is strictly a east, although as Dr. Shannon suggests in his edialthough it Shannon’s Formula & Hartley’s Rule: tant and has proved profitable as a standpoint at 15/31 23 Sept 2014 Mathematical branch of the A theory of time Coincidence? series. Professor Armand

Jacques Laplume Meanwhile (1948), far away. . .

16/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Charles W. Earp

17/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Charles W. Earp

17/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

André G. Clavier

18/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

André G. Clavier

18/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

tion theory and amplitude levels, any particular signal such as that stigate the fun- Stanford shown, havingGoldman a duration of n significant time intervals, signals through represents one out of Ln different possible PROCEEDINGS OF THE 584 I.R.E. signals of this rems governing May re proved, and duration which could have been transmitted in the Some system.' Fundamental h and repetition With the Considerations foregoing meaning Concerning for the various Noise lectivity" is in- symbols, we have and Range in Radar and Communication * dth can be Reduction used se-improvement GOLDMANt, MEMBER,=Ln. I.R.E. number STANFORD of different possibleSENIOR messages (I),2 ssage informaSummary-A general analysis based upon information theory and amplitude levels, any particular signal such as that also pointed out is used toof the mathematical theoryThe of probability the fun- shown, investigate number significant amplitude levels ofisnusually having a duration significant time intervals, damental principles involved in the transmission of signals through uld be possible represents one out Ln different possible the noise in the system. Ifofthe system is signals of this random noise. Three by theorems a background of determined general governing the probability relations between signal and noise are proved, and

duration which could have been transmitted in the

of radar and communication systems may be made are also discussed. The possibility of using extra bandwidth to reduce distortion is pointed out. Finally, some possible relations of this work to biology and psychology are described.

where the "1" is due to the fact that the zero signal level

a linear the maximum amplitude of pulse lengthand and repetition oneisis disapplied to of the effect nature, investigate system.' With signal the foregoing meaning for the various resholdsrate on radar range. The concept of "generalized selectivity" is in- symbols, we have is S, while the noise amplitude is N, then the number of of a ablishment can and and extra bandwidth be is shown how used it troduced, why that most noise-improvement It is pointed out amplitude for noise reduction. number of different possible messages =Ln. (I),2 of the apparent significant levels is essentially systems are based upon coherent repetition of the message informaations tion system, either in time or in the frequency spectrum. It is also pointed out The number of significant amplitude levels is usually why more powerful noise-improvement systems shouldLbe = possible determined (S/N) + 1 by the noise in the system. e of modulation (2)If the system is than have so far been made. The general mechanism of noise-improvement thresholds is dis- of a linear nature, and the maximum signal amplitude ements in range and it is shown how they depend upon the establishment of a is S, while the noise amplitude is N, then the number of cussed,dise are also where "1" the is dueof to the fact that amplitude the zerolevels signal level the apparent The reason for and the limitation coherence standard. significant is essentially law that the maximum operating range of a communications system, educe distortion can be used. a given average power, is independent of the type of modulation L = (S/N) + 1 (2) work tofor biology in which improvements used is then explained. General The ways duration to of ain range significant time interval of the

signal is determined by the inherent can be used. limited bandwidth The duration if to ofaa significant time interval of the of the signal. It is well known signal has signal isthat, determined by the inherent limited bandwidth of the passed through is well known signal. Ithaving system morethat,orif a signal has I. INFORMATION THEORYa transmission n radioflf engipassed through a transmission system having more or interest in radio engi-over which are of transmission HE SIGNALS uniform less a frequency bandwidth lly as funcneering may be represented graphically as func- less uniform transmission over a frequency bandwidth B, the smallest intervals into which we can separate time into whichtime we can separate is shownintervals tions Onesmallest such signalFormula of B, time.the Fig. 1. Rule: Shannon’s &inHartley’s 23 Sept A Mathematical in Fig. 1. 2014 wn19/31 the portions of the signal such thatCoincidence? amplitudes of the

tion theory and amplitude levels, any particular signal such as that stigate the fun- Stanford shown, havingGoldman a duration of n significant time intervals, signals through represents one out of Ln different possible PROCEEDINGS OF THE 584 I.R.E. signals of this rems governing May re proved, and duration which could have been transmitted in the Some system.' Fundamental h and repetition With the Considerations foregoing meaning Concerning for the various Noise lectivity" is in- symbols, we have and Range in Radar and Communication * dth can be Reduction used se-improvement GOLDMANt, MEMBER,=Ln. I.R.E. number STANFORD of different possibleSENIOR messages (I),2 ssage informaSummary-A general analysis based upon information theory and amplitude levels, any particular signal such as that also pointed out is used toof the mathematical theoryThe of probability the fun- shown, investigate number significant amplitude levels ofisnusually having a duration significant time intervals, damental principles involved in the transmission of signals through uld be possible represents one out Ln different possible the noise in the system. Ifofthe system is signals of this random noise. Three by theorems a background of determined general governing the probability relations between signal and noise are proved, and

duration which could have been transmitted in the

of radar and communication systems may be made are also discussed. The possibility of using extra bandwidth to reduce distortion is pointed out. Finally, some possible relations of this work to biology and psychology are described.

where the "1" is due to the fact that the zero signal level

a linear the maximum amplitude of pulse lengthand and repetition oneisis disapplied to of the effect nature, investigate system.' With signal the foregoing meaning for the various resholdsrate on radar range. The concept of "generalized selectivity" is in- symbols, we have is S, while the noise amplitude is N, then the number of of a ablishment can and and extra bandwidth be is shown how used it troduced, why that most noise-improvement It is pointed out amplitude for noise reduction. number of different possible messages =Ln. (I),2 of the apparent significant levels is essentially systems are based upon coherent repetition of the message informaations tion system, either in time or in the frequency spectrum. It is also pointed out The number of significant amplitude levels is usually why more powerful noise-improvement systems shouldLbe = possible determined (S/N) + 1 by the noise in the system. e of modulation (2)If the system is than have so far been made. The general mechanism of noise-improvement thresholds is dis- of a linear nature, and the maximum signal amplitude ements in range and it is shown how they depend upon the establishment of a is S, while the noise amplitude is N, then the number of cussed,dise are also where "1" the is dueof to the fact that amplitude the zerolevels signal level the apparent The reason for and the limitation coherence standard. significant is essentially law that the maximum operating range of a communications system, educe distortion can be used. a given average power, is independent of the type of modulation L = (S/N) + 1 (2) work tofor biology in which improvements used is then explained. General The ways duration to of ain range significant time interval of the

signal is determined by the inherent can be used. limited bandwidth The duration if to ofaa significant time interval of the of the signal. It is well known signal has signal isthat, determined by the inherent limited bandwidth of the passed through is well known signal. Ithaving system morethat,orif a signal has I. INFORMATION THEORYa transmission n radioflf engipassed through a transmission system having more or interest in radio engi-over which are of transmission HE SIGNALS uniform less a frequency bandwidth lly as funcneering may be represented graphically as func- less uniform transmission over a frequency bandwidth B, the smallest intervals into which we can separate time into whichtime we can separate is shownintervals tions Onesmallest such signalFormula of B, time.the Fig. 1. Rule: Shannon’s &inHartley’s 23 Sept A Mathematical in Fig. 1. 2014 wn19/31 the portions of the signal such thatCoincidence? amplitudes of the

468

let in S be the rmsof us limiting assume ittheabsent. It is assum rate ofLettransmission isinformation, unimportant need mitter, of the delivmaximum may be signal that amplitude sis, however, William G. Tuller rms letPROCEEDINGS Let S be the information, us assume itOFabsent. It is assu TIHE I.R.E. Let us assume, a barelyMay ered by the communication system. discer

of thetomaximum may be deliv- sis, however amplitude the truth,signal close fact very signal amplitude that athat carrier am the Let usrecognized, assume,ofa ofbarely system. ered communication by disce Rate the on Limitations Theoretical less than tonoise be amplitude cannot change over-a tion of thechange close fact a noise signalis amplitude truth, equal that to of carrier instantly the amplitude but avery signal presencea of Information* Transmission less than noise be recognized, amplitude cannot change overtion if N is the rms amplitude of the This of Then, recognizable.'4 assumpt IRE SENIOR MEMBER, G. TULLERt, WILLIAM a noise is instantly change equal to amplitude but signal the presence noise mixed with the signal, there are 1 +S/N significant does not hol if Nbeishowever, amplitude the rms Then, recognizable.'4 This aassump transmisof the may the theory worksignal of early of Summary-A review ofsmall of uses the the of the possibility inthe knowledge Thisof sets determined. that values over and there this work by criticalwith of followedmixed sion of information isnoise in- ho of thedoes circuitsnot characteristics the transient-response 1 the are significant signal, +S/N finite is there of noise, absence the in that, refutation of the point of (1). Since derivation that theinformation it is known"3 the specifinoise.s in small ranges neglected He further finitebevolved. be transmitted limit to the rate at which This determined. signal that may values ofis then a anal overbroke smal which analysis entirely an 1946, Gabor5 presented T andsets andeveloped wave ofIn duration maxication arbitrary which includes, theoryof frequency band. A simple theory of the Hartley limitations the of . is some through derivation Since known"3 that that in- it shows the the specifitheory of noise. This of (1). small the effects ranges in first-order mum component we intoin Hartley's measurements, method an f; requires to the . given circuit according formation may be transmitted quantitative introduced . andof2fcT of an Tavailable wave maxication duration and analysis arbitrary entirely relation also ana Gabor However, qualitative reasoning. the quantity ofpurely information at and he (1) have from S/N, log (1 + C/N), H 2BT component mum we measurements, a noise in his reasoning. requires 2fcT in method to include failed f; of the The filter the system: output far been discussed h have soat workers whose papers link ofThe transmission B the of from information, where H is the quantity the information available (1) quantity have and S/N, the problem failed to give much thought to the fact thatset the respo bandwidth, T the time of transmission, and=C/N the carrier-to-noise H ofandthe sthat=therek2_fT (2) ways kn log (1 + information S/N). is in many it is log considered, identical The filter ratio. Certain specialthe output system: of transmitting system. trading systems, modulation of different types two distinctly This(It series.the to the problem of analysis of stationary timeset bandthe other trading= ratio, bandwidth linearly for signal-to-noise is be gives but is r expression, to This an important = sure, didresp who system" Wiener,6 by in a paper classical was made H point s (2) + kn k2_fT log (1 log S/N). width logarithmically for signal-to-noise ratio. is a large (It which system. of that problem analysis a searching inefficiencies of the to show is applied The theory developed in itself limits be as to the that us no information 6 include may are of thesystem" irreducible is problem one, the general by part of the to be gained advantages The systems. of present communication is be gives but expression, to This an important sure, In H. the on is the bandwidth of particular, placed correlations and analysis of the fJnoise present in a mixture of signal and noise. elements Unfortu- bet the removal of internal discus- asnately, The pointed out. of information actual information content no in itself be to the that us 6 includ may are circulation, limited only a of received paper this limits the communication not bandwidth the over-all to system, input sion is applied to such communication systems radar relays, tele- and this, coupled with the fact that the mathematics In particular, H. servomechanisms, the on bandwidth of bea placed elements and fJ is the systems, meters, voice conununication link retransmitter and this filter transmission the of connecting employed were beyond the off-hand capabilities of the computers. the communication not bandwidth of the over-all to system, input in high-tran engineers engaged communication of the analysis this stage over-all ceiver. Also, S/N may not athard-pressed wide an as this has prevented developments,and speed wartime re- ments transmitter transmission the of connecting I. INTRODUCTION ratio of the maximum to C/N, the offilter the have any relation link deserves. application of the theory as its importance of the not at S/N this over-all at back ceiver. analysis stage Also, may investigation this of HE HISTORY of trat versions Shannon’s Formula & Hartley’s Rule: written simplifiedwhich sets havemeasured of Wiener 20/31 23 Sept 2014 Associates as amplitude A Mathematical Coincidence? signal amplitude to the noise on

a

survey

a

a

over a

may

a

way,

over a

shown one

cases are

are

some

message a message are

as

r[f

goes

468

let in S be the rmsof us limiting assume ittheabsent. It is assum rate ofLettransmission isinformation, unimportant need mitter, of the delivmaximum may be signal that amplitude sis, however, William G. Tuller rms letPROCEEDINGS Let S be the information, us assume itOFabsent. It is assu TIHE I.R.E. Let us assume, a barelyMay ered by the communication system. discer

of thetomaximum may be deliv- sis, however amplitude the truth,signal close fact very signal amplitude that athat carrier am the Let usrecognized, assume,ofa ofbarely system. ered communication by disce Rate the on Limitations Theoretical less than tonoise be amplitude cannot change over-a tion of thechange close fact a noise signalis amplitude truth, equal that to of carrier instantly the amplitude but avery signal presencea of Information* Transmission less than noise be recognized, amplitude cannot change overtion if N is the rms amplitude of the This of Then, recognizable.'4 assumpt IRE SENIOR MEMBER, G. TULLERt, WILLIAM a noise is instantly change equal to amplitude but signal the presence noise mixed with the signal, there are 1 +S/N significant does not hol if Nbeishowever, amplitude the rms Then, recognizable.'4 This aassump transmisof the may the theory worksignal of early of Summary-A review ofsmall of uses the the of the possibility inthe knowledge Thisof sets determined. that values over and there this work by criticalwith of followedmixed sion of information isnoise in- ho of thedoes circuitsnot characteristics the transient-response 1 the are significant signal, +S/N finite is there of noise, absence the in that, refutation of the point of (1). Since derivation that theinformation it is known"3 the specifinoise.s in small ranges neglected He further finitebevolved. be transmitted limit to the rate at which This determined. signal that may values ofis then a anal overbroke smal which analysis entirely an 1946, Gabor5 presented T andsets andeveloped wave ofIn duration maxication arbitrary which includes, theoryof frequency band. A simple theory of the Hartley limitations the of . is some through derivation Since known"3 that that in- it shows the the specifitheory of noise. This of (1). small the effects ranges in first-order mum component we intoin Hartley's measurements, method an f; requires to the . given circuit according formation may be transmitted quantitative introduced . andof2fcT of an Tavailable wave maxication duration and analysis arbitrary entirely relation also ana Gabor However, qualitative reasoning. the quantity ofpurely information at and he (1) have from S/N, log (1 + C/N), H 2BT component mum we measurements, a noise in his reasoning. requires 2fcT in method to include failed f; of the The filter the system: output far been discussed h have soat workers whose papers link ofThe transmission B the of from information, where H is the quantity the information available (1) quantity have and S/N, the problem failed to give much thought to the fact thatset the respo bandwidth, T the time of transmission, and=C/N the carrier-to-noise H ofandthe sthat=therek2_fT (2) ways kn log (1 + information S/N). is in many it is log considered, identical The filter ratio. Certain specialthe output system: of transmitting system. trading systems, modulation of different types two distinctly This(It series.the to the problem of analysis of stationary timeset bandthe other trading= ratio, bandwidth linearly for signal-to-noise is be gives but is r expression, to This an important = sure, didresp who system" Wiener,6 by in a paper classical was made H point s (2) + kn k2_fT log (1 log S/N). width logarithmically for signal-to-noise ratio. is a large (It which system. of that problem analysis a searching inefficiencies of the to show is applied The theory developed in itself limits be as to the that us no information 6 include may are of thesystem" irreducible is problem one, the general by part of the to be gained advantages The systems. of present communication is be gives but expression, to This an important sure, In H. the on is the bandwidth of particular, placed correlations and analysis of the fJnoise present in a mixture of signal and noise. elements Unfortu- bet the removal of internal discus- asnately, The pointed out. of information actual information content no in itself be to the that us 6 includ may are circulation, limited only a of received paper this limits the communication not bandwidth the over-all to system, input sion is applied to such communication systems radar relays, tele- and this, coupled with the fact that the mathematics In particular, H. servomechanisms, the on bandwidth of bea placed elements and fJ is the systems, meters, voice conununication link retransmitter and this filter transmission the of connecting employed were beyond the off-hand capabilities of the computers. the communication not bandwidth of the over-all to system, input in high-tran engineers engaged communication of the analysis this stage over-all ceiver. Also, S/N may not athard-pressed wide an as this has prevented developments,and speed wartime re- ments transmitter transmission the of connecting I. INTRODUCTION ratio of the maximum to C/N, the offilter the have any relation link deserves. application of the theory as its importance of the not at S/N this over-all at back ceiver. analysis stage Also, may investigation this of HE HISTORY of trat versions Shannon’s Formula & Hartley’s Rule: written simplifiedwhich sets havemeasured of Wiener 20/31 23 Sept 2014 Associates as amplitude A Mathematical Coincidence? signal amplitude to the noise on

a

survey

a

a

over a

may

a

way,

over a

shown one

cases are

are

some

message a message are

as

r[f

goes

es due to mismatch. At the peak of the curve, it is enthusiastic support of all his co-workers at the Nav concerning the signal; in who combination, is of transmitted order of 80 Research the electronic gain helped to carry out this pro Laboratory imated that ese Claude E. the Shannon ectresult from the stage of conception to the production a these statistical indications in near certainty. ons, he curves of output voltage versus the potential of tests of experimental electron-wave tubes. The untiri possibility an improvement about 8 dbC. B. Smi 9. Fig. 9 shows in Figs. 8 andallows were shown drift tubeThis efforts of two of theofauthor's assistants, the of for the electron-wave tube the and R. S. are particularly appreciated. s characteristic Ware, on in power over (18) with a reasonable definition of reliable resolution of signals, as will appear later. We will Communication the Presence of Noise* of the in geometrical representation to deW, now make useCLAUDE E. SHANNONt, MEMBER, IRE termine the exact capacity of a noisy channel. olfor Let com-average transmitter is developed 2: representing anythe Summary-A method THEOREM P I. INTRODUCTION be power, and rly cation system geometrically. Messages and the corresponding and "function the modulation in two A spaces," is white thermal GENERAL COMMUNICATIONS als are points suppose the noise noise of power N in the system ngs ess is a mapping of one space into the other. Using this represhown schematically in Fig. 1. It consists esse of results communication band W.inBy theory are deduced sufficiently ation, a number complicated encoding systems it is tially of five elements. erning expansion and compression of bandwidth and the An a information to transmit for the maxmum binary Formulas are found rate of transdigits1. at rate source. The source selects one me shold effect.possible ion of binary digits over a system when the signal is perturbed sage from a set of possible messages to be transmitted arious types of noise. Some of the properties of "ideal" systems the receiving terminal. The message may be of vario P+ for Nexample, a sequence of letters or numbers, ch transmit at this maxmum rate are discussed. The equivalent types; ber of binary digits per second for certain information C = Wsources (19) function log21N9( in telegraphy or teletype, or a continuous alculated. N as in radio or telephony. timef(t), laDecimal classification: 621.38. Original manuscript received by 2. The transmitter. This operates on the message Institute, July 23, 1940. Presented, 1948 IRE National ConvenMarchsmall andfrequency IRE New York Section, N. Y., as W. 24, 1948; a , New York, and producesIta is some way signal with of errors as desired. notsuitable pos- for transmi 1947. November N. w York, Y., 12, sion to the receiving point over the channel. In telep Bell Telephone Laboratories, Murray Hill, N. J. tyssible by any encoding method to send at a higher rate and nd have an arbitrarily low frequency of errors. roThis shows that the rate W log (P + N) /N measures in Shannon’s Formula & Hartley’s Rule: 21/31 23 Sept 2014 A Mathematical Coincidence? the

*

uits. The lack of appropriate match is responsible ACKNOWLEDGMENT values the fact that theClaude gain curve assumes E. negative Shannon en the electronic gain is not sufficient to overcome the The author wishes to express his appreciation of t es due to mismatch. At the peak of the curve, it is enthusiastic support of all his co-workers at the Nav imated that the electronic gain is of the order of 80 Research Laboratory who helped to carry out this pro ect from the stage of conception to the production a he curves of output voltage versus the potential of tests of experimental electron-wave tubes. The untiri drift tube were shown in Figs. 8 and 9. Fig. 9 shows efforts of two of the author's assistants, C. B. Smi s characteristic for the electron-wave tube of the and R. S. Ware, are particularly appreciated.

Communication in the Presence

of

Noise*

CLAUDE E. SHANNONt, MEMBER, IRE

Summary-A method is developed for representing any comcation system geometrically. Messages and the corresponding als are points in two "function spaces," and the modulation ess is a mapping of one space into the other. Using this repre-

ation, a number of results in communication theory are deduced

I. INTRODUCTION

A GENERAL COMMUNICATIONS system shown schematically in Fig. 1. It consists esse

tially of five elements. 1. An information source. The source selects one me sage from a set of possible messages to be transmitted the receiving terminal. The message may be of vario types; for example, a sequence of letters or numbers, in telegraphy or teletype, or a continuous function alculated. timef(t), as in radio or telephony. * Decimal classification: 621.38. Original manuscript received by 2. The transmitter. This operates on the message Institute, July 23, 1940. Presented, 1948 IRE National Conven, New York, N. Y., March 24, 1948; and IRE New York Section, some way and produces a signal suitable for transmi 1947. w York, N. Y., November 12, sion to the receiving point over the channel. In telep t Bell Telephone Laboratories, Murray Hill, N. J.

erning expansion and compression of bandwidth and the shold effect. Formulas are found for the maxmum rate of transion of binary digits over a system when the signal is perturbed arious types of noise. Some of the properties of "ideal" systems ch transmit at this maxmum rate are discussed. The equivalent ber of binary digits per second for certain information sources

21/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

en the electronic gain is not sufficient to overcome the The author wishes to express his appreciation of t es due to mismatch. At the peak of the curve, it is enthusiastic support of all his co-workers at the Nav Claude Shannon order of 80 Research Laboratory who helped to carry out this pro of the imated that the electronic gain isE. ect from the stage of conception to the production a he curves of output voltage versus the potential of tests of experimental electron-wave tubes. The untiri drift tube were shown in Figs. 8 and 9. Fig. 9 shows efforts of two of the author's assistants, C. B. Smi s characteristic for the electron-wave tube of the and R. S. Ware, are particularly appreciated.

Communication in the Presence

of

Noise*

CLAUDE E. SHANNONt, MEMBER, IRE

Summary-A method is developed for representing any comcation system geometrically. Messages and the corresponding als are points in two "function spaces," and the modulation ess is a mapping of one space into the other. Using this repre-

ation, a number of results in communication theory are deduced

I. INTRODUCTION

A GENERAL COMMUNICATIONS system shown schematically in Fig. 1. It consists esse

tially of five elements. 1. An information source. The source selects one me sage from a set of possible messages to be transmitted the receiving terminal. The message may be of vario types; for example, a sequence of letters or numbers, in telegraphy or teletype, or a continuous function alculated. timef(t), as in radio or telephony. * Decimal classification: 621.38. Original manuscript received by 2. The transmitter. This operates on the message Institute, July 23, 1940. Presented, 1948 IRE National Conven, New York, N. Y., March 24, 1948; and IRE New York Section, some way and produces a signal suitable for transmi w York, N. Y., November 12, 1947. sion to the receiving point over the channel. In telep t Bell Telephone Laboratories, Murray Hill, N. J.

erning expansion and compression of bandwidth and the shold effect. Formulas are found for the maxmum rate of transion of binary digits over a system when the signal is perturbed arious types of noise. Some of the properties of "ideal" systems ch transmit at this maxmum rate are discussed. The equivalent ber of binary digits per second for certain information sources

21/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Who’s formula?

The “Shannon-Hartley” formula C=

22/31

23 Sept 2014

1 2

 P log2 1 + N

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Who’s formula?

The “Shannon-Hartley” formula C=

1 2

 P log2 1 + N

would actually be the Shannon-Tuller-Wiener-Sullivan-Laplume-Earp-Clavier-Goldman formula

22/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Who’s formula?

The “Shannon-Hartley” formula C=

1 2

 P log2 1 + N

would actually be the Shannon-Tuller-Wiener-Sullivan-Laplume-Earp-Clavier-Goldman formula or simply the Shannon-Tuller formula

22/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Outline

This Hartley’s rule

C0



= log2 1 +

A ∆



is not Hartley’s

Many authors independently derived C =

1 2

 log2 1 +

P N



in 1948.

In fact, C 0 = C (a coincidence?) Besides, C 0 is the capacity of the “uniform” channel (and we can explain)

23/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

“Hartley” ’s argument The channel input X is taking M = 1 + A/∆ equiprobable values in the set {−A, −A + 2∆, . . . , A − 2∆, A}: P = E(X 2 ) =

n 1 X M2 − 1 (M − 1 − 2k)2 = ∆2 . M 3 k=0

The input is mixed with additive noise Z with accuracy ±∆, i.e. having uniform distribution in [−∆, ∆]: 1 N = E(Z ) = 2∆ 2

24/31

23 Sept 2014

Z



−∆

Shannon’s Formula & Hartley’s Rule:

z 2 dz =

∆2 . 3

A Mathematical Coincidence?

“Hartley” ’s argument The channel input X is taking M = 1 + A/∆ equiprobable values in the set {−A, −A + 2∆, . . . , A − 2∆, A}: P = E(X 2 ) =

n 1 X M2 − 1 (M − 1 − 2k)2 = ∆2 . M 3 k=0

The input is mixed with additive noise Z with accuracy ±∆, i.e. having uniform distribution in [−∆, ∆]: 1 N = E(Z ) = 2∆ 2

Hence  A log2 1+ = ∆

1 2

Z

log2 (1+M 2 −1) =



z 2 dz =

−∆

1 2

∆2 . 3

 3P  log2 1+ 2 = ∆

1 2

 P log2 1+ N

i.e., C 0 = C . A mathematical coïncidence? 24/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Outline

This Hartley’s rule

C0



= log2 1 +

A ∆



is not Hartley’s

Many authors independently derived C =

1 2

 log2 1 +

P N



in 1948.

In fact, C 0 = C (a coincidence?) Besides, C 0 is the capacity of the “uniform” channel (and we can explain)

25/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

The uniform channel The capacity of Y = X + Z with additive uniform noise Z is max

X s.t. |X |≤A

I (X ; Y ) = max h(Y ) − h(Y |X ) X

= max h(Y ) − h(Z ) X

=

max

X s.t. |Y |≤A+∆

h(Y ) − log2 (2∆)

Choose X ∗ to be discrete uniform in {−A, −A + 2∆, . . . , A}, then Y = X ∗ + Z has uniform density over [−A − ∆, A + ∆], which maximizes differential entropy: = log2 (2(A + ∆)) − log2 (2∆)  A = log2 1 + ∆ 26/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

What is the worst noise?

  A Thus C 0 = log2 1 + ∆ is correct as the capacity of a communication channel! except that I the noise is not Gaussian, but uniform; I signal limitation is not on the power, but on the amplitude.

27/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

What is the worst noise?

  A Thus C 0 = log2 1 + ∆ is correct as the capacity of a communication channel! except that I the noise is not Gaussian, but uniform; I signal limitation is not on the power, but on the amplitude. Further analogy:

27/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

What is the worst noise?

  A Thus C 0 = log2 1 + ∆ is correct as the capacity of a communication channel! except that I the noise is not Gaussian, but uniform; I signal limitation is not on the power, but on the amplitude. Further analogy: I Shannon used the entropy power inequality to show that under limited power, Gaussian noise is the worst possible noise one can inflict in the channel:   P P 1 1 1 log 1 + α ≤ C ≤ log 1 + + 2 log2 α, 2 2 2 2 N N where α = N/N˜ ≥ 1

27/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

What is the worst noise?

27/31

  A Thus C 0 = log2 1 + ∆ is correct as the capacity of a communication channel! except that I the noise is not Gaussian, but uniform; I signal limitation is not on the power, but on the amplitude. Further analogy: I Shannon used the entropy power inequality to show that under limited power, Gaussian noise is the worst possible noise one can inflict in the channel:   P P 1 1 1 log 1 + α ≤ C ≤ log 1 + + 2 log2 α, 2 2 2 2 N N where α = N/N˜ ≥ 1 I We can show: under limited amplitude, uniform noise is the worst possible noise one can inflict in the channel:   A A log2 1 + ≤ C 0 ≤ log2 1 + + log2 α, ∆ ∆ ˜ ≥ 1. where α = ∆/∆ 23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Conclusion

Why is Shannon’s formula ubiquitous?

28/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Conclusion

Why is Shannon’s formula ubiquitous? I

28/31

we can explain the coincidence by deriving necessary and   P 1 sufficient conditions s.t. C = 2 log2 1 + N .

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Conclusion

Why is Shannon’s formula ubiquitous? I

I

28/31

we can explain the coincidence by deriving necessary and   P 1 sufficient conditions s.t. C = 2 log2 1 + N .

the uniform (Tuller) and Gaussian (Shannon) channels are not the only examples.

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Conclusion

Why is Shannon’s formula ubiquitous? I

I

I

we can explain the coincidence by deriving necessary and   P 1 sufficient conditions s.t. C = 2 log2 1 + N .

the uniform (Tuller) and Gaussian (Shannon) channels are not the only examples. using B-splines, we can construct a sequence of such additive noise channels s.t. uniform channel −−−−−−−−−−−−−−→ Gaussian channel

28/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Conclusion Why is Shannon’s formula ubiquitous? I

I

I

we can explain the coincidence by deriving necessary and   1 P sufficient conditions s.t. C = 2 log2 1 + N .

the uniform (Tuller) and Gaussian (Shannon) channels are not the only examples. using B-splines, we can construct a sequence of such additive noise channels s.t. uniform channel −−−−−−−−−−−−−−→ Gaussian channel “On Shannon’s formula and Hartley’s rule: Beyond the mathematical coincidence,” in Journal Entropy, Vol. 16, No. 9, pp. 4892-4910, Sept. 2014. http://www.mdpi.com/1099-4300/16/9/4892/

28/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Thank you!

29/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

  P A characterization of C = log2 1 + N 1 2

There exists α > 1 such that the ratio of characteristic functions ΦZ (αω) ΦZ (ω) is itself a characterization function of a r.v. X ∗ — which attains capacity under an average cost per channel use E{b(X )} ≤ C , where n  o αpZ (Z ) b(x) = E log2 pZ ((x + Z )/α)

30/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?

Figure 1. Discrete plots of input probability distributions (of Xd ) that attain capacity for

M = 15 and different values of d. B-splines channels

(a) d = 0 (rectangular)

(b) d = 1 (triangular)

(c) d = 2

(d) d = 3

6.3. Convergence as d ! +1 To determine the limit behavior as d ! +1, we need to apply some normalization on the probability distributions. Since the pdf of Zd is obtained by successive convolutions of rectangles of length 2 , its

31/31

23 Sept 2014

Shannon’s Formula & Hartley’s Rule:

A Mathematical Coincidence?