Paul Penfield, Jr.D. C. Jackson Professor of Electrical

o  Arcane 
Pity  
Perspective  
Information  
Course  
Status 
Also one of the most profound and mysterious
What is this thing called "entropy?"
Does the Second Law really tell us which way clocks run?
Deep related concepts
Complexity
Randomness
Fluctuations
Dissipation
Order
The public's imagination is captured
At least that is what we believed as graduate students
It is too complex to be taught
Thermodynamics really is hard to understand.
Many concepts  heat, work, temperature, ...
You have to care about things like ideal gases.
Most engineers don't need to know thermodynamics.
Most don't care.
Arcane  
o  Pity 
Perspective  
Information  
Course  
Status 
All scientists, all engineers, indeed all educated people need to understand that some operations are reversible and some are not.
They need to be able to tell them apart.
They need to know there is a way to quantify reversibility.
They need a "monotonic model"
To complement models of conserved quantities
Both are helpful for understanding the world
Both are prototypes for other quantities
But that is where they have traditionally been taught.
Arcane  
Pity  
o  Perspective 
Information  
Course  
Status 
Entropy need not (at least at one level)
Entropy is less complex
There are plenty of reversible and irreversible operations
The Second Law, or something much like it, exists
Monotonic models can be taught more easily
This is the secret of making complexity simple
* George Clémenceau (1841  1929)
French Premier, 1906  1909, 1917  1920
Arcane  
Pity  
Perspective  
o  Information 
Course  
Status 
Shannon believed this was not a fundamental identity
As far as I know Jaynes agreed
The question is, can we convert from one to the other?
Vastly different scales
They are the same concept
Information is physical
Entropy is subjective
Both are relative
One can be traded for the other (with experimental difficulty)
We should think about entropy conversion systems
The ideas are simple, the applications are arcane
Unifying the concepts simplifies them and lets freshmen in
Entropy is one kind of information  information we don't have
In computation and communications
They also give meaning to causality and the direction of time
This is the Second Law in this context
Use principle of maximum entropy
Voila, thermodynamics!
Temperature is energy per bit of entropy (sort of)
Second Law in traditional setting
Carnot efficiency
Arcane  
Pity  
Perspective  
Information  
o  Course 
Status 
12 weeks (Spring 2001 outline):
1. Bits and Codes (perhaps next year qubits)
2. Compression
3. Errors
4. Probability
5. Communications
6. Processes
7. Inference
8. Entropy
9. Physical Systems
10. Energy
10. Temperature
12. Quantum Information
Introduction to Computing
Introductory Communications
Thermodynamics 101
Restoring logic
Digital abstraction
Signals and streams
Boolean algebra
Codes
Bytes
Symbols
Fixedlength codes
ASCII
Genetic code
Binary code, gray code
Variablelength codes
Morse code
Telegraph codebooks
Helps with low channel capacity
Codebooks
Irreversible  fidelity requirement
JPEG
MP3
Reversible
Run length encoding
LZW, GIF
The LZW patent issue
Physical sources of noise
Detection
Parity
Correction
Triple redundancy
Hamming codes
Racing odds
Random sources
coins, dice, cards
Probabilities are relative and subjective
Information can be quantified
Model with source, coder, channel, decoder, receiver
Source coding theorem
Huffman codes
Channel capacity
Lossless
Noisy
Clientserver model
TCP and IP
Strategies for recovery from lost packets
Discrete memoryless channel
Noise, loss
M = I_{IN}  L
= I_{OUT}  N
Cascade inequalities in L, N, and M
L1 <= L
L_{1} + L_{2}  N_{1} <= L <= L_{1} + L_{2}
Given received signal, what is input
How much information have we learned?
Entropy is information we do not have
It is relative and subjective
Input probabilities consistent with constraints
Minimum bias means maximum entropy
Lagrange multipliers
Simple examples with analytic solution
Examples with one constraint, many states
Quantum mechanics
Multistate model
Microscopic vs. macroscopic
Energy per state
Expected value of energy
Boltzmann distribution
Equilibrium
One of the Lagrange multipliers is temperature
Heat, work
Carnot efficiency
Qubit
Quantum computation
Maxwell's demon
You can trade one for the other
Difference in scales is being erased by Moore's law
What does that say about physics?
But Lagrange multipliers are not easy
No magic here
All we can do is provide simple ideas to be built on later
We want to be consistent with later learning in specific areas
It's disturbing to have unanswered questions close at hand
Arcane  
Pity  
Perspective  
Information  
Course  
o  Status 
Fall 1999, course development
Faculty: Paul Penfield, Seth Lloyd, Sherra Kerns
Spring 2000, pilot offering
Limited to 50 freshmen
Of course we have a Web site  everybody does
http://wwwmtl.mit.edu/Courses/6.095
Spring 2001, second pilot offering
The course will be permanent starting in Spring 2002
We will help other universities start similar courses
We will advocate it as a science exposure in the liberal arts