Paul Penfield, Jr., Making Complexity Simple, Complexity in Engineering Conference, Cambridge, MA; November 20, 1999.

Making Complexity
Simple

MIT logo  . .

Paul Penfield, Jr.

Professor of Electrical Engineering

Department of Electrical Engineering
   and Computer Science
Massachusetts Institute of Technology

Cambridge, MA 02139-4307

(617) 253-2506
penfield@mit.edu
https://mtlsites.mit.edu/users/penfield/


Making Complexity Simple

Outline
Arcane
Pity
Perspective
Information
Course
Status

The Second Law of Thermodynamics

Surely one of science's most glorious accomplishments
 . . Also one of the most profound and mysterious
 . . What is this thing called "entropy?"
 . . Does the Second Law really tell us which way clocks run?
 . . Deep related concepts
 . .  . . Complexity
 . .  . . Randomness
 . .  . . Fluctuations
 . .  . . Dissipation
 . .  . . Order
 . . The public's imagination is captured
Nobody really understands it
 . . At least that is what we believed as graduate students
 . . It is too complex to be taught


Why is entropy considered arcane?

Because people think it is part of thermodynamics
 . . Thermodynamics really is hard to understand.
 . .  . . Many concepts -- heat, work, temperature, ...
 . .  . . You have to care about things like ideal gases.
 . . Most engineers don't need to know thermodynamics.
 . . Most don't care.


This is a shame

Entropy and the Second Law are taught as part of
   thermodynamics, so most people miss them.

... and thereby miss something pretty important
 . . All scientists, all engineers, indeed all educated people need
 . .    to understand that some operations are reversible and
 . .    some are not.
 . . They need to be able to tell them apart.
 . . They need to know there is a way to quantify reversibility.
 . . They need a "monotonic model"
 . .  . . To complement models of conserved quantities
 . .  . . Both are helpful for understanding the world
 . .  . . Both are prototypes for other quantities
Can we satisfy these very real needs somehow?
They have nothing to do with thermodynamics.
 . . It's just that thermodynamics is where they have traditionally
 . .    been taught.


Entropy is useful outside of thermodynamics

Thermodynamics always involves energy
 . . Entropy need not
Outside of thermodynamics, without links to energy
 . . Entropy is less complex
 . . There are plenty of reversible and irreversible operations
 . . The Second Law, or something much like it, exists
 . . Monotonic models can be taught more easily
The more general the context, the simpler the concept
 . . This is the secret of making complexity simple


War is too important to be left to the generals*

And entropy is too important to be left to the physicists

 . . * George Clémenceau (1841 - 1929)
 . .  . . French Premier, 1906 - 1909, 1917 - 1920


Information is simpler and more general

Start with information
 . . Entropy is one kind of information.
 . . Entropy is information we do not have
See reversible and irreversible data transformations
 . . In computation and communications
Note that irreversible operations destroy information
 . . This is the Second Law in this context
Apply to a physical system with energy
Use maximum-entropy principle
 . . Voila, thermodynamics!
 . . Temperature is energy per bit of entropy (sort of)
 . . Intensive vs. extensive variables
 . . Second Law in traditional setting
 . . Carnot efficiency
The basic idea of reversibility is not difficult to understand


We want to teach this stuff to freshmen

Why is this possible?
 . . Today's students are different
 . . Best to start from the known
 . .  . . Data, disks, Internet, packets, bits, ...
 . .  . . Consistent with the coming information age
 . . Go toward the unknown
 . .  . . Thermodynamics, equilibrium, heat engines, refrigerators
 . .  . . Relevant to the current industrial age
 . . Physical view of information . . . like energy, information
 . .  . . can be of many types
 . .  . . can be converted from one form to another
 . .  . . can exist in one place or another
 . .  . . can be sent from here to there
 . .  . . can be stored for later use
 . . There are interesting applications
 . .  . . Biology (genetic code)
 . .  . . Communications
 . .  . . Quantum computing


Information and Entropy

A Freshman Course
 . . 12 weeks:
 . .  . . 1. Bits
 . .  . . 2. Codes
 . .  . . 3. Compression
 . .  . . 4. Errors
 . .  . . 5. Probability
 . .  . . 6. Communications
 . .  . . 7. Processes
 . .  . . 8. Inference
 . .  . . 9. Entropy
 . .  . . 10. Physical systems
 . .  . . 11. Temperature
 . .  . . 12. Myths
This course is NOT
 . . Introduction to Computing
 . . Introductory Communications
 . . Thermodynamics 101


Course material

1. Bits
 . . Restoring logic
 . . Digital abstraction
 . . Signals and streams
 . . Boolean algebra
2. Codes
 . . Bytes
 . . Fixed-length codes
 . .  . . ASCII
 . .  . . Genetic code
 . .  . . Binary code, gray code
 . . Variable-length codes
 . .  . . Morse code
 . .  . . Telegraph codebooks


Course material (cont)

3. Compression
 . . Helps with low channel capacity
 . . Codebooks
 . . Irreversible -- fidelity requirement
 . .  . . JPEG
 . .  . . MP3
 . . Reversible
 . .  . . Run length encoding
 . .  . . LZW
 . .  . . The LZW patent issue
4. Errors
 . . Physical sources of noise
 . . Detection -- parity
 . . Correction
 . .  . . Triple redundancy
 . .  . . Hamming code


Course material (cont)

5. Probability
 . . Racing odds
 . . Random sources
 . .  . . coins, dice, cards
 . . Probabilities are subjective
 . . Information can be quantified
6. Communications
 . . Model with source, coder, channel, decoder, receiver
 . . Huffman codes
 . . Symmetric binary channel
 . .  . . Lossless
 . .  . . Noisy
 . . Client-server model
 . .  . . TCP and IP
 . .  . . Strategies for recovery from lost packets


Course material (cont)

7. Processes
 . . Discrete memoryless channel
 . . Noise, loss
 . . M = IIN - L
 . .  . . = IOUT - N
 . . Cascade inequalities in L, N, and M
 . .  . .  . .  . .    L1 <= L
 . .  . . L1 + L2 - N1 <= L <= L1 + L2
8. Inference
 . . Given received signal, what is input
 . . How much information have we learned?


Course material (cont)

Discrete Memoryless Channel
 . .


Course material (cont)

9. Entropy
 . . Entropy is information we do not have
 . . Input probabilities consistent with constraints
 . . Minimum assumptions, maximum entropy
 . . Lagrange multipliers
 . .  . . Deer hunters
 . .  . . Fishermen
10. Physical systems
 . . Energy per state
 . . Expected value of energy
 . . Boltzmann distribution
 . . Lagrange multipliers are intensive variables
 . . Equilibrium


Course material (cont)

11. Temperature
 . . One of the Lagrange multipliers is temperature
 . . Heat, work
 . . Carnot efficiency
12. Myths
 . . Order out of chaos
 . . Miracle needed
 . . Heat death
 . . Evaporation of black holes
 . . Difficulty of extensions to social science


Can freshmen actually learn this stuff?

We are finding out
 . . Fall 1999, course development
 . .  . . Faculty: Paul Penfield, Seth Lloyd, Sherra Kerns
 . .  . . Students: small set of freshmen serving as guinea pigs
 . . Spring 2000, pilot offering
 . .  . . Limited to 50 freshmen
 . .  . . Of course we have a Web site -- everybody does
 . .  . . https://mtlsites.mit.edu/users/penfield/6.095-s00/
 . . Fall 2000, revisions, note writing
 . . Spring 2001, first full offering
The devil is in the details
 . . E.g., which is the best statistical mechanics model to use?


Some of the subtle points (risks)

Entropy is the information we don't have
 . . Therefore entropy is subjective (some people don't like that)
Math generally simple -- discrete, not continuous processes
 . . But Lagrange multipliers are not easy
Skill in modeling is still important
 . . No magic here
At the freshman level you cannot go very deeply
 . . All we can do is provide simple ideas to be built on later
 . .  . . We want to be consistent with later learning in specific areas
Stay in touch -- we will let you know how it turns out
If we are successful
 . . The course will be permanent
 . . We will help other universities start similar courses
 . . We will advocate it as a science exposure in the liberal arts


URL of this page: https://mtlsites.mit.edu/users/penfield/pubs/complex-99.html
Created: Nov 19, 1999  |  Modified: Dec 29, 1999
Related page: Penfield publication list
Site map  |  To Paul Penfield's home page  |  Your comments are welcome.
Click here for information on MIT Accessibility