Department of Electrical Engineering and Computer Science
Department of Mechanical Engineering
6.050J/2.110J – Information, Entropy and Computation
Frequently Asked Questions
1. Are these questions really frequently asked?
Of course not. Most have never been asked in exactly this form.
However, questions and answers are a good way to explain things.
2. What is this course all about?
Information and entropy.
Information is a measure of the amount of knowledge; it is not that
knowledge itself. Thus information is like energy, in the sense that
knowing the amount of energy does not tell you anything about its nature,
location, form, etc. The same is true with information. Entropy is one
kind of information. It is the information you do not have about a
situation.
3. The information I don’t have? Is this different from the
information you don’t have?
Certainly. Information is subjective, or observer-dependent. You know
things that I don’t. This, after all, is why communication is useful.
4. So is entropy also a subjective quantity?
Strictly speaking, yes. However, in physical situations the difference in
entropy as perceived by two observers is usually negligibly small.
5. If information is the measure of a quantity, what are its units?
Information is measured in bits. More bits means more information.
6. Is entropy also measured in bits?
Yes. However, in physical situations lots of the information is not known. (Think
about how many bits of information you don’t have that would be needed to
specify the position of all atoms in an object.) It is impractical to work with
such large numbers, so another set of units is used: Joules per Kelvin.
7. Wait a minute. Entropy is physical and information is mathematical.
How can they be the same?
Historically, information storage and transmission required a physical
artifact of some kind (think of newspapers, books, and medieval
manuscripts). So traditionally information has also been physical—most
of the cost of information processing was due to the physical carrier of the
information. Only recently has the cost of storing, moving, and processing
information become so low that we think of information apart from its
physical form, not even subject to physical laws. But this is only
temporary. Eventually, information technology will have to face the limits
imposed by quantum mechanics and thermodynamics, and it will again be
necessary to understand the fundamental physics of information. This time
it will be entropy that is the most important physical concept.
8. You mentioned cost. What about the economics of information and entropy?
When the physical media carrying information were costly, people could buy
and sell information by buying and selling the physical artifact that
carried it. Today costs are different. New economic models and new laws
are in place for commerce of intellectual property, for example music and
entertainment video. These basic models and laws will have to be changed
again when quantum limitations on information processing become important.
If the new laws are designed to be consistent with the underlying physics,
they will be easier to understand and enforce.
9. Why is information so important?
The industrial age was opened up by our ability to manage energy. Today we
are at the beginning of the information age because we are learning how to
manage information so effectively.
10. Why is entropy so important?
Entropy is one of the most mysterious of all the concepts dealt with by
scientists. The Second Law of Thermodynamics states that the entropy of a
given situation cannot decrease unless there is a greater increase
elsewhere. Thus entropy has the unusual property that it is not conserved,
as energy is, but monotonically increases over time. The Second Law of
Thermodynamics is often regarded as one of science’s most glorious
accomplishments. And also one of the most difficult to understand.
11. If entropy is so difficult, can a freshman really understand it?
Certainly. It’s all in how the topics are approached. It is true
that the concepts involved here are not normally taught to freshmen. This
is a shame, because freshmen, and even high-school students, have the
background to appreciate them if approached from the point of view of
information. Traditionally the Second Law of Thermodynamics is taught as
part of a course on thermodynamics, and a background in physics or chemistry
is needed. Also, the examples used come from thermodynamics. In this
course, the Second Law is treated as an example of information processing in
natural and man-made systems; the examples come from many domains.
12. I am thinking about taking this course. What do I need to know to
start?
You need to understand energy, and how it can exist in one place or another,
how it can be transported from here to there, and can be stored for later
use. You need to know how to deal with a conserved quantity like energy
mathematically, and to appreciate that if energy increases in one region, it
must decrease elsewhere. More specifically, the prerequisite for this
course is the MIT first semester freshman physics subject 8.01 (or 8.011,
8.012, or 8.01L).
13. Will I learn information theory in this course?
Not really. Information theory deals with information processing systems,
including communication systems, and how they can be made more efficient.
There is great emphasis on the design of effective, efficient codes. In
this course only very simple models of such systems are used, with just
barely enough complexity to illustrate the basic ideas.
14. Will I learn thermodynamics in this course?
Not really. Thermodynamics includes the application of basic principles
such as the Second Law to physical systems. The properties of particular
materials are important, particularly those related to energy, heat and
temperature in practical settings. In this course extremely simple models
of materials are used, with just barely enough complexity to reveal the
limitations imposed by the Second Law.
15. Will I learn computer science in this course?
Not really. Computer science deals with information processing, including
computers, their hardware, software, and data organization. There is a lot
of detail about architecture, programming, and advanced applications. In
this course extremely simple computation models are used with just barely
enough complexity to illustrate the limitations imposed by quantum mechanics.
16. Will I learn quantum mechanics in this course?
Not really. Quantum mechanics, typically learned by upperclass physics
majors, deals with the laws of nature for extremely small particles, such as
atoms, molecules, and subatomic particles. Quantum principles are also
applied to information processing systems. In this course only extremely
simple models of quantum systems are used, with just barely enough
complexity to demonstrate some of the fascinating properties of quantum
information systems.
17. So will I learn anything at all in this course?
Yes. You will learn about this quantity, entropy, that describes
fundamental limits on what any of these systems can do. Then when you study
any of them in more detail, you will understand those constraints, and can
appreciate similar ones for other types of systems.
18. What other types of systems?
All physical systems, both those that process energy and those that process
data, obey the Second Law and as a result are fundamentally constrained one
way or another.
19. Why aren’t information and entropy normally thought of together?
Most scientists now recognize that information can be exchanged for entropy
and vice versa, but many don’t consider that fact important. The
reason is that in typical physical situations the number of bits of entropy
is far larger than the number of bits of information that even the largest
information processing systems can deal with. In other words, the scales
involved are vastly different. This is because there are a large number of
atoms in physical systems of everyday size.
20. So why treat them together now?
For two reasons. First, the underlying principles are the same so you only
have to learn them once, and then you can appreciate them in many contexts.
And second, modern technology is continuously increasing the amount of
information that computers and communication systems can deal with. In
other words, this year modern microelectronic systems can control more bits
with fewer atoms than last year. As the number of atoms per bit comes down,
the difference in scale between information in computers and communications
systems and entropy in the corresponding physical systems shrinks.
Pretty soon it will be possible to make devices that cannot be understood
without considering the interplay between the information stored in the
device and the entropy of its physical configuration. A growing technical
field, known as Quantum Information Processing or Quantum Computing, deals
with this.
21. Can I learn this stuff without taking the course?
Most students find that taking the course is helpful because it motivates
them to stick with it. However, if that is not possible for you, you can
still read the notes, do the problem sets, and even take the quizzes and
check your answers. Material from several prior offerings since Spring 2003
is available on the course Web site,
http://www-mtl.mit.edu/Courses/6.050.
22. Can I do this without being at MIT?
Of course. The Web site is accessible anywhere, any time, by anyone,
without charge.
23. Is there any other material available?
Yes. Course material, including some video lectures, is on the
MIT OCW (OpenCourseWare) site.
24. Is the course itself delivered online?
Not yet. However, there is some interest in offering the course as a MOOC
(Massive Open Online Course) through
MITx and/or
edX. Maybe some day.
25. After I take this course, can I get a summer job?
Certainly, but probably not because of anything you learn here.