Massachusetts Institute of Technology
Department of Electrical Engineering and Computer Science
Department of Mechanical Engineering
6.095 / 2.995
Information and Entropy
Frequently Asked Questions
- 1. Are these questions really frequently asked?
- Of course not. Most have never been asked in exactly this form.
However, questions and answers are a good way to explain things.
- 2. What is this course all about?
- Information and entropy.
Information is a measure of the amount of knowledge; it is not that knowledge
itself. Thus information is like energy, in the sense that knowing the amount
of energy does not tell you anything about its nature, location, form, etc.
The same is true with information. Entropy is a particular kind of information.
It is the information you do not have about a situation.
- 3. The information I don't have? Is this different from the information
you don't have?
- Certainly. Information is subjective. You know things that I don't. This is,
after all, why communication is useful.
- 4. But is entropy also a subjective quantity?
- Strictly speaking, yes. However, in physical situations the difference of
entropy as perceived by two observers may be negligibly small.
- 5. If information is the measure of a quantity, what are its units?
- Information is measured in bits. More bits means more information.
- 6. Is entropy also measured in bits?
- Yes. However, in physical situations there is lots of information that is
not known. (Think about how many bits of information are needed to specify the
position of all atoms in an object.) It is impractical to work with such large
numbers, so a more practical set of units is needed: Joules per degree Kelvin.
- 7. Wait a minute. Entropy is physical and information is mathematical. How
can they be conceptually the same?
- Historically, information storage and transmission required a physical artifact
of some kind. Think of newspapers, books, and medieval manuscripts. Therefore
information has also been physical -- most of the cost of information processing
was due to the physical carrier of the information. It is only recently that
the cost of storing, moving, or processing information is so low that we
think of information apart from its physical form, not even subject to physical
laws. But eventually information technology will have to face the limits imposed by
quantum mechanics and thermodynamics, and it will again be necessary to understand
the fundamental physics of information. This time it will be entropy that
is the relevant physical concept.
- 8. Why is information so important?
- This is the beginning of the information age. Just as the industrial age
was opened up by our ability to manage energy, so the information age is upon
us because we are learning how to manage information effectively.
- 9. Why is entropy so important?
- Entropy is one of the most mysterious of the concepts dealt with by
scientists. The Second Law of Thermodynamics states that the entropy of a
given situation cannot decrease except if there is a greater increase elsewhere.
Thus entropy has the unusual property that it is not conserved, as energy is, but
monotonically increases over time. The Second Law of Thermodynamics is often
regarded as one of science's most glorious laws.
- 10. About this course, why is it aimed at freshmen?
- The concepts involved here are not normally taught to freshmen. This is a
shame, because they have the background necessary to appreciate them if approached
from the point of view of information. Traditionally the Second Law of
Thermodynamics is taught as part of a course on thermodynamics, and a background
in physics or chemistry is needed. Also, the examples used come from thermodynamics.
In this course, the Second Law is treated as an example of information processing
in natural and man-made systems; the examples come from many domains.
- 11. I am thinking about taking this course. What do I need to know to start?
- You need to understand energy, and how it can exist in one place or another,
how it can be transported from here to there, and can be stored for later use. You
need to know how to deal with a conserved quantity like energy mathematically,
and to appreciate that if energy decreases in one region, it must increase elsewhere.
More specifically, the prerequisite for this course is the first semester freshman
physics subject 8.01 (or 8.012, 8.01L, or 8.01X).
- 12. Is entropy useful outside of thermodynamics?
- All physical systems obey the laws of thermodynamics. The challenge is to
express these laws in simple but general forms so that their significance in, for
example, a biological system gives insight. Besides, laws similar to the Second
Law of Thermodynamics are found in abstract systems governed more by mathematics than
physics -- two examples discussed in the course are computers and communications
systems. In these contexts the "information" part of "Information and Entropy"
- 13. Why aren't information and entropy normally thought of together?
- Most scientists recognize that information can be exchanged for entropy and
vice versa. but they don't consider that fact important. The reason is that
in typical physical situations the number of bits of entropy is far larger than
the number of bits of information that even the largest information processing
systems can deal with. In other words, the scales involved are vastly
different. This is because there are a large number of atoms in physical systems.
- 14. But if this is the case, why treat them together now?
- For two reasons. First, the underlying principles are the same so learning
about information in one system can help the understanding of another system. And
second, modern technology is continuously increasing the amount of information that
computers and communication systems can deal with. Another way of saying this is
to observe that modern microelectronic systems control larger number of bits with
fewer atoms. As the number of atoms per bit comes down, the difference in
scale between information in computers and communications systems and entropy in
the corresponding physical systems shrinks. Eventually it will be possible to
make devices that cannot be understood without considering the interplay between
the information stored in the device and the entropy of its physical configuration.
- 15. If I take this course, can I get a summer job?
- Certainly, but probably not because of what you learn here.
URL of this page:
Author: Paul Penfield, Jr. |
Created: Nov 10, 1999 |
Modified: Nov 21, 2000
Site map |
To 6.095 / 2.995 home page |
Your comments and inquiries are welcome.