文档介绍:An introduction to
information theory and
entropy
Tom Carter
/˜ tom/SFI-CSSS
Complex Systems Summer School
June, 2003
1
Our general topics: ←
} plexity
} Some probability background
} Basics of information theory
} Some entropy theory
} The Gibbs inequality
} A simple physical example (gases)
} Shannon’munication theory
} Application to Biology (analyzing
genomes)
} Some other measures
} Some additional material
} Examples using Bayes’ Theorem
} Analog channels
} A Maximum Entropy Principle
} Application: Economics I (a Boltzmann
Economy)
} Application: Economics II (a power law)
} Application to Physics (lasers)
} References
2
The quotes
} Science, wisdom, and counting
} Being different – or random
} Surprise, information, and miracles
} Information (and hope)
} H (or S) for Entropy
} Thermodynamics
} Language, and putting things together
} Tools
To topics ←
3
Science, wisdom, and
counting
“Science anized knowledge. Wisdom is
organized life.”
- Immanuel Kant
“My own suspicion is that the universe is not
only stranger than we suppose, but stranger
than we can suppose.”
- John Haldane
“Not everything that can be counted counts,
and not everything that counts can be
counted.”
- Albert Einstein (1879-1955)
“The laws of probability, so true in general,
so fallacious in particular .”
- Edward Gibbon
4
plexity ←
• Workers in the field plexity face a
classic problem: how can we tell that the
system we are looking at is actually a
complex system? (., should we even be
studying this system? :-)
Of course, in practice, we will study the
systems that interest us, for whatever
reasons, so the problem identified above
tends not to be a real problem. On the
other hand, having chosen a system to
study, we might well ask “plex is
this system?”
In this more general context, we probably
want at least to be able pare two
systems, and be able to say that system