Skip to Content
A Concise Introduction to Programming in Python
book

A Concise Introduction to Programming in Python

by Mark J. Johnson
December 2011
Beginner
217 pages
8h
English
Chapman and Hall/CRC
Content preview from A Concise Introduction to Programming in Python
Project: Shannon Entropy
In 1948, Claude Shannon founded the field of information theory with his
paper, “A Mathematical Theory of Communication” [5]. In it, he defined the
entropy H of a message as
H =
i
p
i
log
2
(p
i
)
where p
i
is the probability of the ith character occurring in the message. This
probability can be easily calculated if we count the number of times each
character appears in the message:
p
i
=
number of times ith character appears
length of message
When entropy H is calculated as above, using the log base two, it measures
the average number of bits
1
per character required to communicate the given
message.
Example: Low Entropy
Intuitively, in a string ...
Become an O’Reilly member and get unlimited access to this title plus top books and audiobooks from O’Reilly and nearly 200 top publishers, thousands of courses curated by job role, 150+ live events each month,
and much more.
Start your free trial

You might also like

Python Standard Library

Python Standard Library

Fredrik Lundh
Numerical Computing with Python

Numerical Computing with Python

Pratap Dangeti, Allen Yu, Claire Chung, Aldrin Yim

Publisher Resources

ISBN: 9781439896952