Information is a frequently used word in everyday language, and almost everybody has some notion of what it stands for. Today, as a technical term, information has also entered into many different branches of science and engineering. It is reasonable to start the beginning of modern information theories with the discovery of communication devices and computers. Before we introduce the mathematical theory of information, let us start by sorting out some of the complex and interrelated concepts that this seemingly simple word implies.

For a scientific treatment of information, it is essential that we define a meaningful measure for its amount. Let us start with an idealized but illustrative example. Consider large numbers of aeroplanes and cars of various kinds and models, all separated into small parts and put into separate boxes. Also assume that these parts are separated in such a way that by just looking at any one of them, you cannot tell whether it came from an aeroplane or a car. These boxes are then stored randomly in a warehouse with codes images printed on them indicating their locations in the warehouse. These parts are also separated in such a way that they can be brought together with only a few relatively simple instructions and tools like screwdrivers, wrenches, etc. Using the parts in the warehouse, in principle, it is now possible for anybody ...

Get Essentials of Mathematical Methods in Science and Engineering, 2nd Edition now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.