4Information

In the previous chapter, the foundations of probability theory were presented as a way to characterize uncertainty in systems or processes. Despite this fact, we have not provided any direct quantification of uncertainty in relation to the random variables or processes defined by the different probability distributions. This chapter follows Adami's interpretation of Shannon's seminal work [1], summarized in a friendly manner in [2], where the entropy function measures the uncertainty in relation to a given observation protocol and the experiment. Therefrom, the informative value of events – or simply information – can be mathematically defined. After [1], this particular branch of probability theory applied to state the fundamental limits of engineered communications systems emerged as an autonomous research field called Information Theory.

Our objective here is not to review such a theory but rather to provide its very basic concepts that have opened up new research paths from philosophy to biology, from technology to physics [3]. To this end, we will first establish what can be called information in our theoretical construction for cyber‐physical systems, eliminating possible misuses or misinterpretations of the term, and then present a useful typology for information. This chapter is an attempt to move beyond Shannon by explicitly incorporating semantics [3, 4] but still in a generalized way as indicated by [2, 57].

4.1 Introduction

Information is a term ...

Get Cyber-physical Systems now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.