© The Author(s), under exclusive license to APress Media, LLC, part of Springer Nature 2024
D. Johnston, R. FantDesigning to FIPS-140https://doi.org/10.1007/979-8-8688-0125-9_8

8. Entropy Assessment

David Johnston1   and Richard Fant2
(1)
Hillsboro, OR, USA
(2)
Austin, TX, USA
 

8.1 What Is Entropy?

Entropy is commonly referred to as a measure of disorder or information capacity of a communication channel. Entropy as defined by Alfréd Rényi is called Rényi Entropy with a parameter alpha. There are an infinite number of entropy measures based on that value that can take on any nonnegative real value. The equation is
$$ {H}_{\alpha }(X)={H}_{\alpha}\left({p}_1,\dots, {p}_n\right)=\frac{1}{1-\alpha}\;{\log}_2\left(\sum \limits_{i=1}^n{P}_i^{\alpha}\right),\alpha \ge 0,\alpha \ne 1 $$

H is the symbol for entropy. X is the symbol ...

Get Designing to FIPS-140: A Guide for Engineers and Programmers now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.