Clearly from (A3), each of the probabilities is equal to the same constant and we must have that the
sum of the probabilities is one, i.e.,
X
K
i ¼1
p
i
¼ 1 ¼ Ke
l
C
1
(A4)
(A4) can be solved for
l
to yield
l ¼ C
1 þ ln
1
K
(A5)
The substitution of (A5) into (A3) yields the choice for probabilities that maximize (A1):
p
i
¼ e
C½1 þ lnð1 = KÞ
C
1
¼ e
ln
1
K
¼
1
K
; i ¼ 1; 2; . ; K (A6)
The maximum value of the information entropy that results is
C
X
K
i ¼1
p
i
lnðp
i
Þ¼C
X
K
i ¼1
1
K
ln
1
K
¼ C lnðKÞ (A7)
LIST OF SYMBOLS
Symbol Meaning
A Area
b Computational base, logarithm base
C Constant
E Energy
f Frequency
F Bit rate
I Information
k
B
Boltzmann constant, k
B
¼ 1.38 10
–23
J/K
K, L, N Integer numbers
l Length
m Mass
M Molar mass
N
A
Avogadro’s Number, N
A
¼ 6.022 10
23
mol
–1
p Probability
P Power
Q Heat flux
r, R Radius
S Entropy
182 CHAPTER 6 Micron-sized systems: In carbo vs. in silico
Get Microsystems for Bioelectronics now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.