i
i
i
i
i
i
i
i
18 1 Information Theory
For example, if x = [0,0,1,1], the codeword is
y = [0,0,1, 1,0,1,0] + [0, 0, 0,1,1,0, 1] = [0,0,1,0, 1,1,1].
All codewords for this code are
[0,0,0,0, 0,0,0], [1, 1,0,1,0, 0, 0],[0,1, 1,0,1,0,0] ,[1,0,1,1, 1,0,0]
[0,0,1,1, 0,1,0], [1, 1,1,0,0, 1, 0],[0,1, 0,1,1,1,0] ,[1,0,0,0, 1,1,0]
[0,0,0,1, 1,0,1], [1, 1,0,0,1, 0, 1],[0,1, 1,1,0,0,1] ,[1,0,1,0, 0,0,1]
[0,0,1,0, 1,1,1], [1, 1,1,1,1, 1, 1],[0,1, 0,0,0,1,1] ,[1,0,0,1, 0,1,1]
. (1.60)
It can be verified that the number of different bits between any two codewords is 3.
This is known as Hamming distance. Therefore, Hamming (7,4) codes can detect up
to two bits of error and correct up to one bit of error.
Every (n,k) linear block code has associated with a (n −k) ×n matrix