The DRNN has the input layer, the hidden layer, the self-feedback layer, and the output layer. So there are different weight iterative formulas.

4.3.2.1Weight iteration formula of the output layer

The connection weight between the output layer and the hidden layer is images, so

wjo(n+1)=wjo(n)2μ1x˜(n)[ |x˜(n)|2R2 ]x˜(n)wjo(n)(4.30)

x˜(n)wjo(n)=f2(j=1Jwjo(n)vjJ(n))vjJ(n)(4.31)

wjo(n+1)=wjo(n)2μ1x˜(n)[ |x˜(n)|2R2 ]f2(j=1Jwjo(n)vjJ(n))vjJ(n)(4.32)

where μ1 is the weight iteration step size of the output layer.

4.3.2.2Self-feedback weight iteration formula of the hidden layer

The self-feedback ...

Get Blind Equalization in Neural Networks now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.