292 Current Trends in Bayesian Methodology with Applications
14.2.2 Non-Adaptive SPS algorithm
We start from the SMC algorithm as detailed in [6]. The algorithm gener-
ates a nd modifies the particles θ
jn
, with superscripts used for further speci-
ficity at various points in the algo rithm. To make the notation compact, let
J = {1, . . . , J} and N = {1, . . . , N }. The algo rithm is an implementation of
Bayesian learning, providing simulations from θ | y
1:t
for t = 1, 2, . . . , T . It
processes observa tions, in order and in successive batches, e ach batch consti-
tuting a cycle of the algorithm.
The global structure of the algorithm is therefore iterative, proceeding
through the sample. But it operates on many particles in exactly the same
way at