Trellis-based search of the maximum a posteriori sequence using particle filtering
read more
Citations
On Approximate Maximum-Likelihood Methods for Blind Identification: How to Cope With the Curse of Dimensionality
Approximate Maximum Likelihood Methods for Large Scale Blind Classification and Identification in Digital Communications
Méthodes approchées de maximum de vraisemblances pour la classification et identification aveugles en communications numériques
A filtering approach for model selection
References
Novel approach to nonlinear/non-Gaussian Bayesian state estimation
Error bounds for convolutional codes and an asymptotically optimum decoding algorithm
Sequential Monte Carlo methods in practice
The viterbi algorithm
On sequential Monte Carlo sampling methods for Bayesian filtering
Related Papers (5)
A reduced-complexity trellis search decoding algorithm for extended class IV partial response systems
Frequently Asked Questions (14)
Q2. What is the purpose of this paper?
In this paper, the authors propose to apply the M algorithm [5] and the T algorithm [6] in order to reduce the computational complexity of the VA built on the particle states.
Q3. What is the metric associated with the path in the particle trellis?
The metric associated with a possible path in the particle trellis from a departure particle pd at time k − 1 to pa is given by:λpak = λ pd k−1 + ln p(yk|xpak ) + ln p(xpak |xpdk−1).
Q4. What is the importance function of the SIS algorithm?
After a few iterations of the algorithm, only a particle has a normalized weight almost equal to 1 and the other weights are very close to zero.
Q5. How many particles are there in the VA?
If the computational complexity of the SISR algorithm is proportional to the number N of particles, the computational complexity of the VA is proportional to N2.
Q6. How can the authors estimate the hidden state?
The estimation of the hidden state can be obtained by the Minimum Mean Square Error (MMSE) method or by the Maximum A Posteriori (MAP) method.
Q7. How many particles can be reduced with the VA?
Applying the T algorithm, the authors can reduce the number of particles up to nearly 20% practically without loss of performance with regard to the VA.
Q8. What is the purpose of the paper?
In this paper, the authors consider the filtering problem yielding the estimation of the hidden state xt at a time t from the observations y1:t = {y1, · · · , yt}.
Q9. What is the standard Markovian state space model?
The standard Markovian state space model is represented by the following expressions:{ xk = f(xk−1, wk) yk = h(xk, vk) , (1)where k ≥ 1 is a discrete time index, wk and vk are independent white noises.
Q10. What is the general idea of the paper?
In general, the physical phenomenon can be represented by a mathematical model, which describes the time evolution of the unknown quantities called hidden state and their interactions with the observations.
Q11. how many particles are used in the figure?
This figure is obtained by applying a SISR with N = 1000 particles, an importance function π(xk|x0:k−1, y1:k) = p(xk|xk−1) and a resampling step made at each time (bootstrap filter, [11]).
Q12. What is the purpose of the standard particle filtering?
The aim of the standard particle filtering is to approximate recursively in time the posterior distribution p(x1:t|y1:t) with weighted particles:p(x1:t|y1:t) ≈ N∑i=1w̃itδ(xt − xit) · · · δ(x1 − xi1), (4)where N is the number of particles, w̃it is the normalized weight associated with the particle i and δ(xk −xik) denotes the Dirac delta centered in xk = xik for k = 1, · · · , t.
Q13. How does the VA algorithm reduce the computational complexity?
In this paper, the authors propose to reduce the computational complexity of the VA using the M and T algorithms, while keeping the same performance.
Q14. How can the authors reduce the number of particles using the VA?
The authors can conclude that these algorithms enable a reduction of the number of particles up to 20%, practically without loss of performance with regard to the VA.