Wednesday, March 25, 2009

Confusions with model parameter estimation and Sampling techniques

Lately, I have been digging in deep into Bayesian parameter estimation and how it works with the MCMC sampling techniques. Most of the tutorials leave a gap open in their efforts on explaining what role do MCMC techniques play in parameter estimation or filtering or prediction? Can the two be separated at all from a theoretical point at least? With the advent of recursive bayesian estimation the thin line that separates the two is getting smudged or erased.

Here I am trying to bring it all under one roof. The punch line is this.

"Bayesian Parameter estimation employs sampling and its only one of the steps in estimation.
When we are doing predictive distribution and filtering sampling becomes an important step in calculating the predicted and/or values and their corresponding distribution."

Some of the main techniques in parameter estimation are the EM, MAP, variational EM, stochastic EM and particle filters etc

Sampling techniques involve MCMC, Gibbs, MH, Importance sampling, sequential importance sampling.

Sample->estimate parameters->sample->estimate parameters

This cycle goes on till we have reached optimal parameter values

No comments:

Post a Comment