Autocorrelation while sampling a mixture norm

Youyi Fong
2012-04-25
2012-09-01
  • Youyi Fong

    Youyi Fong - 2012-04-25

    Dear all, I am comparing the convergence behavior between JAGS and a hand-
    written code which samples the full conditionals for a simple mixture normal
    on the real line. Here is the jags model:

    theta ~ dnorm (mu, 3)

    d ~ dcat(p)

    What surprises me is that the autocorrelation of JAGS samples is much smaller
    than the Gibbs sampler I implemented using full conditionals sampling. Can
    anyone shed light on the algorithm JAGS uses?

    Thanks,

    Youyi

     
  • Youyi Fong

    Youyi Fong - 2012-05-14

    I tried list.samplers() and it returned an empty list. My best guess is that
    the discrete slice sampler from the base module is used.

     
  • Martyn Plummer

    Martyn Plummer - 2012-05-14

    No. If list.samplers() is empty then all of the nodes in your model are being
    updated from their prior distributions. This would also explain the low
    autocorrelation - in fact you have independent samples from the prior.

    You need to ensure that define the data correctly.

     
  • Jack Tanner

    Jack Tanner - 2012-05-14

    No. If list.samplers() is empty then all of the nodes in your model are
    being updated from their prior distributions.

    Martyn, that would be helpful to state in the manual.

     
  • Martyn Plummer

    Martyn Plummer - 2012-05-15

    Good point. I put in a note to say that Nodes that are updated by forward
    sampling from the prior are not included in the list.

     
  • Youyi Fong

    Youyi Fong - 2012-05-15

    This is a pedagogical example taken from Peter Hoff's first course in Bayesian
    book (page 100-101). There is no data and it is used to illustrate the
    inefficiency of MCMC compared to MC. I hadn't realized that JAGS performed
    forward sampling when there was no data. This is great. I will try to use JAGS
    for complicated simulation studies from now on. Thanks.

     

Log in to post a comment.