Estimates are unexpectedly too wide

2013-05-22
2013-05-23
  • Consider this model, which should express indirect random effect meta analysis with given results from a study which compares treatment A with B in terms of inLHR[1], and study which compares treatment B with C inLHR[2] (both assumed normal).

    model{
    inLHR[1] ~ dnorm(e12,25)
    inLHR[2] ~ dnorm(e23,25)
    e12~dnorm(ef2,tauLHR)
    e13~dnorm(-ef3,tauLHR)
    e23~dnorm(ef2-ef3,tauLHR)
    ef2 ~ dnorm(0,tau)
    ef3 ~ dnorm(0,tau)
    tauLHR~dgamma(0.001,0.001)
    tau~dgamma(0.001,0.001)
    }
    

    From theoretical point of view, I'd expect the following
    mean(e13) = mean(e12) - mean(e23)
    mean(e12) = inLHR[1]
    mean(e23) = inLHR[2]

    When I test it with input data:

    inLHR[1]<-0
    inLHR[2]<-0.2
    

    I've got estimates of e13 (which should correspond of estimator of pooled logHR between treatment A and C). And the distribution is far too wide, sometimes I see sampled valued of as much s 45. One such value is enough to skyrocket the HR defined as mean(exp(ef13)) into ridiculously big estimates (in order of 10^30)

    Is there any error I made in the model?

     
  • Martyn Plummer
    Martyn Plummer
    2013-05-22

    You have given vague gamma priors to tau and tauLHR. There is certainly not enough information in your two observations to estimate them

     
  • The estimates for both gammas were very reasonable (tauLHR = 235,9 +- 434,7, tau = 234,0 +- 422,5).

    The problem persists even if I set both gammas to be equal (i.e. remove one gamma).

    In both cases the e13 has a mean 0,046 and sd 0,99 and it superficially looks normal. The problem is with the skewness (-42,54) and kurtosis 5920,59. Upon inspection the kurtosis is caused by few stray values.

    The adaptation is 120 000 steps, and burn-in also 120 000. I took 10000 x 400 samples, skipping over 400 samples for each sample collected. Gellman statistics equals 1,00011, so it should be ok.

    I've tried the same model on both Windows and Linux version of JAGS 3.30.

     
  • Once I'll manage to run the development version 4.00 (see my technical problems on [https://sourceforge.net/p/mcmc-jags/discussion/610037/thread/99493435] ) I'll post the results on the other JAGS.

     
  • If this is relevant, the simulated parameters are quite correlated;

    cor(mcmcSamples)
    e12 e13 e23 tau
    e12 1.00000000 0.406498658 -0.3506496 0.073861206
    e13 0.40649866 1.000000000 0.7131021 -0.173312923
    e23 -0.35064957 0.713102094 1.0000000 -0.234323373
    tau 0.07386121 -0.173312923 -0.2343234 1.000000000

    I've read that the Gibbs sampler can be inefficient when parameters are correlated; does this correlation matrix look suspicious?