I've been trying to figure out how JAGS calculates the deviance at each

iteration during the sampling process from the posterior. I used the

parameterization for the normal distribution detailed in the JAGS manual

within my model and calculated the deviance manually for a simple linear

regression. However, what I calculate and what JAGS calculates are slightly

different. Any help on what I'm doing wrong would be greatly appreciated.

Thanks in advance.

{ model{ for(i in 1:n){ y[i] ~ dnorm(mu[i],tauy) mu[i] <- beta[1] + beta[2]*x[i] D[i] <- log(pow((tauy/(2*pi)),0.5)*exp(-(pow((x[i]-mu[i]),2)*tauy))) } for(i in 1:2){ beta[i] ~ dnorm(0.0,1.0E-4) } tauy ~ dgamma(0.1,0.1) Deviance <- -2*sum(D[]) } }[\code]