Hello!

I have a running a hierarchical model in JAGS. The purpose is to do group-wise comparisons on overdispersed count data, doing row-wise comparisons for a table of counts. The model flawlessly on smaller data sets, say 2000 rows of data. On larger datasets (10 000 rows in this case) it runs for a while and then crashes with the error:

Error: Error in node phi[1973,1]

Failure to calculate log density

The failure occurs at exactly the same node for five runs on different data (of same size). The model becomes quite large with "Graph Size: 520025" and I suspect I might be running into a memory issue. What is the limit on model size/memory usage? I can run the larger model succesfully if I reduce the number samples taken.

Reading up on the cause for the error message it would seem that it is related to NaNs or Infs coming up in the model. I do not see why these would occur at the same time each simulation and why it would depend on model size. Could it be that the likelihood gets too small in a large model and thus numerically unstable?

Currently running JAGS version 3.3.0, through rjags.

Best Regards

Viktor