Learn how easy it is to sync an existing GitHub or Google Code repo to a SourceForge project! See Demo

Close

Is running time expected to grow linearly with sample length?

  • Fabio
    Fabio
    2013-12-19

    Since there is always some tradeoff between running time and the quality of MCMC output, it seems to me that when running large analyses it's very useful to understand the relationship between amount of samples and running time so that one can predict and select the right tradeoff ahead of time. I had imagined that the running time would increase linearly with the number of samples, but I seem to be seeing something different.

    For example, with one model/dataset (graph size of 683k), JAGS successfully completed 2 chains x 200 samples in under 2 hours. Then, when I tried to do the "full run" with 2 chains x 1800 samples I finally had to stop the program after running for over 30 hours, since I had no idea when it would ever complete, since it was well over the 2 hr x 9 = 18 hours that I thought it should take to complete. Everything with respect to CPU usage and memory usage seems to be the same between the two, and well within my system's capacity.

    Am I misunderstanding the way the sampling is supposed to work, or is something unexpected possibly happening?

     
    Last edit: Fabio 2013-12-19