I have recently started running more complicated Bayesian models in R using the rjags package. As model complexity has increased I have had to run longer chains to reach convergence for some parameters. Generally I will try a run at 20,000 and if this does not converge, use the update function to increase the chain length.
For simple models I can save the model object, and it is small enough to share easily (<50 Mb), however I have a few models that require chains around 800,000 in length to reach convergence which result in objects >100 Mb.
Is there a function or technique to reduce the size of these objects before I save them? I am familiar with the concept of thinning when starting the model, but am interested if there is a technique to "thin" the chains post-hoc.