Hi Martyn-

I've been trying to implement a model similar in some ways to the models presented by Nicky Best in the ICEBugs talk quite a while ago. I tried to find more information about the issue on the web, and this jags discussion seems most relevant, but doesn't really provide a workable solution to the problem.

Given the reservations about the appropriateness of the cut function in previous discussions of the topic, would you be willing to go the route of a patch for JAGS that allows the cut function, and to distribute the patch with an appropriate warning? That might discourage careless use of the function, while providing the option for situations in which it makes practical sense.

A short summary to understand the question: the issue is that in a (regression type) model where inputs(covariates) are uncertain and need to be imputed, there does not seem to be a single sensible way of doing this. Especially with small sample sizes for the covariates, in a full model with a full posterior, the assumption that the regression model is the true model simply 'pulls' the covariates toward the regression line. I've been encountering the extreme end of this, where the inputs are very uncertain and I end up with y=1x_1+0x_2+...+0*x_3 - meaning that x1 gets 'pulled' all the way to y=x1.

This obviously says a lot about the amount of information in the inputs and one might question whether the model makes sense at all. My case is a bit different from the regression example, however, in that y is actually a linear combination of X=x1..xn (a linear mixing model) rather than just assumed to be linearly related. In this case, it seems to me that it's actually the link from y to X that doesn't make sense: while I want the uncertainty in X to be reflected in the mixing coefficients, I don't want y to give information about X.

I started implementing my models in OpenBUGS using the cut() function but the issues with calling OpenBUGS models from 64bit R, having the final models available to users on MACs and a host of other issues make JAGS a far more appealing option. As of now it seems the 2 step-approach of using ML plug-in estimates (which I guess amounts to a very strong point mass prior) for X is the only workable solution in JAGS, but I feel that the discarding of uncertainty about X make this semi Bayesian approach not much better than using the cut function and naively trimming the posterior.

Any feedback would be greatly appreciated!
kind regards