I am trying to do a Bayesian linear regression. Since my data cannot be negative a gave them a log-Normal distribution, but I am not sure if the priors should be positive also. If I write my model using log-Normal priors for the intercept and the slope I get quite good adjustment. Instead, if I use Normal priors I don't and the priors do not separate from the posterior distributions. But I am not sure if this is statistically correct. Here is the model:
#linear regression
o<-c(22.77619, 19.07782, 22.08817, 16.32168, 32.57081,NA, 10.48027, 15.93440, 27.54557, 33.39933)
evi<-c(0.07289889,0.06288981,0.065947587,0.05886781,
0.07037986,0.07256388,0.06540081,0.07219641,0.0798039,0.08368564)
n<-9
#problema -> distribución de poisson
cat(file = "reg.bug", "
#Likelihood:
model {
for(i in 1:9){
o[i] ~ dlnorm(mu[i],tau)
mu[i] <- b0 + b1 *log(evi[i])
}
#priors:
b0 ~ dlnorm(1,0.001)
b1 ~ dlnorm(1,0.001)
tau ~ dgamma(1,5)
}")
#linear regression
reg.data<-c("o","evi")
inits<-function()list(b0=rlnorm(1,1,1),b1=runif(0,1),tau=runif(0.1,1))
params<-c("b0","b1","tau")
ni <- 100000
nt <- 1
nb <- 50000
nc <- 3
library(jagsUI)
reg.model <- jags (model.file = "reg.bug", data = reg.data, parameters.to.save = params,
inits=inits,n.burnin=nb,n.chains = nc,n.iter = ni)
reg.model