srrs2 <- read.table("../ARM_Data/radon/srrs2.dat", header=TRUE, sep=",")
mn <- srrs2$state == "MN"
radon <- srrs2$activity[mn]
y <- log(ifelse(radon == 0, 0.1, radon))
n <- length(radon)
x <- srrs2$floor[mn] # 0 for basement, 1 for first floor
srrs2.fips <- srrs2$stfips * 1000 + srrs2$cntyfips
county.name <- as.vector(srrs2$county[mn])
uniq.name <- unique(county.name)
J <- length(uniq.name)
county <- rep(NA, J)
for (i in 1:J) {
county[county.name == uniq.name[i]] <- i
}
srrs2.fips <- srrs2$stfips*1000 + srrs2$cntyfips
cty <- read.table ("../ARM_Data/radon/cty.dat", header=T, sep=",")
usa.fips <- 1000*cty[,"stfips"] + cty[,"ctfips"]
usa.rows <- match (unique(srrs2.fips[mn]), usa.fips)
uranium <- cty[usa.rows,"Uppm"]
u <- log (uranium)
u.full <- u[county]
data_jags <- list(y = y, x = x, county = county, n = n, J = J)
data_jags$u <- u.full7 Ajuste de GLMMs en JAGS
Este capítulo aborda la estimación de modelos lineales y generalizados multinivel utilizando JAGS y R. Se presentan ejemplos de modelos con interceptos y pendientes variables, estructuras no anidadas, y extensiones multivariadas . Además, se discuten enfoques para incluir correlaciones entre parámetros y para manejar múltiples coeficientes que varían entre grupos.
7.1 Modelos con intercepto y pendiente variables
Comenzamos con un modelo con intercepto y pendiente que varían por grupo e incluyen el predictor (x), sin incorporar (por ahora) el predictor de uranio a nivel de condado. La estructura es:
\[ y_i \sim \mathcal{N}\big(\alpha_{j[i]} + \beta_{j[i]} x_i,\ \sigma_y^2\big), \quad i=1,\dots,n. \]
donde \[ \begin{pmatrix} \alpha_j \\ \beta_j \end{pmatrix} \sim \mathcal{N}\left( \begin{pmatrix} \mu_\alpha \\ \mu_\beta \end{pmatrix}, \begin{pmatrix} \sigma_\alpha^2 & \rho\sigma_\alpha\sigma_\beta \\ \rho\sigma_\alpha\sigma_\beta & \sigma_\beta^2 \end{pmatrix}\right), \quad j=1,\dots,J. \]
Este modelo permite variación entre grupos tanto en \(\alpha_j\) como en \(\beta_j\) y, además, introduce un parámetro de correlación entre grupos \(\rho\).
7.1.1 Modelo simple sin correlación entre interceptos y pendientes
En este primer modelo supondremos independencia entre interceptos \((a_j)\) y pendientes \((b_j)\) de los grupos (condados). Aunque esta suposición puede ser restrictiva, facilita la implementación inicial en JAGS.
model {
for (i in 1:n){
y[i] ~ dnorm(y.hat[i], tau.y)
y.hat[i] <- a[county[i]] + b[county[i]] * x[i]
}
tau.y <- pow(sigma.y, -2)
sigma.y ~ dunif(0, 100)
for (j in 1:J){
a[j] ~ dnorm(a.hat[j], tau.a)
b[j] ~ dnorm(b.hat[j], tau.b)
a.hat[j] <- mu.a
b.hat[j] <- mu.b
}
mu.a ~ dnorm(0, .0001)
mu.b ~ dnorm(0, .0001)
tau.a <- pow(sigma.a, -2)
tau.b <- pow(sigma.b, -2)
sigma.a ~ dunif(0, 100)
sigma.b ~ dunif(0, 100)
}
En esta especificación, \(\mu_a\) y \(\mu_b\) representan las medias grupales que controlan, respectivamente, los interceptos y las pendientes promedio. Se mantienen explícitamente las cantidades auxiliares a.hat[j] y b.hat[j] porque facilitan extensiones posteriores (por ejemplo, para introducir correlaciones o predictores a nivel de grupo).
Carga de datos en R:
Ajuste en R con R2jags (modelo sin correlación):
# Cargar librería necesaria
library(R2jags)Loading required package: rjags
Loading required package: coda
Linked to JAGS 4.3.0
Loaded modules: basemod,bugs
Attaching package: 'R2jags'
The following object is masked from 'package:coda':
traceplot
# Preparar lista de datos para JAGS
data_jags <- list(y = y, x = x, county = county, n = n, J = J, u = u.full)
# Valores iniciales
inits <- function() {
list(a = rnorm(J), b = rnorm(J), mu.a = rnorm(1), mu.b = rnorm(1),
sigma.y = runif(1), sigma.a = runif(1), sigma.b = runif(1))
}
# Parámetros a monitorear
params <- c("a", "b", "mu.a", "mu.b", "sigma.y", "sigma.a", "sigma.b")
# Ejecutar JAGS
jags_fit <- jags(data = data_jags,
inits = inits,
parameters.to.save = params,
model.file = "codigoJAGS/radon_model_slope1.jags",
n.chains = 3,
n.iter = 5000,
n.burnin = 1000,
n.thin = 10)module glm loaded
Warning in jags.model(model.file, data = data, inits = init.values, n.chains =
n.chains, : Unused variable "u" in data
Compiling model graph
Resolving undeclared variables
Allocating nodes
Graph information:
Observed stochastic nodes: 919
Unobserved stochastic nodes: 175
Total graph size: 3232
Initializing model
# Resumen
print(jags_fit)Inference for Bugs model at "codigoJAGS/radon_model_slope1.jags", fit using jags,
3 chains, each with 5000 iterations (first 1000 discarded), n.thin = 10
n.sims = 1200 iterations saved. Running time = 6.098 secs
mu.vect sd.vect 2.5% 25% 50% 75% 97.5% Rhat
a[1] 1.169 0.257 0.659 0.986 1.180 1.358 1.656 1.003
a[2] 0.936 0.101 0.732 0.871 0.936 1.001 1.124 1.003
a[3] 1.475 0.279 0.917 1.295 1.477 1.666 2.041 1.003
a[4] 1.523 0.228 1.105 1.371 1.506 1.669 1.986 1.001
a[5] 1.436 0.251 0.962 1.261 1.444 1.608 1.942 1.003
a[6] 1.490 0.275 0.974 1.297 1.484 1.660 2.048 1.002
a[7] 1.840 0.183 1.501 1.718 1.833 1.960 2.218 1.000
a[8] 1.679 0.253 1.190 1.504 1.682 1.844 2.207 1.001
a[9] 1.141 0.201 0.747 1.007 1.144 1.274 1.535 1.000
a[10] 1.519 0.242 1.047 1.360 1.525 1.675 2.001 1.001
a[11] 1.417 0.241 0.958 1.259 1.408 1.573 1.897 1.000
a[12] 1.587 0.247 1.112 1.431 1.578 1.756 2.051 1.005
a[13] 1.234 0.226 0.780 1.089 1.239 1.377 1.693 1.001
a[14] 1.853 0.187 1.475 1.722 1.852 1.974 2.212 1.003
a[15] 1.389 0.253 0.872 1.223 1.399 1.549 1.888 1.009
a[16] 1.234 0.278 0.691 1.058 1.236 1.403 1.794 1.002
a[17] 1.399 0.262 0.898 1.239 1.389 1.567 1.916 1.002
a[18] 1.205 0.188 0.825 1.077 1.209 1.339 1.552 1.001
a[19] 1.349 0.094 1.160 1.284 1.348 1.413 1.534 1.002
a[20] 1.600 0.261 1.080 1.422 1.602 1.773 2.124 1.002
a[21] 1.631 0.202 1.233 1.489 1.633 1.780 2.020 1.000
a[22] 1.027 0.227 0.583 0.874 1.019 1.184 1.470 1.001
a[23] 1.425 0.283 0.861 1.235 1.426 1.616 1.980 1.003
a[24] 1.853 0.208 1.463 1.704 1.851 1.995 2.265 1.000
a[25] 1.791 0.171 1.451 1.677 1.794 1.905 2.115 1.000
a[26] 1.374 0.072 1.225 1.330 1.375 1.423 1.517 1.001
a[27] 1.616 0.245 1.136 1.453 1.603 1.777 2.135 1.001
a[28] 1.340 0.248 0.848 1.166 1.348 1.507 1.815 1.004
a[29] 1.313 0.275 0.789 1.116 1.324 1.502 1.838 1.002
a[30] 1.104 0.193 0.736 0.972 1.111 1.240 1.454 1.003
a[31] 1.749 0.239 1.302 1.583 1.748 1.900 2.227 1.000
a[32] 1.367 0.248 0.905 1.202 1.370 1.537 1.837 1.005
a[33] 1.731 0.244 1.258 1.561 1.729 1.885 2.221 1.001
a[34] 1.509 0.277 0.977 1.320 1.515 1.689 2.055 1.000
a[35] 1.071 0.230 0.626 0.917 1.069 1.226 1.510 1.002
a[36] 1.852 0.294 1.308 1.646 1.842 2.052 2.415 1.002
a[37] 0.784 0.212 0.360 0.643 0.783 0.927 1.177 1.000
a[38] 1.595 0.253 1.071 1.416 1.597 1.773 2.067 1.005
a[39] 1.592 0.242 1.140 1.422 1.581 1.758 2.088 1.001
a[40] 1.839 0.271 1.308 1.651 1.833 2.020 2.376 1.001
a[41] 1.755 0.213 1.355 1.611 1.744 1.900 2.203 1.000
a[42] 1.438 0.323 0.830 1.220 1.431 1.660 2.040 1.007
a[43] 1.645 0.237 1.193 1.485 1.647 1.801 2.126 1.006
a[44] 1.243 0.220 0.785 1.093 1.251 1.395 1.675 1.001
a[45] 1.368 0.185 1.008 1.243 1.366 1.497 1.741 1.001
a[46] 1.334 0.239 0.857 1.175 1.336 1.498 1.796 1.003
a[47] 1.321 0.287 0.740 1.126 1.317 1.512 1.874 1.000
a[48] 1.258 0.209 0.836 1.122 1.266 1.395 1.661 1.002
a[49] 1.650 0.178 1.298 1.534 1.643 1.763 2.007 1.001
a[50] 1.634 0.316 1.051 1.418 1.620 1.851 2.307 1.001
a[51] 1.788 0.264 1.278 1.610 1.776 1.959 2.305 1.001
a[52] 1.630 0.266 1.076 1.460 1.629 1.806 2.169 1.001
a[53] 1.393 0.274 0.862 1.204 1.400 1.575 1.938 1.000
a[54] 1.350 0.145 1.074 1.255 1.354 1.448 1.638 1.003
a[55] 1.525 0.224 1.091 1.376 1.519 1.681 1.972 1.000
a[56] 1.351 0.273 0.812 1.165 1.356 1.534 1.877 1.001
a[57] 1.084 0.227 0.643 0.931 1.083 1.243 1.505 1.001
a[58] 1.625 0.255 1.142 1.448 1.629 1.794 2.114 1.004
a[59] 1.566 0.253 1.075 1.387 1.572 1.739 2.078 1.001
a[60] 1.405 0.288 0.841 1.225 1.415 1.587 1.970 1.000
a[61] 1.179 0.125 0.937 1.096 1.182 1.264 1.418 1.001
a[62] 1.681 0.243 1.203 1.520 1.685 1.840 2.155 1.003
a[63] 1.516 0.275 0.966 1.330 1.503 1.696 2.072 1.001
a[64] 1.720 0.180 1.380 1.593 1.719 1.844 2.076 1.001
a[65] 1.417 0.297 0.849 1.228 1.404 1.599 2.042 1.002
a[66] 1.576 0.205 1.175 1.442 1.574 1.713 1.979 1.001
a[67] 1.643 0.183 1.267 1.525 1.644 1.762 1.999 1.002
a[68] 1.245 0.212 0.828 1.104 1.247 1.389 1.644 1.001
a[69] 1.364 0.256 0.874 1.185 1.370 1.532 1.854 1.000
a[70] 0.880 0.075 0.742 0.827 0.878 0.930 1.032 1.003
a[71] 1.494 0.143 1.229 1.397 1.490 1.591 1.779 1.002
a[72] 1.542 0.196 1.191 1.404 1.535 1.674 1.944 1.005
a[73] 1.562 0.283 1.009 1.376 1.561 1.751 2.100 1.000
a[74] 1.250 0.244 0.752 1.095 1.252 1.418 1.713 1.004
a[75] 1.553 0.268 1.064 1.365 1.551 1.734 2.088 1.000
a[76] 1.695 0.261 1.192 1.519 1.692 1.868 2.218 1.001
a[77] 1.669 0.230 1.216 1.512 1.665 1.825 2.121 1.000
a[78] 1.382 0.247 0.889 1.222 1.388 1.543 1.863 1.000
a[79] 1.095 0.253 0.593 0.921 1.104 1.267 1.577 1.001
a[80] 1.350 0.105 1.144 1.275 1.350 1.423 1.550 1.002
a[81] 1.865 0.293 1.334 1.665 1.864 2.062 2.476 1.003
a[82] 1.571 0.299 0.994 1.367 1.560 1.766 2.174 1.000
a[83] 1.618 0.191 1.257 1.487 1.618 1.747 2.001 1.001
a[84] 1.603 0.178 1.260 1.487 1.602 1.724 1.949 1.001
a[85] 1.388 0.284 0.825 1.205 1.385 1.578 1.947 1.000
b[1] -0.647 0.261 -1.141 -0.803 -0.660 -0.498 -0.077 1.001
b[2] -0.830 0.264 -1.440 -0.978 -0.798 -0.658 -0.357 1.000
b[3] -0.668 0.261 -1.200 -0.820 -0.685 -0.517 -0.141 1.003
b[4] -0.725 0.243 -1.204 -0.868 -0.721 -0.584 -0.245 1.003
b[5] -0.648 0.266 -1.167 -0.797 -0.663 -0.516 -0.087 1.006
b[6] -0.671 0.310 -1.299 -0.825 -0.683 -0.524 0.016 1.004
b[7] -0.453 0.309 -0.939 -0.663 -0.517 -0.280 0.233 1.011
b[8] -0.652 0.253 -1.159 -0.791 -0.668 -0.511 -0.113 1.010
b[9] -0.527 0.298 -1.015 -0.722 -0.568 -0.377 0.199 1.016
b[10] -0.738 0.253 -1.275 -0.886 -0.726 -0.589 -0.232 1.000
b[11] -0.670 0.309 -1.309 -0.834 -0.693 -0.506 -0.022 1.005
b[12] -0.684 0.309 -1.308 -0.852 -0.695 -0.531 0.010 1.005
b[13] -0.688 0.301 -1.286 -0.852 -0.698 -0.530 -0.032 1.012
b[14] -0.756 0.247 -1.289 -0.893 -0.743 -0.598 -0.301 1.002
b[15] -0.669 0.249 -1.166 -0.823 -0.670 -0.519 -0.160 1.008
b[16] -0.681 0.284 -1.242 -0.836 -0.685 -0.531 -0.076 1.006
b[17] -0.779 0.264 -1.357 -0.925 -0.755 -0.610 -0.283 1.008
b[18] -0.572 0.255 -1.022 -0.738 -0.605 -0.421 -0.003 1.003
b[19] -0.763 0.233 -1.293 -0.890 -0.750 -0.618 -0.327 1.005
b[20] -0.673 0.298 -1.302 -0.832 -0.675 -0.521 -0.032 1.002
b[21] -0.642 0.284 -1.191 -0.802 -0.664 -0.498 0.015 1.009
b[22] -0.686 0.271 -1.259 -0.835 -0.689 -0.529 -0.131 1.002
b[23] -0.657 0.278 -1.197 -0.815 -0.678 -0.505 -0.068 1.009
b[24] -0.658 0.261 -1.171 -0.810 -0.677 -0.522 -0.068 1.006
b[25] -0.463 0.299 -0.952 -0.682 -0.510 -0.280 0.207 1.007
b[26] -0.766 0.175 -1.139 -0.876 -0.757 -0.651 -0.417 1.000
b[27] -0.622 0.275 -1.148 -0.785 -0.642 -0.458 -0.018 1.007
b[28] -0.665 0.257 -1.195 -0.807 -0.676 -0.520 -0.125 1.012
b[29] -0.683 0.308 -1.349 -0.827 -0.694 -0.527 -0.034 1.012
b[30] -0.666 0.309 -1.281 -0.824 -0.677 -0.514 0.007 1.009
b[31] -0.673 0.318 -1.331 -0.835 -0.683 -0.514 0.049 1.006
b[32] -0.670 0.319 -1.355 -0.829 -0.673 -0.515 0.013 1.000
b[33] -0.689 0.317 -1.409 -0.847 -0.686 -0.519 -0.072 1.011
b[34] -0.715 0.270 -1.265 -0.857 -0.713 -0.566 -0.166 1.003
b[35] -0.643 0.242 -1.105 -0.792 -0.664 -0.507 -0.131 1.000
b[36] -0.534 0.308 -1.084 -0.729 -0.584 -0.375 0.146 1.018
b[37] -0.649 0.286 -1.206 -0.817 -0.665 -0.490 -0.040 1.014
b[38] -0.615 0.265 -1.112 -0.773 -0.645 -0.467 -0.030 1.004
b[39] -0.608 0.271 -1.114 -0.762 -0.625 -0.466 -0.027 1.001
b[40] -0.679 0.278 -1.235 -0.841 -0.694 -0.516 -0.069 1.005
b[41] -0.570 0.295 -1.057 -0.755 -0.607 -0.409 0.109 1.010
b[42] -0.699 0.309 -1.346 -0.864 -0.702 -0.540 -0.064 1.009
b[43] -1.003 0.308 -1.703 -1.218 -0.946 -0.758 -0.549 1.010
b[44] -0.834 0.312 -1.552 -0.988 -0.801 -0.647 -0.276 1.007
b[45] -0.801 0.245 -1.361 -0.936 -0.767 -0.652 -0.349 1.008
b[46] -0.667 0.306 -1.289 -0.842 -0.685 -0.504 -0.046 1.004
b[47] -0.855 0.297 -1.560 -1.005 -0.797 -0.676 -0.362 1.008
b[48] -0.614 0.297 -1.166 -0.794 -0.642 -0.452 0.049 1.010
b[49] -0.845 0.271 -1.465 -0.996 -0.806 -0.672 -0.369 1.007
b[50] -0.667 0.313 -1.299 -0.828 -0.685 -0.526 0.052 1.003
b[51] -0.681 0.314 -1.308 -0.859 -0.692 -0.515 0.021 1.007
b[52] -0.683 0.310 -1.340 -0.849 -0.684 -0.521 -0.058 1.011
b[53] -0.703 0.286 -1.327 -0.848 -0.697 -0.551 -0.151 1.005
b[54] -0.867 0.267 -1.472 -1.025 -0.816 -0.687 -0.420 1.016
b[55] -0.612 0.250 -1.082 -0.767 -0.634 -0.468 -0.060 1.015
b[56] -0.822 0.272 -1.426 -0.969 -0.778 -0.651 -0.353 1.005
b[57] -0.711 0.288 -1.301 -0.855 -0.708 -0.560 -0.109 1.009
b[58] -0.610 0.284 -1.133 -0.780 -0.636 -0.468 0.047 1.005
b[59] -0.734 0.259 -1.273 -0.873 -0.716 -0.583 -0.225 1.009
b[60] -0.683 0.302 -1.331 -0.845 -0.692 -0.513 -0.011 1.011
b[61] -0.490 0.271 -0.910 -0.694 -0.532 -0.326 0.106 1.023
b[62] -0.433 0.345 -0.940 -0.672 -0.493 -0.258 0.383 1.014
b[63] -0.553 0.295 -1.071 -0.736 -0.584 -0.401 0.121 1.011
b[64] -0.709 0.273 -1.292 -0.849 -0.706 -0.556 -0.159 1.007
b[65] -0.676 0.307 -1.309 -0.846 -0.695 -0.508 -0.023 1.004
b[66] -0.628 0.215 -1.029 -0.765 -0.647 -0.497 -0.179 1.006
b[67] -0.445 0.287 -0.898 -0.667 -0.484 -0.263 0.226 1.009
b[68] -0.688 0.312 -1.373 -0.855 -0.695 -0.525 -0.048 1.009
b[69] -0.678 0.311 -1.338 -0.840 -0.686 -0.511 -0.035 1.005
b[70] -0.633 0.158 -0.937 -0.733 -0.640 -0.533 -0.312 1.001
b[71] -0.776 0.231 -1.255 -0.912 -0.761 -0.624 -0.326 1.003
b[72] -0.684 0.307 -1.357 -0.846 -0.690 -0.515 -0.072 1.003
b[73] -0.682 0.323 -1.324 -0.854 -0.685 -0.526 0.021 1.009
b[74] -0.673 0.302 -1.286 -0.835 -0.682 -0.511 -0.017 1.005
b[75] -0.564 0.293 -1.072 -0.746 -0.607 -0.411 0.128 1.008
b[76] -0.666 0.280 -1.239 -0.814 -0.679 -0.517 -0.046 1.002
b[77] -0.793 0.289 -1.453 -0.944 -0.753 -0.624 -0.239 1.006
b[78] -0.745 0.271 -1.330 -0.885 -0.735 -0.586 -0.230 1.003
b[79] -0.748 0.292 -1.392 -0.906 -0.732 -0.584 -0.149 1.007
b[80] -0.810 0.228 -1.306 -0.942 -0.793 -0.665 -0.382 1.004
b[81] -0.491 0.302 -0.974 -0.697 -0.541 -0.315 0.230 1.007
b[82] -0.657 0.297 -1.235 -0.818 -0.673 -0.498 -0.003 1.007
b[83] -0.989 0.299 -1.661 -1.175 -0.941 -0.755 -0.554 1.015
b[84] -0.679 0.278 -1.253 -0.828 -0.687 -0.523 -0.103 1.006
b[85] -0.675 0.315 -1.331 -0.836 -0.683 -0.516 0.026 1.003
mu.a 1.461 0.053 1.356 1.425 1.461 1.497 1.565 1.003
mu.b -0.681 0.087 -0.840 -0.740 -0.682 -0.621 -0.505 1.003
sigma.a 0.335 0.049 0.246 0.304 0.333 0.364 0.440 1.003
sigma.b 0.266 0.132 0.054 0.161 0.266 0.356 0.528 1.035
sigma.y 0.750 0.018 0.717 0.739 0.750 0.762 0.785 1.002
deviance 2079.311 16.551 2045.138 2068.016 2079.194 2091.156 2111.952 1.006
n.eff
a[1] 1200
a[2] 1200
a[3] 1200
a[4] 1200
a[5] 1200
a[6] 1100
a[7] 1200
a[8] 1200
a[9] 1200
a[10] 1200
a[11] 1200
a[12] 360
a[13] 1200
a[14] 700
a[15] 330
a[16] 1200
a[17] 1200
a[18] 1200
a[19] 780
a[20] 970
a[21] 1200
a[22] 1200
a[23] 1200
a[24] 1200
a[25] 1200
a[26] 1200
a[27] 1200
a[28] 430
a[29] 1200
a[30] 560
a[31] 1200
a[32] 1100
a[33] 1200
a[34] 1200
a[35] 860
a[36] 1100
a[37] 1200
a[38] 410
a[39] 1200
a[40] 1200
a[41] 1200
a[42] 260
a[43] 370
a[44] 1200
a[45] 1200
a[46] 600
a[47] 1200
a[48] 790
a[49] 1200
a[50] 1200
a[51] 1200
a[52] 1200
a[53] 1200
a[54] 550
a[55] 1200
a[56] 1200
a[57] 1200
a[58] 440
a[59] 1200
a[60] 1200
a[61] 1200
a[62] 710
a[63] 1200
a[64] 1200
a[65] 1200
a[66] 1200
a[67] 980
a[68] 1200
a[69] 1200
a[70] 1200
a[71] 930
a[72] 450
a[73] 1200
a[74] 1200
a[75] 1200
a[76] 1200
a[77] 1200
a[78] 1200
a[79] 1200
a[80] 880
a[81] 550
a[82] 1200
a[83] 1200
a[84] 1200
a[85] 1200
b[1] 1200
b[2] 1200
b[3] 1200
b[4] 1200
b[5] 540
b[6] 1200
b[7] 180
b[8] 220
b[9] 160
b[10] 1200
b[11] 440
b[12] 1200
b[13] 1100
b[14] 1200
b[15] 1200
b[16] 410
b[17] 490
b[18] 970
b[19] 1200
b[20] 1200
b[21] 1200
b[22] 930
b[23] 330
b[24] 740
b[25] 320
b[26] 1200
b[27] 1200
b[28] 650
b[29] 500
b[30] 1100
b[31] 1200
b[32] 1200
b[33] 750
b[34] 1200
b[35] 1200
b[36] 170
b[37] 240
b[38] 460
b[39] 1200
b[40] 1200
b[41] 440
b[42] 450
b[43] 220
b[44] 840
b[45] 800
b[46] 1200
b[47] 500
b[48] 490
b[49] 1200
b[50] 1200
b[51] 360
b[52] 1200
b[53] 1200
b[54] 150
b[55] 180
b[56] 970
b[57] 1200
b[58] 370
b[59] 1200
b[60] 430
b[61] 95
b[62] 170
b[63] 280
b[64] 780
b[65] 870
b[66] 400
b[67] 250
b[68] 1100
b[69] 1200
b[70] 1200
b[71] 1200
b[72] 1200
b[73] 370
b[74] 1200
b[75] 590
b[76] 1200
b[77] 710
b[78] 830
b[79] 1200
b[80] 800
b[81] 430
b[82] 1200
b[83] 150
b[84] 1200
b[85] 1100
mu.a 540
mu.b 670
sigma.a 630
sigma.b 73
sigma.y 890
deviance 300
For each parameter, n.eff is a crude measure of effective sample size,
and Rhat is the potential scale reduction factor (at convergence, Rhat=1).
DIC info (using the rule: pV = var(deviance)/2)
pV = 136.3 and DIC = 2215.6
DIC is an estimate of expected predictive error (lower deviance is better).
7.1.2 Modelo con correlación entre interceptos y pendientes
Para introducir una correlación \(\rho\) entre interceptos y pendientes, reescribimos el modelo en forma matricial:
model {
for (i in 1:n){
y[i] ~ dnorm(y.hat[i], tau.y)
y.hat[i] <- a[county[i]] + b[county[i]] * x[i]
}
tau.y <- pow(sigma.y, -2)
sigma.y ~ dunif(0, 100)
for (j in 1:J){
a[j] <- B[j,1]
b[j] <- B[j,2]
B[j,1:2] ~ dmnorm(B.hat[j,], Tau.B[,])
B.hat[j,1] <- mu.a
B.hat[j,2] <- mu.b
}
mu.a ~ dnorm(0, .0001)
mu.b ~ dnorm(0, .0001)
Tau.B[1:2,1:2] <- inverse(Sigma.B[,])
Sigma.B[1,1] <- pow(sigma.a, 2)
Sigma.B[2,2] <- pow(sigma.b, 2)
Sigma.B[1,2] <- rho * sigma.a * sigma.b
Sigma.B[2,1] <- Sigma.B[1,2]
sigma.a ~ dunif(0, 100)
sigma.b ~ dunif(0, 100)
rho ~ dunif(-1, 1)
}
Aquí, B es la matriz \((a_j, b_j)\) y Tau.B la matriz de precisión (inversa de la covarianza). La estimación directa de \(\rho\) aporta una estructura jerárquica más realista.
Ajuste en R con R2jags (modelo con correlación):
# Preparar datos para JAGS
data_jags <- list(y = y, x = x, county = county, n = n, J = J)
# Valores iniciales
inits <- function() {
list(B = array(rnorm(2 * J), dim = c(J, 2)),
mu.a = rnorm(1), mu.b = rnorm(1),
sigma.y = runif(1), sigma.a = runif(1), sigma.b = runif(1),
rho = runif(1, -1, 1))
}
# Parámetros a monitorear
params <- c("a", "b", "mu.a", "mu.b", "sigma.y", "sigma.a", "sigma.b", "rho")
# Ejecutar JAGS
jags_fit <- jags(data = data_jags,
inits = inits,
parameters.to.save = params,
model.file = "codigoJAGS/radon_model_slope2.jags",
n.chains = 3,
n.iter = 5000,
n.burnin = 1000,
n.thin = 10)Compiling model graph
Resolving undeclared variables
Allocating nodes
Graph information:
Observed stochastic nodes: 919
Unobserved stochastic nodes: 91
Total graph size: 3408
Initializing model
# Resumen de resultados
print(jags_fit)Inference for Bugs model at "codigoJAGS/radon_model_slope2.jags", fit using jags,
3 chains, each with 5000 iterations (first 1000 discarded), n.thin = 10
n.sims = 1200 iterations saved. Running time = 38.016 secs
mu.vect sd.vect 2.5% 25% 50% 75% 97.5% Rhat
a[1] 1.148 0.272 0.622 0.968 1.142 1.327 1.664 1.002
a[2] 0.932 0.102 0.734 0.864 0.931 1.000 1.130 1.001
a[3] 1.481 0.294 0.881 1.289 1.468 1.667 2.073 1.003
a[4] 1.515 0.253 1.016 1.341 1.518 1.681 2.007 1.004
a[5] 1.444 0.277 0.892 1.258 1.454 1.628 1.975 1.007
a[6] 1.476 0.267 0.962 1.295 1.469 1.668 1.989 1.001
a[7] 1.826 0.181 1.478 1.700 1.828 1.950 2.175 1.004
a[8] 1.674 0.277 1.149 1.493 1.673 1.846 2.230 1.000
a[9] 1.107 0.207 0.683 0.976 1.107 1.240 1.503 1.002
a[10] 1.541 0.253 1.062 1.372 1.531 1.710 2.074 1.006
a[11] 1.425 0.246 0.955 1.266 1.429 1.588 1.909 1.002
a[12] 1.596 0.249 1.118 1.430 1.594 1.769 2.077 1.000
a[13] 1.230 0.228 0.782 1.072 1.231 1.374 1.694 1.004
a[14] 1.861 0.194 1.488 1.733 1.855 1.984 2.249 1.000
a[15] 1.400 0.269 0.893 1.213 1.411 1.584 1.902 1.000
a[16] 1.214 0.297 0.601 1.025 1.204 1.421 1.793 1.007
a[17] 1.439 0.285 0.894 1.248 1.429 1.632 2.000 1.000
a[18] 1.165 0.199 0.764 1.025 1.173 1.307 1.540 1.006
a[19] 1.351 0.090 1.178 1.290 1.348 1.414 1.530 1.001
a[20] 1.591 0.270 1.093 1.401 1.588 1.778 2.114 1.002
a[21] 1.619 0.205 1.230 1.476 1.610 1.759 2.030 1.001
a[22] 0.975 0.246 0.485 0.813 0.974 1.149 1.439 1.006
a[23] 1.429 0.306 0.853 1.227 1.425 1.629 2.041 1.002
a[24] 1.873 0.220 1.447 1.728 1.866 2.020 2.307 1.003
a[25] 1.790 0.189 1.423 1.664 1.784 1.920 2.172 1.002
a[26] 1.368 0.074 1.226 1.317 1.368 1.418 1.516 1.001
a[27] 1.594 0.239 1.117 1.436 1.596 1.751 2.063 1.000
a[28] 1.328 0.268 0.809 1.153 1.334 1.504 1.855 1.000
a[29] 1.295 0.264 0.777 1.122 1.294 1.469 1.799 1.005
a[30] 1.095 0.182 0.716 0.974 1.100 1.211 1.463 1.002
a[31] 1.736 0.249 1.253 1.560 1.729 1.906 2.225 1.001
a[32] 1.366 0.263 0.847 1.185 1.364 1.546 1.874 1.001
a[33] 1.747 0.261 1.266 1.566 1.741 1.914 2.290 1.004
a[34] 1.511 0.286 0.960 1.324 1.502 1.702 2.074 1.001
a[35] 1.017 0.250 0.500 0.858 1.022 1.188 1.487 1.001
a[36] 1.859 0.317 1.217 1.648 1.855 2.076 2.480 1.005
a[37] 0.752 0.219 0.323 0.601 0.754 0.903 1.179 1.010
a[38] 1.599 0.272 1.054 1.424 1.606 1.780 2.127 1.003
a[39] 1.582 0.244 1.110 1.415 1.589 1.750 2.062 1.003
a[40] 1.837 0.276 1.295 1.653 1.833 2.014 2.410 1.000
a[41] 1.751 0.211 1.351 1.599 1.742 1.890 2.162 1.000
a[42] 1.435 0.324 0.803 1.219 1.438 1.646 2.079 1.001
a[43] 1.729 0.265 1.249 1.540 1.717 1.905 2.288 1.001
a[44] 1.235 0.229 0.763 1.088 1.239 1.386 1.675 1.000
a[45] 1.356 0.191 0.993 1.224 1.349 1.480 1.741 1.000
a[46] 1.333 0.243 0.863 1.164 1.348 1.492 1.795 1.001
a[47] 1.338 0.306 0.744 1.134 1.334 1.535 1.985 1.003
a[48] 1.253 0.212 0.840 1.105 1.252 1.400 1.645 1.003
a[49] 1.670 0.188 1.321 1.540 1.667 1.803 2.024 1.008
a[50] 1.633 0.316 1.039 1.419 1.622 1.831 2.271 1.000
a[51] 1.781 0.267 1.270 1.586 1.776 1.968 2.293 1.000
a[52] 1.646 0.273 1.090 1.467 1.646 1.824 2.194 1.005
a[53] 1.399 0.283 0.862 1.212 1.398 1.586 1.946 1.005
a[54] 1.356 0.151 1.065 1.250 1.361 1.458 1.644 1.006
a[55] 1.510 0.218 1.089 1.361 1.510 1.661 1.935 1.000
a[56] 1.383 0.297 0.819 1.187 1.376 1.581 1.996 1.002
a[57] 1.065 0.238 0.592 0.900 1.065 1.223 1.521 1.002
a[58] 1.627 0.256 1.123 1.450 1.627 1.795 2.130 1.002
a[59] 1.606 0.284 1.073 1.410 1.601 1.790 2.175 1.001
a[60] 1.401 0.295 0.837 1.207 1.394 1.604 1.964 1.001
a[61] 1.165 0.129 0.913 1.080 1.163 1.247 1.421 1.002
a[62] 1.659 0.259 1.191 1.473 1.659 1.834 2.193 1.002
a[63] 1.497 0.297 0.941 1.291 1.492 1.699 2.074 1.001
a[64] 1.728 0.186 1.360 1.606 1.729 1.851 2.100 1.002
a[65] 1.406 0.291 0.844 1.219 1.399 1.593 1.994 1.002
a[66] 1.573 0.214 1.146 1.430 1.570 1.720 1.995 1.002
a[67] 1.621 0.198 1.238 1.485 1.625 1.753 2.009 1.000
a[68] 1.220 0.217 0.772 1.081 1.224 1.362 1.641 1.003
a[69] 1.348 0.257 0.815 1.181 1.345 1.530 1.813 1.001
a[70] 0.878 0.074 0.732 0.827 0.878 0.929 1.015 1.004
a[71] 1.492 0.148 1.213 1.390 1.489 1.584 1.779 1.001
a[72] 1.539 0.201 1.139 1.399 1.545 1.672 1.939 1.003
a[73] 1.561 0.285 1.008 1.367 1.567 1.756 2.121 1.002
a[74] 1.238 0.254 0.755 1.066 1.239 1.406 1.719 1.004
a[75] 1.508 0.294 0.920 1.315 1.510 1.702 2.050 1.002
a[76] 1.699 0.266 1.178 1.512 1.706 1.881 2.228 1.004
a[77] 1.698 0.223 1.257 1.542 1.705 1.842 2.160 1.002
a[78] 1.377 0.259 0.869 1.200 1.379 1.549 1.893 1.001
a[79] 1.083 0.272 0.573 0.901 1.088 1.273 1.590 1.001
a[80] 1.355 0.112 1.140 1.282 1.356 1.429 1.577 1.000
a[81] 1.845 0.316 1.259 1.619 1.841 2.060 2.467 1.000
a[82] 1.601 0.327 0.948 1.381 1.595 1.820 2.226 1.005
a[83] 1.653 0.193 1.284 1.524 1.644 1.787 2.042 1.002
a[84] 1.593 0.181 1.250 1.474 1.593 1.707 1.968 1.002
a[85] 1.380 0.308 0.772 1.180 1.373 1.586 1.995 1.004
b[1] -0.575 0.326 -1.179 -0.771 -0.598 -0.398 0.135 1.007
b[2] -0.749 0.279 -1.359 -0.910 -0.726 -0.577 -0.225 1.002
b[3] -0.678 0.286 -1.241 -0.845 -0.680 -0.527 -0.093 1.003
b[4] -0.717 0.265 -1.289 -0.867 -0.714 -0.564 -0.163 1.006
b[5] -0.631 0.308 -1.201 -0.810 -0.646 -0.486 0.083 1.009
b[6] -0.681 0.322 -1.354 -0.853 -0.673 -0.497 -0.038 1.005
b[7] -0.519 0.307 -1.024 -0.718 -0.566 -0.342 0.158 1.003
b[8] -0.664 0.288 -1.269 -0.835 -0.661 -0.501 -0.052 1.001
b[9] -0.418 0.328 -0.976 -0.644 -0.465 -0.232 0.359 1.016
b[10] -0.757 0.274 -1.364 -0.904 -0.727 -0.591 -0.239 1.001
b[11] -0.663 0.330 -1.377 -0.835 -0.667 -0.488 0.014 1.004
b[12] -0.706 0.325 -1.440 -0.885 -0.694 -0.531 -0.045 1.008
b[13] -0.618 0.331 -1.254 -0.812 -0.627 -0.448 0.081 1.008
b[14] -0.795 0.271 -1.377 -0.958 -0.785 -0.626 -0.287 1.006
b[15] -0.653 0.284 -1.239 -0.823 -0.660 -0.499 -0.061 1.000
b[16] -0.622 0.349 -1.318 -0.810 -0.628 -0.452 0.143 1.001
b[17] -0.814 0.301 -1.525 -0.984 -0.775 -0.618 -0.272 1.004
b[18] -0.500 0.291 -0.991 -0.694 -0.546 -0.330 0.141 1.003
b[19] -0.741 0.236 -1.239 -0.890 -0.720 -0.586 -0.292 1.004
b[20] -0.726 0.326 -1.466 -0.891 -0.699 -0.542 -0.078 1.005
b[21] -0.651 0.298 -1.293 -0.810 -0.656 -0.483 -0.011 1.002
b[22] -0.571 0.316 -1.198 -0.764 -0.587 -0.394 0.106 1.003
b[23] -0.643 0.315 -1.340 -0.822 -0.652 -0.468 -0.018 1.002
b[24] -0.731 0.292 -1.355 -0.911 -0.725 -0.567 -0.144 1.002
b[25] -0.524 0.305 -1.062 -0.724 -0.568 -0.348 0.128 1.007
b[26] -0.745 0.179 -1.109 -0.857 -0.734 -0.624 -0.416 1.001
b[27] -0.650 0.290 -1.205 -0.822 -0.655 -0.498 -0.026 1.003
b[28] -0.642 0.279 -1.195 -0.801 -0.654 -0.489 -0.039 1.000
b[29] -0.625 0.328 -1.247 -0.815 -0.647 -0.455 0.087 1.007
b[30] -0.601 0.345 -1.268 -0.803 -0.610 -0.412 0.119 1.007
b[31] -0.752 0.327 -1.483 -0.919 -0.730 -0.565 -0.124 1.013
b[32] -0.662 0.312 -1.324 -0.839 -0.657 -0.504 -0.016 1.004
b[33] -0.755 0.350 -1.568 -0.912 -0.719 -0.568 -0.082 1.002
b[34] -0.739 0.289 -1.402 -0.906 -0.712 -0.568 -0.166 1.000
b[35] -0.550 0.278 -1.068 -0.719 -0.566 -0.392 0.067 1.006
b[36] -0.582 0.331 -1.185 -0.788 -0.608 -0.380 0.163 1.004
b[37] -0.497 0.341 -1.111 -0.708 -0.522 -0.296 0.227 1.013
b[38] -0.625 0.304 -1.229 -0.802 -0.643 -0.450 0.046 1.006
b[39] -0.631 0.301 -1.253 -0.805 -0.648 -0.481 0.010 1.003
b[40] -0.724 0.310 -1.371 -0.898 -0.708 -0.544 -0.083 1.003
b[41] -0.625 0.311 -1.234 -0.805 -0.637 -0.456 0.028 1.000
b[42] -0.668 0.334 -1.329 -0.863 -0.680 -0.493 0.047 1.006
b[43] -1.089 0.364 -1.897 -1.320 -1.034 -0.803 -0.551 1.003
b[44] -0.791 0.321 -1.558 -0.964 -0.745 -0.590 -0.206 1.014
b[45] -0.763 0.256 -1.321 -0.922 -0.747 -0.603 -0.275 1.005
b[46] -0.640 0.317 -1.287 -0.820 -0.646 -0.475 0.076 1.000
b[47] -0.839 0.336 -1.600 -1.014 -0.790 -0.630 -0.256 1.005
b[48] -0.562 0.305 -1.146 -0.740 -0.577 -0.403 0.168 1.004
b[49] -0.914 0.298 -1.616 -1.079 -0.872 -0.705 -0.440 1.019
b[50] -0.719 0.322 -1.412 -0.894 -0.706 -0.536 -0.087 1.008
b[51] -0.755 0.342 -1.477 -0.955 -0.732 -0.556 -0.088 1.003
b[52] -0.733 0.335 -1.437 -0.929 -0.728 -0.546 -0.027 1.002
b[53] -0.698 0.308 -1.364 -0.872 -0.696 -0.528 -0.047 1.002
b[54] -0.843 0.273 -1.457 -0.995 -0.814 -0.651 -0.386 1.008
b[55] -0.598 0.261 -1.069 -0.774 -0.618 -0.455 -0.013 1.000
b[56] -0.815 0.320 -1.557 -1.002 -0.775 -0.604 -0.297 1.003
b[57] -0.620 0.314 -1.248 -0.804 -0.626 -0.444 0.037 1.001
b[58] -0.640 0.308 -1.262 -0.828 -0.660 -0.466 0.023 1.008
b[59] -0.777 0.309 -1.440 -0.945 -0.749 -0.598 -0.236 1.002
b[60] -0.656 0.331 -1.351 -0.823 -0.660 -0.493 0.093 1.001
b[61] -0.407 0.286 -0.908 -0.616 -0.443 -0.222 0.206 1.003
b[62] -0.427 0.365 -0.956 -0.675 -0.503 -0.234 0.441 1.004
b[63] -0.544 0.323 -1.140 -0.737 -0.587 -0.365 0.176 1.002
b[64] -0.774 0.313 -1.509 -0.941 -0.739 -0.594 -0.209 1.004
b[65] -0.672 0.340 -1.372 -0.842 -0.682 -0.506 0.067 1.006
b[66] -0.631 0.243 -1.106 -0.783 -0.643 -0.488 -0.091 1.001
b[67] -0.453 0.294 -0.959 -0.647 -0.495 -0.290 0.193 1.005
b[68] -0.631 0.319 -1.320 -0.797 -0.642 -0.443 0.035 1.002
b[69] -0.653 0.349 -1.383 -0.828 -0.657 -0.481 0.145 1.006
b[70] -0.580 0.172 -0.911 -0.690 -0.584 -0.473 -0.229 1.004
b[71] -0.777 0.238 -1.286 -0.909 -0.752 -0.624 -0.325 1.003
b[72] -0.702 0.319 -1.401 -0.866 -0.690 -0.537 -0.035 1.000
b[73] -0.705 0.327 -1.390 -0.898 -0.689 -0.516 -0.066 1.008
b[74] -0.619 0.335 -1.311 -0.809 -0.635 -0.430 0.132 1.004
b[75] -0.543 0.312 -1.092 -0.746 -0.573 -0.369 0.141 1.001
b[76] -0.728 0.292 -1.348 -0.890 -0.711 -0.560 -0.145 1.005
b[77] -0.849 0.313 -1.545 -1.029 -0.809 -0.638 -0.315 1.007
b[78] -0.729 0.280 -1.334 -0.891 -0.712 -0.566 -0.187 1.001
b[79] -0.675 0.310 -1.279 -0.837 -0.669 -0.493 -0.070 1.012
b[80] -0.799 0.238 -1.329 -0.939 -0.771 -0.642 -0.407 1.004
b[81] -0.549 0.334 -1.128 -0.762 -0.582 -0.370 0.175 1.007
b[82] -0.716 0.328 -1.433 -0.898 -0.698 -0.544 -0.067 1.001
b[83] -1.046 0.352 -1.801 -1.270 -0.991 -0.776 -0.529 1.018
b[84] -0.709 0.293 -1.325 -0.872 -0.701 -0.541 -0.132 1.002
b[85] -0.665 0.323 -1.309 -0.842 -0.657 -0.492 -0.025 1.005
mu.a 1.458 0.055 1.351 1.424 1.457 1.494 1.563 1.000
mu.b -0.674 0.086 -0.843 -0.737 -0.674 -0.615 -0.502 1.000
rho -0.270 0.341 -0.841 -0.503 -0.304 -0.064 0.539 1.016
sigma.a 0.350 0.050 0.262 0.314 0.346 0.381 0.456 1.002
sigma.b 0.297 0.138 0.057 0.190 0.298 0.399 0.571 1.028
sigma.y 0.750 0.019 0.713 0.737 0.750 0.764 0.787 1.010
deviance 2077.730 17.685 2041.947 2065.836 2078.274 2089.506 2112.228 1.009
n.eff
a[1] 780
a[2] 1200
a[3] 1200
a[4] 430
a[5] 1200
a[6] 1200
a[7] 460
a[8] 1200
a[9] 950
a[10] 330
a[11] 820
a[12] 1200
a[13] 520
a[14] 1200
a[15] 1200
a[16] 660
a[17] 1200
a[18] 340
a[19] 1200
a[20] 760
a[21] 1200
a[22] 490
a[23] 1000
a[24] 1200
a[25] 830
a[26] 1200
a[27] 1200
a[28] 1200
a[29] 1200
a[30] 960
a[31] 1200
a[32] 1200
a[33] 540
a[34] 1200
a[35] 1200
a[36] 400
a[37] 1200
a[38] 610
a[39] 670
a[40] 1200
a[41] 1200
a[42] 1200
a[43] 1200
a[44] 1200
a[45] 1200
a[46] 1200
a[47] 640
a[48] 570
a[49] 260
a[50] 1200
a[51] 1200
a[52] 400
a[53] 780
a[54] 300
a[55] 1200
a[56] 1000
a[57] 1100
a[58] 860
a[59] 1200
a[60] 1200
a[61] 900
a[62] 1100
a[63] 1200
a[64] 770
a[65] 910
a[66] 780
a[67] 1200
a[68] 590
a[69] 1200
a[70] 1200
a[71] 1200
a[72] 1200
a[73] 950
a[74] 680
a[75] 1100
a[76] 470
a[77] 1200
a[78] 1200
a[79] 1200
a[80] 1200
a[81] 1200
a[82] 410
a[83] 760
a[84] 960
a[85] 510
b[1] 440
b[2] 1100
b[3] 1200
b[4] 400
b[5] 550
b[6] 1200
b[7] 1200
b[8] 1200
b[9] 170
b[10] 1200
b[11] 1200
b[12] 260
b[13] 1200
b[14] 370
b[15] 1200
b[16] 1200
b[17] 540
b[18] 1100
b[19] 430
b[20] 900
b[21] 1200
b[22] 610
b[23] 1100
b[24] 1200
b[25] 500
b[26] 1200
b[27] 1200
b[28] 1200
b[29] 490
b[30] 470
b[31] 180
b[32] 1200
b[33] 1100
b[34] 1200
b[35] 310
b[36] 490
b[37] 230
b[38] 470
b[39] 1200
b[40] 840
b[41] 1200
b[42] 510
b[43] 690
b[44] 190
b[45] 560
b[46] 1200
b[47] 500
b[48] 680
b[49] 110
b[50] 730
b[51] 1200
b[52] 1200
b[53] 950
b[54] 410
b[55] 1200
b[56] 690
b[57] 1200
b[58] 240
b[59] 1200
b[60] 1200
b[61] 580
b[62] 690
b[63] 1200
b[64] 1200
b[65] 490
b[66] 1200
b[67] 1100
b[68] 1200
b[69] 650
b[70] 540
b[71] 720
b[72] 1200
b[73] 1200
b[74] 880
b[75] 1200
b[76] 460
b[77] 280
b[78] 1200
b[79] 310
b[80] 500
b[81] 420
b[82] 1200
b[83] 120
b[84] 1200
b[85] 430
mu.a 1200
mu.b 1200
rho 160
sigma.a 1200
sigma.b 100
sigma.y 200
deviance 230
For each parameter, n.eff is a crude measure of effective sample size,
and Rhat is the potential scale reduction factor (at convergence, Rhat=1).
DIC info (using the rule: pV = var(deviance)/2)
pV = 155.3 and DIC = 2233.0
DIC is an estimate of expected predictive error (lower deviance is better).
jags_result <- jags_fit$BUGSoutput$summary7.1.3 Inclusión de coeficientes no modelados a nivel individual
Para ciertos predictores que no varían entre grupos, se puede incluir un vector de coeficientes no jerárquicos \(\beta^0\) asociado a una matriz \(X^0\):
\[ y_i = X_i^0 \beta^0 + X_i B_{j[i]} + \epsilon_i \]
En JAGS, esta parte del modelo se expresa como:
y.hat[i] <- inprod(b.0[], X.0[i,]) + inprod(B[county[i],], X[i,])
Y se asigna una distribución previa no informativa para los parámetros no jerárquicos:
for (k in 1:K.0){
b.0[k] ~ dnorm(0, .0001)
}
7.2 Predictores de nivel de grupo en modelos con intercepto y pendiente variables
Podemos añadir predictores a nivel de grupo sustituyendo las medias grupales por regresiones a nivel de grupo.
\[ \text{a.hat}[j] \leftarrow g.a.0 + g.a.1 \cdot u[j] \]
\[ \text{b.hat}[j] \leftarrow g.b.0 + g.b.1 \cdot u[j] \]
Se eliminan las priors de \(\mu_a\) y \(\mu_b\), reemplazándolas por priors no informativas para \(g.a.0, g.a.1, g.b.0, g.b.1\): \[ g.a.0, g.a.1, g.b.0, g.b.1 \sim \text{Normal}(0, 0.0001) \]
Versión matricial con \(B\)
En la versión matricial que combina interceptos y pendientes:
\[ B[j,1] = a_j,\quad B[j,2] = b_j \]
y los valores ajustados se expresan como:
\[ y_i \sim \text{Normal}(\text{inprod}(B_{j[i],}, X_i), \sigma_y^2) \] Código R (versión no matricial con predictores a nivel de grupo):
# Prepare the data for JAGS
data_jags <- list(y = y, x = x, county = county, n = n, J = J, u = u.full)
# Initial values
inits <- function() {
list(B = array(rnorm(2 * J), dim = c(J, 2)),
g.a.0 = rnorm(1), g.a.1 = rnorm(1),
g.b.0 = rnorm(1), g.b.1 = rnorm(1),
sigma.y = runif(1), sigma.a = runif(1), sigma.b = runif(1), rho = runif(1, -1, 1))
}
# Parameters to monitor
params <- c("a", "b", "g.a.0", "g.a.1", "g.b.0", "g.b.1", "sigma.y", "sigma.a", "sigma.b", "rho")
# Run JAGS model
jags_fit <- jags(data = data_jags,
inits = inits,
parameters.to.save = params,
model.file = "codigoJAGS/radon_model_slope3.jags",
n.chains = 3,
n.iter = 5000,
n.burnin = 1000,
n.thin = 10)Compiling model graph
Resolving undeclared variables
Allocating nodes
Graph information:
Observed stochastic nodes: 919
Unobserved stochastic nodes: 93
Total graph size: 4357
Initializing model
# Check the summary
print(jags_fit)Inference for Bugs model at "codigoJAGS/radon_model_slope3.jags", fit using jags,
3 chains, each with 5000 iterations (first 1000 discarded), n.thin = 10
n.sims = 1200 iterations saved. Running time = 35.705 secs
mu.vect sd.vect 2.5% 25% 50% 75% 97.5% Rhat
a[1] 1.141 0.284 0.585 0.953 1.137 1.326 1.706 1.005
a[2] 0.938 0.102 0.735 0.868 0.938 1.010 1.128 1.001
a[3] 1.471 0.298 0.891 1.283 1.467 1.671 2.067 1.002
a[4] 1.534 0.246 1.048 1.370 1.529 1.692 2.028 1.006
a[5] 1.415 0.266 0.902 1.242 1.416 1.585 1.936 1.000
a[6] 1.469 0.280 0.902 1.282 1.483 1.663 1.981 1.003
a[7] 1.833 0.185 1.463 1.704 1.837 1.956 2.193 1.002
a[8] 1.694 0.290 1.134 1.500 1.702 1.888 2.268 1.003
a[9] 1.111 0.206 0.701 0.971 1.120 1.261 1.495 1.000
a[10] 1.543 0.260 1.034 1.377 1.547 1.712 2.066 1.001
a[11] 1.424 0.244 0.921 1.268 1.425 1.583 1.895 1.000
a[12] 1.575 0.261 1.083 1.386 1.573 1.744 2.084 1.001
a[13] 1.214 0.236 0.773 1.056 1.208 1.368 1.701 1.001
a[14] 1.860 0.192 1.495 1.727 1.862 1.982 2.247 1.001
a[15] 1.389 0.269 0.848 1.203 1.394 1.564 1.927 1.001
a[16] 1.198 0.298 0.584 0.990 1.215 1.413 1.756 1.020
a[17] 1.396 0.302 0.848 1.189 1.396 1.587 1.991 1.013
a[18] 1.158 0.197 0.766 1.029 1.163 1.290 1.547 1.002
a[19] 1.346 0.095 1.166 1.279 1.349 1.412 1.525 1.007
a[20] 1.591 0.288 1.029 1.401 1.587 1.783 2.161 1.003
a[21] 1.630 0.205 1.234 1.501 1.631 1.760 2.037 1.000
a[22] 0.979 0.240 0.519 0.805 0.982 1.149 1.433 1.000
a[23] 1.433 0.315 0.817 1.239 1.438 1.637 2.054 1.011
a[24] 1.876 0.217 1.469 1.729 1.870 2.017 2.306 1.004
a[25] 1.775 0.183 1.386 1.654 1.782 1.899 2.119 1.007
a[26] 1.369 0.075 1.222 1.319 1.371 1.420 1.515 1.005
a[27] 1.619 0.244 1.162 1.455 1.605 1.791 2.085 1.001
a[28] 1.314 0.279 0.769 1.115 1.306 1.493 1.855 1.000
a[29] 1.291 0.282 0.706 1.108 1.303 1.477 1.817 1.000
a[30] 1.090 0.191 0.697 0.963 1.090 1.214 1.459 1.000
a[31] 1.742 0.246 1.270 1.576 1.733 1.900 2.233 1.000
a[32] 1.347 0.245 0.867 1.184 1.351 1.506 1.829 1.004
a[33] 1.736 0.268 1.224 1.561 1.726 1.910 2.272 1.010
a[34] 1.503 0.307 0.912 1.282 1.501 1.703 2.155 1.001
a[35] 1.007 0.268 0.482 0.836 1.010 1.195 1.504 1.001
a[36] 1.847 0.332 1.228 1.621 1.844 2.055 2.526 1.001
a[37] 0.750 0.226 0.303 0.602 0.753 0.905 1.182 1.003
a[38] 1.607 0.279 1.061 1.416 1.598 1.792 2.178 1.000
a[39] 1.590 0.253 1.115 1.417 1.576 1.754 2.121 1.000
a[40] 1.856 0.267 1.356 1.677 1.845 2.040 2.393 1.006
a[41] 1.766 0.219 1.363 1.611 1.761 1.920 2.210 1.000
a[42] 1.429 0.317 0.806 1.216 1.425 1.640 2.037 1.004
a[43] 1.734 0.274 1.228 1.541 1.720 1.917 2.304 1.013
a[44] 1.244 0.233 0.800 1.079 1.252 1.403 1.691 1.005
a[45] 1.352 0.192 0.962 1.230 1.356 1.484 1.716 1.001
a[46] 1.339 0.242 0.851 1.180 1.345 1.495 1.805 1.002
a[47] 1.351 0.317 0.746 1.135 1.354 1.573 1.961 1.000
a[48] 1.225 0.208 0.811 1.092 1.227 1.366 1.627 1.001
a[49] 1.668 0.188 1.290 1.538 1.672 1.794 2.017 1.001
a[50] 1.638 0.326 1.007 1.422 1.640 1.854 2.316 1.002
a[51] 1.786 0.261 1.290 1.609 1.778 1.960 2.335 1.004
a[52] 1.634 0.282 1.106 1.458 1.628 1.812 2.210 1.000
a[53] 1.367 0.293 0.770 1.172 1.374 1.560 1.929 1.004
a[54] 1.358 0.148 1.075 1.257 1.356 1.457 1.647 1.006
a[55] 1.517 0.221 1.086 1.373 1.512 1.663 1.944 1.000
a[56] 1.369 0.307 0.769 1.159 1.365 1.572 1.989 1.002
a[57] 1.071 0.243 0.572 0.917 1.074 1.241 1.527 1.000
a[58] 1.650 0.265 1.138 1.473 1.653 1.827 2.178 1.001
a[59] 1.643 0.278 1.111 1.448 1.644 1.832 2.184 1.007
a[60] 1.400 0.296 0.776 1.214 1.412 1.600 1.970 1.005
a[61] 1.168 0.128 0.913 1.087 1.166 1.251 1.415 1.005
a[62] 1.664 0.258 1.153 1.496 1.661 1.837 2.175 1.001
a[63] 1.497 0.282 0.945 1.311 1.493 1.681 2.064 1.003
a[64] 1.736 0.198 1.346 1.601 1.746 1.865 2.123 1.002
a[65] 1.420 0.290 0.860 1.230 1.421 1.618 1.997 1.000
a[66] 1.574 0.206 1.174 1.430 1.577 1.712 1.984 1.002
a[67] 1.646 0.197 1.244 1.519 1.651 1.772 2.026 1.005
a[68] 1.226 0.212 0.807 1.082 1.232 1.375 1.622 1.003
a[69] 1.366 0.266 0.838 1.188 1.369 1.544 1.875 1.005
a[70] 0.875 0.073 0.726 0.828 0.877 0.921 1.019 1.003
a[71] 1.510 0.146 1.236 1.410 1.511 1.609 1.791 1.007
a[72] 1.563 0.197 1.185 1.432 1.557 1.700 1.943 1.001
a[73] 1.581 0.329 0.904 1.362 1.588 1.803 2.224 1.002
a[74] 1.256 0.253 0.771 1.090 1.252 1.412 1.770 1.001
a[75] 1.550 0.293 0.951 1.337 1.568 1.751 2.104 1.000
a[76] 1.743 0.287 1.239 1.543 1.742 1.921 2.360 1.003
a[77] 1.716 0.227 1.293 1.566 1.719 1.870 2.158 1.004
a[78] 1.430 0.265 0.910 1.248 1.430 1.612 1.933 1.003
a[79] 1.111 0.289 0.539 0.922 1.105 1.294 1.663 1.004
a[80] 1.365 0.108 1.144 1.294 1.367 1.435 1.580 1.002
a[81] 1.921 0.332 1.283 1.694 1.910 2.143 2.562 1.003
a[82] 1.627 0.335 0.956 1.407 1.623 1.855 2.279 1.000
a[83] 1.680 0.201 1.268 1.546 1.685 1.813 2.070 1.012
a[84] 1.610 0.182 1.252 1.492 1.609 1.731 1.964 1.004
a[85] 1.406 0.310 0.791 1.201 1.400 1.623 2.021 1.002
b[1] -0.572 0.305 -1.131 -0.758 -0.601 -0.403 0.104 1.012
b[2] -0.748 0.270 -1.312 -0.902 -0.734 -0.578 -0.241 1.018
b[3] -0.671 0.290 -1.238 -0.841 -0.687 -0.515 -0.043 1.011
b[4] -0.723 0.265 -1.267 -0.866 -0.721 -0.563 -0.180 1.014
b[5] -0.604 0.311 -1.238 -0.782 -0.624 -0.425 0.027 1.016
b[6] -0.646 0.330 -1.319 -0.826 -0.662 -0.467 0.060 1.020
b[7] -0.502 0.311 -1.036 -0.716 -0.543 -0.327 0.267 1.015
b[8] -0.683 0.284 -1.233 -0.852 -0.681 -0.510 -0.132 1.012
b[9] -0.429 0.330 -0.949 -0.658 -0.477 -0.238 0.320 1.030
b[10] -0.750 0.276 -1.363 -0.912 -0.734 -0.577 -0.233 1.021
b[11] -0.649 0.325 -1.263 -0.838 -0.660 -0.463 0.078 1.013
b[12] -0.663 0.317 -1.284 -0.838 -0.660 -0.489 0.003 1.017
b[13] -0.600 0.322 -1.296 -0.777 -0.616 -0.413 0.072 1.023
b[14] -0.783 0.261 -1.345 -0.945 -0.776 -0.615 -0.289 1.021
b[15] -0.637 0.291 -1.234 -0.808 -0.648 -0.466 -0.013 1.015
b[16] -0.580 0.338 -1.246 -0.767 -0.612 -0.395 0.161 1.016
b[17] -0.766 0.299 -1.447 -0.934 -0.733 -0.580 -0.212 1.024
b[18] -0.488 0.280 -0.971 -0.684 -0.522 -0.315 0.150 1.031
b[19] -0.731 0.243 -1.291 -0.861 -0.719 -0.577 -0.268 1.017
b[20] -0.671 0.324 -1.373 -0.840 -0.679 -0.479 -0.019 1.012
b[21] -0.640 0.286 -1.183 -0.809 -0.651 -0.474 -0.043 1.015
b[22] -0.560 0.315 -1.159 -0.755 -0.577 -0.393 0.106 1.021
b[23] -0.627 0.307 -1.211 -0.798 -0.642 -0.469 0.026 1.032
b[24] -0.716 0.290 -1.336 -0.888 -0.715 -0.540 -0.153 1.015
b[25] -0.499 0.302 -1.027 -0.713 -0.529 -0.329 0.178 1.018
b[26] -0.750 0.184 -1.134 -0.866 -0.743 -0.629 -0.391 1.024
b[27] -0.626 0.281 -1.205 -0.790 -0.629 -0.459 -0.077 1.016
b[28] -0.615 0.279 -1.171 -0.781 -0.622 -0.463 -0.022 1.014
b[29] -0.617 0.335 -1.287 -0.810 -0.639 -0.447 0.118 1.019
b[30] -0.568 0.348 -1.291 -0.769 -0.594 -0.378 0.172 1.029
b[31] -0.717 0.321 -1.408 -0.888 -0.710 -0.541 -0.075 1.017
b[32] -0.629 0.322 -1.284 -0.802 -0.637 -0.453 0.064 1.021
b[33] -0.719 0.315 -1.412 -0.898 -0.707 -0.531 -0.059 1.016
b[34] -0.685 0.297 -1.324 -0.853 -0.673 -0.505 -0.082 1.016
b[35] -0.535 0.283 -1.062 -0.711 -0.556 -0.367 0.064 1.021
b[36] -0.572 0.353 -1.218 -0.798 -0.605 -0.362 0.200 1.016
b[37] -0.488 0.334 -1.126 -0.706 -0.519 -0.290 0.206 1.026
b[38] -0.612 0.297 -1.176 -0.806 -0.618 -0.440 0.043 1.011
b[39] -0.604 0.297 -1.185 -0.785 -0.622 -0.426 0.016 1.022
b[40] -0.724 0.321 -1.386 -0.908 -0.717 -0.536 -0.044 1.020
b[41] -0.600 0.307 -1.159 -0.801 -0.627 -0.426 0.087 1.015
b[42] -0.647 0.326 -1.335 -0.838 -0.653 -0.452 0.048 1.020
b[43] -1.077 0.353 -1.873 -1.294 -1.022 -0.807 -0.553 1.067
b[44] -0.781 0.311 -1.510 -0.926 -0.751 -0.589 -0.253 1.028
b[45] -0.759 0.258 -1.322 -0.899 -0.742 -0.606 -0.278 1.027
b[46] -0.618 0.328 -1.278 -0.801 -0.632 -0.442 0.044 1.024
b[47] -0.827 0.324 -1.592 -0.995 -0.794 -0.617 -0.260 1.029
b[48] -0.527 0.301 -1.067 -0.714 -0.559 -0.356 0.153 1.040
b[49] -0.870 0.291 -1.523 -1.035 -0.835 -0.676 -0.372 1.058
b[50] -0.699 0.339 -1.478 -0.886 -0.700 -0.496 -0.029 1.033
b[51] -0.739 0.323 -1.429 -0.915 -0.733 -0.550 -0.087 1.022
b[52] -0.707 0.331 -1.358 -0.890 -0.703 -0.522 -0.027 1.021
b[53] -0.647 0.308 -1.282 -0.825 -0.653 -0.474 -0.023 1.010
b[54] -0.833 0.278 -1.425 -1.007 -0.801 -0.647 -0.343 1.038
b[55] -0.580 0.272 -1.089 -0.755 -0.597 -0.417 0.004 1.015
b[56] -0.790 0.320 -1.503 -0.959 -0.756 -0.590 -0.182 1.040
b[57] -0.681 0.320 -1.297 -0.857 -0.686 -0.505 -0.028 1.015
b[58] -0.712 0.303 -1.330 -0.895 -0.715 -0.539 -0.066 1.009
b[59] -0.843 0.302 -1.512 -1.014 -0.827 -0.654 -0.296 1.027
b[60] -0.679 0.322 -1.329 -0.848 -0.685 -0.522 0.001 1.022
b[61] -0.424 0.294 -0.901 -0.627 -0.463 -0.240 0.223 1.046
b[62] -0.470 0.342 -1.026 -0.703 -0.525 -0.272 0.346 1.048
b[63] -0.566 0.310 -1.123 -0.764 -0.594 -0.395 0.134 1.016
b[64] -0.756 0.304 -1.424 -0.931 -0.736 -0.569 -0.168 1.020
b[65] -0.677 0.337 -1.392 -0.862 -0.687 -0.502 0.045 1.028
b[66] -0.641 0.240 -1.102 -0.798 -0.659 -0.491 -0.137 1.007
b[67] -0.510 0.291 -0.980 -0.716 -0.550 -0.340 0.151 1.058
b[68] -0.672 0.333 -1.320 -0.864 -0.688 -0.492 0.070 1.019
b[69] -0.721 0.347 -1.425 -0.919 -0.719 -0.530 0.042 1.023
b[70] -0.604 0.173 -0.924 -0.723 -0.616 -0.491 -0.264 1.014
b[71] -0.871 0.263 -1.412 -1.036 -0.862 -0.688 -0.393 1.017
b[72] -0.829 0.357 -1.552 -1.027 -0.825 -0.616 -0.094 1.009
b[73] -0.829 0.362 -1.604 -1.045 -0.812 -0.606 -0.104 1.019
b[74] -0.745 0.369 -1.484 -0.941 -0.751 -0.551 0.017 1.022
b[75] -0.666 0.339 -1.261 -0.887 -0.699 -0.459 0.053 1.026
b[76] -0.837 0.337 -1.530 -1.035 -0.822 -0.639 -0.177 1.012
b[77] -0.944 0.340 -1.750 -1.123 -0.920 -0.735 -0.312 1.019
b[78] -0.834 0.307 -1.480 -1.035 -0.819 -0.626 -0.263 1.017
b[79] -0.766 0.334 -1.472 -0.951 -0.765 -0.569 -0.084 1.002
b[80] -0.876 0.239 -1.401 -1.022 -0.858 -0.712 -0.443 1.013
b[81] -0.654 0.361 -1.290 -0.899 -0.683 -0.438 0.169 1.020
b[82] -0.816 0.365 -1.592 -1.024 -0.803 -0.595 -0.113 1.009
b[83] -1.126 0.358 -1.908 -1.363 -1.086 -0.868 -0.541 1.083
b[84] -0.818 0.302 -1.465 -1.000 -0.815 -0.628 -0.235 1.006
b[85] -0.767 0.363 -1.511 -0.974 -0.760 -0.556 -0.008 1.006
g.a.0 1.485 0.086 1.322 1.429 1.485 1.547 1.640 1.002
g.a.1 0.039 0.118 -0.196 -0.041 0.038 0.121 0.265 1.001
g.b.0 -0.761 0.135 -1.036 -0.847 -0.758 -0.679 -0.496 1.004
g.b.1 -0.131 0.193 -0.518 -0.258 -0.128 -0.002 0.227 1.007
rho -0.272 0.340 -0.857 -0.517 -0.300 -0.073 0.486 1.017
sigma.a 0.356 0.053 0.260 0.320 0.353 0.392 0.469 1.006
sigma.b 0.293 0.133 0.061 0.194 0.292 0.387 0.561 1.266
sigma.y 0.749 0.018 0.713 0.736 0.749 0.761 0.786 1.008
deviance 2077.664 18.229 2042.457 2065.744 2078.112 2089.918 2112.759 1.060
n.eff
a[1] 560
a[2] 1200
a[3] 1000
a[4] 650
a[5] 1200
a[6] 1200
a[7] 960
a[8] 710
a[9] 1200
a[10] 1200
a[11] 1200
a[12] 1200
a[13] 1200
a[14] 1200
a[15] 1200
a[16] 510
a[17] 160
a[18] 1200
a[19] 290
a[20] 1200
a[21] 1200
a[22] 1200
a[23] 650
a[24] 530
a[25] 540
a[26] 370
a[27] 1200
a[28] 1200
a[29] 1200
a[30] 1200
a[31] 1200
a[32] 510
a[33] 200
a[34] 1200
a[35] 1200
a[36] 1200
a[37] 660
a[38] 1200
a[39] 1200
a[40] 460
a[41] 1200
a[42] 1200
a[43] 150
a[44] 990
a[45] 1200
a[46] 970
a[47] 1200
a[48] 1200
a[49] 1200
a[50] 1000
a[51] 510
a[52] 1200
a[53] 500
a[54] 310
a[55] 1200
a[56] 890
a[57] 1200
a[58] 1200
a[59] 270
a[60] 540
a[61] 390
a[62] 1200
a[63] 750
a[64] 960
a[65] 1200
a[66] 980
a[67] 350
a[68] 840
a[69] 480
a[70] 720
a[71] 280
a[72] 1200
a[73] 820
a[74] 1200
a[75] 1200
a[76] 640
a[77] 460
a[78] 1200
a[79] 630
a[80] 1100
a[81] 660
a[82] 1200
a[83] 190
a[84] 500
a[85] 1200
b[1] 290
b[2] 160
b[3] 830
b[4] 370
b[5] 450
b[6] 270
b[7] 150
b[8] 1200
b[9] 100
b[10] 220
b[11] 1200
b[12] 410
b[13] 290
b[14] 450
b[15] 480
b[16] 470
b[17] 140
b[18] 88
b[19] 190
b[20] 1200
b[21] 760
b[22] 360
b[23] 320
b[24] 1200
b[25] 140
b[26] 98
b[27] 1200
b[28] 1200
b[29] 1200
b[30] 400
b[31] 900
b[32] 780
b[33] 1200
b[34] 430
b[35] 150
b[36] 300
b[37] 250
b[38] 330
b[39] 330
b[40] 810
b[41] 370
b[42] 1000
b[43] 35
b[44] 100
b[45] 150
b[46] 350
b[47] 100
b[48] 79
b[49] 57
b[50] 180
b[51] 720
b[52] 910
b[53] 1200
b[54] 70
b[55] 320
b[56] 82
b[57] 1200
b[58] 1200
b[59] 200
b[60] 570
b[61] 55
b[62] 59
b[63] 420
b[64] 350
b[65] 1200
b[66] 710
b[67] 41
b[68] 690
b[69] 230
b[70] 250
b[71] 250
b[72] 1000
b[73] 570
b[74] 840
b[75] 290
b[76] 480
b[77] 130
b[78] 260
b[79] 1200
b[80] 200
b[81] 170
b[82] 440
b[83] 30
b[84] 1200
b[85] 1200
g.a.0 790
g.a.1 1200
g.b.0 540
g.b.1 340
rho 200
sigma.a 330
sigma.b 13
sigma.y 250
deviance 38
For each parameter, n.eff is a crude measure of effective sample size,
and Rhat is the potential scale reduction factor (at convergence, Rhat=1).
DIC info (using the rule: pV = var(deviance)/2)
pV = 157.4 and DIC = 2235.1
DIC is an estimate of expected predictive error (lower deviance is better).
7.3 Modelos no anidados
En este experimento se tienen n = 40 observaciones, J = 5 condiciones de tratamiento y K = 8 aeropuertos. El modelo asume que la variable respuesta sigue una estructura no anidada:
\[ y_i \sim N(\mu + \gamma_{j[i]} + \delta_{k[i]}, \sigma_y^2), \quad \text{para } i = 1, \dots, n \]
donde:
\[ \gamma_j \sim N(0, \sigma_\gamma^2), \quad j = 1, \dots, J \]
\[ \delta_k \sim N(0, \sigma_\delta^2), \quad k = 1, \dots, K \]
Los parámetros del modelo son:
- \(\mu\): intercepto general.
- \(\gamma_j\): efecto del tratamiento para cada condición, centrado en cero para evitar redundancia con \(\mu\).
- \(\delta_k\): efecto del aeropuerto, también centrado en cero.
Este modelo incorpora efectos independientes de tratamiento y aeropuerto para capturar su contribución específica a la variabilidad total.
7.3.1 Carga y preparación de datos
Los datos se agrupan en dos niveles distintos: por tratamiento y por aeropuerto. A continuación se muestra el código R utilizado para procesar los datos del experimento de simulador de vuelo:
library(arm)Loading required package: MASS
Loading required package: Matrix
Loading required package: lme4
arm (Version 1.14-4, built: 2024-4-1)
Working directory is /home/luis/Dropbox/Cursos/Actuales/SP1653_2025/NotasClase/ModelosMixtos
Attaching package: 'arm'
The following object is masked from 'package:R2jags':
traceplot
The following object is masked from 'package:coda':
traceplot
pilots <- read.table("../ARM_Data/pilots/pilots.dat", header = TRUE)
attach(pilots)
group.names <- as.vector(unique(group))
scenario.names <- as.vector(unique(scenario))
n.group <- length(group.names)
n.scenario <- length(scenario.names)
successes <- NULL
failures <- NULL
group.id <- NULL
scenario.id <- NULL
for (j in 1:n.group) {
for (k in 1:n.scenario) {
ok <- group == group.names[j] & scenario == scenario.names[k]
successes <- c(successes, sum(recovered[ok] == 1, na.rm = TRUE))
failures <- c(failures, sum(recovered[ok] == 0, na.rm = TRUE))
group.id <- c(group.id, j)
scenario.id <- c(scenario.id, k)
}
}
y <- successes / (successes + failures)7.3.2 Especificación del modelo en JAGS
A continuación se define la estructura del modelo, los parámetros a monitorear y la ejecución del muestreador MCMC mediante JAGS:
# Definir número de tratamientos y aeropuertos
n.treatment <- length(unique(group.id))
n.airport <- length(unique(scenario.id))
# Preparar lista de datos para JAGS
data_jags <- list(
y = successes,
treatment = group.id,
airport = scenario.id,
n = length(successes),
n.treatment = n.treatment,
n.airport = n.airport
)
# Valores iniciales
inits <- function() {
list(
mu = rnorm(1),
sigma.y = runif(1),
sigma.gamma = runif(1),
sigma.delta = runif(1)
)
}
# Parámetros a monitorear
params <- c("mu", "gamma", "delta", "sigma.y", "sigma.gamma", "sigma.delta")
# Ajuste del modelo con JAGS
jags_fit <- jags(
data = data_jags,
inits = inits,
parameters.to.save = params,
model.file = "codigoJAGS/pilots.jags",
n.chains = 3,
n.iter = 5000,
n.burnin = 1000,
n.thin = 10
)Compiling model graph
Resolving undeclared variables
Allocating nodes
Graph information:
Observed stochastic nodes: 40
Unobserved stochastic nodes: 17
Total graph size: 188
Initializing model
# Resumen del ajuste del modelo
print(jags_fit)Inference for Bugs model at "codigoJAGS/pilots.jags", fit using jags,
3 chains, each with 5000 iterations (first 1000 discarded), n.thin = 10
n.sims = 1200 iterations saved. Running time = 0.45 secs
mu.vect sd.vect 2.5% 25% 50% 75% 97.5% Rhat n.eff
delta[1] -0.647 1.229 -3.067 -1.373 -0.686 0.046 1.911 1.001 1200
delta[2] -2.138 1.251 -4.741 -2.864 -2.139 -1.358 0.170 1.001 1200
delta[3] 0.064 1.260 -2.398 -0.718 0.063 0.800 2.521 1.003 1200
delta[4] -1.605 1.270 -4.046 -2.382 -1.593 -0.823 0.856 1.001 1200
delta[5] -0.092 1.231 -2.607 -0.829 -0.080 0.648 2.354 1.000 1200
delta[6] 3.566 1.271 1.105 2.698 3.563 4.369 6.106 1.001 1200
delta[7] -2.123 1.251 -4.722 -2.896 -2.114 -1.332 0.309 1.000 1200
delta[8] 2.626 1.243 0.152 1.823 2.622 3.349 5.120 1.000 1200
gamma[1] -0.150 0.487 -1.103 -0.339 -0.075 0.047 0.621 1.044 1200
gamma[2] 0.063 0.485 -0.742 -0.104 0.018 0.219 0.980 1.029 1200
gamma[3] -0.074 0.490 -1.002 -0.237 -0.029 0.101 0.779 1.032 270
gamma[4] -0.057 0.478 -1.051 -0.208 -0.019 0.113 0.732 1.005 1200
gamma[5] 0.227 0.509 -0.458 -0.025 0.104 0.418 1.290 1.033 1200
mu 3.127 1.152 0.767 2.490 3.158 3.816 5.339 1.001 1200
sigma.delta 2.699 1.017 1.424 2.032 2.498 3.119 5.277 1.001 1200
sigma.gamma 0.509 0.674 0.014 0.160 0.342 0.625 1.969 1.028 160
sigma.y 1.551 0.211 1.216 1.402 1.519 1.669 2.016 1.008 410
deviance 147.353 5.666 138.664 143.068 146.738 151.033 159.950 1.005 500
For each parameter, n.eff is a crude measure of effective sample size,
and Rhat is the potential scale reduction factor (at convergence, Rhat=1).
DIC info (using the rule: pV = var(deviance)/2)
pV = 16.0 and DIC = 163.4
DIC is an estimate of expected predictive error (lower deviance is better).
7.3.3 Interpretación conceptual
En los modelos no anidados, es posible decidir dónde incluir el intercepto o término constante. En este caso, se incorpora como \(\mu\) en el modelo de datos, con medias cero para los efectos de grupo. Otra alternativa sería incluir un parámetro de media adicional para los \(\gamma_j\) o los \(\delta_k\). Sin embargo, agregar un término constante en más de un nivel produce no identificabilidad.
Como se explica en la Sección 19.4 del texto de referencia, puede ser útil incluir parámetros redundantes y luego reparametrizar el modelo para restablecer la identificabilidad. Esta estrategia puede facilitar la interpretación conceptual y la estabilidad computacional del modelo.