smooth.construct.re.smooth.spec {mgcv} | R Documentation |
gam
can deal with simple independent random effects, by exploiting the link
between smooths and random effects to treat random effects as smooths. s(x,bs="re")
implements
this. Such terms can can have any number of predictors, which can be any mixture of numeric or factor
variables. The terms produce a parametric interaction of the predictors, and penalize the corresponding
coefficients with a multiple of the identity matrix, corresponding to an assumption of i.i.d. normality.
See details.
## S3 method for class 're.smooth.spec' smooth.construct(object, data, knots) ## S3 method for class 'random.effect' Predict.matrix(object, data)
object |
For the |
data |
a list containing just the data (including any |
knots |
generically a list containing any knots supplied for basis setup — unused at present. |
Exactly how the random effects are implemented is best seen by example. Consider the model
term s(x,z,bs="re")
. This will result in the model matrix component corresponding to ~x:z-1
being added to the model matrix for the whole model. The coefficients associated with the model matrix
component are assumed i.i.d. normal, with unknown variance (to be estimated). This assumption is
equivalent to an identity penalty matrix (i.e. a ridge penalty) on the coefficients. Because such a
penalty is full rank, random effects terms do not require centering constraints.
If the nature of the random effect specification is not clear, consider a couple more examples:
s(x,bs="re")
results in model.matrix(~x-1)
being appended to the overall model matrix,
while s(x,v,w,bs="re")
would result in model.matrix(~x:v:w-1)
being appended to the model
matrix. In both cases the corresponding model coefficients are assumed i.i.d. normal, and are hence
subject to ridge penalties.
If the random effect precision matrix is of the form sum_j p_j S_j for known matrices S_j and unknown parameters p_j, then a list containing the S_j can be supplied in the xt
argument of
s
. In this case an array rank
should also be supplied in xt
giving the ranks of the S_j matrices. See simple example below.
Note that smooth id
s are not supported for random effect terms. Unlike most smooth terms, side
conditions are never applied to random effect terms in the event of nesting (since they are identifiable
without side conditions).
Random effects implemented in this way do not exploit the sparse structure of many random effects, and
may therefore be relatively inefficient for models with large numbers of random effects, when gamm4
or gamm
may be better alternatives. Note also that gam
will not support
models with more coefficients than data.
The situation in which factor variable random effects intentionally have unobserved levels requires special handling.
You should set drop.unused.levels=FALSE
in the model fitting function, gam
, bam
or gamm
, having first ensured that any fixed effect factors do not contain unobserved levels.
An object of class "random.effect"
or a matrix mapping the coefficients of the random effect to the random effects themselves.
Simon N. Wood simon.wood@r-project.org
Wood, S.N. (2008) Fast stable direct fitting and smoothness selection for generalized additive models. Journal of the Royal Statistical Society (B) 70(3):495-518
## see ?gam.vcomp require(mgcv) ## simulate simple random effect example set.seed(4) nb <- 50; n <- 400 b <- rnorm(nb)*2 ## random effect r <- sample(1:nb,n,replace=TRUE) ## r.e. levels y <- 2 + b[r] + rnorm(n) r <- factor(r) ## fit model.... b <- gam(y ~ s(r,bs="re"),method="REML") gam.vcomp(b) ## example with supplied precision matrices... b <- c(rnorm(nb/2)*2,rnorm(nb/2)*.5) ## random effect now with 2 variances r <- sample(1:nb,n,replace=TRUE) ## r.e. levels y <- 2 + b[r] + rnorm(n) r <- factor(r) ## known precision matrix components... S <- list(diag(rep(c(1,0),each=nb/2)),diag(rep(c(0,1),each=nb/2))) b <- gam(y ~ s(r,bs="re",xt=list(S=S,rank=c(nb/2,nb/2))),method="REML") gam.vcomp(b) summary(b)