susieR
allows for a customized initialization. In this
vignette we deomonstrate how to use L0Learn
fit
to initialize susieR.
library(susieR)
library(L0Learn)
First, we simulate data from the minimal example.
set.seed(1)
= 1000
n = 1000
p = rep(0,p)
beta c(1,2,300,400)] = 1
beta[= matrix(rnorm(n*p),nrow=n,ncol=p)
X = X %*% beta + rnorm(n) y
We start with fitting a L0-regularized model to the simulated data.
set.seed(1)
= L0Learn.cvfit(X, y, penalty = "L0") L0fit
Let’s choose the penalty strength parameter that minimizes the cross-validation error.
= which.min(L0fit$cvMeans[[1]])
lambdaIndex = as.numeric(coef(L0fit$fit, lambda = L0fit$fit$lambda[[1]][lambdaIndex]))
L0coef = L0coef[which(L0coef!=0)][-1]
effect.beta = (which(L0coef!=0)-1)[-1]
effect.index length(effect.beta)
# [1] 5
1:10]
effect.beta[# [1] 1.0630345 1.0199835 0.9825537 1.0071008 -0.1124102 NA
# [7] NA NA NA NA
1:10]
effect.index[# [1] 1 2 300 400 555 NA NA NA NA NA
The L0Learn model finds 5 effects, which will be used to initialize susie.
We create an initialization from l0learn
coefficients
using susie_init_coef
function,
set.seed(1)
= susie_init_coef(effect.index, effect.beta, p) s.init
Now, we use effect.index
and effect.beta
obtained from L0Learn fit to create an s.init
object. We
then run susie with this initialization.
= susie(X,y,s_init=s.init)
susieL0.fit $sets$cs
susieL0.fit# $L1
# [1] 1
#
# $L2
# [1] 2
#
# $L3
# [1] 300
#
# $L4
# [1] 400