SuSiE with L0Learn initialization example

Kaiqian Zhang

2022-06-26

susieR allows for a customized initialization. In this vignette we deomonstrate how to use L0Learn fit to initialize susieR.

library(susieR)
library(L0Learn)

Simulate data

First, we simulate data from the minimal example.

set.seed(1)
n = 1000
p = 1000
beta = rep(0,p)
beta[c(1,2,300,400)] = 1
X = matrix(rnorm(n*p),nrow=n,ncol=p)
y = X %*% beta + rnorm(n)

Fit L0Learn

We start with fitting a L0-regularized model to the simulated data.

set.seed(1)
L0fit = L0Learn.cvfit(X, y, penalty = "L0")

Let’s choose the penalty strength parameter that minimizes the cross-validation error.

lambdaIndex = which.min(L0fit$cvMeans[[1]]) 
L0coef = as.numeric(coef(L0fit$fit, lambda = L0fit$fit$lambda[[1]][lambdaIndex]))
effect.beta = L0coef[which(L0coef!=0)][-1]
effect.index = (which(L0coef!=0)-1)[-1] 
length(effect.beta)
# [1] 5
effect.beta[1:10]
#  [1]  1.0630345  1.0199835  0.9825537  1.0071008 -0.1124102         NA
#  [7]         NA         NA         NA         NA
effect.index[1:10]
#  [1]   1   2 300 400 555  NA  NA  NA  NA  NA

The L0Learn model finds 5 effects, which will be used to initialize susie.

Build an initialization object

We create an initialization from l0learn coefficients using susie_init_coef function,

set.seed(1)
s.init = susie_init_coef(effect.index, effect.beta, p)

Run susieR with initialization

Now, we use effect.index and effect.beta obtained from L0Learn fit to create an s.init object. We then run susie with this initialization.

susieL0.fit = susie(X,y,s_init=s.init)
susieL0.fit$sets$cs
# $L1
# [1] 1
# 
# $L2
# [1] 2
# 
# $L3
# [1] 300
# 
# $L4
# [1] 400

References

  1. Hussein Hazimeh and Rahul Mazumder. (2018). Fast Best Subset Selection: Coordinate Descent and Local Combinatorial Optimization Algorithms.