This example uses the immortal Holzinger Swineford data set.
The OpenMx model looks like this:
HS.ability.data$ageym <- HS.ability.data$agey*12 + HS.ability.data$agem
HS.ability.data$male <- as.numeric(HS.ability.data$Gender == 'Male')
# Specify variables
indicators <- c('visual','cubes','paper','flags','paperrev','flagssub',
'general','paragrap','sentence','wordc','wordm')
covariates <- c("male","ageym","grade")
latents = c("g", covariates)
# Build the model
mimicModel <- mxModel(
"MIMIC", type="RAM",
manifestVars = indicators, latentVars = latents,
# Set up exogenous predictors
mxPath("one", covariates, labels=paste0('data.',covariates), free=FALSE),
# Fix factor variance
mxPath('g', arrows=2, free=FALSE, values=1),
# Error variances:
mxPath(from=c(indicators), arrows=2, free=TRUE, values=10),
# Means (saturated means model):
mxPath(from="one", to=indicators, values=rep(5, length(indicators))),
# Loadings:
mxPath(from="g", to=indicators, values=.5),
# Covariate paths
mxPath(covariates, "g", labels=covariates),
# Data
mxData(observed = HS.ability.data, type = "raw"))
# Get some good starting values for regularization. This
# saves 2-3 minutes on my laptop.
mimicModel <- mxRun(mimicModel)
## Running MIMIC with 36 parameters
Add the penalty:
mimicModel <- mxModel(
mimicModel,
mxMatrix('Full',1,1,free=TRUE,values=0,labels="lambda",name="hparam"),
# Set scale to ML estimates for adaptive lasso
mxPenaltyLASSO(what=covariates, name="LASSO",
scale = coef(mimicModel)[covariates],
lambda = 0, lambda.max =2, lambda.step=.04)
)
Run the regularization. With only three covariates, the plot of results is not very exciting. We learn that sex is not a good predictor of this factor.
## Running MIMIC with 37 parameters
## Warning: In model 'MIMIC' Optimizer returned a non-zero status code 6. The
## model does not satisfy the first-order optimality conditions to the required
## accuracy, and no improved point for the merit function could be found during
## the final linesearch (Mx status RED)
detail <- regMIMIC$compute$steps$PS$output$detail
library(reshape2)
library(ggplot2)
est <- detail[,c(covariates, 'lambda')]
ggplot(melt(est, id.vars = 'lambda')) +
geom_line(aes(x=lambda, y=value, color=variable)) +
geom_vline(aes(xintercept=coef(regMIMIC)['lambda']),
linetype="dashed", alpha=.5)
The regularized factor loadings can be found here,
## male ageym grade
## 27 3.372766e-07 -0.02815012 1.063449
The regularization causes a lot of bias. One way to deal with this is to fix zerod parameters to zero, discard the regularization penalty, and re-fit model.
## Zapping 'male'
## Fixing 'lambda'
## Tip: Use
## model = mxRun(model)
## to re-estimate the model without any penalty terms.
## Running MIMIC with 35 parameters
## Summary of MIMIC
##
## free parameters:
## name matrix row col Estimate Std.Error A
## 1 MIMIC.A[1,12] A visual g 2.61817594 0.359298787
## 2 MIMIC.A[2,12] A cubes g 0.93493904 0.249165023
## 3 MIMIC.A[3,12] A paper g 0.70084632 0.148850149
## 4 MIMIC.A[4,12] A flags g 1.57351184 0.488495921
## 5 MIMIC.A[5,12] A paperrev g 0.99010502 0.241890553
## 6 MIMIC.A[6,12] A flagssub g 3.33349725 0.636982509
## 7 MIMIC.A[7,12] A general g 9.23700795 0.531325575
## 8 MIMIC.A[8,12] A paragrap g 2.53491899 0.153265387
## 9 MIMIC.A[9,12] A sentence g 3.96091919 0.218337618
## 10 MIMIC.A[10,12] A wordc g 3.80400282 0.254812615
## 11 MIMIC.A[11,12] A wordm g 5.73048123 0.332521021
## 12 ageym A g ageym -0.02870565 0.005116147
## 13 grade A g grade 1.08144484 0.147319560
## 14 MIMIC.S[1,1] S visual visual 40.27118644 3.350709894
## 15 MIMIC.S[2,2] S cubes cubes 21.00803435 1.721514961
## 16 MIMIC.S[3,3] S paper paper 7.36561018 0.605048154
## 17 MIMIC.S[4,4] S flags flags 78.47382780 6.425595339 !
## 18 MIMIC.S[5,5] S paperrev paperrev 8.35235110 0.994617503
## 19 MIMIC.S[6,6] S flagssub flagssub 56.56097630 6.796716392 !
## 20 MIMIC.S[7,7] S general general 45.64539430 4.716316397
## 21 MIMIC.S[8,8] S paragrap paragrap 4.06597950 0.402134862
## 22 MIMIC.S[9,9] S sentence sentence 6.80451253 0.749512037
## 23 MIMIC.S[10,10] S wordc wordc 13.88563458 1.285502142
## 24 MIMIC.S[11,11] S wordm wordm 17.27857521 1.794717772 !
## 25 MIMIC.M[1,1] M 1 visual 21.29329866 2.947678352
## 26 MIMIC.M[1,2] M 1 cubes 21.38065555 1.263730076
## 27 MIMIC.M[1,3] M 1 paper 12.00174359 0.891365009
## 28 MIMIC.M[1,4] M 1 flags 13.00224785 2.420589129
## 29 MIMIC.M[1,5] M 1 paperrev 12.12977939 1.369287374
## 30 MIMIC.M[1,6] M 1 flagssub 24.45733301 4.211407211
## 31 MIMIC.M[1,7] M 1 general 11.26671178 9.773539364
## 32 MIMIC.M[1,8] M 1 paragrap 1.12601996 2.684513304
## 33 MIMIC.M[1,9] M 1 sentence 4.77318252 4.172379784
## 34 MIMIC.M[1,10] M 1 wordc 14.03602611 4.045727959
## 35 MIMIC.M[1,11] M 1 wordm -2.91412180 6.049740939
##
## Model Statistics:
## | Parameters | Degrees of Freedom | Fit (-2lnL units)
## Model: 35 2964 17843.68
## Saturated: 77 2922 NA
## Independence: 22 2977 NA
## Number of observations/statistics: 301/2999
##
## Information Criteria:
## | df Penalty | Parameters Penalty | Sample-Size Adjusted
## AIC: 11915.6773 17913.68 17923.19
## BIC: 927.8025 18043.43 17932.43
## To get additional fit indices, see help(mxRefModels)
## timestamp: 2024-08-16 20:56:56
## Wall clock time: 9.974365 secs
## optimizer: SLSQP
## OpenMx version number: 2.21.12
## Need help? See help(mxSummary)