We have established a new strategy for calibrating the graphical lasso here. Great work, Mike! 🍷
We have uploaded a new paper on the lasso’s effective noise and on consequences for calibration and inference here. Thanks to Michael for the great collaboration!
We have uploaded a paper on tuning-free ridge regression here. Well done, Shih-Ting and Fang!
We have uploaded a new paper on prediction in personalized medicine here. Congratulations to Shih-Ting and Yannick, who are the student authors of this paper!
We have uploaded a new paper that contains theoretical insights about our TREX here on arXiv. Thanks to my collaborators Jacob, Irina, and Christian!
We have uploaded a paper titled “Maximum Regularized Likelihood Estimators: A General Prediction Theory and Applications” here on arXiv. We discuss “slow” rates for MRLEs in a wide range of settings. Cool stuff, Rui! ✌
We have established inference for two-stage regression models, with both stages high-dimensional. The paper is available here. Awesome work, David! ⛰
We have developed an approach for integrating additional knowledge into parameter estimation in graphical models. The main idea is to funnel the knowledge into the tuning parameters. Find the paper here. Well done, Yunqi! 😁
We have extended the AV-scheme for tuning parameter calibration to logistic regression. Our approach provides efficient feature selection and is supported by theoretical guarantees. The paper can be downloaded here. Great job, Wei! 😎