We have put a new paper about layer sparsity in neural networks on arXiv.
We have now put our paper “A pipeline for variable selection and false discovery rate control with an application in labor economics” on arXiv. The paper will be part of the Annual Congress of the Swiss Society of Economics and Statistics in 2021. Congratulations Sophie-Charlotte!
We have derived statistical guarantees for deep learning here. Well done, Mahsa and Fang! ⚡⚡⚡
We have established a new strategy for calibrating the graphical lasso here. Great work, Mike! 🍷
We have uploaded a new paper on the lasso’s effective noise and on consequences for calibration and inference here. Thanks to Michael for the great collaboration!
We have uploaded a paper on tuning-free ridge regression here. Well done, Shih-Ting and Fang!
We have uploaded a new paper on prediction in personalized medicine here. Congratulations to Shih-Ting and Yannick, who are the student authors of this paper!
We have uploaded a new paper that contains theoretical insights about our TREX here on arXiv. Thanks to my collaborators Jacob, Irina, and Christian!
We have uploaded a paper titled “Maximum Regularized Likelihood Estimators: A General Prediction Theory and Applications” here on arXiv. We discuss “slow” rates for MRLEs in a wide range of settings. Cool stuff, Rui! ✌