We have a new paper with the title “Statistical guarantees for sparse deep learning” in AStA Adv. Stat. Anal.
Author Archives: LedererLab
Lasso paper now online
The final version of our paper “Balancing Statistical and Computational Precision: A General Theory and Applications to Sparse Regression” is now accessible online here.
Paper Accepted at IEEE TIT
Our paper “Balancing Statistical and Computational Precision and Applications to Penalized Linear Regression with Group Sparsity” has been accepted at the IEEE Transactions on Information Theory. Well deserved, Néhémy and Mahsa! 🥇🥇
DeepMom paper now online
The published version of our paper “DeepMoM: Robust Deep Learning With Median-of-Means” is now accessible online
Two new researchers on the team
We have two new researchers on our team: Somnath Chakraborty joins us as a post-doctoral researcher, and Ayşe Çobankaya will join us as a PhD student. Welcome! 🤩
Canadian Journal of Statistics
Johannes has joined the editorial board of the Canadian Journal of Statistics. Find the journal’s website here.
Paper Accepted at JCGS
Our paper “DeepMoM: Robust Deep Learning With Median-of-Means” has been accepted at the Journal of Computational and Graphical Statistics. Awesome job, Shih-Ting! 🕺🕺
Shih-Ting passed final exams
Shih-Ting Huang passed his doctoral exam, earning his doctorate from Ruhr-University Bochum. Congratulations! 🎓
Paper Accepted at ICML
Our paper “Copula-Based Normalizing Flows” has been accepted at ICML. Cheers, Mike and Asja! 🥂
Approximate Stationary Points of Simple Neural Networks
Our new paper “Statistical Guarantees for Approximate Stationary Points of Simple Neural Networks” is now online. Well done, Mahsa and Fang! 🎉🎉