The published versions of our papers “Tuning-free ridge estimators for high-dimensional generalized linear models” and “Integrating additional knowledge into the estimation of graphical models” are now accessible online
here and here, respectively.
FDR control rev.
Our paper “Aggregating Knockoffs for False Discovery Rate Control with an Application to Gut Microbiome Data” has now been published. Congratulations again to Fang!
Paper on FDR control accepted
Our paper “Aggregated false discovery rate control” has been accepted at Entropy. Great job, Fang! 🎬
Paper on brain connectivities accepted
Our paper “Integrating additional knowledge into the estimation of graphical models” is now accepted at the International Journal of Biostatistics. Congratulations, Yunqi! ⛄
Activation Functions in Artificial Neural Networks
We have composed an overview of activation functions in artificial neural networks here.
Papers accepted at AISTATS 2021
We got two papers accepted at this year’s AISTATS conference: “False Discovery Rates in Biological Networks” with Lu Yu and Tobias Kaufmann and “Thresholded Adaptive Validation: Tuning the Graphical Lasso for Graph Recovery” with Mike Laszkiewicz and Asja Fischer. Congratulations especially to the PhD students Lu (Toronto) and Mike (Bochum)! 🥳
Grant Awarded
Our grant proposal A general framework for graphical models was selected for funding by the German Research Foundation. We are looking forward to working on an exciting topic! 🎬
Optimization landscapes in deep learning
We analyze the optimization landscapes of feedforward neural networks here. We show especially that the landscapes of wide networks do not have spurious local minima.
Statistics and artificial intelligence
We have discussed the role of statistics in artificial intelligence here.
Robust deep learning
We have established risk bounds for robust deep learning here.