We have analized the vanishing-gradient problem in deep learning and potential remedies here. Congratulations, Leni! 🏄
The published version of “Statistical guarantees for regularized neural networks” can now be found
Our paper “Statistical guarantees for regularized neural networks” has been accepted at Neural Networks. Congrats, Mahsa and Fang! 🧉
Our paper “Aggregating Knockoffs for False Discovery Rate Control with an Application to Gut Microbiome Data” has now been published. Congratulations again to Fang!
Our paper “Aggregated false discovery rate control” has been accepted at Entropy. Great job, Fang! 🎬
Our paper “Integrating additional knowledge into the estimation of graphical models” is now accepted at the International Journal of Biostatistics. Congratulations, Yunqi! ⛄
We have composed an overview of activation functions in artificial neural networks here.
We got two papers accepted at this year’s AISTATS conference: “False Discovery Rates in Biological Networks” with Lu Yu and Tobias Kaufmann and “Thresholded Adaptive Validation: Tuning the Graphical Lasso for Graph Recovery” with Mike Laszkiewicz and Asja Fischer. Congratulations especially to the PhD students Lu (Toronto) and Mike (Bochum)! 🥳