Our paper “Integrating additional knowledge into the estimation of graphical models” is now accepted at the International Journal of Biostatistics. Congratulations, Yunqi! ⛄
We have composed an overview of activation functions in artificial neural networks here.
We got two papers accepted at this year’s AISTATS conference: “False Discovery Rates in Biological Networks” with Lu Yu and Tobias Kaufmann and “Thresholded Adaptive Validation: Tuning the Graphical Lasso for Graph Recovery” with Mike Laszkiewicz and Asja Fischer. Congratulations especially to the PhD students Lu (Toronto) and Mike (Bochum)! 🥳
Our grant proposal A general framework for graphical models was selected for funding by the German Research Foundation. We are looking forward to working on an exciting topic! 🎬
We analyze the optimization landscapes of feedforward neural networks here. We show especially that the landscapes of wide networks do not have spurious local minima.
We have discussed the role of statistics in artificial intelligence here.
We have established risk bounds for robust deep learning here.
We have put a new paper about layer sparsity in neural networks on arXiv.
We have now put our paper “A pipeline for variable selection and false discovery rate control with an application in labor economics” on arXiv. The paper will be part of the Annual Congress of the Swiss Society of Economics and Statistics in 2021. Congratulations Sophie-Charlotte!
Our paper “Inference for high-dimensional instrumental variables regression” has now been published. Congratulations again to David and Jing!