We have submitted a new paper “copula-based normalizing flows.” Great to work with you, Mike and Asja! โ๏ธ
Category Archives: New Paper
Vanishing gradients
We have analized the vanishing-gradient problem in deep learning and potential remedies here. Congratulations, Leni! ๐
Two new papers on deep learning
We two new papers on deep learning, one on targeted deep learning and one on robust deep learning. Well done, Shih-Ting!
Activation Functions in Artificial Neural Networks
We have composed an overview of activation functions in artificial neural networks here.
Optimization landscapes in deep learning
We analyze the optimization landscapes of feedforward neural networks here. We show especially that the landscapes of wide networks do not have spurious local minima.
Statistics and artificial intelligence
We have discussed the role of statistics in artificial intelligence here.
Robust deep learning
We have established risk bounds for robust deep learning here.
Layer sparsity in neural networks
We have put a new paper about layer sparsity in neural networks on arXiv.
Inference in Labor Economics
We have now put our paper “A pipeline for variable selection and false discovery rate control with an application in labor economics” on arXiv. The paper will be part of the Annual Congress of the Swiss Society of Economics and Statistics in 2021. Congratulations Sophie-Charlotte!
Statistical Guarantees for Deep Learning
We have derived statistical guarantees for deep learning here. Well done, Mahsa and Fang! โกโกโก