Together with our amazing collaborator Marco Oesting, we have put a new paper with the title “Extremes in high dimensions: methods and scalable algorithms” on arXiv. Thanks for the great work, Marco! ๐ป
Category Archives: New Paper
Lag selection and stability in AR processes
With Somnath and our collaborator Rainer von Sachs, we have put a new paper with the title “Lag selection and estimation of stable parameters for multiple autoregressive processes through convex programming” on arXiv. Well done, Somnath! ๐ฅ
Two new papers on deep learning
New Paper in AStA Adv. Stat. Anal.
We have a new paper with the title “Statistical guarantees for sparse deep learning” in AStA Adv. Stat. Anal.
Approximate Stationary Points of Simple Neural Networks
Our new paper “Statistical Guarantees for Approximate Stationary Points of Simple Neural Networks” is now online. Well done, Mahsa and Fang! ๐๐
New Paper on Variable Clustering
We have a new paper “VC-PCR: a prediction method based on supervised variable selection and clustering” about variable clustering in transcriptomics and in general. Great work especially from Rebecca, who just defended her PhD, and from Rainer and Bernadette, the two other members of our Belgian๐ง๐ช-American๐บ๐ธ-German๐ฉ๐ช tag team!
Depth Normalization of Small RNA Sequencing
Our paper “Depth Normalization of Small RNA Sequencing: Using Data and Biology to Select a Suitable Method” is now available on arXiv. Great to work with you, Yannick and Li-Xuan! ๐ฆ
Normalizing Flows
We have submitted a new paper “copula-based normalizing flows.” Great to work with you, Mike and Asja! โ๏ธ
Vanishing gradients
We have analized the vanishing-gradient problem in deep learning and potential remedies here. Congratulations, Leni! ๐
Two new papers on deep learning
We two new papers on deep learning, one on targeted deep learning and one on robust deep learning. Well done, Shih-Ting!