Software
See the links below for software directly related to a paper. We also refer to our group repository LedererLab@Github.
Preprints
Optimization landscapes of wide deep neural networks are benign
preprint
[arxiv]
Keywords: neural networks; deep learning; nonconvex optimization
Is there a role for statistics in artificial intelligence?
Joint work with S. Friedrich and others
preprint
[arxiv]
Keywords: statistics; artificial intelligence
Risk bounds for robust deep learning
preprint
[arxiv]
Keywords: neural networks; robust deep learning; risk bounds
Layer sparsity in neural networks
Joint work with M. Hebiri
preprint
[arxiv]
Keywords: neural networks; deep learning; sparsity
Statistical guarantees for regularized neural networks
Joint work with M. Taheri and F. Xie
preprint
[arxiv]
Keywords: neural networks; deep learning; oracle inequalities
Thresholded adaptive validation: tuning the graphical lasso for graph recovery
Joint work with M. Laszkiewicz and A. Fischer
preprint
[arxiv]
Keywords: graphical models; tuning parameter calibration; biological applications
Estimating the lasso’s effective noise
Joint work with M. Vogt
preprint
[arxiv]
Keywords: lasso; tuning parameter calibration; inference
Tuning-free ridge estimators for high-dimensional generalized linear models
Joint work with S.-T. Huang and F. Xie
preprint
[arxiv]
Keywords: generalized linear models; ridge regression; tuning parameter calibration
Tuning parameter calibration for prediction in personalized medicine
Joint work with S.-T. Huang, Y. Düren, and K. Hellton
preprint
[arxiv] | [software]
Keywords: precision medicine; ridge regression; tuning parameter calibration
False discovery rates in biological networks
Joint work with L. Yu and T. Kaufmann
preprint
[arxiv] | [software]
Keywords: biological systems; FDR control; Gaussian graphical models
Aggregated false discovery rate control
Joint work with F. Xie
preprint
[arxiv]
Keywords: FDR control, multiple testing, variable selection
Integrating additional knowledge into estimation of graphical models
Joint work with Y. Bu
preprint
[arxiv] | [data and software]
Keywords: brain connectivity networks, reproducible graph estimation
Efficient feature selection with large and high-dimensional data
Joint work with N. Lim
preprint
[arxiv] | [software]
Keywords: convex optimization, Big Data
Graphical models for discrete and continuous data
Joint work with R. Zhuang and N. Simon
preprint
[arxiv] | [software]
Keywords: graphical models
Published / In Press
A pipeline for variable selection and false discovery rate control with an application in labor economics
Joint work with S.-C. Klose
accepted at Annual Congress of the Swiss Society of Economics and Statistics, 2021
[arxiv]
Keywords: economics, high-dimensional inference, FDR control
Inference for high-dimensional nested regression
Joint work with D. Gold and J. Tao
J. Econometrics, 217(1), 2020, 79–111
[arxiv] | [software]
Keywords: instrumental variables, lasso, one-step correction, high-dimensional inference
Tuning parameter calibration for l1-regularized logistic regressions
Joint work with W. Li
J. Statist. Plann. Inference, 202, 2019, 80–98
[arxiv]
Keywords: tuning parameters, high-dimensional logistic regression
Prediction error bounds for linear regression with the TREX
Joint work with J. Bien, I. Gaynanova, and C. Müller
TEST, 28(2), 2019, 451–474
[arxiv]
Keywords: tuning parameters, oracle inequalities
Oracle inequalities for high-dimensional prediction
Joint work with L. Yu and I. Gaynanova
Bernoulli, 25(2), 2019, pages 1225–1255
[arxiv]
Keywords: high-dimensional prediction, correlations
Maximum regularized likelihood estimators: a general prediction theory and applications
Joint work with R. Zhuang
STAT 7(1), 2018, e186
[arxiv]
Keywords: oracle inequalities, maximum regularized likelihood
Non-convex global minimization and false discovery rate control for the TREX
Joint work with J. Bien, I. Gaynanova, and C. Müller
short version presented at ICML workshop, long version Comput. Graph. Statist. 27(1), 2018, pages 23–33
[arxiv] | [software and data examples]
Keywords: non-convex optimization, feature selection, tuning parameters
Optimal two-step prediction in regression
Joint work with D. Chételat and J. Salmon
Electron. J. Stat. 11(1), 2017, pages 2519–2546
[arxiv] | [software]
Keywords: tuning parameters, high-dimensional prediction, refitting
On the prediction performance of the lasso
Joint work with A. Dalalyan and M. Hebiri
Bernoulli, 23(1), 2017, pages 552–581
[arxiv]
Keywords: high-dimensional prediction, correlations
A practical scheme and fast algorithm to tune the lasso with optimality guarantees
Joint work with M. Chichignoud and M. Wainwright
J. Mach. Learn. Res. 17, 2016, pages 1–20
[arxiv]
Keywords: tuning parameters, high-dimensional statistics, oracle inequalities
Topology adaptive graph estimation in high dimensions
Joint work with C. Müller
Technical Report, 2016
[arxiv]
Keywords: high-dimensional networks, Gaussian graphical models
Trust, but verify: benefits and pitfalls of least-squares refitting in high dimensions
Technical Report, 2016
[arxiv]
Keywords: high-dimensional regression, refitting
Compute less to get more: using ORC to improve sparse filtering
Joint work with S. Guadarrama
AAAI-15
[arxiv]
Keywords: computer vision, feature learning
Don’t fall for tuning parameters: tuning-free variable selection in high dimensions with the TREX
Joint work with C. Müller
AAAI-15
[arxiv]
Keywords: tuning parameters, feature selection
New Concentration inequalities for suprema of empirical processes
Joint work with S. van de Geer
Bernoulli 20(4), 2014, pages 2020–2038
[arxiv]
Keywords: empirical processes, concentration inequalities
A robust, adaptive M-estimator for pointwise estimation in heteroscedastic regression
Joint work with M. Chichignoud
Bernoulli 20(3), 2014, pages 1560–1599
[arxiv]
Keywords: pointwise estimation, robust regression, adaptive regression
The group square-root lasso: theoretical properties and fast algorithms
Joint work with F. Bunea and Y. She
IEEE Trans. Inform. Theory 60(2), 2014, pages 1313–1325
[arxiv] | [software]
Keywords: tuning parameters, oracle inequalities, convex optimization
How correlations influence lasso prediction
Joint work with M. Hebiri
IEEE Trans. Inform. Theory 59(3), 2013, pages 1846–1854
[arxiv]
Keywords: high-dimensional prediction, correlations
The Lasso, correlated design, and improved oracle inequalities
Joint work with S. van de Geer
IMS Collections 9, 2013, pages 303–316
[arxiv]
Keywords: high-dimensional prediction, correlations
The Bernstein-Orlicz norm and deviation inequalities
Joint work with S. van de Geer
Probab. Theory Related Fields 157, Issue 1-2, 2013, pages 225–250
[arxiv]
Keywords: empirical processes, concentration inequalities
Nonasymptotic bounds for empirical processes and regression
Under supervision of S. van de Geer and P. Bühlmann
PhD thesis, 2012
Keywords: empirical processes, high-dimensional regression, robust statistics
Bounds for Rademacher processes via chaining
Technical report, 2010
[arxiv]
Keywords: empirical processes, chaining
Production of charged vector boson pairs in hadronic collisions
Under supervision of C. Anastasiou
Master’s thesis, 2009
Keywords: particle physics, Higgs boson