Efficient training of interpretable, non-linear regression models

dc.contributor.authorAllerbo, Oskar
dc.date.accessioned2023-06-30T08:41:22Z
dc.date.available2023-06-30T08:41:22Z
dc.date.issued2023-06-30
dc.description.abstractRegression, the process of estimating functions from data, comes in many flavors. One of the most commonly used regression models is linear regression, which is computationally efficient and easy to interpret, but lacks in flexibility. Non-linear regression methods, such as kernel regression and artificial neural networks, tend to be much more flexible, but also harder to interpret and more difficult, and computationally heavy, to train. In the five papers of this thesis, different techniques for constructing regression models that combine flexibility with interpretability and computational efficiency, are investigated. In Papers I and II, sparsely regularized neural networks are used to obtain flexible, yet interpretable, models for additive modeling (Paper I) and dimensionality reduction (Paper II). Sparse regression, in the form of the elastic net, is also covered in Paper III, where the focus is on increased computational efficiency by replacing explicit regularization with iterative optimization and early stopping. In Paper IV, inspired by Jacobian regularization, we propose a computationally efficient method for bandwidth selection for kernel regression with the Gaussian kernel. Kernel regression is also the topic of Paper V, where we revisit efficient regularization through early stopping, by solving kernel regression iteratively. Using an iterative algorithm for kernel regression also enables changing the kernel during training, which we use to obtain a more flexible method, resembling the behavior of neural networks. In all five papers, the results are obtained by carefully selecting either the regularization strength or the bandwidth. Thus, in summary, this work contributes with new statistical methods for combining flexibility with interpretability and computational efficiency based on intelligent hyperparameter selection.en
dc.gup.defencedate2023-09-22
dc.gup.defenceplaceFredagen den 22 september, kl. 13.15, Pascal, Matematiska vetenskaper, Chalmers tvärgata 3en
dc.gup.departmentDepartment of Mathematical Sciences ; Institutionen för matematiska vetenskaperen
dc.gup.dissdb-fakultetMNF
dc.gup.mailallerbo@chalmers.seen
dc.gup.originUniversity of Gothenburgen
dc.identifier.isbn978-91-8069-337-0 (TRYCKT)
dc.identifier.isbn978-91-8069-338-7 (PDF)
dc.identifier.urihttps://hdl.handle.net/2077/77367
dc.language.isoengen
dc.relation.haspartI. Allerbo, O., Jörnsten, R. (2022). Flexible, Non-parametric Modeling Using Regularized Neural Networks. Computational Statistics, 37(4), 2029-2047. https://doi.org/10.1007/s00180-021-01190-4en
dc.relation.haspartII. Allerbo, O., Jörnsten, R. (2021). Non-linear, Sparse Dimensionality Reduction via Path Lasso Penalized Autoencoders. The Journal of Machine Learning Research, 22(283), 1-28. https://jmlr.org/papers/v22/21-0203.htmlen
dc.relation.haspartIII. Allerbo, O., Jonasson, J., Jörnsten, R. (2023). Elastic Gradient Descent, an Iterative Optimization Method Approximating the Solution Paths of the Elastic Net. The Journal of Machine Learning Research, 24(277), 1-53. https://jmlr.org/papers/v24/22-0119.htmlen
dc.relation.haspartIV. Allerbo, O., Jörnsten, R. (2023). Bandwidth Selection for Gaussian Kernel Ridge Regression via Jacobian Control. https://doi.org/10.48550/arXiv.2205.11956en
dc.relation.haspartV. Allerbo, O., Jörnsten, R. (2023). Solving Kernel Ridge Regression with Gradient-Based Optimization Methods. https://doi.org/10.48550/arXiv.2306.16838en
dc.subjectsparse regressionen
dc.subjectkernel regressionen
dc.subjectneural network regressionen
dc.subjectearly stoppingen
dc.subjectbandwidth selectionen
dc.titleEfficient training of interpretable, non-linear regression modelsen
dc.typeText
dc.type.degreeDoctor of Philosophyen
dc.type.svepDoctoral thesiseng

Files

Original bundle

Now showing 1 - 3 of 3
No Thumbnail Available
Name:
Kappa Oskar Allerbo.pdf
Size:
1.37 MB
Format:
Adobe Portable Document Format
Description:
Thesis
No Thumbnail Available
Name:
Spikblad A5.pdf
Size:
47.45 KB
Format:
Adobe Portable Document Format
Description:
Abstract
No Thumbnail Available
Name:
Omslag Oskar Allerbo.pdf
Size:
443.2 KB
Format:
Adobe Portable Document Format
Description:
Cover

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
4.68 KB
Format:
Item-specific license agreed upon to submission
Description: