Understanding machine learning through dynamical-systems methods

dc.contributor.authorStorm, Ludvig
dc.contributor.authorStorm, Ludvig
dc.date.accessioned2025-09-29T14:46:22Z
dc.date.available2025-09-29T14:46:22Z
dc.date.issued2025-09-29
dc.description.abstractMachine learning has in the past decade been successfully applied to a vast range of tasks, ranging from classification, time-series prediction, and optimal navigation strategies. However, the internal mechanisms of many models are still difficult to interpret, and we lack a systematic understanding of when and why they perform successfully. Dynamical-systems theory has long been used to study complex, high-dimensional systems by focusing on their geometric and stability properties. In this thesis, methods from dynamical-systems theory are applied to machine learning models in order to gain new insights into their behaviour, with particular emphasis on finite-time Lyapunov exponents (FTLE) and Lagrangian coherent structures (LCS). In the first part, FTLEs are used to study how feed-forward neural networks organise sensitivity in input space, distinguishing regimes where networks align sensitivity with decision boundaries from regimes where embeddings appear random. In the second part, reservoir computing is analysed from a dynamical-systems perspective, and the maximal Lyapunov exponent of the driven reservoir is identified as the key parameter that controls prediction performance. In the third part, LCS theory is applied to the dynamics of active particles in flows, and it is shown how coherent structures determine when navigation strategies succeed or fail, in particular by explaining the trapping of swimmers in vortical regions. Overall, the thesis demonstrates that concepts originally developed to analyse complex physical systems can be fruitfully applied to machine learning. The use of FTLEs and LCS provides systematic tools for quantifying sensitivity and stability, offering a complementary perspective to existing approaches for analysing when and how machine-learning algorithms are able to learn.sv
dc.gup.defencedate2025-10-24
dc.gup.defenceplaceFredagen den 24 oktober 2025, kl. 10.00, PJ-salen, Institutionen för fysik, Fysik Origo, Fysikgården 2A, Göteborg.sv
dc.gup.departmentDepartment of Physics ; Institutionen för fysiksv
dc.gup.dissdb-fakultetMNF
dc.gup.mailstorm.ludvig@gmail.comsv
dc.gup.originUniversity of Gothenburg. Faculty of Science and Technologysv
dc.identifier.isbn978-91-8115-463-4 (TRYCK)
dc.identifier.isbn978-91-8115-464-1 (PDF)
dc.identifier.urihttps://hdl.handle.net/2077/89665
dc.language.isoengsv
dc.relation.haspartPaper [I]: STORM, L., LINANDER, H., BEC, J., GUSTAVSSON, K., & MEHLIG, B. 2024 Finite-time Lyapunov exponents of deep neural networks. Phys. Rev. Lett. 132, 057301. https://doi.org/10.1103/PhysRevLett.132.057301sv
dc.relation.haspartPaper [II]: STORM, L., GUSTAVSSON, K., & MEHLIG, B. 2022 Constraints on parameter choices for successful time-series prediction with echo-state networks. Mach. Learn.: Sci, Technol. 3, 045021. https://doi.org/10.1088/2632-2153/aca1f6sv
dc.relation.haspartPaper [III]: STORM, L., QIU, J., GUSTAVSSON, K., & MEHLIG, B. 2025 Transport barriers for mi- croswimmers in unsteady flow. Preprint. https://arxiv.org/abs/2509.16430sv
dc.titleUnderstanding machine learning through dynamical-systems methodssv
dc.typeText
dc.type.degreeDoctor of Philosophysv
dc.type.svepDoctoral thesiseng

Files

Original bundle

Now showing 1 - 3 of 3
No Thumbnail Available
Name:
PhD_thesis_without_papers_Ludvig_Storm.pdf
Size:
5.37 MB
Format:
Adobe Portable Document Format
Description:
PhD thesis
No Thumbnail Available
Name:
cover_PhD_thesis.pdf
Size:
1.34 MB
Format:
Adobe Portable Document Format
Description:
Cover
No Thumbnail Available
Name:
Spikblad_Ludvig_Storm.pdf
Size:
1.63 MB
Format:
Adobe Portable Document Format
Description:
Spikblad

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
4.68 KB
Format:
Item-specific license agreed upon to submission
Description: