Andrew Jacobsen

Ph.D. Student

About Me

I am a Ph.D. student working at the University of Alberta, advised by Ashok Cutkosky (Boston University) and Martha White (University of Alberta). My primary focus is online learning theory, particularly algorithms that require no hyperparameter tuning and can adapt to time-varying objectives on-the-fly. I am also broadly interested in machine learning theory, reinforcement learning, and bandits.

Feel free to get in touch!

Publications

2023

Andrew Jacobsen, Ashok Cutkosky. “Unconstrained Online Learning with Unbounded Losses”. International Conference on Machine Learning. 2023. (in press)

2022

Andrew Jacobsen, Ashok Cutkosky. “Parameter-free Mirror Descent”. Conference on Learning Theory (COLT). 2022.

2021

Matthew McLeod, Chunlok Lo, Matthew Schlegel, Andrew Jacobsen, Raksha Kumaraswamy, Martha White, Adam White. “Continual Auxiliary Task Learning”. Neural Information Processing Systems. 2021.

Matthew Schlegel, Andrew Jacobsen, Muhammad Zaheer, Andrew Patterson, Adam White, Martha White. “General Value Function Networks.” Journal of Artificial Intelligence Research. 2021.

2019

Andrew Jacobsen. “Vector Step-size Adaptation for Continual, Online Prediction.” MSc thesis. University of Alberta. 2019.

Andrew Jacobsen, Matthew Schlegel, Cameron Linke, Thomas Degris, Adam White, and Martha White. “Meta-descent for Online, Continual Prediction.” In Proceedings of the AAAI Conference on Artificial Intelligence, vol. 33, pp. 3943-3950. 2019.

Education

University of Alberta

Ph.D. Computer Science

2019 - now

Advisors: Ashok Cutkosky, Martha White

University of Alberta

M.Sc. Computer Science

2018 - 2019

Thesis: Vector Step-size Adaptation for Continual, Online Prediction.

Advisors: Martha White, Adam White

University of Alberta

B.Sc. Computer Science

2013 - 2018