Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks, Chelsea Finn, Pieter Abbeel, Sergey Levine, 2017Proceedings of the 34th International Conference on Machine Learning, Vol. 70 (PMLR)DOI: 10.55989/t2q5 - This foundational paper introduces MAML, a prominent algorithm for meta-learning that exemplifies the inner loop unrolling approach to bilevel optimization.
Bilevel Optimization for Machine Learning: A Survey, Yifan Lu, Zhichao Huang, Luyao Niu, Weishan Zhang, Xiang Li, Shouyang Wang, Xin Li, Xiaodong Yang, Song Guo, 2020IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 43 (IEEE)DOI: 10.1109/TPAMI.2020.3006214 - This survey provides a comprehensive overview of bilevel optimization techniques applied to various machine learning problems, including meta-learning, and discusses different algorithmic approaches.
OptNet: Differentiable Optimization as a Layer in Neural Networks, Brandon Amos, J. Zico Kolter, 2017Proceedings of the 34th International Conference on Machine Learning, Vol. 70 (PMLR)DOI: 10.55989/v70-amos17a - This paper provides foundational insights into differentiating through optimization problems, a core technique underlying implicit differentiation methods for meta-learning and other machine learning tasks.
On First-Order Meta-Learning Algorithms, Alex Nichol, Joshua Achiam, John Schulman, 2018arXiv, Vol. abs/1803.02999DOI: 10.48550/arXiv.1803.02999 - Introduces Reptile, a simple and efficient first-order meta-learning algorithm that offers an alternative perspective to gradient-based meta-learning by approximating the meta-gradient through repeated SGD steps.