Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks, Nils Reimers and Iryna Gurevych, 2019Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP) (Association for Computational Linguistics)DOI: 10.18653/v1/D19-1410 - This paper introduced Sentence-BERT, a widely used framework for generating sentence embeddings and the foundation for the Sentence Transformers library, directly relevant to the base models and fine-tuning strategies discussed.
LoRA: Low-Rank Adaptation of Large Language Models, Edward J. Hu, Yelong Shen, Phillip Wallis, Zeyuan Allen-Zhu, Yuanzhi Li, Shean Wang, Lu Wang, and Weizhu Chen, 2022International Conference on Learning Representations (ICLR 2022)DOI: 10.48550/arXiv.2106.09685 - This paper introduced Low-Rank Adaptation (LoRA), a parameter-efficient fine-tuning method that addresses computational cost and catastrophic forgetting, as described for adapter-based fine-tuning.