Deep Learning, Ian Goodfellow, Yoshua Bengio, and Aaron Courville, 2016 (MIT Press) - This book provides comprehensive coverage of deep learning fundamentals, including detailed discussions on overfitting, bias-variance tradeoff, and various regularization techniques such as L1/L2, Dropout, and Early Stopping.
Dropout: A Simple Way to Prevent Neural Networks from Overfitting, Nitish Srivastava, Geoffrey Hinton, Alex Krizhevsky, Ilya Sutskever, and Ruslan Salakhutdinov, 2014Journal of Machine Learning Research, Vol. 15 (Journal of Machine Learning Research)DOI: 10.5555/2627435.2670313 - The original publication introducing Dropout, a technique essential for mitigating overfitting in neural networks.
Regularization, Stanford CS231n Course Staff, 2023 (Stanford University) - Official course notes from a widely respected university course, offering clear explanations of regularization methods for neural networks.