Regularisation-Dropout Resources
Regularization and Dropout are two commonly used techniques in deep learning to prevent overfitting of the model to the training data.”Regularization” section of the Deep Learning book by Ian Goodfellow, Yoshua Bengio, and Aaron Courville: This section of the book provides a detailed explanation of regularization techniques, including L1 and L2 regularization, dropout, and data augmentation.