Dropout vs. batch normalization: effect on accuracy, training and inference times - code for the paper
-
Updated
Dec 19, 2022 - TeX
8000
Dropout vs. batch normalization: effect on accuracy, training and inference times - code for the paper
Assignments in unconstrained optimization course covering 1st half of Nocedal and Wright textbook.
Machine Learning Practical - Coursework 1 Report: a study of the problem of overfitting in deep neural networks, how it can be detected, and prevented using the EMNIST dataset. This was done by performing experiments with depth and width, dropout, L1 & L2 regularization, and Maxout networks.
Add a description, image, and links to the overfitting topic page so that developers can more easily learn about it.
To associate your repository with the overfitting topic, visit your repo's landing page and select "manage topics."