Welcome to the Neural Network Initialization and Training project! This project demonstrates the importance of initialization methods when training deep learning models. It showcases how different initialization strategies affect the model's learning and performance. We'll cover three key initialization methods:
- Zero Initialization ➡️ All weights set to zero.
- Random Initialization ➡️ Weights are randomly initialized.
- He Initialization ➡️ Weights are initialized using He et al.'s method, suitable for ReLU activations.
This project is based on concepts and methodologies taught in the Deep Learning Specialization by Andrew Ng and DeepLearning.AI.