8000 GitHub - MohammedSaqibMS/Initialization: This repository explores the impact of various weight initialization methods on a neural network's performance, comparing zero, random, and He initialization. It includes visualizations of cost function and decision boundaries.
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

This repository explores the impact of various weight initialization methods on a neural network's performance, comparing zero, random, and He initialization. It includes visualizations of cost function and decision boundaries.

Notifications You must be signed in to change notification settings

MohammedSaqibMS/Initialization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Neural Network Initialization and Training 🧠🚀

Welcome to the Neural Network Initialization and Training project! This project demonstrates the importance of initialization methods when training deep learning models. It showcases how different initialization strategies affect the model's learning and performance. We'll cover three key initialization methods:

  • Zero Initialization ➡️ All weights set to zero.
  • Random Initialization ➡️ Weights are randomly initialized.
  • He Initialization ➡️ Weights are initialized using He et al.'s method, suitable for ReLU activations.

🙏 Credits

This project is based on concepts and methodologies taught in the Deep Learning Specialization by Andrew Ng and DeepLearning.AI.

About

This repository explores the impact of various weight initialization methods on a neural network's performance, comparing zero, random, and He initialization. It includes visualizations of cost function and decision boundaries.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
0