100 Days Challenge Day 30 - Mini-Batch Gradient Descent, Exponentially Weighted Averages
100 Days Challenge Day 30
Mini Batch Gradient Descent, Exponentially Weighted Averages
Watched videos from Week 2 of Hyperparameters, Regularization and Optimization Course on Youtube.Mini Batch Gradient Descent (C2W2L01)
Understanding Mini-Batch Gradient Descent (C2W2L02)
Exponentially Weighted Averages (C2W2L03)
Understanding Exponentially Weighted Averages (C2W2L04)
Bias Correction of Exponentially Weighted Averages (C2W2L05)
Gradient Descent With Momentum (C2W2L06)
RMSProp (C2W2L07)
Adam Optimization Algorithm (C2W2L08)
Learning Rate Decay (C2W2L09)
No comments