What is the significance of learning rate in gradient descent?
January 9, 2025
The learning rate controls how large the steps are during gradient descent. A large learning rate can cause the algorithm to overshoot the optimal solution, while a small learning rate may lead to slow convergence. Finding the optimal learning rate is crucial for efficient training.