Architecture
Gradient Descent
When training a neural network, an algorithm is used to minimize the loss. This algorithm is called as Gradient Descent. And loss refers to the incorrect outputs given by the hypothesis function. The Gradient is Read more…