Nadam Optimiser
Gradient descent is an optimization algorithm that follows the negative gradient of an objective function in order to locate the minimum of the function.The Nesterov-accelerated Adaptive Moment Estimation, or the Nadam, algorithm is an extension to the Adaptive Movement Estimation (Adam) optimization algorithm to add Nesterov’s Accelerated Gradient (NAG) or Nesterov momentum, which is an improved type of momentum.Learn about gradient descent, an optimization algorithm used to train machine learning models by minimizing errors between predicted and actual results.