Nesterov Momentum or Nesterov accelerated gradient (NAG) is an optimization algorithm that helps you limit the overshoots in Momentum Gradient Descent Look Ahead Gradients
If you do have any questions with what we covered in this video then feel free to ask in the comment section below & I'll do my best to answer those.
If you enjoy these tutorials & would like to support them then the easiest way is to simply like the video & give it a thumbs up & also it's a huge help to share these videos with anyone who you think would find them useful.
Please consider clicking the SUBSCRIBE button to be notified for future videos & thank you all for watching.
You can find me on:
Blog - [ Ссылка ]
Twitter - [ Ссылка ]
GitHub - [ Ссылка ]
Medium - [ Ссылка ]
#GradientDescent #Optimization
![](https://i.ytimg.com/vi/uHOTRHqnakQ/maxresdefault.jpg)