To add Stochastic Gradient Descent to Documentation (#63805)
authorIlqar Ramazanli <iramazanli@fb.com>
Wed, 8 Sep 2021 22:20:52 +0000 (15:20 -0700)
committerFacebook GitHub Bot <facebook-github-bot@users.noreply.github.com>
Wed, 8 Sep 2021 22:22:30 +0000 (15:22 -0700)
commit149f1114fe94dc90fbb7000669a98c2337e25a54
tree17fdd84571f7eee2d1d6ae1f3ffb3124671964d7
parentff18195df91646ff2a35a7dbe4ff8a6ab87b7934
To add Stochastic Gradient Descent to Documentation (#63805)

Summary:
It has been discussed before that adding description of Optimization algorithms to PyTorch Core documentation may result in a nice Optimization research tutorial. In the following tracking issue we mentioned about all the necessary algorithms and links to the originally published paper  https://github.com/pytorch/pytorch/issues/63236.

In this PR we are adding description of Stochastic Gradient Descent to the documentation.

<img width="466" alt="SGDalgo" src="https://user-images.githubusercontent.com/73658284/132585881-b351a6d4-ece0-4825-b9c0-126d7303ed53.png">

Pull Request resolved: https://github.com/pytorch/pytorch/pull/63805

Reviewed By: albanD

Differential Revision: D30818947

Pulled By: iramazanli

fbshipit-source-id: 3812028e322c8a64f4343552b0c8c4582ea382f3
torch/optim/sgd.py