Download Individual edition from here and then run bash <.sh file path> in terminal
https://docs.anaconda.com/anaconda/install/uninstall/
Refer https://jupyter.org/install
conda install -c conda-forge jupyterlab
Refer https://pytorch.org/get-started/locally/
conda install pytorch -c pytorch
https://www.analyticsvidhya.com/blog/2019/09/introduction-to-pytorch-from-scratch/
Refer https://anaconda.org/conda-forge/transformers
conda install -c conda-forge transformers
The issue with learning rate schedules is that they all depend on hyperparameters that must be manually chosen for each given learning session and may vary greatly depending on the problem at hand or the model used. To combat this there are many different types of adaptive gradient descent algorithms such as Adagrad, Adadelta, RMSprop, Adam which are generally built into deep learning libraries such as Keras.
Refer: https://en.wikipedia.org/wiki/Learning_rate#Adaptive_learning_rate