Learning Rate Warmup in PyTorch
-
Updated
May 11, 2024 - Python
Learning Rate Warmup in PyTorch
Gradient based Hyperparameter Tuning library in PyTorch
optimizer & lr scheduler & loss function collections in PyTorch
Polynomial Learning Rate Decay Scheduler for PyTorch
A guide that integrates Pytorch DistributedDataParallel, Apex, warmup, learning rate scheduler, also mentions the set-up of early-stopping and random seed.
Pytorch cyclic cosine decay learning rate scheduler
Automatic learning-rate scheduler
Warmup learning rate wrapper for Pytorch Scheduler
A learning rate recommending and benchmarking tool.
sharpDARTS: Faster and More Accurate Differentiable Architecture Search
Keras Callback to Automatically Adjust the learning rate when it stops improving
[PENDING] A lightweight but efficient Transformer model for accurate univariate stock price forecasting, designed for real-time trading applications. This project transforms the vanilla Transformer architecture for higher-precision financial time series analysis with minimal computational demands.
Pytorch implementation of arbitrary learning rate and momentum schedules, including the One Cycle Policy
Implementation of fluctuation dissipation relations for automatic learning rate annealing.
Code to reproduce the experiments of ICLR2023-paper: How I Learned to Stop Worrying and Love Retraining
Comprehensive image classification for training multilayer perceptron (MLP), LeNet, LeNet5, conv2, conv4, conv6, VGG11, VGG13, VGG16, VGG19 with batch normalization, ResNet18, ResNet34, ResNet50, MobilNetV2 on MNIST, CIFAR10, CIFAR100, and ImageNet1K.
A method for assigning separate learning rate schedulers to different parameters group in a model.
(GECCO2023 Best Paper Nomination) CMA-ES with Learning Rate Adaptation
Add a description, image, and links to the learning-rate-scheduling topic page so that developers can more easily learn about it.
To associate your repository with the learning-rate-scheduling topic, visit your repo's landing page and select "manage topics."