Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Optimal Qlora settings #316

Open
KnutJaegersberg opened this issue Sep 2, 2023 · 1 comment
Open

Optimal Qlora settings #316

KnutJaegersberg opened this issue Sep 2, 2023 · 1 comment
Labels
feat/training Feature: Training/Fine-tuning type/feature Type: Feature
Milestone

Comments

@KnutJaegersberg
Copy link

In HF transformers, the default setting of qlora does not replicate the qlora of the original paper, leaving valuable performance lying on the ML practitioners street using lib defaults.
One has to apply lora to certain parts of the NN, please see Tweet by Tim Dettmers:

https://twitter.com/Tim_Dettmers/status/1695377756232589459

I guess this has to be customized for each model architecture, sounds like a feature for curated-transformers, to me.

@danieldk
Copy link
Contributor

danieldk commented Sep 5, 2023

Thanks for the suggestion! We hope to look more into training in the coming period and will definitely take this into account.

@danieldk danieldk added type/feature Type: Feature feat/training Feature: Training/Fine-tuning labels Sep 5, 2023
@shadeMe shadeMe added this to the Undecided milestone Oct 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feat/training Feature: Training/Fine-tuning type/feature Type: Feature
Projects
None yet
Development

No branches or pull requests

3 participants