KEMBAR78
Support for more Dadaptation by sdbds · Pull Request #455 · kohya-ss/sd-scripts · GitHub
Skip to content

Conversation

@sdbds
Copy link
Contributor

@sdbds sdbds commented Apr 26, 2023

add more optimizer_type for DAdaptation
DAdaptation(DAdaptAdam)
DAdaptAdaGrad
DAdaptAdan
DAdaptSGD

i will test tomorrow

@sdbds sdbds changed the base branch from main to dev April 27, 2023 02:24
@sdbds sdbds marked this pull request as ready for review April 27, 2023 02:24
@sdbds
Copy link
Contributor Author

sdbds commented Apr 27, 2023

It works well.Even i try to use DAdaptAdan for block weight lr....
image

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

get_optimizer method looks like it would be better to refactor to make some parts common. If it is difficult to deal with, I will update it after the merge.

@kohya-ss
Copy link
Owner

Thank you for this! I will take a look when I have time!

@sdbds
Copy link
Contributor Author

sdbds commented May 5, 2023

I found something interesting——pytorch_optimizer
included 47 optimizers, 6 lr schedulers and Many methods can be directly invoked.。
https://github.com/kozistr/pytorch_optimizer

@kohya-ss kohya-ss merged commit 164a197 into kohya-ss:dev May 6, 2023
@kohya-ss
Copy link
Owner

kohya-ss commented May 6, 2023

Thank you for this! I've merged. Sorry for the delay.

pytorch_optimizer looks good! I think we can use the optimizer/lr scheduler with the arbitrary optimizer/scheduler option, like --optimizer_type=pytorch_optimizer.AdamP.

wkpark pushed a commit to wkpark/sd-scripts that referenced this pull request Feb 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants