KEMBAR78
[LoRA] fix indexing in LoRA state dict expansion utils by sayakpaul · Pull Request #10396 · huggingface/diffusers · GitHub
Skip to content

Conversation

sayakpaul
Copy link
Member

@sayakpaul sayakpaul commented Dec 27, 2024

What does this PR do?

We introduced a bug in

base_weight_param = transformer_state_dict[base_param_name]

This makes the following code throw errors with the latest diffusers:

Reproducer
from diffusers import DiffusionPipeline 
import torch 

lora_one = "Purz/choose-your-own-adventure"
lora_two = "ByteDance/Hyper-SD"

pipeline = DiffusionPipeline.from_pretrained("black-forest-labs/FLUX.1-dev", torch_dtype=torch.bfloat16).to("cuda")
pipeline.load_lora_weights(lora_one) # ['default_0']
print(pipeline.get_active_adapters())

pipeline.load_lora_weights(lora_two, weight_name="Hyper-FLUX.1-dev-8steps-lora.safetensors")
print(pipeline.get_active_adapters()) # ['default_1']

pipeline.set_adapters(["default_0", "default_1"])
print(pipeline.get_active_adapters()) # ['default_0', 'default_1']

This went uncaught because we don't have a test to mimic this scenario.

So, this PR fixes this behavior and also adds a test for it.

I think we have to do a patch release for this (which I can take care of). Related to #10392.

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@sayakpaul sayakpaul requested a review from a-r-r-o-w December 27, 2024 13:37
@sayakpaul sayakpaul marked this pull request as ready for review December 27, 2024 13:38
@sayakpaul sayakpaul requested a review from yiyixuxu December 27, 2024 14:21
@sayakpaul
Copy link
Member Author

Closing in favor of #10388. Cc: @hlky

@sayakpaul sayakpaul closed this Dec 29, 2024
@sayakpaul sayakpaul deleted the fix-lora-expansion-flux branch December 29, 2024 15:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants