-
Notifications
You must be signed in to change notification settings - Fork 6.4k
Fix PyTorch 2.3.1 compatibility: add version guard for torch.library.… #12206
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix PyTorch 2.3.1 compatibility: add version guard for torch.library.… #12206
Conversation
…custom_op - Add hasattr() check for torch.library.custom_op and register_fake - These functions were added in PyTorch 2.4, causing import failures in 2.3.1 - Both decorators and functions are now properly guarded with version checks - Maintains backward compatibility while preserving functionality Fixes huggingface#12195
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi, thanks for the fix! I think we can improve a bit here by using the approach followed in #11941. Specifically, the following lines:
diffusers/src/diffusers/models/attention_dispatch.py
Lines 121 to 139 in 0018b62
if torch.__version__ >= "2.4.0": | |
_custom_op = torch.library.custom_op | |
_register_fake = torch.library.register_fake | |
else: | |
def _custom_op_no_op(name, fn=None, /, *, mutates_args, device_types=None, schema=None): | |
def wrap(func): | |
return func | |
return wrap if fn is None else fn | |
def _register_fake_no_op(op, fn=None, /, *, lib=None, _stacklevel=1): | |
def wrap(func): | |
return func | |
return wrap if fn is None else fn | |
_custom_op = _custom_op_no_op | |
_register_fake = _register_fake_no_op |
- Replace hasattr check with version string comparison - Add no-op decorator functions for PyTorch < 2.4.0 - Follows pattern from huggingface#11941 as suggested by reviewer - Maintains cleaner code structure without indentation changes
Hi @a-r-r-o-w Tested with PyTorch 2.3.1 and the import works correctly. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the updates! Just one more comment
Update all the decorator usages Co-authored-by: Aryan <contact.aryanvs@gmail.com>
Co-authored-by: Aryan <contact.aryanvs@gmail.com>
Co-authored-by: Aryan <contact.aryanvs@gmail.com>
Co-authored-by: Aryan <contact.aryanvs@gmail.com>
Hi @a-r-r-o-w I've addressed all your feedback - moved the version check to the top of the file and used private naming with underscores. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for looking into it @Aishwarya0811!
@bot /style |
Style bot fixed some files and pushed the changes. |
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
#12206) * Fix PyTorch 2.3.1 compatibility: add version guard for torch.library.custom_op - Add hasattr() check for torch.library.custom_op and register_fake - These functions were added in PyTorch 2.4, causing import failures in 2.3.1 - Both decorators and functions are now properly guarded with version checks - Maintains backward compatibility while preserving functionality Fixes #12195 * Use dummy decorators approach for PyTorch version compatibility - Replace hasattr check with version string comparison - Add no-op decorator functions for PyTorch < 2.4.0 - Follows pattern from #11941 as suggested by reviewer - Maintains cleaner code structure without indentation changes * Update src/diffusers/models/attention_dispatch.py Update all the decorator usages Co-authored-by: Aryan <contact.aryanvs@gmail.com> * Update src/diffusers/models/attention_dispatch.py Co-authored-by: Aryan <contact.aryanvs@gmail.com> * Update src/diffusers/models/attention_dispatch.py Co-authored-by: Aryan <contact.aryanvs@gmail.com> * Update src/diffusers/models/attention_dispatch.py Co-authored-by: Aryan <contact.aryanvs@gmail.com> * Move version check to top of file and use private naming as requested * Apply style fixes --------- Co-authored-by: Aryan <contact.aryanvs@gmail.com> Co-authored-by: Aryan <aryan@huggingface.co> Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
What does this PR do?
Fixes #12195 by adding version guards for
torch.library.custom_op
andtorch.library.register_fake
which are not available in PyTorch 2.3.1.Problem
torch.library.custom_op
andtorch.library.register_fake
were introduced in PyTorch 2.4AttributeError: module 'torch.library' has no attribute 'custom_op'
from diffusers import AutoencoderKL
and other importsSolution
hasattr()
checks before using these torch.library functionsTesting
Fixes #12195
Who can review?
@sayakpaul @yiyixuxu - This is a PyTorch compatibility fix for core library functionality.