KEMBAR78
Make create_extended_attention_mask_for_decoder static method by pbelevich · Pull Request #16893 · huggingface/transformers · GitHub
Skip to content

Conversation

@pbelevich
Copy link
Contributor

What does this PR do?'

create_extended_attention_mask_for_decoder doesn't access self and can be @staticmethod. This resolves some issues with fx tracing for pytorch pipeline parallelism project.

cc @michaelbenayoun

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Apr 22, 2022

The documentation is not available anymore as the PR was closed or merged.

@pbelevich pbelevich marked this pull request as ready for review April 22, 2022 15:28
@ydshieh
Copy link
Collaborator

ydshieh commented Apr 22, 2022

(just a comment)

Would it be possible to provide the code sample for the issue that occurs without this PR, or a link to the issue page?

@michaelbenayoun
Copy link
Member

Looks good to me once all the tests pass.
Pinging @sgugger for review!

@michaelbenayoun michaelbenayoun requested a review from sgugger April 25, 2022 08:44
Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks for fixing!

@pbelevich pbelevich force-pushed the make_create_extended_attention_mask_for_decoder_static branch from cfee494 to c2ac8cb Compare April 29, 2022 13:34
@pbelevich
Copy link
Contributor Author

Would it be possible to provide the code sample for the issue that occurs without this PR, or a link to the issue page?

The project will be released and the repo will be opened soon

@pbelevich
Copy link
Contributor Author

Looks good to me once all the tests pass.

@michaelbenayoun @sgugger all tests passed

@sgugger sgugger merged commit 63fbed5 into huggingface:main Apr 29, 2022
@sgugger
Copy link
Collaborator

sgugger commented Apr 29, 2022

Thanks again for your contribution!

stevhliu pushed a commit to stevhliu/transformers that referenced this pull request May 3, 2022
elusenji pushed a commit to elusenji/transformers that referenced this pull request Jun 12, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants