KEMBAR78
Use ACT2FN to fetch ReLU activation in the T5 model by eldarkurtic · Pull Request #16874 · huggingface/transformers · GitHub
Skip to content

Conversation

@eldarkurtic
Copy link
Contributor

  • all activations should be fetched through ACT2FN
  • it returns ReLU as nn.Module, which allows attaching hooks on the activation function and prints it to stdout when print(model)

- all activations should be fetched through ACT2FN
- it returns ReLU as `nn.Module`, which allows attaching hooks on the activation function and prints it to stdout when `print(model)`
Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Apr 21, 2022

The documentation is not available anymore as the PR was closed or merged.

@LysandreJik LysandreJik merged commit bae9b64 into huggingface:main Apr 21, 2022
elusenji pushed a commit to elusenji/transformers that referenced this pull request Jun 12, 2022
- all activations should be fetched through ACT2FN
- it returns ReLU as `nn.Module`, which allows attaching hooks on the activation function and prints it to stdout when `print(model)`
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants