KEMBAR78
TF: Add sigmoid activation function by gante · Pull Request #16819 · huggingface/transformers · GitHub
Skip to content

Conversation

@gante
Copy link
Member

@gante gante commented Apr 18, 2022

What does this PR do?

Fixes #16810 -- adds the sigmoid activation function to the TF activation functions.

Also sorts the activation functions in the enum-like dict and in the tests, to quickly identify missing functions.

@gante gante requested review from Rocketknight1 and sgugger April 18, 2022 15:46
"gelu_10": gelu_10,
"glu": glu,
"relu": tf.keras.activations.relu,
"sigmoid": tf.keras.activations.sigmoid,
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the only new line (in practice)

get_tf_activation("quick_gelu")
get_tf_activation("glu")
get_tf_activation("relu")
get_tf_activation("sigmoid")
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

and its corresponding test :)

@HuggingFaceDocBuilderDev
Copy link

HuggingFaceDocBuilderDev commented Apr 18, 2022

The documentation is not available anymore as the PR was closed or merged.

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for adding this and cleaning up!

@Rocketknight1
Copy link
Member

As an aside, GELU is a core Keras activation now, although we might have to wait until we can move our minimum version before we can switch to using it instead of our own implementations. Other than that, this looks great!

@gante gante merged commit f09c45e into huggingface:main Apr 19, 2022
@gante gante deleted the add_sigmoid branch April 19, 2022 15:13
elusenji pushed a commit to elusenji/transformers that referenced this pull request Jun 12, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Missing activation Function

4 participants