-
Notifications
You must be signed in to change notification settings - Fork 30.9k
TF: Add sigmoid activation function #16819
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| "gelu_10": gelu_10, | ||
| "glu": glu, | ||
| "relu": tf.keras.activations.relu, | ||
| "sigmoid": tf.keras.activations.sigmoid, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
the only new line (in practice)
| get_tf_activation("quick_gelu") | ||
| get_tf_activation("glu") | ||
| get_tf_activation("relu") | ||
| get_tf_activation("sigmoid") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
and its corresponding test :)
|
The documentation is not available anymore as the PR was closed or merged. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for adding this and cleaning up!
|
As an aside, GELU is a core Keras activation now, although we might have to wait until we can move our minimum version before we can switch to using it instead of our own implementations. Other than that, this looks great! |
What does this PR do?
Fixes #16810 -- adds the sigmoid activation function to the TF activation functions.
Also sorts the activation functions in the enum-like dict and in the tests, to quickly identify missing functions.