View source on GitHub
|
The Glorot normal initializer, also called Xavier normal initializer.
Inherits From: VarianceScaling, Initializer
tf.keras.initializers.GlorotNormal(
seed=None
)
Used in the notebooks
| Used in the tutorials |
|---|
Draws samples from a truncated normal distribution centered on 0 with
stddev = sqrt(2 / (fan_in + fan_out)) where fan_in is the number of
input units in the weight tensor and fan_out is the number of output units
in the weight tensor.
Examples:
# Standalone usage:initializer = GlorotNormal()values = initializer(shape=(2, 2))
# Usage in a Keras layer:initializer = GlorotNormal()layer = Dense(3, kernel_initializer=initializer)
Reference:
Methods
clone
clone()
from_config
@classmethodfrom_config( config )
Instantiates an initializer from a configuration dictionary.
Example:
initializer = RandomUniform(-1, 1)
config = initializer.get_config()
initializer = RandomUniform.from_config(config)
| Args | |
|---|---|
config
|
A Python dictionary, the output of get_config().
|
| Returns | |
|---|---|
An Initializer instance.
|
get_config
get_config()
Returns the initializer's configuration as a JSON-serializable dict.
| Returns | |
|---|---|
| A JSON-serializable Python dict. |
__call__
__call__(
shape, dtype=None
)
Returns a tensor object initialized as specified by the initializer.
| Args | |
|---|---|
shape
|
Shape of the tensor. |
dtype
|
Optional dtype of the tensor. |
View source on GitHub