View source on GitHub
|
Computes the crossentropy loss between the labels and predictions.
Inherits From: Loss
tf.keras.losses.CategoricalCrossentropy(
from_logits=False,
label_smoothing=0.0,
axis=-1,
reduction='sum_over_batch_size',
name='categorical_crossentropy'
)
Used in the notebooks
| Used in the guide | Used in the tutorials |
|---|---|
Use this crossentropy loss function when there are two or more label
classes. We expect labels to be provided in a one_hot representation. If
you want to provide labels as integers, please use
SparseCategoricalCrossentropy loss. There should be num_classes floating
point values per feature, i.e., the shape of both y_pred and y_true are
[batch_size, num_classes].
Examples:
Standalone usage:
y_true = [[0, 1, 0], [0, 0, 1]]y_pred = [[0.05, 0.95, 0], [0.1, 0.8, 0.1]]# Using 'auto'/'sum_over_batch_size' reduction type.cce = keras.losses.CategoricalCrossentropy()cce(y_true, y_pred)1.177
# Calling with 'sample_weight'.cce(y_true, y_pred, sample_weight=np.array([0.3, 0.7]))0.814
# Using 'sum' reduction type.cce = keras.losses.CategoricalCrossentropy(reduction="sum")cce(y_true, y_pred)2.354
# Using 'none' reduction type.cce = keras.losses.CategoricalCrossentropy(reduction=None)cce(y_true, y_pred)array([0.0513, 2.303], dtype=float32)
Usage with the compile() API:
model.compile(optimizer='sgd',
loss=keras.losses.CategoricalCrossentropy())
Methods
call
call(
y_true, y_pred
)
from_config
@classmethodfrom_config( config )
get_config
get_config()
__call__
__call__(
y_true, y_pred, sample_weight=None
)
Call self as a function.
View source on GitHub