KEMBAR78
losses per-batch-element · Issue #264 · pytorch/pytorch · GitHub
Skip to content

losses per-batch-element #264

@bshillingford

Description

@bshillingford

It would be useful to have losses that return a batch of losses, rather than summing/mean-ing them. This is useful e.g. for adding weights on the loss that vary for each batch element, and for masking losses on the outputs of variable-length RNNs.

Most common losses can be emulated using element-wise arithmetic (or in the case of the categorical NLL, a Gather operation), and that was my standard solution in lua torch. However, having them built-in is arguably cleaner and leads to more consistent code for the end user.

Metadata

Metadata

Assignees

No one assigned

    Labels

    hackamonthtodoNot as important as medium or high priority tasks, but we will work on these.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions