-
Notifications
You must be signed in to change notification settings - Fork 25.7k
Closed
Closed
Copy link
Labels
module: xpuIntel XPU related issuesIntel XPU related issuestriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Description
The following torch.xpu ops are missing (were available for IPEX):
-
torch.xpu.memory_allocated() -
torch.xpu.max_memory_allocated() -
torch.xpu.reset_peak_memory_stats()
These operations are used for example on Huggingface side:
- https://github.com/huggingface/transformers/blob/fd3238b4b0e1af4fae4a293cbfd1251ead40cd29/src/transformers/trainer_utils.py#L55
- https://github.com/huggingface/transformers/blob/fd3238b4b0e1af4fae4a293cbfd1251ead40cd29/src/transformers/trainer_utils.py#L60
- https://github.com/huggingface/transformers/blob/fd3238b4b0e1af4fae4a293cbfd1251ead40cd29/src/transformers/trainer_utils.py#L539
This issue affects XPU enabling for Huggingface - huggingface/transformers#31237 (comment).
Metadata
Metadata
Assignees
Labels
module: xpuIntel XPU related issuesIntel XPU related issuestriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
Type
Projects
Status
Done