-
Notifications
You must be signed in to change notification settings - Fork 30.9k
Add tests for no_trainer and fix existing examples #16656
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
The documentation is not available anymore as the PR was closed or merged. |
|
CI failures were fixed by removing: if torch_device != "cuda":
testargs.append("--no_cuda")from From what I could see they were unused, so I didn't duplicate them from the transformers tests. Let me know if they should be added back in, with special behavior on those tests 😄 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great work! 😍
|
For information, here are the durations: Could the |
|
Changed checkpointing tests to be by epoch, and also not saving with swag. Here were those times locally for me: Before After |
New tests for the
no_trainerscriptsWhat does this add?
no_trainerscripts, mocking how the Transformers counterparts workno_trainerscripts, discovered while writing these tests