-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add ability for TQDMProgressBar to retain prior epoch training bars (… #19578
base: master
Are you sure you want to change the base?
Conversation
0148c20
to
f35de36
Compare
|
||
""" | ||
|
||
BAR_FORMAT = "{l_bar}{bar}| {n_fmt}/{total_fmt} [{elapsed}<{remaining}, {rate_noinv_fmt}{postfix}]" | ||
|
||
def __init__(self, refresh_rate: int = 1, process_position: int = 0): | ||
def __init__(self, refresh_rate: int = 1, process_position: int = 0, leave: bool = False): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sounds like a good idea!
if self._leave: | ||
self.train_progress_bar = self.init_train_tqdm() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of doing this, I believe it would be better to pass leave=self.leave
to the tqdm bar directly (see init_train_tqdm()
, init_validation_tqdm()
etc. above.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You mean creating yet another constructor wrapper, like in the rich progress bar, and what is already present in this file?
E.g.
def reinit_train_tqdm(self) -> Tqdm:
"""Override this to customize the tqdm bar for training."""
return Tqdm(
desc=self.train_description,
position=(2 * self.process_position),
disable=self.is_disabled,
leave=self._leave,
dynamic_ncols=True,
file=sys.stdout,
smoothing=0,
bar_format=self.BAR_FORMAT,
)
It'd be identical to init_validation_tqdm
, so I don't really see the point of that, unless the intended introduction of init_validation_tqdm
had some undocumented purpose that you're planning to start taking advantage of (and perhaps need to change the implementation of the constructor function)
Or did you perhaps mean changing the hard coded "leave=True" to "leave=self._leave" in the existing "init_train_tqdm" function. Or perhaps a third variant, where the init_train_tqdm
function is parameterized to take the leave value?
PS. Thanks for the review.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, I'm proposing to simply pass leave=self.leave
in init_train_tqdm()
where we hard coded it to True so far. Line 205
What does this PR do?
Adds the ability to retain tqdm progress bars from prior training epochs.
Fixes #19322
Before submitting
A: No, an issue was was opened but there is no PR yet to discuss (catch 22).
A: Yes.
A: No. The goal was to not break any existing behavior so the existing TQDMProgressBar test suite was used to ensure that objective.
A: Partly, I only ran the TQDMProgressBar test suite, since the other's mostly failed on my machine even before the change was made. As such am submitting a draft PR to ensure the entire test suite passes using the project's CI system before it's worth discussing the PR.
A: N/A, No breaking changes intended.
A: Yes
PR review
Anyone in the community is welcome to review the PR.
Before you start reviewing, make sure you have read the review guidelines. In short, see the following bullet-list:
Reviewer checklist
📚 Documentation preview 📚: https://pytorch-lightning--19578.org.readthedocs.build/en/19578/