I am using ‘auto_lr_find’ and find it’s quite awesome. But each run of trainer would produce a file like lr_find_temp_model_*.ckpt in current path. It’s annoying. Could it’s automatically saved in a specified directory like ‘temp/’？ Or could it find a existed lr_find_temp_model_**.ckpt and use it, rather than produce a new one? Especially when I am just debugging the code.
Part of my code is
checkpoint_callback = ModelCheckpoint(save_weights_only=False, mode="min", monitor="val_loss",dirpath='outputs',save_last=False,save_top_k=1) trainer=pl.Trainer(gpus=1,strategy='dp', max_epochs=10, auto_lr_find=True, callbacks=[ checkpoint_callback, LearningRateMonitor("epoch"), RichProgressBar(), ], log_every_n_steps=10, ) trainer.tune(model,train_loader,val_loader) trainer.fit(model,train_loader,val_loader,ckpt_path=None)
this should be resolved with latest release. Now it adds a unique id so that it would create different checkpoint file names for each run. Note that they are deleted once tuning is done.
Also, we have moved the discussions to GitHub Discussions. You might want to check that out instead to get a quick response. The forums will be marked read-only after some time.