Getting the total length of the validation batch

Hi peeps,

I’ve been struggling to figure this out for several days now.

I’m trying to get the total number of validation batches such that I may produce an average accuracy measure across all the batches at validation_epoch_end

Upon running self.trainer.num_batch_length I’m getting a value which seems to be a function of X GPUs and P CPU workers per GPU.

Is there a simple way for me to get the total?

Many thanks

Apologies, I made a mistake in my code. self.trainer.num_val_batches works exactly like intended!