Trainer use one epoch of the test_dataloader

I want to try the test mode with trainer.test(model, data_module, ckpt_path="./checkpoints/best-checkpoint.ckpt"). If I did not make any mistake, with that line of cod, the model will work with the dataset_test that I created in the LightningDataModule and run each epoch, calculate the loss and at the end it will return el avg_loss (code below).

class Data(pl.LightningDataModule): #data_module
  def __init__(self, test_df):
    self.test_df = test_df

  def setup(self, stage=None):
    self.dataset_test = TensorDataset(input_ids_test, attention_masks_test, labels_test)

    def test_dataloader(self):
    return DataLoader(self.dataset_test,
class ClassModel(pl.LightningModule): #model
  def __init__(self):
    self.model = bert_model

  def configure_optimizers(self):

  def forward(self, input_ids, attention_mask, labels):
    loss, output = self.model(input_ids=input_ids, attention_mask=attention_mask, labels=labels)
    return loss, output


  def test_step(self, batch, batch_idx):
    inputs = {'input_ids':      batch[0],
              'attention_mask': batch[1],
              'labels':         batch[2]}
    outputs = self.model(**inputs)
    loss = outputs[0]
    self.log("Test loss", loss, on_epoch=True)
    return {'loss': loss}

  def test_epoch_end(self, outputs):
    avg_loss = torch.stack([x['loss'] for x in outputs]).mean()
    return {'avg_loss': avg_loss}

The result that I got after I run trainer.test(model, data_module) is :

We can see her that it only run for one epoch (63) while the size of my test dataset is 1000 and the batch size is 16.
Can you please tell me here if I missed something?
Thank you for your time!