IndexError when exporting a model to ONNX format

Hi everyone,
we’re trying to export a trained model to the onnx format by using the following code:

self.model.to_onnx(
        Path(export_dir) / "model.onnx",
        convert_batch_to_list(input_sample),
        export_params=True,
        opset_version=opset_version,
    )

using an opset_version of 10, and being the output of convert_batch_to_list(input_sample) a list of 6 torch.Tensors with size of 128x64

The error trace is the following:

  File "/.../_train.py", line 239, in _export_to_onnx
    self.model.to_onnx(
  File "/home/user/.local/lib/python3.9/site-packages/torch/autograd/grad_mode.py", line 28, in decorate_context
    return func(*args, **kwargs)
  File "/usr/local/envs/fair/lib/python3.9/site-packages/pytorch_lightning/core/lightning.py", line 1899, in to_onnx
    torch.onnx.export(self, input_sample, file_path, **kwargs)
  File "/home/user/.local/lib/python3.9/site-packages/torch/onnx/__init__.py", line 316, in export
    return utils.export(model, args, f, export_params, verbose, training,
  File "/home/user/.local/lib/python3.9/site-packages/torch/onnx/utils.py", line 107, in export
    _export(model, args, f, export_params, verbose, training, input_names, output_names,
  File "/home/user/.local/lib/python3.9/site-packages/torch/onnx/utils.py", line 724, in _export
    _model_to_graph(model, args, verbose, input_names,
  File "/home/user/.local/lib/python3.9/site-packages/torch/onnx/utils.py", line 544, in _model_to_graph
    params_dict = torch._C._jit_pass_onnx_constant_fold(graph, params_dict,
IndexError: index_select(): Index is supposed to be a vector

It seems like that when calling the _jit_pas_onnx_constant_fold function, the _export_onnx_opset_version variable is set to an integer (the value of _opset_version) but it is expecting a vector instead.

We are using pytorch 1.10 and pytorchLightning 1.5.5.

Does anyone have seen a similar behavior before?
Is there any way to set properly the value of _export_onnx_opset_version variable ?

Thank you very much!