Temporal Fusion Transformer to C++ (libtorch

C++ Inference
Hey,
I am trying to load and do inference in C++ on a pretrained Temporal Fusion Transformer network. Pytorch Forecasting (Temporal Fusion Transformer)

The method proposed by PyTorch Lightning: Lightning inference in produciton does not seem to work for those type of networks.

NotSupportedError: Comprehension ifs are not supported yet:
  File "C:\Users\1998t\anaconda3\envs\dai\lib\site-packages\pytorch_forecasting\models\temporal_fusion_transformer\__init__.py", line 404
        input_vectors = self.input_embeddings(x_cat)
        input_vectors.update(
            {
                name: x_cont[..., idx].unsqueeze(-1)
                for idx, name in enumerate(self.hparams.x_reals)

A second method was by using torch.jit.trace(model,example_inputs). I was able to write this to a .pt file by feeding a dictionary of tensors (and only in this way using trace, because the script functionality was not supported). Loading in this .pt file worked on the C++ side. However, calling module.forward(inputs) generates an internal runtime error. The inputs are of the right type. This is catched internally and resulting in a return value None.

What could have gone wrong and are there methods to get deeper into the debugging?