When to use xmp.spawn with pytorch lightening

Hi,
I am trying to train a translation model which is written in pytorch lightening(specifcally I train finetune.py in huggingface repo https://github.com/huggingface/transformers/blob/master/examples/seq2seq/finetune.py), for now consider it as a balckbox written with pytorch lightening. Now I want to train it with TPUs, they say you need to launch the code using this script (https://github.com/huggingface/transformers/blob/master/examples/seq2seq/xla_spawn.py) where it calls

xmp.spawn(mod._mp_fn, args=(), nprocs=args.num_cores)

Is this needed? does pytorch lightening take care of it automatically if I pass n_tpu_cores to the trainer?
could you please have a brief look into finetune.py and tell me if this is needed?

thanks

Best
Rabeeh

Hi @Rabeeh_Karimi,
PyTorch Lightning takes care of this automatically. Just pass in the tpu_cores param and Lightning will handle the rest.