I am trying to train a translation model which is written in pytorch lightening(specifcally I train finetune.py in huggingface repo https://github.com/huggingface/transformers/blob/master/examples/seq2seq/finetune.py), for now consider it as a balckbox written with pytorch lightening. Now I want to train it with TPUs, they say you need to launch the code using this script (https://github.com/huggingface/transformers/blob/master/examples/seq2seq/xla_spawn.py) where it calls
xmp.spawn(mod._mp_fn, args=(), nprocs=args.num_cores)
Is this needed? does pytorch lightening take care of it automatically if I pass n_tpu_cores to the trainer?
could you please have a brief look into finetune.py and tell me if this is needed?