Multiple GPU support for training?

I'm training on a machine with 4 gpus and was wondering if there's a way to leverage all of them to improve training speed? The command I'm currently running uses only one gpu:
python -m prodigy train --ner my_ner_data --gpu-id 0

This works fine, but is there a way to use something like --gpu-id 0,1,2,3 to leverage all four available?


Hi! spaCy (which is what Prodigy's train calls into under the hood) doesn't currently support multi-GPU training – but it's something we're working on :slightly_smiling_face:

1 Like