Default Prodigy NER Format to BERT Format

Hi, is there a way to export the default NER annotation format to a format that can work on BERT. I saw some other posts similar to mine but I wasn't sure if there is a newer feature that lets me convert formats. I'm trying to reuse the annotations I already have and use them to train a BERT model for NER.

Hi Yaamin,

have you seen this segment on our docs?

In particular, it shows that if you want to use BERT in spaCy you can just use the annotations as-is. To quote the page:

New in Prodigy v1.11 and spaCy v3
spaCy v3 lets you train a transformer-based pipeline and will take care of all tokenization alignment under the hood, to ensure that the subword tokens match to the linguistic tokenization. You can use data-to-spacy to export your annotations and train with spaCy v3 and a transformer-based config directly, or run `` train and provide the config via the --config argument.