Hello,
I' m currently training a NER model with bert as base mode.I already done the manual annotation part using the bert-vocab and transformers_tokenizers files.I then trained the model using these annotations and a spacy procedure with config file.
Now i would like to use ner.correct and ner.teach model using my trained model.
I could start the correct method that way : prodigy ner.correct ner_logs ./model/output/model-best ./data.jsonl --label INTRACTABLE_STR,OS,URL -F transformers_tokenizers.py
the problem is i cant provide the method with a correct tokenization,how could i do that?
thanks