I am new to spacy so this is my first time using this forum.
I have problem on saving custom tokenizer. I have trained a model as described over here https://spacy.io/usage/training#example-train-ner and got decent results without doing anything but to get high accuracy I had to use custom tokenizer as described over here https://spacy.io/usage/linguistic-features#native-tokenizers . But when i try to save the trained model it was not allowing me to save the model sines it couldn’t find to_disk() function in custom tokenizer config.So I found that, by changing “nlp.to_disk(output_dir)” to “nlp.to_disk(output_dir, disable=[‘tokenizer’])” did allow me to save the model without tokenizer. But when i try to load the model it says it cannot find tokenizer in model folder. So i tried to find a way to disable tokenizer while loading but i was unsuccessful. I found a helpful thread over here ‘/saving-custom-tokenizer/395’(i cant add full url due to number of links restriction in this forum) but i cant figure out how to implement the suggested fixes discussed in that thread with the way i have modified my tokenizer. Can anyone please help me on guiding me how to properly fix this issue.