Hello!
I'm trying to load custom model, but I can't find how exactly can I do it.
I use BERT model, which has been fine-tuned with PyTorch.
When I try to load it with spacy.load(path), it says "[E053] Could not read meta.json", but the folder does not have meta.json, it has config.json.
What should I do? Is there any place where I can read how to load custom model and use it in Prodigy? And if I need to create meta.json, what exactly should be there?
Hi!
How exactly is the BERT model defined? Is it a spaCy model? Which version of spaCy was it created with, and which version of Prodigy do you have installed?
If your model is not a spaCy model, you won't be able to load it with spacy.load
โ but you can always use a custom recipe to load any model that can be loaded in Python. You just need to extract and set the annotations it produces, which of course depends on the model and what it does (NER, text classification etc.).
Here's an example for how to plug in a custom NER model: https://prodi.gy/docs/named-entity-recognition#custom-model
Here's an example of the JSON data you want to output: https://prodi.gy/docs/api-interfaces#ner_manual If you're using a transformer model with wordpiece tokens, make sure to also include the "tokens"
property (because the wordpiece tokens will be kinda arbitrary and won't match the default linguistically-motivated tokenization).
Hi!
Model is bert-base-german-cased, which was taken from Huggingface library
Fine-tuning was made with PyTorch, not with Spacy.
Prodigy 1.10.8
Thank you for the reply!