As we are trying to use the interface and see what annotations can be “accept” or “deny”, we found that ner.teach shows error every time. Ner,manual is able to open the web application and operate. However, for ner.manual, we were wondering what does all the annotations on the top mean?
In addition, the text is labeled with all the existed annotations instead of one label per instance, is there a way to perform one “yes” or “no” each time?
Could you paste the commands you're running for
ner.manual, and the errors they're giving?
Those are the labels loaded into the model. If you don't provide the
--label argument, it offers you all the ones from the model you're using as options. Usually you'll want to limit them.
Thanks a lot for the response. The ner.teach commands I used is:
python3 -m prodigy ner.teach dataset en_core_web_sm xxx.jsonl
The error is either “start”, or “ValueError: Error while validating stream: no first example. This likely means that your stream is empty.”
Then we tried ner.manual:
python3 -m prodigy ner.manual dataset en_core_web_sm xxx.jsonl
Based on your response, do we need to specify which labels we wanted to show on each page for ner.manual? However, what we wished to accomplish is more like going one by one same to what ner.teach can do. So that we can evaluate and improve the existing annotations.
There might be several things going in here:
- Try removing all pre-defined
"spans"from the input data.
- Make sure there's enough data in
xxx.jsonl. If you get a message saying your "stream is empty", it usually means that Prodigy couldn't find a batch of relevant suggestions and the stream is empty.
- Try focusing on a few labels at a time, not all of them. For example,
--label PERSON,ORGto improve only the
ner.manual is intended for fully manual annotation. So you need to tell it which labels you want to assign manually. If you want to improve a model in the loop, you probably want to be using