I am trying to spot check if
ner.teach has caught all entities by using
ner.manual (around 1600 annotations made in
ner.teach). But I could not load it and view on web application. Is it too many annotations? Or any other
ner command I could use to achieve my goal?
What exactly did you try, and how did you run the recipe? If you want to re-annotate examples, you typically want to export the dataset and then use that ad the input data.
However, checking if
ner.teach "has caught all entities" is a slightly confusing objective – unless I'm misunderstanding what you're trying to do.
ner.teach will show you suggestions based on all possible analyses of the given text to find the most relevant examples to update the model with. So it's not going to necessarily suggest you the model's most confident predictions, or all entities predicted in the data.
If that's what you want, you probably want to pre-train a model with the annotations collected with
ner.teach and then work with that model and see if it's predictions have improved. For example, the
ner.make-gold workflow streams in the model's predictions and lets you edit them, which seems similar to whast you're trying to do?