I used Prodigy to retrain the QUANTITY NER on en_core_web_lg. With that updated model, it doesn't appear to be detecting CARDINAL anymore. Do I have to also train CARDINAL on the same dataset to retain that same NER capability?
Is there any reason why updating the QUANTITY NER would impact the CARDINAL NER?
Hi! Since the numeric entities like QUANTITY and CARDINAL are similar, it can definitely happen that the model overcorrects when you only update it with only examples of one entity type. So in your case, it looks like the model has learned that most of what it previously predicted as CARDINAL are now QUANTITY.
This is also known as the "catastrophic forgetting" problem and you typically want to prevent this by including annotations of the new types and examples of what the model previously predicted correctly, essentially "reminding" it of what CARDINAL entities are. Doing this is pretty easy in Prodigy because you can just run a workflow like ner.correct with --label CARDINAL, accept all correctly predicted entities and add the annotations in with your QUANTITY dataset.
Here are some related discussions on the topic of catastrophic forgetting: