Yes, ner.correct
will show you all examples exactly as they come in and doesn't do any example selection using an updated model in the loop and doesn't skip any less relevant examples in favour of better ones etc. It just lets you correct the model's suggestions and use the model to help you annotate faster.
I'm not sure if example selection with active learning really makes sense in your use case, since you want to add a new label from scratch. That can be pretty difficult if you start out with a model that knows nothing about the new entity type, and you need to make sure it gets to see enough examples so it can learn about it. So I would at least collect some gold-standard annotations first for all of your entity types so you can pretrain a model. You can always improve it by having the model select examples to annotate later.
Btw, on the original question, it's definitely possible to update the model in the loop in a manual workflow – it just comes down to experimentation: Combining ner.teach with patterns file and manual correction of spans