Prodigy looks very promising! I have two questions:
Can Prodigy support a workflow that includes
A: first pass annotations of raw text based on SPACY based processing.
B. Support dynamic annotation options for the user, based on a custom python function? See pic for example:
Hi! Detailed answers below – but the short summary is, yes, both of these things should work
While you typically define the full label set once via the "config" returned by the recipe, you can also override them on a per-task basis. The same question actually came up the other day and I shared an example here:
So depending on the example, you'd just add "config": {"labels": [...]} to the outgoing task dictionary. If no overrides are provided, the default labels are used as a fallback.
In the dictionary you send out, you would then include the "text", "tokens" and "spans", as well as the "options" that can be rendered by the choice block.
Thank you for your quick response! It definitely answered my inquiry, thanks. @ines I have a follow-up question:
I read somewhere prodigy can apply annotation on to the raw text before it is shown to the users. Is this true? What I'm trying to do is to use Spacy to find some annotations, then have a human reader annotate what the system missed, and feed it back into the NER model again to improve it over time - a feedback system.