This is the confidence of the prediction assigned by the model – for example, the category label predicted by the text classifier, or the entity label predicted by the entity recognizer. In your example, the score of
0.5 is kind of the “perfect” uncertain prediction – by default, Prodigy prioritises examples with a prediction closest to
0.5, i.e. the ones it’s most uncertain about and which will give you the most relevant gradient for training. (No matter if you click accept or reject – the model will always have a gradient of
0.5 to learn from.)
"show_stats": true in your
prodigy.json – see here for more details!
Btw, some more background on the progress bar (also in case others come across this issue later): Prodigy’s active learning recipes will use the loss returned by the model’s
update method to calculate an estimated annotation progress, based on how the model is improving. It does a simple regression to predict how many examples until the loss reaches zero.
Recipes that don’t use a model in the loop will check whether the
stream has a length and calculates the progress based on the total examples and the examples already annotated in this session. This is usually not the case if the stream is a generator, so the progress bar will show the infinity symbol (like in the screenshot above). In your custom recipes, you can also define your own progress function as the
'progress' component returned by the recipe, for example:
def get_progress(session=0, total=0, loss=0):
progress = compute_something_here()