Interactive Guidelines

Hi, I've been using Prodigy for a couple of months now and I'm really happy with the smooth flow of the whole annotation process.

The most difficult thing for me, however, is to reach a good agreement ratio between annotators. Usually the guidelines become a bit more complex and just the process of successfully communicating these to annotators takes a lot of time (writing & reading the annotation guidelines, and then usually guiding them through some first annotations).

So I have been wondering if there couldn't be a more interactive way of showing how the annotations should look like. Specifically I could imagine the following process to be quite effective for that:

  • Instead of writing down guidelines as a text document, I collect examples which represent difficult cases.
  • For each of these examples I write down a short explanation of what should be the correct annotation.
  • Then in Prodigy, each new annotator first sees these hand-picked examples and annotates them as usual, but before moving to the next instance she would first see some pop-up or text box with the short explanation I wrote. Maybe even with specific feedback on why her annotation does not match the intended guidelines.
  • After going through these examples (maybe about 30 instances), the annotator would just continue as usual with regular (unseen) instances from the dataset without getting any more feedback.

Basically that would make the process of explaining the guidelines a lot more interactive and I think it would bring some advantages, especially because annotators could just start annotating and learn in small bits of how their responses should ideally look like.

Could that be implemented in any way with Prodigy? As far as I can see, it would be quite hacky if I just try it with a custom recipe, e.g. by adding these examples twice to the start of the stream, first without and then with explanations.

Thanks a lot already for the great tool and I hope that idea is understandable enough.
Christopher

1 Like

Hi! This is a good idea and it should definitely be possible to implement with a custom recipe. You would have to prepend your "guideline examples" to the stream. But it shouldn't be that hacky, and you could always load them from a separate file and add something like "is_guide": True to each of them so you'll be able to tell whether an example was one of your onboarding examples.

The validate_answer callback of a custom recipe lets you validate answers as the user submits them in the UI, and raise custom errors if something is not right. Those will then be shown to the annotator in the UI: https://prodi.gy/docs/custom-recipes#validate_answer It's just a Python function, so depending on the examples, you could implement pretty thorough custom checks with helpful messages for the user that tell them what exactly was wrong.