automate active learning for testing different methods

I would like to test several active learning methods against each other on labelled data without going through the trouble of clicking reject/accept. Is there a good way to forgo the clicking while still having the model/active learning in the loop?

I realize this is ignoring the time aspect of the active learning and that you present the annotations in an order that helps with annotation speed. I hope to take this into account in another experiment.


This should be no problem — actually there’s a couple of layers at which you could do this!

The recipes all return dictionaries, so you could write a wrapper recipe that calls the recipe you’re interested in, takes batches from the queue, and passes them into the update callback. If you’re missing some logic in the Controller, you can also create an instance of that and call its get_questions and receive_answers methods. Moving up another layer, the source for the server is provided, so you could also add a quick hack there.

Finally, there’s nothing stopping you from running the whole service, and then running another Python process that makes requests into the REST api, just the same way the web client does. The documentation for all these layers (including the REST api) should be provided in your PRODIGY_README.html.