No tasks available for ner.manual when batch_size: 1

Peculiar error, I get a “No tasks available” for the ner.manual recipe when batch sizes are small.

prodigy ner.manual test en_core_web_sm data/sample_sents.txt

But it only happens when I highlight a span and then hit Accept.

I can “solve” this by reloading the page, but obviously not optimal.

Seems to work when I put the batch_size back to 10.

Thanks

Prodigy ver. 1.5.1

Thanks for the report, that’s very interesting :thinking: I’ll need to experiment and reproduce this, but my guess is that for some reason, the app doesn’t seem to correctly request the next batch when the queue is empty (i.e. always with batch size 1) if the task was updated before submission…

I get the same behavior, both with prodigy ner.manual and with prodigy mark [...] --view-id ner_manual.

I actually get the error with larger batch sizes too (2 or 3), as soon as I’m through the batch and need to refresh. As with @plusepsilon, it only happens if I actually annotate text. It works fine if I just accept/reject.

Here’s a reproducible example (with the batch_size in prodigy.json set at 1-3):

prodigy mark assault_highlight_test highlight_quiz.jsonl --label ACTOR,EVENT,TARGET --view-id ner_manual

highlight_quiz.jsonl (65.2 KB)

@ines Any progress on figuring out the bug? Our annotators are getting slowed down by having to refresh the page. I’m happy to help with reproducing the bug in any way I can!

@andy Oh, sorry, I didn’t realise it was blocking issue :disappointed: (I had it on my list as “weird quirk to fix before next release”.) I hope I can get to this by early next week! (I’ll be working on the annotation manager anyways, so it makes sense to investigate the potential issue here at the same time.)

1 Like

No worries! No reason for you to think it was.

Sorry to bug you again on this, but any suggestions for how to work or address the issue? I wouldn’t be asking if there weren’t annotators I feel bad for.

@andy is there a reason you can’t have batch_size: 10?

It’s mostly because of the multi-annotator setup we’re using. I want to update the DB of tasks as soon as a coder has completed each one to prevent double-coding, and we were losing annotations from coders forgetting to save their work before stopping. The latter is fixable with some reminders, but the first would require some reworking of our annotation setup. But you’re right, it could be the better of the two approaches.

So sorry, everything else ended up taking longer than expected :disappointed: But I finally had a look and tried out a possible patch for this issue – I’ve sent you an email with an updated bundle to test. If this works as expected and doesn’t have any bad side-effects, we can push the fix to v1.5.2!

1 Like

That fixed it. Thank you! (from me and our annotators).

1 Like

Hi @ines I'm also trying to use batch_size 1. Is it possible to get a copy of the updated bundle?

@mikeross Sure, do you just want to send me an email to ines@explosion.ai? Sorry we haven’t shipped the fix yet – there’s a bunch of other stuff we wanted to push into that release but everything was taking longer than expected.