Missing first N annotations when using ner.manual recipe

Yes, exactly – and, of course, given that there's nothing in the setup that just automatically reloads the app (because in that case, the annotator would just always get to see the second batch, no matter what).

If you aren't doing this already, you can also set the environment variable PRODIGY_LOGGING=basic. This will log everything that's going on under the hood, including the API requests and responses. If our hypothesis is correct, you should be seeing Prodigy respond to /get_questions twice after startup (and before it has received anything back from /give_answers).

If the underlying problem turns out difficult to debug, you could also work around it by making your stream infinite. On each iteration, you can then load the data and send out examples whose hashes aren't yet in the dataset. So if a batch is dropped in the first iteration (for whatever reason), it'll be queued up again in the second. If an iteration doesn't send out un-annotated examples anymore, you can break and you know that no examples are missing. You can find more details and an example implementation here: "No tasks available" on page refresh - #2 by ines