Two Questions on Teach recipes

If you're using ner.batch-train or train with --binary then no – the training process here was specifically designed for incomplete and binary yes/no answers. Before training, Prodigy will merge all annotations on the same texts, and all unannotated tokens will be considered missing values. (Btw, if you're interested in how the updating process works for binary annotations, my slides here show an example.)

Depending on the annotations and specific texts and entities, it can always happen that the binary annotations don't move the needle very much. If training the model on the binary annotations makes the model significantly worse, you might also want to double-check that your data is consistent (e.g. review a random sample using the review recipe). The dataset you're training from should only contain binary annotations, and shouldn't label any partial suggestions as accepted (see here for background on this).