Prodigy with spacy-llm ner.llm.correct - not showing the text to be annotated on the UI

Hi @Fantahun,

The truth is that the UI was not really designed to handle this many labels. But there's a reason to it as it is, likely, not the best idea to try to annotate this many labels at the same time.
This would be really taxing for the annotators, as they need to think about big data model with every annotation task and not the easiest task for the model ether.
This thread by @ines explains very well why you might consider splitting your annotation in steps.

Additionally, this high number of labels is even less recommended in the context of LLM annotation.
It makes the prompt much bigger and more difficult for the model. it also slows down the interference time. The official prompt engineering guide by OpenAI explicitly instructs to split complex tasks into simpler ones and have one task per prompt.

Finally, if you do need to show the high number of labels in the UI, you could make the label area scrollable with custom css via global_css setting in .prodigy.json:

#prodigy.json
 "global_css": " .prodigy-labels { max-height: 150px; overflow-y: auto;} .prodigy-container { max-width: 950px; }",

I'd like reiterate that it would not be our recommended way of dealing with a high number of labels.

If you're interested in some more NER annotation good practice tips, this thread has plenty of relevant references on the topic of dealing with a high number of labels.