Not yet – but possibly soon! In v1.4.0, we quietly shipped very experimental support for the features described here:
The custom scripts are currently only available for the html
interface, but if it all works well, we might consider adding an option for custom HTML and scripts to all interfaces. (In the case of ner_manual
, we could, for example, allow custom HTML in the area below the text – so you could insert your own markup and links there, and write a JavaScript function that updates the data based on the current selection).
Another idea could be to break your task down into a classification task first – e.g. annotate if the title is about topic X, includes type Y or has context Z. This would let you use a custom HTML interface, or even the choice mode with custom HTML (and even a list of several topic options to choose from). Once you've pre-classified your examples, you can then move on to annotate them manually.
Prodigy should convert all URLs in the task meta data to links, so in the meantime, this could be a simple solution for this particular case (where you actually know the link upfront).
One quick note on this: Keep in mind that the harder it is for you to make the annotation decisions based on the context, the harder it will be for the model to pick up on context clues and learn the labels you assign. This is especially true if you're relying on mostly manual annotations. So if you can come up with creative ways to pack more text context into the individual task that's displayed, it might help the model learn your labels (and help you annotate them).