Hey everyone!
We have just released Prodigy v1.13.0
which comes with a set of new recipes for LLM-assisted workflows via spacy-llm
that offers multi-LLM backends as well as improved prompts for NER, textcat and spancat annotation!
The new spacy-llm recipes are an improved version of our existing OpenAI recipes which we'll be slowly phasing out.
There are several advantages to the new spacy-llm workflows that we'd like to highlight:
- you can now choose from various LLM providers or even use a locally running model eliminating the need to send your data to a third party. A full list of supported models can be found on
spacy-llm
docs:
- we have provided workflows for NER, textcat and spancat annotations but in custom recipes you can benefit from all
spacy-llm
tasks. The tasks and prompts are under active development from spaCy team so all the upgrades tospacy-llm
can be expected to be directly transferable to your Prodigy workflows.
- you are now able to improve your prompts by adding examples and label definitions for your task
- you can also configure the cache to help reduce costs
- all these settings (and more!) are configurable using spaCy config system which makes it easy to keep your environment reproducible and well organized:
We have updated our Large Language Models docs, where you find all details and a ton of examples.
As always, we are looking forward to your feedback
To install:
pip install --upgrade prodigy -f https://XXXX-XXXX-XXXX-XXXX@download.prodi.gy