Inquiries about Prodigy 1.12 and Future GPT-4 Integration

Hello Prodigy Support Team,

I hope this message finds you well. I have recently upgraded to Prodigy 1.12 and have been enjoying the new features and enhancements. I had a couple of questions that I was hoping you could assist me with.

  1. Can we use GPT-4 in Prodigy recipes? If not, do you have any information on when GPT-4 will be available for integration in Prodigy recipes? I'm eager to explore the capabilities of the newer GPT version and would appreciate any insights or updates on this matter.
  2. I have been utilizing the "rel.manual" recipe in Prodigy to build relations, and it has been a great experience. However, I was wondering if there is currently a way to train the relation model using the annotated data and obtain predictions on unseen data. If not, I kindly request that you consider implementing a feature that allows us to easily train our relation models and generate predictions on new data. This would greatly enhance the practicality and usefulness of Prodigy in relation-based tasks.

Thank you for your time and support. I look forward to hearing from you and appreciate any information or guidance you can provide regarding my inquiries.

Best regards,

Hi there!

  1. The Prodigy recipes should integrate with the non-chat based models from OpenAI. My understanding is that OpenAI recently declared these endpoints to be "legacy" and will favor the chat completion based models via the https://api.openai.com/v1/chat/completions endpoint. The plan is to support these chat based models for sure, but the integration will occur via spacy-llm. This way, you can treat the LLM powered predictions just like a "normal" spaCy pipeline and Prodigy will just integrate with spacy-llm directly to support a wide range of LLM models. We're working on getting support for spacy-llm done and I expect it won't take long for us to release v1.13.

  2. Funny that you mention this because spacy-llm actually has support for this! Support for relationship models via spacy-llm models is on the roadmap for v1.13 for sure. You might also be able to prodigy train to train a coref model (as of v1.12) using spacy-experimental. This could be re-used in a custom recipe too.

1 Like