Hi! I have been experimenting with the textcat.openai.correct to help me with correcting a text classification task. When I try to run the recipe I get the following error: "The specified model 'text-davinci-003' is not available". It says on the openai website 'text-davinci-003' has been recently deprecated and to replace it with "gpt-3.5-turbo-instruct". However, when I update the recipe to use this model, I get the error "The specified model 'gpt-3.5-turbo-instruct' is not supported by the /v1/completions endpoint". I am using the latest stable release of Prodigy (1.12.7). Please can you advise on any workarounds for this?
There seem to be two problems here, and unfortunately you have already solved the one related to prodigy by replacing
gpt-3.5-turbo-instruct in the
OpenAI documentation states for
Similar capabilities as GPT-3 era models. Compatible with legacy Completions endpoint and not Chat Completions.
Per their API documentation, the
/v1/completions endpoint is the "legacy Completions endpoint" described above, while "Chat Completions" is
/v1/chat/completions. I tested it using the
curl command given on the linked API reference, and it worked with
gpt-3.5-turbo-instruct. I also get different errors when I misspell the model and can only replicate your error when I provide a valid model name (like
gpt-3.5-turbo) that does not support the
This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?
Was this a transient OpenAI problem that is now resolved?
Thanks for your help and quick response
I just tried using the legacy completions endpoint (completions.create) through the openai python library directly with gpt-3.5-turbo-instruct, in the same environment as I have Prodigy, and this works fine for me too.
However, I still get the same error when I use the openai_correct_textcat recipe with gpt-3.5-turbo-instruct. This is the full error message: "The specified model 'gpt-3.5-turbo-instruct' is not supported by the
/v1/completions endpoint. Choices are: ['ada', 'babbage', 'curie', 'davinci',
'text-ada-001', 'text-babbage-001', 'text-curie-001', 'text-davinci-002',
'text-davinci-003']." I see these choices are now outdated as compared to the openai documentation.
Is it possible this error message could be coming from something in the prodigy/componenets/openai.py file?
Hi @fgh95 ,
As of Prodigy 1.13,
spacy-llm recipes are the recommended way to support LLM workflows as they offer more flexibility in terms of configuration so that the recipes can be easily adaptaded to the changes in LLM provider API.
As soon as we integrated
spacy-llm to Prodigy, we stopped maintaning OpenAI recipes - please this deprecation warning in the docs.
Hi, ok that makes sense, thanks. I will use the spacy-llm recipes instead