Could not find the API key to access the openai API

This is weird. I try to run prompt engineering recipe, so I added values to .env file. But I get the error. Please check the image:


Version: 1.12.4

hi @orkenstein!

Thanks for your question and welcome to the Prodigy community :wave:

Can you set these values as environment variables and retry?

export PRODIGY_OPENAI_ORG = "org-..."
export PRODIGY_OPENAI_KEY = "sk-..."

This is an alternative to keeping an .env file. The downside being you need to do this each time you refresh your terminal.

Just curious - do you have python-dotenv installed? If not, does installing it fix the issue?

pip install python-dotenv
1 Like

Hi @ryanwesslen
Raw export works just fine.
Yep, I have python-dotenv:

Maybe It's somehow messed up by my pyenv setup?

hi @orkenstein,

Just curious - when you ran that initial Prodigy command, did you make sure to run it while you were in the prompt_eng folder? I suspect so but it's an easy mistake and I can't confirm from the terminal screenshot.

Maybe, but I can't see a reason why it would be the case.

With the same venv and in the same folder, can you run this script?

import os
from dotenv import load_dotenv

# Load environment variables from the .env file

# Access the environment variables
openai_org = os.getenv("PRODIGY_OPENAI_ORG")
openai_key = os.getenv("PRODIGY_OPENAI_KEY")

# Print the environment variables
print("PRODIGY_OPENAI_ORG:", openai_org)
print("PRODIGY_OPENAI_KEY:", openai_key)

This is what Prodigy is doing to get the .env variables.

There's no venv, just shims python. The result:

What if you run Prodigy like:

python3 -m prodigy ...

Sadly, the same result

Back to this point - are you not creating a separate virtual environment for Prodigy?

Per the install docs, we recommend doing this because there are weird issues like this. Can you try to create a new venv for Prodigy and retry?

No luck:

Since it looks like the export works fine, you should use that in the interim.

We've made a decision to remove using .env within the recipes and will implement this once we move to v1.13 very soon (maybe even next week).

So my suggestion for now would be to use export for now but be on the lookout for v1.13 where you'll then need to set this up on your own moving forward.

1 Like

Mkay, can't wait. Thanks.

Almost forgot - one other tip (and this is good too in general but could help you debug now), you can view, modify and debug any built-in recipes by finding their location in the path Location: from prodigy stats, then looking at the recipes folder.

For example, the ab.openai.prompts recipe is in recipes/openai/ You can then even set up custom logging and view that logging in your console by adding general logging (either basic or verbose). Definitely avoid printing out your OpenAI key or data in any production/open environment -- but I'd be really curious if you could use this to debug your recipe and see what is happening right before you get this error message. We'd love to fix if there is something we're missing.

Hi @cheyanneb ,

I've just realized it should be pointed out better in our docs, but the environment variable names for OpenAI credentials are different for spacy-llm recipes.
They are:


You can consult the right names for other providers on spacy docs here.

Another difference between the OpenAI and spacy-llm recipes is that in spacy-llm recipes we no longer load env variables internally as we have realized it should be better managed by users themselves.
If you export the spacy-llm OpenAI keys in your virtual env, you should be able to run prodigy as normal. If you decide to keep them in the .env file, you can run prodigy via dotenv like so:

 dotenv run -- python -m prodigy recipe arguments

You can find some more information on the environment variables for spacy-llm recipes here.

1 Like

Thank you! I deleted my comment/question after I changed the env variable names and it worked. Thank you, though!

Alright! I wasn't sure why it was showing up as "hidden" for me :slight_smile: Still, it pointed me to improving our docs for others so I'm glad it came up! Thanks!

1 Like