I’m trying to implement a simple Route LLM Module using OpenAI and Groq. I keep getting an error saying “openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable”.
import os
from routellm.controller import Controller
from rich import print
os.environ["OPENAI_API_KEY"] = "My secret openai key"
os.environ["GROQ_API_KEY"] = "My secret key"
client = Controller(
# List of routers to initialize
routers=["mf"],
# The pair of strong and weak models to route to
strong_model="gpt-3.5-turbo",
weak_model="anyscale/mistralai/Mixtral-8x7B-Instruct-v0.1",
# This config for the router is provided by default and is the best-performing config.)
config={
"mf": {
"checkpoint_path": "routellm/mf_gpt3.5_augmented"
}
},
# Display a progress bar for operations
progress_bar=False,
)
def llm_query_router(query):
response = client.chat.completions.create(
# This tells RouteLLM to use the MF router with a cost threshold of 0.11593
model="router-mf-0.11593",
messages=[
{"role": "user", "content": f"{query}"}
]
)
return f'Model - {response["id"]} nResponse - {response.choices[0]["message"]["content"]}'
response_message=llm_query_router("What are the benefits of eating healthy food")
print(response_message)
I’ve used this API key for a LiteLLM implementation and it works fine. Not sure why it’s not working here.
New contributor
Hassan Ferozpurwala is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.