Please Help me setting an EXTERNAL liteLLM model #678
Replies: 1 comment
-
Hey @QwertyJacob! I'm here to help you with any bugs, questions, or contributions. Let's tackle this issue together! To set up an external LiteLLM model using your Cerebras API, you need to modify the
Here's an example code snippet to guide you: from paperqa.settings import AgentSettings, make_default_litellm_model_list_settings
# Set up your Cerebras model in AgentSettings
agent_settings = AgentSettings(
agent_llm="your_cerebras_model_name", # Replace with your Cerebras model name
agent_llm_config=make_default_litellm_model_list_settings(
llm="your_cerebras_model_name", # Ensure this matches the model name
temperature=0.0 # Set the desired temperature
)
)
# Ensure compatibility with the Settings object
settings = Settings(
agent=agent_settings
)
# Now, settings can be used with the LiteLLM model This setup changes the default OpenAI model to your specified Cerebras model and ensures that the |
Beta Was this translation helpful? Give feedback.
-
Hello!
I am reaching out because I have been unable to set-up an external LiteLLM model (I am not interested on local models, but I need to use LiteLLM, because I have no openAI or Antrhopic credits). Specifically, I am using my Cerebras API, which is fully compatible with LiteLLM.
I have tried many different initializations for the Settings object. Taking care of changing the llm, summary_llm params, and also providing a config dict with the appropiate model_list object. None of these work.
I have also tried to use the simple setting like in the readme.md (the one that follows the text "You can use Anthropic or any other model supported by litellm:")
Please help me. I made some debugging and I noticed there are some AgentSettings which by default point to OpenAI. How can I change those settings?
Note: I have also tried to use the LiteLLM proxy but it does not work.
Thank you!!
Beta Was this translation helpful? Give feedback.
All reactions