Supported LLMs

Supported LLMs – Compatible Models in RagaAI Catalyst

This page lists the Large Language Models (LLMs) that work with RagaAI Catalyst, how to connect them, and when to choose one provider over another. You’ll also find a quick compatibility matrix, limits to be aware of, and FAQs.

How to connect a provider (2-minute setup)

  • Go to Settings → API Keys in Catalyst.

  • Click Add Key, choose the provider, and paste your API key or credentials.

Choosing the right model

  • Rapid prototyping / lowest latency: start with lightweight models (e.g., “mini/flash” tiers).

  • Complex reasoning / tool orchestration: choose higher-end models (e.g., GPT-4.x, Claude 3.x).

  • Enterprise hosting requirements: prefer Azure OpenAI or AWS Bedrock.

  • Long context / multimodal inputs: consider Gemini or Claude 3.x with extended context windows.

OpenAIGeminiAzureAWS BedrockANTHROPIC

Last updated

Was this helpful?