# Supported LLMs

This page lists the Large Language Models (LLMs) that work with **RagaAI Catalyst**, how to connect them, and when to choose one provider over another. You’ll also find a quick compatibility matrix, limits to be aware of, and FAQs.

### How to connect a provider (2-minute setup)

* Go to **Settings → API Keys** in Catalyst.
* Click **Add Key**, choose the provider, and paste your API key or credentials.

### Choosing the right model

* **Rapid prototyping / lowest latency:** start with lightweight models (e.g., “mini/flash” tiers).
* **Complex reasoning / tool orchestration:** choose higher-end models (e.g., GPT-4.x, Claude 3.x).
* **Enterprise hosting requirements:** prefer **Azure OpenAI** or **AWS Bedrock**.
* **Long context / multimodal inputs:** consider **Gemini** or **Claude 3.x** with extended context windows.

{% content-ref url="supported-llms/openai" %}
[openai](https://docs.raga.ai/ragaai-catalyst/concepts/supported-llms/openai)
{% endcontent-ref %}

{% content-ref url="supported-llms/gemini" %}
[gemini](https://docs.raga.ai/ragaai-catalyst/concepts/supported-llms/gemini)
{% endcontent-ref %}

{% content-ref url="supported-llms/azure" %}
[azure](https://docs.raga.ai/ragaai-catalyst/concepts/supported-llms/azure)
{% endcontent-ref %}

{% content-ref url="supported-llms/aws-bedrock" %}
[aws-bedrock](https://docs.raga.ai/ragaai-catalyst/concepts/supported-llms/aws-bedrock)
{% endcontent-ref %}

{% content-ref url="supported-llms/anthropic" %}
[anthropic](https://docs.raga.ai/ragaai-catalyst/concepts/supported-llms/anthropic)
{% endcontent-ref %}
