LLM Provider
Early Alpha Version
AI Agents are in early alpha. Node interfaces, behaviors, and APIs are subject to change.
The LLM Provider node configures which AI model and provider to use. It outputs an LLM Config object that you connect to an Agent node.
Supported Providers
| Provider | Notes |
|---|---|
| OpenRouter | Access 200+ models through a single API key. Recommended for getting started. |
| OpenAI | GPT-4o, GPT-4, GPT-3.5 and other OpenAI models |
| Anthropic | Claude Sonnet, Claude Opus, Claude Haiku |
| Groq | Fast inference for Llama, Mixtral and other open models |
| Together | Open-source model hosting |
| Fireworks | Fast inference platform |
| Custom | Any OpenAI-compatible endpoint |
Inputs
| Input | Type | Default | Description |
|---|---|---|---|
| Provider | Enum | OpenRouter | Which LLM provider to use |
| Model | String | anthropic/claude-sonnet-4 | Model identifier (provider-specific) |
| API Key | String | — | Your provider API key |
| Temperature | Number | 0.7 | Controls response randomness (0 = deterministic, 1 = creative) |
| Max Tokens | Number | 4096 | Maximum number of tokens in the response |
| Site URL | String | — | (OpenRouter only) Your site URL for ranking |
| Site Name | String | — | (OpenRouter only) Your site name for ranking |
| Fetch Models | Signal | — | Trigger to fetch available models from the provider API |
Outputs
| Output | Type | Description |
|---|---|---|
| LLM Config | Object | Configuration object — connect this to an Agent's LLM Config input |
| Provider Info | String | Human-readable summary of the current configuration |
| Available Models | Array | List of models returned after triggering Fetch Models |
| Models Fetched | Signal | Fires when the model list has been successfully fetched |
| Error | String | Error message if configuration is invalid |
Fetching Available Models
Instead of typing model names manually, you can fetch the full list from your provider:
- Set the Provider and API Key
- Send a signal to the Fetch Models input
- The Available Models output will populate with id/name entries
- The Models Fetched signal fires when the list is ready
Note: Anthropic does not have a public models endpoint. A curated list of current models is provided automatically.
Usage Tips
- OpenRouter is the easiest way to get started — one API key gives you access to hundreds of models from all major providers.
- Temperature at
0produces consistent, deterministic outputs. Values closer to1produce more creative and varied responses. - The LLM Config output updates automatically whenever you change any input — no rebuild signal needed.