Skip to main content

LLM Provider

Early Alpha Version

AI Agents are in early alpha. Node interfaces, behaviors, and APIs are subject to change.

The LLM Provider node configures which AI model and provider to use. It outputs an LLM Config object that you connect to an Agent node.

Supported Providers

ProviderNotes
OpenRouterAccess 200+ models through a single API key. Recommended for getting started.
OpenAIGPT-4o, GPT-4, GPT-3.5 and other OpenAI models
AnthropicClaude Sonnet, Claude Opus, Claude Haiku
GroqFast inference for Llama, Mixtral and other open models
TogetherOpen-source model hosting
FireworksFast inference platform
CustomAny OpenAI-compatible endpoint

Inputs

InputTypeDefaultDescription
ProviderEnumOpenRouterWhich LLM provider to use
ModelStringanthropic/claude-sonnet-4Model identifier (provider-specific)
API KeyStringYour provider API key
TemperatureNumber0.7Controls response randomness (0 = deterministic, 1 = creative)
Max TokensNumber4096Maximum number of tokens in the response
Site URLString(OpenRouter only) Your site URL for ranking
Site NameString(OpenRouter only) Your site name for ranking
Fetch ModelsSignalTrigger to fetch available models from the provider API

Outputs

OutputTypeDescription
LLM ConfigObjectConfiguration object — connect this to an Agent's LLM Config input
Provider InfoStringHuman-readable summary of the current configuration
Available ModelsArrayList of models returned after triggering Fetch Models
Models FetchedSignalFires when the model list has been successfully fetched
ErrorStringError message if configuration is invalid

Fetching Available Models

Instead of typing model names manually, you can fetch the full list from your provider:

  1. Set the Provider and API Key
  2. Send a signal to the Fetch Models input
  3. The Available Models output will populate with id/name entries
  4. The Models Fetched signal fires when the list is ready

Note: Anthropic does not have a public models endpoint. A curated list of current models is provided automatically.

Usage Tips

  • OpenRouter is the easiest way to get started — one API key gives you access to hundreds of models from all major providers.
  • Temperature at 0 produces consistent, deterministic outputs. Values closer to 1 produce more creative and varied responses.
  • The LLM Config output updates automatically whenever you change any input — no rebuild signal needed.