Skip to main content

AI Agents

Build AI-powered features visually — no backend code needed. Wire together a few nodes and you have a working chatbot, content generator, or intelligent assistant running inside your app.

What you will learn

  • How to set up an LLM Provider and connect it to an Agent
  • How to create tools the agent can call
  • How to build multi-agent networks and workflows
  • How to persist conversations with Memory

Core Concepts

The basic agent setup uses three nodes:

[LLM Provider] → [Agent] → [Agent Chat]
NodeRole
LLM ProviderPicks the AI model and provider (OpenRouter, OpenAI, Anthropic, etc.)
AgentThe brain — holds instructions, tools, and the LLM config
Agent ChatThe interface — sends messages and receives responses

Available Nodes

NodeDescriptionReference
LLM ProviderConfigure which AI model to useDocs
AI AgentCreate an agent with instructions and toolsDocs
Agent ChatSend messages and receive responsesDocs
Agent ToolDefine tools the agent can callDocs
Agent MemoryStore conversation historyDocs
Agent WorkflowBuild step-based pipelinesDocs
Multi-Agent NetworkCoordinate multiple agentsDocs
MCP ServerConnect to MCP tool serversDocs

Getting Started

New to AI Agents? Start with the Getting Started guide to build your first chatbot in under 5 minutes.