AI Agents
Build AI-powered features visually — no backend code needed. Wire together a few nodes and you have a working chatbot, content generator, or intelligent assistant running inside your app.
What you will learn
- How to set up an LLM Provider and connect it to an Agent
- How to create tools the agent can call
- How to build multi-agent networks and workflows
- How to persist conversations with Memory
Core Concepts
The basic agent setup uses three nodes:
[LLM Provider] → [Agent] → [Agent Chat]
| Node | Role |
|---|---|
| LLM Provider | Picks the AI model and provider (OpenRouter, OpenAI, Anthropic, etc.) |
| Agent | The brain — holds instructions, tools, and the LLM config |
| Agent Chat | The interface — sends messages and receives responses |
Available Nodes
| Node | Description | Reference |
|---|---|---|
| LLM Provider | Configure which AI model to use | Docs |
| AI Agent | Create an agent with instructions and tools | Docs |
| Agent Chat | Send messages and receive responses | Docs |
| Agent Tool | Define tools the agent can call | Docs |
| Agent Memory | Store conversation history | Docs |
| Agent Workflow | Build step-based pipelines | Docs |
| Multi-Agent Network | Coordinate multiple agents | Docs |
| MCP Server | Connect to MCP tool servers | Docs |
Getting Started
New to AI Agents? Start with the Getting Started guide to build your first chatbot in under 5 minutes.