AI (local)
Run LLM prompts directly inside automations using your Collabase AI configuration or any OpenAI-compatible endpoint — including local Ollama instances with no API key required.
Authentication
Auth type: API Key (optional for local Ollama)
Configure AI settings in Admin Settings → AI Configuration. The connector reads your configured base URL and API key automatically.
| Credential | Description |
|---|
| Base URL | http://localhost:11434/v1 for Ollama, https://api.openai.com/v1 for OpenAI |
| API Key | Leave empty for local Ollama. Required for OpenAI and other hosted providers. |
Actions
| Action | Key inputs | Key outputs |
|---|
| Generate Text | model, systemPrompt, userPrompt, temperature, maxTokens | content, model, tokensUsed |
| Summarize Text | text, model, maxSentences | summary |
Example
[Trigger: Test Case Failed]
↓
[AI: Summarize Text]
text: {{testCaseTitle}} — {{testCaseDescription}}
maxSentences: 2
↓
[Slack: Send Message]
channel: C012QA
text: "Test failure: {{previous.summary}}"
The AI (local) connector uses Collabase’s system-wide AI configuration. To use a per-automation API key instead, use the OpenAI connector with explicit credentials.