Skip to main content

AI (local)

Run LLM prompts directly inside automations using your Collabase AI configuration or any OpenAI-compatible endpoint — including local Ollama instances with no API key required.

Authentication

Auth type: API Key (optional for local Ollama) Configure AI settings in Admin Settings → AI Configuration. The connector reads your configured base URL and API key automatically.
CredentialDescription
Base URLhttp://localhost:11434/v1 for Ollama, https://api.openai.com/v1 for OpenAI
API KeyLeave empty for local Ollama. Required for OpenAI and other hosted providers.

Actions

ActionKey inputsKey outputs
Generate Textmodel, systemPrompt, userPrompt, temperature, maxTokenscontent, model, tokensUsed
Summarize Texttext, model, maxSentencessummary

Example

[Trigger: Test Case Failed]

[AI: Summarize Text]
    text: {{testCaseTitle}} — {{testCaseDescription}}
    maxSentences: 2

[Slack: Send Message]
    channel: C012QA
    text: "Test failure: {{previous.summary}}"
The AI (local) connector uses Collabase’s system-wide AI configuration. To use a per-automation API key instead, use the OpenAI connector with explicit credentials.