Services
Colour: Sky (#0ea5e9)
Services represent external systems that workflows run on or require — LLM providers, MCP servers, databases, messaging platforms, and anything else outside skrptiq itself.
Pre-Built Templates
A comprehensive library of service templates is available, organised by category. Each template comes pre-filled with setup instructions, authentication details, endpoints, and capabilities in markdown format.
LLM Providers (API)
| Template | Description |
|---|---|
| Anthropic Claude | Claude Opus, Sonnet, and Haiku models via the Anthropic API |
| OpenAI | GPT-4, GPT-4o, o1, and o3 models via the OpenAI API |
| Google Gemini | Gemini models via Google AI Studio or Vertex AI |
| Mistral AI | Mistral Large, Medium, Small, and Codestral models |
| Cohere | Command and Command-R models, optimised for RAG and enterprise use |
Local LLMs
| Template | Description |
|---|---|
| Ollama | Local LLM runner for private/offline use with GPU acceleration |
| LM Studio | Desktop app for local model inference with OpenAI-compatible API |
| llama.cpp | Lightweight C++ inference engine for GGUF quantised models |
Desktop Apps
| Template | Description |
|---|---|
| Claude Desktop | Anthropic’s desktop app with native MCP server support |
| ChatGPT Desktop | OpenAI’s desktop app for macOS and Windows |
MCP Servers
| Template | Description |
|---|---|
| GitHub MCP | Repository access, pull requests, and issues via MCP |
| Filesystem MCP | Local file system read/write access via MCP |
| PostgreSQL MCP | PostgreSQL database queries via MCP |
| SQLite MCP | SQLite database queries via MCP |
| Brave Search MCP | Web search via the Brave Search API and MCP |
| Puppeteer MCP | Browser automation and web scraping via MCP |
| Memory MCP | Persistent knowledge graph memory via MCP |
| Custom MCP Server | Blank template for your own MCP server implementation |
External Services
| Template | Description |
|---|---|
| Slack | Messaging, channel history, and notifications via the Slack API |
| Jira | Atlassian Jira issue and project tracking |
| Linear | Issue tracking and project management via GraphQL API |
| Notion | Workspace pages and database access via the Notion API |
| GitHub API | REST and GraphQL API for webhooks, automation, and CI/CD |
| Email / SMTP | Email sending via SMTP or API-based providers (SendGrid, Resend, Postmark) |
Vector Databases
| Template | Description |
|---|---|
| Pinecone | Managed vector database for embeddings and similarity search |
| Chroma | Open-source embedding database, runs locally or hosted |
| Qdrant | High-performance Rust-based vector database |
Template Content
Each template includes structured markdown covering:
- Provider/service name and type
- Endpoint URLs — API base URLs or local server addresses
- Authentication — API keys, tokens, or credentials with variable placeholders (e.g.
{{ANTHROPIC_API_KEY}}) - Available models or capabilities — what the service can do
- Setup instructions — installation commands, configuration steps
- Notes — rate limits, pricing tiers, platform support
Connections
Services connect to workflows through two edge types:
- Runs on — the workflow’s primary executor. A workflow typically runs on one service (e.g. an LLM provider).
- Requires — additional dependencies. A workflow might require a vector database for retrieval alongside its primary LLM.
Services do not connect directly to each other. They always relate to the graph through workflows.
Custom Services
If none of the templates match, create a blank service node and add your own configuration in the content editor. Use the {{VARIABLE}} syntax for any credentials or environment-specific values you want to keep configurable.