Build AI-powered applications, not infrastructure
Agents, functions, database queries, state, files, and templates — behind a single API with role-based access control. Deploy with Docker Compose.
Everything you need to ship AI applications
Six integrated subsystems, one unified API. Add what you need, ignore the rest.
AI Agents
Multi-provider LLM agents with tool calling, streaming, and agent-to-agent orchestration. OpenAI, Anthropic, Mistral, or Ollama.
Python Functions
Serverless Python in isolated Docker containers. Triggered by agents, webhooks, schedules, or the API. Automatic execution tracking.
Database Queries
SQL templates against external PostgreSQL, ClickHouse, or Snowflake. Parameterized, validated, and usable as agent tools.
Skills
Reusable instruction documents that agents retrieve on-demand or preload into system prompts. Share expertise without prompt bloat.
State Management
Persistent key-value storage with namespaces. Agents maintain memory and context across conversations with fine-grained access control.
Files & Templates
File collections with versioning and metadata validation. Jinja2 templates for emails and dynamic content. Upload hooks for processing.
Agents that call agents, use tools, and remember
Configure AI agents with any LLM provider. Each agent gets a system prompt, tool access, and state namespaces. Agents can call other agents as sub-tools — orchestration is built in.
- Switch between OpenAI, Anthropic, Mistral, or Ollama per agent
- Tool calling with functions, queries, skills, state, and file search
- Agent-to-agent calls via async queue — no recursive blocking
- SSE streaming with reconnectable Redis Streams
- Jinja2 system prompts with input variable injection
agents: - namespace: support name: ticket-agent model: gpt-4o system_prompt: | You are a support agent for {{company}}. Use tools to look up customer data. enabled_functions: - crm/lookup_customer - email/send_reply enabled_skills: - skill: default/tone_guidelines preload: true enabled_queries: - analytics/recent_orders state_namespaces_readwrite: - conversation_memory
Python functions with automatic execution tracking
Write Python, deploy instantly. Functions run in pre-warmed Docker containers with resource limits. Every call is tracked — including nested function calls that build execution trees.
- Isolated containers with memory, CPU, and disk limits
- Input/output validation via JSON Schema
- Context injection: user_id, access_token, execution_id
- Webhooks and cron schedules as triggers
- Admin-approved package management
def lookup_customer(input, context): """Look up a customer by email.""" import requests headers = { "Authorization": f"Bearer {context['access_token']}" } # Call back into Sinas API resp = requests.get( f"http://host.docker.internal:8000/states", params={"namespace": "customers", "key": input["email"]}, headers=headers ) return resp.json()
Everything else included
Authentication, permissions, configuration management, and operational tooling — all built in.
Role-Based Access Control
Granular permissions with wildcard matching and scope hierarchy. :all automatically grants :own. Define custom permission prefixes for external apps.
Declarative Configuration
Define all resources in YAML. Idempotent apply with change detection, environment variable interpolation, reference validation, and dry-run mode.
Authentication
Passwordless OTP login, external OIDC providers (Authentik, Auth0, Keycloak), and API keys with scoped permissions. JWT access + refresh tokens.
Management Console
Web UI for managing agents, functions, queries, skills, users, and permissions. Configure everything visually, test functions, chat with agents.
Webhooks & Schedules
Expose functions as HTTP endpoints. Trigger functions or agents on cron schedules with timezone support. Input merging and default values.
Request Logging
Every API request logged to ClickHouse with method, path, status, duration, user, and permissions used. Auth endpoints automatically redacted.
What you can build
Customer support
Agents that look up customer data via queries, follow tone guidelines via skills, and escalate via function calls. Full conversation history with state persistence.
Internal AI tools
Connect agents to internal databases and documentation. Role-based access ensures each team sees only their data. Deploy in hours, iterate in minutes.
Multi-tenant SaaS
Namespaces and RBAC make multi-tenancy native. Each customer gets isolated agents, functions, and state. Scale to thousands of tenants on one deployment.
Workflow automation
Receive webhooks, execute business logic in Python, trigger AI analysis, send templated emails. Visual execution trees show exactly what happened.
Client delivery
Agencies and consultancies: deploy one Sinas instance per client. Build 5-20 applications per deployment. Clean data sovereignty and easy handoff.
Research assistants
Agents with progressive skills for research methodology, file collections for document search, and state for tracking findings across sessions.
Deploy anywhere, your way
Self-host on your infrastructure. Docker Compose handles everything — PostgreSQL, Redis, ClickHouse, workers, scheduler, and the console.
Start building with Sinas
Open source. Deploy in minutes. No vendor lock-in.