Zero-Trust AI Agent
High-performance. Single binary. Sandboxed by default.
cgroups v2, Network namespace (or net-guard), Seccomp BPF, Landlock, DNS allowlist. Every tool call is isolated.
read_file, write_file, list_dir, execute_cmd, fetch_url, inspect_process — all sandboxed.
Load contextual abilities via @skill_name. Frontmatter-driven, multi-path search.
GitHub Copilot (auto refresh), OpenRouter, Google Gemini, any OpenAI-compatible endpoint.
Stdio JSON-RPC client for Model Context Protocol servers. Extend without recompiling.
Same binary doubles as a Concourse CI resource type via symlink routing. Zero extra deps.
Use Rune as a Concourse CI resource type. Minimal weather pipeline:
resource_types:
- name: rune-agent
type: registry-image
source:
repository: ghcr.io/fourdollars/rune
tag: latest
resources:
- name: weather
type: rune-agent
check_every: 1h
source:
api_key: ((copilot-pat))
model: gpt-4o-mini
prompt: "Fetch the weather for Taoyuan from wttr.in using curl."
policy:
allowed_commands: ["curl"]
allowed_domains: ["wttr.in"]
jobs:
- name: weather-check
plan:
- get: weather
trigger: true
git clone https://github.com/fourdollars/rune.git
cd rune
cargo build --release
cp target/release/rune ~/.local/bin/
rune init
echo "Get weather for Tokyo" | rune --json --yes