The agentic interface
for your Mac.

When AI lives inside apps, the cognitive load stays on you — what to do, how to prompt, which tool, what order. Autoclaw lives at the interface layer so you never carry it.

Apple Neural EngineClaude CodeMCP ProtocolSwiftUIClickUpGranolaGoogle SheetsGitHubSlackLinearNotionChrome Extension Apple Neural EngineClaude CodeMCP ProtocolSwiftUIClickUpGranolaGoogle SheetsGitHubSlackLinearNotionChrome Extension
Connected Tools
Autoclaw
Ambient
Session ConfigActive
Clone & Build See the pipeline
How it works

You work. It watches.
Then it runs.

In Phase 1, Autoclaw runs a parallel low-intelligence pipeline between you and Claude Code — voice, screen, clipboard. Zero prompting. One-click approval.

O
Observe
U
Understand
P
Plan
A
Approve
X
Execute

Autoclaw collapses the cognitive load of prompt engineering, thinking like a PM, managing tasks, and monitoring your session. In upcoming phases, it learns your workflows and ambiently runs tasks it recognises.

Phase 1 · Claude Code

The cognitive load
you don't carry anymore.

A parallel low-intelligence pipeline at the interface between you and Claude Code. Voice in, prompts out. No prompt engineering. No project management. No context switching.

▸ Prompt Crafting

"It writes the prompt for you."

Watches your screen, hears your voice, and crafts the right prompt. You never describe context, explain what file you're in, or engineer a prompt again.

VoiceScreenClipboardVision

⊙ Prompt Planning

"It already knows what's next."

Thinks like a PM. Before you finish one task, Autoclaw reads the trajectory, sees the gaps, and queues the next prompt. You stay in flow — it handles the thinking ahead.

ContextSessionHistory

✓ Todo Maintenance

"Nothing falls through the cracks."

Todos surface from your conversations, your screen, your work. Autoclaw maintains them ambiently so you can stay in flow instead of managing lists.

AmbientTrackingScreen

⊕ Theatre Mode

"Make your sessions legendary."

Character-driven Claude Code sessions. Custom filler dialogue, voice caches, storylines. Building software should be entertaining — Theatre mode makes it so.

VoiceCharactersDialogue
Connectors

It already knows
your tools.

If it's in ~/.claude/mcp.json, Autoclaw already has it. No setup. No configuration. The interface layer inherits everything.

ClickUpTasks
GranolaMeetings
Google SheetsData
GitHubCode
SlackChat
LinearIssues
NotionDocs
Web SearchSearch
FilesystemFiles
Your MCP serverAny

Add any MCP server to mcp.json — Autoclaw uses it immediately.

Architecture

An interface layer.
Not another app.

Pure Swift. Apple Neural Engine. No Electron. No cloud relay. Autoclaw doesn't compete with your tools — it sits between you and them, collapsing the space where cognitive load lives.

Vision Pipeline

Apple VNFeaturePrint generates neural embeddings on-device. Zero API cost. Only meaningful screen changes trigger the interface layer.

let request = VNGenerateImageFeaturePrintRequest()
// On-device. Neural Engine. Zero cost.

Claude Code CLI

Phase 1 runs through Claude Code's CLI with streaming JSON. Every MCP server, skill, and permission — inherited automatically.

claude -p "fix the null ref" \
--output-format stream-json

MCP Protocol

Model Context Protocol gives the interface layer reach. One JSON config, infinite integrations. In upcoming phases, this extends beyond Claude Code.

"mcpServers": {
"slack": { "command": "npx" }
}

Local-first

Screen captures, voice, clipboard, and workflows never leave your laptop. Direct API calls to Anthropic. Nothing in between.

// ~/.autoclaw/
transcripts.db // local SQLite
world-model.md // your knowledge base

It sees everything.
Nobody else does.

Interface-layer trust

+

Autoclaw sits between you and your AI tools — it sees your screen, voice, and clipboard. That position demands trust. Everything stays local, permissions are structural, and the pipeline is fully auditable.

Local-first architecture

+

Everything runs on your machine. No cloud relay. No telemetry. Your screen captures, voice transcripts, and file contents never leave your laptop except as direct API calls to Anthropic.

Fully open source

+

Every line of Swift is on GitHub. Read the vision pipeline. Inspect the prompt crafting logic. Verify the interface layer. Fork it, modify it, audit it.

Your Mac.
Zero cognitive load.

Clone & Build