Oracle (LLM File Bundler)

Bundle prompts + files into one-shot requests for another model. Browser automation with GPT-5.2 Pro or API mode.

Oracle is a CLI and local daemon by Peter Steinberger (steipete) that bundles your prompt and files into a single, context-rich request and routes it to high-capability frontier models — GPT-5.4 Pro, GPT-5.1, Gemini 3 Pro, Claude Opus, and many more. It's designed for the 'second opinion' pattern: when your AI assistant is stuck, uncertain, or needs a different model's perspective, Oracle ships the problem out and brings back the answer. The core workflow is simple but powerful: you give Oracle a prompt and optionally attach files (code, logs, screenshots, documents). Oracle bundles everything into a single payload and sends it to the target model. The response comes back with full context preserved. This is especially valuable for OpenClaw agents that want to consult a different model without switching their own context. What makes Oracle unique is its dual-mode operation. In API mode, it sends directly to provider APIs (OpenAI, Anthropic, Google) using your API keys. In browser mode, it automates ChatGPT Pro or Google AI Studio through a headless browser, giving you access to models on subscription plans without API billing. The browser automation uses Playwright under the hood and handles login, session management, and response extraction. The file bundling is thoughtful: Oracle understands file types, applies syntax highlighting for code, and structures multi-file context so the receiving model gets clean, readable input. It can process entire directories with glob patterns, making it easy to share a codebase with another model for review. Oracle also includes a local daemon mode that keeps browser sessions alive for faster repeated queries — no re-auth on every request. The daemon listens on a local port and accepts requests via HTTP. For OpenClaw users, Oracle is the skill that lets your agent 'phone a friend.' Stuck on a tricky debugging problem? Oracle ships it to GPT-5.4 Pro. Need a code review from a different perspective? Send the files to Claude Opus. The OpenClaw skill teaches the agent when and how to use this escape hatch. Best suited for: AI agents needing second opinions from different models, developers wanting CLI access to frontier models, code review workflows across multiple AI providers, anyone who wants to use ChatGPT Pro features via automation.

Tags: llm, bundler, prompt, ai, developer-tools

Category: Developer Tools

Use Cases

  • Second opinion: agent sends problem to a different model when stuck
  • Code review across models: share codebase with GPT-5 Pro or Claude Opus
  • Large context analysis: bundle logs + code + docs for a frontier model
  • ChatGPT Pro access via automation without manual browser interaction
  • Model comparison: send the same prompt to multiple models and compare
  • Research synthesis: bundle multiple documents for a model to analyze

Tips

  • Use API mode for speed and reliability; browser mode for accessing subscription-tier models
  • Bundle only relevant files — don't send your entire repo, use glob patterns to select key files
  • Use the daemon mode (`oracle daemon`) for repeated queries to avoid re-authentication overhead
  • Combine with the coding-agent skill: agent gets stuck → Oracle asks GPT-5 Pro for a second opinion
  • Use `--model` flag to specify which model to target: `oracle ask --model gpt-5.4-pro 'Review this code'`
  • For code reviews, use `oracle ask --files 'src/**/*.ts' 'Review this codebase for security issues'`
  • Set a default model in config so you don't need to specify it every time
  • The JSON output mode is great for parsing responses programmatically in automation workflows

Known Issues & Gotchas

  • Browser mode requires Playwright and a browser installation — run `npx playwright install chromium` first
  • Browser automation for ChatGPT Pro requires an active subscription — Oracle doesn't bypass paywalls
  • API mode requires API keys for each provider you want to use (OPENAI_API_KEY, ANTHROPIC_API_KEY, etc.)
  • Large file bundles can hit token limits — be selective about what you attach
  • Browser sessions can expire or hit CAPTCHAs — daemon mode mitigates this but doesn't eliminate it
  • The daemon mode uses a local port — ensure it's not conflicting with other services
  • Response times vary: API mode is fast, browser mode depends on the provider's web interface speed
  • Not all models support all input types — images work with multimodal models only

Alternatives

  • Direct API calls (curl)
  • aider
  • MCPorter
  • llm (Simon Willison)

Community Feedback

Oracle bundles your prompt and files so another AI can answer with real context. It speaks GPT-5.4 Pro (default), GPT-5.1 Pro, Gemini 3 Pro, Claude Sonnet/Opus and many more.

— GitHub

Oracle is a CLI and local daemon that bundles prompts and files and routes them to high-capability models. It's the 'second opinion' tool for AI agents.

— LobeHub Skills Marketplace

Oracle is incredibly useful when your agent needs a different model's perspective. Bundle the code, send it to GPT-5 Pro, get back a fresh analysis.

— Reddit r/openclaw

Configuration Examples

Install and basic usage

# Install globally
npm i -g @steipete/oracle

# Simple question (API mode)
oracle ask 'What is the best sorting algorithm for nearly-sorted data?'

# With file context
oracle ask --files src/main.ts 'Find bugs in this code'

Multi-file code review

# Bundle multiple files with glob patterns
oracle ask --files 'src/**/*.ts' --files 'package.json' \
  --model gpt-5.4-pro \
  'Review this TypeScript project for security vulnerabilities and performance issues'

Daemon mode for persistent sessions

# Start daemon (keeps browser sessions alive)
oracle daemon start

# Query via daemon
oracle ask --daemon 'Explain this error' --files error.log

# Stop daemon
oracle daemon stop

Installation

npm i -g @steipete/oracle

Homepage: https://askoracle.dev

Source: bundled