About Goose
Goose is Block's open-source AI agent for software engineering tasks, built in Rust and released under the Apache 2.0 license. It runs locally on your machine, plugs into any LLM provider via MCP, and can autonomously edit code, run shells, and call APIs - all without per-seat licensing.
“Goose is the most credible open-source alternative to Cursor and Claude Code, particularly for teams that want to own their infrastructure and avoid per-seat SaaS fees. Block's heavy backing and broad MCP support make it a safe long-term bet for tinkerers and enterprises alike.”
What is Goose?
Overview
Goose is the open-source coding agent from Block, the company behind Square, Cash App, and Tidal. Released in early 2025 and accelerated under the codename Honk in late 2025, Goose is designed for developers who want a Cursor-style agent without the SaaS lock-in. It is written in Rust, distributed as a single binary, and licensed under Apache 2.0, which means you can self-host it, fork it, or embed it in your own tooling without paying anyone.
The project has grown quickly thanks to Block's heavy investment - the company uses Goose internally across its 13,000-person engineering org - and the broader open-source community that has rallied around the Model Context Protocol (MCP) standard, which Goose was an early adopter of.
Core Features
Goose is fundamentally an agent runtime. You give it a task, and it plans, executes, and iterates using whatever tools and models you have configured. Out of the box it can read and write files, run shell commands, browse the web, and invoke any tool exposed via MCP. Because MCP is now a widely supported standard, Goose can plug into databases, GitHub, Slack, Jira, Linear, and hundreds of other services without custom adapters.
The agent supports any model provider, including Anthropic Claude, OpenAI, Google Gemini, Groq, Ollama for local models, and open-router gateways. This provider-agnostic design is the main differentiator versus closed tools like Cursor or Claude Code - you bring your own keys and pay your own model costs at API rates rather than paying a flat per-seat license on top of model usage.
Goose runs in two surfaces: a CLI for terminal-first developers and a desktop app with a chat-style UI. Both share the same underlying agent runtime, so a session started in the CLI can be resumed in the desktop app. Sessions are persistent and can be shared across machines.
Extensions are first-class. The community has built dozens of MCP-compatible extensions for common workflows - database access, browser automation, Kubernetes management, and more - and writing your own extension is straightforward.
Pricing Analysis
Goose itself is completely free. The cost you pay is for the underlying LLM API calls to whichever provider you choose. For developers already paying for Anthropic, OpenAI, or running local Ollama models, Goose adds zero incremental cost. This makes it dramatically cheaper than Cursor or Copilot for heavy users - a single Claude API key can fuel an entire team using Goose.
Block does not offer a managed Goose service today. Everything is self-run on your laptop or your own infrastructure.
Who Should Use Goose
Goose is a strong fit for developers who already pay for LLM API access and want a flexible, scriptable agent without per-seat licensing. It is particularly good for teams with strict data-residency requirements who need to run everything locally or in their own cloud, and for tinkerers who want to compose their own agent workflows from MCP building blocks.
It is less suited for developers who want a polished out-of-the-box editor experience with autocomplete - Goose is an agent first, not an editor. Pair it with VS Code, Zed, or Neovim for the editing surface.
Pros
- Fully open-source under Apache 2.0 with no per-seat licensing or vendor lock-in
- Provider-agnostic - works with Anthropic, OpenAI, Gemini, Groq, Ollama, and any OpenAI-compatible endpoint
- Native MCP support means hundreds of extensions for databases, browsers, dev tools, and SaaS apps
- Both CLI and desktop UI share one runtime, so sessions are portable across surfaces
- Backed by Block (Square, Cash App), which dogfoods Goose across 13,000+ engineers
Cons
- Bring-your-own-key model means you handle billing, rate limits, and provider outages yourself
- No managed cloud service - everything runs locally or on your own infrastructure
- Less polished editor surface than Cursor or Windsurf - you pair it with your own editor for autocomplete
How to Use Goose
- 1Install Goose
Install via Homebrew on macOS (`brew install goose`), or grab a release binary for Linux and Windows from the GitHub releases page.
- 2Configure a Model Provider
Run `goose configure` and choose a provider - Anthropic, OpenAI, Gemini, Groq, or local Ollama. Paste your API key once and it is stored locally.
- 3Start a Session
Launch the CLI with `goose session` or open the desktop app. Describe your task in plain English.
- 4Add Extensions
Install MCP extensions to expand what Goose can do - GitHub, Slack, Linear, PostgreSQL, browser automation, and more.
- 5Iterate
Goose shows each tool call before executing potentially destructive actions, so you stay in control. Approve, reject, or redirect at any step.
Key Features of Goose
AI Capabilities
Works with Anthropic, OpenAI, Gemini, Groq, Ollama, and any OpenAI-compatible endpoint
Plans, executes, and iterates on multi-step coding tasks with optional approval gates
Read, write, and edit files; run shell commands and manage processes natively
Maintained by Block engineering with 13,000+ internal users dogfooding the product
Integration
Hundreds of community-built MCP extensions for databases, browsers, GitHub, Slack, and more
Editor/IDE
Same agent runtime exposed through a terminal CLI and a desktop chat UI - sessions are portable
Sessions are stored locally and can be resumed or shared across machines
Privacy & Security
First-class Ollama integration means you can run Llama, Mistral, or Qwen entirely locally
Fully open-source - inspect, fork, embed, or self-host without restrictions
Configure whether Goose asks before destructive operations or runs autonomously
Key Specifications
| Attribute | Goose |
|---|---|
| Vs | [object Object],[object Object],[object Object] |
| Strengths | Fully open-source (Apache 2.0),Provider-agnostic model support,Strong MCP extension ecosystem,Local model support via Ollama,Backed by Block with heavy internal use |
| Weaknesses | No built-in editor or autocomplete,No managed cloud service,Documentation lags commercial competitors,You manage model billing and rate limits |






