Codename Goose is an on-machine AI agent that automates engineering tasks. It is open source, runs locally, is customizable with integrations, and operates independently to handle complex tasks.
Built with transparency and collaboration in mind, Goose empowers developers to contribute, customize, and innovate freely.
Goose runs locally to execute tasks efficiently, keeping control in your hands.
Customize Goose with your preferred LL.M and enhance its capabilities by connecting it to any external MCP server or API.
Goose independently handles complex tasks, from debugging to deployment, freeing you to focus on what matters most.
Goose CLI works on macOS and Linux systems and supports ARM and x86_64 architectures. Goose CLI can run with WSL on Windows.
Users can configure Goose to work with supported LLM providers by selecting a preferred LLM and supplying an API key during installation.
Goose allows single, continuous conversations between the user and the AI through sessions that can be started and managed via Goose CLI or the Desktop application.
Users can write prompts to interact with Goose, instructing it to perform tasks such as creating an interactive tic-tac-toe game.
Enables users to add built-in extensions like Developer Tools and Computer Controller, enhancing Goose's capabilities with web scraping, file caching, and automations.
Learn how to install the Goose application on CLI and/or Desktop.
Set up Goose to work with a wide range of Language Model providers, enabling customization of tools.
Explore extensions that allow additional functionalities to expand Goose's capabilities.
Goose CLI works on macOS and Linux systems and supports ARM and x86_64 architectures. It can also run via WSL on Windows.
Goose integrates with a set of supported LLM providers, requiring an API key for configuration. Goose automatically enters its configuration screen on installation to set up the LLM provider.
Users can configure Goose by running commands to select providers and enter API keys to utilize different models.
A session is a single, continuous interaction between you and Goose, providing a seamless experience.
Goose offers a command-line interface (CLI) with several commands to enhance productivity and flexibility.
'.goosehints' is a text file used to provide additional context about your project, enhancing Goose's performance.
Rate limiting is the process of restricting the number of requests a user can make, ensuring fair usage.
As an autonomous agent, Goose is designed to carry out tasks following specific protocols for effective file management.
A collection of tips aimed at improving your workflow and efficiency with Goose.
Create your own custom MCP Server to use as a Goose extension.
Add GitHub MCP Server as a Goose Extension.
Add JetBrains MCP Server as a Goose Extension.
Integrate Goose with Langfuse to observe performance.
Goose, an open source AI Agent, builds upon the basic interaction framework to execute complex tasks efficiently.
Describes the design and implementation of the Extensions framework for customizing Goose functionalities.
Focuses on the importance and implementation of error handling as a key performance-driving part of Goose.
Enables support for OpenAI's GPT-4 and GPT-3.5 models, allowing the use of models via APIs with required environment variables for API key and organization ID.
Supports Anthropic's models with integration through API keys, enabling usage of Claude AI models via Goose.
Integrates MosaicML models by API key setup, allowing the use of language models through MosaicML platform.
Facilitates access to Together AI models, utilizing their unique platform features to implement models through Goose.
Supports Azure implementation of OpenAI models, requiring Azure-specific setup configurations.