Prompter IDE is a plugin that connects LLM chat services with your local file system. It allows you to provide context to your LLM, review proposed changes, and decide on acceptance or rejection. To get started, install the Chrome extension, configure site settings, set up Docker, and explore the UI.
Install Prompter IDE from the Chrome Web Store to bridge LLM chat services with your local file system.
Add specific sites where you want the Prompter IDE to be active, allowing you to customize its applicability.
Use the Docker installation guide to effectively set up the environment for Prompter IDE.
Access the UI Guide to familiarize yourself with the interface for efficient use of Prompter IDE.
Allows you to load local files into the LLM prompt for context, enabling better integration with your existing codebase.
Use Git integration to review diffs, commit changes, or revert them to maintain version control directly within the IDE.
Run terminal commands to verify suggestions before applying changes, ensuring that any modifications are thoroughly vetted.
Load files into the prompt.
View diffs, commit, or revert changes.
Run build, test, or other commands.
Apply LLM suggestions back to your files.
Organize files and instructions into sections under a selected epic for improved access and management.
Automatically load the most updated versions of linked files, ensuring contextual responses are up to date.
Integrate prompts directly with epics, reducing redundancies and improving prompt accuracy.
Break down complex tasks into smaller, manageable sections to streamline project management.
Configure snippets through the Snippet Page and use them while typing in the editor. A suggestion list will appear for quick insertion when the typed word matches a snippet.
Attach files by typing @ followed by the file name. A suggestion list for file selection appears once typing starts. Selected files load into the editor, allowing dynamic file management.
Attach files to epics, displayed in the epic's 'Files' section. Click on files to load them, ensuring the latest version is loaded when using the 'Copy' button.
Copy a section using the Copy button in the editor to use in an LLM's chat field. Attached files are included, ensuring the most updated versions are copied.
Allows users to create snippets with a unique shortcut and reusable content, which can be saved for future use.
Enables users to view all existing snippets, edit their content or shortcut, and delete unneeded snippets.
Automatically suggests snippets in the Prompt Editor as you type, making it easy to insert predefined text.
Allows users to simultaneously select and insert multiple files into the LLM prompt, enhancing workflow efficiency by reducing manual actions. Ensures the latest version of files is used dynamically, saving time and streamlining processes.
Provides an intuitive design for selecting and managing multiple files, making it easier for users to organize and input files as needed.
Features a slider interface that allows generated code from the LLM chat to be integrated into the file system. Users can select files using a 'Select File' tab or create new files with the 'Create New File' tab.
Attaches a folder icon to pre-rendered code blocks, enabling users to update files with changes by viewing original and generated content side by side.
Allows text copied from an editor to be merged back into the file system by pasting it into a dedicated slider area.
Allows content from any document to be quickly merged by right-clicking and selecting 'Open in Merge' for seamless integration.
An environment variable used to protect access to the server. It should be set in the docker run command.
Mount this directory to your project directory for easy access and management.
Provides persistent storage for prompts and snippets, ensuring data is retained between sessions.
Serves as internal version control storage, allowing tracking and management of file history.
An optional parameter that enables Docker integrations by connecting to the Docker socket.