AI-powered Git commands for your command line. LLM Git enhances your Git workflow with AI assistance for commit messages, branch naming, PR descriptions, and more.
A plugin for the LLM command-line tool.
LLM Git provides a suite of commands that use AI to help with common Git tasks:
- Generate meaningful commit messages based on your changes
- Create descriptive branch names from commit history
- Write comprehensive PR descriptions
- Analyze and describe staged changes
- Generate fixes for your code
- Improve interactive rebases with AI assistance
llm install llm-git
# Generate a commit message and commit your staged changes
llm git commit
# Generate a branch name based on your recent commits and create it
llm git create-branch
# Create a PR with generated description
llm github create-pr
# Get an analysis of your staged changes
llm git describe-staged
# Generate changes to your code based on instructions
llm git apply "fix the bugs in this code"
# Perform an interactive rebase with AI assistance
llm git rebase HEAD~5
llm git [--model MODEL] commit [--no-edit] [--amend] [--add-metadata] [--extend-prompt TEXT] [--include-prompt]
- Generate commit message and commit changesllm git [--model MODEL] rebase [--upstream BRANCH] [--no-edit] [--extend-prompt TEXT] [--onto BRANCH]
- Rebase the current branch with AI assistancellm git [--model MODEL] create-branch [COMMIT_SPEC] [--preview] [--extend-prompt TEXT]
- Generate branch name from commits and create itllm git [--model MODEL] describe-staged [--extend-prompt TEXT]
- Describe staged changes with suggestionsllm git [--model MODEL] apply INSTRUCTIONS [--cached] [--extend-prompt TEXT]
- [BETA] Generate changes based on instructions (not fully functional yet)llm git [--model MODEL] add [--extend-prompt TEXT]
- [BETA] Generate and stage fixes (not fully functional yet)llm git [--model MODEL] tag [COMMIT_SPEC] [--preview] [--format {name|version}] [-s|--sign] [--no-edit] [--extend-prompt TEXT]
- Generate tag name and message from commits and create an annotated tagllm git dump-prompts
- Display all available prompts
llm github [--model MODEL] create-pr [--upstream BRANCH] [--no-edit] [--extend-prompt TEXT]
- Generate PR description from commits
LLM Git uses a flexible prompt system that allows you to customize how the AI generates content. Prompts are loaded from three different sources, in order of increasing precedence:
- Global config: Built-in default prompts in
src/llm_git/config.yaml
- User config: Your personal customizations in
~/.config/llm-git/config.yaml
- Repository config: Project-specific settings in
.llm-git.yaml
at the root of your Git repository
The key feature of LLM Git's prompt system is the ability to extend prompts from higher-level configs. When you define a prompt with the same name in your user or repository config, you can reference the original prompt using {prompt[prompt_name]}
and then add your own customizations.
Here's a simplified example showing how to extend the commit_message
prompt:
1. Global config (built-in defaults):
prompts:
assistant_intro: |
# Git Assistant
You are a git assistant.
Line length for text output is 72 characters.
commit_message: |
{prompt[assistant_intro]}
## Writing Style
- Use the imperative mood
- Be terse and concise
## Output
Only output the commit message.
2. User config (~/.config/llm-git/config.yaml
):
prompts:
commit_message: |
{prompt[commit_message]}
## User Preferences
- Always include a brief explanation of WHY the change was made
- Use conventional commits format (feat, fix, docs, etc.)
3. Repository config (.llm-git.yaml
):
prompts:
commit_message: |
{prompt[commit_message]}
## Project-Specific Requirements
- Reference ticket numbers in the format PROJ-123
- Always mention affected components
When LLM Git processes the commit_message
prompt, it will:
- Start with the global
commit_message
- Extend it with the user's preferences
- Further extend it with the repository-specific requirements
This approach allows you to build on existing prompts rather than having to redefine them completely, making customization more modular and maintainable.
You can view all available prompts and their current values by running:
llm git dump-prompts
LLM_GIT_SHOW_PROMPTS=1
- Show prompts sent to the LLMLLM_GIT 5BFE _ABORT=request
- Abort before sending request to LLMLLM_GIT_KEEP_TEMP_FILES=1
- Keep temporary files for debuggingLLM_GIT_COMMIT_INCLUDE_PROMPT=1
- Include the LLM prompt (commented out) in the commit message file