- Added acknowledgments section to thank the inspiration sources. - Included links to the original projects for easy access.
12 KiB
forge-llm
- forge-llm
forge-llm
Generate Pull Request descriptions for Forge using LLM providers through the llm package.
Overview
forge-llm
is an Emacs package that integrates Large Language Models (LLMs) with Forge, a Magit interface to GitHub and GitLab forges. This package helps you generate high-quality Pull Request descriptions based on your git diff and repository PR templates.
Main features:
- Automatically finds and uses your repository's PR template
- Generates PR descriptions based on git diffs between branches
- Seamless integration with Forge's PR creation workflow
- Supports any LLM provider supported by the
llm
package - Stream LLM responses in real-time
Installation
Using straight.el with use-package
(use-package forge-llm
:straight (:host gitlab :repo "rogs/forge-llm")
:after forge
:config
(forge-llm-setup))
Using Doom Emacs
Basic Setup
- Add the following to your
packages.el
:
(package! forge-llm
:recipe (:host gitlab :repo "rogs/forge-llm"))
(package! llm) ; Dependency
- Add somewhere in your
config.el
:
(require 'forge-llm)
(forge-llm-setup)
(require 'llm-openai) ; Or your preferred LLM provider
(setq forge-llm-llm-provider (make-llm-openai :key "YOUR-OPENAI-KEY"))
- Run
doom sync
to install the package.
Keybindings
The package automatically sets up Doom Emacs keybindings when Doom is detected:
SPC m g
- Generate PR description in a separate bufferSPC m p
- Generate PR description at pointSPC m t
- Insert PR template at point
No additional configuration is needed for these keybindings to work.
Using MELPA (once available)
(use-package forge-llm
:ensure t
:after forge
:config
(forge-llm-setup))
Manual installation
Clone the repository:
git clone https://gitlab.com/rogs/forge-llm.git ~/.emacs.d/site-lisp/forge-llm
Add to your Emacs configuration:
(add-to-list 'load-path "~/.emacs.d/site-lisp/forge-llm")
(require 'forge-llm)
(forge-llm-setup)
Setting up LLM providers
forge-llm
depends on the llm package for LLM integration. You'll need to set up at least one LLM provider. Please refer to the llm documentation for detailed instructions.
Example: OpenAI provider
First, create an OpenAI API key. Then configure the llm
OpenAI provider:
(require 'llm-openai)
(setq forge-llm-llm-provider (make-llm-openai :key "YOUR-OPENAI-KEY"))
Example: Anthropic provider
To use Claude models from Anthropic:
(require 'llm-claude)
(setq forge-llm-llm-provider (make-llm-claude :key "YOUR-ANTHROPIC-KEY" :chat-model "claude-3-7-sonnet-20250219"))
Using auth-source for API keys (recommended)
For better security, use Emacs auth-source
to store your API keys:
(use-package llm
:ensure t
:config
(setq llm-warn-on-nonfree nil))
(require 'llm-openai)
(use-package forge-llm
:ensure t
:after (forge llm)
:custom
(forge-llm-llm-provider
(make-llm-openai
:key (auth-source-pick-first-password
:host "api.openai.com"
:user "apikey")))
:config
(forge-llm-setup))
Content of .authinfo
or .authinfo.gpg
:
machine api.openai.com login apikey password YOUR-API-KEY-HERE
Usage
After setting up forge-llm
, the following commands will be available in Forge's pull request creation buffer:
Key binding | Command | Description |
---|---|---|
C-c C-g | forge-llm-generate-pr-description | Generate a PR description (output to separate buffer) |
C-c C-p | forge-llm-generate-pr-description-at-point | Generate a PR description at the current point |
C-c C-t | forge-llm-insert-template-at-point | Insert the PR template at the current point |
SPC m g (Doom Emacs) | forge-llm-generate-pr-description | Generate a PR description (output to separate buffer) |
SPC m p (Doom Emacs) | forge-llm-generate-pr-description-at-point | Generate a PR description at the current point |
SPC m t (Doom Emacs) | forge-llm-insert-template-at-point | Insert the PR template at the current point |
Demo: Generate PR description in a new buffer
Demo: Generate PR description at point
Workflow:
- Create a PR using Forge as normal (
forge-create-pullreq
) - In the PR creation buffer, position your cursor where you want to insert the PR description
- Press
C-c C-p
to generate and insert a PR description based on your changes - Edit the description as needed and submit the PR
Canceling Generation:
If you need to cancel an in-progress LLM request:
M-x forge-llm-cancel-request
Customization
You can customize various aspects of forge-llm
through the following variables:
PR Template Configuration
-
forge-llm-pr-template-paths
- List of possible paths for PR/MR templates relative to repo root(setq forge-llm-pr-template-paths '(".github/PULL_REQUEST_TEMPLATE.md" ".github/pull_request_template.md" "docs/pull_request_template.md" ".gitlab/merge_request_templates/default.md"))
forge-llm-default-pr-template
- Default PR template to use when no template is found in the repository
LLM Provider Configuration
-
forge-llm-llm-provider
- LLM provider to use. Can be a provider object or a function that returns a provider object(setq forge-llm-llm-provider (make-llm-openai :key "YOUR-API-KEY"))
-
forge-llm-temperature
- Temperature for LLM responses (nil for provider default)(setq forge-llm-temperature 0.7)
-
forge-llm-max-tokens
- Maximum number of tokens for LLM responses (nil for provider default)(setq forge-llm-max-tokens 1024)
Prompt Configuration
-
forge-llm-pr-description-prompt
- Prompt used to generate a PR description with the LLM. This prompt is formatted with the PR template and git diff.You can customize this prompt to match your project's PR description style:
(setq forge-llm-pr-description-prompt "Generate a PR description for the following changes. PR template: %s Git diff: ``` %s ``` Please generate a PR description that follows our team's style.")
Troubleshooting
- If you're having issues with the LLM provider, you can enable debug logging for
llm
by settingllm-log
tot
. - Check the
*forge-llm-debug-prompt*
buffer to see the exact prompt being sent to the LLM. - Check the
*forge-llm-output*
buffer to see the raw output from the LLM.
Common Issues:
-
Error: "No LLM provider configured"
- Make sure you've set
forge-llm-llm-provider
to a valid provider object. - Ensure your API key is correct.
- Make sure you've set
-
Error: "Failed to generate git diff"
- Ensure you're in a repository with valid head and base branches.
- Check if the current directory is within a git repository.
-
PR Generation is too slow
- Consider using a faster model (like GPT-3.5-turbo instead of GPT-4).
- Reduce
forge-llm-max-tokens
to limit the response size.
-
PR template not found
- Check if your PR template is in one of the paths listed in
forge-llm-pr-template-paths
. - Add your custom template path if needed.
- Check if your PR template is in one of the paths listed in
TO-DO:
- Add more examples and use cases
Contributing
Contributions are welcome! Please feel free to submit a Merge Request.
Development Setup
-
Clone the repository:
git clone https://gitlab.com/rogs/forge-llm.git cd forge-llm
-
Install dependencies for development:
- Ensure you have forge and llm packages
Changelog
0.1.0 (Initial Release)
- Initial functionality for PR description generation
- Template detection for GitHub and GitLab repositories
- LLM integration via the
llm
package - Commands for generating PR descriptions
- Customization options for templates and LLM providers
Acknowledgments
This project was heavily inspired by magit-gptcommit. Check it out! This package works very well with forge-llm.
Another huge inspiration was xenodium, with their Emacs package chatgpt-shell.
License
This project is licensed under the GNU General Public License version 3 - see the LICENSE file for details.