7.0 KiB
forge-llm
- forge-llm
forge-llm
Generate Pull Request descriptions for Forge using LLM providers through the llm package.
Overview
forge-llm
is an Emacs package that integrates Large Language Models (LLMs) with Forge, a Magit interface to GitHub and GitLab forges. This package helps you generate high-quality Pull Request descriptions based on your git diff and repository PR templates.
Main features:
- Automatically finds and uses your repository's PR template
- Generates PR descriptions based on git diffs between branches
- Seamless integration with Forge's PR creation workflow
- Supports any LLM provider supported by the
llm
package
Installation
Using straight.el with use-package
(use-package forge-llm
:straight (:host gitlab :repo "rogs/forge-llm")
:after forge
:config
(forge-llm-setup))
Using MELPA (once available)
(use-package forge-llm
:ensure t
:after forge
:config
(forge-llm-setup))
Manual installation
Clone the repository:
git clone https://gitlab.com/rogs/forge-llm.git ~/.emacs.d/site-lisp/forge-llm
Add to your Emacs configuration:
(add-to-list 'load-path "~/.emacs.d/site-lisp/forge-llm")
(require 'forge-llm)
(forge-llm-setup)
Setting up LLM providers
forge-llm
depends on the llm package for LLM integration. You'll need to set up at least one LLM provider. Please refer to the llm documentation for detailed instructions.
Example: OpenAI provider
First, create an OpenAI API key. Then configure the llm
OpenAI provider:
(require 'llm-openai)
(setq forge-llm-llm-provider (make-llm-openai :key "YOUR-OPENAI-KEY"))
Using auth-source for API keys (recommended)
For better security, use Emacs auth-source
to store your API keys:
(use-package llm
:ensure t
:config
(setq llm-warn-on-nonfree nil))
(use-package forge-llm
:straight (:host gitlab :repo "rogs/forge-llm")
:after (forge llm)
:custom
(forge-llm-llm-provider
(make-llm-openai :key
(auth-info-password
(car (auth-source-search
:host "api.openai.com"
:user "apikey")))))
:config
(forge-llm-setup))
Content of .authinfo
or .authinfo.gpg
:
machine api.openai.com login apikey password YOUR-API-KEY-HERE
Usage
After setting up forge-llm
, the following commands will be available in Forge's pull request creation buffer:
Key binding | Command | Description |
---|---|---|
C-c C-g | forge-llm-generate-pr-description | Generate a PR description (output to separate buffer) |
C-c C-p | forge-llm-generate-pr-description-at-point | Generate a PR description at the current point |
C-c C-t | forge-llm-insert-template-at-point | Insert the PR template at the current point |
Workflow:
- Create a PR using Forge as normal (
forge-create-pullreq
) - In the PR creation buffer, position your cursor where you want to insert the PR description
- Press
C-c C-p
to generate and insert a PR description based on your changes - Edit the description as needed and submit the PR
Customization
You can customize various aspects of forge-llm
through the following variables:
PR Template Configuration
-
forge-llm-pr-template-paths
- List of possible paths for PR/MR templates relative to repo root(setq forge-llm-pr-template-paths '(".github/PULL_REQUEST_TEMPLATE.md" ".github/pull_request_template.md" "docs/pull_request_template.md" ".gitlab/merge_request_templates/default.md"))
forge-llm-default-pr-template
- Default PR template to use when no template is found in the repository
LLM Provider Configuration
-
forge-llm-llm-provider
- LLM provider to use. Can be a provider object or a function that returns a provider object(setq forge-llm-llm-provider (make-llm-openai :key "YOUR-API-KEY"))
-
forge-llm-temperature
- Temperature for LLM responses (nil for provider default)(setq forge-llm-temperature 0.7)
-
forge-llm-max-tokens
- Maximum number of tokens for LLM responses (nil for provider default)(setq forge-llm-max-tokens 1024)
Prompt Configuration
forge-llm-pr-description-prompt
- Prompt used to generate a PR description with the LLM
TO-DO:
- Add example gif or video in the description
- Add Doom emacs keybindings
- Add a logo for the project
- Change script to a literate config, for readability
- Maybe a small one-page website?
- Maybe more?
Troubleshooting
- If you're having issues with the LLM provider, you can enable debug logging for
llm
by settingllm-log
tot
. - Check the
*forge-llm-debug-prompt*
buffer to see the exact prompt being sent to the LLM. - Check the
*forge-llm-output*
buffer to see the raw output from the LLM.
Contributing
Contributions are welcome! Please feel free to submit a Merge Request.
License
This project is licensed under the GNU General Public License version 3 - see the LICENSE file for details.