Roger Gonzalez 856f7eca4e
All checks were successful
Emacs Package CI / Compile Emacs Package (push) Successful in 59s
Emacs Package CI / Lint Emacs Package (push) Successful in 1m3s
Add table of contents option
- Add toc:nil option to disable table of contents.
2025-05-15 20:20:32 -03:00
2025-03-30 12:52:29 -03:00
2025-03-12 21:04:29 -03:00
2025-05-15 20:20:32 -03:00

forge-llm

https://melpa.org/packages/forge-llm-badge.svg https://git.rogs.me/rogs/forge-llm/actions/workflows/ci.yml/badge.svg

forge-llm

Generate Pull Request descriptions for Forge using LLM providers through the llm package.

https://gitlab.com/uploads/-/system/project/avatar/67959042/logo.png

Overview

forge-llm is an Emacs package that integrates Large Language Models (LLMs) with Forge, a Magit interface to GitHub and GitLab forges. This package helps you generate high-quality Pull Request descriptions based on your git diff and repository PR templates.

Main features:

  • Automatically finds and uses your repository's PR template
  • Generates PR descriptions based on git diffs between branches
  • Seamless integration with Forge's PR creation workflow
  • Supports any LLM provider supported by the llm package
  • Stream LLM responses in real-time

Dependencies

  • Magit and Forge (Note: forge is essential for this package to operate)
  • llm
  • Emacs 25.1+

Installation

Using MELPA (Recommended)

The easiest way to install forge-llm is via MELPA. Ensure you have MELPA configured in your Emacs setup (it's included by default in many distributions like Doom Emacs and Spacemacs).

(use-package forge-llm
  :ensure t
  :after forge
  :config
  (forge-llm-setup))

Using straight.el with use-package

If you use straight.el to manage your packages, it can install forge-llm directly from MELPA. Ensure MELPA is included in your straight-recipe-repositories or straight-recipe-sources.

(use-package forge-llm
  ;; straight.el will fetch this from MELPA if :ensure t is used
  ;; and straight.el is configured as the handler for use-package.
  :ensure t
  :after forge
  :config
  (forge-llm-setup))

Using Doom Emacs

Basic Setup
  1. Add the following to your packages.el (ensure MELPA is enabled in your Doom configuration, which is usually the default):
(package! forge-llm)
(package! llm)  ; Dependency
  1. Add somewhere in your config.el:
;; Load and setup forge-llm after forge is loaded
(after! forge
  (require 'forge-llm)
  (forge-llm-setup))

;; Configure your LLM provider (example using OpenAI)
;; Place this somewhere appropriate in your config.el
(require 'llm-openai)  ; Or your preferred LLM provider
(setq forge-llm-llm-provider (make-llm-openai :key "YOUR-OPENAI-KEY")) ; Replace with your key/provider setup
  1. Run doom sync to install the package.
Keybindings

The package automatically sets up Doom Emacs keybindings when Doom is detected:

  • SPC m g - Generate PR description in a separate buffer
  • SPC m p - Generate PR description at point
  • SPC m t - Insert PR template at point

No additional configuration is needed for these keybindings to work.

Manual installation

Clone the repository:

git clone https://gitlab.com/rogs/forge-llm.git ~/.emacs.d/site-lisp/forge-llm

Add to your Emacs configuration:

(add-to-list 'load-path "~/.emacs.d/site-lisp/forge-llm")
(require 'forge-llm)
(forge-llm-setup)

Setting up LLM providers

forge-llm depends on the llm package for LLM integration. You'll need to set up at least one LLM provider. Please refer to the llm documentation for detailed instructions.

Some of the providers supported by the llm package include:

  • OpenAI
  • Anthropic (Claude)
  • Google (Gemini, Vertex AI)
  • Azure OpenAI
  • GitHub Models
  • Ollama (for local models like Llama, Mistral, etc.)
  • GPT4All (for local models)
  • llama.cpp (via OpenAI compatible endpoint)
  • Deepseek
  • Generic OpenAI-compatible endpoints

See the llm documentation for the complete list and specific setup steps.

Example: OpenAI provider

First, create an OpenAI API key. Then configure the llm OpenAI provider:

(require 'llm-openai)
(setq forge-llm-llm-provider (make-llm-openai :key "YOUR-OPENAI-KEY"))

Example: Anthropic provider

To use Claude models from Anthropic:

(require 'llm-claude)
(setq forge-llm-llm-provider (make-llm-claude :key "YOUR-ANTHROPIC-KEY" :chat-model "claude-3-7-sonnet-20250219"))

Using auth-source for API keys (recommended)

For better security, use Emacs auth-source to store your API keys:

(use-package llm
  :ensure t
  :config
  (setq llm-warn-on-nonfree nil))

(require 'llm-openai)

(use-package forge-llm
  :ensure t
  :after (forge llm)
  :custom
  (forge-llm-llm-provider
   (make-llm-openai
    :key (auth-source-pick-first-password
           :host "api.openai.com"
           :user "apikey")))
  :config
  (forge-llm-setup))

Content of .authinfo or .authinfo.gpg:

machine api.openai.com login apikey password YOUR-API-KEY-HERE

Usage

After setting up forge-llm, the following commands will be available specifically within Forge's pull request creation buffer (which runs in forge-post-mode):

Key binding Command Description
C-c C-l g forge-llm-generate-pr-description Generate a PR description (output to separate buffer)
C-c C-l p forge-llm-generate-pr-description-at-point Generate a PR description at the current point
C-c C-l t forge-llm-insert-template-at-point Insert the PR template at the current point
SPC m g (Doom Emacs) forge-llm-generate-pr-description Generate a PR description (output to separate buffer)
SPC m p (Doom Emacs) forge-llm-generate-pr-description-at-point Generate a PR description at the current point
SPC m t (Doom Emacs) forge-llm-insert-template-at-point Insert the PR template at the current point

Demo: Generate PR description in a new buffer

Pressing C-c C-l g will generate a PR description and display it in a separate buffer:

https://gitlab.com/-/project/67959042/uploads/3eed67e0b188d040906d30b6b6cc3ec6/generate-pr-desc.gif

Click the image to view in full screen

Demo: Generate PR description at point

Pressing C-c C-l p will generate a PR description and insert it directly at the cursor position:

https://gitlab.com/-/project/67959042/uploads/9e5d4f8b4eab87989eafca9f58baa467/generate-pr-at-point.gif

Click the image to view in full screen

Workflow:

  1. Create a PR using Forge as normal (forge-create-pullreq)
  2. In the PR creation buffer, position your cursor where you want to insert the PR description
  3. Press C-c C-l p to generate and insert a PR description based on your changes
  4. Edit the description as needed and submit the PR

Canceling Generation:

If you need to cancel an in-progress LLM request:

  • M-x forge-llm-cancel-request

Customization

You can customize various aspects of forge-llm through the following variables:

PR Template Configuration

  • forge-llm-pr-template-paths - List of possible paths for PR/MR templates relative to repo root

    (setq forge-llm-pr-template-paths
          '(".github/PULL_REQUEST_TEMPLATE.md"
            ".github/pull_request_template.md"
            "docs/pull_request_template.md"
            ".gitlab/merge_request_templates/default.md"))
  • forge-llm-default-pr-template - Default PR template to use when no template is found in the repository

LLM Provider Configuration

  • forge-llm-llm-provider - LLM provider to use. Can be a provider object or a function that returns a provider object (See the llm package documentation for how to create provider objects).

    (setq forge-llm-llm-provider (make-llm-openai :key "YOUR-API-KEY"))
  • forge-llm-temperature - Temperature for LLM responses (nil for provider default)

    (setq forge-llm-temperature 0.7)
  • forge-llm-max-tokens - Maximum number of tokens for LLM responses (nil for provider default)

    (setq forge-llm-max-tokens 1024)
  • forge-llm-max-diff-size - Maximum size in characters for git diffs sent to the LLM (nil for no truncation)

    ;; Default is 50000, set to nil to disable truncation
    (setq forge-llm-max-diff-size 100000)  ; Increase to 100K characters
    ;; Or disable truncation completely
    (setq forge-llm-max-diff-size nil)

Prompt Configuration

  • forge-llm-pr-description-prompt - Prompt used to generate a PR description with the LLM. This prompt is formatted with the PR template and git diff.

    You can customize this prompt to match your project's PR description style:

    (setq forge-llm-pr-description-prompt
          "Generate a PR description for the following changes.
    PR template:
    %s
    
    Git diff:
    ```
    %s
    ```
    
    Please generate a PR description that follows our team's style.")

Troubleshooting

  • If you're having issues with the LLM provider, you can enable debug logging for llm by setting llm-log to t.
  • Check the *forge-llm-debug-prompt* buffer to see the exact prompt being sent to the LLM.
  • Check the *forge-llm-output* buffer to see the raw output from the LLM.

Common Issues:

  • Error: "No LLM provider configured"

    • Make sure you've set forge-llm-llm-provider to a valid provider object.
    • Ensure your API key is correct.
  • Error: "Failed to generate git diff"

    • Ensure you're in a repository with valid head and base branches.
    • Check if the current directory is within a git repository.
  • PR Generation is too slow

    • Consider using a faster model (like GPT-3.5-turbo instead of GPT-4).
    • Reduce forge-llm-max-tokens to limit the response size.
  • PR template not found

    • Check if your PR template is in one of the paths listed in forge-llm-pr-template-paths.
    • Add your custom template path if needed.

TO-DO:

  • Add more examples and use cases

Contributing

Contributions are welcome! Please feel free to submit a Merge Request.

Development Setup

  1. Clone the repository:

    git clone https://gitlab.com/rogs/forge-llm.git
    cd forge-llm
  2. Install dependencies for development:

    • Ensure you have forge and llm packages

Acknowledgments

This project was heavily inspired by magit-gptcommit. Check it out! This package works very well with forge-llm.

Another huge inspiration was xenodium, with their Emacs package chatgpt-shell.

License

This project is licensed under the GNU General Public License version 3 - see the LICENSE file for details.

Description
forge-llm is an Emacs package that integrates Large Language Models (LLMs) with Forge, enhancing the pull request (PR) workflow for GitHub and GitLab users. It automates PR description generation by analyzing git diffs and leveraging existing PR templates, helping developers create clear, structured, and high-quality descriptions effortlessly.
Readme GPL-3.0 2.8 MiB
Languages
Emacs Lisp 100%