8000 GitHub - ericgarrigues/ogpt.nvim: Chat with Ollama GPT Neovim Plugin: Effortless Natural Language Generation with Ollama LLM API
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Chat with Ollama GPT Neovim Plugin: Effortless Natural Language Generation with Ollama LLM API

License

Notifications You must be signed in to change notification settings

ericgarrigues/ogpt.nvim

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation


OGPT.nvim

GitHub Workflow Status Lua

Features

Again Credits goes to jackMort/ChatGPT.nvim For these Awesome features

  • Interactive Q&A: Engage in interactive question-and-answer sessions with the powerful gpt model (OGPT) using an intuitive interface.
  • Persona-based Conversations: Explore various perspectives and have conversations with different personas by selecting prompts from Awesome ChatGPT Prompts.
  • Code Editing Assistance: Enhance your coding experience with an interactive editing window powered by the gpt model, offering instructions tailored for coding tasks.
  • Code Completion: Enjoy the convenience of code completion similar to GitHub Copilot, leveraging the capabilities of the gpt model to suggest code snippets and completions based on context and programming patterns.
  • Customizable Actions: Execute a range of actions utilizing the gpt model, such as grammar correction, translation, keyword generation, docstring creation, test addition, code optimization, summarization, bug fixing, code explanation, Roxygen editing, and code readability analysis. Additionally, you can define your own custom actions using a JSON file.

For a comprehensive understanding of the extension's functionality, you can watch a plugin showcase video

Ollama Specific Features:

  • clean up documentation
  • original functionality of OGPT.nvim with Ollama
  • Custom settings per session
  • Another Windows for Template, System
  • Query and Select model from Ollama
  • Support model creation on the fly

Change Model by Opening the Parameter panels default to (ctrl-o) and Tab your way to it then press Enter () on the model field to change it. It should list all the available models on the your Ollama server. Change Model

Same with changing the model, add and delete parameters by using the keys "a" and "d" respectively Additional Ollama Parameters

OGPT Enhancement from Original ChatGPT.nvim

  • additional actions can be added to config options
  • running OGPTRun shows telescope picker

Installation

OGPT is a Neovim plugin that allows you to effortlessly utilize the Ollama OGPT API, empowering you to generate natural language responses from Ollama's OGPT directly within the editor in response to your inquiries.

preview image

  • Make sure you have curl installed.
  • Have a local instance of Ollama running.

Custom Ollama API host with the configuration option api_host_cmd or environment variable called $OLLAMA_API_HOST. It's useful if you run Ollama remotely

-- Packer
use({
  "huynle/ogpt.nvim",
    config = function()
      require("ogpt").setup()
    end,
    requires = {
      "MunifTanjim/nui.nvim",
      "nvim-lua/plenary.nvim",
      "nvim-telescope/telescope.nvim"
    }
})

-- Lazy
{
  "huynle/ogpt.nvim",
    event = "VeryLazy",
    config = function()
      require("ogpt").setup()
    end,
    dependencies = {
      "MunifTanjim/nui.nvim",
      "nvim-lua/plenary.nvim",
      "nvim-telescope/telescope.nvim"
    }
}

Configuration

OGPT.nvim comes with the following defaults, you can override them by passing config as setup param

https://github.com/huynle/ogpt.nvim/blob/81b5e189a044cb9c035b0e9b5e1ad0f2e03c440a/lua/ogpt/config.lua#L10-L157

Usage

Plugin exposes following commands:

OGPT

OGPT command which opens interactive window using the neural-chat:7b model. (also known as OGPT)

OGPTActAs

OGPTActAs command which opens a prompt selection from Awesome OGPT Prompts to be used with the neural-chat:7b model.

preview image

OGPTRun edit_with_instructions

OGPTRun edit_with_instructions command which opens interactive window to edit selected text or whole window using the 'neural-chat:7b` model, you can change in this in your config options

OGPTRun edit_code_with_instructions

This command opens an interactive window to edit selected text or the entire window using the neural-chat:7b model. You can modify this in your config options. The Ollama response will be extracted for its code content, and if it doesn't contain any codeblock, it will default back to the full response.

You can map it using the Lua API, e.g. using which-key.nvim:

local ogpt = require("ogpt")
wk.register({
    p = {
        name = "OGPT",
        e = {
            function()
                ogpt.edit_with_instructions()
            end,
            "Edit with instructions",
        },
    },
}, {
    prefix = "<leader>",
    mode = "v",
})

preview image

OGPTRun

OGPTRun [action] command which runs specific actions -- see actions.json file for a detailed list. Available actions are:

  1. grammar_correction
  2. translate
  3. keywords
  4. docstring
  5. add_tests
  6. optimize_code
  7. summarize
  8. fix_bugs
  9. explain_code
  10. roxygen_edit
  11. code_readability_analysis -- see demo

All the above actions are using neural-chat:7b model.

It is possible to define custom actions with a JSON file. See actions.json for an example. The path of custom actions can be set in the config (see actions_paths field in the config example above).

An example of custom action may look like this: (# marks comments)

{
  "action_name": {
    "type": "chat", # or "completion" or "edit"
    "opts": {
      "template": "A template using possible variable: {{filetype}} (neovim filetype), {{input}} (the selected text) an {{argument}} (provided on the command line)",
      "strategy": "replace", # or "display" or "append" or "edit"
      "params": { # parameters according to the official Ollama API
        "model": "neural-chat:7b", # or any other model supported by `"type"` in the Ollama API, use the playground for reference
        "stop": [
          "```" # a string used to stop the model
        ]
      }
    },
    "args": {
      "argument": {
          "type": "strig",
          "optional": "true",
          "default": "some value"
      }
    }
  }
}

The edit strategy consists in showing the output side by side with the input and available for further editing requests For now, edit strategy is implemented for chat type only.

The display strategy shows the output in a float window.

append and replace modify the text directly in the buffer.

Interactive popup

When using OGPT, the following keybindings are available:

  • <C-Enter> [Both] to submit.
  • <C-y> [Both] to copy/yank last answer.
  • <C-o> [Both] Toggle settings window.
  • <Tab> [Both] Cycle over windows.
  • <C-f> [Chat] Cycle over modes (center, stick to right).
  • <C-c> [Both] to close chat window.
  • <C-u> [Chat] scroll up chat window.
  • <C-d> [Chat] scroll down chat window.
  • <C-k> [Chat] to copy/yank code from last answer.
  • <C-n> [Chat] Start new session.
  • <C-d> [Chat] draft message (create message without submitting it to server)
  • <C-r> [Chat] switch role (switch between user and assistant role to define a workflow)
  • <C-s> [Both] Toggle system message window.
  • <C-i> [Edit Window] use response as input.
  • <C-d> [Edit Window] view the diff between left and right panes and use diff-mode commands

When the setting window is opened (with <C-o>), settings can be modified by pressing Enter on the related config. Settings are saved across sections

Whichkey plugin mappings

Add these to your whichkey plugin mappings for convenient binds

c = {
  name = "OGPT",
    e = { "<cmd>OGPTRun edit_with_instructions<CR>", "Edit with instruction", mode = { "n", "v" } },
    c = { "<cmd>OGPTRun edit_code_with_instructions<CR>", "Edit code with instruction", mode = { "n", "v" } },
    g = { "<cmd>OGPTRun grammar_correction<CR>", "Grammar Correction", mode = { "n", "v" } },
    t = { "<cmd>OGPTRun translate<CR>", "Translate", mode = { "n", "v" } },
    k = { "<cmd>OGPTRun keywords<CR>", "Keywords", mode = { "n", "v" } },
    d = { "<cmd>OGPTRun docstring<CR>", "Docstring", mode = { "n", "v" } },
    a = { "<cmd>OGPTRun add_tests<CR>", "Add Tests", mode = { "n", "v" } },
    o = { "<cmd>OGPTRun optimize_code<CR>", "Optimize Code", mode = { "n", "v" } },
    s = { "<cmd>OGPTRun summarize<CR>", "Summarize", mode = { "n", "v" } },
    f = { "<cmd>OGPTRun fix_bugs<CR>", "Fix Bugs", mode = { "n", "v" } },
    x = { "<cmd>OGPTRun explain_code<CR>", "Explain Code", mode = { "n", "v" } },
    r = { "<cmd>OGPTRun roxygen_edit<CR>", "Roxygen Edit", mode = { "n", "v" } },
    l = { "<cmd>OGPTRun code_readability_analysis<CR>", "Code Readability Analysis", mode = { "n", "v" } },
  },

Credits

First of all, thank you to the author of jackMort/ChatGPT.nvim for creating a seamless framework to interact with OGPT in neovim!

THIS IS A FORK of the original OGPT.nvim that supports Ollama (https://ollama.ai/), which allows you to run complete local LLMs.

Ollama is still in its infancy, so there are numerous pull requests open to expand its capabilities. One of which is to conform to the OGPT API - ollama/ollama#991. Because of this, this repo is a hack together solution for me to test out the local LLMs that I have running.

THIS PLUGIN MAY NOT LAST VERY LONG, depending on the state of Ollama.

Buy Jack(Original creator) a Coffee "Buy Jack A Coffee"

Buy Me a Coffee "Buy Me A Coffee"

About

Chat with Ollama GPT Neovim Plugin: Effortless Natural Language Generation with Ollama LLM API

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Lua 100.0%
0