Is anyone interested in writing Github Copilot plugin?

I’m using chatgpt from kakoune like this, I wrote this little program:

package main

import (
	"bufio"
	"context"
	"fmt"
	"io"
	"log"
	"os"
	"strconv"

	openai "github.com/sashabaranov/go-openai"
)

func main() {
	client := openai.NewClient(os.Getenv("OPENAI_API_KEY"))

	rawMessageCap := os.Getenv("MESSAGE_CAP")

	if rawMessageCap == "" {
		rawMessageCap = "50"
	}

	messageCap, err := strconv.Atoi(rawMessageCap)
	if err != nil {
		log.Fatal("Failed to parse message cap:", err)
	}
	req := openai.ChatCompletionRequest{
		Stream: true,
		Model:  openai.GPT3Dot5Turbo,
		Messages: []openai.ChatCompletionMessage{
			{
				Role:    openai.ChatMessageRoleSystem,
				Content: "you are a helpful chatbot",
			},
		},
	}
	infifo := os.Args[1]
	outfifo := os.Args[2]

	inputFile, err := os.OpenFile(infifo, os.O_RDWR, os.ModeNamedPipe)
	if err != nil {
		log.Fatal("Failed to open input FIFO:", err)
	}
	defer inputFile.Close()

	outputFile, err := os.OpenFile(outfifo, os.O_RDWR, os.ModeNamedPipe)
	if err != nil {
		log.Fatal("Failed to open output FIFO:", err)
	}
	defer outputFile.Close()

	s := bufio.NewScanner(inputFile)
	for s.Scan() {
		req.Messages = append(req.Messages, openai.ChatCompletionMessage{
			Role:    openai.ChatMessageRoleUser,
			Content: s.Text(),
		})
		stream, err := client.CreateChatCompletionStream(context.Background(), req)
		if err != nil {
			fmt.Printf("ChatCompletion error: %v\n", err)
			continue
		}

		msg := openai.ChatCompletionMessage{Role: openai.ChatMessageRoleAssistant}

	BUFFERING:
		for {
			resp, err := stream.Recv()
			if err != nil {
				if err == io.EOF {
					outputFile.WriteString("\n")
					break BUFFERING
				}
				fmt.Printf("ChatCompletion error: %v\n", err)
			}
			outputFile.WriteString(fmt.Sprintf("%s", resp.Choices[0].Delta.Content))
			msg.Content = msg.Content + resp.Choices[0].Delta.Content
		}

		req.Messages = append(req.Messages, msg)
		ran := len(req.Messages) - messageCap
		if ran < 0 {
			ran = 0
		}

		req.Messages = req.Messages[ran:]
	}
}

and added this to my kakrc:

define-command opengpt %{
  eval -try-client tools %{
    edit -fifo ~/chatout chat
  }
}

define-command gpt -override -params 0.. %{
  evaluate-commands %sh{
    if [ $(($(printf %s "${kak_selection}" | wc -m))) -gt 1 ]; then
      echo "$@ " "$(printf '%s' "${kak_selection}" | tr '\n' ' ')" > ~/chatin
    else
      echo "$@" > ~/chatin
    fi
  }
    eval -try-client tools %{
    buffer chat
  }
}

map global user i ": gpt "

I run the program like this

OPENAI_API_KEY=XXX <path_to_program> ~/chatin ~/chatout

chatin and chatout are some fifos that live in my home dir and thats it, a little hardcoded but this allows me to have actual conversations, and easily pass the content of my selection as additional input

1 Like

I will make it a plugin if someone is interested, I’ll be posting it here.
EDIT: here it is GitHub - eko234/geppeto: Makes your editor a "real boy", use chat gpt trough fifos with kakoune, (or without it if you need it...), I will open a thread for it too.

Heya! Curious if anyone managed to get it working in the same way as neovim with auto-suggest? I am new to Kakoune and while it pretty much has everything I need out of the box, this is one thing that’s hard for me to live without.

Heya! Sorry for bumping this twice. I had a crack at this today. I used this:

and then wrote the following commands (thanks @ewintr ).


# --- CODING ASSISTANT
declare-user-mode assistant
map global assistant -docstring "Replace selection with code assistant's answer" r '<a-|>tee /tmp/kak-tmp-code.txt; echo "You are a code generator.\nWriting comments is forbidden.\nWriting test code is forbidden.\nWriting English explanations is forbidden.\nDont include general translations.\nContinue this $kak_bufname code:\n" > /tmp/kak-gpt-prompt.txt; cat /tmp/kak-tmp-code.txt >> /tmp/kak-gpt-prompt.txt<ret>| cat /tmp/kak-gpt-prompt.txt | chatgpt ""<ret>'

Now I can highlight some code and it will generate the continuation for it, so for example writing in test.ml

let binary_sort = 

and then selecting the code and run the replace selection yielded me

let binary_sort = (lst: int list) -> 
  let rec merge left right acc = 
    match (left, right) with
    | [], [] -> List.rev acc
    | [], r :: rt -> merge [] rt (r :: acc)
    | l :: lt, [] -> merge lt [] (l :: acc)
    | l :: lt, r :: rt -> 
        if l <= r then merge lt right (l :: acc)
        else merge left rt (r :: acc)
  in 
  let rec split lst = 
    match lst with
    | [] -> [], []
    | [x] -> [x], []
    | x :: y :: xs -> 
        let left, right = split xs
        x :: left, y :: right
  in 
  let rec sort lst = 
    match lst with
    | [] -> []
    | [x] -> [x]
    | _ -> 
        let left, right = split lst
        let sorted_left = sort left
        let sorted_right = sort right
        merge sorted_left sorted_right []
  in 
  sort lst

Pretty cool!

It isn’t perfect though, as it sometimes will include markdown code comments, sometimes it will replace the original code, sometimes it will include the original code.

I’d love to know if there is a way to “append” a suggestion, because then I could try to prompt chatgpt to never include the original code.

I wrote cli for ollama but I haven’t created Kakoune bindings for it yet.

2 Likes

If anyone is curious about the process one would have to go to to create a GitHub Copilot plugin right now, the Zed team published a blog post about their experience. This part seems sticks out as helpful first steps:

First, we had to actually get access to Copilot. As of this writing, there is no official GitHub API to interact with Copilot. However, thanks to the open-source Neovim plugin, we have access to an undocumented, minified LSP server that handles interacting with GitHub for us. Zed already has built-in support for LSP servers, so getting access to Copilot was as simple as downloading the Copilot LSP from the Neovim plugin repository. Special thanks to TerminalFi, a Zed community member, for their LSP-copilot Sublime Text plugin, which provided a specification for the custom messages available.