Skip to content

bitdruid/collama

Repository files navigation

collama

Collama is a vscode extension that provides AI-powered coding support using local ollama models.

This project is still very experimental. Currently Collama is capable of:

  • auto-complete (inline, multiline, multiblock)
  • context menu with fixed code edits

Installation

Use the marketplace to install the extension or build the vsix yourself. Furthermore you need a ollama instance in your local network.
See this link how to install ollama or this link for the docker image.

Model

The extension is primary tested with the qwen-coder model series. Currently there is one model used for autocomplete and one for edits.

  • Default model autocomplete: qwen2.5-coder:3b
  • Default model instructions: qwen2.5-coder:3b-instruct

autocomplete

Supported are only the model families: qwen, starcoder, codellama. However codellama does not have a file-separator and is currently not capable of getting the context of all opened files.

collama

edits

The extension supports experimental code-edits via context-menu. For this you should declare a base or instruct model - with good chatting-capabilities.

model table

Tests are primary with q4. Feel free to do further testing and contribute your findings!

Currently ChatML is not implemented!

FIM support

Model Tested Comment contribution needed
codeqwen none should work X
qwen2.5-coder 1.5b, 3b, 7b works good
qwen3-coder 30b works good
starcoder 1b, 3b should work X
starcoder2 3b works good
codellama 7b, 13b works ok X

Keybinding

Completition is triggered using the keybinding for editor.action.inlineSuggest.trigger.

Set it to Alt + S or Ctrl + NumPad 1 - for example

Auto-Trigger

Will trigger at a minimum of 1.5 seconds (set higher if needed).

StatusBar

Click the icon to switch modes faster.

Notes

  • You can set the endpoint to a local network server (currently no bearer tokens supported).
  • OLLAMA_ORIGINS=* env may be needed to get a response
  • Check the settings to configure the extension

Ask

Any help needed? Open an issue.

Contribution

Please test and contribute as much as you like!

About

vscode extension to use a local ollama-server for auto-complete / code-actions

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published