Files
oam/knowledge base/ai/use a local llm for coding assistance in vscode.md
2026-02-04 00:12:35 +01:00

1.3 KiB

Use a local LLM for coding assistance in VSCode

  1. Setup
  2. Further readings
    1. Sources

Setup

  1. Install Ollama and download the models you mean to use.
    Refer Ollama.

    brew install 'ollama'
    ollama pull 'llama3.1:8b'
    ollama pull 'qwen2.5-coder:7b'
    ollama pull 'nomic-embed-text:latest'
    
  2. Install the Continue VSCode extension.

  3. Configure the extension to use local LLMs only.

    1. Open the extension's sidebar.
    2. In the top-right, select Local Config from the dropdown menu.
    3. Eventually, tweak the configuration file.

Further readings

Sources