Show HN: VimLM – A Local, Offline Coding Assistant for Vim

by JosefAlberson 2/14/2025, 11:34 PMwith 18 comments

VimLM is a local, offline coding assistant for Vim. It’s like Copilot but runs entirely on your machine—no APIs, no tracking, no cloud.

- Deep Context: Understands your codebase (current file, selections, references). - Conversational: Iterate with follow-ups like "Add error handling". - Vim-Native: Keybindings like `Ctrl-l` for prompts, `Ctrl-p` to replace code. - Inline Commands: `!include` files, `!deploy` code, `!continue` long responses.

Perfect for privacy-conscious devs or air-gapped environments.

Try it: ``` pip install vimlm vimlm ```

[GitHub](https://github.com/JosefAlbers/VimLM)

by topreruleson 2/15/2025, 1:56 AM

Awesome. AI isn't making Vim less relevant, it's now more relevant than ever. When every editor can have maximum magic with the same model and LSP, why not use the tool that lets you also review AI generated diffs and navigate at lightning speed. Vim is a tool that can actually keep up with how fast AI can accelerate the dev cycle.

Also love to see these local solutions. Coding shouldn't just be for the rich who can afford to pay for cloud solutions. We need open, local models and plugins.

by elliotecon 2/15/2025, 4:18 AM

Why does it need an Apple M-series chip? Any hope for it getting on an intel chip and using it with Linux?

by ZYbCRq22HbJ2y7on 2/15/2025, 2:37 AM

What is a good method for sandboxing models? I would like to trust these projects, but downloading hard-to-analyze arbitrary code and running it seems problematic.

by thor_moleculeson 2/15/2025, 2:44 AM

Consider exposing commands that the user can then assign to their own preferred keybindings instead of choosing for them

by dbacaron 2/15/2025, 8:53 PM

a good update for an editor that cant handle indenting out of the box!