summaryrefslogtreecommitdiff
path: root/gemfeed
diff options
context:
space:
mode:
authorPaul Buetow <paul@buetow.org>2025-08-04 17:02:16 +0300
committerPaul Buetow <paul@buetow.org>2025-08-04 17:02:16 +0300
commit02a12574ec096e086ca9f0dabd3234aaa49dd040 (patch)
treef45848f0aeb9eb609da7af911c13b9c0000ae3e4 /gemfeed
parent1e0bddcee3a44deecec07c1610bd72ea8b491ea7 (diff)
Update content for md
Diffstat (limited to 'gemfeed')
-rw-r--r--gemfeed/2025-08-05-local-coding-llm-with-ollama.md1
1 files changed, 1 insertions, 0 deletions
diff --git a/gemfeed/2025-08-05-local-coding-llm-with-ollama.md b/gemfeed/2025-08-05-local-coding-llm-with-ollama.md
index eaec7d7f..a7d7bbfc 100644
--- a/gemfeed/2025-08-05-local-coding-llm-with-ollama.md
+++ b/gemfeed/2025-08-05-local-coding-llm-with-ollama.md
@@ -262,6 +262,7 @@ The code is quite straightforward, especially for generating boilerplate code th
To leverage Ollama for real-time code completion in my editor, I have integrated it with Helix, my preferred text editor. Helix supports the LSP (Language Server Protocol), which enables advanced code completion features. The `lsp-ai` is an LSP server that can interface with Ollama models for code completion tasks.
+[https://helix-editor.com](https://helix-editor.com)
[https://github.com/SilasMarvin/lsp-ai](https://github.com/SilasMarvin/lsp-ai)
### Installation of `lsp-ai`