summaryrefslogtreecommitdiff
path: root/gemfeed/2025-08-05-local-coding-llm-with-ollama.html
diff options
context:
space:
mode:
Diffstat (limited to 'gemfeed/2025-08-05-local-coding-llm-with-ollama.html')
-rw-r--r--gemfeed/2025-08-05-local-coding-llm-with-ollama.html1
1 files changed, 1 insertions, 0 deletions
diff --git a/gemfeed/2025-08-05-local-coding-llm-with-ollama.html b/gemfeed/2025-08-05-local-coding-llm-with-ollama.html
index bdd28982..8e04d7d6 100644
--- a/gemfeed/2025-08-05-local-coding-llm-with-ollama.html
+++ b/gemfeed/2025-08-05-local-coding-llm-with-ollama.html
@@ -303,6 +303,7 @@ http://www.gnu.org/software/src-highlite -->
<br />
<span>To leverage Ollama for real-time code completion in my editor, I have integrated it with Helix, my preferred text editor. Helix supports the LSP (Language Server Protocol), which enables advanced code completion features. The <span class='inlinecode'>lsp-ai</span> is an LSP server that can interface with Ollama models for code completion tasks.</span><br />
<br />
+<a class='textlink' href='https://helix-editor.com'>https://helix-editor.com</a><br />
<a class='textlink' href='https://github.com/SilasMarvin/lsp-ai'>https://github.com/SilasMarvin/lsp-ai</a><br />
<br />
<h3 style='display: inline' id='installation-of-lsp-ai'>Installation of <span class='inlinecode'>lsp-ai</span></h3><br />