summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorPaul Buetow <paul@buetow.org>2025-08-17 21:39:01 +0300
committerPaul Buetow <paul@buetow.org>2025-08-17 21:39:01 +0300
commitad99f0a6a65bb1d327feb933d5349fc067c881f2 (patch)
treef8e66b81639e134e4898e000ff34e8782db57fa0
parent454451105ad3522d2ac3d22136eedee4a4d034af (diff)
feat: Support XDG config home
This change implements support for the XDG Base Directory Specification for the configuration file. The configuration file is now read from `$XDG_CONFIG_HOME/hexai/config.json` if the `XDG_CONFIG_HOME` environment variable is set. If it is not set, it falls back to the previous location, `$HOME/.config/hexai/config.json`. This change also includes: - A fix for a bug in the test suite where a test was failing due to an environment variable being set. - Updates to the documentation to reflect the new configuration file location. - A version bump to 0.1.0.
-rw-r--r--IDEAS.md15
-rw-r--r--README.md187
-rw-r--r--cmd/hexai/main.go1
-rw-r--r--internal/appconfig/config.go160
-rw-r--r--internal/hexailsp/run_test.go4
-rw-r--r--internal/version.go1
6 files changed, 176 insertions, 192 deletions
diff --git a/IDEAS.md b/IDEAS.md
index 70f6537..11cae34 100644
--- a/IDEAS.md
+++ b/IDEAS.md
@@ -50,18 +50,3 @@ Be able to switch LLMs.
## Usage notes
-Helix' `languages.toml`
-
-```toml
-[[language]]
-name = "go"
-auto-format= true
-diagnostic-severity = "hint"
-formatter = { command = "goimports" }
-language-servers = [ "gopls", "golangci-lint-lsp", "hexai" ]
-# language-servers = [ "gopls", "golangci-lint-lsp", "lsp-ai", "gpt", "hexai" ]
-
-[language-server.hexai]
-command = "hexai"
-```
-
diff --git a/README.md b/README.md
index dc79489..bfa5cb9 100644
--- a/README.md
+++ b/README.md
@@ -4,11 +4,41 @@
Hexai, the AI LSP for the Helix editor and also a simple command line tool to interact with LLMs in general.
-At the moment this project is in the alpha state.
+Hexai exposes a simple LLM provider interface. It supports OpenAI, GitHub Copilot, and a local Ollama server. Provider selection and models are configured via a JSON configuration file.
-## LLM provider
+## Configuration
-Hexai exposes a simple LLM provider interface. It supports OpenAI, GitHub Copilot, and a local Ollama server. Provider selection and models are configured via a JSON configuration file.
+### Example configuration file
+
+- Location: `$XDG_CONFIG_HOME/hexai/config.json` (usually `~/.config/hexai/config.json`)
+- Example:
+
+```
+{
+ "max_tokens": 4000,
+ "context_mode": "always-full",
+ "context_window_lines": 120,
+ "max_context_tokens": 4000,
+ "log_preview_limit": 100,
+ "no_disk_io": true,
+ "trigger_characters": [".", ":", "/", "_", ";", "?"],
+ "provider": "ollama",
+ "copilot_model": "gpt-4.1",
+ "copilot_base_url": "https://api.githubcopilot.com",
+ "openai_model": "gpt-4.1",
+ "openai_base_url": "https://api.openai.com/v1",
+ "ollama_model": "qwen2.5-coder:latest",
+ "ollama_base_url": "http://localhost:11434"
+}
+```
+
+* context_mode: minimal | window | file-on-new-func | always-full
+* provider: openai | copilot | ollama
+* openai_model, openai_base_url: OpenAI-only options
+* copilot_model, copilot_base_url: Copilot-only options
+* ollama_model, ollama_base_url: Ollama-only options
+
+Ensure `OPENAI_API_KEY` or `COPILOT_API_KEY` is set in your environment according to your chosen provider.
### Selecting a provider
@@ -40,7 +70,9 @@ Notes:
- If you run Ollama in OpenAI‑compatible mode, you may alternatively use the
OpenAI provider with `openai_base_url` in the config pointing to your local endpoint.
-## CLI usage and configuration
+## Usage
+
+### Hexai LSP Server
- Run LSP server over stdio:
- `hexai-lsp`
@@ -49,116 +81,34 @@ Notes:
- `-version`: print the Hexai version and exit.
- `-log`: path to log file (optional; default `/tmp/hexai-lsp.log`).
-- Run command-line tool (processes text via configured LLM):
- - `cat SOMEFILE.txt | hexai`
- - `hexai 'some prompt text here'`
- - `cat SOMEFILE.txt | hexai 'some prompt text here'` (stdin and arg are concatenated)
-
-Notes for `hexai` (CLI):
-- Prints LLM output to stdout.
-- Prints provider/model immediately to stderr, and a summary to stderr at the end (time, input bytes, output bytes, provider/model).
-- Default response style: short answers. If the prompt asks for commands, outputs only the commands with no explanation. Include the word `explain` anywhere in the prompt to request a verbose explanation.
-- Streams output: when supported by the provider (OpenAI, Ollama), `hexai` streams tokens and prints them to stdout as they arrive. Copilot falls back to non-streaming.
-
-### Hexai CLI behavior
-
-- Inputs: reads from stdin, from a single argument, or both.
- - If both are provided, Hexai concatenates them with a blank line in between.
-- Output routing:
- - Stdout: the LLM response only (no decorations).
- - Stderr: metadata and progress in grey on black (styled via ANSI):
- - Provider/model printed immediately when the request starts.
- - A final stats line on a new line: `done provider=… model=… time=… in_bytes=… out_bytes=…`.
-- Default style: concise answers.
- - If the prompt asks for commands, outputs only the commands with no commentary.
- - Add the word `explain` in your prompt to request a verbose explanation.
-- Exit codes: `0` success, `1` provider/config error, `2` no input.
-
-### Internal CLI package
-
-- Package `internal/hexaicli` contains the CLI logic extracted from `cmd/hexai`.
-- Entry points:
-- `Run(ctx, args, stdin, stdout, stderr)`: Full CLI flow; parses input and builds the LLM client from config/env.
-- `RunWithClient(ctx, args, stdin, stdout, stderr, client)`: Same flow using a provided `llm.Client` (useful for tests and embedding).
-- Behavior is identical to the `hexai` binary: provider/model banner on stderr, streamed output when available, and a final summary line.
-
-### Internal LSP package
-
-- Package `internal/hexailsp` contains the LSP binary logic extracted from `cmd/hexai-lsp`.
-- Entry points:
-- `Run(logPath, stdin, stdout, stderr)`: Configures logging, loads config, builds the LLM client, and runs the LSP server over stdio.
-- `RunWithFactory(logPath, stdin, stdout, logger, cfg, client, factory)`: Testable entry that accepts a prebuilt `llm.Client` and a factory for `lsp.Server` creation.
-- Mirrors the behavior of the `hexai-lsp` binary while enabling unit tests without invoking the full server loop.
-
-Examples:
-
-```
-# From stdin only
-cat SOMEFILE.txt | hexai
-
-# From arg only
-hexai 'summarize: list 3 bullets'
-
-# From both (stdin first, then arg)
-cat SOMEFILE.txt | hexai 'explain the tradeoffs'
-
-# Commands-only output (no explanation)
-hexai 'install ripgrep on macOS'
-
-# Verbose explanation
-hexai 'install ripgrep on macOS and explain'
-```
-
-Notes:
-- Token estimation for truncation uses a simple 4 chars/token heuristic.
-- Full-file context is only included by default when defining a new function to balance quality, latency, and cost.
+### Configure in Helix
+
+In Helix' `~/.config/helix/languages.toml`, configure for example the following:
-- Location: `~/.config/hexai/config.json`
-- Example:
+```toml
+[[language]]
+name = "go"
+auto-format= true
+diagnostic-severity = "hint"
+formatter = { command = "goimports" }
+language-servers = [ "gopls", "golangci-lint-lsp", "hexai" ]
-```
-{
- "max_tokens": 4000,
- "context_mode": "always-full",
- "context_window_lines": 120,
- "max_context_tokens": 4000,
- "log_preview_limit": 100,
- "no_disk_io": true,
- "trigger_characters": [".", ":", "/", "_", ";", "?"],
- "provider": "ollama",
- "copilot_model": "gpt-4.1",
- "copilot_base_url": "https://api.githubcopilot.com",
- "openai_model": "gpt-4.1",
- "openai_base_url": "https://api.openai.com/v1",
- "ollama_model": "qwen2.5-coder:latest",
- "ollama_base_url": "http://localhost:11434"
-}
-```
-
-* context_mode: minimal | window | file-on-new-func | always-full
-* provider: openai | copilot | ollama
-* openai_model, openai_base_url: OpenAI-only options
-* copilot_model, copilot_base_url: Copilot-only options
-* ollama_model, ollama_base_url: Ollama-only options
-Minimal config (defaults to OpenAI):
-
-```
-{}
+[language-server.hexai]
+command = "hexai"
```
-Ensure `OPENAI_API_KEY` or `COPILOT_API_KEY` is set in your environment according to your chosen provider.
+Note, that we have also configured other LSPs here (for Go, `gopls` and `golangci-lint-lsp`, along with `hexai` for AI completions - they aren't required for `hexai` to work, though)
## Inline triggers
-Hexai supports inline trigger tags you can type in your code to request an
+Hexai LSP supports inline trigger tags you can type in your code to request an
action from the LLM and then clean up the tag automatically.
-- ``: Do what is written in `text`, then remove just the `` marker.
+- `;some prompt here;`: Do what is written in `some prompt text here`, then remove just the prompt.
- Strict form: no space after the first ``.
- An optional single space immediately after the closing `;` is also removed.
- - Multiple markers per line are supported.
- - Example: `// TODO ` removes only the marker.
- Spaced variants such as `; text ; spaced ;` are ignored.
+- `some text here ;;some prompt;`
## Code actions
@@ -176,7 +126,34 @@ Instruction sources (first one found wins):
- Line comments: `// text`, `# text`, `-- text`.
- Single-line block comments: `/* text */`, `<!-- text -->`.
-Notes:
+## Hexai CLI tool
+
+- Run command-line tool (processes text via configured LLM):
+ - `cat SOMEFILE.txt | hexai`
+ - `hexai 'some prompt text here'`
+ - `cat SOMEFILE.txt | hexai 'some prompt text here'` (stdin and arg are concatenated)
+
+- Default style: concise answers.
+ - If the prompt asks for commands, outputs only the commands with no commentary.
+ - Add the word `explain` in your prompt to request a verbose explanation.
+- Exit codes: `0` success, `1` provider/config error, `2` no input.
+
+Examples:
+
+```
+# From stdin only
+cat SOMEFILE.txt | hexai
+
+# From arg only
+hexai 'summarize: list 3 bullets'
+
+# From both (stdin first, then arg)
+cat SOMEFILE.txt | hexai 'explain the tradeoffs'
+
+# Commands-only output (no explanation)
+hexai 'install ripgrep on macOS'
+
+# Verbose explanation
+hexai 'install ripgrep on macOS and explain'
+```
-- Only the earliest instruction in the selection is used; Hexai removes that marker/comment from the selection before sending it to the LLM.
-- The action returns only the transformed code and replaces exactly the selected range.
diff --git a/cmd/hexai/main.go b/cmd/hexai/main.go
index 2a0e81b..48cb7db 100644
--- a/cmd/hexai/main.go
+++ b/cmd/hexai/main.go
@@ -1,5 +1,4 @@
// Summary: Hexai CLI entrypoint; parses flags and delegates to internal/hexaicli.
-// Not yet reviewed by a human
package main
import (
diff --git a/internal/appconfig/config.go b/internal/appconfig/config.go
index c0f28d2..6b8df4a 100644
--- a/internal/appconfig/config.go
+++ b/internal/appconfig/config.go
@@ -29,75 +29,95 @@ type App struct {
CopilotModel string `json:"copilot_model"`
}
-// Load reads configuration from ~/.config/hexai/config.json and merges with defaults.
+// Load reads configuration from a file and merges with defaults.
+// It respects the XDG Base Directory Specification.
func Load(logger *log.Logger) App {
- cfg := App{
- MaxTokens: 4000,
- ContextMode: "always-full",
- ContextWindowLines: 120,
- MaxContextTokens: 4000,
- LogPreviewLimit: 100,
- NoDiskIO: true,
- }
- home, err := os.UserHomeDir()
- if err != nil {
- return cfg
- }
- path := filepath.Join(home, ".config", "hexai", "config.json")
- f, err := os.Open(path)
- if err != nil {
- return cfg
- }
- defer f.Close()
- dec := json.NewDecoder(f)
- var fileCfg App
- if err := dec.Decode(&fileCfg); err != nil {
- if logger != nil {
- logger.Printf("invalid config file %s: %v", path, err)
- }
- return cfg
- }
- // Merge: file overrides defaults when provided
- if fileCfg.MaxTokens > 0 {
- cfg.MaxTokens = fileCfg.MaxTokens
- }
- if strings.TrimSpace(fileCfg.ContextMode) != "" {
- cfg.ContextMode = fileCfg.ContextMode
- }
- if fileCfg.ContextWindowLines > 0 {
- cfg.ContextWindowLines = fileCfg.ContextWindowLines
- }
- if fileCfg.MaxContextTokens > 0 {
- cfg.MaxContextTokens = fileCfg.MaxContextTokens
- }
- if fileCfg.LogPreviewLimit >= 0 {
- cfg.LogPreviewLimit = fileCfg.LogPreviewLimit
- }
- cfg.NoDiskIO = fileCfg.NoDiskIO
- if len(fileCfg.TriggerCharacters) > 0 {
- cfg.TriggerCharacters = append([]string{}, fileCfg.TriggerCharacters...)
- }
- if strings.TrimSpace(fileCfg.Provider) != "" {
- cfg.Provider = fileCfg.Provider
- }
- // Provider-specific options
- if strings.TrimSpace(fileCfg.OpenAIBaseURL) != "" {
- cfg.OpenAIBaseURL = fileCfg.OpenAIBaseURL
- }
- if strings.TrimSpace(fileCfg.OpenAIModel) != "" {
- cfg.OpenAIModel = fileCfg.OpenAIModel
- }
- if strings.TrimSpace(fileCfg.OllamaBaseURL) != "" {
- cfg.OllamaBaseURL = fileCfg.OllamaBaseURL
- }
- if strings.TrimSpace(fileCfg.OllamaModel) != "" {
- cfg.OllamaModel = fileCfg.OllamaModel
- }
- if strings.TrimSpace(fileCfg.CopilotBaseURL) != "" {
- cfg.CopilotBaseURL = fileCfg.CopilotBaseURL
- }
- if strings.TrimSpace(fileCfg.CopilotModel) != "" {
- cfg.CopilotModel = fileCfg.CopilotModel
- }
- return cfg
+ cfg := App{
+ MaxTokens: 4000,
+ ContextMode: "always-full",
+ ContextWindowLines: 120,
+ MaxContextTokens: 4000,
+ LogPreviewLimit: 100,
+ NoDiskIO: true,
+ }
+
+ if logger == nil {
+ return cfg // Return defaults if no logger is provided (e.g. in tests)
+ }
+
+ var configPath string
+ if xdgConfigHome := os.Getenv("XDG_CONFIG_HOME"); xdgConfigHome != "" {
+ configPath = filepath.Join(xdgConfigHome, "hexai", "config.json")
+ } else {
+ home, err := os.UserHomeDir()
+ if err != nil {
+ if logger != nil {
+ logger.Printf("cannot find user home directory: %v", err)
+ }
+ return cfg // Return defaults if home dir is not found
+ }
+ configPath = filepath.Join(home, ".config", "hexai", "config.json")
+ }
+
+ f, err := os.Open(configPath)
+ if err != nil {
+ if !os.IsNotExist(err) && logger != nil {
+ logger.Printf("cannot open config file %s: %v", configPath, err)
+ }
+ return cfg // Return defaults if file doesn't exist or can't be opened
+ }
+ defer f.Close()
+
+ dec := json.NewDecoder(f)
+ var fileCfg App
+ if err := dec.Decode(&fileCfg); err != nil {
+ if logger != nil {
+ logger.Printf("invalid config file %s: %v", configPath, err)
+ }
+ return cfg // Return defaults on decoding error
+ }
+
+ // Merge: file overrides defaults when provided
+ if fileCfg.MaxTokens > 0 {
+ cfg.MaxTokens = fileCfg.MaxTokens
+ }
+ if strings.TrimSpace(fileCfg.ContextMode) != "" {
+ cfg.ContextMode = fileCfg.ContextMode
+ }
+ if fileCfg.ContextWindowLines > 0 {
+ cfg.ContextWindowLines = fileCfg.ContextWindowLines
+ }
+ if fileCfg.MaxContextTokens > 0 {
+ cfg.MaxContextTokens = fileCfg.MaxContextTokens
+ }
+ if fileCfg.LogPreviewLimit >= 0 {
+ cfg.LogPreviewLimit = fileCfg.LogPreviewLimit
+ }
+ cfg.NoDiskIO = fileCfg.NoDiskIO
+ if len(fileCfg.TriggerCharacters) > 0 {
+ cfg.TriggerCharacters = append([]string{}, fileCfg.TriggerCharacters...)
+ }
+ if strings.TrimSpace(fileCfg.Provider) != "" {
+ cfg.Provider = fileCfg.Provider
+ }
+ // Provider-specific options
+ if strings.TrimSpace(fileCfg.OpenAIBaseURL) != "" {
+ cfg.OpenAIBaseURL = fileCfg.OpenAIBaseURL
+ }
+ if strings.TrimSpace(fileCfg.OpenAIModel) != "" {
+ cfg.OpenAIModel = fileCfg.OpenAIModel
+ }
+ if strings.TrimSpace(fileCfg.OllamaBaseURL) != "" {
+ cfg.OllamaBaseURL = fileCfg.OllamaBaseURL
+ }
+ if strings.TrimSpace(fileCfg.OllamaModel) != "" {
+ cfg.OllamaModel = fileCfg.OllamaModel
+ }
+ if strings.TrimSpace(fileCfg.CopilotBaseURL) != "" {
+ cfg.CopilotBaseURL = fileCfg.CopilotBaseURL
+ }
+ if strings.TrimSpace(fileCfg.CopilotModel) != "" {
+ cfg.CopilotModel = fileCfg.CopilotModel
+ }
+ return cfg
}
diff --git a/internal/hexailsp/run_test.go b/internal/hexailsp/run_test.go
index 2c0fcaf..923f408 100644
--- a/internal/hexailsp/run_test.go
+++ b/internal/hexailsp/run_test.go
@@ -24,6 +24,10 @@ type fakeServer struct{
func (f *fakeServer) Run() error { f.ran = true; return nil }
func TestRunWithFactory_UsesDefaultsAndCallsServer(t *testing.T) {
+ old := os.Getenv("OPENAI_API_KEY")
+ t.Cleanup(func(){ _ = os.Setenv("OPENAI_API_KEY", old) })
+ _ = os.Setenv("OPENAI_API_KEY", "")
+
var stderr bytes.Buffer
logger := log.New(&stderr, "hexai-lsp ", 0)
cfg := appconfig.Load(nil) // defaults
diff --git a/internal/version.go b/internal/version.go
index a6b1527..d86e60b 100644
--- a/internal/version.go
+++ b/internal/version.go
@@ -1,5 +1,4 @@
// Summary: Hexai semantic version identifier used by CLI and LSP binaries.
-// Not yet reviewed by a human
package internal
const Version = "0.1.0"