| Age | Commit message (Collapse) | Author |
|
- Implement new anthropicClient with full Client interface
- Add Streamer interface for token-by-token streaming via SSE
- Add Anthropic Messages API v1 integration with proper headers
- Support claude-3-5-sonnet-20241022 as default model
- Add configuration via [anthropic] TOML section
- Add environment variable overrides (HEXAI_ANTHROPIC_*)
- Support both HEXAI_ANTHROPIC_API_KEY and ANTHROPIC_API_KEY fallback
- Integrate Anthropic key handling in LSP, CLI, and llmutils
- Update provider factory to support 'anthropic' provider name
- Add 11 comprehensive unit tests for Anthropic client
- Update config.toml.example with [anthropic] section
- Update NewFromConfig() signature to accept anthropicAPIKey parameter
- All 51 internal LLM tests pass (11 new Anthropic tests + 40 existing)
Anthropic models can be accessed via:
[anthropic]
model = "claude-3-5-sonnet-20241022"
base_url = "https://api.anthropic.com/v1"
temperature = 0.2
or environment:
export HEXAI_PROVIDER="anthropic"
export HEXAI_ANTHROPIC_API_KEY="your-key"
Amp-Thread-ID: https://ampcode.com/threads/T-019c0af1-f215-72cf-9940-b014b1a9576b
Co-authored-by: Amp <amp@ampcode.com>
|
|
|
|
|
|
|
|
|
|
|
|
cache; width mitigation (narrow/maxlen); configurable [stats] window_minutes; robust coverage parsing; docs update\n\n- Add internal/stats with windowed event cache + flock + atomic writes\n- Wire stats into LSP/CLI/Tmux Action; tmux shows Σ@window with per-model tail\n- HEXAI_TMUX_STATUS_NARROW and HEXAI_TMUX_STATUS_MAXLEN for width control\n- Add [stats] window_minutes to config and apply on startup\n- Improve Magefile coverage handling; add tests to lift coverage >85%\n- Update docs/tmux.md and config example
|
|
|
|
wire into LSP and TUI
|
|
|
|
|
|
|
|
>text>/>>text>; update docs and example config; tests updated to new triggers and raise LSP coverage to >=85%; chore: remove semicolon legacy; chore(mage): auto-refresh coverage daily if docs/coverage.out is older than 24h
|
|
completion_debounce_ms (default 200)\n- Server: wait until no input for debounce before LLM calls\n- Applies to chat and provider-native completion paths\n- Tests: add debounce and adjust to verify behavior\n\nAll unit tests pass.
|
|
add go install instructions
|
|
|
|
key; prefer HEXAI_COPILOT_API_KEY in builders
|
|
HEXAI_OPENAI_API_KEY over OPENAI_API_KEY; update docs
|
|
heuristic\n\n- Prevent overlapping LLM requests via llmBusy guard\n- Remove time-based throttle and option plumbing\n- Short-prefix heuristic now skips over trailing whitespace and clamps index\n- Add tests for busy guard and trailing-space allowance
|
|
and '?' from trigger characters\n- Add min-typed-prefix heuristic for LLM completions (>=2 chars)\n- Add simple time-based throttle between LLM completions (default 900ms)\n- Tests: verify default triggers and skip logic (throttle + min prefix)\n- Config example: update trigger_characters list
|
|
- Extract helpers to keep funcs <=50 lines; no behavior changes
- Add tests for prompt removal, code actions, and LLM request builders
- Table-drive TestInParamList; run gofmt
|
|
to app config and server options.\n- Use in LSP code actions and completions.\n- Default to provider temperature when not set.\n- Update README and config.json.example.
|
|
config with coding-friendly default 0.2.\n- Wire defaults through providers (OpenAI, Copilot, Ollama).\n- Update CLI and LSP runners to pass configured temperatures.\n- Document temperature behavior and examples in README.\n- Update config.json.example to show new keys.
|
|
|
|
This flag was not used anywhere in the codebase, so it has been removed.
|
|
- Move CLI logic to internal/hexaicli with Run/RunWithClient
- Move LSP logic to internal/hexailsp with Run/RunWithFactory
- Extract helpers; keep behavior identical for both binaries
- Add unit tests for hexaicli (input parsing, messages, streaming) and
hexailsp (factory wiring, client creation, logging settings)
- Add top-of-file summaries and 'Not yet reviewed by a human' comments to all Go files
- Update README with internal package docs
|