- Fix Hal chat: add Chat/Agent mode toggle so users can switch between
direct LLM streaming (Chat) and tool-using Agent mode
- Fix Agent system: graceful degradation when model can't follow
structured THOUGHT/ACTION/PARAMS format (falls back to direct answer
after 2 parse failures instead of looping 20 times)
- Fix frozen build: remove llama_cpp from PyInstaller excludes list
so LLM works in compiled exe
- Add system tray icon: autarch.ico (from icon.svg) used for exe icons,
installer shortcuts, and runtime tray icon
- Update tray.py to load .ico file with fallback to programmatic generation
- Add inline critical CSS for FOUC prevention
- Bump version to 1.5.1
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>