{% extends "base.html" %} {% block title %}Settings - AUTARCH{% endblock %} {% block content %}

Change Password

OSINT Settings

UPnP Settings

LLM Configuration

Active backend: {{ llm_backend }}
Open LLM Settings →

Configure local models (GGUF / SafeTensors), Claude, OpenAI, and HuggingFace Inference API.

MCP Server

Model Context Protocol — expose AUTARCH tools to AI clients (Claude Desktop, Claude Code)

Debug Console

Captures all Python logging output into a live debug window available on every page. Useful for troubleshooting LLM, MSF, and tool issues.

{% endblock %}