My AI Learning Journey – Part 6 – A Reverse Proxy for the LLM GUI
The author explains how to secure Open WebUI (OWUI) for Ollama by adding a reverse proxy, since OWUI only offers HTTP. Because the OWUI host lacks a public IP, a double‑proxy setup is used: an external Caddy instance forwards requests to an internal Nginx container, which then routes traffic to OWUI on port 3000. The solution is defined in Docker‑Compose files and an Nginx configuration, enabling HTTPS access from any device while keeping internal traffic unencrypted. The post also mentions tunneling via SSH for full‑end‑to‑end encryption.
My AI Learning Journey – Part 5 – A GUI for the LLM at Home
The author details how to layer Open WebUI (OWUI) on top of an existing Ollama installation using Docker Compose, exposing the UI on port 3000. By configuring Ollama to listen on 0.0.0.0:11434 and adjusting the host firewall, the container can...
What Does TOTP Protect From?
Time‑based One‑Time Passwords (TOTP) rely on a shared secret stored on both client and server, making the secret a single point of failure if the server is breached. The author argues that TOTP’s strongest defense is against client‑side ransomware or...
TOTP Authentication – Open Source and Between Devices
Two-factor authentication via SMS or email presents latency and vendor lock‑in risks, prompting a shift toward standardized, open‑source TOTP solutions. The author discovered that KeePassDX on Android can act as a local TOTP generator by scanning QR codes and storing...