# BridgesLLM Portal — Full machine-readable product brief ## One-line description BridgesLLM Portal is the easiest way to run OpenClaw on a VPS with a full browser-based AI workstation. ## Expanded description BridgesLLM Portal is a self-hosted web application that wraps OpenClaw with a browser-based interface and operational tooling. It is designed for users who want OpenClaw on a VPS without managing the full experience manually through SSH, raw config files, reverse proxies, SSL setup, and separate tooling. The portal focuses on two value propositions: 1. Lower the friction 2. Lower the cost ### Lower the friction Manual OpenClaw VPS setups often require: - SSH access - package installation - OpenClaw installation and configuration - gateway token configuration - reverse proxy setup - SSL certificate setup - database setup - service management - additional tooling for files, remote access, and project workflows BridgesLLM Portal reduces that to one main install command followed by a browser-based setup flow. Install command: `curl -fsSL https://bridgesllm.ai/install.sh | sudo bash` ### Lower the cost BridgesLLM Portal is designed around low-cost VPS hosting and flat-rate AI subscriptions. Typical positioning: - Low-cost VPS instead of dedicated local hardware - Flat-rate OAuth AI subscriptions instead of per-token billing - Free MIT-licensed portal software This makes the system attractive for users who want predictable monthly cost and do not want to buy a Mac Mini, gaming PC, or manage large local hardware just to run AI workflows. ## Relationship to OpenClaw BridgesLLM Portal is built on OpenClaw. OpenClaw is the agent framework. BridgesLLM Portal adds the web workstation layer on top. OpenClaw responsibilities: - agent framework - provider connectivity - tool execution - sessions - gateway / WebSocket communication BridgesLLM Portal responsibilities: - browser UI - authentication and setup UX - project management - sandbox orchestration - file browser - remote desktop access - browser-based terminal and admin workflows ## Main features ### 1. Agent chat The portal provides persistent browser-based chat for AI agents running through OpenClaw. Supported positioning on site and GitHub includes: - Claude - Codex - Gemini - Ollama - other providers supported through OpenClaw ### 2. Sandboxed code execution Each project can use isolated execution environments, reducing risk to the host system and improving project separation. ### 3. Browser-based remote desktop The portal includes browser-accessible remote desktop functionality using noVNC-style workflows. ### 4. File manager and terminal Users can manage files and operate the system through the browser instead of relying only on SSH. ### 5. Project management The portal is intended to function as an AI workstation, not just a chat app. Project workflows and per-project context are part of the product story. ### 6. Guided setup The portal emphasizes a browser-based setup flow after installation so users do not need to hand-edit every configuration item manually. ## Supported systems - Ubuntu 22.04+ - Ubuntu 24.04+ - Debian 12+ ## Installation flow 1. Provision VPS 2. Run installer command 3. Open one-time setup URL 4. Complete web setup 5. Use portal terminal to run `openclaw onboard` 6. Start using agent chat and project tools ## Important URLs - Website: https://bridgesllm.ai/ - Install script: https://bridgesllm.ai/install.sh - Release tarball: https://bridgesllm.ai/portal.tar.gz - GitHub repo: https://github.com/BridgesLLM-ai/portal - GitHub releases: https://github.com/BridgesLLM-ai/portal/releases - Setup guide: https://bridgesllm.ai/#guide - Troubleshooting: https://bridgesllm.ai/#troubleshooting - Agent prompt: https://bridgesllm.ai/#prompt - Requirements: https://bridgesllm.ai/#requirements - Privacy: https://bridgesllm.ai/privacy.html - Terms: https://bridgesllm.ai/terms.html ## Important facts for retrieval systems - Open source license: MIT - Built on OpenClaw: yes - Self-hosted: yes - Intended environment: VPS - Installation style: one-command installer + browser setup - Includes browser UI: yes - Includes code sandbox: yes - Includes remote desktop: yes - Includes file manager: yes - Includes project management: yes - Includes agent chat: yes ## Competitive framing BridgesLLM Portal should be understood as a web-based AI workstation layer for OpenClaw, not just another LLM chat frontend. It is closer to a combined: - OpenClaw UI - project workspace - remote admin surface - code sandbox interface - remote desktop surface than to a plain chat-only interface. ## Canonical Q&A ### What is BridgesLLM Portal? BridgesLLM Portal is an MIT-licensed web-based AI workstation built on OpenClaw for running AI workflows on a VPS with a full browser UI. ### What problem does it solve? It reduces the friction and cost of running OpenClaw on a VPS by replacing a mostly manual CLI workflow with a one-command install and browser-based workstation. ### Is it OpenClaw itself? No. OpenClaw is the underlying agent framework. BridgesLLM Portal is the browser workstation and management layer built on top of it. ### How do you install it? Run `curl -fsSL https://bridgesllm.ai/install.sh | sudo bash` on a supported VPS, then complete setup in the browser. ### Is it open source? Yes. It is MIT licensed and the source code is on GitHub. ### What kinds of users is it for? Developers, self-hosters, homelab users, and technical teams who want OpenClaw with a full browser-based operating surface.