Mesh Network

The Mesh Network

Every link is a node. Every browser is a compute unit. Distributed inference across sovereign hardware and elastic browser compute.

5
Pi Nodes
52
TOPS
18
Tunnels
198
Sockets
5
Subnets
Architecture

Two-Tier Compute

Permanent hardware backbone meets elastic browser-based compute for unbounded scale.

Pi

Pi Fleet Backbone

Five Raspberry Pi 5 nodes form the permanent compute backbone. Two Hailo-8 AI accelerators deliver 52 TOPS of dedicated neural inference. Always on, always sovereign.

5x Pi 5 / 2x Hailo-8 / 26 TOPS each

Browser Compute Layer

Every browser tab becomes a compute node. WebGPU handles matrix operations, WASM runs model shards, WebRTC coordinates peer-to-peer inference. Zero install required.

WebGPU + WASM + WebRTC

Elastic Federation

The Pi fleet handles routing, model hosting, and consensus. Browser nodes contribute GPU cycles during inference peaks. Load balancing across the entire mesh happens automatically.

Auto-scale / 70-30 compute split
Protocol

How It Works

Three steps from request to response, distributed across the mesh.

Step 01

Connect

A request arrives at blackroad.io/chat or via the OpenAI-compatible API. The gateway identifies optimal nodes based on model availability, load, and latency.

Step 02

Compute

The request is routed to Pi fleet nodes with Hailo-8 accelerators for neural inference. If demand exceeds fleet capacity, browser compute nodes receive model shards via WebRTC.

Step 03

Consensus

Responses are assembled, verified against the soul chain for agent identity, and streamed back to the client. Sub-100ms first token latency on fleet-local inference.

RoadNet

WiFi Mesh Overlay

Five access points forming a physical mesh network with dedicated subnets and failover routing.

Alice
192.168.4.49 / Pi 400
Channel 1 / 10.10.1.0/24
Gateway + DNS + PostgreSQL
Online
Cecilia
192.168.4.96 / Pi 5
Channel 6 / 10.10.2.0/24
CECE API + TTS + Hailo-8
Online
Octavia
192.168.4.100 / Pi 5
Channel 11 / 10.10.3.0/24
1TB NVMe + Hailo-8 + Gitea
Offline
Aria
192.168.4.98 / Pi 5
Channel 1 / 10.10.4.0/24
Portainer + Headscale
Offline
Lucidia
192.168.4.38 / Pi 5
Channel 11 / 10.10.5.0/24
FastAPI + Ollama Bridge
Online
WireGuard Mesh
Encrypted WireGuard tunnels connect every node through the Anastasia hub on 10.8.0.x. Traffic between nodes is encrypted end-to-end. The mesh automatically reroutes if a node goes down.
Anastasia (Hub)
10.8.0.1 / DO NYC1
Alice
10.8.0.6
Cecilia
10.8.0.3
Octavia
10.8.0.4
Aria
10.8.0.7
Gematria
10.8.0.8 / DO NYC3
Topology

Mesh Connections

Live view of the network topology and inter-node connections.

blackroad-network topology
BlackRoad Mesh Topology ──────────────────────────────────────────────────────────────── [ Gematria ] [ Anastasia ] DO NYC3 DO NYC1 (WG Hub) 10.8.0.8 10.8.0.1 \ / | \ \ / | \ ~~~~~~~ WireGuard Mesh (10.8.0.x) ~~~~~~~~ / | | \ \ / | | \ \ [ Alice ] [ Cecilia ] [ Octavia ] [ Aria ] [ Lucidia ] .49 .96 .100 .98 .38 10.8.0.6 10.8.0.3 10.8.0.4 10.8.0.7 Pi 400 Pi 5 Pi 5 Pi 5 Pi 5 CH1 CH6 CH11 CH1 CH11 | | | | | 10.10.1.x 10.10.2.x 10.10.3.x 10.10.4.x 10.10.5.x ──────────────────────────────────────────────────────────────── SSID: RoadNet Password: [secured] Nodes: 5 AP + 2 Cloud Hailo-8: Cecilia (26T) + Octavia (26T) = 52 TOPS Tunnels: 18 Cloudflare + 6 WireGuard + Headscale overlay Sockets: 198 listening (127 TCP + 71 UDP)
Roadmap

Build Sequence

From proof of concept to global mesh in four phases.

01

Proof of Concept

Single browser tab performing inference via WebGPU, coordinated with a Pi fleet node. Validate latency, throughput, and model shard distribution.

Week 1 / In Progress
02

mesh.js SDK

Embeddable JavaScript SDK that turns any webpage into a mesh compute node. One script tag to opt in. WebRTC peer discovery, WASM model runtime, automatic load balancing.

Week 2 / Planned
03

Chat Product + API

blackroad.io/chat as the first mesh-powered product. Free AI chat where your browser contributes compute. OpenAI-compatible API at 50% the cost with 70/30 compute revenue split.

Week 3+ / Planned
Economics

Mesh Economics

Compute contributors earn. API consumers save. The mesh grows itself.

$

API Revenue

OpenAI-compatible inference API priced at 50% of centralized providers. Requests are distributed across the mesh, reducing per-query cost through shared compute.

50% price reduction
%

Compute Split

70% of API revenue flows to compute contributors. Browser nodes earn proportional to GPU cycles contributed. Pi fleet operators earn a baseline allocation.

70/30 contributor split

Hardware Kits

Pre-configured Raspberry Pi 5 + Hailo-8 node kits. Plug in, connect to mesh, start earning. Full BlackRoad OS pre-installed with auto-bootstrap.

$199 starter kit

Join the Mesh

Contribute compute from your browser. Run a node on sovereign hardware. Build on the most distributed AI inference network.