1. WebAssembly Modules (The Logic)
In legacy protocols (like MCP), servers define static “Tools” (e.g.,calculate_sum, read_log), and the AI instructs the server to execute them.
In NMP, the Server does not need to pre-program endless tools. Instead, it exposes an Execution Interface.
The Agent sends a completely dynamic WebAssembly Module (.wasm) containing its own novel logic.
How it works:
- The Agent complies its reasoning (e.g., “Find all error logs containing IP 192.168.1.1 and group them by timestamp”) into a tiny binary cross-platform WASM blob.
- The NMP Server receives this blob via gRPC.
- Instead of executing it natively, the Server spins up an ephemeral WASI Sandbox (using Bytecode Alliance’s
Wasmtime) and injects the WASM module into it. - Extreme Scalability: Both the Rust backend and the TS SDK leverage advanced threading models (Native OS Threads and Node.js
piscinaWorker Pools) to process thousands of these sandboxes concurrently without blocking the Main Event Loop.
2. Capabilities (The Resources)
If the server executes arbitrary code via WASM, how is it secure? Through Capabilities-Based Security. Capabilities are the NMP equivalent of MCP “Resources”, but radically more secure. By default, a running WASM module in NMP has absolute zero access to anything:- No file system read/write.
- No network socket access.
- No environment variables.
- No system clock.
Granting Access
When an Agent sends a Module, it must specify the Capabilities it requires to function. The Server verifies these requests against its configured manifest. If approved, the Server dynamically maps specific file descriptors directly into the Sandbox memory space. For example, a Capability might be:READ_ONLY_ACCESS: /var/logs/nginx/.
The WASM module can now read the nginx logs at blinding speed, but if the malicious logic attempts to read /etc/passwd, the Sandboxing Engine will trigger an immediate, uncatchable WASI Trap, instantly terminating the agent’s logic.
3. Watchdogs (Persistent Asynchronous Events)
One of NMP’s most powerful features is Watchdogs, superseding the concept of simple prompts or manual polling. AI applications often need to monitor a server (e.g., “Tell me when memory usage exceeds 90%”). In HTTP/REST or JSON-RPC, the AI must constantly send manual requests every 5 seconds, wasting massive network bandwidth and LLM tokens. NMP push architecture solves this:- The Agent sends a Watchdog WASM Module to the Data Node.
- The Server parses it and lets it “sleep” perpetually in a passive, low-resource background thread.
- The WASM module connects locally to the system streams.
- The moment the condition is met (Memory > 90%), the WASM module wakes up and pushes an asynchronous event straight down the permanently open multiplexed QUIC connection back to the Agent.
4. Trust & Evidence (Enterprise TEE Execution Sandbox)
In Tier-0 Enterprise Environments, the Neural Mesh Server fundamentally evolves past standard OS-level sandboxing. The server provides Mathematical and Physical evidence utilizing a TEE (Trusted Execution Environment) Sandbox.1. Host Bounds & Checks (Guardian AST & Egress Filter)
Before any logic touches the execution engine, the outermost Host Server interceptor unlocks the.wasm payload from its Kyber768 / AES-256-GCM transport encryption. Instantly, the Guardian AST Sentinel parses the WebAssembly Abstract Syntax Tree. If it detects attempted imports from arbitrary Host functions rather than authorized wasi_snapshot_preview1 bounds, the payload is purged before entry.
Upon successful execution, a Layer 3 Egress Filter mathematically analyzes the outgoing buffer to prevent PII or sensitive schema data exfiltration prior to transmission.
2. Hardware Isolated Enclave (TEE)
The core computation does not run in the Host OS. It is pushed into a physical Hardware Enclave (such as AWS Nitro Enclaves or Intel SGX). This ensures Blind Computation: the Host’s RAM is hardware-encrypted, meaning not even the Server Administrator or Cloud Provider can dump the memory to steal the Agent’s reasoning.3. Wasmtime Engine & Fuel Monitor
Inside the TEE, theWasmtime engine boots the WASI context. Since WebAssembly is Turing Complete, it could theoretically execute an infinite loop to starve the server. The Fuel Monitor defends against this by enforcing .consume_fuel(). Every operation deducts “fuel” limits; if exhausted, the engine unconditionally kills the execution thread, repelling DoS bombs.
4. ZK Prover (Mathematical Certainty)
When the.wasm finishes outputting data, the result passes to the Zero-Knowledge Prover (e.g., RISC Zero integration). This module generates a mathematical Receipt (consisting of a Journal and a cryptographic Seal). The returned packet proves unconditionally to the Agent that the specific logic payload executed flawlessly, and the output was not tampered with by the execution Node.
5. TypeScript Server & MCP Bridge (The SDK Node)
While the Rust Dataplane dominates heavy WASI Sandboxing and Mathematical Evidence, the Neural Mesh Ecosystem also provides a lightweight, pure-TypeScript Server implementation (@nekzus/neural-mesh).
This Server acts primarily as a backward-compatibility layer and local endpoint for Agent logic.
How the Bridge handles Legacy MCP Clients:
- JSON-RPC Interception: Legacy tools (like older MCP Clients) dispatch standard text-heavy
tools/callJSON-RPC 2.0 requests over traditional network ports. - NmpMcpBridge Adapter: The SDK intercepts these primitive payloads and translates them into an internalized
CallToolRequest. It also flawlessly exposes registeredresources/listandresources/readendpoints, enabling Zero-Shot Self-Discovery of Data Schemas for generic LLMs. - NmpServer Zod Validation: Before any logic touches the Node.js Main Thread, the
NmpServerimposes strict Zod schema runtime validations ensuring structural integrity. - Native Node.js Execution: The registered Tool Handler executes locally (using standard Node.js/TypeScript code) rather than isolating itself in a sandboxed WebAssembly boundary.
- JSON-RPC Response: The finding returns seamlessly down the pipeline formatted as a legacy
resultblock.
@nekzus/neural-mesh and serve their existing Tools into the NMP ecosystem with Zero-Code modifications, slowly transitioning towards full WASM Logic-on-Origin payloads when required.
6. Advanced Security & Data Dictionary
To ensure the highest level of Zero-Shot Autonomy (where the AI makes its own plans), NMP Servers must provide metadata about their internal structures.Data Dictionary (Anti-Hallucination)
When an Agent executes foreign logic, it might “hallucinate” fields that don’t exist (e.g., trying to accessgender in a database that only has age). To prevent this, developers should use the dataDictionary method:
nmp_blind_analyst System Prompt. By explicitly detailing the available fields, NMP forces the LLM to adopt a Strict Schema Adherence policy, drastically reducing hallucination risks during logic generation.
Sandbox Data Injection
WhiledataDictionary teaches the Agent what the data looks like, you must actually inject the real data into the Execution Sandbox for the WASM/JS logic to process.
Use the setSandboxData method to load your context (e.g., from a database or JSON file) into the server’s memory:
PII Forbidden Keys
You can dynamically restrict specific fields from ever leaving the server, regardless of the logic’s intent. Pass theforbiddenKeys array during server instantiation:
Set to scan every return object and block any key matching these terms, even if they are nested deep within a JSON structure.
Dynamic Return Structure (Native Auto-i18n)
NMP introduces an architectural pattern where cross-border language translation friction is non-existent. When a Data Node registers its capabilities, it intrinsically embeds a structural directive into the payload definition (DYNAMIC RETURN STRUCTURE). This forces the executing agent to output JSON schemas using keys mathematically mapped to the exact language spoken by the user on the initial prompt.
- If the user queries “¿Cuántos pacientes tienen hipertensión?”, the Node naturally receives a response matching native Spanish taxonomy (e.g.,
{"cantidad": 25, "edad_promedio": 45}). - This entirely eliminates the need for heavyweight frontend internationalization (i18n) libraries, dropping the cognitive load of data parsing right at the Origin without hallucination risks.