Quickstart
With nilAI, it is possible to run AI models within a trusted execution environment (TEE) via the secretLLM SDK. This makes it possible to build new private AI applications or migrate existing ones to run on a secure nilAI node so that your data remains private.
Getting Started
In this quickstart, we will interact with a private AI chat/response application via Next.js. Let's get started by cloning our examples repo.
gh repo clone NillionNetwork/blind-module-examples
cd blind-module-examples/nilai/secretllm_nextjs_nucs