Skip to main content

Private Prompts

Private prompts allow you to store sensitive system prompts in nilDB (Nillion's decentralized database) and use them with LLM inference without exposing the prompt content. This is useful for protecting proprietary instructions, custom behaviors, or sensitive context.

Overview

The private prompts flow involves:

  1. Storing prompts in nilDB: Upload your prompt to nilDB and receive document IDs
  2. Setting up delegation: Create a delegation token chain between the subscription owner and prompt data owner
  3. Using stored prompts: Make LLM requests that reference the stored prompt without exposing its content

Storing Prompts to nilDB

First, store your private prompt to nilDB. This returns document IDs and owner information.

Note: The TypeScript example combines storing and using prompts in a single file. See the store_to_nildb() section:

examples/2-nildb-prompt-store-retrieve.ts
loading...

Using Stored Prompts with Delegation

To use stored prompts, you need to set up a delegation token flow. This involves:

  1. A subscription owner server that manages API access
  2. A prompt data owner server that manages access to stored prompt documents
  3. A client that makes requests using delegation tokens

Note: The TypeScript example combines storing and using prompts in a single file. See the post store_to_nildb() section:

examples/2-nildb-prompt-store-retrieve.ts
loading...

Important Notes

  • Store private keys securely: Keep private keys and stored prompt data in secure configuration files
  • Token expiration: Set appropriate expiration times and usage limits for delegation tokens
  • Prompt storage only on NUCs: Prompt delegation is only available for NUC authentication.

Use Cases

Private prompts are ideal for:

  • Proprietary AI assistants: Protect your custom system prompts and business logic
  • Sensitive instructions: Keep confidential context or data handling rules private
  • Multi-tenant applications: Different users can have different private prompts without exposing them
  • Compliance requirements: Ensure sensitive prompts never leave the secure environment