Tutorial

How to store AI context in Node.js without turning your app into prompt spaghetti.

In most SaaS apps, the hardest part is not generating the answer. It is deciding what context matters and keeping that logic maintainable. The pattern below keeps context in your own Node.js app and uses BrainAPI only for the final AI request.

1. Save structured context

Keep summaries, preferences, and recent events in a shape your app can query quickly. Even a simple table is enough to start.

Example context recordTypeScript
type UserContext = {
  userId: string;
  summary: string;
  preferences: string[];
  lastActions: string[];
};

2. Compose only the relevant slice

Context composerTypeScript
function buildPrompt(context: UserContext, userMessage: string) {
  return [
    "You are a helpful SaaS support assistant.",
    "User summary: " + context.summary,
    "Preferences: " + context.preferences.join(", "),
    "Recent actions: " + context.lastActions.join("; "),
    "Latest message: " + userMessage
  ].join("\\n\\n");
}

3. Send the final request through BrainAPI

Unified API callTypeScript
const prompt = buildPrompt(context, userMessage);

const result = await fetch("https://api.brainapi.site/api/v1/ai", {
  method: "POST",
  headers: {
    "Content-Type": "application/json",
    "X-API-Key": process.env.BRAINAPI_KEY!
  },
  body: JSON.stringify({
    type: "text",
    input: prompt,
    mode: "auto",
    max_output_tokens: 220
  })
}).then((res) => res.json());

4. Save the new summary

After the response, update your context record with the latest facts or summary. This keeps memory storage under your control and avoids locking it inside provider-specific abstractions.

Related

Need product positioning as well?

See Chatbot Memory API for the commercial landing page and OpenAI Memory API Alternative for the comparison angle.