Solutions · Text Generation · Prompt templates

Prompts are programs.

String concatenation is how prompt engineering starts. It is not how it ends. PromptTemplate ships a complete templating engine: three interchangeable syntaxes, typed variable substitution, a filter library, conditional blocks, loops, scoped contexts, and custom helpers. Compiled once, cached, and reused across every inference call. The same engine drives conversations, agents, skills, and document workflows.

3 template syntaxes Compiled once, reused Custom helpers

Variables & filters

Substitute typed values, apply chainable filters (trim, truncate, capitalise, escape), reference scoped data.

Conditionals & loops

#if, #unless, #each, #with. Branching prompts, list rendering, scoped sections.

Three syntaxes

Mustache {{ }}, dollar ${ }, percent % %. Pick the one that does not collide with your model's chat template.

Why a templating engine

Concatenation does not scale.

A demo prompt fits in a verbatim string. A production prompt has a user profile to inject, a tool list to render, retrieved passages to format, a system message to compose, a conditional persona, a few-shot block to select, locale-specific instructions, and a deterministic ordering. Every team eventually writes a templating engine. We wrote one for you.

Compiled and cached

Templates parse once, execute many times. Hot paths skip the parser entirely. Per-call render cost is the substitution itself, not the syntax.

Type-safe context

PromptTemplateContext binds typed values, collections, and nested objects. Missing keys surface at render time, not at inference time.

Three syntaxes, one engine

Pick the syntax that does not clash with your model's chat template. Same features, same compiled IR, swap by setting Syntax.

Filter library

Built-in filters (trim, truncate, upper, lower, capitalise, escape and more) chain naturally with the pipe operator.

Custom helpers

Register named .NET delegates for project-specific transformations (currency formatting, locale-aware dates, redaction). Helpers receive the context and emit text.

Across the SDK

Conversations consume templates for system prompts. Agent skills define them in markdown. RAG pipelines compose retrieved passages through them. Same primitive everywhere.

Three lines

From string to template.

Compile a Handlebars-style template once with branches and filters, then render it against many model inputs.

QuickStart.cs
using LMKit.TextGeneration.Prompts;

// Compile once.
var tpl = PromptTemplate.Compile("""
    You are talking to {{ user.name }} from {{ user.region | upper }}.
    Their preferred language is {{ user.locale }}.

    {{#if hasTools}}
    Tools available: {{ tools | join: ", " }}
    {{/if}}

    Answer the question below.
    """);

// Render many.
string prompt = tpl.Render(new
{
    user     = new { name = "Loic", region = "emea", locale = "fr-FR" },
    hasTools = true,
    tools    = new[] { "weather", "calendar" }
});
Real prompts

RAG context shaping in one template.

A common production pattern: build a system prompt that includes a user profile, conditionally injects tools, formats retrieved passages with source citations, and ends with the user's question. Reads top-down, edits in one place, ships to every inference call.

RagSystemPrompt.cs
var tpl = PromptTemplate.Compile("""
    You are an assistant for {{ user.name }} ({{ user.role }}).
    Locale: {{ user.locale }}. Be concise.

    {{#if passages}}
    The following passages may be relevant. Cite each by its index.

    {{#each passages}}
    [{{ @index }}] {{ source | truncate: 60 }}
    {{ text }}

    {{/each}}
    {{/if}}

    {{#unless allowSpeculation}}
    If the answer is not supported by the passages, say so explicitly.
    {{/unless}}

    Question: {{ query }}
    """);

string systemPrompt = tpl.Render(new
{
    user             = currentUser,
    passages         = retrievedPassages,
    allowSpeculation = false,
    query            = userQuestion
});

// Pass straight into a conversation.
chat.SystemMessage = systemPrompt;
Where templates ship

Every prompt worth maintaining.

Chatbot personalisation

User profile, locale, tier, history all injected through one template. Editable by product, not engineering.

RAG context shaping

Retrieved passages, citations, source attribution rendered consistently across every retrieval call.

Few-shot selection

Conditionally include examples per task type, locale, or difficulty. Loops over example collections.

Tool catalog rendering

Render the active tool set for the current agent role. Conditional on permission policy.

Skill instruction templates

SKILL.md files use the same engine. Skills receive context just like any other prompt surface.

A/B prompt experiments

Swap templates by name in a config. Track which template wins without redeploying code.

Versus the alternatives

Strings, helpers, or a real engine.

String concatenation

Works for the demo. Becomes unmaintainable as soon as a prompt has three branches and a list. Conditional logic ends up scattered across the call site.

Generic templating libraries

Can substitute strings but not aware of conversation, retrieval passages, or tool catalogues. Custom helpers and integration are bring-your-own.

PromptTemplate

Built for prompts. Three syntaxes, filter library, conditionals, loops, scoping, custom helpers, compiled-once render path. Used internally by conversations, skills, and RAG pipelines.

Related capabilities

Templates plus the rest.

Structured content creation

Prompts produce free-form text; grammar-constrained generation produces typed objects. Compose them.

Structured generation

Agent skills

SKILL.md files use the same template engine. Markdown bodies receive context like any other prompt surface.

Agent skills

Document RAG

Retrieved passages render through templates so the prompt format stays consistent across queries.

Document RAG

Sampling controls

A clean prompt deserves clean sampling. Pair templates with logit biasing and dynamic sampling for full output control.

Sampling controls

Demos & docs

Build it. Read it. Try it.

Working console demos on GitHub, step-by-step how-to guides on the docs site, and the API reference for the classes used on this page.

Stop concatenating. Start composing.

Get Community Edition Download