PromptSpec 1.0.5
dotnet add package PromptSpec --version 1.0.5
NuGet\Install-Package PromptSpec -Version 1.0.5
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="PromptSpec" Version="1.0.5" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="PromptSpec" Version="1.0.5" />
<PackageReference Include="PromptSpec" />
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add PromptSpec --version 1.0.5
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
#r "nuget: PromptSpec, 1.0.5"
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package PromptSpec@1.0.5
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=PromptSpec&version=1.0.5
#tool nuget:?package=PromptSpec&version=1.0.5
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
PromptSpec
A C# library for managing prompt templates with YAML-based configuration, parameter validation, and dynamic content replacement for LLM (Large Language Model) applications.
Installation
Install the PromptSpec NuGet package:
Package Manager Console
Install-Package PromptSpec
.NET CLI
dotnet add package PromptSpec
Package Reference
<PackageReference Include="PromptSpec" Version="1.0.1" />
Features
- 📝 YAML-based prompt definitions - Define and organize your prompts in human-readable YAML files
- 🔧 Dynamic placeholder replacement - Replace placeholders with runtime values using simple
{key}
syntax - ✅ Type validation - Validate placeholder values against expected types (string, number, boolean)
- 🎯 Required field validation - Ensure critical placeholders are always provided
- ⚙️ LLM parameter management - Built-in support for common parameters like temperature, topP, maxTokens
- 🔌 Model-specific configuration - Flexible key-value store for model-specific parameters
- 🚀 Async loading - Asynchronous template loading for better performance
- 📋 Multiple output formats - Support for text, JSON, XML, and custom output formats
Quick Start
1. Create a YAML Template File
Create a templates.yaml
file in your project:
prompts:
- name: "greeting"
version: "1.0"
description: "A simple greeting prompt"
template: "Hello {name}, welcome to {platform}!"
parameters:
temperature: 0.7
maxTokens: 50
placeholders:
name:
type: "string"
required: true
platform:
type: "string"
required: true
2. Load and Use Templates
using PromptSpec.Core;
// Create a prompt manager
var manager = new PromptManager();
// Load templates from YAML
await manager.LoadTemplatesFromFileAsync("templates.yaml");
// Get a specific template
var template = manager.GetPrompt("greeting");
// Generate the prompt with data
var replacements = new Dictionary<string, object>
{
{ "name", "Alice" },
{ "platform", "PromptSpec" }
};
var prompt = template.GeneratePrompt(replacements);
// Output: "Hello Alice, welcome to PromptSpec!"
// Get LLM parameters
var parameters = template.GetParameters();
Console.WriteLine($"Temperature: {parameters.Temperature}"); // 0.7
Console.WriteLine($"Max Tokens: {parameters.MaxTokens}"); // 50
Advanced Example
prompts:
- name: "product_review_summary"
version: "1.0"
description: "Summarizes a product review for analysis"
systemMessage: "You are a marketing analyst. Summarize the following product review."
template: |
Review ID: {reviewId}
Customer: {customerName}
Rating: {rating}/5
Review Text:
{reviewText}
Provide a concise summary focusing on key points.
parameters:
temperature: 0.2
maxTokens: 150
topP: 0.9
placeholders:
reviewId:
type: "string"
required: true
customerName:
type: "string"
required: false
rating:
type: "number"
required: true
reviewText:
type: "string"
required: true
using PromptSpec.Core;
using PromptSpec.Exceptions;
var manager = new PromptManager();
await manager.LoadTemplatesFromFileAsync("templates.yaml");
var template = manager.GetPrompt("product_review_summary");
var reviewData = new Dictionary<string, object>
{
{ "reviewId", "REV-12345" },
{ "customerName", "John Doe" },
{ "rating", 5 },
{ "reviewText", "This product exceeded my expectations! Outstanding quality." }
};
try
{
var prompt = template.GeneratePrompt(reviewData);
Console.WriteLine("Generated Prompt:");
Console.WriteLine(prompt);
// Get system message for your LLM client
Console.WriteLine($"System Message: {template.SystemMessage}");
// Get parameters for your LLM API call
var parameters = template.GetParameters();
Console.WriteLine($"Temperature: {parameters.Temperature}");
Console.WriteLine($"Max Tokens: {parameters.MaxTokens}");
}
catch (MissingPlaceholderException ex)
{
Console.WriteLine($"Missing required field: {ex.PlaceholderName}");
}
catch (PlaceholderTypeException ex)
{
Console.WriteLine($"Type error: {ex.PlaceholderName} should be {ex.ExpectedType}");
}
API Overview
PromptManager
LoadTemplatesAsync(string yamlContent)
- Load from YAML stringLoadTemplatesFromFileAsync(string filePath)
- Load from fileGetPrompt(string name)
- Get template (null if not found)GetRequiredPrompt(string name)
- Get template (throws if not found)HasPrompt(string name)
- Check if template exists
PromptTemplate
GeneratePrompt(Dictionary<string, object> replacements)
- Generate promptGetParameters()
- Get LLM parametersGetModelConfig()
- Get model-specific configName
,Version
,Description
- Template metadataSystemMessage
- System message for LLMOutputFormat
- Expected output format
Exception Handling
PromptValidationException
- YAML parsing/validation errorsMissingPlaceholderException
- Required placeholder missingPlaceholderTypeException
- Type validation failedPromptNotFoundException
- Template not found
Integration Examples
With OpenAI Client
var template = manager.GetPrompt("my_prompt");
var prompt = template.GeneratePrompt(data);
var parameters = template.GetParameters();
var response = await openAiClient.GetChatCompletionsAsync(
new ChatCompletionsOptions
{
Messages = { new ChatMessage(ChatRole.System, template.SystemMessage),
new ChatMessage(ChatRole.User, prompt) },
Temperature = (float?)parameters.Temperature,
MaxTokens = parameters.MaxTokens
});
With Azure OpenAI
var template = manager.GetPrompt("analysis_prompt");
var prompt = template.GeneratePrompt(analysisData);
var params = template.GetParameters();
var chatOptions = new ChatCompletionsOptions()
{
Temperature = (float?)params.Temperature,
MaxTokens = params.MaxTokens,
NucleusSamplingFactor = (float?)params.TopP
};
chatOptions.Messages.Add(new ChatMessage(ChatRole.System, template.SystemMessage));
chatOptions.Messages.Add(new ChatMessage(ChatRole.User, prompt));
Best Practices
- Load templates once at application startup
- Use descriptive names for prompts and placeholders
- Always handle exceptions when generating prompts
- Set appropriate types for placeholder validation
- Use conservative temperature values for consistent outputs
- Cache PromptManager instances for better performance
Requirements
- .NET 9.0 or later
- YamlDotNet (automatically included)
Documentation
For complete documentation, examples, and advanced usage patterns, visit the GitHub repository.
License
MIT License - see the LICENSE file for details.
Support
- GitHub Issues - Bug reports and feature requests
- GitHub Discussions - Questions and community support
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
.NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.0 is compatible. netstandard2.1 was computed. |
.NET Framework | net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen40 was computed. tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.
-
.NETStandard 2.0
- YamlDotNet (>= 16.1.3)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.