CoreTemplate.AI
1.2.0
See the version list below for details.
dotnet add package CoreTemplate.AI --version 1.2.0
NuGet\Install-Package CoreTemplate.AI -Version 1.2.0
<PackageReference Include="CoreTemplate.AI" Version="1.2.0" />
<PackageVersion Include="CoreTemplate.AI" Version="1.2.0" />
<PackageReference Include="CoreTemplate.AI" />
paket add CoreTemplate.AI --version 1.2.0
#r "nuget: CoreTemplate.AI, 1.2.0"
#:package CoreTemplate.AI@1.2.0
#addin nuget:?package=CoreTemplate.AI&version=1.2.0
#tool nuget:?package=CoreTemplate.AI&version=1.2.0
Streaming Support in CoreTemplate.AI (v1.2.0 and later)
Starting with v1.2.0, CoreTemplate.AI now supports real-time streaming responses using IAsyncEnumerable<string>
for both OpenRouter and Ollama providers.
This allows you to stream LLM-generated text token-by-token or line-by-line, as it's generated — ideal for chat apps, assistants, or long-form content.
No breaking changes
Program.cs Setup (❗No extra code required if already integrated)
If you've already registered IAIService, no additional setup is needed.
Streaming API Example (Minimal ASP.NET Controller)
[HttpPost("stream")] public async Task StreamPrompt([FromBody] PromptTextCommand command) { Response.ContentType = "text/plain"; await foreach (var chunk in _aiService.StreamPromptAsync(command.Prompt, command.Options)) { var buffer = Encoding.UTF8.GetBytes(chunk); await Response.Body.WriteAsync(buffer); await Response.Body.FlushAsync(); } }
Test It with curl
curl -k https://localhost:your-local-host/api/ai/stream
-H "Content-Type: application/json"
-d '{
"prompt": "Tell me a fun fact about the moon",
"options": {
"provider": "Ollama",
"model": "mistral"
}
}'
Why Use Streaming?
Better UX (instant feedback)
Supports long prompts/results
Ideal for AI chat / assistants
Works seamlessly with Blazor, JS, React, etc.
Integration Tip
You can use StreamPromptAsync(...) anywhere IAIService is injected — Blazor components, API controllers, or background workers.
Version Requirement
Streaming is available starting from:
<PackageReference Include="CoreTemplate.AI" Version="1.2.0" /> ⚠️ Earlier versions do not support streaming.
Happy Streaming! Let your AI speak as it thinks
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net9.0 is compatible. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net9.0
- MediatR (>= 9.0.0)
- Microsoft.Extensions.Configuration.Abstractions (>= 9.0.6)
- Microsoft.Extensions.Options (>= 9.0.6)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.