CoreTemplate.AI
1.1.0
See the version list below for details.
dotnet add package CoreTemplate.AI --version 1.1.0
NuGet\Install-Package CoreTemplate.AI -Version 1.1.0
<PackageReference Include="CoreTemplate.AI" Version="1.1.0" />
<PackageVersion Include="CoreTemplate.AI" Version="1.1.0" />
<PackageReference Include="CoreTemplate.AI" />
paket add CoreTemplate.AI --version 1.1.0
#r "nuget: CoreTemplate.AI, 1.1.0"
#:package CoreTemplate.AI@1.1.0
#addin nuget:?package=CoreTemplate.AI&version=1.1.0
#tool nuget:?package=CoreTemplate.AI&version=1.1.0
๐ง CoreTemplate.AI
Modular & Configurable AI Service Layer for .NET (OpenRouter + Ollama)
CoreTemplate.AI is a pluggable, strategy-based AI integration library for .NET Core projects.
It lets you easily switch between cloud-based and local LLM providers โ like OpenRouter and Ollama โ at runtime via configuration.
Designed with Clean Architecture, Strategy Pattern, MediatR, and strong extensibility in mind.
โจ Features
- โ Plug-and-play AI integration
- โ๏ธ Provider switch via
appsettings.json
(OpenRouter, Ollama) - ๐ Dynamic model override (per-request)
- โ Model support validation (checks if model exists via real API)
- ๐งฑ Fully testable & decoupled structure
- ๐ก Built-in CQRS Command:
PromptTextCommand
(MediatR)
๐ฆ Installation
dotnet add package CoreTemplate.AI
Or add manually in .csproj:
<PackageReference Include="CoreTemplate.AI" Version="1.1.0" />
๐ ๏ธ Program.cs Setup
// --- AI Services ---
builder.Services.AddOptions<AISettings>()
.Bind(builder.Configuration.GetSection("AiSettings"))
.Validate(settings => Enum.IsDefined(typeof(AIProvider), settings.Provider),
"Invalid AI provider configured in AiSettings.Provider");
builder.Services.AddSingleton(sp =>
sp.GetRequiredService<IOptions<AISettings>>().Value);
// Providers
builder.Services.AddScoped<OpenRouterAiService>();
builder.Services.AddScoped<OllamaAiService>();
builder.Services.AddScoped<IAIService, AIServiceResolver>();
// Model Providers
builder.Services.AddScoped<OpenRouterModelProvider>();
builder.Services.AddScoped<OllamaModelProvider>();
builder.Services.AddScoped<AIModelProviderResolver>();
// MediatR
builder.Services.AddMediatR(cfg =>
{
cfg.RegisterServicesFromAssembly(typeof(PromptTextCommandHandler).Assembly);
});
โ๏ธ Configuration
appsettings.json
"AiSettings": {
"Provider": "OpenRouter", // or "Ollama"
"Model": "anthropic/claude-3-haiku" // any model that supported
},
"OpenAI": {
"ApiKey": "sk-your-openrouter-api-key" // need for OpenRouter
}
๐ Example Usage
๐งฉ 1. Using MediatR (Preferred) Send prompt through built-in PromptTextCommand
var result = await _mediator.Send(new PromptTextCommand
{
Prompt = "Write a poem about code.",
Options = new AIRequestOptions
{
Context = "You are a helpful assistant",
Temperature = 0.7,
Model = "mistralai/mistral-7b-instruct" // overrides config default
}
});
๐งฉ 2. Using IAIService or with API Controller Directly
public class MyService
{
private readonly IAIService _ai;
public MyService(IAIService ai) => _ai = ai;
public async Task<string> AskAsync(string prompt)
{
return await _ai.PromptAsync(prompt, new AIRequestOptions
{
Model = "llama2",
Temperature = 0.8
});
}
}
or
[Route("api/ai")]
[ApiController]
public class AiController : ControllerBase
{
private readonly IMediator _mediator;
private readonly IAIService _aiService;
public AiController(IMediator mediator, IAIService aiService)
{
_mediator = mediator;
_aiService = aiService;
}
// POST: /api/ai/prompt
[HttpPost("prompt")]
public async Task<IActionResult> Prompt([FromBody] PromptTextCommand command)
{
var result = await _mediator.Send(command);
return Ok(new { result });
}
// GET: /api/ai/model-supported?model=anthropic/claude-3-haiku
[HttpGet("model-supported")]
public async Task<IActionResult> IsModelSupported([FromQuery] string model)
{
var isSupported = await _aiService.IsModelSupportedAsync(model);
return Ok(new { model, isSupported });
}
}
๐ Supported Providers
Provider | Description | Notes |
---|---|---|
OpenRouter |
Cloud-based OpenAI-compatible proxy | Requires API key |
Ollama |
Local models via Ollama CLI | Requires ollama run |
๐ก๏ธ Model Validation
Before sending the prompt, the system checks if the model exists using:
https://openrouter.ai/api/v1/models for OpenRouter
http://localhost:11434/tags for Ollama
๐งช Example cURL Test
curl -X POST https://localhost:5001/api/ai/prompt
-H "Content-Type: application/json"
-d '{
"prompt": "Explain quantum computing",
"options": {
"model": "anthropic/claude-3-haiku"
}
}'
๐งฐ Technologies
- .NET 9
- Clean Architecture
- Strategy Pattern
- MediatR (CQRS support)
- FluentValidation
- JSON-based configuration
- OpenAI-compatible APIs
๐ค Author
Developed by @CerenSusuz GitHub: [https://github.com/CerenSusuz/CoreTemplateApp/tree/master/Core.AI]
๐ License
This project is licensed under the MIT License.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net9.0 is compatible. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net9.0
- MediatR (>= 9.0.0)
- Microsoft.Extensions.Configuration.Abstractions (>= 9.0.6)
- Microsoft.Extensions.Options (>= 9.0.6)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.