Soenneker.SemanticKernel.Pool
3.0.12
Prefix Reserved
See the version list below for details.
dotnet add package Soenneker.SemanticKernel.Pool --version 3.0.12
NuGet\Install-Package Soenneker.SemanticKernel.Pool -Version 3.0.12
<PackageReference Include="Soenneker.SemanticKernel.Pool" Version="3.0.12" />
<PackageVersion Include="Soenneker.SemanticKernel.Pool" Version="3.0.12" />
<PackageReference Include="Soenneker.SemanticKernel.Pool" />
paket add Soenneker.SemanticKernel.Pool --version 3.0.12
#r "nuget: Soenneker.SemanticKernel.Pool, 3.0.12"
#addin nuget:?package=Soenneker.SemanticKernel.Pool&version=3.0.12
#tool nuget:?package=Soenneker.SemanticKernel.Pool&version=3.0.12
Soenneker.SemanticKernel.Pool
A high-performance, thread-safe pool implementation for Microsoft Semantic Kernel instances with built-in rate limiting capabilities.
Features
- Kernel Pooling: Efficiently manages and reuses Semantic Kernel instances
- Rate Limiting: Built-in support for request rate limiting at multiple time windows:
- Per-second rate limiting
- Per-minute rate limiting
- Per-day rate limiting
- Token-based rate limiting
- Thread Safety: Fully thread-safe implementation using concurrent collections
- Async Support: Modern async/await patterns throughout the codebase
- Flexible Configuration: Configurable rate limits and pool settings
- Resource Management: Automatic cleanup of expired rate limit windows
Installation
dotnet add package Soenneker.SemanticKernel.Pool
services.AddSemanticKernelPoolAsSingleton()
Extension Packages
This library has several extension packages for different AI providers:
- Soenneker.SemanticKernel.Pool.Gemini - Google Gemini integration
- Soenneker.SemanticKernel.Pool.OpenAi - OpenAI/OpenRouter.ai/etc integration
- Soenneker.SemanticKernel.Pool.Ollama - Ollama integration
- Soenneker.SemanticKernel.Pool.OpenAi.Azure - Azure OpenAI integration
Usage
Startup Configuration
// In Program.cs or Startup.cs
public class Program
{
public static async Task Main(string[] args)
{
var builder = WebApplication.CreateBuilder(args);
// Add the kernel pool as a singleton
builder.Services.AddSemanticKernelPoolAsSingleton();
var app = builder.Build();
// Register kernels during startup
var kernelPool = app.Services.GetRequiredService<ISemanticKernelPool>();
// Manually create options, or use one of the extensions mentioned above
var options = new SemanticKernelOptions
{
ApiKey = "your-api-key",
Endpoint = "https://api.openai.com/v1",
Model = "gpt-4",
KernelFactory = async (opts, _) =>
{
return Kernel.CreateBuilder()
.AddOpenAIChatCompletion(modelId: opts.ModelId!,
new OpenAIClient(new ApiKeyCredential(opts.ApiKey), new OpenAIClientOptions {Endpoint = new Uri(opts.Endpoint)}));
}
// Rate Limiting
RequestsPerSecond = 10,
RequestsPerMinute = 100,
RequestsPerDay = 1000,
TokensPerDay = 10000
};
await kernelPool.Register("my-kernel", options);
// Add more registrations... order matters!
await app.RunAsync();
}
}
Using the Pool
public class MyService
{
private readonly ISemanticKernelPool _kernelPool;
public MyService(ISemanticKernelPool kernelPool)
{
_kernelPool = kernelPool;
}
public async Task ProcessAsync()
{
// Get an available kernel that's within its rate limits, preferring the first registered
var (kernel, entry) = await _kernelPool.GetAvailableKernel();
// Get the chat completion service
var chatCompletionService = kernel.GetService<IChatCompletionService>();
// Create a chat history
var chatHistory = new ChatHistory();
chatHistory.AddMessage(AuthorRole.User, "What is the capital of France?");
// Execute chat completion
var response = await chatCompletionService.GetChatMessageContentAsync(chatHistory);
Console.WriteLine($"Response: {response.Content}");
// Access rate limit information through the entry
var remainingQuota = await entry.RemainingQuota();
Console.WriteLine($"Remaining requests - Second: {remainingQuota.Second}, Minute: {remainingQuota.Minute}, Day: {remainingQuota.Day}");
}
}
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net9.0 is compatible. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net9.0
- Soenneker.SemanticKernel.Cache (>= 3.0.419)
NuGet packages (5)
Showing the top 5 NuGet packages that depend on Soenneker.SemanticKernel.Pool:
Package | Downloads |
---|---|
Soenneker.SemanticKernel.Pool.OpenAi
Provides OpenAI-specific registration extensions for KernelPoolManager, enabling integration with local LLMs via Semantic Kernel. |
|
Soenneker.SemanticKernel.Pool.OpenAi.Azure
Provides Azure OpenAI-specific registration extensions for KernelPoolManager, enabling integration with local LLMs via Semantic Kernel. |
|
Soenneker.SemanticKernel.Pool.Ollama
Provides Ollama-specific registration extensions for KernelPoolManager, enabling integration with local LLMs via Semantic Kernel. |
|
Soenneker.SemanticKernel.Pool.Gemini
Provides Gemini-specific registration extensions for KernelPoolManager, enabling integration with local LLMs via Semantic Kernel. |
|
Soenneker.SemanticKernel.Pool.Mistral
Provides Mistral-specific registration extensions for KernelPoolManager, enabling integration via Semantic Kernel. |
GitHub repositories
This package is not used by any popular GitHub repositories.
Version | Downloads | Last Updated |
---|---|---|
3.0.63 | 0 | 7/4/2025 |
3.0.62 | 42 | 7/2/2025 |
3.0.61 | 43 | 6/28/2025 |
3.0.60 | 39 | 6/28/2025 |
3.0.59 | 11 | 6/28/2025 |
3.0.58 | 17 | 6/28/2025 |
3.0.57 | 22 | 6/27/2025 |
3.0.56 | 28 | 6/27/2025 |
3.0.55 | 86 | 6/26/2025 |
3.0.54 | 102 | 6/25/2025 |
3.0.53 | 138 | 6/25/2025 |
3.0.52 | 129 | 6/25/2025 |
3.0.51 | 296 | 6/17/2025 |
3.0.50 | 354 | 6/11/2025 |
3.0.49 | 312 | 6/11/2025 |
3.0.48 | 328 | 6/11/2025 |
3.0.47 | 346 | 6/11/2025 |
3.0.46 | 304 | 6/11/2025 |
3.0.45 | 337 | 6/10/2025 |
3.0.44 | 294 | 6/3/2025 |
3.0.43 | 168 | 6/3/2025 |
3.0.42 | 175 | 6/3/2025 |
3.0.41 | 164 | 6/3/2025 |
3.0.40 | 144 | 6/3/2025 |
3.0.39 | 170 | 6/3/2025 |
3.0.38 | 172 | 6/3/2025 |
3.0.37 | 166 | 6/2/2025 |
3.0.36 | 234 | 5/28/2025 |
3.0.35 | 168 | 5/28/2025 |
3.0.34 | 178 | 5/28/2025 |
3.0.33 | 148 | 5/28/2025 |
3.0.32 | 182 | 5/27/2025 |
3.0.31 | 163 | 5/27/2025 |
3.0.30 | 138 | 5/27/2025 |
3.0.29 | 220 | 5/27/2025 |
3.0.28 | 203 | 5/26/2025 |
3.0.27 | 123 | 5/25/2025 |
3.0.26 | 120 | 5/25/2025 |
3.0.25 | 107 | 5/23/2025 |
3.0.24 | 130 | 5/23/2025 |
3.0.23 | 120 | 5/23/2025 |
3.0.22 | 115 | 5/23/2025 |
3.0.21 | 122 | 5/23/2025 |
3.0.20 | 152 | 5/23/2025 |
3.0.19 | 146 | 5/22/2025 |
3.0.18 | 143 | 5/22/2025 |
3.0.17 | 321 | 5/22/2025 |
3.0.16 | 153 | 5/21/2025 |
3.0.15 | 186 | 5/20/2025 |
3.0.14 | 179 | 5/19/2025 |
3.0.13 | 166 | 5/19/2025 |
3.0.12 | 134 | 5/19/2025 |
3.0.11 | 136 | 5/19/2025 |
3.0.10 | 156 | 5/19/2025 |
3.0.9 | 140 | 5/19/2025 |
3.0.8 | 197 | 5/19/2025 |
3.0.7 | 134 | 5/18/2025 |
3.0.6 | 147 | 5/18/2025 |
3.0.5 | 148 | 5/18/2025 |
3.0.4 | 131 | 5/18/2025 |
3.0.3 | 136 | 5/18/2025 |
3.0.2 | 135 | 5/18/2025 |
3.0.1 | 132 | 5/18/2025 |