ApertusSharp 0.2.1
dotnet add package ApertusSharp --version 0.2.1
NuGet\Install-Package ApertusSharp -Version 0.2.1
<PackageReference Include="ApertusSharp" Version="0.2.1" />
<PackageVersion Include="ApertusSharp" Version="0.2.1" />
<PackageReference Include="ApertusSharp" />
paket add ApertusSharp --version 0.2.1
#r "nuget: ApertusSharp, 0.2.1"
#:package ApertusSharp@0.2.1
#addin nuget:?package=ApertusSharp&version=0.2.1
#tool nuget:?package=ApertusSharp&version=0.2.1
ApertusSharp
ApertusSharp is a modern .NET client for Swis-AI's Apertus LLM — built on Microsoft.Extensions.AI
with full support for Semantic Kernel and custom chat pipelines. It’s designed for clean integration, simple usage, and flexible composition in .NET applications.
✨ Features
- ✅ Built on
Microsoft.Extensions.AI
abstractions - 🔄 Supports both streaming and non-streaming chat
- 🧠 Semantic Kernel compatible
- 🧩 Extensible for custom chat pipelines and agents
- 🧪 Minimal, testable, simple .NET code
- 🧰 Ready for DI registration and service composition
📦 Installation
Install via NuGet:
dotnet add package ApertusSharp
Or via Package Manager Console in Visual Studio:
Install-Package ApertusSharp
🚀 Quick Start
List Available models:
var apertus = new ApertusClient(apiKey);
var models = await apertus.ListModelsAsync();
Console.WriteLine($"Available models: {string.Join(", ", models.Select(m => m.Id))}");
Create the Apertus chat client:
var apertus = new ApertusClient(model:"swiss-ai/apertus-8b-instruct", apiKey:apiKey);
await foreach (var stream in apertus.GenerateAsync("How are you today?"))
Console.Write(stream.Text);
Use it as service extension:
var services = new ServiceCollection();
services.AddApertusChatClient(apiKey: apiKey, model: "swiss-ai/apertus-8b-instruct");
await using var provider = services.BuildServiceProvider(new ServiceProviderOptions
{
ValidateScopes = true,
ValidateOnBuild = true
});
var apertus = provider.GetRequiredService<ApertusClient>();
var messages = new List<ChatMessage>
{
new ChatMessage(ChatRole.User, "Hello from ServiceCollection!")
};
Console.Write("AI: ");
await foreach (var chunk in apertus.GetStreamingResponseAsync(messages))
{
Console.Write(chunk);
}
Console.WriteLine("\n Streaming complete.");
🔌 Semantic Kernel Integration
ApertusSharp can be used as a custom IChatClient
for Semantic Kernel:
// Create a Semantic Kernel builder
var builder = Kernel.CreateBuilder();
// Register Apertus as a chat client service
builder.Services.AddApertusChatClient(apiKey: apiKey, model: "swiss-ai/apertus-8b-instruct");
var kernel = builder.Build();
// Use the kernel to get a chat completion
var chat = kernel.GetRequiredService<IChatClient>();
var msg = "How does Semantic Kernel work with Apertus?";
Console.WriteLine("User: " + msg);
var history = new List<ChatMessage>
{
new ChatMessage(ChatRole.User, msg)
};
var result = await chat.GetResponseAsync(history);
Console.WriteLine("AI: " + result.Text);
Usage in Jupyter Notebooks
Cell 1 - Install ApertusSharp package:
#r "nuget: ApertusSharp"
Cell 2 - Import namespaces:
using ApertusSharp;
using Microsoft.Extensions.AI;
Cell 3 - Initialize the client:
var apiKey = "your-api-key-here"; // Replace with your actual API key
var apertus = new ApertusClient(
apiKey: apiKey,
model: "swiss-ai/apertus-8b-instruct"
);
Cell 4 - Simple chat interaction:
var response = await apertus.GetResponseAsync("Explain quantum computing in simple terms");
Console.WriteLine(response);
Tips for Jupyter Notebooks
- Set your API key as an environment variable for security:
Environment.GetEnvironmentVariable("APERTUS_TOKEN")
- Use
display()
function to render rich outputs - Break complex workflows into multiple cells for better interactivity
- Leverage async/await for responsive notebook experience
🧱 Architecture
IChatClient
andIChatCompletionService
abstractions- Streaming via
IAsyncEnumerable<ChatResponseUpdate>
- Extensible options via
ApertusChatOptions
- Designed for clean DI and modular composition
📚 Documentation
🤝 Contributing
Pull requests welcome! Please open an issue first for major changes.
📄 License
MIT — see LICENSE for details.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net8.0
- Microsoft.Extensions.AI.Abstractions (>= 9.9.0)
- Microsoft.Extensions.DependencyInjection.Abstractions (>= 8.0.0)
- Microsoft.Extensions.Options (>= 8.0.0)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
Version | Downloads | Last Updated |
---|---|---|
0.2.1 | 151 | 9/22/2025 |
0.1.0-preview01 | 151 | 9/21/2025 |
Initial preview release.
- Core chat and streaming APIs for Apertus LLM
- .NET 8 support
- Integration with Microsoft.Extensions.AI and Semantic Kernel