LMSupply.Generator
0.13.1
dotnet add package LMSupply.Generator --version 0.13.1
NuGet\Install-Package LMSupply.Generator -Version 0.13.1
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="LMSupply.Generator" Version="0.13.1" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="LMSupply.Generator" Version="0.13.1" />
<PackageReference Include="LMSupply.Generator" />
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add LMSupply.Generator --version 0.13.1
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
#r "nuget: LMSupply.Generator, 0.13.1"
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package LMSupply.Generator@0.13.1
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=LMSupply.Generator&version=0.13.1
#tool nuget:?package=LMSupply.Generator&version=0.13.1
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
LMSupply.Generator
Local text generation and chat with ONNX Runtime GenAI and GGUF (llama-server).
Features
- Zero-config: Models download automatically from HuggingFace
- GPU Acceleration: CUDA, Vulkan, DirectML, CoreML, Metal
- GGUF Support: Load any GGUF model via llama-server (auto-downloaded)
- Server Pooling: Reuses llama-server instances for fast model switching
- MIT Models: Phi-4 and Phi-3.5 models with no usage restrictions
- Chat Support: Built-in chat formatters for various models
Quick Start
using LMSupply.Generator;
// Using the builder pattern
var generator = await TextGeneratorBuilder.Create()
.WithDefaultModel()
.BuildAsync();
// Generate text
string response = await generator.GenerateCompleteAsync("What is AI?");
Console.WriteLine(response);
await generator.DisposeAsync();
Chat Completion
var messages = new[]
{
new ChatMessage(ChatRole.System, "You are a helpful assistant."),
new ChatMessage(ChatRole.User, "Explain quantum computing.")
};
string response = await generator.GenerateChatCompleteAsync(messages);
Available Models
| Model | Parameters | License | Description |
|---|---|---|---|
| Phi-4 Mini | 3.8B | MIT | Default, best balance |
| Phi-3.5 Mini | 3.8B | MIT | Fast, reliable |
| Phi-4 | 14B | MIT | Highest quality |
| Llama 3.2 1B | 1B | Conditional | Ultra-lightweight |
| Llama 3.2 3B | 3B | Conditional | Balanced |
GPU Acceleration
# NVIDIA GPU
dotnet add package Microsoft.ML.OnnxRuntime.Gpu
# Windows (AMD/Intel/NVIDIA)
dotnet add package Microsoft.ML.OnnxRuntime.DirectML
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net10.0 is compatible. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.
-
net10.0
- LMSupply.Core (>= 0.13.1)
- LMSupply.Llama (>= 0.13.1)
- Microsoft.ML.OnnxRuntimeGenAI (>= 0.11.4)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
| Version | Downloads | Last Updated |
|---|---|---|
| 0.13.1 | 99 | 2/2/2026 |
| 0.13.0 | 133 | 2/2/2026 |
| 0.12.1 | 91 | 2/2/2026 |
| 0.12.0 | 100 | 2/1/2026 |
| 0.11.0 | 89 | 1/27/2026 |
| 0.10.0 | 204 | 1/22/2026 |
| 0.9.3 | 106 | 1/19/2026 |
| 0.9.2 | 92 | 1/19/2026 |
| 0.9.1 | 82 | 1/18/2026 |
| 0.9.0 | 86 | 1/18/2026 |
| 0.8.18 | 91 | 1/18/2026 |
| 0.8.17 | 89 | 1/17/2026 |
| 0.8.16 | 86 | 1/15/2026 |
| 0.8.15 | 86 | 1/13/2026 |
| 0.8.14 | 90 | 1/12/2026 |
| 0.8.13 | 91 | 1/10/2026 |
| 0.8.12 | 87 | 1/9/2026 |
| 0.8.11 | 88 | 1/9/2026 |
| 0.8.10 | 170 | 1/8/2026 |
| 0.8.9 | 90 | 1/8/2026 |
| 0.8.8 | 100 | 1/8/2026 |
| 0.8.7 | 90 | 1/8/2026 |
| 0.8.6 | 91 | 1/8/2026 |
| 0.8.5 | 96 | 1/7/2026 |
| 0.8.4 | 91 | 1/7/2026 |
| 0.8.3 | 960 | 12/22/2025 |
| 0.8.2 | 148 | 12/20/2025 |
| 0.8.1 | 249 | 12/19/2025 |
| 0.8.0 | 275 | 12/17/2025 |
| 0.7.3 | 266 | 12/17/2025 |
| 0.7.2 | 573 | 12/17/2025 |