LMSupply.Generator
0.8.15
dotnet add package LMSupply.Generator --version 0.8.15
NuGet\Install-Package LMSupply.Generator -Version 0.8.15
<PackageReference Include="LMSupply.Generator" Version="0.8.15" />
<PackageVersion Include="LMSupply.Generator" Version="0.8.15" />
<PackageReference Include="LMSupply.Generator" />
paket add LMSupply.Generator --version 0.8.15
#r "nuget: LMSupply.Generator, 0.8.15"
#:package LMSupply.Generator@0.8.15
#addin nuget:?package=LMSupply.Generator&version=0.8.15
#tool nuget:?package=LMSupply.Generator&version=0.8.15
LMSupply.Generator
Local text generation and chat with ONNX Runtime GenAI.
Features
- Zero-config: Models download automatically from HuggingFace
- GPU Acceleration: CUDA, DirectML (Windows), CoreML (macOS)
- MIT Models: Phi-4 and Phi-3.5 models with no usage restrictions
- Chat Support: Built-in chat formatters for various models
Quick Start
using LMSupply.Generator;
// Using the builder pattern
var generator = await TextGeneratorBuilder.Create()
.WithDefaultModel()
.BuildAsync();
// Generate text
string response = await generator.GenerateCompleteAsync("What is AI?");
Console.WriteLine(response);
await generator.DisposeAsync();
Chat Completion
var messages = new[]
{
new ChatMessage(ChatRole.System, "You are a helpful assistant."),
new ChatMessage(ChatRole.User, "Explain quantum computing.")
};
string response = await generator.GenerateChatCompleteAsync(messages);
Available Models
| Model | Parameters | License | Description |
|---|---|---|---|
| Phi-4 Mini | 3.8B | MIT | Default, best balance |
| Phi-3.5 Mini | 3.8B | MIT | Fast, reliable |
| Phi-4 | 14B | MIT | Highest quality |
| Llama 3.2 1B | 1B | Conditional | Ultra-lightweight |
| Llama 3.2 3B | 3B | Conditional | Balanced |
GPU Acceleration
# NVIDIA GPU
dotnet add package Microsoft.ML.OnnxRuntime.Gpu
# Windows (AMD/Intel/NVIDIA)
dotnet add package Microsoft.ML.OnnxRuntime.DirectML
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net10.0 is compatible. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net10.0
- LMSupply.Core (>= 0.8.15)
- Microsoft.ML.OnnxRuntimeGenAI (>= 0.11.4)
NuGet packages (4)
Showing the top 4 NuGet packages that depend on LMSupply.Generator:
| Package | Downloads |
|---|---|
|
FileFlux
Complete document processing SDK optimized for RAG systems. Transform PDF, DOCX, Excel, PowerPoint, Markdown and other formats into high-quality chunks with intelligent semantic boundary detection. Includes advanced chunking strategies, metadata extraction, and performance optimization. |
|
|
FluxIndex.SDK
FluxIndex SDK - Complete RAG infrastructure with LMSupply embeddings, FileFlux integration, FluxCurator preprocessing, and FluxImprover quality enhancement |
|
|
FluxImprover
The Quality Layer for RAG Data Pipelines - LLM-powered enrichment and quality assessment |
|
|
MemoryIndexer.Sdk
Memory Indexer SDK - Full-featured long-term memory management for LLM applications via MCP. Includes SQLite/Qdrant storage, BGE-M3/Ollama/OpenAI embeddings, and OpenTelemetry observability. |
GitHub repositories
This package is not used by any popular GitHub repositories.
| Version | Downloads | Last Updated |
|---|---|---|
| 0.8.15 | 69 | 1/13/2026 |
| 0.8.14 | 84 | 1/12/2026 |
| 0.8.13 | 83 | 1/10/2026 |
| 0.8.12 | 79 | 1/9/2026 |
| 0.8.11 | 78 | 1/9/2026 |
| 0.8.10 | 150 | 1/8/2026 |
| 0.8.9 | 80 | 1/8/2026 |
| 0.8.8 | 91 | 1/8/2026 |
| 0.8.7 | 82 | 1/8/2026 |
| 0.8.6 | 84 | 1/8/2026 |
| 0.8.5 | 85 | 1/7/2026 |
| 0.8.4 | 84 | 1/7/2026 |
| 0.8.3 | 298 | 12/22/2025 |
| 0.8.2 | 142 | 12/20/2025 |
| 0.8.1 | 241 | 12/19/2025 |
| 0.8.0 | 268 | 12/17/2025 |
| 0.7.3 | 262 | 12/17/2025 |
| 0.7.2 | 557 | 12/17/2025 |