LMSupply.Generator 0.22.2

There is a newer version of this package available.
See the version list below for details.
dotnet add package LMSupply.Generator --version 0.22.2
                    
NuGet\Install-Package LMSupply.Generator -Version 0.22.2
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="LMSupply.Generator" Version="0.22.2" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="LMSupply.Generator" Version="0.22.2" />
                    
Directory.Packages.props
<PackageReference Include="LMSupply.Generator" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add LMSupply.Generator --version 0.22.2
                    
#r "nuget: LMSupply.Generator, 0.22.2"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package LMSupply.Generator@0.22.2
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=LMSupply.Generator&version=0.22.2
                    
Install as a Cake Addin
#tool nuget:?package=LMSupply.Generator&version=0.22.2
                    
Install as a Cake Tool

LMSupply.Generator

Local text generation and chat with ONNX Runtime GenAI and GGUF (llama-server).

Features

  • Zero-config: Models download automatically from HuggingFace
  • GPU Acceleration: CUDA, Vulkan, DirectML, CoreML, Metal
  • GGUF Support: Load any GGUF model via llama-server (auto-downloaded)
  • Server Pooling: Reuses llama-server instances for fast model switching
  • MIT Models: Phi-4 and Phi-3.5 models with no usage restrictions
  • Chat Support: Built-in chat formatters for various models

Quick Start

using LMSupply.Generator;

// Using the builder pattern
var generator = await TextGeneratorBuilder.Create()
    .WithDefaultModel()
    .BuildAsync();

// Generate text
string response = await generator.GenerateCompleteAsync("What is AI?");
Console.WriteLine(response);

await generator.DisposeAsync();

Chat Completion

var messages = new[]
{
    new ChatMessage(ChatRole.System, "You are a helpful assistant."),
    new ChatMessage(ChatRole.User, "Explain quantum computing.")
};

string response = await generator.GenerateChatCompleteAsync(messages);

Available Models

Model Parameters License Description
Phi-4 Mini 3.8B MIT Default, best balance
Phi-3.5 Mini 3.8B MIT Fast, reliable
Phi-4 14B MIT Highest quality
Llama 3.2 1B 1B Conditional Ultra-lightweight
Llama 3.2 3B 3B Conditional Balanced

GPU Acceleration

# NVIDIA GPU
dotnet add package Microsoft.ML.OnnxRuntime.Gpu

# Windows (AMD/Intel/NVIDIA)
dotnet add package Microsoft.ML.OnnxRuntime.DirectML
Product Compatible and additional computed target framework versions.
.NET net10.0 is compatible.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (2)

Showing the top 2 NuGet packages that depend on LMSupply.Generator:

Package Downloads
FluxIndex.Providers.LMSupply

LMSupply local AI embedding, reranking, and text completion provider for FluxIndex

IronHive.Cli.Core

IronHive CLI Core - Agent loop, tools, session management, and provider integrations for building AI-powered CLI tools

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last Updated
0.24.0 161 3/9/2026
0.22.2 86 3/9/2026
0.22.1 119 3/8/2026
0.22.0 85 3/8/2026
0.21.0 85 3/8/2026
0.20.2 111 3/7/2026
0.20.1 84 3/7/2026
0.20.0 93 3/7/2026
0.19.1 84 3/6/2026
0.19.0 83 3/6/2026
0.18.0 82 3/6/2026
0.17.2 80 3/5/2026
0.17.1 77 3/5/2026
0.17.0 78 3/5/2026
0.16.1 80 3/5/2026
0.16.0 74 3/5/2026
0.15.1 81 3/4/2026
0.15.0 83 3/3/2026
0.14.0 173 3/3/2026
0.13.9 140 2/27/2026
Loading failed