LMSupply.Generator 0.8.15

dotnet add package LMSupply.Generator --version 0.8.15
                    
NuGet\Install-Package LMSupply.Generator -Version 0.8.15
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="LMSupply.Generator" Version="0.8.15" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="LMSupply.Generator" Version="0.8.15" />
                    
Directory.Packages.props
<PackageReference Include="LMSupply.Generator" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add LMSupply.Generator --version 0.8.15
                    
#r "nuget: LMSupply.Generator, 0.8.15"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package LMSupply.Generator@0.8.15
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=LMSupply.Generator&version=0.8.15
                    
Install as a Cake Addin
#tool nuget:?package=LMSupply.Generator&version=0.8.15
                    
Install as a Cake Tool

LMSupply.Generator

Local text generation and chat with ONNX Runtime GenAI.

Features

  • Zero-config: Models download automatically from HuggingFace
  • GPU Acceleration: CUDA, DirectML (Windows), CoreML (macOS)
  • MIT Models: Phi-4 and Phi-3.5 models with no usage restrictions
  • Chat Support: Built-in chat formatters for various models

Quick Start

using LMSupply.Generator;

// Using the builder pattern
var generator = await TextGeneratorBuilder.Create()
    .WithDefaultModel()
    .BuildAsync();

// Generate text
string response = await generator.GenerateCompleteAsync("What is AI?");
Console.WriteLine(response);

await generator.DisposeAsync();

Chat Completion

var messages = new[]
{
    new ChatMessage(ChatRole.System, "You are a helpful assistant."),
    new ChatMessage(ChatRole.User, "Explain quantum computing.")
};

string response = await generator.GenerateChatCompleteAsync(messages);

Available Models

Model Parameters License Description
Phi-4 Mini 3.8B MIT Default, best balance
Phi-3.5 Mini 3.8B MIT Fast, reliable
Phi-4 14B MIT Highest quality
Llama 3.2 1B 1B Conditional Ultra-lightweight
Llama 3.2 3B 3B Conditional Balanced

GPU Acceleration

# NVIDIA GPU
dotnet add package Microsoft.ML.OnnxRuntime.Gpu

# Windows (AMD/Intel/NVIDIA)
dotnet add package Microsoft.ML.OnnxRuntime.DirectML
Product Compatible and additional computed target framework versions.
.NET net10.0 is compatible.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (4)

Showing the top 4 NuGet packages that depend on LMSupply.Generator:

Package Downloads
FileFlux

Complete document processing SDK optimized for RAG systems. Transform PDF, DOCX, Excel, PowerPoint, Markdown and other formats into high-quality chunks with intelligent semantic boundary detection. Includes advanced chunking strategies, metadata extraction, and performance optimization.

FluxIndex.SDK

FluxIndex SDK - Complete RAG infrastructure with LMSupply embeddings, FileFlux integration, FluxCurator preprocessing, and FluxImprover quality enhancement

FluxImprover

The Quality Layer for RAG Data Pipelines - LLM-powered enrichment and quality assessment

MemoryIndexer.Sdk

Memory Indexer SDK - Full-featured long-term memory management for LLM applications via MCP. Includes SQLite/Qdrant storage, BGE-M3/Ollama/OpenAI embeddings, and OpenTelemetry observability.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last Updated
0.8.15 69 1/13/2026
0.8.14 84 1/12/2026
0.8.13 83 1/10/2026
0.8.12 79 1/9/2026
0.8.11 78 1/9/2026
0.8.10 150 1/8/2026
0.8.9 80 1/8/2026
0.8.8 91 1/8/2026
0.8.7 82 1/8/2026
0.8.6 84 1/8/2026
0.8.5 85 1/7/2026
0.8.4 84 1/7/2026
0.8.3 298 12/22/2025
0.8.2 142 12/20/2025
0.8.1 241 12/19/2025
0.8.0 268 12/17/2025
0.7.3 262 12/17/2025
0.7.2 557 12/17/2025