LocalAI.Generator
0.7.2
dotnet add package LocalAI.Generator --version 0.7.2
NuGet\Install-Package LocalAI.Generator -Version 0.7.2
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="LocalAI.Generator" Version="0.7.2" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="LocalAI.Generator" Version="0.7.2" />
<PackageReference Include="LocalAI.Generator" />
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add LocalAI.Generator --version 0.7.2
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
#r "nuget: LocalAI.Generator, 0.7.2"
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package LocalAI.Generator@0.7.2
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=LocalAI.Generator&version=0.7.2
#tool nuget:?package=LocalAI.Generator&version=0.7.2
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
LocalAI.Generator
Local text generation and chat with ONNX Runtime GenAI.
Features
- Zero-config: Models download automatically from HuggingFace
- GPU Acceleration: CUDA, DirectML (Windows), CoreML (macOS)
- MIT Models: Phi-4 and Phi-3.5 models with no usage restrictions
- Chat Support: Built-in chat formatters for various models
Quick Start
using LocalAI.Generator;
// Using the builder pattern
var generator = await TextGeneratorBuilder.Create()
.WithDefaultModel()
.BuildAsync();
// Generate text
string response = await generator.GenerateCompleteAsync("What is AI?");
Console.WriteLine(response);
await generator.DisposeAsync();
Chat Completion
var messages = new[]
{
new ChatMessage(ChatRole.System, "You are a helpful assistant."),
new ChatMessage(ChatRole.User, "Explain quantum computing.")
};
string response = await generator.GenerateChatCompleteAsync(messages);
Available Models
| Model | Parameters | License | Description |
|---|---|---|---|
| Phi-4 Mini | 3.8B | MIT | Default, best balance |
| Phi-3.5 Mini | 3.8B | MIT | Fast, reliable |
| Phi-4 | 14B | MIT | Highest quality |
| Llama 3.2 1B | 1B | Conditional | Ultra-lightweight |
| Llama 3.2 3B | 3B | Conditional | Balanced |
GPU Acceleration
# NVIDIA GPU
dotnet add package Microsoft.ML.OnnxRuntime.Gpu
# Windows (AMD/Intel/NVIDIA)
dotnet add package Microsoft.ML.OnnxRuntime.DirectML
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net10.0 is compatible. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.
-
net10.0
- LocalAI.Core (>= 0.7.2)
- Microsoft.ML.OnnxRuntimeGenAI (>= 0.11.4)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.