Microsoft.Extensions.AI.Ollama
9.1.0-preview.1.25064.3
Prefix Reserved
This package is deprecated and the OllamaSharp package is recommended. OllamaSharp provides .NET bindings for the Ollama API, simplifying interactions with Ollama both locally and remotely.
No further updates, features, or fixes are planned for the Microsoft.Extensions.AI.Ollama package.
See the version list below for details.
dotnet add package Microsoft.Extensions.AI.Ollama --version 9.1.0-preview.1.25064.3
NuGet\Install-Package Microsoft.Extensions.AI.Ollama -Version 9.1.0-preview.1.25064.3
<PackageReference Include="Microsoft.Extensions.AI.Ollama" Version="9.1.0-preview.1.25064.3" />
<PackageVersion Include="Microsoft.Extensions.AI.Ollama" Version="9.1.0-preview.1.25064.3" />
<PackageReference Include="Microsoft.Extensions.AI.Ollama" />
paket add Microsoft.Extensions.AI.Ollama --version 9.1.0-preview.1.25064.3
#r "nuget: Microsoft.Extensions.AI.Ollama, 9.1.0-preview.1.25064.3"
#addin nuget:?package=Microsoft.Extensions.AI.Ollama&version=9.1.0-preview.1.25064.3&prerelease
#tool nuget:?package=Microsoft.Extensions.AI.Ollama&version=9.1.0-preview.1.25064.3&prerelease
Microsoft.Extensions.AI.Ollama
Provides an implementation of the IChatClient
interface for Ollama.
Install the package
From the command-line:
dotnet add package Microsoft.Extensions.AI.Ollama
Or directly in the C# project file:
<ItemGroup>
<PackageReference Include="Microsoft.Extensions.AI.Ollama" Version="[CURRENTVERSION]" />
</ItemGroup>
Usage Examples
Chat
using Microsoft.Extensions.AI;
IChatClient client = new OllamaChatClient(new Uri("http://localhost:11434/"), "llama3.1");
Console.WriteLine(await client.CompleteAsync("What is AI?"));
Chat + Conversation History
using Microsoft.Extensions.AI;
IChatClient client = new OllamaChatClient(new Uri("http://localhost:11434/"), "llama3.1");
Console.WriteLine(await client.CompleteAsync(
[
new ChatMessage(ChatRole.System, "You are a helpful AI assistant"),
new ChatMessage(ChatRole.User, "What is AI?"),
]));
Chat Streaming
using Microsoft.Extensions.AI;
IChatClient client = new OllamaChatClient(new Uri("http://localhost:11434/"), "llama3.1");
await foreach (var update in client.CompleteStreamingAsync("What is AI?"))
{
Console.Write(update);
}
Tool Calling
Known limitations:
- Only a subset of models provided by Ollama support tool calling.
- Tool calling is currently not supported with streaming requests.
using System.ComponentModel;
using Microsoft.Extensions.AI;
IChatClient ollamaClient = new OllamaChatClient(new Uri("http://localhost:11434/"), "llama3.1");
IChatClient client = new ChatClientBuilder(ollamaClient)
.UseFunctionInvocation()
.Build();
ChatOptions chatOptions = new()
{
Tools = [AIFunctionFactory.Create(GetWeather)]
};
Console.WriteLine(await client.CompleteAsync("Do I need an umbrella?", chatOptions));
[Description("Gets the weather")]
static string GetWeather() => Random.Shared.NextDouble() > 0.5 ? "It's sunny" : "It's raining";
Caching
using Microsoft.Extensions.AI;
using Microsoft.Extensions.Caching.Distributed;
using Microsoft.Extensions.Caching.Memory;
using Microsoft.Extensions.Options;
IDistributedCache cache = new MemoryDistributedCache(Options.Create(new MemoryDistributedCacheOptions()));
IChatClient ollamaClient = new OllamaChatClient(new Uri("http://localhost:11434/"), "llama3.1");
IChatClient client = new ChatClientBuilder(ollamaClient)
.UseDistributedCache(cache)
.Build();
for (int i = 0; i < 3; i++)
{
await foreach (var message in client.CompleteStreamingAsync("In less than 100 words, what is AI?"))
{
Console.Write(message);
}
Console.WriteLine();
Console.WriteLine();
}
Telemetry
using Microsoft.Extensions.AI;
using OpenTelemetry.Trace;
// Configure OpenTelemetry exporter
var sourceName = Guid.NewGuid().ToString();
var tracerProvider = OpenTelemetry.Sdk.CreateTracerProviderBuilder()
.AddSource(sourceName)
.AddConsoleExporter()
.Build();
IChatClient ollamaClient = new OllamaChatClient(new Uri("http://localhost:11434/"), "llama3.1");
IChatClient client = new ChatClientBuilder(ollamaClient)
.UseOpenTelemetry(sourceName, c => c.EnableSensitiveData = true)
.Build();
Console.WriteLine(await client.CompleteAsync("What is AI?"));
Telemetry, Caching, and Tool Calling
using System.ComponentModel;
using Microsoft.Extensions.AI;
using Microsoft.Extensions.Caching.Distributed;
using Microsoft.Extensions.Caching.Memory;
using Microsoft.Extensions.Options;
using OpenTelemetry.Trace;
// Configure telemetry
var sourceName = Guid.NewGuid().ToString();
var tracerProvider = OpenTelemetry.Sdk.CreateTracerProviderBuilder()
.AddSource(sourceName)
.AddConsoleExporter()
.Build();
// Configure caching
IDistributedCache cache = new MemoryDistributedCache(Options.Create(new MemoryDistributedCacheOptions()));
// Configure tool calling
var chatOptions = new ChatOptions
{
Tools = [AIFunctionFactory.Create(GetPersonAge)]
};
IChatClient ollamaClient = new OllamaChatClient(new Uri("http://localhost:11434/"), "llama3.1");
IChatClient client = new ChatClientBuilder(ollamaClient)
.UseDistributedCache(cache)
.UseFunctionInvocation()
.UseOpenTelemetry(sourceName, c => c.EnableSensitiveData = true)
.Build();
for (int i = 0; i < 3; i++)
{
Console.WriteLine(await client.CompleteAsync("How much older is Alice than Bob?", chatOptions));
}
[Description("Gets the age of a person specified by name.")]
static int GetPersonAge(string personName) =>
personName switch
{
"Alice" => 42,
"Bob" => 35,
_ => 26,
};
Text embedding generation
using Microsoft.Extensions.AI;
IEmbeddingGenerator<string, Embedding<float>> generator =
new OllamaEmbeddingGenerator(new Uri("http://localhost:11434/"), "all-minilm");
var embeddings = await generator.GenerateAsync("What is AI?");
Console.WriteLine(string.Join(", ", embeddings[0].Vector.ToArray()));
Text embedding generation with caching
using Microsoft.Extensions.AI;
using Microsoft.Extensions.Caching.Distributed;
using Microsoft.Extensions.Caching.Memory;
using Microsoft.Extensions.Options;
IDistributedCache cache = new MemoryDistributedCache(Options.Create(new MemoryDistributedCacheOptions()));
IEmbeddingGenerator<string, Embedding<float>> ollamaGenerator =
new OllamaEmbeddingGenerator(new Uri("http://localhost:11434/"), "all-minilm");
IEmbeddingGenerator<string, Embedding<float>> generator = new EmbeddingGeneratorBuilder<string, Embedding<float>>(ollamaGenerator)
.UseDistributedCache(cache)
.Build();
foreach (var prompt in new[] { "What is AI?", "What is .NET?", "What is AI?" })
{
var embeddings = await generator.GenerateAsync(prompt);
Console.WriteLine(string.Join(", ", embeddings[0].Vector.ToArray()));
}
Dependency Injection
using Microsoft.Extensions.AI;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using Microsoft.Extensions.Logging;
// App Setup
var builder = Host.CreateApplicationBuilder();
builder.Services.AddDistributedMemoryCache();
builder.Services.AddLogging(b => b.AddConsole().SetMinimumLevel(LogLevel.Trace));
builder.Services.AddChatClient(new OllamaChatClient(new Uri("http://localhost:11434/"), "llama3.1"))
.UseDistributedCache()
.UseLogging();
var app = builder.Build();
// Elsewhere in the app
var chatClient = app.Services.GetRequiredService<IChatClient>();
Console.WriteLine(await chatClient.CompleteAsync("What is AI?"));
Minimal Web API
using Microsoft.Extensions.AI;
var builder = WebApplication.CreateBuilder(args);
builder.Services.AddChatClient(
new OllamaChatClient(new Uri("http://localhost:11434/"), "llama3.1"));
builder.Services.AddEmbeddingGenerator(new OllamaEmbeddingGenerator(endpoint, "all-minilm"));
var app = builder.Build();
app.MapPost("/chat", async (IChatClient client, string message) =>
{
var response = await client.CompleteAsync(message, cancellationToken: default);
return response.Message;
});
app.MapPost("/embedding", async (IEmbeddingGenerator<string,Embedding<float>> client, string message) =>
{
var response = await client.GenerateAsync(message);
return response[0].Vector;
});
app.Run();
Feedback & Contributing
We welcome feedback and contributions in our GitHub repo.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 is compatible. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
.NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.0 is compatible. netstandard2.1 was computed. |
.NET Framework | net461 was computed. net462 is compatible. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen40 was computed. tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETFramework 4.6.2
- Microsoft.Extensions.AI.Abstractions (>= 9.1.0-preview.1.25064.3)
- System.Net.Http.Json (>= 8.0.1)
- System.Text.Json (>= 8.0.5)
-
.NETStandard 2.0
- Microsoft.Extensions.AI.Abstractions (>= 9.1.0-preview.1.25064.3)
- System.Net.Http.Json (>= 8.0.1)
- System.Text.Json (>= 8.0.5)
-
net8.0
- Microsoft.Extensions.AI.Abstractions (>= 9.1.0-preview.1.25064.3)
- System.Net.Http.Json (>= 8.0.1)
- System.Text.Json (>= 8.0.5)
-
net9.0
- Microsoft.Extensions.AI.Abstractions (>= 9.1.0-preview.1.25064.3)
- System.Net.Http.Json (>= 9.0.1)
- System.Text.Json (>= 9.0.1)
NuGet packages (5)
Showing the top 5 NuGet packages that depend on Microsoft.Extensions.AI.Ollama:
Package | Downloads |
---|---|
Richasy.AgentKernel.Connectors.Ollama
Agent Kernel connectors for Ollama. |
|
DaLi.Utils.AI
基于 Microsoft.Extensions.AI 的公共库 |
|
DaLi.Utils.App.Plugin.AI
.Net 常用基础公共库 |
|
VaultForce.MLService
shared resources |
|
DeepMeta.DeepOpenAI
Package Description |
GitHub repositories (8)
Showing the top 8 popular GitHub repositories that depend on Microsoft.Extensions.AI.Ollama:
Repository | Stars |
---|---|
dotnet/eShop
A reference .NET application implementing an eCommerce site
|
|
microsoft/Generative-AI-for-beginners-dotnet
Five lessons, learn how to really apply AI to your .NET Applications
|
|
getcellm/cellm
Use LLMs in Excel formulas
|
|
dotnet/ai-samples
|
|
thangchung/practical-dotnet-aspire
The practical .NET Aspire builds on the coffeeshop app business domain
|
|
axzxs2001/Asp.NetCoreExperiment
原来所有项目都移动到**OleVersion**目录下进行保留。新的案例装以.net 5.0为主,一部分对以前案例进行升级,一部分将以前的工作经验总结出来,以供大家参考!
|
|
afrise/MCPSharp
MCPSharp is a .NET library that helps you build Model Context Protocol (MCP) servers and clients - the standardized API protocol used by AI assistants and models.
|
|
SteveSandersonMS/dotnet-ai-workshop
|
Version | Downloads | Last Updated | |
---|---|---|---|
9.7.0-preview.1.25356.2 | 99 | 7/8/2025 | |
9.6.0-preview.1.25310.2 | 1,919 | 6/10/2025 | |
9.5.0-preview.1.25265.7 | 8,083 | 5/16/2025 | |
9.5.0-preview.1.25262.9 | 750 | 5/13/2025 | |
9.4.4-preview.1.25259.16 | 984 | 5/10/2025 | |
9.4.3-preview.1.25230.7 | 5,590 | 5/1/2025 | |
9.4.0-preview.1.25207.5 | 15,794 | 4/8/2025 | |
9.3.0-preview.1.25161.3 | 21,553 | 3/11/2025 | |
9.3.0-preview.1.25114.11 | 13,746 | 2/16/2025 | |
9.1.0-preview.1.25064.3 | 20,106 | 1/14/2025 | |
9.0.1-preview.1.24570.5 | 35,633 | 11/21/2024 | |
9.0.0-preview.9.24556.5 | 29,666 | 11/12/2024 | |
9.0.0-preview.9.24525.1 | 39,585 | 10/26/2024 | |
9.0.0-preview.9.24507.7 | 11,183 | 10/8/2024 |