OllamaSharp 3.0.7
See the version list below for details.
dotnet add package OllamaSharp --version 3.0.7
NuGet\Install-Package OllamaSharp -Version 3.0.7
<PackageReference Include="OllamaSharp" Version="3.0.7" />
<PackageVersion Include="OllamaSharp" Version="3.0.7" />
<PackageReference Include="OllamaSharp" />
paket add OllamaSharp --version 3.0.7
#r "nuget: OllamaSharp, 3.0.7"
#:package OllamaSharp@3.0.7
#addin nuget:?package=OllamaSharp&version=3.0.7
#tool nuget:?package=OllamaSharp&version=3.0.7
<p align="center"> <img alt="ollama" height="200px" src="https://github.com/awaescher/OllamaSharp/blob/main/Ollama.png"> </p>
OllamaSharp 🦙
OllamaSharp provides .NET bindings for the Ollama API, simplifying interaction with Ollama both locally and remotely.
Features
- Ease of use: Interact with Ollama in just a few lines of code.
- API endpoint coverage: Support for all Ollama API endpoints including chats, embeddings, listing models, pulling and creating new models, and more.
- Real-time streaming: Stream responses directly to your application.
- Progress reporting: Get real-time progress feedback on tasks like model pulling.
- Support for vision models and tools (function calling).
Usage
OllamaSharp wraps every Ollama API endpoint in awaitable methods that fully support response streaming.
The following list shows a few simple code examples.
ℹ Try our full-featured Ollama API client app OllamaSharpConsole to interact with your Ollama instance.
Initializing
// set up the client
var uri = new Uri("http://localhost:11434");
var ollama = new OllamaApiClient(uri);
// select a model which should be used for further operations
ollama.SelectedModel = "llama3.1:8b";
Listing all models that are available locally
var models = await ollama.ListLocalModels();
Pulling a model and reporting progress
await foreach (var status in ollama.PullModel("llama3.1:405b"))
Console.WriteLine($"{status.Percent}% {status.Status}");
Generating a completion directly into the console
await foreach (var stream in ollama.Generate("How are you today?"))
Console.Write(stream.Response);
Building interactive chats
var chat = new Chat(ollama);
while (true)
{
var message = Console.ReadLine();
await foreach (var answerToken in chat.Send(message))
Console.Write(answerToken);
}
// messages including their roles and tool calls will automatically be tracked within the chat object
// and are accessible via the Messages property
Credits
Icon and name were reused from the amazing Ollama project.
I would like to thank all the contributors who take the time to improve OllamaSharp. First and foremost mili-tan, who always keeps OllamaSharp in sync with the Ollama API. ❤
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
| .NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
| .NET Standard | netstandard2.0 is compatible. netstandard2.1 was computed. |
| .NET Framework | net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
| MonoAndroid | monoandroid was computed. |
| MonoMac | monomac was computed. |
| MonoTouch | monotouch was computed. |
| Tizen | tizen40 was computed. tizen60 was computed. |
| Xamarin.iOS | xamarinios was computed. |
| Xamarin.Mac | xamarinmac was computed. |
| Xamarin.TVOS | xamarintvos was computed. |
| Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.0
- Microsoft.Bcl.AsyncInterfaces (>= 8.0.0)
- System.Text.Json (>= 8.0.4)
NuGet packages (34)
Showing the top 5 NuGet packages that depend on OllamaSharp:
| Package | Downloads |
|---|---|
|
Microsoft.SemanticKernel.Connectors.Ollama
Semantic Kernel connector for Ollama. Contains services for text generation, chat completion and text embeddings. |
|
|
Microsoft.KernelMemory.AI.Ollama
Provide access to Ollama LLM models in Kernel Memory to generate embeddings and text |
|
|
CommunityToolkit.Aspire.OllamaSharp
An Aspire client integration for the OllamaSharp library. |
|
|
CommunityToolkit.Aspire.Hosting.Ollama
An Aspire integration leveraging the Ollama container with support for downloading a model on startup. |
|
|
OllamaSharp.ModelContextProtocol
Use tools from model context protocol (MCP) servers with Ollama |
GitHub repositories (18)
Showing the top 18 popular GitHub repositories that depend on OllamaSharp:
| Repository | Stars |
|---|---|
|
microsoft/semantic-kernel
Integrate cutting-edge LLM technology quickly and easily into your apps
|
|
|
testcontainers/testcontainers-dotnet
A library to support tests with throwaway instances of Docker containers for all compatible .NET Standard versions.
|
|
|
dotnet/extensions
This repository contains a suite of libraries that provide facilities commonly needed when creating production-ready applications.
|
|
|
microsoft/Generative-AI-for-beginners-dotnet
Five lessons, learn how to really apply AI to your .NET Applications
|
|
|
microsoft/kernel-memory
Research project. A Memory solution for users, teams, and applications.
|
|
|
microsoft/ai-dev-gallery
An open-source project for Windows developers to learn how to add AI with local models and APIs to Windows apps.
|
|
|
getcellm/cellm
Use LLMs in Excel formulas
|
|
|
mixcore/mix.core
🚀 A future-proof enterprise web CMS supporting both headless and decoupled approaches. Build any type of app with customizable APIs on ASP.NET Core/.NET Core. Completely open-source and designed for flexibility.
|
|
|
dotnet/ai-samples
|
|
|
CommunityToolkit/Aspire
A community project with additional components and extensions for Aspire
|
|
|
PowerShell/AIShell
An interactive shell to work with AI-powered assistance providers
|
|
|
anobaka/Bakabase
A local media manager for all types of files. 二次元老司机专用的本地媒体文件管理器,支持管理和处理音视频、本子、图集、小说、哔哩哔哩视频、游戏甚至mod等各类资源
|
|
|
axzxs2001/Asp.NetCoreExperiment
原来所有项目都移动到**OleVersion**目录下进行保留。新的案例装以.net 5.0为主,一部分对以前案例进行升级,一部分将以前的工作经验总结出来,以供大家参考!
|
|
|
afrise/MCPSharp
MCPSharp is a .NET library that helps you build Model Context Protocol (MCP) servers and clients - the standardized API protocol used by AI assistants and models.
|
|
|
lindexi/lindexi_gd
博客用到的代码
|
|
|
rwjdk/MicrosoftAgentFrameworkSamples
Samples demonstrating the Microsoft Agent Framework in C#
|
|
|
petabridge/memorizer
Vector-search powered agent memory MCP server
|
|
|
PacktPublishing/.NET-MAUI-Cookbook
.NET MAUI Cookbook, published by Packt
|
| Version | Downloads | Last Updated |
|---|---|---|
| 5.4.16 | 19,709 | 1/28/2026 |
| 5.4.15 | 515 | 1/28/2026 |
| 5.4.13 | 2,423 | 1/26/2026 |
| 5.4.12 | 89,351 | 12/5/2025 |
| 5.4.11 | 61,491 | 11/14/2025 |
| 5.4.10 | 10,665 | 11/11/2025 |
| 5.4.9 | 3,972 | 11/10/2025 |
| 5.4.8 | 77,942 | 10/17/2025 |
| 5.4.7 | 36,650 | 9/25/2025 |
| 5.4.6 | 46,618 | 9/22/2025 |
| 5.4.5 | 10,852 | 9/19/2025 |
| 5.4.4 | 6,269 | 9/15/2025 |
| 5.4.3 | 13,687 | 9/15/2025 |
| 5.4.2 | 1,631 | 9/15/2025 |
| 5.4.1 | 12,526 | 9/11/2025 |
| 5.3.12 | 4,046 | 9/10/2025 |
| 5.3.11 | 1,381 | 9/10/2025 |
| 5.3.10 | 1,386 | 9/10/2025 |
| 5.3.9 | 1,386 | 9/10/2025 |
| 3.0.7 | 35,771 | 9/12/2024 |