Ollama 1.15.1-dev.47
See the version list below for details.
dotnet add package Ollama --version 1.15.1-dev.47
NuGet\Install-Package Ollama -Version 1.15.1-dev.47
<PackageReference Include="Ollama" Version="1.15.1-dev.47" />
<PackageVersion Include="Ollama" Version="1.15.1-dev.47" />
<PackageReference Include="Ollama" />
paket add Ollama --version 1.15.1-dev.47
#r "nuget: Ollama, 1.15.1-dev.47"
#:package Ollama@1.15.1-dev.47
#addin nuget:?package=Ollama&version=1.15.1-dev.47&prerelease
#tool nuget:?package=Ollama&version=1.15.1-dev.47&prerelease
Ollama SDK for .NET 🦙
Features 🔥
- Fully generated C# SDK based on the official Ollama OpenAPI specification using AutoSDK
- Automatic releases of new preview versions when the official OpenAPI specification changes
- Source generator to define tools natively through C# interfaces
- All modern .NET features - nullability, trimming, NativeAOT, etc.
- Support .Net Framework/.Net Standard 2.0
- Support for all Ollama API endpoints including chats, embeddings, listing models, pulling and creating new models, and more.
Usage
Initializing
using var ollama = new OllamaApiClient();
// or if you have a custom server
// using var ollama = new OllamaApiClient(baseUri: new Uri("http://10.10.5.85:11434"));
var models = await ollama.ListAsync();
// Pulling a model and reporting progress
await foreach (var response in ollama.PullAsStreamAsync("all-minilm"))
{
Console.WriteLine($"{response.Status}. Progress: {response.Completed}/{response.Total}");
}
// or just pull the model and wait for it to finish
await ollama.PullAsStreamAsync("all-minilm").EnsureSuccessAsync();
// Generating an embedding
var embedding = await ollama.EmbedAsync(
model: "all-minilm",
input: "hello");
// Streaming a completion directly into the console
var enumerable = ollama.GenerateAsStreamAsync("llama3.2", "answer 5 random words");
await foreach (var response in enumerable)
{
Console.WriteLine($"> {response.Response}");
}
var lastResponse = await ollama.GenerateAsync("llama3.2", "answer 123");
Console.WriteLine(lastResponse.Response);
var chat = ollama.Chat("mistral");
while (true)
{
var message = await chat.SendAsync("answer 123");
Console.WriteLine(message.Content);
var newMessage = Console.ReadLine();
await chat.SendAsync(newMessage);
}
Tools
using var ollama = new OllamaApiClient();
var chat = ollama.Chat(
model: "llama3.2",
systemMessage: "You are a helpful weather assistant.",
autoCallTools: true);
var service = new WeatherService();
chat.AddToolService(service.AsTools().AsOllamaTools(), service.AsCalls());
try
{
_ = await chat.SendAsync("What is the current temperature in Dubai, UAE in Celsius?");
}
finally
{
Console.WriteLine(chat.PrintMessages());
}
> System:
You are a helpful weather assistant.
> User:
What is the current temperature in Dubai, UAE in Celsius?
> Assistant:
Tool calls:
GetCurrentWeather({"location":"Dubai, UAE","unit":"celsius"})
> Tool:
{"location":"Dubai, UAE","temperature":22,"unit":"celsius","description":"Sunny"}
> Assistant:
The current temperature in Dubai, UAE is 22°C.
using CSharpToJsonSchema;
public enum Unit
{
Celsius,
Fahrenheit,
}
public class Weather
{
public string Location { get; set; } = string.Empty;
public double Temperature { get; set; }
public Unit Unit { get; set; }
public string Description { get; set; } = string.Empty;
}
[GenerateJsonSchema]
public interface IWeatherFunctions
{
[Description("Get the current weather in a given location")]
public Task<Weather> GetCurrentWeatherAsync(
[Description("The city and state, e.g. San Francisco, CA")] string location,
Unit unit = Unit.Celsius,
CancellationToken cancellationToken = default);
}
public class WeatherService : IWeatherFunctions
{
public Task<Weather> GetCurrentWeatherAsync(string location, Unit unit = Unit.Celsius, CancellationToken cancellationToken = default)
{
return Task.FromResult(new Weather
{
Location = location,
Temperature = 22.0,
Unit = unit,
Description = "Sunny",
});
}
}
Community Projects
Projects built on top of this SDK:
LangMate
LangMate - A modular and extensible AI chat application platform built on this SDK:
- LangMate.Core SDK - Developer-friendly wrapper for easy Ollama integration in .NET apps
- Blazor Server Chat UI - Real-time, interactive chat interface with streaming responses
- RESTful Web API - Backend service with OpenAPI documentation (Scalar integration)
- MongoDB Integration - Persistent chat history and caching layer
- Polly-Based Resilience - Circuit breakers, retry logic, and timeout policies
- File Upload Support - Multimodal capabilities with base64 image preview for vision models
- .NET Aspire Compatible - Full orchestration support for Docker/Kubernetes deployment
- Production-ready .NET 9 implementation with clean, testable architecture
Credits
Icon and name were reused from the amazing Ollama project.
The project was forked from this repository,
after which automatic code generation was applied based on the official Ollama OpenAPI specification.
Support
Priority place for bugs: https://github.com/tryAGI/Ollama/issues
Priority place for ideas and general questions: https://github.com/tryAGI/Ollama/discussions
Discord: https://discord.gg/Ca2xhfBf3v
Acknowledgments
This project is supported by JetBrains through the Open Source Support Program.
This project is supported by CodeRabbit through the Open Source Support Program.
| Product | Versions Compatible and additional computed target framework versions. |
|---|---|
| .NET | net10.0 is compatible. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net10.0
- CSharpToJsonSchema (>= 3.10.1)
NuGet packages (5)
Showing the top 5 NuGet packages that depend on Ollama:
| Package | Downloads |
|---|---|
|
LangChain.Providers.Ollama
Ollama Chat model provider. |
|
|
Richasy.AgentKernel.Connectors.Ollama
Agent Kernel connectors for Ollama. |
|
|
LangMate.Core
LangMate.Core is a lightweight, extensible .NET SDK designed to make working with Ollama-powered local AI models seamless and developer-friendly. It abstracts away the complexity of managing conversations, interacting with Ollama endpoints, and persisting chat history — all while offering resiliency, caching, and extensibility. |
|
|
LangMate.Abstractions
LangMate.Abstractions shared interfaces, contracts, and data transfer objects used across LangMate SDK. |
|
|
SemanticDocIngestor.Core
SemanticDocIngestor.Core is a powerful .NET 9 SDK for document ingestion, semantic search, and retrieval-augmented generation (RAG) with hybrid search capabilities. Build intelligent document processing pipelines with vector and keyword search powered by Qdrant and Elasticsearch. Supports multi-source ingestion from local files, OneDrive, and Google Drive with real-time progress tracking and AI-powered answers using Ollama LLM models. |
GitHub repositories
This package is not used by any popular GitHub repositories.
| Version | Downloads | Last Updated |
|---|---|---|
| 1.15.1-dev.50 | 29 | 3/14/2026 |
| 1.15.1-dev.47 | 28 | 3/14/2026 |
| 1.15.1-dev.38 | 607 | 10/10/2025 |
| 1.15.1-dev.28 | 364 | 6/23/2025 |
| 1.15.1-dev.27 | 258 | 6/18/2025 |
| 1.15.1-dev.13 | 845 | 5/8/2025 |
| 1.15.1-dev.12 | 339 | 4/14/2025 |
| 1.15.1-dev.11 | 196 | 4/10/2025 |
| 1.15.1-dev.10 | 223 | 3/30/2025 |
| 1.15.1-dev.9 | 335 | 3/23/2025 |
| 1.15.1-dev.8 | 308 | 3/23/2025 |
| 1.15.1-dev.7 | 204 | 3/17/2025 |
| 1.15.1-dev.6 | 282 | 3/7/2025 |
| 1.15.1-dev.2 | 789 | 2/14/2025 |
| 1.15.1-dev.1 | 373 | 2/7/2025 |
| 1.15.0 | 5,287 | 2/7/2025 |
| 1.14.1-dev.1 | 164 | 1/27/2025 |
| 1.14.0 | 600 | 1/27/2025 |
| 1.13.1-dev.12 | 178 | 1/20/2025 |
| 1.13.1-dev.11 | 141 | 1/7/2025 |