Llamas 0.1.1
dotnet add package Llamas --version 0.1.1
NuGet\Install-Package Llamas -Version 0.1.1
<PackageReference Include="Llamas" Version="0.1.1" />
paket add Llamas --version 0.1.1
#r "nuget: Llamas, 0.1.1"
// Install Llamas as a Cake Addin #addin nuget:?package=Llamas&version=0.1.1 // Install Llamas as a Cake Tool #tool nuget:?package=Llamas&version=0.1.1
Llamas
Llamas
is a .NET client library for Ollama, enabling .NET developers to interact with and leverage large language models.
If using the Llamas.Container package, developers can also host pre-configured instances of Ollama in docker from their own .NET code either directly or using the simple DI patterns they are accustomed to with no configuration knowledge needed.
Llamas
is a handwritten client library focused on ergonomics and performance, taking full advantage of IAsyncEnumerable
and ndjson
to handle and propagate live-streaming data.
This client handles the functionality exposed by the Ollama API and therefore requires an instance of Ollama to be accessible over the local network, or hosted using the Llamas.Container
package.
Usage
The IOllamaClient
interface describes the functionality of the Ollama client, such as listing models installed locally, pulling new models, generating chat completions, generating embeddings, pushing models, and retrieving details about models.
IOllamaBlobClient
contains definitions for blob functionality including checking for the existence of and creation of a data blob.
Examples of client use can be found both in the examples
folder, as well as the integration test suite.
Dependency Injection
Llamas
comes with several ways to set up a client using the .NET hosting abstractions.
One can inject a client configuration and the client explicitly, or using one of the helper extension methods on IServiceCollection
.
services.AddHttpClient(); // IHttpClientFactory and HttpClient can both be injected. Otherwise, new HttpClient will be created
#region Manual Addition
/// Add the services manually
var clientConfig = new OllamaClientConfiguration();
services.AddSingleton(clientConfig);
services.AddSingleton<IOllamaClient, OllamaClient>();
#endregion
#region From Configuration
// Automatically inject the configuration and a client
var clientConfig = new OllamaClientConfiguration();
services.AddOllamaClient(clientConfig);
#endregion
#region With Configuration Builder
// Use the lambda parameter to change the default configuration values
services.AddOllamaClient(clientConfig => clientConfig with {Port = 8082});
#endregion
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
-
net8.0
- Llamas.Abstractions (>= 0.1.1)
- Microsoft.Extensions.Hosting.Abstractions (>= 8.0.0)
- Microsoft.Extensions.Http (>= 8.0.0)
- System.Text.Json (>= 8.0.3)
NuGet packages (1)
Showing the top 1 NuGet packages that depend on Llamas:
Package | Downloads |
---|---|
Llamas.Container
Host Ollama docker containers from the comfort of .NET with client support from Llamas |
GitHub repositories
This package is not used by any popular GitHub repositories.
Minor changes to enable testing