TalkBack-LLM 1.0.5

dotnet add package TalkBack-LLM --version 1.0.5
                    
NuGet\Install-Package TalkBack-LLM -Version 1.0.5
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="TalkBack-LLM" Version="1.0.5" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="TalkBack-LLM" Version="1.0.5" />
                    
Directory.Packages.props
<PackageReference Include="TalkBack-LLM" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add TalkBack-LLM --version 1.0.5
                    
#r "nuget: TalkBack-LLM, 1.0.5"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package TalkBack-LLM@1.0.5
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=TalkBack-LLM&version=1.0.5
                    
Install as a Cake Addin
#tool nuget:?package=TalkBack-LLM&version=1.0.5
                    
Install as a Cake Tool

TalkBack

TalkBack is a library that abstracts out chat with LLMs.

Overview

Each LLM has its own API, its own model options and so forth. If you want an application that uses multiple models, you then have to implement code for interacting with each API.

TalkBack is designed to be lightweight and easy-to-use. It provides access to the custom options of each LLM while providing an otherwise single interface for interacting with the APIs, whether executing blocking or streaming completions.

TalkBack utilizes a "IConversationContext" that maintains the history of a conversation for maintaining context between calls.

Supporter LLMs

Ollama OpenAI Claude Groq

Installation

dotnet add package TalkBack-LLM

Startup

Call the RegisterTalkBack() extension method on IServiceCollection, to add all the services to the DI container.
using TalkBack;

...
	services.RegisterTalkBack()

Example

The non-streaming version of the code is very simple:

_conversationContext.SystemMessage = "You are an expert C# programmer!";
var result = await _llm.CompleteAsync("Please write a command-line C# program to retrieve the current weather for Paris, France, from OpenWeather.", _conversationContext);
string responseText = result.Response;

The streaming version isn't much more complicated:

public class MyClass : ICompletionReceiver
{

...
    public SomeMethod()
    {
        await _llm.StreamCompletionAsync(this, prompt, _conversationContext);
    }
...

    public async Task ReceiveCompletionPartAsync(IModelResponse response, bool final)
    {
        if (!final) // final contains a copy of the entire response that was streamed.
        {
            Console.Write(response.Response);
        }
        return;
    }

This is a command-line chat app. When you run it, it will give you a prompt: "> ", and you type your prompt. It will then respond. The conversation continues until you enter "q" on a blank line and hit enter. Simply change the provider setup to change LLMs.

using Microsoft.Extensions.DependencyInjection;
using System.Diagnostics;
using TalkBack;
using TalkBack.Interfaces;
using TalkBack.LLMProviders.Claude;
using TalkBack.LLMProviders.Ollama;
using TalkBack.LLMProviders.OpenAI;

bool streaming = true; // <-- Change to false for blocking version.

// Add services for DI and add TalkBack
var services = new ServiceCollection();
services.RegisterTalkBack();

var serviceProvider = services.BuildServiceProvider();

var providerActivator = serviceProvider.GetService<IProviderActivator>();
var llm = serviceProvider.GetService<ILLM>();

/*
    Examples of the 3 current providers:

var provider = providerActivator!.CreateProvider<OpenAIProvider>();
provider!.InitProvider(new OpenAIOptions()
{
    ApiKey = "<your key here>",
    Model = "gpt-3.5-turbo"
});

var provider = providerActivator!.CreateProvider<OllamaProvider>();
provider!.InitProvider(new OllamaOptions()
{
    ServerUrl = "http://localhost:11434/api",
    Model = "llama2"
});

var provider = providerActivator!.CreateProvider<ClaudeProvider>();
provider!.InitProvider(new ClaudeOptions()
{
    ApiKey = "<your key here>",
    Model = "claude-2.1"
});
*/

var provider = providerActivator!.CreateProvider<OllamaProvider>();
provider!.InitProvider(new OllamaOptions()
{
    ServerUrl = "http://localhost:11434/api",
    Model = "llama2"
});

llm!.SetProvider(provider);

var streamReceiver = new StreamReceiver();

var conversationContext = llm.CreateNewContext();

string prompt = string.Empty;
Console.WriteLine("'q' + enter to quit");
while (prompt.ToLower() != "q")
{
    Console.Write("> ");
    prompt = Console.ReadLine() ?? string.Empty;
    if (streaming)
    {
        streamReceiver.Done = false;
        Console.Write(Environment.NewLine + "Response: ");
        await llm.StreamCompletionAsync(streamReceiver, prompt, conversationContext);

        // Wait for it to finish streaming.
        while(!streamReceiver.Done)
        {
            Thread.Sleep(1000);
        }
        Console.WriteLine(Environment.NewLine);
    }
    else
    {
        var result = await llm.CompleteAsync(prompt, conversationContext);
        Console.WriteLine(Environment.NewLine + "Response: " + result.Response + Environment.NewLine);
    }

}


public class StreamReceiver : ICompletionReceiver
{
    public bool Done { get; set; } = true;
    public Task ReceiveCompletionPartAsync(IModelResponse response, bool final)
    {
        Done = final;
        if (!final)
        {
            Console.Write(response.Response);
        }
        return Task.CompletedTask;
    }
}

More

See the Wiki for some more realistic examples with normal dependency injection.

Product Compatible and additional computed target framework versions.
.NET net6.0 is compatible.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 is compatible.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 was computed.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 was computed.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last Updated
1.0.5 161 10/2/2024
1.0.4 120 10/2/2024
1.0.3 118 10/1/2024
1.0.2 123 9/19/2024
1.0.1 127 9/8/2024
1.0.0 143 6/16/2024