ChatGptNet 1.3.8

There is a newer version of this package available.
See the version list below for details.
dotnet add package ChatGptNet --version 1.3.8                
NuGet\Install-Package ChatGptNet -Version 1.3.8                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="ChatGptNet" Version="1.3.8" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add ChatGptNet --version 1.3.8                
#r "nuget: ChatGptNet, 1.3.8"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install ChatGptNet as a Cake Addin
#addin nuget:?package=ChatGptNet&version=1.3.8

// Install ChatGptNet as a Cake Tool
#tool nuget:?package=ChatGptNet&version=1.3.8                

ChatGPT for .NET

Lint Code Base CodeQL NuGet Nuget License: MIT

A ChatGPT integration library for .NET

Installation

The library is available on NuGet. Just search for ChatGptNet in the Package Manager GUI or run the following command in the .NET CLI:

dotnet add package ChatGptNet

Configuration

Register ChatGPT service at application startup:

builder.Services.AddChatGpt(options =>
{
    options.ApiKey = "";
    options.Organization = null;    // Optional
    options.DefaultModel = ChatGptModels.Gpt35Turbo;  // Default: ChatGptModels.Gpt35Turbo
    options.MessageLimit = 16;  // Default: 10
    options.MessageExpiration = TimeSpan.FromMinutes(5);    // Default: 1 hour
});

The API Key can be obtained in the User settings page of your OpenAI account. For users who belong to multiple organizations, you can also specify which organization is used. Usage from these API requests will count against the specified organization's subscription quota.

With the DefaultModel property, you can specify the default model that will be used for chat completion, unless you pass an explicit value in the AskAsync method.

Note The ChatGptModels.Gpt4 model is currently in a limited beta and only accessible to those who have been granted access. You can find more information in the models documentation page of the OpenAI site.

ChatGPT is aimed to support conversational scenarios: user can talk to ChatGPT without specifying the full context for every interaction. However, conversation history isn't managed by OpenAI, so it's up to us to retain the current state. ChatGptNet handles this requirement using a MemoryCache that stores messages for each conversation. The behavior can be set using the following properties:

  • MessageLimit: specifies how many messages for each conversation must be saved. When this limit is reached, oldest messages are automatically removed.
  • MessageExpiration: specifies the time interval used to maintain messages in cache, regardless their count.

We can also set ChatGPT parameters for chat completion at startup. Check the official documentation for the list of available parameters and their meaning.

The configuration can be automatically read from IConfiguration, using for example a ChatGPT section in the appsettings.json file:

"ChatGPT": {
    "ApiKey": "",
    "MessageLimit": 20,
    "MessageExpiration": "00:30:00",
    "DefaultModel": "gpt-3.5-turbo",
    "ThrowExceptionOnError": true,
    "User": "UserName"
    "Organization": "My-Organization",
    "DefaultParameters": {
        "Temperature": 0.8,
        "TopP": 1,
        "Choices": 1,
        "MaxTokens": 500,
        "PresencePenalty": 0,
        "FrequencyPenalty": 0
    }
}

And then use the corresponding overload of che AddChatGpt method:

// Adds ChatGPT service using settings from IConfiguration.
services.AddChatGpt(context.Configuration);

The AddChatGpt method has also an overload that accepts an IServiceProvider as argument. It can be used, for example, if we're in a Web API and we need to support scenarios in which every user has a different API key that can be retrieved accessing a database via Dependency Injection:

builder.Services.AddChatGpt((services, options) =>
{
    var accountService = services.GetRequiredService<IAccountService>();

    // Dynamically gets the API key from the service.
    var apiKey = "..."        

    options.ApiKey = apiKey;
});

Usage

The library can be used in any .NET application built with .NET 6.0 or later. For example, we can create a Minimal API in this way:

app.MapPost("/api/chat/ask", async (Request request, IChatGptClient chatGptClient) =>
{
    var response = await chatGptClient.AskAsync(request.ConversationId, request.Message);
    return TypedResults.Ok(response);
})
.WithOpenApi();

// ...

public record class Request(Guid ConversationId, string Message);

If we just want to retrieve the response message, we can call the GetMessage method:

var message = response.GetMessage();

Handling a conversation

The AskAsync method has an overload (the one shown in the example above) that requires a conversationId parameter. If we pass an empty value, a random one is generated and returned. We can pass this value in subsequent invocations of AskAsync so that the library automatically retrieves previous messages of the current conversation (according to MessageLimit and MessageExpiration settings) and send them to chat completion API.

Response streaming

Chat completion API supports response streaming. When using this feature, partial message deltas will be sent, like in ChatGPT. Tokens will be sent as data-only server-sent events as they become available. ChatGptNet provides response streaming using the AskStreamAsync method:

// Requests a streaming response.
var responseStream = chatGptClient.AskStreamAsync(conversationId, message);

await foreach (var response in responseStream)
{
    Console.Write(response.GetMessage());
    await Task.Delay(80);
}

alternate text is missing from this package README image

Response streaming works by returning an IAsyncEnumerable, so it can be used even in a Web API project:

app.MapGet("/api/chat/stream", (Guid? conversationId, string message, IChatGptClient chatGptClient) =>
{
    async IAsyncEnumerable<string> Stream()
    {
        // Requests a streaming response.
        var responseStream = chatGptClient.AskStreamAsync(conversationId.GetValueOrDefault(), message);

        // Uses the "AsDeltas" extension method to retrieve partial message deltas only.
        await foreach (var delta in responseStream.AsDeltas())
        {
            yield return delta;
            await Task.Delay(50);
        }
    }

    return Stream();
})
.WithOpenApi();

alternate text is missing from this package README image

The library is 100% compatible also with Blazor WebAssembly applications:

alternate text is missing from this package README image

Check the Samples folder for more information about the different implementations.

Changing the assistant's behavior

ChatGPT supports messages with the system role to influence how the assistant should behave. For example, we can tell to ChatGPT something like that:

  • You are an helpful assistant
  • Answer like Shakespeare
  • Give me only wrong answers
  • Answer in rhyme

ChatGptNet provides this feature using the SetupAsync method:

var conversationId await = chatGptClient.SetupAsync("Answer in rhyme");

If we use the same conversationId when calling AskAsync, then the system message will be automatically sent along with every request, so that the assistant will know how to behave.

Note The system message does not count for messages limit number.

Deleting a conversation

Conversation history is automatically deleted when expiration time (specified by MessageExpiration property) is reached. However, if necessary it is possible to immediately clear the history:

await chatGptClient.DeleteConversationAsync(conversationId, preserveSetup: false);

The preserveSetup argument allows to decide whether mantain also the system message that has been set with the SetupAsync method (default: false).

Contribute

The project is constantly evolving. Contributions are welcome. Feel free to file issues and pull requests on the repo and we'll address them as we can.

Warning Remember to work on the develop branch, don't use the master branch directly. Create Pull Requests targeting develop.

Product Compatible and additional computed target framework versions.
.NET net6.0 is compatible.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 is compatible.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 was computed.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last updated
3.3.9 62 11/6/2024
3.3.7 73 11/5/2024
3.3.5 157 10/15/2024
3.3.4 394 9/11/2024
3.3.2 130 9/5/2024
3.2.18 267 7/23/2024
3.2.17 454 6/26/2024
3.2.15 1,261 5/17/2024
3.2.14 215 5/3/2024
3.2.10 1,141 3/13/2024
3.2.9 153 3/7/2024
3.2.2 379 1/29/2024
3.1.2 267 1/17/2024
3.0.7 241 1/8/2024
3.0.6 817 12/11/2023
2.6.3 1,517 10/16/2023
2.5.4 462 9/19/2023
2.4.7 268 9/18/2023
2.2.15 331 9/7/2023
2.2.14 511 8/11/2023
2.2.12 366 8/3/2023
2.2.11 269 7/27/2023
2.2.8 187 7/21/2023
2.2.4 171 7/20/2023
2.1.9 239 7/12/2023
2.1.6 255 6/26/2023
2.1.3 256 6/19/2023
2.0.16 217 6/14/2023
2.0.14 259 6/7/2023
2.0.11 193 6/5/2023
2.0.9 198 5/30/2023
2.0.3 296 5/16/2023
1.4.2 435 4/17/2023
1.3.8 249 4/10/2023
1.3.6 231 4/5/2023
1.3.3 253 3/29/2023
1.2.3 566 3/24/2023
1.1.6 257 3/22/2023
1.1.4 264 3/21/2023
1.0.5 263 3/19/2023
0.1.12 264 3/12/2023
0.1.8 241 3/11/2023