FoundationaLLM.Client.Core 0.9.7-rc230

This is a prerelease version of FoundationaLLM.Client.Core.
There is a newer prerelease version of this package available.
See the version list below for details.
dotnet add package FoundationaLLM.Client.Core --version 0.9.7-rc230
                    
NuGet\Install-Package FoundationaLLM.Client.Core -Version 0.9.7-rc230
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="FoundationaLLM.Client.Core" Version="0.9.7-rc230" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="FoundationaLLM.Client.Core" Version="0.9.7-rc230" />
                    
Directory.Packages.props
<PackageReference Include="FoundationaLLM.Client.Core" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add FoundationaLLM.Client.Core --version 0.9.7-rc230
                    
#r "nuget: FoundationaLLM.Client.Core, 0.9.7-rc230"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package FoundationaLLM.Client.Core@0.9.7-rc230
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=FoundationaLLM.Client.Core&version=0.9.7-rc230&prerelease
                    
Install as a Cake Addin
#tool nuget:?package=FoundationaLLM.Client.Core&version=0.9.7-rc230&prerelease
                    
Install as a Cake Tool

FoundationaLLM Core Client

The FoundationaLLM Core Client is a .NET client library that simplifies the process of interacting with the FoundationaLLM Core API. The client library provides a set of classes and methods that allow you to interact with the FoundationaLLM Core API in a more intuitive way.

This library contains two primary classes:

  • CoreRESTClient: A class that provides a set of methods for interacting with the FoundationaLLM Core API using REST. This is considered the low-level client and provides direct access to all Core API endpoints.
  • CoreClient: A class that provides a set of methods for interacting with the FoundationaLLM Core API using a higher-level abstraction. This class is designed to simplify the process of interacting with the Core API by providing a more intuitive interface. It does not contain all the methods available in the CoreRESTClient class, but it provides a more user-friendly way to interact with the Core API.

These two classes are mutually exclusive, and you should choose one based on your requirements. If you need direct access to all Core API endpoints, use the CoreRESTClient class. If you need a more user-friendly interface, use the CoreClient class.

Getting started

If you do not have FoundationaLLM deployed, follow the Quick Start Deployment instructions to get FoundationaLLM deployed in your Azure subscription.

Install the NuGet package:

dotnet add package FoundationaLLM.Client.Core

Manual service instantiation

Complete the following steps if you do not want to use dependency injection:

  1. Create a new instance of the CoreRESTClient and CoreClient classes:

    var coreUri = "<YOUR_CORE_API_URL>"; // e.g., "https://myfoundationallmcoreapi.com"
    var instanceId = "<YOUR_INSTANCE_ID>"; // Each FoundationaLLM deployment has a unique (GUID) ID. Locate this value in the FoundationaLLM Management Portal or in Azure App Config (FoundationaLLM:Instance:Id key)
    
    var credential = new AzureCliCredential(); // Can use any TokenCredential implementation, such as ManagedIdentityCredential or AzureCliCredential.
    var options = new APIClientSettings // Optional settings parameter. Default timeout is 900 seconds.
    {
        Timeout = TimeSpan.FromSeconds(600)
    };
    
    var coreRestClient = new CoreRESTClient(
        coreUri,
        credential,
        instanceId,
        options);
    var coreClient = new CoreClient(
        coreUri,
        credential,
        instanceId,
        options);
    
  2. Make a request to the Core API with the CoreRESTClient class:

    var status = await coreRestClient.Status.GetServiceStatusAsync();
    
  3. Make a request to the Core API with the CoreClient class:

    var results = await coreClient.GetAgentsAsync();
    

You can use the FoundationaLLM.Common.Authentication.DefaultAuthentication class to generate the TokenCredential. This class sets the AzureCredential property using the ManagedIdentityCredential when running in a production environment (production parameter of the Initialize method) and the AzureCliCredential when running in a development environment.

Example:

DefaultAuthentication.Initialize(false, "Test"); var credentials = DefaultAuthentication.AzureCredential;

Use dependency injection with a configuration file

Rather than manually instantiating the CoreRESTClient and CoreClient classes, you can use dependency injection to manage the instances. This approach is more flexible and allows you to easily switch between different implementations of the ICoreClient and ICoreRESTClient interfaces.

  1. Create a configuration file (e.g., appsettings.json) with the following content:

    {
        "FoundationaLLM": {
            "APIEndpoints": {
     	        "CoreAPI": {
     	            "Essentials": {
     	                "APIUrl": "https://localhost:63279/"
                    }
     		    },
            },
            "Instance": {
                "Id": "00000000-0000-0000-0000-000000000000"
            }
        }
    }
    
  2. Read the configuration file:

    var configuration = new ConfigurationBuilder()
        .AddJsonFile("appsettings.json", optional: false, reloadOnChange: true)
        .Build();
    
  3. Use the CoreClient extension method to add the CoreClient and CoreRESTClient to the service collection:

    var services = new ServiceCollection();
    var credential = new AzureCliCredential(); // Can use any TokenCredential implementation, such as ManagedIdentityCredential or AzureCliCredential.
    services.AddCoreClient(
        configuration[AppConfigurationKeys.FoundationaLLM_APIEndpoints_CoreAPI_Essentials_APIUrl]!,
        credential,
        configuration[AppConfigurationKeys.FoundationaLLM_Instance_Id]!);
    
    var serviceProvider = services.BuildServiceProvider();
    
  4. Retrieve the CoreClient and CoreRESTClient instances from the service provider:

    var coreClient = serviceProvider.GetRequiredService<ICoreClient>();
    var coreRestClient = serviceProvider.GetRequiredService<ICoreRESTClient>();
    

Alternately, you can inject the CoreClient and CoreRESTClient instances directly into your classes using dependency injection.

public class MyService
{
    private readonly ICoreClient _coreClient;
    private readonly ICoreRESTClient _coreRestClient;

    public MyService(ICoreClient coreClient, ICoreRESTClient coreRestClient)
    {
        _coreClient = coreClient;
        _coreRestClient = coreRestClient;
    }
}

Use dependency injection with Azure App Configuration

If you prefer to retrieve the configuration settings from Azure App Configuration, you can use the Microsoft.Azure.AppConfiguration.AspNetCore or Microsoft.Extensions.Configuration.AzureAppConfiguration package to retrieve the configuration settings from Azure App Configuration.

  1. Connect to Azure App Configuration:

    var configuration = new ConfigurationBuilder()
        .AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
        .AddEnvironmentVariables()
        .AddAzureAppConfiguration(options =>
        {
            options.Connect("<connection-string>");
            options.ConfigureKeyVault(kv =>
            {
                kv.SetCredential(Credentials);
            });
            options.Select(AppConfigurationKeyFilters.FoundationaLLM_Instance);
            options.Select(AppConfigurationKeyFilters.FoundationaLLM_APIEndpoints_CoreAPI_Essentials);
        })
        .Build();
    

    If you have configured your local development environment, you can obtain the App Config connection string from an environment variable (Environment.GetEnvironmentVariable(EnvironmentVariables.FoundationaLLM_AppConfig_ConnectionString)) when developing locally.

  2. Use the CoreClient extension method to add the CoreClient and CoreRESTClient to the service collection:

    var services = new ServiceCollection();
    var credential = new AzureCliCredential(); // Can use any TokenCredential implementation, such as ManagedIdentityCredential or AzureCliCredential.
    
    services.AddCoreClient(
        configuration[AppConfigurationKeys.FoundationaLLM_APIEndpoints_CoreAPI_Essentials_APIUrl]!,
        credential,
        configuration[AppConfigurationKeys.FoundationaLLM_Instance_Id]!);
    
  3. Retrieve the CoreClient and CoreRESTClient instances from the service provider:

    var coreClient = serviceProvider.GetRequiredService<ICoreClient>();
    var coreRestClient = serviceProvider.GetRequiredService<ICoreRESTClient>();
    

Example projects

The Core.Examples test project contains several examples that demonstrate how to use the CoreClient and CoreRESTClient classes to interact with the Core API through a series of end-to-end tests.

FoundationaLLM: The platform for deploying, scaling, securing and governing generative AI in the enterprises 🚀

License

FoundationaLLM provides the platform for deploying, scaling, securing and governing generative AI in the enterprise. With FoundationaLLM you can:

  • Create AI agents that are grounded in your enterprise data, be that text, semi-structured or structured data.
  • Make AI agents available to your users through a branded chat interface or integrate the REST API to the AI agent into your application for a copilot experience or integrate the Agent API in a machine-to-machine automated process.
  • Experiment building agents that can use a variety of large language models including OpenAI GPT-4, Mistral and Llama 2 or any models pulled from the Hugging Face model catalog that provide a REST completions endpoint.
  • Centrally manage, configure and secure your AI agents AND their underlying assets including prompts, data sources, vectorization data pipelines, vector databases and large language models using the management portal.
  • Enable everyone in your enterprise to create their own AI agents. Your non-developer users can create and deploy their own agents in a self-service fashion from the management portal, but we don't get in the way of your advanced AI developers who can deploy their own orchestrations built in LangChain, Semantic Kernel, Prompt Flow or any orchestration that exposes a completions endpoint.
  • Deploy and manage scalable vectorization data pipelines that can ingest millions of documents to provide knowledge to your model.
  • Empower your users with as many task-focused AI agents as desired.
  • Control access to the AI agents and the resources they access using role-based access controls (RBAC).
  • Harness the rapidly evolving capabilities from Azure AI and Azure OpenAI from one integrated stack.

FoundationaLLM is not a large language model. It enables you to use the large language models of your choice (e.g., OpenAI GPT-4, Mistral, LLama 2, etc.)

FoundationaLLM deploys a secure, comprehensive and highly configurable copilot platform to your Azure cloud environment:

  • Simplifies integration with enterprise data sources used by agent for in-context learning (e.g., enabling RAG, CoT, ReAct and inner monologue patterns).
  • Provides defense in depth with fine-grain security controls over data used by agent and pre/post completion filters that guard against attack.
  • Hardened solution attacked by an LLM red team from inception.
  • Scalable solution load balances across multiple LLM endpoints.
  • Extensible to new data sources, new LLM orchestrators and LLMs.

Why is FoundationaLLM Needed?

Simply put we saw a lot of folks reinventing the wheel just to get a customized copilot or AI agent that was grounded and bases its responses in their own data as opposed to the trained parametric knowledge of the model. Many of the solutions we saw made for great demos, but were effectively toys wrapping calls to OpenAI endpoints- they were not something intended or ready to take into production at enterprise scale. We built FoundationaLLM to provide a continuous journey, one that was quick to get started with so folks could experiment quickly with LLM's but not fall off a cliff after that with a solution that would be insecure, unlicensed, inflexible and not fully featured enough to grow from the prototype into a production solution without having to start all over.

The core problems to deliver enterprise copilots or AI agents are:

  • Enterprise grade copilots or AI agents are complex and have lots of moving parts (not to mention infrastructure).
  • The industry has a skills gap when it comes to filling the roles needed to deliver these complex copilot solutions.
  • The top AI risks (inaccuracy, cybersecurity, compliance, explainability, privacy) are not being mitigated by individual tools.
  • Delivery of a copilot or AI agent solution is time consuming, expensive and frustrating when starting from scratch.

Documentation

Get up to speed with FoundationaLLM by reading the documentation. This includes deployment instructions, quickstarts, architecture, and API references.

Getting Started

FoundationaLLM provides a simple command line driven approach to getting your first deployment up and running. Basically, it's two commands. After that, you can customize the solution, run it locally on your machine and update the deployment with your customizations.

Follow the Quick Start Deployment instructions to get FoundationaLLM deployed in your Azure subscription.

Reporting Issues and Support

If you encounter any issues with FoundationaLLM, please open an issue on GitHub. We will respond to your issue as soon as possible. Please use the Labels (bug, documentation, general question, release x.x.x) to categorize your issue and provide as much detail as possible to help us understand and resolve the issue.

Product Compatible and additional computed target framework versions.
.NET net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 was computed.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 was computed.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (1)

Showing the top 1 NuGet packages that depend on FoundationaLLM.Client.Core:

Package Downloads
FoundationaLLM.Core.Examples

FoundationaLLM.Core.Examples contains custom development examples packaged as tests.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last Updated
0.9.7-rc269 0 8/9/2025
0.9.7-rc268 0 8/9/2025
0.9.7-rc267 0 8/9/2025
0.9.7-rc266 40 8/8/2025
0.9.7-rc265 44 8/8/2025
0.9.7-rc264 55 8/8/2025
0.9.7-rc263 79 8/8/2025
0.9.7-rc262 82 8/8/2025
0.9.7-rc261 83 8/8/2025
0.9.7-rc260 90 8/8/2025
0.9.7-rc259 114 8/7/2025
0.9.7-rc258 117 8/4/2025
0.9.7-rc257 117 8/4/2025
0.9.7-rc256 92 7/27/2025
0.9.7-rc255 435 7/24/2025
0.9.7-rc254 477 7/22/2025
0.9.7-rc253 469 7/22/2025
0.9.7-rc252 429 7/21/2025
0.9.7-rc251 361 7/21/2025
0.9.7-rc250 268 7/20/2025
0.9.7-rc249.1 268 7/20/2025
0.9.7-rc249 184 7/20/2025
0.9.7-rc248 17 7/18/2025
0.9.7-rc247 19 7/18/2025
0.9.7-rc246 25 7/18/2025
0.9.7-rc245 31 7/18/2025
0.9.7-rc244 67 7/18/2025
0.9.7-rc243 71 7/18/2025
0.9.7-rc242 73 7/18/2025
0.9.7-rc241 106 7/17/2025
0.9.7-rc240 105 7/17/2025
0.9.7-rc239 103 7/17/2025
0.9.7-rc238 104 7/17/2025
0.9.7-rc237 103 7/17/2025
0.9.7-rc236 108 7/17/2025
0.9.7-rc235 106 7/17/2025
0.9.7-rc234 125 7/16/2025
0.9.7-rc233 123 7/16/2025
0.9.7-rc232 120 7/16/2025
0.9.7-rc231 134 7/16/2025
0.9.7-rc230 122 7/16/2025
0.9.7-rc229 125 7/16/2025
0.9.7-rc228 129 7/16/2025
0.9.7-rc227 123 7/16/2025
0.9.7-rc226 125 7/16/2025
0.9.7-rc225 127 7/15/2025
0.9.7-rc224 125 7/15/2025
0.9.7-rc223 125 7/15/2025
0.9.7-rc222 126 7/15/2025
0.9.7-rc220 131 7/10/2025
0.9.7-rc219 126 7/10/2025
0.9.7-rc218 126 7/10/2025
0.9.7-rc217 132 7/10/2025
0.9.7-rc216 129 7/10/2025
0.9.7-rc215 122 7/10/2025
0.9.7-rc214 121 7/9/2025
0.9.7-rc213 128 7/8/2025
0.9.7-rc212 126 7/8/2025
0.9.7-rc211 127 7/8/2025
0.9.7-rc208 128 7/8/2025
0.9.7-rc207 129 7/8/2025
0.9.7-rc206 125 7/8/2025
0.9.7-rc205 125 7/7/2025
0.9.7-rc204 126 7/7/2025
0.9.7-rc203 132 7/7/2025
0.9.7-rc202 125 7/7/2025
0.9.7-rc201 126 7/7/2025
0.9.7-rc200 126 7/3/2025
0.9.7-rc199 129 7/3/2025
0.9.7-rc198 129 7/3/2025
0.9.7-rc197 127 7/3/2025
0.9.7-rc196 125 7/2/2025
0.9.7-rc195 128 7/2/2025
0.9.7-rc194 127 7/1/2025
0.9.7-rc193 123 7/1/2025
0.9.7-rc192 128 7/1/2025
0.9.7-rc191 129 6/30/2025
0.9.7-rc190 126 6/30/2025
0.9.7-rc188 126 6/26/2025
0.9.7-rc187 131 6/26/2025
0.9.7-rc186 126 6/26/2025
0.9.7-rc185 126 6/26/2025
0.9.7-rc184 130 6/24/2025
0.9.7-rc181 135 6/23/2025
0.9.7-rc180 130 6/23/2025
0.9.7-rc179 131 6/23/2025
0.9.7-rc178 130 6/23/2025
0.9.7-rc177 67 6/20/2025
0.9.7-rc176 67 6/20/2025
0.9.7-rc175 67 6/20/2025
0.9.7-rc174 75 6/20/2025
0.9.7-rc173 71 6/20/2025
0.9.7-rc172 129 6/19/2025
0.9.7-rc171 135 6/19/2025
0.9.7-rc170 138 6/19/2025
0.9.7-rc169 132 6/19/2025
0.9.7-rc168 134 6/19/2025
0.9.7-rc167 138 6/19/2025
0.9.7-rc166 130 6/17/2025
0.9.7-rc165 134 6/17/2025
0.9.7-rc164 135 6/16/2025
0.9.7-rc163 130 6/16/2025
0.9.7-rc162 133 6/16/2025
0.9.7-rc161 137 6/15/2025
0.9.7-rc160 204 6/13/2025
0.9.7-rc159 224 6/13/2025
0.9.7-rc158 278 6/12/2025
0.9.7-rc157 286 6/11/2025
0.9.7-rc156 274 6/11/2025
0.9.7-rc155 276 6/10/2025
0.9.7-rc154 279 6/10/2025
0.9.7-rc153 282 6/10/2025
0.9.7-rc152 281 6/10/2025
0.9.7-rc151 279 6/10/2025
0.9.7-rc150.4 436 7/23/2025
0.9.7-rc150.3 111 6/23/2025
0.9.7-rc150.2 114 6/23/2025
0.9.7-rc150 277 6/10/2025
0.9.7-rc149 257 6/9/2025
0.9.7-rc148 258 6/9/2025
0.9.7-rc147 256 6/9/2025
0.9.7-rc146 254 6/9/2025
0.9.7-rc145 259 6/9/2025
0.9.7-rc144 233 6/9/2025
0.9.7-rc143 191 6/8/2025
0.9.7-rc142 193 6/8/2025
0.9.7-rc141 111 6/8/2025
0.9.7-rc140 107 6/7/2025
0.9.7-rc139 98 6/6/2025
0.9.7-rc138 103 6/6/2025
0.9.7-rc137 101 6/6/2025
0.9.7-rc136 136 6/5/2025
0.9.7-rc135 135 6/5/2025
0.9.7-rc134 137 6/5/2025
0.9.7-rc133 131 6/5/2025
0.9.7-rc132 136 6/5/2025
0.9.7-rc131 132 6/5/2025
0.9.7-rc130 136 6/5/2025
0.9.7-rc129 141 6/5/2025
0.9.7-rc128 133 6/4/2025
0.9.7-rc127 143 6/4/2025
0.9.7-rc126 125 6/4/2025
0.9.7-rc125 141 6/4/2025
0.9.7-rc124 140 6/3/2025
0.9.7-rc123 132 6/3/2025
0.9.7-rc122 133 6/3/2025
0.9.7-rc121 137 6/3/2025
0.9.7-rc120 139 6/3/2025
0.9.7-rc119 137 6/2/2025
0.9.7-rc118 135 6/2/2025
0.9.7-rc117 134 6/2/2025
0.9.7-rc116 106 5/30/2025
0.9.7-rc115 138 5/30/2025
0.9.7-rc114 138 5/29/2025
0.9.7-rc113 141 5/29/2025
0.9.7-rc112 140 5/29/2025
0.9.7-rc111 142 5/29/2025
0.9.7-rc110 141 5/29/2025
0.9.7-rc109 138 5/28/2025
0.9.7-rc108 137 5/28/2025
0.9.7-rc107 138 5/27/2025
0.9.7-rc106 135 5/27/2025
0.9.7-rc105 139 5/27/2025
0.9.7-rc104 143 5/26/2025
0.9.7-rc103 142 5/25/2025
0.9.7-rc102 145 5/25/2025
0.9.7-rc101 62 5/24/2025
0.9.7-rc100 92 5/23/2025
0.9.7-beta159 137 5/20/2025
0.9.7-beta158 169 5/16/2025
0.9.7-beta157 229 5/13/2025
0.9.7-beta156 215 5/12/2025
0.9.7-beta155 142 5/6/2025
0.9.7-beta154 145 5/6/2025
0.9.7-beta153 142 5/5/2025
0.9.7-beta152 145 4/30/2025
0.9.7-beta151 171 4/21/2025
0.9.7-beta150 167 4/21/2025
0.9.7-beta149 166 4/20/2025
0.9.7-beta148 144 4/18/2025
0.9.7-beta147 183 4/17/2025
0.9.7-beta146 189 4/17/2025
0.9.7-beta145 112 4/11/2025
0.9.7-beta144 125 4/11/2025
0.9.7-beta143 138 4/11/2025
0.9.7-beta142 128 4/11/2025
0.9.7-beta141 125 4/11/2025
0.9.7-beta140 163 4/10/2025
0.9.7-beta139 157 4/10/2025
0.9.7-beta138 160 4/9/2025
0.9.7-beta137 151 4/3/2025
0.9.7-beta136 147 4/2/2025
0.9.7-beta135 165 4/2/2025
0.9.7-beta134 149 4/2/2025
0.9.7-beta133 151 4/2/2025
0.9.7-beta132 160 4/2/2025
0.9.7-beta131 151 4/1/2025
0.9.7-beta130 164 4/1/2025
0.9.7-beta129 163 3/31/2025
0.9.7-beta128 156 3/31/2025
0.9.7-beta127 152 3/30/2025
0.9.7-beta126 151 3/30/2025
0.9.7-beta125 464 3/26/2025
0.9.7-beta124 468 3/26/2025
0.9.7-beta123 471 3/26/2025
0.9.7-beta122 469 3/25/2025
0.9.7-beta121 470 3/25/2025
0.9.7-beta120 462 3/25/2025
0.9.7-beta119 475 3/25/2025
0.9.7-beta118 472 3/25/2025
0.9.7-beta117 474 3/25/2025
0.9.7-beta116 483 3/24/2025
0.9.7-beta115 403 3/24/2025
0.9.7-beta114 267 3/23/2025
0.9.7-beta113 93 3/21/2025
0.9.7-beta112 113 3/21/2025
0.9.7-beta111 152 3/19/2025
0.9.7-beta110 155 3/19/2025
0.9.7-beta109 152 3/18/2025
0.9.7-beta108 148 3/17/2025
0.9.7-beta107 145 3/17/2025
0.9.7-beta106 161 3/17/2025
0.9.7-beta105 152 3/13/2025
0.9.7-beta104 156 3/12/2025
0.9.7-beta103 169 3/11/2025
0.9.7-beta102 162 3/9/2025
0.9.7-beta101 208 3/7/2025
0.9.7-beta100 201 3/5/2025
0.9.6 220 3/3/2025
0.9.6-rc100 98 2/28/2025
0.9.5 120 2/26/2025
0.9.5-rc102 96 2/25/2025
0.9.5-rc101 105 2/24/2025
0.9.5-rc100 109 2/23/2025
0.9.4 118 2/21/2025
0.9.3 122 2/17/2025
0.9.3-rc018 107 2/17/2025
0.9.3-rc017 106 2/12/2025
0.9.3-rc016 107 2/12/2025
0.9.3-rc015 110 2/7/2025
0.9.3-rc014 96 2/6/2025
0.9.3-rc013 100 2/5/2025
0.9.3-rc012 111 2/5/2025
0.9.3-rc011 108 2/5/2025
0.9.3-rc010 107 2/5/2025
0.9.3-rc009 112 2/4/2025
0.9.3-rc008 106 2/4/2025
0.9.3-rc007 102 2/4/2025
0.9.3-rc006 111 2/3/2025
0.9.3-rc005 109 2/3/2025
0.9.3-rc004 104 1/31/2025
0.9.3-rc003 109 1/30/2025
0.9.3-rc002 98 1/29/2025
0.9.3-rc001 90 1/29/2025
0.9.2 96 1/24/2025
0.9.2-rc007 82 1/24/2025
0.9.2-rc006 93 1/23/2025
0.9.2-rc005 86 1/23/2025
0.9.2-rc004 85 1/23/2025
0.9.2-rc003 85 1/23/2025
0.9.2-rc002 87 1/23/2025
0.9.2-rc001 88 1/22/2025
0.9.2-a001 113 1/21/2025
0.9.1 113 1/21/2025
0.9.1-rc131 95 1/19/2025
0.9.1-rc130 98 1/19/2025
0.9.1-rc129 99 1/19/2025
0.9.1-rc128 95 1/18/2025
0.9.1-rc127 93 1/18/2025
0.9.1-rc126 103 1/17/2025
0.9.1-rc125 100 1/17/2025
0.9.1-rc124 98 1/16/2025
0.9.1-rc123 97 1/15/2025
0.9.1-rc122 87 1/14/2025
0.9.1-rc121 91 1/14/2025
0.9.1-rc120 98 1/14/2025
0.9.1-rc118 98 1/13/2025
0.9.1-rc117 103 1/13/2025
0.9.1-rc116 86 1/8/2025
0.9.1-rc115 101 1/2/2025
0.9.1-rc114 100 12/24/2024
0.9.1-rc113 102 12/23/2024
0.9.1-rc112 109 12/22/2024
0.9.1-rc111 109 12/22/2024
0.9.1-rc110 101 12/21/2024
0.9.1-rc109 103 12/21/2024
0.9.1-rc108 103 12/21/2024
0.9.1-rc107 110 12/20/2024
0.9.1-rc106 103 12/20/2024
0.9.1-rc105 102 12/19/2024
0.9.1-rc104 99 12/19/2024
0.9.1-rc100 110 12/16/2024
0.9.1-alpha4 111 12/15/2024
0.9.1-alpha3 110 12/15/2024
0.9.0-rc3 105 12/9/2024
0.9.0-rc2 107 12/9/2024
0.9.0-alpha5 107 11/28/2024
0.9.0-alpha1 99 11/27/2024
0.8.4 124 11/20/2024
0.8.3 148 9/18/2024
0.8.2 127 9/3/2024
0.8.2-alpha2 112 9/23/2024
0.8.1 160 8/23/2024
0.8.1-alpha2 113 9/18/2024