FoundationaLLM.Client.Core 0.9.7-beta137

This is a prerelease version of FoundationaLLM.Client.Core.
There is a newer prerelease version of this package available.
See the version list below for details.
dotnet add package FoundationaLLM.Client.Core --version 0.9.7-beta137
                    
NuGet\Install-Package FoundationaLLM.Client.Core -Version 0.9.7-beta137
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="FoundationaLLM.Client.Core" Version="0.9.7-beta137" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="FoundationaLLM.Client.Core" Version="0.9.7-beta137" />
                    
Directory.Packages.props
<PackageReference Include="FoundationaLLM.Client.Core" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add FoundationaLLM.Client.Core --version 0.9.7-beta137
                    
#r "nuget: FoundationaLLM.Client.Core, 0.9.7-beta137"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package FoundationaLLM.Client.Core@0.9.7-beta137
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=FoundationaLLM.Client.Core&version=0.9.7-beta137&prerelease
                    
Install as a Cake Addin
#tool nuget:?package=FoundationaLLM.Client.Core&version=0.9.7-beta137&prerelease
                    
Install as a Cake Tool

FoundationaLLM Core Client

The FoundationaLLM Core Client is a .NET client library that simplifies the process of interacting with the FoundationaLLM Core API. The client library provides a set of classes and methods that allow you to interact with the FoundationaLLM Core API in a more intuitive way.

This library contains two primary classes:

  • CoreRESTClient: A class that provides a set of methods for interacting with the FoundationaLLM Core API using REST. This is considered the low-level client and provides direct access to all Core API endpoints.
  • CoreClient: A class that provides a set of methods for interacting with the FoundationaLLM Core API using a higher-level abstraction. This class is designed to simplify the process of interacting with the Core API by providing a more intuitive interface. It does not contain all the methods available in the CoreRESTClient class, but it provides a more user-friendly way to interact with the Core API.

These two classes are mutually exclusive, and you should choose one based on your requirements. If you need direct access to all Core API endpoints, use the CoreRESTClient class. If you need a more user-friendly interface, use the CoreClient class.

Getting started

If you do not have FoundationaLLM deployed, follow the Quick Start Deployment instructions to get FoundationaLLM deployed in your Azure subscription.

Install the NuGet package:

dotnet add package FoundationaLLM.Client.Core

Manual service instantiation

Complete the following steps if you do not want to use dependency injection:

  1. Create a new instance of the CoreRESTClient and CoreClient classes:

    var coreUri = "<YOUR_CORE_API_URL>"; // e.g., "https://myfoundationallmcoreapi.com"
    var instanceId = "<YOUR_INSTANCE_ID>"; // Each FoundationaLLM deployment has a unique (GUID) ID. Locate this value in the FoundationaLLM Management Portal or in Azure App Config (FoundationaLLM:Instance:Id key)
    
    var credential = new AzureCliCredential(); // Can use any TokenCredential implementation, such as ManagedIdentityCredential or AzureCliCredential.
    var options = new APIClientSettings // Optional settings parameter. Default timeout is 900 seconds.
    {
        Timeout = TimeSpan.FromSeconds(600)
    };
    
    var coreRestClient = new CoreRESTClient(
        coreUri,
        credential,
        instanceId,
        options);
    var coreClient = new CoreClient(
        coreUri,
        credential,
        instanceId,
        options);
    
  2. Make a request to the Core API with the CoreRESTClient class:

    var status = await coreRestClient.Status.GetServiceStatusAsync();
    
  3. Make a request to the Core API with the CoreClient class:

    var results = await coreClient.GetAgentsAsync();
    

You can use the FoundationaLLM.Common.Authentication.DefaultAuthentication class to generate the TokenCredential. This class sets the AzureCredential property using the ManagedIdentityCredential when running in a production environment (production parameter of the Initialize method) and the AzureCliCredential when running in a development environment.

Example:

DefaultAuthentication.Initialize(false, "Test"); var credentials = DefaultAuthentication.AzureCredential;

Use dependency injection with a configuration file

Rather than manually instantiating the CoreRESTClient and CoreClient classes, you can use dependency injection to manage the instances. This approach is more flexible and allows you to easily switch between different implementations of the ICoreClient and ICoreRESTClient interfaces.

  1. Create a configuration file (e.g., appsettings.json) with the following content:

    {
        "FoundationaLLM": {
            "APIEndpoints": {
     	        "CoreAPI": {
     	            "Essentials": {
     	                "APIUrl": "https://localhost:63279/"
                    }
     		    },
            },
            "Instance": {
                "Id": "00000000-0000-0000-0000-000000000000"
            }
        }
    }
    
  2. Read the configuration file:

    var configuration = new ConfigurationBuilder()
        .AddJsonFile("appsettings.json", optional: false, reloadOnChange: true)
        .Build();
    
  3. Use the CoreClient extension method to add the CoreClient and CoreRESTClient to the service collection:

    var services = new ServiceCollection();
    var credential = new AzureCliCredential(); // Can use any TokenCredential implementation, such as ManagedIdentityCredential or AzureCliCredential.
    services.AddCoreClient(
        configuration[AppConfigurationKeys.FoundationaLLM_APIEndpoints_CoreAPI_Essentials_APIUrl]!,
        credential,
        configuration[AppConfigurationKeys.FoundationaLLM_Instance_Id]!);
    
    var serviceProvider = services.BuildServiceProvider();
    
  4. Retrieve the CoreClient and CoreRESTClient instances from the service provider:

    var coreClient = serviceProvider.GetRequiredService<ICoreClient>();
    var coreRestClient = serviceProvider.GetRequiredService<ICoreRESTClient>();
    

Alternately, you can inject the CoreClient and CoreRESTClient instances directly into your classes using dependency injection.

public class MyService
{
    private readonly ICoreClient _coreClient;
    private readonly ICoreRESTClient _coreRestClient;

    public MyService(ICoreClient coreClient, ICoreRESTClient coreRestClient)
    {
        _coreClient = coreClient;
        _coreRestClient = coreRestClient;
    }
}

Use dependency injection with Azure App Configuration

If you prefer to retrieve the configuration settings from Azure App Configuration, you can use the Microsoft.Azure.AppConfiguration.AspNetCore or Microsoft.Extensions.Configuration.AzureAppConfiguration package to retrieve the configuration settings from Azure App Configuration.

  1. Connect to Azure App Configuration:

    var configuration = new ConfigurationBuilder()
        .AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
        .AddEnvironmentVariables()
        .AddAzureAppConfiguration(options =>
        {
            options.Connect("<connection-string>");
            options.ConfigureKeyVault(kv =>
            {
                kv.SetCredential(Credentials);
            });
            options.Select(AppConfigurationKeyFilters.FoundationaLLM_Instance);
            options.Select(AppConfigurationKeyFilters.FoundationaLLM_APIEndpoints_CoreAPI_Essentials);
        })
        .Build();
    

    If you have configured your local development environment, you can obtain the App Config connection string from an environment variable (Environment.GetEnvironmentVariable(EnvironmentVariables.FoundationaLLM_AppConfig_ConnectionString)) when developing locally.

  2. Use the CoreClient extension method to add the CoreClient and CoreRESTClient to the service collection:

    var services = new ServiceCollection();
    var credential = new AzureCliCredential(); // Can use any TokenCredential implementation, such as ManagedIdentityCredential or AzureCliCredential.
    
    services.AddCoreClient(
        configuration[AppConfigurationKeys.FoundationaLLM_APIEndpoints_CoreAPI_Essentials_APIUrl]!,
        credential,
        configuration[AppConfigurationKeys.FoundationaLLM_Instance_Id]!);
    
  3. Retrieve the CoreClient and CoreRESTClient instances from the service provider:

    var coreClient = serviceProvider.GetRequiredService<ICoreClient>();
    var coreRestClient = serviceProvider.GetRequiredService<ICoreRESTClient>();
    

Example projects

The Core.Examples test project contains several examples that demonstrate how to use the CoreClient and CoreRESTClient classes to interact with the Core API through a series of end-to-end tests.

FoundationaLLM: The platform for deploying, scaling, securing and governing generative AI in the enterprises 🚀

License

FoundationaLLM provides the platform for deploying, scaling, securing and governing generative AI in the enterprise. With FoundationaLLM you can:

  • Create AI agents that are grounded in your enterprise data, be that text, semi-structured or structured data.
  • Make AI agents available to your users through a branded chat interface or integrate the REST API to the AI agent into your application for a copilot experience or integrate the Agent API in a machine-to-machine automated process.
  • Experiment building agents that can use a variety of large language models including OpenAI GPT-4, Mistral and Llama 2 or any models pulled from the Hugging Face model catalog that provide a REST completions endpoint.
  • Centrally manage, configure and secure your AI agents AND their underlying assets including prompts, data sources, vectorization data pipelines, vector databases and large language models using the management portal.
  • Enable everyone in your enterprise to create their own AI agents. Your non-developer users can create and deploy their own agents in a self-service fashion from the management portal, but we don't get in the way of your advanced AI developers who can deploy their own orchestrations built in LangChain, Semantic Kernel, Prompt Flow or any orchestration that exposes a completions endpoint.
  • Deploy and manage scalable vectorization data pipelines that can ingest millions of documents to provide knowledge to your model.
  • Empower your users with as many task-focused AI agents as desired.
  • Control access to the AI agents and the resources they access using role-based access controls (RBAC).
  • Harness the rapidly evolving capabilities from Azure AI and Azure OpenAI from one integrated stack.

FoundationaLLM is not a large language model. It enables you to use the large language models of your choice (e.g., OpenAI GPT-4, Mistral, LLama 2, etc.)

FoundationaLLM deploys a secure, comprehensive and highly configurable copilot platform to your Azure cloud environment:

  • Simplifies integration with enterprise data sources used by agent for in-context learning (e.g., enabling RAG, CoT, ReAct and inner monologue patterns).
  • Provides defense in depth with fine-grain security controls over data used by agent and pre/post completion filters that guard against attack.
  • Hardened solution attacked by an LLM red team from inception.
  • Scalable solution load balances across multiple LLM endpoints.
  • Extensible to new data sources, new LLM orchestrators and LLMs.

Why is FoundationaLLM Needed?

Simply put we saw a lot of folks reinventing the wheel just to get a customized copilot or AI agent that was grounded and bases its responses in their own data as opposed to the trained parametric knowledge of the model. Many of the solutions we saw made for great demos, but were effectively toys wrapping calls to OpenAI endpoints- they were not something intended or ready to take into production at enterprise scale. We built FoundationaLLM to provide a continuous journey, one that was quick to get started with so folks could experiment quickly with LLM's but not fall off a cliff after that with a solution that would be insecure, unlicensed, inflexible and not fully featured enough to grow from the prototype into a production solution without having to start all over.

The core problems to deliver enterprise copilots or AI agents are:

  • Enterprise grade copilots or AI agents are complex and have lots of moving parts (not to mention infrastructure).
  • The industry has a skills gap when it comes to filling the roles needed to deliver these complex copilot solutions.
  • The top AI risks (inaccuracy, cybersecurity, compliance, explainability, privacy) are not being mitigated by individual tools.
  • Delivery of a copilot or AI agent solution is time consuming, expensive and frustrating when starting from scratch.

Documentation

Get up to speed with FoundationaLLM by reading the documentation. This includes deployment instructions, quickstarts, architecture, and API references.

Getting Started

FoundationaLLM provides a simple command line driven approach to getting your first deployment up and running. Basically, it's two commands. After that, you can customize the solution, run it locally on your machine and update the deployment with your customizations.

Follow the Quick Start Deployment instructions to get FoundationaLLM deployed in your Azure subscription.

Reporting Issues and Support

If you encounter any issues with FoundationaLLM, please open an issue on GitHub. We will respond to your issue as soon as possible. Please use the Labels (bug, documentation, general question, release x.x.x) to categorize your issue and provide as much detail as possible to help us understand and resolve the issue.

Product Compatible and additional computed target framework versions.
.NET net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 was computed.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 was computed.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (1)

Showing the top 1 NuGet packages that depend on FoundationaLLM.Client.Core:

Package Downloads
FoundationaLLM.Core.Examples

FoundationaLLM.Core.Examples contains custom development examples packaged as tests.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last Updated
0.9.7-rc244 0 7/18/2025
0.9.7-rc243 0 7/18/2025
0.9.7-rc242 0 7/18/2025
0.9.7-rc241 0 7/17/2025
0.9.7-rc240 18 7/17/2025
0.9.7-rc239 18 7/17/2025
0.9.7-rc238 22 7/17/2025
0.9.7-rc237 21 7/17/2025
0.9.7-rc236 26 7/17/2025
0.9.7-rc235 25 7/17/2025
0.9.7-rc234 48 7/16/2025
0.9.7-rc233 46 7/16/2025
0.9.7-rc232 45 7/16/2025
0.9.7-rc231 55 7/16/2025
0.9.7-rc230 45 7/16/2025
0.9.7-rc229 48 7/16/2025
0.9.7-rc228 51 7/16/2025
0.9.7-rc227 46 7/16/2025
0.9.7-rc226 47 7/16/2025
0.9.7-rc225 49 7/15/2025
0.9.7-rc224 48 7/15/2025
0.9.7-rc223 48 7/15/2025
0.9.7-rc222 50 7/15/2025
0.9.7-rc220 128 7/10/2025
0.9.7-rc219 123 7/10/2025
0.9.7-rc218 123 7/10/2025
0.9.7-rc217 129 7/10/2025
0.9.7-rc216 126 7/10/2025
0.9.7-rc215 119 7/10/2025
0.9.7-rc214 118 7/9/2025
0.9.7-rc213 125 7/8/2025
0.9.7-rc212 123 7/8/2025
0.9.7-rc211 125 7/8/2025
0.9.7-rc208 126 7/8/2025
0.9.7-rc207 127 7/8/2025
0.9.7-rc206 123 7/8/2025
0.9.7-rc205 123 7/7/2025
0.9.7-rc204 123 7/7/2025
0.9.7-rc203 130 7/7/2025
0.9.7-rc202 124 7/7/2025
0.9.7-rc201 125 7/7/2025
0.9.7-rc200 125 7/3/2025
0.9.7-rc199 127 7/3/2025
0.9.7-rc198 128 7/3/2025
0.9.7-rc197 125 7/3/2025
0.9.7-rc196 124 7/2/2025
0.9.7-rc195 126 7/2/2025
0.9.7-rc194 125 7/1/2025
0.9.7-rc193 121 7/1/2025
0.9.7-rc192 125 7/1/2025
0.9.7-rc191 127 6/30/2025
0.9.7-rc190 124 6/30/2025
0.9.7-rc188 124 6/26/2025
0.9.7-rc187 127 6/26/2025
0.9.7-rc186 124 6/26/2025
0.9.7-rc185 124 6/26/2025
0.9.7-rc184 128 6/24/2025
0.9.7-rc181 133 6/23/2025
0.9.7-rc180 128 6/23/2025
0.9.7-rc179 129 6/23/2025
0.9.7-rc178 128 6/23/2025
0.9.7-rc177 66 6/20/2025
0.9.7-rc176 66 6/20/2025
0.9.7-rc175 66 6/20/2025
0.9.7-rc174 74 6/20/2025
0.9.7-rc173 70 6/20/2025
0.9.7-rc172 128 6/19/2025
0.9.7-rc171 134 6/19/2025
0.9.7-rc170 137 6/19/2025
0.9.7-rc169 131 6/19/2025
0.9.7-rc168 133 6/19/2025
0.9.7-rc167 136 6/19/2025
0.9.7-rc166 129 6/17/2025
0.9.7-rc165 133 6/17/2025
0.9.7-rc164 134 6/16/2025
0.9.7-rc163 129 6/16/2025
0.9.7-rc162 132 6/16/2025
0.9.7-rc161 136 6/15/2025
0.9.7-rc160 202 6/13/2025
0.9.7-rc159 222 6/13/2025
0.9.7-rc158 275 6/12/2025
0.9.7-rc157 284 6/11/2025
0.9.7-rc156 272 6/11/2025
0.9.7-rc155 273 6/10/2025
0.9.7-rc154 276 6/10/2025
0.9.7-rc153 280 6/10/2025
0.9.7-rc152 279 6/10/2025
0.9.7-rc151 277 6/10/2025
0.9.7-rc150.3 109 6/23/2025
0.9.7-rc150.2 112 6/23/2025
0.9.7-rc150 275 6/10/2025
0.9.7-rc149 255 6/9/2025
0.9.7-rc148 256 6/9/2025
0.9.7-rc147 254 6/9/2025
0.9.7-rc146 252 6/9/2025
0.9.7-rc145 257 6/9/2025
0.9.7-rc144 231 6/9/2025
0.9.7-rc143 189 6/8/2025
0.9.7-rc142 191 6/8/2025
0.9.7-rc141 109 6/8/2025
0.9.7-rc140 105 6/7/2025
0.9.7-rc139 96 6/6/2025
0.9.7-rc138 101 6/6/2025
0.9.7-rc137 99 6/6/2025
0.9.7-rc136 134 6/5/2025
0.9.7-rc135 133 6/5/2025
0.9.7-rc134 135 6/5/2025
0.9.7-rc133 129 6/5/2025
0.9.7-rc132 134 6/5/2025
0.9.7-rc131 130 6/5/2025
0.9.7-rc130 133 6/5/2025
0.9.7-rc129 139 6/5/2025
0.9.7-rc128 131 6/4/2025
0.9.7-rc127 141 6/4/2025
0.9.7-rc126 123 6/4/2025
0.9.7-rc125 139 6/4/2025
0.9.7-rc124 138 6/3/2025
0.9.7-rc123 130 6/3/2025
0.9.7-rc122 131 6/3/2025
0.9.7-rc121 135 6/3/2025
0.9.7-rc120 137 6/3/2025
0.9.7-rc119 135 6/2/2025
0.9.7-rc118 133 6/2/2025
0.9.7-rc117 132 6/2/2025
0.9.7-rc116 104 5/30/2025
0.9.7-rc115 136 5/30/2025
0.9.7-rc114 136 5/29/2025
0.9.7-rc113 139 5/29/2025
0.9.7-rc112 138 5/29/2025
0.9.7-rc111 139 5/29/2025
0.9.7-rc110 139 5/29/2025
0.9.7-rc109 136 5/28/2025
0.9.7-rc108 135 5/28/2025
0.9.7-rc107 136 5/27/2025
0.9.7-rc106 133 5/27/2025
0.9.7-rc105 137 5/27/2025
0.9.7-rc104 140 5/26/2025
0.9.7-rc103 139 5/25/2025
0.9.7-rc102 142 5/25/2025
0.9.7-rc101 59 5/24/2025
0.9.7-rc100 89 5/23/2025
0.9.7-beta159 135 5/20/2025
0.9.7-beta158 166 5/16/2025
0.9.7-beta157 227 5/13/2025
0.9.7-beta156 213 5/12/2025
0.9.7-beta155 139 5/6/2025
0.9.7-beta154 143 5/6/2025
0.9.7-beta153 140 5/5/2025
0.9.7-beta152 143 4/30/2025
0.9.7-beta151 169 4/21/2025
0.9.7-beta150 165 4/21/2025
0.9.7-beta149 165 4/20/2025
0.9.7-beta148 143 4/18/2025
0.9.7-beta147 182 4/17/2025
0.9.7-beta146 188 4/17/2025
0.9.7-beta145 111 4/11/2025
0.9.7-beta144 124 4/11/2025
0.9.7-beta143 137 4/11/2025
0.9.7-beta142 126 4/11/2025
0.9.7-beta141 124 4/11/2025
0.9.7-beta140 162 4/10/2025
0.9.7-beta139 156 4/10/2025
0.9.7-beta138 159 4/9/2025
0.9.7-beta137 149 4/3/2025
0.9.7-beta136 146 4/2/2025
0.9.7-beta135 164 4/2/2025
0.9.7-beta134 148 4/2/2025
0.9.7-beta133 150 4/2/2025
0.9.7-beta132 158 4/2/2025
0.9.7-beta131 150 4/1/2025
0.9.7-beta130 163 4/1/2025
0.9.7-beta129 162 3/31/2025
0.9.7-beta128 155 3/31/2025
0.9.7-beta127 151 3/30/2025
0.9.7-beta126 150 3/30/2025
0.9.7-beta125 463 3/26/2025
0.9.7-beta124 467 3/26/2025
0.9.7-beta123 470 3/26/2025
0.9.7-beta122 467 3/25/2025
0.9.7-beta121 468 3/25/2025
0.9.7-beta120 461 3/25/2025
0.9.7-beta119 474 3/25/2025
0.9.7-beta118 471 3/25/2025
0.9.7-beta117 473 3/25/2025
0.9.7-beta116 482 3/24/2025
0.9.7-beta115 401 3/24/2025
0.9.7-beta114 266 3/23/2025
0.9.7-beta113 91 3/21/2025
0.9.7-beta112 112 3/21/2025
0.9.7-beta111 150 3/19/2025
0.9.7-beta110 154 3/19/2025
0.9.7-beta109 151 3/18/2025
0.9.7-beta108 147 3/17/2025
0.9.7-beta107 144 3/17/2025
0.9.7-beta106 160 3/17/2025
0.9.7-beta105 151 3/13/2025
0.9.7-beta104 155 3/12/2025
0.9.7-beta103 168 3/11/2025
0.9.7-beta102 161 3/9/2025
0.9.7-beta101 207 3/7/2025
0.9.7-beta100 200 3/5/2025
0.9.6 218 3/3/2025
0.9.6-rc100 96 2/28/2025
0.9.5 119 2/26/2025
0.9.5-rc102 95 2/25/2025
0.9.5-rc101 104 2/24/2025
0.9.5-rc100 108 2/23/2025
0.9.4 117 2/21/2025
0.9.3 121 2/17/2025
0.9.3-rc018 106 2/17/2025
0.9.3-rc017 104 2/12/2025
0.9.3-rc016 106 2/12/2025
0.9.3-rc015 109 2/7/2025
0.9.3-rc014 94 2/6/2025
0.9.3-rc013 99 2/5/2025
0.9.3-rc012 109 2/5/2025
0.9.3-rc011 107 2/5/2025
0.9.3-rc010 105 2/5/2025
0.9.3-rc009 111 2/4/2025
0.9.3-rc008 105 2/4/2025
0.9.3-rc007 101 2/4/2025
0.9.3-rc006 109 2/3/2025
0.9.3-rc005 108 2/3/2025
0.9.3-rc004 103 1/31/2025
0.9.3-rc003 108 1/30/2025
0.9.3-rc002 97 1/29/2025
0.9.3-rc001 89 1/29/2025
0.9.2 94 1/24/2025
0.9.2-rc007 81 1/24/2025
0.9.2-rc006 91 1/23/2025
0.9.2-rc005 85 1/23/2025
0.9.2-rc004 84 1/23/2025
0.9.2-rc003 83 1/23/2025
0.9.2-rc002 86 1/23/2025
0.9.2-rc001 87 1/22/2025
0.9.2-a001 112 1/21/2025
0.9.1 112 1/21/2025
0.9.1-rc131 94 1/19/2025
0.9.1-rc130 97 1/19/2025
0.9.1-rc129 98 1/19/2025
0.9.1-rc128 94 1/18/2025
0.9.1-rc127 92 1/18/2025
0.9.1-rc126 102 1/17/2025
0.9.1-rc125 98 1/17/2025
0.9.1-rc124 97 1/16/2025
0.9.1-rc123 95 1/15/2025
0.9.1-rc122 86 1/14/2025
0.9.1-rc121 90 1/14/2025
0.9.1-rc120 97 1/14/2025
0.9.1-rc118 97 1/13/2025
0.9.1-rc117 101 1/13/2025
0.9.1-rc116 84 1/8/2025
0.9.1-rc115 100 1/2/2025
0.9.1-rc114 99 12/24/2024
0.9.1-rc113 101 12/23/2024
0.9.1-rc112 108 12/22/2024
0.9.1-rc111 108 12/22/2024
0.9.1-rc110 99 12/21/2024
0.9.1-rc109 102 12/21/2024
0.9.1-rc108 102 12/21/2024
0.9.1-rc107 109 12/20/2024
0.9.1-rc106 102 12/20/2024
0.9.1-rc105 101 12/19/2024
0.9.1-rc104 98 12/19/2024
0.9.1-rc100 109 12/16/2024
0.9.1-alpha4 110 12/15/2024
0.9.1-alpha3 108 12/15/2024
0.9.0-rc3 104 12/9/2024
0.9.0-rc2 106 12/9/2024
0.9.0-alpha5 106 11/28/2024
0.9.0-alpha1 97 11/27/2024
0.8.4 123 11/20/2024
0.8.3 147 9/18/2024
0.8.2 126 9/3/2024
0.8.2-alpha2 111 9/23/2024
0.8.1 159 8/23/2024
0.8.1-alpha2 112 9/18/2024