FoundationaLLM.Client.Core 0.9.7-rc104

This is a prerelease version of FoundationaLLM.Client.Core.
There is a newer prerelease version of this package available.
See the version list below for details.
dotnet add package FoundationaLLM.Client.Core --version 0.9.7-rc104
                    
NuGet\Install-Package FoundationaLLM.Client.Core -Version 0.9.7-rc104
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="FoundationaLLM.Client.Core" Version="0.9.7-rc104" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="FoundationaLLM.Client.Core" Version="0.9.7-rc104" />
                    
Directory.Packages.props
<PackageReference Include="FoundationaLLM.Client.Core" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add FoundationaLLM.Client.Core --version 0.9.7-rc104
                    
#r "nuget: FoundationaLLM.Client.Core, 0.9.7-rc104"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#addin nuget:?package=FoundationaLLM.Client.Core&version=0.9.7-rc104&prerelease
                    
Install as a Cake Addin
#tool nuget:?package=FoundationaLLM.Client.Core&version=0.9.7-rc104&prerelease
                    
Install as a Cake Tool

FoundationaLLM Core Client

The FoundationaLLM Core Client is a .NET client library that simplifies the process of interacting with the FoundationaLLM Core API. The client library provides a set of classes and methods that allow you to interact with the FoundationaLLM Core API in a more intuitive way.

This library contains two primary classes:

  • CoreRESTClient: A class that provides a set of methods for interacting with the FoundationaLLM Core API using REST. This is considered the low-level client and provides direct access to all Core API endpoints.
  • CoreClient: A class that provides a set of methods for interacting with the FoundationaLLM Core API using a higher-level abstraction. This class is designed to simplify the process of interacting with the Core API by providing a more intuitive interface. It does not contain all the methods available in the CoreRESTClient class, but it provides a more user-friendly way to interact with the Core API.

These two classes are mutually exclusive, and you should choose one based on your requirements. If you need direct access to all Core API endpoints, use the CoreRESTClient class. If you need a more user-friendly interface, use the CoreClient class.

Getting started

If you do not have FoundationaLLM deployed, follow the Quick Start Deployment instructions to get FoundationaLLM deployed in your Azure subscription.

Install the NuGet package:

dotnet add package FoundationaLLM.Client.Core

Manual service instantiation

Complete the following steps if you do not want to use dependency injection:

  1. Create a new instance of the CoreRESTClient and CoreClient classes:

    var coreUri = "<YOUR_CORE_API_URL>"; // e.g., "https://myfoundationallmcoreapi.com"
    var instanceId = "<YOUR_INSTANCE_ID>"; // Each FoundationaLLM deployment has a unique (GUID) ID. Locate this value in the FoundationaLLM Management Portal or in Azure App Config (FoundationaLLM:Instance:Id key)
    
    var credential = new AzureCliCredential(); // Can use any TokenCredential implementation, such as ManagedIdentityCredential or AzureCliCredential.
    var options = new APIClientSettings // Optional settings parameter. Default timeout is 900 seconds.
    {
        Timeout = TimeSpan.FromSeconds(600)
    };
    
    var coreRestClient = new CoreRESTClient(
        coreUri,
        credential,
        instanceId,
        options);
    var coreClient = new CoreClient(
        coreUri,
        credential,
        instanceId,
        options);
    
  2. Make a request to the Core API with the CoreRESTClient class:

    var status = await coreRestClient.Status.GetServiceStatusAsync();
    
  3. Make a request to the Core API with the CoreClient class:

    var results = await coreClient.GetAgentsAsync();
    

You can use the FoundationaLLM.Common.Authentication.DefaultAuthentication class to generate the TokenCredential. This class sets the AzureCredential property using the ManagedIdentityCredential when running in a production environment (production parameter of the Initialize method) and the AzureCliCredential when running in a development environment.

Example:

DefaultAuthentication.Initialize(false, "Test"); var credentials = DefaultAuthentication.AzureCredential;

Use dependency injection with a configuration file

Rather than manually instantiating the CoreRESTClient and CoreClient classes, you can use dependency injection to manage the instances. This approach is more flexible and allows you to easily switch between different implementations of the ICoreClient and ICoreRESTClient interfaces.

  1. Create a configuration file (e.g., appsettings.json) with the following content:

    {
        "FoundationaLLM": {
            "APIEndpoints": {
     	        "CoreAPI": {
     	            "Essentials": {
     	                "APIUrl": "https://localhost:63279/"
                    }
     		    },
            },
            "Instance": {
                "Id": "00000000-0000-0000-0000-000000000000"
            }
        }
    }
    
  2. Read the configuration file:

    var configuration = new ConfigurationBuilder()
        .AddJsonFile("appsettings.json", optional: false, reloadOnChange: true)
        .Build();
    
  3. Use the CoreClient extension method to add the CoreClient and CoreRESTClient to the service collection:

    var services = new ServiceCollection();
    var credential = new AzureCliCredential(); // Can use any TokenCredential implementation, such as ManagedIdentityCredential or AzureCliCredential.
    services.AddCoreClient(
        configuration[AppConfigurationKeys.FoundationaLLM_APIEndpoints_CoreAPI_Essentials_APIUrl]!,
        credential,
        configuration[AppConfigurationKeys.FoundationaLLM_Instance_Id]!);
    
    var serviceProvider = services.BuildServiceProvider();
    
  4. Retrieve the CoreClient and CoreRESTClient instances from the service provider:

    var coreClient = serviceProvider.GetRequiredService<ICoreClient>();
    var coreRestClient = serviceProvider.GetRequiredService<ICoreRESTClient>();
    

Alternately, you can inject the CoreClient and CoreRESTClient instances directly into your classes using dependency injection.

public class MyService
{
    private readonly ICoreClient _coreClient;
    private readonly ICoreRESTClient _coreRestClient;

    public MyService(ICoreClient coreClient, ICoreRESTClient coreRestClient)
    {
        _coreClient = coreClient;
        _coreRestClient = coreRestClient;
    }
}

Use dependency injection with Azure App Configuration

If you prefer to retrieve the configuration settings from Azure App Configuration, you can use the Microsoft.Azure.AppConfiguration.AspNetCore or Microsoft.Extensions.Configuration.AzureAppConfiguration package to retrieve the configuration settings from Azure App Configuration.

  1. Connect to Azure App Configuration:

    var configuration = new ConfigurationBuilder()
        .AddJsonFile("appsettings.json", optional: true, reloadOnChange: true)
        .AddEnvironmentVariables()
        .AddAzureAppConfiguration(options =>
        {
            options.Connect("<connection-string>");
            options.ConfigureKeyVault(kv =>
            {
                kv.SetCredential(Credentials);
            });
            options.Select(AppConfigurationKeyFilters.FoundationaLLM_Instance);
            options.Select(AppConfigurationKeyFilters.FoundationaLLM_APIEndpoints_CoreAPI_Essentials);
        })
        .Build();
    

    If you have configured your local development environment, you can obtain the App Config connection string from an environment variable (Environment.GetEnvironmentVariable(EnvironmentVariables.FoundationaLLM_AppConfig_ConnectionString)) when developing locally.

  2. Use the CoreClient extension method to add the CoreClient and CoreRESTClient to the service collection:

    var services = new ServiceCollection();
    var credential = new AzureCliCredential(); // Can use any TokenCredential implementation, such as ManagedIdentityCredential or AzureCliCredential.
    
    services.AddCoreClient(
        configuration[AppConfigurationKeys.FoundationaLLM_APIEndpoints_CoreAPI_Essentials_APIUrl]!,
        credential,
        configuration[AppConfigurationKeys.FoundationaLLM_Instance_Id]!);
    
  3. Retrieve the CoreClient and CoreRESTClient instances from the service provider:

    var coreClient = serviceProvider.GetRequiredService<ICoreClient>();
    var coreRestClient = serviceProvider.GetRequiredService<ICoreRESTClient>();
    

Example projects

The Core.Examples test project contains several examples that demonstrate how to use the CoreClient and CoreRESTClient classes to interact with the Core API through a series of end-to-end tests.

FoundationaLLM: The platform for deploying, scaling, securing and governing generative AI in the enterprises 🚀

License

FoundationaLLM provides the platform for deploying, scaling, securing and governing generative AI in the enterprise. With FoundationaLLM you can:

  • Create AI agents that are grounded in your enterprise data, be that text, semi-structured or structured data.
  • Make AI agents available to your users through a branded chat interface or integrate the REST API to the AI agent into your application for a copilot experience or integrate the Agent API in a machine-to-machine automated process.
  • Experiment building agents that can use a variety of large language models including OpenAI GPT-4, Mistral and Llama 2 or any models pulled from the Hugging Face model catalog that provide a REST completions endpoint.
  • Centrally manage, configure and secure your AI agents AND their underlying assets including prompts, data sources, vectorization data pipelines, vector databases and large language models using the management portal.
  • Enable everyone in your enterprise to create their own AI agents. Your non-developer users can create and deploy their own agents in a self-service fashion from the management portal, but we don't get in the way of your advanced AI developers who can deploy their own orchestrations built in LangChain, Semantic Kernel, Prompt Flow or any orchestration that exposes a completions endpoint.
  • Deploy and manage scalable vectorization data pipelines that can ingest millions of documents to provide knowledge to your model.
  • Empower your users with as many task-focused AI agents as desired.
  • Control access to the AI agents and the resources they access using role-based access controls (RBAC).
  • Harness the rapidly evolving capabilities from Azure AI and Azure OpenAI from one integrated stack.

FoundationaLLM is not a large language model. It enables you to use the large language models of your choice (e.g., OpenAI GPT-4, Mistral, LLama 2, etc.)

FoundationaLLM deploys a secure, comprehensive and highly configurable copilot platform to your Azure cloud environment:

  • Simplifies integration with enterprise data sources used by agent for in-context learning (e.g., enabling RAG, CoT, ReAct and inner monologue patterns).
  • Provides defense in depth with fine-grain security controls over data used by agent and pre/post completion filters that guard against attack.
  • Hardened solution attacked by an LLM red team from inception.
  • Scalable solution load balances across multiple LLM endpoints.
  • Extensible to new data sources, new LLM orchestrators and LLMs.

Why is FoundationaLLM Needed?

Simply put we saw a lot of folks reinventing the wheel just to get a customized copilot or AI agent that was grounded and bases its responses in their own data as opposed to the trained parametric knowledge of the model. Many of the solutions we saw made for great demos, but were effectively toys wrapping calls to OpenAI endpoints- they were not something intended or ready to take into production at enterprise scale. We built FoundationaLLM to provide a continuous journey, one that was quick to get started with so folks could experiment quickly with LLM's but not fall off a cliff after that with a solution that would be insecure, unlicensed, inflexible and not fully featured enough to grow from the prototype into a production solution without having to start all over.

The core problems to deliver enterprise copilots or AI agents are:

  • Enterprise grade copilots or AI agents are complex and have lots of moving parts (not to mention infrastructure).
  • The industry has a skills gap when it comes to filling the roles needed to deliver these complex copilot solutions.
  • The top AI risks (inaccuracy, cybersecurity, compliance, explainability, privacy) are not being mitigated by individual tools.
  • Delivery of a copilot or AI agent solution is time consuming, expensive and frustrating when starting from scratch.

Documentation

Get up to speed with FoundationaLLM by reading the documentation. This includes deployment instructions, quickstarts, architecture, and API references.

Getting Started

FoundationaLLM provides a simple command line driven approach to getting your first deployment up and running. Basically, it's two commands. After that, you can customize the solution, run it locally on your machine and update the deployment with your customizations.

Follow the Quick Start Deployment instructions to get FoundationaLLM deployed in your Azure subscription.

Reporting Issues and Support

If you encounter any issues with FoundationaLLM, please open an issue on GitHub. We will respond to your issue as soon as possible. Please use the Labels (bug, documentation, general question, release x.x.x) to categorize your issue and provide as much detail as possible to help us understand and resolve the issue.

Product Compatible and additional computed target framework versions.
.NET net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 was computed.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 was computed.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (1)

Showing the top 1 NuGet packages that depend on FoundationaLLM.Client.Core:

Package Downloads
FoundationaLLM.Core.Examples

FoundationaLLM.Core.Examples contains custom development examples packaged as tests.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last Updated
0.9.7-rc212 0 7/8/2025
0.9.7-rc211 0 7/8/2025
0.9.7-rc208 0 7/8/2025
0.9.7-rc207 0 7/8/2025
0.9.7-rc206 0 7/8/2025
0.9.7-rc205 21 7/7/2025
0.9.7-rc204 25 7/7/2025
0.9.7-rc203 29 7/7/2025
0.9.7-rc202 26 7/7/2025
0.9.7-rc201 29 7/7/2025
0.9.7-rc200 103 7/3/2025
0.9.7-rc199 105 7/3/2025
0.9.7-rc198 106 7/3/2025
0.9.7-rc197 103 7/3/2025
0.9.7-rc196 120 7/2/2025
0.9.7-rc195 118 7/2/2025
0.9.7-rc194 121 7/1/2025
0.9.7-rc193 117 7/1/2025
0.9.7-rc192 122 7/1/2025
0.9.7-rc191 124 6/30/2025
0.9.7-rc190 121 6/30/2025
0.9.7-rc188 121 6/26/2025
0.9.7-rc187 122 6/26/2025
0.9.7-rc186 121 6/26/2025
0.9.7-rc185 121 6/26/2025
0.9.7-rc184 126 6/24/2025
0.9.7-rc181 131 6/23/2025
0.9.7-rc180 126 6/23/2025
0.9.7-rc179 127 6/23/2025
0.9.7-rc178 126 6/23/2025
0.9.7-rc177 64 6/20/2025
0.9.7-rc176 64 6/20/2025
0.9.7-rc175 64 6/20/2025
0.9.7-rc174 72 6/20/2025
0.9.7-rc173 68 6/20/2025
0.9.7-rc172 126 6/19/2025
0.9.7-rc171 132 6/19/2025
0.9.7-rc170 135 6/19/2025
0.9.7-rc169 129 6/19/2025
0.9.7-rc168 131 6/19/2025
0.9.7-rc167 134 6/19/2025
0.9.7-rc166 127 6/17/2025
0.9.7-rc165 131 6/17/2025
0.9.7-rc164 132 6/16/2025
0.9.7-rc163 127 6/16/2025
0.9.7-rc162 130 6/16/2025
0.9.7-rc161 134 6/15/2025
0.9.7-rc160 200 6/13/2025
0.9.7-rc159 219 6/13/2025
0.9.7-rc158 273 6/12/2025
0.9.7-rc157 282 6/11/2025
0.9.7-rc156 270 6/11/2025
0.9.7-rc155 271 6/10/2025
0.9.7-rc154 274 6/10/2025
0.9.7-rc153 278 6/10/2025
0.9.7-rc152 277 6/10/2025
0.9.7-rc151 275 6/10/2025
0.9.7-rc150.3 107 6/23/2025
0.9.7-rc150.2 110 6/23/2025
0.9.7-rc150 273 6/10/2025
0.9.7-rc149 253 6/9/2025
0.9.7-rc148 254 6/9/2025
0.9.7-rc147 252 6/9/2025
0.9.7-rc146 250 6/9/2025
0.9.7-rc145 255 6/9/2025
0.9.7-rc144 229 6/9/2025
0.9.7-rc143 187 6/8/2025
0.9.7-rc142 189 6/8/2025
0.9.7-rc141 107 6/8/2025
0.9.7-rc140 103 6/7/2025
0.9.7-rc139 94 6/6/2025
0.9.7-rc138 99 6/6/2025
0.9.7-rc137 97 6/6/2025
0.9.7-rc136 132 6/5/2025
0.9.7-rc135 131 6/5/2025
0.9.7-rc134 133 6/5/2025
0.9.7-rc133 127 6/5/2025
0.9.7-rc132 132 6/5/2025
0.9.7-rc131 128 6/5/2025
0.9.7-rc130 131 6/5/2025
0.9.7-rc129 137 6/5/2025
0.9.7-rc128 129 6/4/2025
0.9.7-rc127 139 6/4/2025
0.9.7-rc126 121 6/4/2025
0.9.7-rc125 137 6/4/2025
0.9.7-rc124 136 6/3/2025
0.9.7-rc123 128 6/3/2025
0.9.7-rc122 129 6/3/2025
0.9.7-rc121 133 6/3/2025
0.9.7-rc120 129 6/3/2025
0.9.7-rc119 133 6/2/2025
0.9.7-rc118 131 6/2/2025
0.9.7-rc117 130 6/2/2025
0.9.7-rc116 102 5/30/2025
0.9.7-rc115 134 5/30/2025
0.9.7-rc114 134 5/29/2025
0.9.7-rc113 137 5/29/2025
0.9.7-rc112 136 5/29/2025
0.9.7-rc111 137 5/29/2025
0.9.7-rc110 137 5/29/2025
0.9.7-rc109 134 5/28/2025
0.9.7-rc108 133 5/28/2025
0.9.7-rc107 134 5/27/2025
0.9.7-rc106 131 5/27/2025
0.9.7-rc105 135 5/27/2025
0.9.7-rc104 138 5/26/2025
0.9.7-rc103 137 5/25/2025
0.9.7-rc102 140 5/25/2025
0.9.7-rc101 57 5/24/2025
0.9.7-rc100 87 5/23/2025
0.9.7-beta159 132 5/20/2025
0.9.7-beta158 163 5/16/2025
0.9.7-beta157 224 5/13/2025
0.9.7-beta156 210 5/12/2025
0.9.7-beta155 136 5/6/2025
0.9.7-beta154 140 5/6/2025
0.9.7-beta153 135 5/5/2025
0.9.7-beta152 140 4/30/2025
0.9.7-beta151 165 4/21/2025
0.9.7-beta150 161 4/21/2025
0.9.7-beta149 162 4/20/2025
0.9.7-beta148 140 4/18/2025
0.9.7-beta147 179 4/17/2025
0.9.7-beta146 185 4/17/2025
0.9.7-beta145 107 4/11/2025
0.9.7-beta144 121 4/11/2025
0.9.7-beta143 135 4/11/2025
0.9.7-beta142 123 4/11/2025
0.9.7-beta141 121 4/11/2025
0.9.7-beta140 159 4/10/2025
0.9.7-beta139 153 4/10/2025
0.9.7-beta138 156 4/9/2025
0.9.7-beta137 146 4/3/2025
0.9.7-beta136 143 4/2/2025
0.9.7-beta135 161 4/2/2025
0.9.7-beta134 145 4/2/2025
0.9.7-beta133 147 4/2/2025
0.9.7-beta132 155 4/2/2025
0.9.7-beta131 147 4/1/2025
0.9.7-beta130 161 4/1/2025
0.9.7-beta129 159 3/31/2025
0.9.7-beta128 152 3/31/2025
0.9.7-beta127 148 3/30/2025
0.9.7-beta126 147 3/30/2025
0.9.7-beta125 460 3/26/2025
0.9.7-beta124 463 3/26/2025
0.9.7-beta123 467 3/26/2025
0.9.7-beta122 464 3/25/2025
0.9.7-beta121 465 3/25/2025
0.9.7-beta120 458 3/25/2025
0.9.7-beta119 471 3/25/2025
0.9.7-beta118 468 3/25/2025
0.9.7-beta117 470 3/25/2025
0.9.7-beta116 479 3/24/2025
0.9.7-beta115 398 3/24/2025
0.9.7-beta114 263 3/23/2025
0.9.7-beta113 88 3/21/2025
0.9.7-beta112 109 3/21/2025
0.9.7-beta111 147 3/19/2025
0.9.7-beta110 151 3/19/2025
0.9.7-beta109 148 3/18/2025
0.9.7-beta108 144 3/17/2025
0.9.7-beta107 140 3/17/2025
0.9.7-beta106 157 3/17/2025
0.9.7-beta105 148 3/13/2025
0.9.7-beta104 152 3/12/2025
0.9.7-beta103 165 3/11/2025
0.9.7-beta102 158 3/9/2025
0.9.7-beta101 204 3/7/2025
0.9.7-beta100 197 3/5/2025
0.9.6 213 3/3/2025
0.9.6-rc100 93 2/28/2025
0.9.5 116 2/26/2025
0.9.5-rc102 92 2/25/2025
0.9.5-rc101 101 2/24/2025
0.9.5-rc100 105 2/23/2025
0.9.4 113 2/21/2025
0.9.3 118 2/17/2025
0.9.3-rc018 103 2/17/2025
0.9.3-rc017 101 2/12/2025
0.9.3-rc016 102 2/12/2025
0.9.3-rc015 106 2/7/2025
0.9.3-rc014 91 2/6/2025
0.9.3-rc013 97 2/5/2025
0.9.3-rc012 106 2/5/2025
0.9.3-rc011 104 2/5/2025
0.9.3-rc010 102 2/5/2025
0.9.3-rc009 108 2/4/2025
0.9.3-rc008 102 2/4/2025
0.9.3-rc007 98 2/4/2025
0.9.3-rc006 106 2/3/2025
0.9.3-rc005 105 2/3/2025
0.9.3-rc004 100 1/31/2025
0.9.3-rc003 105 1/30/2025
0.9.3-rc002 94 1/29/2025
0.9.3-rc001 86 1/29/2025
0.9.2 91 1/24/2025
0.9.2-rc007 78 1/24/2025
0.9.2-rc006 88 1/23/2025
0.9.2-rc005 82 1/23/2025
0.9.2-rc004 80 1/23/2025
0.9.2-rc003 79 1/23/2025
0.9.2-rc002 82 1/23/2025
0.9.2-rc001 84 1/22/2025
0.9.2-a001 109 1/21/2025
0.9.1 109 1/21/2025
0.9.1-rc131 91 1/19/2025
0.9.1-rc130 94 1/19/2025
0.9.1-rc129 95 1/19/2025
0.9.1-rc128 90 1/18/2025
0.9.1-rc127 88 1/18/2025
0.9.1-rc126 99 1/17/2025
0.9.1-rc125 95 1/17/2025
0.9.1-rc124 94 1/16/2025
0.9.1-rc123 91 1/15/2025
0.9.1-rc122 82 1/14/2025
0.9.1-rc121 87 1/14/2025
0.9.1-rc120 93 1/14/2025
0.9.1-rc118 93 1/13/2025
0.9.1-rc117 97 1/13/2025
0.9.1-rc116 80 1/8/2025
0.9.1-rc115 97 1/2/2025
0.9.1-rc114 96 12/24/2024
0.9.1-rc113 98 12/23/2024
0.9.1-rc112 105 12/22/2024
0.9.1-rc111 105 12/22/2024
0.9.1-rc110 94 12/21/2024
0.9.1-rc109 97 12/21/2024
0.9.1-rc108 98 12/21/2024
0.9.1-rc107 106 12/20/2024
0.9.1-rc106 99 12/20/2024
0.9.1-rc105 98 12/19/2024
0.9.1-rc104 95 12/19/2024
0.9.1-rc100 106 12/16/2024
0.9.1-alpha4 107 12/15/2024
0.9.1-alpha3 105 12/15/2024
0.9.0-rc3 101 12/9/2024
0.9.0-rc2 103 12/9/2024
0.9.0-alpha5 103 11/28/2024
0.9.0-alpha1 94 11/27/2024
0.8.4 120 11/20/2024
0.8.3 145 9/18/2024
0.8.2 124 9/3/2024
0.8.2-alpha2 109 9/23/2024
0.8.1 157 8/23/2024
0.8.1-alpha2 110 9/18/2024