Encamina.Enmarcha.SemanticKernel.Connectors.Memory
8.2.0
dotnet add package Encamina.Enmarcha.SemanticKernel.Connectors.Memory --version 8.2.0
NuGet\Install-Package Encamina.Enmarcha.SemanticKernel.Connectors.Memory -Version 8.2.0
<PackageReference Include="Encamina.Enmarcha.SemanticKernel.Connectors.Memory" Version="8.2.0" />
paket add Encamina.Enmarcha.SemanticKernel.Connectors.Memory --version 8.2.0
#r "nuget: Encamina.Enmarcha.SemanticKernel.Connectors.Memory, 8.2.0"
// Install Encamina.Enmarcha.SemanticKernel.Connectors.Memory as a Cake Addin #addin nuget:?package=Encamina.Enmarcha.SemanticKernel.Connectors.Memory&version=8.2.0 // Install Encamina.Enmarcha.SemanticKernel.Connectors.Memory as a Cake Tool #tool nuget:?package=Encamina.Enmarcha.SemanticKernel.Connectors.Memory&version=8.2.0
Semantic Kernel - Memory Connectors
Memory Connectors is a project that allows adding specific IMemoryStore instances. These IMemoryStore
instances are used for storing and retrieving embeddings.
Setup
Nuget package
First, install NuGet. Then, install Encamina.Enmarcha.SemanticKernel.Connectors.Memory from the package manager console:
PM> Install-Package Encamina.Enmarcha.SemanticKernel.Connectors.Memory
.NET CLI:
Install .NET CLI. Next, install Encamina.Enmarcha.SemanticKernel.Connectors.Memory from the .NET CLI:
dotnet add package Encamina.Enmarcha.SemanticKernel.Connectors.Memory
How to use
First, if you want use Qdrant memory provider you need to add the QdrantOptions to your project configuration. You can achieve this by using any configuration provider. The followng code is an example of how the settings would appear using the appsettings.json
file:
// ...
"QdrantOptions": {
"Host": "https://sample-qdrant.azurewebsites.net/", // Endpoint protocol and host
"Port": 6333, // Endpoint port
"VectorSize": 1536, // Vector size
"ApiKey": "xxxxxxxxxx" // API Key used by Qdrant as a form of client authentication.
},
// ...
If you want to use Azure AI Search as a memory provider, you need to add the AzureSearchOptions to your project configuration. You can achieve this by using any configuration provider. The following code is an example of how the settings would appear using the appsettings.json
file:
// ...
"AzureSearchOptions": {
"Endpoint": "https://sample-searchendpoint/", // Endpoint
"Key": "xxxxxxxxxx" // API Key used by Azure Search AI as a form of client authentication.
},
// ...
Next, in `Program.cs` or a similar entry point file in your project, add the following code:
```csharp
// Entry point
var builder = WebApplication.CreateBuilder(new WebApplicationOptions
{
// ...
});
// ...
// Or others configuration providers...
builder.Configuration.AddJsonFile("appsettings.json", optional: true, reloadOnChange: true);
builder.Services.AddOptions<QdrantOptions>().Bind(builder.Configuration.GetSection(nameof(QdrantOptions)))
.ValidateDataAnnotations()
.ValidateOnStart();
// Adds Qdrant as IMemoryStore
services.AddQdrantMemoryStore();
// Or adds Azure AI Search as IMemoryStore
services.AddAzureAISearchMemoryStore();
If you need add both memory providers, you can use the following code:
services.AddAzureAISearchNamedMemoryStore("AzureAISearchMemoryStore"); //AzureAISearchMemoryStore is the name of the memory store that you can use in the future to inject the IMemoryStore
services.AddQdrantNamedMemoryStore("QdrantMemoryStore"); //QdrantMemoryStore is the name of the memory store that you can use in the future to inject the IMemoryStore
In the previous code, it can be observed that in the first part is necessary to add certain Qdrant configurations that are available in the Encamina.Enmarcha.Data.Qdrant.Abstractions nuget package. The last line of code corresponds to an extension method that will add the specified implementation of the IMemoryStore interface as a singleton. With this, you have Qdrant configured as the storage to save and retrieve embeddings.
After the initial configuration, we typically configure Semantic Kernel as Scoped
.
builder.Services.AddScoped(sp =>
{
var kernel = new KernelBuilder()
.WithAzureTextEmbeddingGenerationService("<YOUR DEPLOYMENT NAME>", "<YOUR AZURE ENDPOINT>", "<YOUR API KEY>")
//.WithOpenAITextEmbeddingGenerationService("<YOUR MODEL ID>", "<YOUR API KEY>", "<YOUR API KEY>")
/// ...
.Build();
// ...
return kernel;
});
Once configured, you can now use Semantic Kernel, and it will utilize the Qdrant storage we have previously set up (in addition to generating the embeddings).
public class MyClass
{
private readonly Kernel kernel;
public MyClass(Kernel kernel)
{
this.kernel = kernel;
}
public async Task MyTestMethodAsync()
{
await kernel.Memory.SaveInformationAsync("my-collection", "my dummy text", Guid.NewGuid().ToString());
var memoryQueryResult = await kernel.Memory.SearchAsync("my-collection", "my similar dummy text")
.ToListAsync(); // ToListAsync method is provided by System.Linq.Async nuget https://www.nuget.org/packages/System.Linq.Async
}
}
If you prefer, you can inject the ISemanticTextMemory
interface directly.
public class MyClass
{
private readonly ISemanticTextMemory semanticTextMemory;
public MyClass(ISemanticTextMemory semanticTextMemory)
{
this.semanticTextMemory = semanticTextMemory;
}
public async Task MyTestMethodAsync()
{
await semanticTextMemory.SaveInformationAsync("my-collection", "my dummy text", Guid.NewGuid().ToString());
var memoryQueryResult = await semanticTextMemory.SearchAsync("my-collection", "my similar dummy text")
.ToListAsync(); // ToListAsync method is provided by System.Linq.Async nuget https://www.nuget.org/packages/System.Linq.Async
}
}
If use NamedKeys, you can use the following code:
internal class YourClassWithAzureSearch
{
private readonly Kernel kernel;
public YourClass([FromKeyedServices("Your Provider Name , fe: 'AzureAISearchMemoryStore'")]Kernel kernel)
{
this.kernel = kernel;
}
}
internal class YourClassWithQdrant
{
private readonly Kernel kernel;
public YourClass([FromKeyedServices("Your Provider Name , fe: 'QdrantMemoryStore'")]Kernel kernel)
{
this.kernel = kernel;
}
}
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
-
net8.0
- CommunityToolkit.Diagnostics (>= 8.2.2)
- Encamina.Enmarcha.AI.OpenAI.Abstractions (>= 8.2.0)
- Encamina.Enmarcha.AI.OpenAI.Azure (>= 8.2.0)
- Encamina.Enmarcha.Data.AzureAISearch (>= 8.2.0)
- Encamina.Enmarcha.Data.Qdrant.Abstractions (>= 8.2.0)
- Encamina.Enmarcha.DependencyInjection (>= 8.2.0)
- Encamina.Enmarcha.SemanticKernel.Abstractions (>= 8.2.0)
- Microsoft.Extensions.DependencyInjection.Abstractions (>= 8.0.1)
- Microsoft.Extensions.Http (>= 8.0.0)
- Microsoft.SemanticKernel.Connectors.AzureAISearch (>= 1.24.1-preview)
- Microsoft.SemanticKernel.Connectors.Qdrant (>= 1.24.1-preview)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
Version | Downloads | Last updated |
---|---|---|
8.2.0 | 83 | 10/22/2024 |
8.2.0-preview-01-m01 | 136 | 9/17/2024 |
8.1.9-preview-02 | 61 | 10/22/2024 |
8.1.9-preview-01 | 176 | 10/4/2024 |
8.1.8 | 143 | 9/23/2024 |
8.1.8-preview-07 | 134 | 9/12/2024 |
8.1.8-preview-06 | 139 | 9/11/2024 |
8.1.8-preview-05 | 88 | 9/10/2024 |
8.1.8-preview-04 | 216 | 8/16/2024 |
8.1.8-preview-03 | 128 | 8/13/2024 |
8.1.8-preview-02 | 100 | 8/13/2024 |
8.1.8-preview-01 | 99 | 8/12/2024 |
8.1.7 | 106 | 8/7/2024 |
8.1.7-preview-09 | 124 | 7/3/2024 |
8.1.7-preview-08 | 82 | 7/2/2024 |
8.1.7-preview-07 | 88 | 6/10/2024 |
8.1.7-preview-06 | 81 | 6/10/2024 |
8.1.7-preview-05 | 100 | 6/6/2024 |
8.1.7-preview-04 | 82 | 6/6/2024 |
8.1.7-preview-03 | 92 | 5/24/2024 |
8.1.7-preview-02 | 85 | 5/10/2024 |
8.1.7-preview-01 | 114 | 5/8/2024 |
8.1.6 | 1,116 | 5/7/2024 |
8.1.6-preview-08 | 60 | 5/2/2024 |
8.1.6-preview-07 | 93 | 4/29/2024 |
8.1.6-preview-06 | 260 | 4/26/2024 |
8.1.6-preview-05 | 93 | 4/24/2024 |
8.1.6-preview-04 | 106 | 4/22/2024 |
8.1.6-preview-03 | 90 | 4/22/2024 |
8.1.6-preview-02 | 122 | 4/17/2024 |
8.1.6-preview-01 | 184 | 4/15/2024 |
8.1.5 | 111 | 4/15/2024 |
8.1.5-preview-15 | 94 | 4/10/2024 |
8.1.5-preview-14 | 124 | 3/20/2024 |
8.1.5-preview-13 | 89 | 3/18/2024 |
8.1.5-preview-12 | 98 | 3/13/2024 |
8.1.5-preview-11 | 90 | 3/13/2024 |
8.1.5-preview-10 | 127 | 3/13/2024 |
8.1.5-preview-09 | 81 | 3/12/2024 |
8.1.5-preview-08 | 84 | 3/12/2024 |
8.1.5-preview-07 | 99 | 3/8/2024 |
8.1.5-preview-06 | 199 | 3/8/2024 |
8.1.5-preview-05 | 90 | 3/7/2024 |
8.1.5-preview-04 | 93 | 3/7/2024 |
8.1.5-preview-03 | 92 | 3/7/2024 |
8.1.5-preview-02 | 139 | 2/28/2024 |
8.1.5-preview-01 | 131 | 2/19/2024 |
8.1.4 | 213 | 2/15/2024 |
8.1.3 | 122 | 2/13/2024 |
8.1.3-preview-07 | 87 | 2/13/2024 |
8.1.3-preview-06 | 102 | 2/12/2024 |
8.1.3-preview-05 | 98 | 2/9/2024 |
8.1.3-preview-04 | 102 | 2/8/2024 |
8.1.3-preview-03 | 94 | 2/7/2024 |
8.1.3-preview-02 | 95 | 2/2/2024 |
8.1.3-preview-01 | 92 | 2/2/2024 |
8.1.2 | 154 | 2/1/2024 |
8.1.2-preview-9 | 100 | 1/22/2024 |
8.1.2-preview-8 | 91 | 1/19/2024 |
8.1.2-preview-7 | 92 | 1/19/2024 |
8.1.2-preview-6 | 86 | 1/19/2024 |
8.1.2-preview-5 | 91 | 1/19/2024 |
8.1.2-preview-4 | 86 | 1/19/2024 |
8.1.2-preview-3 | 81 | 1/18/2024 |
8.1.2-preview-2 | 89 | 1/18/2024 |
8.1.2-preview-16 | 95 | 1/31/2024 |
8.1.2-preview-15 | 95 | 1/31/2024 |
8.1.2-preview-14 | 202 | 1/25/2024 |
8.1.2-preview-13 | 90 | 1/25/2024 |
8.1.2-preview-12 | 100 | 1/23/2024 |
8.1.2-preview-11 | 96 | 1/23/2024 |
8.1.2-preview-10 | 78 | 1/22/2024 |
8.1.2-preview-1 | 80 | 1/18/2024 |
8.1.1 | 130 | 1/18/2024 |
8.1.0 | 105 | 1/18/2024 |
8.0.3 | 153 | 12/29/2023 |
8.0.1 | 148 | 12/14/2023 |
8.0.0 | 153 | 12/7/2023 |
6.0.4.3 | 164 | 12/29/2023 |
6.0.4.2 | 146 | 12/20/2023 |
6.0.4.1 | 188 | 12/19/2023 |
6.0.4 | 163 | 12/4/2023 |
6.0.3.20 | 135 | 11/27/2023 |
6.0.3.19 | 138 | 11/22/2023 |