Azure.AI.Language.Conversations
2.0.0-beta.1
Prefix Reserved
dotnet add package Azure.AI.Language.Conversations --version 2.0.0-beta.1
NuGet\Install-Package Azure.AI.Language.Conversations -Version 2.0.0-beta.1
<PackageReference Include="Azure.AI.Language.Conversations" Version="2.0.0-beta.1" />
paket add Azure.AI.Language.Conversations --version 2.0.0-beta.1
#r "nuget: Azure.AI.Language.Conversations, 2.0.0-beta.1"
// Install Azure.AI.Language.Conversations as a Cake Addin #addin nuget:?package=Azure.AI.Language.Conversations&version=2.0.0-beta.1&prerelease // Install Azure.AI.Language.Conversations as a Cake Tool #tool nuget:?package=Azure.AI.Language.Conversations&version=2.0.0-beta.1&prerelease
Azure Cognitive Language Services Conversations client library for .NET
The Azure.AI.Language.Conversations client library provides a suite of APIs for conversational language analysis capabilities like conversation language understanding and orchestration, conversational summarization and conversational personally identifiable information (PII) detection.
Conversation Language Understanding - aka CLU for short - is a cloud-based conversational AI service which provides many language understanding capabilities like:
- Conversation App: It's used in extracting intents and entities in conversations
- Workflow app: Acts like an orchestrator to select the best candidate to analyze conversations to get best response from apps like Qna, Luis, and Conversation App
Conversation Summarization is one feature offered by Azure AI Language, which is a combination of generative Large Language models and task-optimized encoder models that offer summarization solutions with higher quality, cost efficiency, and lower latency.
Conversation PII detection another feature offered by Azure AI Language, which is a collection of machine learning and AI algorithms to identify, categorize, and redact sensitive information in text. The Conversational PII model is a specialized model for handling speech transcriptions and the more informal, conversational tone of meeting and call transcripts.
Source code | Package (NuGet) | API reference documentation | Samples | Product documentation | Analysis REST API documentation
[!NOTE] Conversational Authoring is not supported in version 2.0.0-beta.1. If you use Conversational Authoring, please continue to use version 1.1.0. You can find the samples here.
Getting started
Install the package
Install the Azure Cognitive Language Services Conversations client library for .NET with NuGet:
dotnet add package Azure.AI.Language.Conversations
Prerequisites
- An Azure subscription
- An existing Azure Language Service Resource
Though you can use this SDK to create and import conversation projects, Language Studio is the preferred method for creating projects.
Authenticate the client
In order to interact with the Conversations service, you'll need to create an instance of the ConversationAnalysisClient
class. You will need an endpoint, and an API key to instantiate a client object. For more information regarding authenticating with Cognitive Services, see Authenticate requests to Azure Cognitive Services.
Get an API key
You can get the endpoint and an API key from the Cognitive Services resource in the Azure Portal.
Alternatively, use the Azure CLI command shown below to get the API key from the Cognitive Service resource.
az cognitiveservices account keys list --resource-group <resource-group-name> --name <resource-name>
Namespaces
Start by importing the namespace for the ConversationAnalysisClient
and related class:
using Azure.Core;
using Azure.Core.Serialization;
using Azure.AI.Language.Conversations;
Create a ConversationAnalysisClient
Once you've determined your endpoint and API key you can instantiate a ConversationAnalysisClient
:
Uri endpoint = new Uri("https://myaccount.cognitiveservices.azure.com");
AzureKeyCredential credential = new AzureKeyCredential("{api-key}");
ConversationAnalysisClient client = new ConversationAnalysisClient(endpoint, credential);
Create a client using Azure Active Directory authentication
You can also create a ConversationAnalysisClient
using Azure Active Directory (AAD) authentication. Your user or service principal must be assigned the "Cognitive Services Language Reader" role.
Using the DefaultAzureCredential you can authenticate a service using Managed Identity or a service principal, authenticate as a developer working on an application, and more all without changing code.
Before you can use the DefaultAzureCredential
, or any credential type from Azure.Identity, you'll first need to install the Azure.Identity package.
To use DefaultAzureCredential
with a client ID and secret, you'll need to set the AZURE_TENANT_ID
, AZURE_CLIENT_ID
, and AZURE_CLIENT_SECRET
environment variables; alternatively, you can pass those values
to the ClientSecretCredential
also in Azure.Identity.
Make sure you use the right namespace for DefaultAzureCredential
at the top of your source file:
using Azure.Identity;
Then you can create an instance of DefaultAzureCredential
and pass it to a new instance of your client:
Uri endpoint = new Uri("https://myaccount.cognitiveservices.azure.com");
DefaultAzureCredential credential = new DefaultAzureCredential();
ConversationAnalysisClient client = new ConversationAnalysisClient(endpoint, credential);
Note that regional endpoints do not support AAD authentication. Instead, create a custom domain name for your resource to use AAD authentication.
Key concepts
ConversationAnalysisClient
The ConversationAnalysisClient
is the primary interface for making predictions using your deployed Conversations models. It provides both synchronous and asynchronous APIs to submit queries.
Thread safety
We guarantee that all client instance methods are thread-safe and independent of each other (guideline). This ensures that the recommendation of reusing client instances is always safe, even across threads.
Additional concepts
Client options | Accessing the response | Long-running operations | Handling failures | Diagnostics | Mocking | Client lifetime
Examples
The Azure.AI.Language.Conversations client library provides both synchronous and asynchronous APIs.
The following examples show common scenarios using the client
created above.
Extract intents and entities from a conversation (Conversation Language Understanding)
To analyze a conversation, you can call the AnalyzeConversation()
method:
string projectName = "Menu";
string deploymentName = "production";
AnalyzeConversationInput data = new ConversationLanguageUnderstandingInput(
new ConversationAnalysisInput(
new TextConversationItem(
id: "1",
participantId: "participant1",
text: "Send an email to Carol about tomorrow's demo")),
new ConversationLanguageUnderstandingActionContent(projectName, deploymentName)
{
// Use Utf16CodeUnit for strings in .NET.
StringIndexType = StringIndexType.Utf16CodeUnit,
});
Response<AnalyzeConversationActionResult> response = client.AnalyzeConversation(data);
ConversationActionResult conversationActionResult = response.Value as ConversationActionResult;
ConversationPrediction conversationPrediction = conversationActionResult.Result.Prediction as ConversationPrediction;
Console.WriteLine($"Top intent: {conversationPrediction.TopIntent}");
Console.WriteLine("Intents:");
foreach (ConversationIntent intent in conversationPrediction.Intents)
{
Console.WriteLine($"Category: {intent.Category}");
Console.WriteLine($"Confidence: {intent.Confidence}");
Console.WriteLine();
}
Console.WriteLine("Entities:");
foreach (ConversationEntity entity in conversationPrediction.Entities)
{
Console.WriteLine($"Category: {entity.Category}");
Console.WriteLine($"Text: {entity.Text}");
Console.WriteLine($"Offset: {entity.Offset}");
Console.WriteLine($"Length: {entity.Length}");
Console.WriteLine($"Confidence: {entity.Confidence}");
Console.WriteLine();
if (entity.Resolutions != null && entity.Resolutions.Any())
{
foreach (ResolutionBase resolution in entity.Resolutions)
{
if (resolution is DateTimeResolution dateTimeResolution)
{
Console.WriteLine($"Datetime Sub Kind: {dateTimeResolution.DateTimeSubKind}");
Console.WriteLine($"Timex: {dateTimeResolution.Timex}");
Console.WriteLine($"Value: {dateTimeResolution.Value}");
Console.WriteLine();
}
}
}
}
Additional options can be passed to AnalyzeConversation
like enabling more verbose output:
string projectName = "Menu";
string deploymentName = "production";
AnalyzeConversationInput data = new ConversationLanguageUnderstandingInput(
new ConversationAnalysisInput(
new TextConversationItem(
id: "1",
participantId: "participant1",
text: "Send an email to Carol about tomorrow's demo")),
new ConversationLanguageUnderstandingActionContent(projectName, deploymentName)
{
// Use Utf16CodeUnit for strings in .NET.
StringIndexType = StringIndexType.Utf16CodeUnit,
Verbose = true,
});
Response<AnalyzeConversationActionResult> response = client.AnalyzeConversation(data);
Extract intents and entities from a conversation in a different language (Conversation Language Understanding)
The language
property can be set to specify the language of the conversation:
string projectName = "Menu";
string deploymentName = "production";
AnalyzeConversationInput data =
new ConversationLanguageUnderstandingInput(
new ConversationAnalysisInput(
new TextConversationItem(
id: "1",
participantId: "participant1",
text: "Enviar un email a Carol acerca de la presentación de mañana")
{
Language = "es"
}),
new ConversationLanguageUnderstandingActionContent(projectName, deploymentName)
{
// Use Utf16CodeUnit for strings in .NET.
StringIndexType = StringIndexType.Utf16CodeUnit,
Verbose = true
});
Response<AnalyzeConversationActionResult> response = client.AnalyzeConversation(data);
Orchestrate a conversation between various conversation apps like Question Answering app, CLU app
To analyze a conversation using an orchestration project, you can call the AnalyzeConversations()
method just like the conversation project.
string projectName = "DomainOrchestrator";
string deploymentName = "production";
AnalyzeConversationInput data = new ConversationLanguageUnderstandingInput(
new ConversationAnalysisInput(
new TextConversationItem(
id: "1",
participantId: "participant1",
text: "How are you?")),
new ConversationLanguageUnderstandingActionContent(projectName, deploymentName)
{
StringIndexType = StringIndexType.Utf16CodeUnit,
});
Response<AnalyzeConversationActionResult> response = client.AnalyzeConversation(data);
ConversationActionResult conversationResult = response.Value as ConversationActionResult;
OrchestrationPrediction orchestrationPrediction = conversationResult.Result.Prediction as OrchestrationPrediction;
Question Answering prediction
If your conversation was analyzed by Question Answering, it will include an intent - perhaps even the top intent - from which you can retrieve answers:
string respondingProjectName = orchestrationPrediction.TopIntent;
Console.WriteLine($"Top intent: {respondingProjectName}");
TargetIntentResult targetIntentResult = orchestrationPrediction.Intents[respondingProjectName];
if (targetIntentResult is QuestionAnsweringTargetIntentResult questionAnsweringTargetIntentResult)
{
AnswersResult questionAnsweringResponse = questionAnsweringTargetIntentResult.Result;
Console.WriteLine($"Question Answering Response:");
foreach (KnowledgeBaseAnswer answer in questionAnsweringResponse.Answers)
{
Console.WriteLine(answer.Answer?.ToString());
}
}
CLU prediction
If your conversation was analyzed by a CLU application, it will include an intent and entities:
string respondingProjectName = orchestrationPrediction.TopIntent;
TargetIntentResult targetIntentResult = orchestrationPrediction.Intents[respondingProjectName];
if (targetIntentResult is ConversationTargetIntentResult conversationTargetIntent)
{
ConversationResult conversationResult = conversationTargetIntent.Result;
ConversationPrediction conversationPrediction = conversationResult.Prediction;
Console.WriteLine($"Top Intent: {conversationPrediction.TopIntent}");
Console.WriteLine($"Intents:");
foreach (ConversationIntent intent in conversationPrediction.Intents)
{
Console.WriteLine($"Intent Category: {intent.Category}");
Console.WriteLine($"Confidence: {intent.Confidence}");
Console.WriteLine();
}
}
Summarize a conversation
To summarize a conversation, you can use the AnalyzeConversationsAsync
method overload that returns an Response<AnalyzeConversationOperationState>
:
MultiLanguageConversationInput input = new MultiLanguageConversationInput(
new List<ConversationInput>
{
new TextConversation("1", "en", new List<TextConversationItem>()
{
new TextConversationItem("1", "Agent", "Hello, how can I help you?"),
new TextConversationItem("2", "Customer", "How to upgrade Office? I am getting error messages the whole day."),
new TextConversationItem("3", "Agent", "Press the upgrade button please. Then sign in and follow the instructions.")
})
});
List<AnalyzeConversationOperationAction> actions = new List<AnalyzeConversationOperationAction>
{
new SummarizationOperationAction()
{
ActionContent = new ConversationSummarizationActionContent(new List<SummaryAspect>
{
SummaryAspect.Issue,
}),
Name = "Issue task",
},
new SummarizationOperationAction()
{
ActionContent = new ConversationSummarizationActionContent(new List<SummaryAspect>
{
SummaryAspect.Resolution,
}),
Name = "Resolution task",
}
};
AnalyzeConversationOperationInput data = new AnalyzeConversationOperationInput(input, actions);
Response<AnalyzeConversationOperationState> analyzeConversationOperation = await client.AnalyzeConversationsAsync(data);
AnalyzeConversationOperationState operationState = analyzeConversationOperation.Value;
foreach (var operationResult in operationState.Actions.Items)
{
Console.WriteLine($"Operation action name: {operationResult.Name}");
if (operationResult is SummarizationOperationResult summarizationOperationResult)
{
SummaryResult results = summarizationOperationResult.Results;
foreach (ConversationsSummaryResult conversation in results.Conversations)
{
Console.WriteLine($"Conversation: #{conversation.Id}");
Console.WriteLine("Summaries:");
foreach (SummaryResultItem summary in conversation.Summaries)
{
Console.WriteLine($"Text: {summary.Text}");
Console.WriteLine($"Aspect: {summary.Aspect}");
}
if (conversation.Warnings != null && conversation.Warnings.Any())
{
Console.WriteLine("Warnings:");
foreach (InputWarning warning in conversation.Warnings)
{
Console.WriteLine($"Code: {warning.Code}");
Console.WriteLine($"Message: {warning.Message}");
}
}
Console.WriteLine();
}
}
if (operationState.Errors != null && operationState.Errors.Any())
{
Console.WriteLine("Errors:");
foreach (ConversationError error in operationState.Errors)
{
Console.WriteLine($"Error: {error.Code} - {error}");
}
}
}
Extract PII from a conversation
To detect and redact PII in a conversation, you can use the AnalyzeConversationsAsync
method overload with an action of type PiiOperationAction
that returns an Response<AnalyzeConversationOperationState>
::
MultiLanguageConversationInput input = new MultiLanguageConversationInput(
new List<ConversationInput>
{
new TextConversation("1", "en", new List<TextConversationItem>()
{
new TextConversationItem(id: "1", participantId: "Agent_1", text: "Can you provide you name?"),
new TextConversationItem(id: "2", participantId: "Customer_1", text: "Hi, my name is John Doe."),
new TextConversationItem(id : "3", participantId : "Agent_1", text : "Thank you John, that has been updated in our system.")
})
});
List<AnalyzeConversationOperationAction> actions = new List<AnalyzeConversationOperationAction>
{
new PiiOperationAction()
{
ActionContent = new ConversationPiiActionContent(),
Name = "Conversation PII",
}
};
AnalyzeConversationOperationInput data = new AnalyzeConversationOperationInput(input, actions);
Response<AnalyzeConversationOperationState> analyzeConversationOperation = await client.AnalyzeConversationsAsync(data);
AnalyzeConversationOperationState operationState = analyzeConversationOperation.Value;
foreach (AnalyzeConversationOperationResult operationResult in operationState.Actions.Items)
{
Console.WriteLine($"Operation action name: {operationResult.Name}");
if (operationResult is ConversationPiiOperationResult piiOperationResult)
{
foreach (ConversationalPiiResult conversation in piiOperationResult.Results.Conversations)
{
Console.WriteLine($"Conversation: #{conversation.Id}");
Console.WriteLine("Detected Entities:");
foreach (ConversationPiiItemResult item in conversation.ConversationItems)
{
foreach (NamedEntity entity in item.Entities)
{
Console.WriteLine($" Category: {entity.Category}");
Console.WriteLine($" Subcategory: {entity.Subcategory}");
Console.WriteLine($" Text: {entity.Text}");
Console.WriteLine($" Offset: {entity.Offset}");
Console.WriteLine($" Length: {entity.Length}");
Console.WriteLine($" Confidence score: {entity.ConfidenceScore}");
Console.WriteLine();
}
}
if (conversation.Warnings != null && conversation.Warnings.Any())
{
Console.WriteLine("Warnings:");
foreach (InputWarning warning in conversation.Warnings)
{
Console.WriteLine($"Code: {warning.Code}");
Console.WriteLine($"Message: {warning.Message}");
}
}
Console.WriteLine();
}
}
if (operationState.Errors != null && operationState.Errors.Any())
{
Console.WriteLine("Errors:");
foreach (ConversationError error in operationState.Errors)
{
Console.WriteLine($"Error: {error.Code} - {error}");
}
}
}
Additional samples
Browse our samples for more examples of how to analyze conversations.
Troubleshooting
General
When you interact with the Cognitive Language Services Conversations client library using the .NET SDK, errors returned by the service correspond to the same HTTP status codes returned for REST API requests.
For example, if you submit a utterance to a non-existant project, a 400
error is returned indicating "Bad Request".
try
{
var data = new
{
analysisInput = new
{
conversationItem = new
{
text = "Send an email to Carol about tomorrow's demo",
id = "1",
participantId = "1",
}
},
parameters = new
{
projectName = "invalid-project",
deploymentName = "production",
// Use Utf16CodeUnit for strings in .NET.
stringIndexType = "Utf16CodeUnit",
},
kind = "Conversation",
};
Response response = client.AnalyzeConversation(RequestContent.Create(data));
}
catch (RequestFailedException ex)
{
Console.WriteLine(ex.ToString());
}
You will notice that additional information is logged, like the client request ID of the operation.
Azure.RequestFailedException: The input parameter is invalid.
Status: 400 (Bad Request)
ErrorCode: InvalidArgument
Content:
{
"error": {
"code": "InvalidArgument",
"message": "The input parameter is invalid.",
"innerError": {
"code": "InvalidArgument",
"message": "The input parameter \"payload\" cannot be null or empty."
}
}
}
Headers:
Transfer-Encoding: chunked
pragma: no-cache
request-id: 0303b4d0-0954-459f-8a3d-1be6819745b5
apim-request-id: 0303b4d0-0954-459f-8a3d-1be6819745b5
x-envoy-upstream-service-time: 15
Strict-Transport-Security: max-age=31536000; includeSubDomains; preload
x-content-type-options: nosniff
Cache-Control: no-store, proxy-revalidate, no-cache, max-age=0, private
Content-Type: application/json
Setting up console logging
The simplest way to see the logs is to enable console logging. To create an Azure SDK log listener that outputs messages to the console use the AzureEventSourceListener.CreateConsoleLogger
method.
// Setup a listener to monitor logged events.
using AzureEventSourceListener listener = AzureEventSourceListener.CreateConsoleLogger();
To learn more about other logging mechanisms see here.
Next steps
- View our samples.
- Read about the different features of the Conversations service.
- Try our service demos.
Contributing
See the CONTRIBUTING.md for details on building, testing, and contributing to this library.
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit cla.microsoft.com.
When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
.NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.0 is compatible. netstandard2.1 was computed. |
.NET Framework | net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen40 was computed. tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.0
- Azure.Core (>= 1.41.0)
- System.Text.Json (>= 4.7.2)
NuGet packages (6)
Showing the top 5 NuGet packages that depend on Azure.AI.Language.Conversations:
Package | Downloads |
---|---|
Encamina.Enmarcha.AI.IntentsPrediction.Azure
Package Description |
|
AccessibleAI.Bots.LanguageUnderstanding
Helpers for working with Conversational Language Understanding (CLU), Orchestration, and Chit Chat for Microsoft Bot Framework bot development |
|
AccessibleAI.Bots.Language.Azure
Bots Framework intent resolvers using Conversational Language Understanding (CLU) and Orchestration for bot development |
|
NegativeEddy.Bots.LanguageUnderstandingRecognizer
Package Description |
|
MattEland.Bots.CluHelpers
Package Description |
GitHub repositories (4)
Showing the top 4 popular GitHub repositories that depend on Azure.AI.Language.Conversations:
Repository | Stars |
---|---|
Azure-Samples/cognitive-services-speech-sdk
Sample code for the Microsoft Cognitive Services Speech SDK
|
|
MicrosoftLearning/AI-102-AIEngineer
Lab files for AI-102 - AI Engineer
|
|
MicrosoftLearning/mslearn-ai-language
Lab files for Azure AI Language modules
|
|
Azure-Samples/communication-services-AI-customer-service-sample
A sample app for the customer support center running in Azure, using Azure Communication Services and Azure OpenAI for text and voice bots.
|
Version | Downloads | Last updated |
---|---|---|
2.0.0-beta.1 | 3,253 | 8/1/2024 |
1.1.0 | 171,945 | 6/14/2023 |
1.1.0-beta.2 | 22,233 | 11/11/2022 |
1.1.0-beta.1 | 8,097 | 7/2/2022 |
1.0.0 | 145,852 | 6/28/2022 |
1.0.0-beta.3 | 1,471 | 4/20/2022 |
1.0.0-beta.2 | 4,724 | 2/8/2022 |
1.0.0-beta.1 | 2,989 | 11/4/2021 |