Universal.Anthropic.Client
2.3.0
dotnet add package Universal.Anthropic.Client --version 2.3.0
NuGet\Install-Package Universal.Anthropic.Client -Version 2.3.0
<PackageReference Include="Universal.Anthropic.Client" Version="2.3.0" />
<PackageVersion Include="Universal.Anthropic.Client" Version="2.3.0" />
<PackageReference Include="Universal.Anthropic.Client" />
paket add Universal.Anthropic.Client --version 2.3.0
#r "nuget: Universal.Anthropic.Client, 2.3.0"
#addin nuget:?package=Universal.Anthropic.Client&version=2.3.0
#tool nuget:?package=Universal.Anthropic.Client&version=2.3.0
Universal.Anthropic.Client
Universal.Anthropic.Client
is a C# library for interacting with the Anthropic API. It provides a simple and efficient way to create messages and handle streaming responses from Anthropic's AI models.
Features
- Easy-to-use client for Anthropic API
- Support for creating messages
- Streaming support for real-time responses
- Token counting for messages before creation
- Thinking support
- List available models
- Customizable API version
- Built-in error handling and deserialization
Usage
Initializing the Client
var client = new AnthropicClient("YOUR_API_KEY");
Creating a Message
var request = new MessageRequest
{
Model = "claude-sonnet-4",
Messages = new List<Message>
{
new Message(Roles.User, "Hello, how are you?")
}
};
var response = await client.CreateMessageAsync(request);
Console.WriteLine(response.Content[0].Text);
Streaming a Message
var streamingRequest = new MessageRequest
{
Model = "claude-sonnet-4",
Messages = new List<Message>
{
new Message(Roles.User, "Tell me a story about a brave knight.")
},
Stream = true
};
var streamingResponse = client.CreateMessageStreamingAsync(streamingRequest);
streamingResponse.Updated += (sender, args) =>
{
Console.WriteLine("Response updated: " + streamingResponse.Value.Content);
};
await streamingResponse.Task; // Wait for completion
Counting Message Tokens
The Token Count API allows you to count the number of tokens in a message, including tools, images, and documents, without actually creating the message. This is useful for estimating costs, validating message size limits, or optimizing your requests:
// Create a token counting request
var countRequest = new CountMessageTokensRequest
{
Model = Models.ClaudeOpus4,
Messages = new List<Message>
{
new Message(Roles.User, "What is the square root of 841, and how did you determine it?")
},
System = "You are an assistant for Red Marble AI. Show your detailed reasoning process when solving problems.",
Thinking = new Thinking
{
Type = ThinkingTypes.Enabled,
BudgetTokens = 2048
}
};
// Count the tokens
var tokenCountResponse = await client.CountMessageTokensAsync(countRequest);
Console.WriteLine($"Input tokens: {tokenCountResponse.InputTokens}");
// You can now use this information to make decisions about your request
if (tokenCountResponse.InputTokens > maxAllowedTokens)
{
Console.WriteLine("Message exceeds token limit, please reduce content.");
}
else
{
// Proceed with creating the actual message
var messageRequest = new MessageRequest
{
Model = countRequest.Model,
Messages = countRequest.Messages,
System = countRequest.System,
Thinking = countRequest.Thinking,
MaxTokens = 8192
};
var response = await client.CreateMessageAsync(messageRequest);
}
The CountMessageTokensResponse
includes:
InputTokens
: The number of input tokens in your request- Additional token information depending on the request structure
This method supports all the same parameters as CreateMessageAsync
, including:
- Messages with text and image content
- System messages
- Tools and tool schemas
- Thinking configuration
- All supported models
Listing Available Models
The library provides a way to retrieve all available models from the Anthropic API:
// Get all available models
var modelsResponse = await client.ListModelsAsync();
// Display available models
foreach (var model in modelsResponse.Data)
{
Console.WriteLine($"ID: {model.Id}");
Console.WriteLine($"Name: {model.DisplayName}");
Console.WriteLine($"Created: {model.CreatedAt}");
Console.WriteLine();
}
This functionality allows you to programmatically determine which models are available for use with the API. Models are returned with the most recently released ones listed first.
Using Tools
The Anthropic API supports the use of tools, allowing the AI to perform specific actions or retrieve information. Here's an example of how to use a tool for weather checking:
// Define a weather tool
var weatherTool = new Tool
{
Name = "get_weather",
Description = "Get the current weather in a given location",
InputSchema = new JsonSchema
{
Type = "object",
Properties = new Dictionary<string, JsonSchema>()
{
["location"] = new JsonSchema()
{
Type = "string",
Description = "The city and state, e.g. San Francisco, CA"
}
},
Required = new[] { "location" }
}
};
// Create a message request with the tool
var request = new MessageRequest()
{
Model = Models.Claude35Sonnet,
Messages = new List<Message>
{
new Message(Roles.User, "What's the weather like in San Francisco?")
},
System = "You are an assistant for Red Marble AI. When asked about weather, always use the get_weather tool to provide accurate information.",
MaxTokens = 8192,
Tools = new List<Tool> { weatherTool },
ToolChoice = new AutoToolChoice()
};
// Send the request
var response = await anthropicClient.CreateMessageAsync(request);
// Check if the tool was used
var toolUseBlock = response.Content.FirstOrDefault(c => c is ToolUseContentBlock) as ToolUseContentBlock;
if (toolUseBlock != null)
{
Console.WriteLine($"Tool used: {toolUseBlock.Name}");
Console.WriteLine($"Tool input: {toolUseBlock.Input}");
}
In this example, we define a get_weather
tool with an input schema for location. We then include this tool in our message request, along with a system message instructing the AI to use the weather tool for weather-related questions. The ToolChoice
is set to AutoToolChoice
, allowing the model to decide when to use the tool.
After receiving the response, you can check if the tool was used by looking for a ToolUseContentBlock
in the response content.
Using Extended Thinking
Claude 3.7 Sonnet and newer models support an extended thinking feature that reveals Claude's detailed reasoning process. This feature helps you understand how Claude arrives at its answers, especially for complex problems:
// Create a request with extended thinking enabled
var request = new MessageRequest
{
Model = Models.Claude37Sonnet,
Messages = new List
{
new Message(Roles.User, "What's the square root of 841, and how did you determine it?")
},
MaxTokens = 8192,
Thinking = new Thinking
{
Type = ThinkingTypes.Enabled,
BudgetTokens = 2048
}
};
var response = await client.CreateMessageAsync(request);
// Process the response
foreach (var block in response.Content)
{
if (block is ThinkingContentBlock thinkingBlock)
{
Console.WriteLine("Thinking Process:");
Console.WriteLine(thinkingBlock.Thinking);
Console.WriteLine($"Signature: {thinkingBlock.Signature}");
}
else if (block is TextContentBlock textBlock)
{
Console.WriteLine("Final Answer:");
Console.WriteLine(textBlock.Text);
}
}
Key Classes
AnthropicClient
: The main client for interacting with the Anthropic API.MessageRequest
: Represents a request to create a message.MessageResponse
: Represents the response from creating a message.CountMessageTokensRequest
: Represents a request to count tokens in a message.CountMessageTokensResponse
: Represents the response containing token count information.StreamingMessageResponse
: Represents a streaming response for real-time updates.ListResponse<T>
: Generic response for list operations, used withModel
for listing available models.
Error Handling
The client throws HttpException
for non-successful status codes. Make sure to handle these exceptions in your code.
Customization
You can customize the API version used by setting the AnthropicVersion
property on the client:
client.AnthropicVersion = "2023-06-01";
You can also opt into betas by setting the AnthropicBeta
property on the client:
client.AnthropicBetas = ["beta-version-1"];
For more detailed information about the Anthropic API, please refer to the official Anthropic documentation.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
.NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.0 is compatible. netstandard2.1 was computed. |
.NET Framework | net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen40 was computed. tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.0
- Microsoft.Bcl.AsyncInterfaces (>= 9.0.3)
- Newtonsoft.Json (>= 13.0.3)
- Universal.Common.Json (>= 1.5.0)
- Universal.Common.Net.Http (>= 5.0.0)
- Universal.Common.Serialization (>= 2.4.0)
NuGet packages (1)
Showing the top 1 NuGet packages that depend on Universal.Anthropic.Client:
Package | Downloads |
---|---|
Universal.GenerativeAI.Anthropic
Implementation of generative AI abstractions using Anthropic as the model provider. |
GitHub repositories
This package is not used by any popular GitHub repositories.
Added support for model context protocol properties. Added new enumeration for new models. Added support for count messages token API. Added support for the beta headers.