Cnblogs.DashScope.AspNetCore 0.7.1

dotnet add package Cnblogs.DashScope.AspNetCore --version 0.7.1
                    
NuGet\Install-Package Cnblogs.DashScope.AspNetCore -Version 0.7.1
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Cnblogs.DashScope.AspNetCore" Version="0.7.1" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="Cnblogs.DashScope.AspNetCore" Version="0.7.1" />
                    
Directory.Packages.props
<PackageReference Include="Cnblogs.DashScope.AspNetCore" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add Cnblogs.DashScope.AspNetCore --version 0.7.1
                    
#r "nuget: Cnblogs.DashScope.AspNetCore, 0.7.1"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#addin nuget:?package=Cnblogs.DashScope.AspNetCore&version=0.7.1
                    
Install Cnblogs.DashScope.AspNetCore as a Cake Addin
#tool nuget:?package=Cnblogs.DashScope.AspNetCore&version=0.7.1
                    
Install Cnblogs.DashScope.AspNetCore as a Cake Tool

English | 简体中文

NuGet Version NuGet Version NuGet Version

DashScope SDK for .NET

An unofficial DashScope SDK maintained by Cnblogs.

Warning: this project is under active development, Breaking Changes may introduced without notice or major version change. Make sure you read the Release Notes before upgrading.

Quick Start

Using Microsoft.Extensions.AI

Install Cnblogs.DashScope.AI Package

var client = new DashScopeClient("your-api-key").AsChatClient("qwen-max");
var completion = await client.CompleteAsync("hello");
Console.WriteLine(completion)

Console App

Install Cnblogs.DashScope.Sdk package.

var client = new DashScopeClient("your-api-key");
var completion = await client.GetQWenCompletionAsync(QWenLlm.QWenMax, prompt);
// or pass the model name string directly.
// var completion = await client.GetQWenCompletionAsync("qwen-max", prompt);
Console.WriteLine(completion.Output.Text);

ASP.NET Core

Install the Cnblogs.DashScope.AspNetCore package.

Program.cs

builder.AddDashScopeClient(builder.Configuration);

appsettings.json

{
    "DashScope": {
        "ApiKey": "your-api-key",
    }
}

Usage

public class YourService(IDashScopeClient client)
{
    public async Task<string> CompletePromptAsync(string prompt)
    {
       var completion = await client.GetQWenCompletionAsync(QWenLlm.QWenMax, prompt);
       return completion.Output.Text;
    }
}

Supported APIs

  • Text Embedding API - GetTextEmbeddingsAsync()
  • Text Generation API(qwen-turbo, qwen-max, etc.) - GetQWenCompletionAsync() and GetQWenCompletionStreamAsync()
  • DeepSeek Models - GetDeepSeekCompletionAsync() and GetDeepSeekCompletionStreamAsync()
  • BaiChuan Models - Use GetBaiChuanTextCompletionAsync()
  • LLaMa2 Models - GetLlama2TextCompletionAsync()
  • Multimodal Generation API(qwen-vl-max, etc.) - GetQWenMultimodalCompletionAsync() and GetQWenMultimodalCompletionStreamAsync()
  • Wanx Models(Image generation, background generation, etc)
    • Image Synthesis - CreateWanxImageSynthesisTaskAsync() and GetWanxImageSynthesisTaskAsync()
    • Image Generation - CreateWanxImageGenerationTaskAsync() and GetWanxImageGenerationTaskAsync()
    • Background Image Generation - CreateWanxBackgroundGenerationTaskAsync() and GetWanxBackgroundGenerationTaskAsync()
  • File API that used by Qwen-Long - UploadFileAsync() and DeleteFileAsync
  • Application call - GetApplicationResponseAsync() and GetApplicationResponseStreamAsync()

Examples

Visit snapshots for calling samples.

Visit tests for more usage of each api.

General Text Completion API

Use client.GetTextCompletionAsync and client.GetTextCompletionStreamAsync to access text generation api directly.

var completion = await dashScopeClient.GetTextCompletionAsync(
            new ModelRequest<TextGenerationInput, ITextGenerationParameters>
            {
                Model = "your-model-name",
                Input = new TextGenerationInput { Prompt = prompt },
                Parameters = new TextGenerationParameters()
                {
                    // control parameters as you wish.
                    EnableSearch = true
                }
            });
var completions = dashScopeClient.GetTextCompletionStreamAsync(
            new ModelRequest<TextGenerationInput, ITextGenerationParameters>
            {
                Model = "your-model-name",
                Input = new TextGenerationInput { Messages = [TextChatMessage.System("you are a helpful assistant"), TextChatMessage.User("How are you?")] },
                Parameters = new TextGenerationParameters()
                {
                    // control parameters as you wish.
                    EnableSearch = true,
                    IncreamentalOutput = true
                }
            });

Single Text Completion

var prompt = "hello"
var completion = await client.GetQWenCompletionAsync(QWenLlm.QWenMax, prompt);
Console.WriteLine(completion.Output.Text);

Reasoning

Use completion.Output.Choices![0].Message.ReasoningContent to access the reasoning content from model.

var history = new List<ChatMessage>
{
    ChatMessage.User("Calculate 1+1")
};
var completion = await client.GetDeepSeekChatCompletionAsync(DeepSeekLlm.DeepSeekR1, history);
Console.WriteLine(completion.Output.Choices[0]!.Message.ReasoningContent);

Multi-round chat

var history = new List<ChatMessage>
{
    ChatMessage.User("Please remember this number, 42"),
    ChatMessage.Assistant("I have remembered this number."),
    ChatMessage.User("What was the number I metioned before?")
}
var parameters = new TextGenerationParameters()
{
    ResultFormat = ResultFormats.Message
};
var completion = await client.GetQWenChatCompletionAsync(QWenLlm.QWenMax, history, parameters);
Console.WriteLine(completion.Output.Choices[0].Message.Content); // The number is 42

Function Call

Creates a function with parameters

string GetCurrentWeather(GetCurrentWeatherParameters parameters)
{
    // actual implementation should be different.
    return "Sunny, 14" + parameters.Unit switch
    {
        TemperatureUnit.Celsius => "℃",
        TemperatureUnit.Fahrenheit => "℉"
    };
}

public record GetCurrentWeatherParameters(
    [property: Required]
    [property: Description("The city and state, e.g. San Francisco, CA")]
    string Location,
    [property: JsonConverter(typeof(EnumStringConverter<TemperatureUnit>))]
    TemperatureUnit Unit = TemperatureUnit.Celsius);

public enum TemperatureUnit
{
    Celsius,
    Fahrenheit
}

Append tool information to chat messages.

var tools = new List<ToolDefinition>()
{
    new(
        ToolTypes.Function,
        new FunctionDefinition(
            nameof(GetCurrentWeather),
            "Get the weather abount given location",
            new JsonSchemaBuilder().FromType<GetCurrentWeatherParameters>().Build()))
};

var history = new List<ChatMessage>
{
    ChatMessage.User("What is the weather today in C.A?")
};

var parameters = new TextGenerationParamters()
{
    ResultFormat = ResultFormats.Message,
    Tools = tools
};

// send question with available tools.
var completion = await client.GetQWenChatCompletionAsync(QWenLlm.QWenMax, history, parameters);
history.Add(completion.Output.Choice[0].Message);

// model responding with tool calls.
Console.WriteLine(completion.Output.Choice[0].Message.ToolCalls[0].Function.Name); // GetCurrentWeather

// calling tool that model requests and append result into history.
var result = GetCurrentWeather(JsonSerializer.Deserialize<GetCurrentWeatherParameters>(completion.Output.Choice[0].Message.ToolCalls[0].Function.Arguments));
history.Add(ChatMessage.Tool(result, nameof(GetCurrentWeather)));

// get back answers.
completion = await client.GetQWenChatCompletionAsync(QWenLlm.QWenMax, history, parameters);
Console.WriteLine(completion.Output.Choice[0].Message.Content);

Append the tool calling result with tool role, then model will generate answers based on tool calling result.

QWen-Long with files

Upload file first.

var file = new FileInfo("test.txt");
var uploadedFile = await dashScopeClient.UploadFileAsync(file.OpenRead(), file.Name);

Using uploaded file id in messages.

var history = new List<ChatMessage>
{
    ChatMessage.File(uploadedFile.Id),   // use array for multiple files, e.g. [file1.Id, file2.Id]
    ChatMessage.User("Summarize the content of file.")
}
var parameters = new TextGenerationParameters()
{
    ResultFormat = ResultFormats.Message
};
var completion = await client.GetQWenChatCompletionAsync(QWenLlm.QWenLong, history, parameters);
Console.WriteLine(completion.Output.Choices[0].Message.Content);

Delete file if needed

var deletionResult = await dashScopeClient.DeleteFileAsync(uploadedFile.Id);

Application call

Use GetApplicationResponseAsync to call an application.

Use GetApplicationResponseStreamAsync for streaming output.

var request =
    new ApplicationRequest()
    {
        Input = new ApplicationInput() { Prompt = "Summarize this file." },
        Parameters = new ApplicationParameters()
        {
            TopK = 100,
            TopP = 0.8f,
            Seed = 1234,
            Temperature = 0.85f,
            RagOptions = new ApplicationRagOptions()
            {
                PipelineIds = ["thie5bysoj"],
                FileIds = ["file_d129d632800c45aa9e7421b30561f447_10207234"]
            }
        }
    };
var response = await client.GetApplicationResponseAsync("your-application-id", request);
Console.WriteLine(response.Output.Text);

ApplicationRequest use an Dictionary<string, object?> as BizParams by default.

var request =
    new ApplicationRequest()
    {
        Input = new ApplicationInput()
        {
            Prompt = "Summarize this file.",
            BizParams = new Dictionary<string, object?>()
            {
                { "customKey1", "custom-value" }
            }
        }
    };
var response = await client.GetApplicationResponseAsync("your-application-id", request);
Console.WriteLine(response.Output.Text);

You can use the generic version ApplicationRequest<TBizParams> for strong-typed BizParams. But keep in mind that client use snake_case by default when doing json serialization, you may need to use [JsonPropertyName("camelCase")] for other type of naming policy.

public record TestApplicationBizParam(
    [property: JsonPropertyName("sourceCode")]
    string SourceCode);

var request =
    new ApplicationRequest<TestApplicationBizParam>()
    {
        Input = new ApplicationInput<TestApplicationBizParam>()
        {
            Prompt = "Summarize this file.",
            BizParams = new TestApplicationBizParam("test")
        }
    };
var response = await client.GetApplicationResponseAsync("your-application-id", request);
Console.WriteLine(response.Output.Text);
Product Compatible and additional computed target framework versions.
.NET net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 was computed.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last updated
0.7.1 139 13 days ago
0.7.0 131 15 days ago
0.6.1 147 24 days ago
0.6.0 200 25 days ago
0.5.2 130 3 months ago
0.5.1 119 4 months ago
0.5.0 111 4 months ago
0.4.1 99 4 months ago
0.4.0 116 6 months ago
0.3.0 202 9 months ago
0.2.2 113 9 months ago
0.2.1 126 10 months ago
0.2.0 154 3/14/2024
0.1.0 138 3/13/2024
0.0.3 144 3/5/2024
0.0.2 137 3/4/2024
0.0.1 138 3/4/2024