OnnxStack.StableDiffusion 0.8.0

Prefix Reserved
There is a newer version of this package available.
See the version list below for details.
dotnet add package OnnxStack.StableDiffusion --version 0.8.0                
NuGet\Install-Package OnnxStack.StableDiffusion -Version 0.8.0                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="OnnxStack.StableDiffusion" Version="0.8.0" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add OnnxStack.StableDiffusion --version 0.8.0                
#r "nuget: OnnxStack.StableDiffusion, 0.8.0"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install OnnxStack.StableDiffusion as a Cake Addin
#addin nuget:?package=OnnxStack.StableDiffusion&version=0.8.0

// Install OnnxStack.StableDiffusion as a Cake Tool
#tool nuget:?package=OnnxStack.StableDiffusion&version=0.8.0                

OnnxStack.StableDiffusion - Onnx Stable Diffusion Services for .NET Applications

OnnxStack.StableDiffusion is a library that provides higher-level Stable Diffusion services for use in .NET applications. It offers extensive support for features such as dependency injection, .NET configuration implementations, ASP.NET Core integration, and IHostedService support.

Getting Started

OnnxStack.StableDiffusion can be found via the nuget package manager, download and install it.

PM> Install-Package OnnxStack.StableDiffusion

Microsoft.ML.OnnxRuntime

Depending on the devices you have and the platform you are running on, you will want to install the Microsoft.ML.OnnxRuntime package that best suits your needs.

CPU-GPU via Microsoft Drirect ML

PM> Install-Package Microsoft.ML.OnnxRuntime.DirectML

GPU support for both NVIDIA and AMD?

PM> Install-Package Microsoft.ML.OnnxRuntime.Gpu

.NET Core Registration

You can easily integrate OnnxStack.StableDiffusion into your application services layer. This registration process sets up the necessary services and loads the appsettings.json configuration.

Example: Registering OnnxStack.StableDiffusion

builder.Services.AddOnnxStackStableDiffusion();

.NET Console Application Example

Required Nuget Packages for example

Microsoft.Extensions.Hosting
Microsoft.Extensions.Logging
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Hosting;
using OnnxStack.StableDiffusion.Common;
using OnnxStack.StableDiffusion.Config;

internal class Program
{
   static async Task Main(string[] _)
   {
      var builder = Host.CreateApplicationBuilder();
      builder.Logging.ClearProviders();
      builder.Services.AddLogging((loggingBuilder) => loggingBuilder.SetMinimumLevel(LogLevel.Error));

      // Add OnnxStack Stable Diffusion
      builder.Services.AddOnnxStackStableDiffusion();

      // Add AppService
      builder.Services.AddHostedService<AppService>();

      // Start
      await builder.Build().RunAsync();
   }
}

internal class AppService : IHostedService
{
   private readonly string _outputDirectory;
   private readonly IStableDiffusionService _stableDiffusionService;

   public AppService(IStableDiffusionService stableDiffusionService)
   {
      _stableDiffusionService = stableDiffusionService;
      _outputDirectory = Path.Combine(Directory.GetCurrentDirectory(), "Images");
   }

   public async Task StartAsync(CancellationToken cancellationToken)
   {
      Directory.CreateDirectory(_outputDirectory);

      while (true)
      {
         System.Console.WriteLine("Please type a prompt and press ENTER");
         var prompt = System.Console.ReadLine();

         System.Console.WriteLine("Please type a negative prompt and press ENTER (optional)");
         var negativePrompt = System.Console.ReadLine();


         // Example only, full config depends on model
         // appsettings.json is recommended for ease of use
         var modelOptions = new ModelOptions
         {
            Name = "Stable  Diffusion 1.5",
            ExecutionProvider = ExecutionProvider.DirectML,
            ModelConfigurations = new List<OnnxModelSessionConfig>
            {
                  new OnnxModelSessionConfig
                  {
                     Type = OnnxModelType.Unet,
                     OnnxModelPath = "model path"
                  }
            }
         };

         var promptOptions = new PromptOptions
         {
            Prompt = prompt,
            NegativePrompt = negativePrompt,
            DiffuserType = DiffuserType.TextToImage,

            // Input for ImageToImage
            // InputImage = new InputImage(File.ReadAllBytesAsync("image to image filename"))
         };

         var schedulerOptions = new SchedulerOptions
         {
            Seed = Random.Shared.Next(),
            GuidanceScale = 7.5f,
            InferenceSteps = 30,
            Height = 512,
            Width = 512,
            SchedulerType = SchedulerType.LMS,
         };


         // Generate Image Example
         var outputFilename = Path.Combine(_outputDirectory, $"{schedulerOptions.Seed}_{schedulerOptions.SchedulerType}.png");
         var result = await _stableDiffusionService.GenerateAsImageAsync(modelOptions, promptOptions, schedulerOptions);
         if (result is not null)
         {
            // Save image to disk
            await result.SaveAsPngAsync(outputFilename);
         }




         // Generate Batch Example
         var batchOptions = new BatchOptions
         {
            BatchType = BatchOptionType.Seed,
            ValueTo = 20
         };

         await foreach (var batchResult in _stableDiffusionService.GenerateBatchAsImageAsync(modelOptions, promptOptions, schedulerOptions, batchOptions))
         {
            // Save image to disk
            await batchResult.SaveAsPngAsync(outputFilename);
         }


      }
   }

   public Task StopAsync(CancellationToken cancellationToken)
   {
      return Task.CompletedTask;
   }
}

Configuration

The appsettings.json is the easiest option for configuring model sets. Below is an example of Stable Diffusion 1.5. The example adds the necessary paths to each model file required for Stable Diffusion, as well as any model-specific configurations. Each model can be assigned to its own device, which is handy if you have only a small GPU. This way, you can offload only what you need. There are limitations depending on the version of the Microsoft.ML.OnnxRuntime package you are using, but in most cases, you can split the load between CPU and GPU.

{
   "Logging": {
      "LogLevel": {
         "Default": "Information",
         "Microsoft.AspNetCore": "Warning"
      }
   },

   "OnnxStackConfig": {
      "Name": "StableDiffusion 1.5",
      "IsEnabled": true,
      "PadTokenId": 49407,
      "BlankTokenId": 49407,
      "TokenizerLimit": 77,
      "EmbeddingsLength": 768,
      "ScaleFactor": 0.18215,
      "PipelineType": "StableDiffusion",
      "Diffusers": [
         "TextToImage",
         "ImageToImage",
         "ImageInpaintLegacy"
      ],
      "DeviceId": 0,
      "InterOpNumThreads": 0,
      "IntraOpNumThreads": 0,
      "ExecutionMode": "ORT_SEQUENTIAL",
      "ExecutionProvider": "DirectML",
      "ModelConfigurations": [
         {
            "Type": "Tokenizer",
            "OnnxModelPath": "D:\\Repositories\\stable-diffusion-v1-5\\cliptokenizer.onnx"
         },
         {
            "Type": "Unet",
            "OnnxModelPath": "D:\\Repositories\\stable-diffusion-v1-5\\unet\\model.onnx"
         },
         {
            "Type": "TextEncoder",
            "OnnxModelPath": "D:\\Repositories\\stable-diffusion-v1-5\\text_encoder\\model.onnx"
         },
         {
            "Type": "VaeEncoder",
            "OnnxModelPath": "D:\\Repositories\\stable-diffusion-v1-5\\vae_encoder\\model.onnx"
         },
         {
            "Type": "VaeDecoder",
            "OnnxModelPath": "D:\\Repositories\\stable-diffusion-v1-5\\vae_decoder\\model.onnx"
         }
      ]
   }
}
Product Compatible and additional computed target framework versions.
.NET net7.0 is compatible.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 was computed.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (1)

Showing the top 1 NuGet packages that depend on OnnxStack.StableDiffusion:

Package Downloads
Frank.SemanticKernel.Connectors.OnnxStack.StableDiffusion

Package Description

GitHub repositories (1)

Showing the top 1 popular GitHub repositories that depend on OnnxStack.StableDiffusion:

Repository Stars
TensorStack-AI/OnnxStack
C# Stable Diffusion using ONNX Runtime
Version Downloads Last updated
0.39.0 378 6/12/2024
0.31.0 235 4/25/2024
0.27.0 176 3/31/2024
0.25.0 155 3/14/2024
0.23.1 139 3/1/2024
0.23.0 123 2/29/2024
0.22.0 139 2/23/2024
0.21.0 132 2/15/2024
0.19.0 139 2/1/2024
0.17.0 177 1/19/2024
0.16.0 142 1/11/2024
0.15.0 208 1/5/2024
0.14.0 177 12/27/2023
0.13.0 118 12/22/2023
0.12.0 148 12/15/2023
0.10.0 181 11/30/2023
0.9.0 158 11/23/2023
0.8.0 220 11/16/2023
0.7.0 160 11/9/2023
0.6.0 136 11/2/2023
0.5.0 166 10/27/2023
0.4.0 165 10/19/2023
0.3.1 160 10/9/2023
0.3.0 151 10/9/2023
0.2.0 162 10/3/2023
0.1.0 164 9/25/2023 0.1.0 is deprecated because it is no longer maintained.