Microsoft.ML.OnnxRuntime.QNN
1.23.0-rc.1
Prefix Reserved
dotnet add package Microsoft.ML.OnnxRuntime.QNN --version 1.23.0-rc.1
NuGet\Install-Package Microsoft.ML.OnnxRuntime.QNN -Version 1.23.0-rc.1
<PackageReference Include="Microsoft.ML.OnnxRuntime.QNN" Version="1.23.0-rc.1" />
<PackageVersion Include="Microsoft.ML.OnnxRuntime.QNN" Version="1.23.0-rc.1" />
<PackageReference Include="Microsoft.ML.OnnxRuntime.QNN" />
paket add Microsoft.ML.OnnxRuntime.QNN --version 1.23.0-rc.1
#r "nuget: Microsoft.ML.OnnxRuntime.QNN, 1.23.0-rc.1"
#:package Microsoft.ML.OnnxRuntime.QNN@1.23.0-rc.1
#addin nuget:?package=Microsoft.ML.OnnxRuntime.QNN&version=1.23.0-rc.1&prerelease
#tool nuget:?package=Microsoft.ML.OnnxRuntime.QNN&version=1.23.0-rc.1&prerelease
About
ONNX Runtime is a cross-platform machine-learning inferencing accelerator.
ONNX Runtime can enable faster customer experiences and lower costs, supporting models from deep learning frameworks such as PyTorch and TensorFlow/Keras as well as classical machine learning libraries such as scikit-learn, LightGBM, XGBoost, etc. ONNX Runtime is compatible with different hardware, drivers, and operating systems, and provides optimal performance by leveraging hardware accelerators where applicable alongside graph optimizations and transforms.
Learn more → here
NuGet Packages
ONNX Runtime Native packages
Microsoft.ML.OnnxRuntime
- Native libraries for all supported platforms
- CPU Execution Provider
- CoreML Execution Provider on macOS/iOS
- XNNPACK Execution Provider on Android/iOS
Microsoft.ML.OnnxRuntime.Gpu
- Windows and Linux
- TensorRT Execution Provider
- CUDA Execution Provider
- CPU Execution Provider
Microsoft.ML.OnnxRuntime.DirectML
- Windows
- DirectML Execution Provider
- CPU Execution Provider
Microsoft.ML.OnnxRuntime.QNN
- 64-bit Windows
- QNN Execution Provider
- CPU Execution Provider
Intel.ML.OnnxRuntime.OpenVino
- 64-bit Windows
- OpenVINO Execution Provider
- CPU Execution Provider
Other packages
Microsoft.ML.OnnxRuntime.Managed
- C# language bindings
Microsoft.ML.OnnxRuntime.Extensions
- Custom operators for pre/post processing on all supported platforms.
Product | Versions Compatible and additional computed target framework versions. |
---|---|
native | native is compatible. |
-
.NETCoreApp 0.0
- Microsoft.ML.OnnxRuntime.Managed (>= 1.23.0-rc.1)
-
.NETFramework 0.0
- Microsoft.ML.OnnxRuntime.Managed (>= 1.23.0-rc.1)
-
.NETStandard 0.0
- Microsoft.ML.OnnxRuntime.Managed (>= 1.23.0-rc.1)
NuGet packages (1)
Showing the top 1 NuGet packages that depend on Microsoft.ML.OnnxRuntime.QNN:
Package | Downloads |
---|---|
Microsoft.ML.OnnxRuntimeGenAI.QNN
ONNX Runtime Generative AI Native Package |
GitHub repositories
This package is not used by any popular GitHub repositories.
Version | Downloads | Last Updated |
---|---|---|
1.23.0-rc.1 | 124 | 9/3/2025 |
1.22.2 | 227 | 8/13/2025 |
1.22.0 | 2,105 | 5/9/2025 |
1.21.0 | 3,185 | 3/7/2025 |
1.20.2 | 1,256 | 2/11/2025 |
1.20.1 | 3,016 | 11/21/2024 |
1.20.0 | 994 | 11/1/2024 |
1.19.0 | 1,321 | 8/15/2024 |
1.18.1 | 1,021 | 6/28/2024 |
1.18.0 | 767 | 5/15/2024 |
1.17.1 | 576 | 2/28/2024 |
1.16.0 | 658 | 9/21/2023 |
1.15.0 | 513 | 5/24/2023 |
Release Def:
Branch: refs/heads/users/snnn/rel123
Commit: a3a6628c7f6b8744c4797e505a09f4f4503a9f49
Build: https://aiinfra.visualstudio.com/Lotus/_build/results?buildId=927672