MKL.NET.osx-x64 2022.0.0.105

dotnet add package MKL.NET.osx-x64 --version 2022.0.0.105                
NuGet\Install-Package MKL.NET.osx-x64 -Version 2022.0.0.105                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="MKL.NET.osx-x64" Version="2022.0.0.105" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add MKL.NET.osx-x64 --version 2022.0.0.105                
#r "nuget: MKL.NET.osx-x64, 2022.0.0.105"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install MKL.NET.osx-x64 as a Cake Addin
#addin nuget:?package=MKL.NET.osx-x64&version=2022.0.0.105

// Install MKL.NET.osx-x64 as a Cake Tool
#tool nuget:?package=MKL.NET.osx-x64&version=2022.0.0.105                

MKL.NET

build

A simple cross platform .NET API for Intel MKL.

Exposing functions from MKL keeping the syntax as close to the c developer reference as possible.

Reference the MKL.NET package and required runtime packages and use the static MKL functions. The correct native libraries will be included and loaded at runtime.

MKL.NET MKL.NET
runtimes:
MKL.NET.win-x64 MKL.NET
MKL.NET.win-x86 MKL.NET
MKL.NET.linux-x64 MKL.NET
MKL.NET.linux-x86 MKL.NET
MKL.NET.osx-x64 MKL.NET
libraries:
MKL.NET.Matrix MKL.NET
MKL.NET.Optimization MKL.NET
MKL.NET.Statistics MKL.NET

Rationale

  • Use freely available Intel MKL packages repackaged to work for each runtime.
  • The MKL.NET API is just a thin .NET wrapper around the native API keeping the syntax as close as possible.
  • The project is well defined, with an open design, and no business logic and could benefit from external input.
  • Cross platform testing is easy and free using Github actions.
  • MKL.NET native packages can just be referenced for needed runtimes at library or application level.

MKL.NET.Matrix

  • Performance and memory optimised matrix algebra library.
  • Matrix expressions are optimised to perform intermediate calculations inplace and reuse memory.
  • Operations such as scale, transpose, +, * are combined into single MKL calls.
  • Intermediate matrices are disposed (or reused) automatically.
  • ArrayPool underlying memory model using IDisposable and Finalizers.
  • Uses the Pinned Object Heap for net5.0.
  • All these combined result in it being much faster than other matrix libraries.

The following example only results in one new matrix r (using ArrayPool) without mutating inputs.

public static matrix Example(matrix ma, matrix mb, vector va, vector vb)
{
    using matrix r = 0.5 * Matrix.Abs(1.0 - ma) * mb.T + Math.PI * va.T * Vector.Sin(vb);
    ...
}

Example statistics matrix function:

public static (vector, matrix) MeanAndCovariance(matrix samples, vector weights)
{
    if (samples.Rows != weights.Length) ThrowHelper.ThrowIncorrectDimensionsForOperation();
    var mean = new vector(samples.Cols);
    var cov = new matrix(samples.Cols, samples.Cols);
    var task = Vsl.SSNewTask(samples.Cols, samples.Rows, VslStorage.ROWS, samples.Array, weights.Array);
    ThrowHelper.Check(Vsl.SSEditCovCor(task, mean.Array, cov.Array, VslFormat.FULL, null, VslFormat.FULL));
    ThrowHelper.Check(Vsl.SSCompute(task, VslEstimate.COV, VslMethod.FAST));
    ThrowHelper.Check(Vsl.SSDeleteTask(task));
    return (mean, cov);
}

Note: arrays need to be pinned across all MKL function calls when there are multiple as above as MKL stores native pointers and the arrays could be moved between calls. MKL.NET handles pinning automatically, unpinning when the task is deleted. This is a common seen bug when using MKL directly from .NET which causes occasional crashes.

MKL.NET.Optimization

Simple and high performance optimization and root finding library loosely based on the scipy.optimize API.

The aim is to include the latest algorithms such as Toms748, robustly tested with CsCheck. Full use of MKL.NET will be made to improve performance. Algorithms will be performance tested and default to the best for given inputs.

  • Root - root finding algorithms. Default algorithm has 20% fewer function calls than Brent, Toms748, Newton and Halley. Further details here.
  • Calculus - derivative and integral numeric calculations and check to any precision using Richardson extrapolation.
  • Minimum - minimum finding algorithms in one dimension. Default algorithm has 50% fewer function calls than Brent.
  • Minimum - in N dimensions. Intuative tolerance parameters. Optimised no array allocation in main loop using in place symmetric MKL rank-k, rank-2k functions.
    Should scale well with the number of dimensions. ~ 50-70% fewer function calls than other BFGS algorithms.
  • CurveFit and non-linear LeastSquares - helper functions based on Minimum.

MKL.NET.Statistics

Simple and high performance statistics functions.

  • Summary - Sum, Mean, Median, MAD, Raw/Central/Standard Moments, Quartiles, Quantiles, Covariance, Correlation. All can be weighted.
  • Estimator - Running high performance, low memory estimators for Quantile, Quartiles, Quantiles, Histogram, Central/Standard Moments. Further details here.
There are no supported framework assets in this package.

Learn more about Target Frameworks and .NET Standard.

  • .NETStandard 2.0

    • No dependencies.

NuGet packages (2)

Showing the top 2 NuGet packages that depend on MKL.NET.osx-x64:

Package Downloads
ParallelReverseAutoDiff

A library for parallelized reverse mode automatic differentiation in C# for custom neural network development.

ParallelReverseAutoDiffLite

A lightweight library for parallelized reverse mode automatic differentiation in C# for custom neural network development using single-precision.

GitHub repositories (1)

Showing the top 1 popular GitHub repositories that depend on MKL.NET.osx-x64:

Repository Stars
MKL-NET/MKL.NET
A simple cross platform .NET API for Intel MKL
Version Downloads Last updated
2022.0.0.105 17,525 1/19/2022
2021.4.0.637 1,425 11/5/2021
2021.3.0.517 2,403 7/4/2021
2021.2.0.269 1,800 3/29/2021
2021.1.1.50 1,065 2/15/2021
2020.4.301 2,651 10/30/2020
2020.3.279 2,863 9/4/2020
2020.2.258 2,017 9/4/2020
2020.1.216 2,044 9/4/2020