Microsoft.Windows.AI.MachineLearning
0.3.131-beta
Prefix Reserved
Requires NuGet 3.3 or higher.
dotnet add package Microsoft.Windows.AI.MachineLearning --version 0.3.131-beta
NuGet\Install-Package Microsoft.Windows.AI.MachineLearning -Version 0.3.131-beta
<PackageReference Include="Microsoft.Windows.AI.MachineLearning" Version="0.3.131-beta" />
<PackageVersion Include="Microsoft.Windows.AI.MachineLearning" Version="0.3.131-beta" />
<PackageReference Include="Microsoft.Windows.AI.MachineLearning" />
paket add Microsoft.Windows.AI.MachineLearning --version 0.3.131-beta
#r "nuget: Microsoft.Windows.AI.MachineLearning, 0.3.131-beta"
#addin nuget:?package=Microsoft.Windows.AI.MachineLearning&version=0.3.131-beta&prerelease
#tool nuget:?package=Microsoft.Windows.AI.MachineLearning&version=0.3.131-beta&prerelease
Microsoft.Windows.AI.MachineLearning
The Microsoft Windows ML Runtime provides APIs for machine learning and AI operations in Windows applications.
Basic Usage
This package provides the Windows ML Runtime WinMD files for use in both C# and C++ projects.
Package Deployment
The package includes MSIX packages that contain the Windows ML Runtime. These packages should be deployed as part of your application's installation process.
MSIX Package Structure
When WinMLDeployMSIXToOutput
is set to true
, the MSIX packages will be copied to your output directory in this structure:
msix/
├── win-x64/
│ └── Microsoft.Windows.AI.MachineLearning.msix
└── win-arm64/
└── Microsoft.Windows.AI.MachineLearning.msix
You can instruct the NuGet package to copy these files to your output directory by setting:
<PropertyGroup>
<WinMLDeployMSIXToOutput>true</WinMLDeployMSIXToOutput>
</PropertyGroup>
Hardware Architecture Detection
IMPORTANT: You must install the MSIX package that matches your hardware platform, not your application's architecture:
- For ARM64 hardware, use the ARM64 package, even if your application is x64 (running under emulation)
- For x64 hardware, use the x64 package
// C++ deployment code example
#include <WinMLBootstrap.h>
// Deploy the package - auto-detects hardware architecture and uses the appropriate MSIX
HRESULT hr = WinMLDeployMainPackage();
if (FAILED(hr))
{
// Handle deployment failure
}
// C# deployment code example
using Microsoft.Windows.AI.MachineLearning.Bootstrap;
// Deploy the package - auto-detects hardware architecture and uses the appropriate MSIX
int hr = NativeMethods.WinMLDeployMainPackage();
if (hr < 0)
{
// Handle deployment failure
}
IMPORTANT: The MSIX deployment function is specifically designed for installation scenarios. It automatically looks for the MSIX file in the
msix/win-{arch}
subdirectory relative to your application executable.
Custom Deployment
If you need more control over the deployment process, you can use the Windows.Management.Deployment.PackageManager API directly. For complete documentation on the PackageManager API, see the Microsoft Docs: Windows.Management.Deployment.PackageManager.
Bootstrap Functionality
The package includes Bootstrap functionality for both C++ and C# projects, which is automatically configured when you install the NuGet package. This provides runtime initialization and dependency management without any additional setup.
What's Included
For C++ projects, the package automatically:
- Adds necessary include paths
- Adds required library references (WinMLBootstrap.lib)
- Copies the WinMLBootstrap.dll to your output directory
- Includes auto-initialization code
For C# projects, the package automatically:
- Copies the WinMLBootstrap.dll to your output directory
- Includes auto-initialization code
Configuration Options
Bootstrap functionality is included by default, but you can configure its behavior with these project properties:
<PropertyGroup>
<WinMLBootstrapAutoInitializeDisabled>true</WinMLBootstrapAutoInitializeDisabled>
<WinMLContinueOnInitFailure>true</WinMLContinueOnInitFailure>
</PropertyGroup>
1. Disable Auto-Initialization
When WinMLBootstrapAutoInitializeDisabled
is set to true
, the auto-initialization code is not included in your project. In this case, you'll need to manually initialize and uninitialize the runtime as shown below.
2. Continue on Initialization Failure
When WinMLContinueOnInitFailure
is set to true
, your application will continue running even if initialization fails. You can check the initialization status using the methods shown in the manual initialization examples.
Manual Initialization in C++
When auto-initialization is disabled, you need to manually initialize and uninitialize:
#include <WinMLBootstrap.h>
// Initialize
HRESULT hr = WinMLInitialize();
if (FAILED(hr))
{
// Handle initialization failure...
}
// Get the initialization status at any point
HRESULT status = WinMLGetInitializationStatus();
// Later, uninitialize before application exit
WinMLUninitialize();
Manual Initialization in C#
When auto-initialization is disabled, you need to manually initialize and uninitialize:
using Microsoft.Windows.AI.MachineLearning.Bootstrap;
// Initialize
int hr = NativeMethods.WinMLInitialize();
if (hr < 0)
{
// Handle initialization failure...
}
// Get the initialization status at any point
int status = NativeMethods.WinMLGetInitializationStatus();
// Later, uninitialize before application exit
NativeMethods.WinMLUninitialize();
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net8.0-windows10.0.26100 is compatible. net9.0-windows was computed. net10.0-windows was computed. |
native | native is compatible. |
Universal Windows Platform | uap10.0.22000 is compatible. |
-
native 0.0
- No dependencies.
-
net8.0-windows10.0.26100
- No dependencies.
-
UAP 10.0.22000
- No dependencies.
NuGet packages (1)
Showing the top 1 NuGet packages that depend on Microsoft.Windows.AI.MachineLearning:
Package | Downloads |
---|---|
Microsoft.ML.OnnxRuntimeGenAI.WinML
ONNX Runtime Generative AI Native Package |
GitHub repositories
This package is not used by any popular GitHub repositories.
Version | Downloads | Last updated |
---|---|---|
0.3.131-beta | 880 | 5/20/2025 |