OnnxStack.Core
0.4.0
Prefix Reserved
See the version list below for details.
dotnet add package OnnxStack.Core --version 0.4.0
NuGet\Install-Package OnnxStack.Core -Version 0.4.0
<PackageReference Include="OnnxStack.Core" Version="0.4.0" />
paket add OnnxStack.Core --version 0.4.0
#r "nuget: OnnxStack.Core, 0.4.0"
// Install OnnxStack.Core as a Cake Addin #addin nuget:?package=OnnxStack.Core&version=0.4.0 // Install OnnxStack.Core as a Cake Tool #tool nuget:?package=OnnxStack.Core&version=0.4.0
OnnxStack.Core - Onnx Services for .NET Applications
OnnxStack.Core is a library that provides higher-level ONNX services for use in .NET applications. It offers extensive support for features such as dependency injection, .NET configuration implementations, ASP.NET Core integration, and IHostedService support.
You can configure a model set for runtime, offloading individual models to different devices to make better use of resources or run on lower-end hardware. The first use-case is StableDiffusion; however, it will be expanded, and other model sets, such as object detection and classification, will be added.
Getting Started
OnnxStack.Core can be found via the nuget package manager, download and install it.
PM> Install-Package OnnxStack.Core
.NET Core Registration
You can easily integrate OnnxStack.Core
into your application services layer. This registration process sets up the necessary services and loads the appsettings.json
configuration.
Example: Registering OnnxStack
builder.Services.AddOnnxStack();
Configuration example
The appsettings.json
is the easiest option for configuring model sets. Below is an example of clip tokenizer
.
{
"Logging": {
"LogLevel": {
"Default": "Information",
"Microsoft.AspNetCore": "Warning"
}
},
"AllowedHosts": "*",
"OnnxStackConfig": {
"Name": "Clip Tokenizer",
"TokenizerLimit": 77,
"ModelConfigurations": [{
"Type": "Tokenizer",
"DeviceId": 0,
"ExecutionProvider": "Cpu",
"OnnxModelPath": "D:\\Repositories\\stable-diffusion-v1-5\\cliptokenizer.onnx"
}]
}
}
Basic C# Example
// From DI
IOnnxModelService _onnxModelService;
// Tokenizer model Example
var text = "Text To Tokenize";
var inputTensor = new DenseTensor<string>(new string[] { text }, new int[] { 1 });
var inputString = new List<NamedOnnxValue>
{
NamedOnnxValue.CreateFromTensor("string_input", inputTensor)
};
// Create an InferenceSession from the Onnx clip tokenizer.
// Run session and send the input data in to get inference output.
using (var tokens = _onnxModelService.RunInference(OnnxModelType.Tokenizer, inputString))
{
var resultTensor = tokens.ToArray();
}
Basic C# Example (No DI)
// Create Configuration
var onnxStackConfig = new OnnxStackConfig
{
Name = "OnnxStack",
TokenizerLimit = 77,
ModelConfigurations = new List<OnnxModelSessionConfig>
{
new OnnxModelSessionConfig
{
DeviceId = 0,
ExecutionProvider = ExecutionProvider.DirectML,
Type = OnnxModelType.Tokenizer,
OnnxModelPath = "clip_tokenizer.onnx",
}
}
};
// Create Service
var onnxModelService = new OnnxModelService(onnxStackConfig);
// Tokenizer model Example
var text = "Text To Tokenize";
var inputTensor = new DenseTensor<string>(new string[] { text }, new int[] { 1 });
var inputString = new List<NamedOnnxValue>
{
NamedOnnxValue.CreateFromTensor("string_input", inputTensor)
};
// Create an InferenceSession from the Onnx clip tokenizer.
// Run session and send the input data in to get inference output.
using (var tokens = onnxModelService.RunInference(OnnxModelType.Tokenizer, inputString))
{
var resultTensor = tokens.ToArray();
}
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net7.0 is compatible. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
-
net7.0
- Microsoft.Extensions.DependencyInjection.Abstractions (>= 7.0.0)
- Microsoft.Extensions.Hosting.Abstractions (>= 7.0.0)
- Microsoft.ML (>= 2.0.1)
- Microsoft.ML.OnnxRuntime.Extensions (>= 0.9.0)
- Microsoft.ML.OnnxRuntime.Managed (>= 1.16.1)
NuGet packages (3)
Showing the top 3 NuGet packages that depend on OnnxStack.Core:
Package | Downloads |
---|---|
OnnxStack.StableDiffusion
Stable Diffusion Library for .NET |
|
OnnxStack.ImageUpscaler
OnnxRuntime Image Upscale Library for .NET |
|
OnnxStack.FeatureExtractor
OnnxRuntime Image Feature Extractor Library for .NET |
GitHub repositories (1)
Showing the top 1 popular GitHub repositories that depend on OnnxStack.Core:
Repository | Stars |
---|---|
TensorStack-AI/OnnxStack
C# Stable Diffusion using ONNX Runtime
|
Version | Downloads | Last updated | |
---|---|---|---|
0.39.0 | 380 | 6/12/2024 | |
0.31.0 | 267 | 4/25/2024 | |
0.27.0 | 192 | 3/31/2024 | |
0.25.0 | 179 | 3/14/2024 | |
0.23.0 | 175 | 2/29/2024 | |
0.22.0 | 135 | 2/23/2024 | |
0.21.0 | 157 | 2/15/2024 | |
0.19.0 | 163 | 2/1/2024 | |
0.17.0 | 184 | 1/18/2024 | |
0.16.0 | 134 | 1/11/2024 | |
0.15.0 | 205 | 1/5/2024 | |
0.14.0 | 161 | 12/27/2023 | |
0.13.0 | 133 | 12/22/2023 | |
0.12.0 | 142 | 12/15/2023 | |
0.10.0 | 168 | 11/30/2023 | |
0.9.0 | 147 | 11/23/2023 | |
0.8.0 | 202 | 11/16/2023 | |
0.7.0 | 147 | 11/9/2023 | |
0.6.0 | 133 | 11/2/2023 | |
0.5.0 | 140 | 10/27/2023 | |
0.4.0 | 121 | 10/19/2023 | |
0.3.1 | 148 | 10/9/2023 | |
0.3.0 | 117 | 10/9/2023 | |
0.2.0 | 124 | 10/3/2023 | |
0.1.0 | 171 | 9/25/2023 |