TensorStack.Upscaler
0.1.17
Prefix Reserved
dotnet add package TensorStack.Upscaler --version 0.1.17
NuGet\Install-Package TensorStack.Upscaler -Version 0.1.17
<PackageReference Include="TensorStack.Upscaler" Version="0.1.17" />
<PackageVersion Include="TensorStack.Upscaler" Version="0.1.17" />
<PackageReference Include="TensorStack.Upscaler" />
paket add TensorStack.Upscaler --version 0.1.17
#r "nuget: TensorStack.Upscaler, 0.1.17"
#:package TensorStack.Upscaler@0.1.17
#addin nuget:?package=TensorStack.Upscaler&version=0.1.17
#tool nuget:?package=TensorStack.Upscaler&version=0.1.17
TensorStack.Upscaler
High-performance ONNX image and video upscaling for AI-generated content or other media. Supports multiple models and provides both buffered and streaming video modes for flexible memory usage.
Quick Start
This minimal example demonstrates how to upscale an image and a video using TensorStack.Upscaler
.
[nuget: TensorStack.Upscaler]
[nuget: TensorStack.Providers.DML]
[nuget: TensorStack.Image.Bitmap]
[nuget: TensorStack.Video.Windows]
[nuget: TensorStack.Audio.Windows]
[model: https://huggingface.co/TensorStack/Upscale-amuse/resolve/main/RealESR-General-4x/model.onnx]
async Task QuickStartAsync()
{
// 1. Create the Upscale Pipeline
var pipeline = UpscalePipeline.Create(new UpscalerConfig
{
ScaleFactor = 4,
Normalization = Normalization.ZeroToOne,
ExecutionProvider = Provider.GetProvider(),
Path = @"M:\Upscaler\RealESR-General-4x.onnx"
});
// 2. Upscale an Image
var inputImage = new ImageInput("Input.png");
var upscaledImage = await pipeline.RunAsync(new UpscaleImageOptions
{
Image = inputImage,
TileMode = TileMode.None
});
await upscaledImage.SaveAsync("Output.png");
// 3. Upscale a Video (Streaming mode)
var inputStream = await VideoInputStream.CreateAsync("Input.mp4");
var upscaledVideo = pipeline.RunAsync(new UpscaleStreamOptions
{
Stream = inputStream.GetAsync(),
TileMode = TileMode.None
});
await upscaledVideo.SaveAsync("Output.mp4");
// 4. Add audio from the source video (optional)
await AudioManager.AddAudioAsync("Output.mp4", "Input.mp4");
}
Creating an Upscale Pipeline
[nuget: TensorStack.Upscaler]
[nuget: TensorStack.Providers.DML]
// Create the pipeline
var pipeline = UpscalePipeline.Create(new UpscalerConfig
{
ScaleFactor = 4,
Normalization = Normalization.ZeroToOne,
ExecutionProvider = Provider.GetProvider(),
Path = @"M:\Models\RealESR-General-4x\model.onnx"
});
Configuration Options:
ScaleFactor
— Upscale factor (e.g., 2x, 4x)Normalization
— Input value normalization (ZeroToOne
orOneToOne
)ExecutionProvider
— Hardware provider (CPU, GPU, DirectML, etc.)Path
— Path to the ONNX model
Upscale Image
[nuget: TensorStack.Image.Bitmap]
// Read Image
var inputImage = new ImageInput("Input.png");
// Upscale Image
var output = await pipeline.RunAsync(new UpscaleImageOptions
{
Image = inputImage,
TileMode = TileMode.None
});
// Write Image
await output.SaveAsync("Output.png");
Upscale Video (Buffered)
Buffers all frames in memory. Suitable for short-duration videos, AI-generated content, low-resolution videos, or GIFs.
[nuget: TensorStack.Video.Windows]
// Read Video
var inputVideo = await VideoInput.CreateAsync("Input.gif");
// Upscale Video
var outputVideo = await pipeline.RunAsync(new UpscaleVideoOptions
{
Video = inputVideo,
TileMode = TileMode.None
});
// Write Video
await outputVideo.SaveAync("Output.mp4");
Upscale Video (Stream)
Processes frames one-by-one for minimal memory usage. Ideal for high-resolution or long-duration videos.
[nuget: TensorStack.Video.Windows]
// Read Stream
var inputStream = await VideoInputStream.CreateAsync("Input.mp4");
// Upscale Stream
var outputStream = pipeline.RunAsync(new UpscaleStreamOptions
{
Stream = inputStream.GetAsync(),
TileMode = TileMode.None
});
// Write Stream
await outputStream.SaveAync("Output.mp4");
Audio Support
TensorStack.Video only processes video frames, so audio will be missing from the final result.
You can use the TensorStack.Audio package to restore audio from the source video:
[nuget: TensorStack.Audio.Windows]
await AudioManager.AddAudioAsync("TargetVideo.mp4", "SourceVideo.mp4");
Tiling Support
Tiling allows images and video frames to be processed in smaller sections (tiles) instead of all at once. This helps reduce memory usage and can improve performance when working with very large images or high-resolution videos.
The TileMode
determines how these tiles are handled:
- None: Processes the entire image/frame in a single pass.
- Overlap: Tiles have overlapping edges to avoid visible seams.
- Blend: Overlapping tiles are blended together for smooth transitions.
- Clip: Tiles are cut without blending.
- Clip + Blend: Combines clipping and blending for high-quality results.
Additional options include:
- MaxTileSize: The maximum size of each tile in pixels. Smaller tiles reduce memory usage but may take longer to process.
- TileOverlap: The number of overlapping pixels between tiles. More overlap can prevent visible seams and improve output quality.
Adjusting these settings allows you to balance memory usage, processing speed, and visual quality for your upscaling tasks.
Upscale Models
Here is a list of some known and tested models compatible with TensorStack.Upscaler
:
- wuminghao/swinir
- rocca/swin-ir-onnx
- Xenova/swin2SR-classical-sr-x2-64
- Xenova/swin2SR-classical-sr-x4-64
- Neus/GFPGANv1.4
- TensorStack/Upscale-amuse
Combining Pipelines
TensorStack supports chaining multiple pipelines together using IAsyncEnumerable
streams.
This allows complex video processing workflows—such as upscaling and frame interpolation—to be executed efficiently in a single pass without storing intermediate results in memory.
In the following example, the video is upscaled by 4× and its frame rate is increased by 3×:
[nuget: TensorStack.Upscaler]
[nuget: TensorStack.Providers.DML]
[nuget: TensorStack.Video.Windows]
// Create Provider
var provider = Provider.GetProvider();
// Upscaler Config
var upscaleConfig = new UpscalerConfig
{
ScaleFactor = 4,
ExecutionProvider = provider,
Normalization = Normalization.ZeroToOne,
Path = @"M:\Models\RealESR-General-4x\model.onnx"
}
// Create Pipelines
using (var upscalePipeline = UpscalePipeline.Create(upscaleConfig))
using (var interpolationPipeline = InterpolationPipeline.Create(provider))
{
// Read Stream [512 x 512 @ 8fps]
var videoInput = new VideoInputStream("Input.mp4");
var videoStream = videoInput.GetAsync();
// Upscale Stream
videoStream = upscalePipeline.RunAsync(new UpscaleStreamOptions
{
Stream = videoStream
});
// Interpolate Stream
videoStream = interpolationPipeline.RunAsync(new InterpolationStreamOptions
{
Multiplier = 3,
Stream = videoStream,
FrameRate = videoInput.FrameRate,
FrameCount = videoInput.FrameCount
});
// Save Steam [2048 x 2048 @ 24fps]
await videoStream.SaveAync("Output.mp4");
}
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net9.0 is compatible. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net9.0
- TensorStack.Common (>= 0.1.17)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.