DotTorch.Optimizers 9.0.0

There is a newer version of this package available.
See the version list below for details.
dotnet add package DotTorch.Optimizers --version 9.0.0
                    
NuGet\Install-Package DotTorch.Optimizers -Version 9.0.0
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="DotTorch.Optimizers" Version="9.0.0" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="DotTorch.Optimizers" Version="9.0.0" />
                    
Directory.Packages.props
<PackageReference Include="DotTorch.Optimizers" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add DotTorch.Optimizers --version 9.0.0
                    
#r "nuget: DotTorch.Optimizers, 9.0.0"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package DotTorch.Optimizers@9.0.0
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=DotTorch.Optimizers&version=9.0.0
                    
Install as a Cake Addin
#tool nuget:?package=DotTorch.Optimizers&version=9.0.0
                    
Install as a Cake Tool

DotTorch.Optimizers — Оптимизаторы для обучения нейросетей в .NET

Содержание / Contents / Inhalt / 目录


Русский

DotTorch.Optimizers — библиотека оптимизаторов для обучения нейросетей на платформе .NET. Пакет расширяет DotTorch.Core, предоставляя реализацию классических и современных градиентных методов оптимизации с полной поддержкой вычислительного графа и autograd.

Основные возможности:

  • Реализованы оптимизаторы: SGD, RMSprop, Adam AdamW.
  • Совместимость с DotTorch.Core: корректная работа с .Grad, .Backward() и Parameters().
  • Поддержка learning rate, weight decay, группировки параметров.
  • Полноценный API: Step(), ZeroGrad(), AddParameterGroup() и т.д.
  • Лёгкая интеграция в тренировочные циклы.
  • Высокая производительность и надёжность.
  • Полное покрытие юнит-тестами.

DotTorch.Optimizers создан для гибкого и эффективного управления обучением нейросетей в .NET-экосистеме.


English

DotTorch.Optimizers is a high-performance optimizer library for training neural networks on the .NET platform. It extends DotTorch.Core by offering standard and advanced gradient-based optimizers with full autograd compatibility.

Key features:

  • Implements SGD, RMSprop, Adam AdamW.
  • Seamless integration with DotTorch.Core: compatible with .Grad, .Backward(), Parameters().
  • Supports learning rate, weight decay, parameter grouping.
  • Clear and flexible API: Step(), ZeroGrad(), AddParameterGroup(), etc.
  • Designed for efficient training loop integration.
  • Robust and extensible architecture.
  • Fully tested and production-ready.

DotTorch.Optimizers is built for precise and fast optimization in modern .NET-based deep learning systems.


Deutsch

DotTorch.Optimizers ist eine leistungsstarke Optimierungsbibliothek für das Training neuronaler Netzwerke unter .NET. Sie erweitert DotTorch.Core um klassische und moderne Gradienten-basierte Optimierer mit vollständiger Unterstützung von Autograd.

Hauptfunktionen:

  • Implementierung von SGD, RMSprop, Adam AdamW.
  • Nahtlose Integration mit DotTorch.Core: .Grad, .Backward(), Parameters().
  • Unterstützung für Lernrate, Gewichtsdämpfung und Parametergruppierung.
  • Klare API: Step(), ZeroGrad(), AddParameterGroup() usw.
  • Für effiziente Trainingsschleifen konzipiert.
  • Zuverlässig und erweiterbar.
  • Vollständig getestet und bereit für die Produktion.

DotTorch.Optimizers wurde entwickelt, um optimiertes Training neuronaler Netze mit .NET zu ermöglichen.


中文

DotTorch.Optimizers 是一个用于 .NET 平台的高性能优化器库,用于训练神经网络。它扩展了 DotTorch.Core,提供标准和高级的基于梯度的优化器,并完全兼容自动微分机制。

主要功能:

  • 实现优化器:SGD, RMSprop, Adam AdamW
  • 与 DotTorch.Core 完全集成:支持 .Grad.Backward()Parameters()
  • 支持学习率、权重衰减和参数分组。
  • 清晰的 API:Step()ZeroGrad()AddParameterGroup() 等。
  • 易于集成到训练循环中。
  • 高性能且结构可扩展。
  • 完善的测试覆盖,适用于生产环境。

DotTorch.Optimizers 致力于在 .NET 平台实现高效、稳定的深度学习优化过程。

Product Compatible and additional computed target framework versions.
.NET net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 is compatible.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 was computed.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

EN:
• DotTorch.Optimizers release.
• Implemented core optimizers: SGD, SGD with Momentum, Adam.
• Support for weight decay, learning rate scheduling, and parameter grouping.
• Integrated with DotTorch.Core autograd for full gradient compatibility.
• Designed for .NET 8/9 and compatible with future extensions.

RU:
• Релиз DotTorch.Optimizers.
• Реализованы базовые оптимизаторы: SGD, SGD с Momentum, Adam.
• Поддержка weight decay, lr scheduling и группировки параметров.
• Полная интеграция с DotTorch.Core для автоматического дифференцирования.
• Поддержка .NET 8/9 и расширяемость под будущие версии.

DE:
• Veröffentlichung von DotTorch.Optimizers.
• Kernoptimierer implementiert: SGD, SGD mit Momentum, Adam.
• Unterstützung für Weight Decay, Lernratenplanung und Parametergruppen.
• Vollständig integriert mit DotTorch.Core Autograd-System.
• Für .NET 8/9 entwickelt, mit Zukunftssicherheit.

CN:
• DotTorch.Optimizers 发布。
• 实现了核心优化器:SGD、SGD(含 Momentum)、Adam。
• 支持权重衰减、学习率调度和参数分组。
• 与 DotTorch.Core 的自动微分系统完全集成。
• 面向 .NET 8/9 设计,支持未来扩展。