The AForge.Neuro library contains classes for artificial neural network computation - feed forwards networks with error back propagation learning and Kohonen self organizing maps. Full list of features is available on the project's web site.
Contains neural learning algorithms such as Levenberg-Marquardt, Parallel Resilient Backpropagation, initialization procedures such as Nguyen-Widrow and other neural network related methods. This package is part of the Accord.NET Framework.
Catalyst is a Natural Language Processing library built from scratch for speed. Inspired by spaCy's design, it brings pre-trained models, out-of-the box support for training word and document embeddings, and flexible entity recognition models. You can install language-specific models with the model...
Parallel Neural Network Classes for .NET. Backpropagation class for easy training and testing.
Code Project link:
Hype is a proof-of-concept deep learning library, where you can perform optimization on compositional machine learning systems of many components, even when such components themselves internally perform optimization.
This is enabled by nested automatic differentiation (AD) giving you access to the...
This package contains the default English models for Catalyst. Catalyst is a Natural Language Processing library built from scratch for speed. Inspired by spaCy's design, it brings pre-trained models, out-of-the box support for training word and document embeddings, and flexible entity recognition...
The package helps to share any Umbraco document type in Linkedin, Twitter, and Facebook via one click. For now, you will never forget to share your blog post, news item or any other content on social networks. Please, carefully read quickstart tutorial to begin.
DyNet is a neural network library developed by Carnegie Mellon University and many others. It is written in C++ (with bindings in Python and C#) and is designed to be efficient when run on either CPU or GPU, and to work well with networks that have dynamic structures that change for every training...