Zachnad0.UtilLib
1.3.2
See the version list below for details.
dotnet add package Zachnad0.UtilLib --version 1.3.2
NuGet\Install-Package Zachnad0.UtilLib -Version 1.3.2
<PackageReference Include="Zachnad0.UtilLib" Version="1.3.2" />
paket add Zachnad0.UtilLib --version 1.3.2
#r "nuget: Zachnad0.UtilLib, 1.3.2"
// Install Zachnad0.UtilLib as a Cake Addin #addin nuget:?package=Zachnad0.UtilLib&version=1.3.2 // Install Zachnad0.UtilLib as a Cake Tool #tool nuget:?package=Zachnad0.UtilLib&version=1.3.2
<h1>ZUtilLib Quick Guide</h1> <h2>Note</h2> <p>This library is not specifically intended to be useful really to anyone besides myself, so why I am making a guide is just for fun. And testing <u style="color:green;"><s><i><b>formatting</b></i></s></u>. Do note that this library is, and will <i>always</i> be a work in progress, though most of it will be finished to a usable state eventually (of course). Documentation is available for <i>most</i> features.</p> <h2>Basic Utilities</h2> <p>Under the <code>ZUtilLib</code> namespace directly is where the most generic and uncategorizable methods (some extension) are. Basically stuff that's useful for, well, whatever they say they do.</p> <h2>Maths Stuff</h2> <p>Under the namespace <code>ZUtilLib.ZMath</code> a whole bunch of classes, an interface, and methods under those classes can be found. These are all a part of my "<b>Object Oriented Algebraic Calculator</b>" system that I came up with that can do trivial algebraic stuff like substituting variables, or random calculus things like differentiation and integration of equations. As of writing this, integration and <i>division</i> (coming later), have not been implemented. Do note, future self and other readers, that this part of the library is currently entirely undocumented.</p> <h2>AI (Neural Networks)</h2> <p>So instead of using someone else's obscure AI library, I decided to make my <i>own</i> obscure AI library. Stuff for specifically training networks is not provided, but the generating, initializing, deriving, mutating, calculating, and data structures for neural networks is all provided. This is all under the <code>ZUtilLib.ZAI</code> namespace, and will be relatively well documented when it is at a usable state. Activation functions and other stuff is included and utilized internally, so it should be quite easy and effective to customize and use. Below I'll write up a quick tutorial so that I can remember what do to, then wonder <i>why</i> I made it that way.</p> <h2>Basic Neural Network Usage Tutorial</h2> <p>All of these steps are required for the usage of neural networks.</p> <ol> <li>Instantite a new NN: <code>NeuralNetwork testnet = new NeuralNetwork(3, 3, 5, 2, NDNodeActivFunc.ReLU);</code>. Intellisense will show you what the parameter names are, and the documentation assigned to them. Calling the constructor essentially creates a new NN instance of the specified sizes.</li> <li>Initialize the NN: <code>testNet.InitializeThis();</code>. What this does is generate all of the nodes and links with randomized weights and biases. An optional parameter is a float, for amplifying the randomness.</li> <li>Setup the output names: <code>testNet.SetupOutputs("alpha", "beta", "gamma");</code>. This method simply takes a params array of strings, but you must only pass in as many as there are output nodes. I made the required for some reason, this may become an optional step in the future.</li> <li>Calculate the result by passing in the inputs: <code>(string NodeName, float Value)[] result = testNet.PerformCalculations(("in1", 0.2f), ("in2", 0.3f), ("in3", 0.4f));</code>. This method takes a params array of tuple-things with both the name (no idea why I made it that way) and value of the input node.</li> <li>So now you've gotten your results, and scored the neural network or whatever you wish to do. To clone the network you'll have to instantiate a new network with identical specifications: <code>NeuralNetwork secondTestNet = new NeuralNetwork(3, 3, 5, 2, NDNodeActivFunc.ReLU);</code>.</li> <li>Then clone it via an overload of the InitializeThis method: <code>secondTestNet.InitializeThis(testNet, 1, 1);</code>. The two floats following the network to be cloned determine mutation chance and learning rate for it to be cloned by. There's also an optional bool for if you want the changes to be relative (default off).</li> </ol> <br> <p>Well that seemed <i>quite inconvenient</i> I must say. Certainly this will be updated when I get around to enhancing it, after confirming that it works well enough, and modifying it further if needed. I'd be surpised if anyone actually reads this entire thing, but anyways g'bye and enjoy these random features.</p>
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
.NET Core | netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.1 is compatible. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.1
- System.Text.Json (>= 7.0.3)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
1.3.2:
- ADDED Greek alphabet, because why not.
1.3.1:
- ADDED Neural network training async task, for training some/an initial network(s) or a new network, against a provided multi-input and output target function.
- FIXED Random network generation only generating positive initial weights and biases, so amplitude is actually treated as such.
- FIXED Naming outputs (and inputs) is now __entirely__ optional.
- FIXED Calculation inputs are no longer limited between 1 and 0.