Tensor 0.4.11

.NET Standard 2.0
dotnet add package Tensor --version 0.4.11
NuGet\Install-Package Tensor -Version 0.4.11
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Tensor" Version="0.4.11" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Tensor --version 0.4.11
#r "nuget: Tensor, 0.4.11"
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install Tensor as a Cake Addin
#addin nuget:?package=Tensor&version=0.4.11

// Install Tensor as a Cake Tool
#tool nuget:?package=Tensor&version=0.4.11

Tensor (n-dimensional array) library for F#

     Core features:
       - n-dimensional arrays (tensors) in host memory or on CUDA GPUs
       - element-wise operations (addition, multiplication, absolute value, etc.)
       - basic linear algebra operations (dot product, SVD decomposition, matrix inverse, etc.)
       - reduction operations (sum, product, average, maximum, arg max, etc.)
       - logic operations (comparision, and, or, etc.)
       - views, slicing, reshaping, broadcasting (similar to NumPy)
       - scatter and gather by indices
       - standard functional operations (map, fold, etc.)

     Data exchange:
       - read/write support for HDF5 (.h5)
       - interop with standard F# types (Seq, List, Array, Array2D, Array3D, etc.)

     Performance:
       - host: SIMD and BLAS accelerated operations
         - by default Intel MKL is used (shipped with NuGet package)
         - other BLASes (OpenBLAS, vendor-specific) can be selected by configuration option
       - CUDA GPU: all operations performed locally on GPU and cuBLAS used for matrix operations

     Requirements:
       - Linux, MacOS or Windows on x64
       - Linux requires libgomp.so.1 installed.

     Additional algorithms are provided in the Tensor.Algorithm package.

Product Compatible and additional computed target framework versions.
.NET net5.0 was computed.  net5.0-windows was computed.  net6.0 was computed.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 was computed.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed. 
.NET Core netcoreapp2.0 was computed.  netcoreapp2.1 was computed.  netcoreapp2.2 was computed.  netcoreapp3.0 was computed.  netcoreapp3.1 was computed. 
.NET Standard netstandard2.0 is compatible.  netstandard2.1 was computed. 
.NET Framework net461 was computed.  net462 was computed.  net463 was computed.  net47 was computed.  net471 was computed.  net472 was computed.  net48 was computed.  net481 was computed. 
MonoAndroid monoandroid was computed. 
MonoMac monomac was computed. 
MonoTouch monotouch was computed. 
Tizen tizen40 was computed.  tizen60 was computed. 
Xamarin.iOS xamarinios was computed. 
Xamarin.Mac xamarinmac was computed. 
Xamarin.TVOS xamarintvos was computed. 
Xamarin.WatchOS xamarinwatchos was computed. 
Compatible target framework(s)
Additional computed target framework(s)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (3)

Showing the top 3 NuGet packages that depend on Tensor:

Package Downloads
DeepNet

Deep learning library for F#. Provides symbolic model differentiation, automatic differentiation and compilation to CUDA GPUs. Includes optimizers and model blocks used in deep learning. Make sure to set the platform of your project to x64.

RPlotTools

Tools for plotting using R from F#.

Tensor.Algorithm

Data types: - arbitrary precision rational numbers Matrix algebra (integer, rational): - Row echelon form - Smith normal form - Kernel, cokernel and (pseudo-)inverse Matrix decomposition (floating point): - Principal component analysis (PCA) - ZCA whitening Misc: - Bezout's identity - Loading of NumPy's .npy and .npz files.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last updated
0.4.11 6,225 5/8/2018
0.4.11-v0.4.11-215 607 5/8/2018
0.4.11-symtensor-core-242 969 11/15/2018
0.4.11-symtensor-core-241 964 11/15/2018
0.4.11-symtensor-core-240 965 11/15/2018
0.4.11-symtensor-core-239 928 11/15/2018
0.4.11-symtensor-core-238 950 11/15/2018
0.4.11-symtensor-core-237 1,030 11/15/2018
0.4.11-symtensor-core-236 900 11/14/2018
0.4.11-symtensor-core-235 942 11/14/2018
0.4.11-symtensor-core-234 921 11/14/2018
0.4.11-symtensor-core-231 1,022 11/9/2018
0.4.11-symtensor-core-230 973 11/9/2018
0.4.11-symtensor-core-229 955 11/8/2018
0.4.11-symtensor-core-228 985 11/8/2018
0.4.11-symtensor-core-227 999 10/30/2018
0.4.11-symtensor-core-226 1,059 10/30/2018
0.4.11-symtensor-core-225 968 10/30/2018
0.4.11-develop-216 1,163 5/8/2018
0.4.10-develop-213 1,200 5/8/2018
0.4.10-develop-212 1,128 5/7/2018
0.4.10-develop-211 1,209 5/7/2018
0.3.0.712-master 915 9/1/2017
0.3.0.711-master 920 9/1/2017
0.3.0.710-master 881 9/1/2017
0.3.0.709-master 897 8/31/2017
0.3.0.708-master 911 8/30/2017
0.3.0.707-master 884 8/30/2017
0.3.0.706-master 891 8/30/2017
0.3.0.701-master 950 6/26/2017
0.3.0.700-master 941 6/22/2017
0.3.0.699-master 910 6/22/2017
0.3.0.698-master 907 6/21/2017
0.3.0.697-master 914 6/21/2017
0.3.0.696-master 983 6/21/2017
0.3.0.695-master 940 6/21/2017
0.3.0.694-master 901 6/21/2017
0.3.0.693-master 928 6/20/2017
0.3.0.692-master 912 6/19/2017
0.3.0.691-master 946 6/19/2017
0.3.0.690-master 942 6/19/2017
0.3.0.689-master 922 5/14/2017
0.3.0.688 7,063 5/14/2017
0.3.0.686-master 918 5/14/2017
0.2.0.591-master 896 4/19/2017
0.2.0.565-master 892 4/11/2017
0.2.0.556-master 885 3/21/2017
0.2.0.551-master 950 3/17/2017
0.2.0.540-master 869 3/15/2017
0.2.0.536-master 874 3/14/2017
0.2.0.519-master 910 3/2/2017
0.2.0.516-master 890 3/2/2017
0.2.0.499-master 922 2/13/2017
0.2.0.494-master 894 2/7/2017
0.2.0.479-master 910 2/1/2017
0.2.0.463-master 910 1/17/2017
0.2.0.431-master 980 12/2/2016
0.2.0.422-master 1,269 11/9/2016
0.2.0.421-master 1,210 11/9/2016
0.2.0.411-master 954 10/26/2016
0.2.0.400-master 899 10/26/2016
0.2.0.394-master 905 10/25/2016
0.2.0.382-master 904 10/21/2016
0.2.0.377-master 904 10/20/2016
0.2.0.323-master 904 10/11/2016
0.2.0.262-master 937 9/29/2016
0.2.0.248-master 922 9/27/2016
0.2.0.174-master 917 9/16/2016
0.2.0.128-master 932 9/8/2016
0.2.0.122-master 929 9/8/2016
0.2.0.121-master 898 9/7/2016
0.2.0.111-master 895 9/7/2016
0.2.0.105-ci 938 9/5/2016
0.2.0.97-ci 965 8/30/2016
0.2.0.96-ci 909 8/29/2016
0.2.0.90-ci 927 8/25/2016
0.2.0.89-ci 884 8/24/2016
0.2.0.88-ci 923 8/24/2016
0.2.0.87-ci 910 8/24/2016
0.2.0.86-ci 918 8/23/2016
0.2.0.85-ci 925 8/22/2016
0.2.0.84-ci 936 8/22/2016
0.2.0.83-ci 948 8/22/2016
0.2.0.82 2,130 8/22/2016
0.2.0.81-ci 934 8/19/2016
0.2.0.80-ci 937 6/27/2016
0.2.0.79-ci 941 6/27/2016
0.2.0.77-ci 937 6/22/2016
0.2.0.76-ci 955 6/22/2016
0.2.0.75 1,604 6/15/2016
0.2.0.74-ci 1,289 6/15/2016
0.2.0.73 1,843 6/15/2016
0.2.0.72 1,835 6/15/2016
0.2.0.71 1,828 6/14/2016
0.2.0.70 1,718 6/9/2016
0.2.0.69 1,662 6/9/2016
0.2.0.68 1,510 6/9/2016
0.2.0.67 1,996 6/8/2016
0.2.0.66-ci 945 6/8/2016
0.2.0.65-ci 925 6/8/2016
0.2.0.64-ci 992 6/8/2016
0.2.0.63-ci 918 6/7/2016
0.2.0.62 1,492 6/7/2016
0.2.0.61 1,461 6/6/2016
0.2.0.60 1,481 6/6/2016
0.2.0.59 1,411 6/6/2016
0.2.0.57 1,498 6/3/2016
0.2.0.56 1,465 6/3/2016
0.2.0.55 1,548 6/3/2016
0.2.0.54 1,500 6/3/2016
0.2.0.53 1,819 6/3/2016
0.2.0.52-ci 901 6/2/2016
0.2.0.51-ci 943 6/2/2016
0.2.0.50-ci 932 6/2/2016
0.2.0.49 1,850 5/31/2016
0.2.0.48-ci 994 5/31/2016
0.2.0.46-ci 963 5/31/2016
0.2.0.45 1,696 5/31/2016
0.2.0.44 1,700 5/31/2016
0.2.0.43 1,653 5/31/2016
0.2.0.42 1,685 5/30/2016
0.2.0.41 1,673 5/30/2016
0.2.0.40 1,685 5/30/2016
0.2.0.39 1,725 5/30/2016
0.2.0.38 1,718 5/30/2016
0.2.0.37 1,645 5/30/2016
0.2.0.36 1,700 5/25/2016
0.2.0.35 1,698 5/24/2016
0.2.0.34 1,719 5/24/2016
0.2.0.33 2,515 5/24/2016
0.2.0.32-ci 918 5/24/2016
0.1.26-ci 954 5/24/2016
0.1.24-ci 945 5/24/2016
0.1.19-ci 921 5/24/2016