Tensor 0.4.11

Tensor (n-dimensional array) library for F#

     Core features:
       - n-dimensional arrays (tensors) in host memory or on CUDA GPUs
       - element-wise operations (addition, multiplication, absolute value, etc.)
       - basic linear algebra operations (dot product, SVD decomposition, matrix inverse, etc.)
       - reduction operations (sum, product, average, maximum, arg max, etc.)
       - logic operations (comparision, and, or, etc.)
       - views, slicing, reshaping, broadcasting (similar to NumPy)
       - scatter and gather by indices
       - standard functional operations (map, fold, etc.)

     Data exchange:
       - read/write support for HDF5 (.h5)
       - interop with standard F# types (Seq, List, Array, Array2D, Array3D, etc.)

     Performance:
       - host: SIMD and BLAS accelerated operations
         - by default Intel MKL is used (shipped with NuGet package)
         - other BLASes (OpenBLAS, vendor-specific) can be selected by configuration option
       - CUDA GPU: all operations performed locally on GPU and cuBLAS used for matrix operations

     Requirements:
       - Linux, MacOS or Windows on x64
       - Linux requires libgomp.so.1 installed.

     Additional algorithms are provided in the Tensor.Algorithm package.

Install-Package Tensor -Version 0.4.11
dotnet add package Tensor --version 0.4.11
<PackageReference Include="Tensor" Version="0.4.11" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Tensor --version 0.4.11
The NuGet Team does not provide support for this client. Please contact its maintainers for support.

NuGet packages (3)

Showing the top 3 NuGet packages that depend on Tensor:

Package Downloads
DeepNet
Deep learning library for F#. Provides symbolic model differentiation, automatic differentiation and compilation to CUDA GPUs. Includes optimizers and model blocks used in deep learning. Make sure to set the platform of your project to x64.
RPlotTools
Tools for plotting using R from F#.
Tensor.Algorithm
Data types: - arbitrary precision rational numbers Matrix algebra (integer, rational): - Row echelon form - Smith normal form - Kernel, cokernel and (pseudo-)inverse Matrix decomposition (floating point): - Principal component analysis (PCA) - ZCA whitening Misc: - Bezout's identity - Loading of NumPy's .npy and .npz files.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version History

Version Downloads Last updated
0.4.11 1,591 5/8/2018
0.4.11-v0.4.11-215 263 5/8/2018
0.4.11-symtensor-core-242 292 11/15/2018
0.4.11-symtensor-core-241 261 11/15/2018
0.4.11-symtensor-core-240 269 11/15/2018
0.4.11-symtensor-core-239 261 11/15/2018
0.4.11-symtensor-core-238 279 11/15/2018
0.4.11-symtensor-core-237 297 11/15/2018
0.4.11-symtensor-core-236 245 11/14/2018
0.4.11-symtensor-core-235 262 11/14/2018
0.4.11-symtensor-core-234 255 11/14/2018
0.4.11-symtensor-core-231 280 11/9/2018
0.4.11-symtensor-core-230 287 11/9/2018
0.4.11-symtensor-core-229 259 11/8/2018
0.4.11-symtensor-core-228 249 11/8/2018
0.4.11-symtensor-core-227 281 10/30/2018
0.4.11-symtensor-core-226 286 10/30/2018
0.4.11-symtensor-core-225 250 10/30/2018
0.4.11-develop-216 392 5/8/2018
0.4.10-develop-213 402 5/8/2018
0.4.10-develop-212 380 5/7/2018
0.4.10-develop-211 388 5/7/2018
0.3.0.712-master 373 9/1/2017
0.3.0.711-master 375 9/1/2017
0.3.0.710-master 360 9/1/2017
0.3.0.709-master 343 8/31/2017
0.3.0.708-master 364 8/30/2017
0.3.0.707-master 386 8/30/2017
0.3.0.706-master 365 8/30/2017
0.3.0.701-master 392 6/26/2017
0.3.0.700-master 418 6/22/2017
0.3.0.699-master 390 6/22/2017
0.3.0.698-master 388 6/21/2017
0.3.0.697-master 387 6/21/2017
0.3.0.696-master 416 6/21/2017
0.3.0.695-master 386 6/21/2017
0.3.0.694-master 381 6/21/2017
0.3.0.693-master 391 6/20/2017
0.3.0.692-master 379 6/19/2017
0.3.0.691-master 407 6/19/2017
0.3.0.690-master 390 6/19/2017
0.3.0.689-master 393 5/14/2017
0.3.0.688 1,967 5/14/2017
0.3.0.686-master 392 5/14/2017
0.2.0.591-master 397 4/19/2017
0.2.0.565-master 409 4/11/2017
0.2.0.556-master 401 3/21/2017
0.2.0.551-master 449 3/17/2017
0.2.0.540-master 384 3/15/2017
0.2.0.536-master 382 3/14/2017
0.2.0.519-master 407 3/2/2017
0.2.0.516-master 384 3/2/2017
0.2.0.499-master 408 2/13/2017
0.2.0.494-master 392 2/7/2017
0.2.0.479-master 412 2/1/2017
0.2.0.463-master 409 1/17/2017
0.2.0.431-master 483 12/2/2016
0.2.0.422-master 428 11/9/2016
0.2.0.421-master 409 11/9/2016
0.2.0.411-master 461 10/26/2016
0.2.0.400-master 410 10/26/2016
0.2.0.394-master 432 10/25/2016
0.2.0.382-master 415 10/21/2016
0.2.0.377-master 406 10/20/2016
0.2.0.323-master 409 10/11/2016
0.2.0.262-master 423 9/29/2016
0.2.0.248-master 428 9/27/2016
0.2.0.174-master 423 9/16/2016
0.2.0.128-master 427 9/8/2016
0.2.0.122-master 439 9/8/2016
0.2.0.121-master 421 9/7/2016
0.2.0.111-master 413 9/7/2016
0.2.0.105-ci 464 9/5/2016
0.2.0.97-ci 452 8/30/2016
0.2.0.96-ci 432 8/29/2016
0.2.0.90-ci 422 8/25/2016
0.2.0.89-ci 411 8/24/2016
0.2.0.88-ci 415 8/24/2016
0.2.0.87-ci 431 8/24/2016
0.2.0.86-ci 417 8/23/2016
0.2.0.85-ci 420 8/22/2016
0.2.0.84-ci 432 8/22/2016
0.2.0.83-ci 435 8/22/2016
0.2.0.82 722 8/22/2016
0.2.0.81-ci 448 8/19/2016
0.2.0.80-ci 443 6/27/2016
0.2.0.79-ci 440 6/27/2016
0.2.0.77-ci 445 6/22/2016
0.2.0.76-ci 446 6/22/2016
0.2.0.75 532 6/15/2016
0.2.0.74-ci 437 6/15/2016
0.2.0.73 495 6/15/2016
0.2.0.72 510 6/15/2016
0.2.0.71 543 6/14/2016
0.2.0.70 492 6/9/2016
0.2.0.69 459 6/9/2016
0.2.0.68 492 6/9/2016
0.2.0.67 620 6/8/2016
0.2.0.66-ci 439 6/8/2016
0.2.0.65-ci 431 6/8/2016
0.2.0.64-ci 467 6/8/2016
0.2.0.63-ci 421 6/7/2016
0.2.0.62 497 6/7/2016
0.2.0.61 481 6/6/2016
0.2.0.60 476 6/6/2016
0.2.0.59 468 6/6/2016
0.2.0.57 499 6/3/2016
0.2.0.56 483 6/3/2016
0.2.0.55 529 6/3/2016
0.2.0.54 499 6/3/2016
0.2.0.53 570 6/3/2016
0.2.0.52-ci 426 6/2/2016
0.2.0.51-ci 444 6/2/2016
0.2.0.50-ci 440 6/2/2016
0.2.0.49 589 5/31/2016
0.2.0.48-ci 453 5/31/2016
0.2.0.46-ci 443 5/31/2016
0.2.0.45 515 5/31/2016
0.2.0.44 526 5/31/2016
0.2.0.43 542 5/31/2016
0.2.0.42 528 5/30/2016
0.2.0.41 541 5/30/2016
0.2.0.40 549 5/30/2016
0.2.0.39 539 5/30/2016
0.2.0.38 527 5/30/2016
0.2.0.37 520 5/30/2016
0.2.0.36 516 5/25/2016
0.2.0.35 533 5/24/2016
0.2.0.34 552 5/24/2016
0.2.0.33 768 5/24/2016
0.2.0.32-ci 438 5/24/2016
0.1.26-ci 454 5/24/2016
0.1.24-ci 445 5/24/2016
0.1.19-ci 428 5/24/2016