Llamas 0.1.1

dotnet add package Llamas --version 0.1.1
NuGet\Install-Package Llamas -Version 0.1.1
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Llamas" Version="0.1.1" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Llamas --version 0.1.1
#r "nuget: Llamas, 0.1.1"
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install Llamas as a Cake Addin
#addin nuget:?package=Llamas&version=0.1.1

// Install Llamas as a Cake Tool
#tool nuget:?package=Llamas&version=0.1.1

Llamas


Llamas NuGet Version

Llamas is a .NET client library for Ollama, enabling .NET developers to interact with and leverage large language models. If using the Llamas.Container package, developers can also host pre-configured instances of Ollama in docker from their own .NET code either directly or using the simple DI patterns they are accustomed to with no configuration knowledge needed.

Llamas is a handwritten client library focused on ergonomics and performance, taking full advantage of IAsyncEnumerable and ndjson to handle and propagate live-streaming data. This client handles the functionality exposed by the Ollama API and therefore requires an instance of Ollama to be accessible over the local network, or hosted using the Llamas.Container package.

Usage

The IOllamaClient interface describes the functionality of the Ollama client, such as listing models installed locally, pulling new models, generating chat completions, generating embeddings, pushing models, and retrieving details about models. IOllamaBlobClient contains definitions for blob functionality including checking for the existence of and creation of a data blob.

Examples of client use can be found both in the examples folder, as well as the integration test suite.

Dependency Injection

Llamas comes with several ways to set up a client using the .NET hosting abstractions.

One can inject a client configuration and the client explicitly, or using one of the helper extension methods on IServiceCollection.

services.AddHttpClient(); // IHttpClientFactory and HttpClient can both be injected. Otherwise, new HttpClient will be created

#region Manual Addition

/// Add the services manually
var clientConfig = new OllamaClientConfiguration();
services.AddSingleton(clientConfig);
services.AddSingleton<IOllamaClient, OllamaClient>();
#endregion


#region From Configuration

// Automatically inject the configuration and a client
var clientConfig = new OllamaClientConfiguration();
services.AddOllamaClient(clientConfig);

#endregion

#region With Configuration Builder

// Use the lambda parameter to change the default configuration values
services.AddOllamaClient(clientConfig => clientConfig with {Port = 8082});

#endregion
Product Compatible and additional computed target framework versions.
.NET net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (1)

Showing the top 1 NuGet packages that depend on Llamas:

Package Downloads
Llamas.Container

Host Ollama docker containers from the comfort of .NET with client support from Llamas

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last updated
0.1.1 38 7/2/2024
0.1.0 39 7/1/2024

Minor changes to enable testing