OllamaSharp 2.0.13

There is a newer version of this package available.
See the version list below for details.
dotnet add package OllamaSharp --version 2.0.13                
NuGet\Install-Package OllamaSharp -Version 2.0.13                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="OllamaSharp" Version="2.0.13" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add OllamaSharp --version 2.0.13                
#r "nuget: OllamaSharp, 2.0.13"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install OllamaSharp as a Cake Addin
#addin nuget:?package=OllamaSharp&version=2.0.13

// Install OllamaSharp as a Cake Tool
#tool nuget:?package=OllamaSharp&version=2.0.13                

OllamaSharp 🦙

OllamaSharp is a .NET binding for the Ollama API, making it easy to interact with Ollama using your favorite .NET languages.

Features

  • Intuitive API client: Set up and interact with Ollama in just a few lines of code.
  • API endpoint coverage: Support for all Ollama API endpoints including chats, embeddings, listing models, pulling and creating new models, and more.
  • Real-time streaming: Stream responses directly to your application.
  • Progress reporting: Get real-time progress feedback on tasks like model pulling.
  • API Console: A ready-to-use API console to chat and manage your Ollama host remotely

Usage

OllamaSharp wraps every Ollama API endpoint in awaitable methods that fully support response streaming.

The follow list shows a few examples to get a glimpse on how easy it is to use. The list is not complete.

Initializing

// set up the client
var uri = new Uri("http://localhost:11434");
var ollama = new OllamaApiClient(uri);

// select a model which should be used for further operations
ollama.SelectedModel = "llama2";

Listing all models that are available locally

var models = await ollama.ListLocalModels();

Pulling a model and reporting progress

Callback Syntax
await ollama.PullModel("mistral", status => Console.WriteLine($"({status.Percent}%) {status.Status}"));
IAsyncEnumerable Syntax
await foreach (var status in ollama.PullModel("mistral"))
{
    Console.WriteLine($"({status.Percent}%) {status.Status}");
}

Streaming a completion directly into the console

Callback Syntax
// keep reusing the context to keep the chat topic going
ConversationContext context = null;
context = await ollama.StreamCompletion("How are you today?", context, stream => Console.Write(stream.Response));
IAsyncEnumerable Syntax
// keep reusing the context to keep the chat topic going
ConversationContext context = null;
await foreach (var stream in ollama.StreamCompletion("How are you today?", context))
{
    Console.Write(stream.Response);
    context = stream.Context;
}

Building interactive chats

// uses the /chat api from Ollama 0.1.14
// messages including their roles will automatically be tracked within the chat object
var chat = ollama.Chat(stream => Console.WriteLine(stream.Message?.Content ?? ""));
while (true)
{
    var message = Console.ReadLine();
    await chat.Send(message);
}

Api Console

This project ships a full-featured demo console for all endpoints the Ollama API is exposing.

This is not only a great resource to learn about OllamaSharp, it can also be used to manage and chat with the Ollama host remotely. Image chat is supported for multi modal models.

Api Console Demo

Credits

Icon and name were reused from the amazing Ollama project.

Product Compatible and additional computed target framework versions.
.NET net5.0 was computed.  net5.0-windows was computed.  net6.0 was computed.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 was computed.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 was computed.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed. 
.NET Core netcoreapp2.0 was computed.  netcoreapp2.1 was computed.  netcoreapp2.2 was computed.  netcoreapp3.0 was computed.  netcoreapp3.1 was computed. 
.NET Standard netstandard2.0 is compatible.  netstandard2.1 was computed. 
.NET Framework net461 was computed.  net462 was computed.  net463 was computed.  net47 was computed.  net471 was computed.  net472 was computed.  net48 was computed.  net481 was computed. 
MonoAndroid monoandroid was computed. 
MonoMac monomac was computed. 
MonoTouch monotouch was computed. 
Tizen tizen40 was computed.  tizen60 was computed. 
Xamarin.iOS xamarinios was computed. 
Xamarin.Mac xamarinmac was computed. 
Xamarin.TVOS xamarintvos was computed. 
Xamarin.WatchOS xamarinwatchos was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (9)

Showing the top 5 NuGet packages that depend on OllamaSharp:

Package Downloads
Microsoft.KernelMemory.AI.Ollama

Provide access to Ollama LLM models in Kernel Memory to generate embeddings and text

Microsoft.SemanticKernel.Connectors.Ollama

Semantic Kernel connector for Ollama. Contains services for text generation, chat completion and text embeddings.

CommunityToolkit.Aspire.Hosting.Ollama

An Aspire integration leveraging the Ollama container with support for downloading a model on startup.

CommunityToolkit.Aspire.OllamaSharp

A .NET Aspire client integration for the OllamaSharp library.

Atc.SemanticKernel.Connectors.Ollama

Atc.SemanticKernel.Connectors.Ollama contains a connector for integrating with local llms through Ollama.

GitHub repositories (6)

Showing the top 5 popular GitHub repositories that depend on OllamaSharp:

Repository Stars
microsoft/semantic-kernel
Integrate cutting-edge LLM technology quickly and easily into your apps
microsoft/kernel-memory
RAG architecture: index and query any data using LLM and natural language, track sources, show citations, asynchronous memory patterns.
dotnet/ai-samples
CommunityToolkit/Aspire
A community project with additional components and extensions for .NET Aspire
lindexi/lindexi_gd
博客用到的代码
Version Downloads Last updated
4.0.11 3,962 12/9/2024
4.0.10 51 12/9/2024
4.0.9 1,647 12/2/2024
4.0.8 11,789 11/22/2024
4.0.7 3,424 11/13/2024
4.0.6 10,353 11/7/2024
4.0.5 1,293 11/5/2024
4.0.4 335 11/4/2024
4.0.3 6,654 10/30/2024
4.0.2 254 10/29/2024
4.0.1 2,061 10/26/2024
4.0.0-preview.10 77 10/23/2024
4.0.0-preview.9 48 10/21/2024
4.0.0-preview.8 72 10/17/2024
3.0.15 3,047 10/21/2024
3.0.14 6,636 10/16/2024
3.0.13 84 10/16/2024
3.0.12 14,753 10/14/2024
3.0.11 1,360 10/9/2024
3.0.10 10,313 10/4/2024
3.0.9 97 10/4/2024
3.0.8 12,748 9/26/2024
3.0.7 18,615 9/12/2024
3.0.6 1,173 9/11/2024
3.0.5 789 9/11/2024
3.0.4 24,875 9/6/2024
3.0.3 117 9/6/2024
3.0.2 411 9/5/2024
3.0.1 16,669 9/2/2024
3.0.0 1,981 8/26/2024
2.1.3 2,548 8/23/2024
2.1.2 2,184 8/19/2024
2.1.1 4,131 8/5/2024
2.0.15 134 8/5/2024
2.0.14 101 8/3/2024
2.0.13 1,776 7/29/2024
2.0.12 110 7/28/2024
2.0.11 120 7/28/2024
2.0.10 3,829 7/12/2024
2.0.9 100 7/12/2024
2.0.8 131 7/12/2024
2.0.7 1,569 7/10/2024
2.0.6 3,083 6/25/2024
2.0.5 105 6/25/2024
2.0.4 574 6/24/2024
2.0.3 106 6/24/2024
2.0.2 145 6/24/2024
2.0.1 3,670 6/5/2024
1.1.13 183 6/5/2024
1.1.12 472 6/4/2024
1.1.11 300 6/2/2024
1.1.10 986 5/31/2024
1.1.9 7,079 5/15/2024
1.1.8 1,362 5/10/2024
1.1.7 119 5/10/2024
1.1.5 113 5/10/2024
1.1.4 178 5/10/2024
1.1.3 115 5/10/2024
1.1.2 116 5/10/2024
1.1.1 3,139 3/27/2024
1.1.0 1,849 1/8/2024
1.0.4 293 12/27/2023
1.0.3 341 11/30/2023
1.0.2 326 11/5/2023
1.0.1 189 10/16/2023
1.0.0 1,477 10/16/2023