OllamaSharp 5.3.4

There is a newer version of this package available.
See the version list below for details.
dotnet add package OllamaSharp --version 5.3.4
                    
NuGet\Install-Package OllamaSharp -Version 5.3.4
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="OllamaSharp" Version="5.3.4" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="OllamaSharp" Version="5.3.4" />
                    
Directory.Packages.props
<PackageReference Include="OllamaSharp" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add OllamaSharp --version 5.3.4
                    
#r "nuget: OllamaSharp, 5.3.4"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package OllamaSharp@5.3.4
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=OllamaSharp&version=5.3.4
                    
Install as a Cake Addin
#tool nuget:?package=OllamaSharp&version=5.3.4
                    
Install as a Cake Tool

<a href="https://www.nuget.org/packages/OllamaSharp"><img src="https://img.shields.io/nuget/v/OllamaSharp" alt="nuget version"></a> <a href="https://www.nuget.org/packages/OllamaSharp"><img src="https://img.shields.io/nuget/dt/OllamaSharp.svg" alt="nuget downloads"></a> <a href="https://awaescher.github.io/OllamaSharp"><img src="https://img.shields.io/badge/api_docs-8A2BE2" alt="Api docs"></a>

OllamaSharp 🦙

OllamaSharp provides .NET bindings for the Ollama API, simplifying interactions with Ollama both locally and remotely.

🏆 Recommended by Microsoft

Features

Usage

OllamaSharp wraps each Ollama API endpoint in awaitable methods that fully support response streaming.

The following list shows a few simple code examples.

Try our full featured demo application that's included in this repository

Initializing

// set up the client
var uri = new Uri("http://localhost:11434");
var ollama = new OllamaApiClient(uri);

// select a model which should be used for further operations
ollama.SelectedModel = "llama3.1:8b";

Listing all models that are available locally

var models = await ollama.ListLocalModelsAsync();

Pulling a model and reporting progress

await foreach (var status in ollama.PullModelAsync("llama3.1:405b"))
    Console.WriteLine($"{status.Percent}% {status.Status}");

Generating a completion directly into the console

await foreach (var stream in ollama.GenerateAsync("How are you today?"))
    Console.Write(stream.Response);

Building interactive chats

// messages including their roles and tool calls will automatically be tracked within the chat object
// and are accessible via the Messages property

var chat = new Chat(ollama);

while (true)
{
    var message = Console.ReadLine();
    await foreach (var answerToken in chat.SendAsync(message))
        Console.Write(answerToken);
}

Usage with Microsoft.Extensions.AI

Microsoft built an abstraction library to streamline the usage of different AI providers. This is a really interesting concept if you plan to build apps that might use different providers, like ChatGPT, Claude and local models with Ollama.

I encourage you to read their accouncement Introducing Microsoft.Extensions.AI Preview – Unified AI Building Blocks for .NET.

OllamaSharp is the first full implementation of their IChatClient and IEmbeddingGenerator that makes it possible to use Ollama just like every other chat provider.

To do this, simply use the OllamaApiClient as IChatClient instead of IOllamaApiClient.

// install package Microsoft.Extensions.AI.Abstractions

private static IChatClient CreateChatClient(Arguments arguments)
{
  if (arguments.Provider.Equals("ollama", StringComparison.OrdinalIgnoreCase))
    return new OllamaApiClient(arguments.Uri, arguments.Model);
  else
    return new OpenAIChatClient(new OpenAI.OpenAIClient(arguments.ApiKey), arguments.Model); // ChatGPT or compatible
}

The OllamaApiClient implements both interfaces from Microsoft.Extensions.AI, you just need to cast it accordingly:

  • IChatClient for model inference
  • IEmbeddingGenerator<string, Embedding<float>> for embedding generation

Thanks

I would like to thank all the contributors who take the time to improve OllamaSharp. First and foremost mili-tan, who always keeps OllamaSharp in sync with the Ollama API.

The icon and name were reused from the amazing Ollama project.

Special thanks to JetBrains for supporting this project.

JetBrains logo.

Product Compatible and additional computed target framework versions.
.NET net5.0 was computed.  net5.0-windows was computed.  net6.0 was computed.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 was computed.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 is compatible.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 was computed.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
.NET Core netcoreapp2.0 was computed.  netcoreapp2.1 was computed.  netcoreapp2.2 was computed.  netcoreapp3.0 was computed.  netcoreapp3.1 was computed. 
.NET Standard netstandard2.0 is compatible.  netstandard2.1 is compatible. 
.NET Framework net461 was computed.  net462 was computed.  net463 was computed.  net47 was computed.  net471 was computed.  net472 was computed.  net48 was computed.  net481 was computed. 
MonoAndroid monoandroid was computed. 
MonoMac monomac was computed. 
MonoTouch monotouch was computed. 
Tizen tizen40 was computed.  tizen60 was computed. 
Xamarin.iOS xamarinios was computed. 
Xamarin.Mac xamarinmac was computed. 
Xamarin.TVOS xamarintvos was computed. 
Xamarin.WatchOS xamarinwatchos was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (21)

Showing the top 5 NuGet packages that depend on OllamaSharp:

Package Downloads
Microsoft.SemanticKernel.Connectors.Ollama

Semantic Kernel connector for Ollama. Contains services for text generation, chat completion and text embeddings.

Microsoft.KernelMemory.AI.Ollama

Provide access to Ollama LLM models in Kernel Memory to generate embeddings and text

CommunityToolkit.Aspire.Hosting.Ollama

An Aspire integration leveraging the Ollama container with support for downloading a model on startup.

CommunityToolkit.Aspire.OllamaSharp

A .NET Aspire client integration for the OllamaSharp library.

EnergyAssembly

Package Description

GitHub repositories (15)

Showing the top 15 popular GitHub repositories that depend on OllamaSharp:

Repository Stars
microsoft/semantic-kernel
Integrate cutting-edge LLM technology quickly and easily into your apps
testcontainers/testcontainers-dotnet
A library to support tests with throwaway instances of Docker containers for all compatible .NET Standard versions.
dotnet/extensions
This repository contains a suite of libraries that provide facilities commonly needed when creating production-ready applications.
microsoft/kernel-memory
RAG architecture: index and query any data using LLM and natural language, track sources, show citations, asynchronous memory patterns.
microsoft/Generative-AI-for-beginners-dotnet
Five lessons, learn how to really apply AI to your .NET Applications
microsoft/ai-dev-gallery
An open-source project for Windows developers to learn how to add AI with local models and APIs to Windows apps.
getcellm/cellm
Use LLMs in Excel formulas
mixcore/mix.core
🚀 A future-proof enterprise web CMS supporting both headless and decoupled approaches. Build any type of app with customizable APIs on ASP.NET Core/.NET Core. Completely open-source and designed for flexibility.
dotnet/ai-samples
CommunityToolkit/Aspire
A community project with additional components and extensions for .NET Aspire
anobaka/Bakabase
A local media manager for all types of files. 二次元老司机专用的本地媒体文件管理器,支持管理和处理音视频、本子、图集、小说、哔哩哔哩视频、游戏甚至mod等各类资源
PowerShell/AIShell
An interactive shell to work with AI-powered assistance providers
axzxs2001/Asp.NetCoreExperiment
原来所有项目都移动到**OleVersion**目录下进行保留。新的案例装以.net 5.0为主,一部分对以前案例进行升级,一部分将以前的工作经验总结出来,以供大家参考!
afrise/MCPSharp
MCPSharp is a .NET library that helps you build Model Context Protocol (MCP) servers and clients - the standardized API protocol used by AI assistants and models.
lindexi/lindexi_gd
博客用到的代码
Version Downloads Last Updated
5.3.6 4,973 8/20/2025
5.3.5 8,216 8/18/2025
5.3.4 5,864 8/2/2025
5.3.3 3,698 7/22/2025
5.3.2 709 7/22/2025
5.3.1 984 7/19/2025
5.2.10 2,598 7/14/2025
5.2.9 2,175 7/13/2025
5.2.8 1,842 7/10/2025
5.2.7 2,685 7/7/2025
5.2.6 1,112 7/4/2025
5.2.5 425 7/4/2025
5.2.4 420 7/4/2025
5.2.3 27,356 6/23/2025
5.2.2 18,742 5/30/2025
5.2.1 533 5/30/2025
5.1.20 546 5/30/2025
5.1.19 3,400 5/23/2025
5.1.18 13,871 5/19/2025
5.1.17 990 5/16/2025
5.1.16 1,256 5/13/2025
5.1.15 442 5/13/2025
5.1.14 23,901 5/6/2025
5.1.13 10,475 4/10/2025
5.1.12 138,090 4/10/2025
5.1.11 3,563 4/9/2025
5.1.10 1,920 4/7/2025
5.1.9 4,238 3/27/2025
5.1.8 274 3/27/2025
5.1.7 75,270 3/14/2025
5.1.6 261 3/14/2025
5.1.5 1,087 3/13/2025
5.1.4 2,864 3/5/2025
5.1.3 545 3/4/2025
5.1.2 23,615 2/24/2025
5.1.1 1,234 2/21/2025
5.1.0 255 2/21/2025
5.0.7 35,419 2/17/2025
5.0.6 30,582 2/3/2025
5.0.5 1,362 1/31/2025
5.0.4 4,065 1/27/2025
5.0.3 3,367 1/22/2025
5.0.2 8,688 1/15/2025
5.0.1 473 1/15/2025
4.0.22 10,035 1/10/2025
4.0.21 208 1/9/2025
4.0.20 454 1/8/2025
4.0.19 136 1/8/2025
4.0.18 511 1/8/2025
4.0.17 37,075 1/3/2025
4.0.16 180 1/3/2025
4.0.15 183 1/3/2025
4.0.14 198 1/3/2025
4.0.13 173 1/3/2025
4.0.12 174 1/3/2025
4.0.11 33,759 12/9/2024
4.0.10 170 12/9/2024
4.0.9 1,926 12/2/2024
4.0.8 35,226 11/22/2024
4.0.7 4,196 11/13/2024
4.0.6 14,445 11/7/2024
4.0.5 1,391 11/5/2024
4.0.4 649 11/4/2024
4.0.3 30,699 10/30/2024
4.0.2 351 10/29/2024
4.0.1 2,282 10/26/2024
4.0.0-preview.10 136 10/23/2024
4.0.0-preview.9 92 10/21/2024
4.0.0-preview.8 117 10/17/2024
3.0.15 5,338 10/21/2024
3.0.14 10,832 10/16/2024
3.0.13 154 10/16/2024
3.0.12 21,653 10/14/2024
3.0.11 1,522 10/9/2024
3.0.10 25,149 10/4/2024
3.0.9 173 10/4/2024
3.0.8 17,650 9/26/2024
3.0.7 34,764 9/12/2024
3.0.6 1,272 9/11/2024
3.0.5 846 9/11/2024
3.0.4 26,723 9/6/2024
3.0.3 170 9/6/2024
3.0.2 587 9/5/2024
3.0.1 23,403 9/2/2024
3.0.0 2,206 8/26/2024
2.1.3 2,758 8/23/2024
2.1.2 2,678 8/19/2024
2.1.1 4,460 8/5/2024
2.0.15 290 8/5/2024
2.0.14 154 8/3/2024
2.0.13 1,999 7/29/2024
2.0.12 175 7/28/2024
2.0.11 239 7/28/2024
2.0.10 5,086 7/12/2024
2.0.9 151 7/12/2024
2.0.8 184 7/12/2024
2.0.7 1,822 7/10/2024
2.0.6 3,185 6/25/2024
2.0.5 164 6/25/2024
2.0.4 628 6/24/2024
2.0.3 161 6/24/2024
2.0.2 220 6/24/2024
2.0.1 4,074 6/5/2024
1.1.13 611 6/5/2024
1.1.12 602 6/4/2024
1.1.11 352 6/2/2024
1.1.10 1,362 5/31/2024
1.1.9 7,321 5/15/2024
1.1.8 1,423 5/10/2024
1.1.7 178 5/10/2024
1.1.5 169 5/10/2024
1.1.4 307 5/10/2024
1.1.3 176 5/10/2024
1.1.2 168 5/10/2024
1.1.1 3,621 3/27/2024
1.1.0 2,291 1/8/2024
1.0.4 355 12/27/2023
1.0.3 432 11/30/2023
1.0.2 443 11/5/2023
1.0.1 271 10/16/2023
1.0.0 2,135 10/16/2023