Rapid.AI.Ollama.Framework 2.0.5

dotnet add package Rapid.AI.Ollama.Framework --version 2.0.5
                    
NuGet\Install-Package Rapid.AI.Ollama.Framework -Version 2.0.5
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Rapid.AI.Ollama.Framework" Version="2.0.5" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="Rapid.AI.Ollama.Framework" Version="2.0.5" />
                    
Directory.Packages.props
<PackageReference Include="Rapid.AI.Ollama.Framework" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add Rapid.AI.Ollama.Framework --version 2.0.5
                    
#r "nuget: Rapid.AI.Ollama.Framework, 2.0.5"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package Rapid.AI.Ollama.Framework@2.0.5
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=Rapid.AI.Ollama.Framework&version=2.0.5
                    
Install as a Cake Addin
#tool nuget:?package=Rapid.AI.Ollama.Framework&version=2.0.5
                    
Install as a Cake Tool

Rapid.AI.Ollama.Framework

Rapid.AI.Ollama.Framework is a lightweight .NET library that allows developers to interact with locally or remotely running Ollama models. It supports both stateless prompt generation and contextual multi-turn chat conversations using the Ollama REST API.- Say goodbye to API limits and monthly bills. Develop and run a powerful local AI that browses, codes, and thinks.

โœจ Features

  • ๐Ÿ“ฆ Simple and easy-to-integrate C# API
  • ๐Ÿค– Ollama Integration: Send prompts and receive completions from local or remote Ollama servers.
  • ๐Ÿงผ Stateful & Stateless Chat: Clearable chat histories to manage conversational memory.
  • ๐Ÿ”ง Supports streaming output for prompt generation

๐Ÿค– Ollama Gateway (IOllamaGateway)

  • Lightweight prompt-based chat and content generation
  • Statefull and Stateless by default, with optional history reset

๐Ÿ“ฆ Installation

Install from NuGet.org:

dotnet add package Rapid.AI.Ollama.Framework

Or via Package Manager Console:

Install-Package Rapid.AI.Ollama.Framework

๐Ÿš€ Getting Started

๐Ÿ“ฆ Prerequisites

  • .NET 6 or later
  • Ollama running locally (default port http://localhost:11434)
  • A downloaded model (e.g., llama3, llama3.2:1b, etc.)

๐Ÿงฑ Example Model Names

  • llama3
  • llama3.2:1b
  • mistral

Any other model available through Ollama

Ensure the model is already pulled by running:

ollama run llama3.2:1b

๐Ÿงช Usage

๐Ÿญ 1. Factory Class

public interface IOllamaGatewayFactory
{
    IOllamaGateway Create(string ollamaUrl, string model);
}

๐Ÿค– 2. IOllamaGateway

public interface IOllamaGateway
{
    string ChatAsync(string prompt, string model = "");
    void ClearChatHistory();
    string GenerateAsync(string prompt, string model = "");
}
โœจ 1. Chat Usage example with Context (Multi-Turn)

var factory = new OllamaGatewayFactory();
var ollama = factory.Create(ollamaUrl, myModel);

string response = ollama.ChatAsync("Tell me about India?");
Console.WriteLine(response);

string responseUpdated = ollama.ChatAsync("What are the best tourist places?");
Console.WriteLine(responseUpdated);

ollama.ClearChatHistory();
โœจ 1. Content generation Usage example

var factory = new OllamaGatewayFactory();
var ollama = factory.Create(ollamaUrl, myModel);

string response = ollama.GenerateAsync("Tell me about India?");
Console.WriteLine(response);
๐Ÿ”„ To Clear Chat History:
OllamaClient.ClearChatHistory();

๐Ÿ“ƒ License

MIT License โ€“ free to use, modify, and distribute. ๐Ÿ“ง aruna.devadiga@gmail.com

๐Ÿค Contributions

Feature requests and improvements are welcome. Please fork and PR your changes! ๐Ÿ“ง aruna.devadiga@gmail.com


Product Compatible and additional computed target framework versions.
.NET net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 was computed.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 was computed.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.
  • net8.0

    • No dependencies.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last Updated
2.0.5 155 6/2/2025
2.0.4 159 5/29/2025
2.0.3 145 5/29/2025
2.0.2 77 5/24/2025
2.0.1 85 5/24/2025
1.2.5 166 5/21/2025
1.2.4 150 5/21/2025
1.2.3 161 5/21/2025

- License file udpated
- Readme updated
- Factory introduced to create Ollama gateway<br />