Rapid.AI.Ollama.Framework 1.2.5

There is a newer version of this package available.
See the version list below for details.
dotnet add package Rapid.AI.Ollama.Framework --version 1.2.5
                    
NuGet\Install-Package Rapid.AI.Ollama.Framework -Version 1.2.5
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Rapid.AI.Ollama.Framework" Version="1.2.5" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="Rapid.AI.Ollama.Framework" Version="1.2.5" />
                    
Directory.Packages.props
<PackageReference Include="Rapid.AI.Ollama.Framework" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add Rapid.AI.Ollama.Framework --version 1.2.5
                    
#r "nuget: Rapid.AI.Ollama.Framework, 1.2.5"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package Rapid.AI.Ollama.Framework@1.2.5
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=Rapid.AI.Ollama.Framework&version=1.2.5
                    
Install as a Cake Addin
#tool nuget:?package=Rapid.AI.Ollama.Framework&version=1.2.5
                    
Install as a Cake Tool

Rapid.AI.Ollama.Framework

Rapid.AI.Ollama.Framework is a lightweight C# client library that allows developers to interact with locally running Ollama models. It supports both stateless prompt generation and contextual multi-turn chat conversations using the Ollama REST API.

โœจ Features

  • ๐Ÿ” Stateless prompt generation using /api/generate
  • ๐Ÿ’ฌ Context-aware chat with conversation history using /api/chat
  • ๐Ÿ“ฆ Simple and easy-to-integrate C# API
  • ๐Ÿ”ง Supports streaming output for prompt generation

๐Ÿš€ Getting Started

๐Ÿ“ฆ Prerequisites

  • .NET 6 or later
  • Ollama running locally (default port http://localhost:11434)
  • A downloaded model (e.g., llama3, llama3.2:1b, etc.)

๐Ÿงช Usage

โœจ 1. Generate Prompt Response

using Rapid.AI.Ollama.Framework;

string result = OllamaClient.Generate("http://localhost:11434/api/generate", "What is quantum physics?", "llama3.2:1b");

Console.WriteLine(result);

This uses the /api/generate endpoint with streaming enabled, and returns a stateless response for the given prompt.

๐Ÿ’ฌ 2. Chat with Context (Multi-Turn)

using Rapid.AI.Ollama.Framework;

// First user message
string reply1 = OllamaClient.Chat("http://localhost:11434/api/chat", "Who was Marie Curie?", "llama3.2:1b");
Console.WriteLine("AI: " + reply1);

// Follow-up message
string reply2 = OllamaClient.Chat("http://localhost:11434/api/chat", "What was her contribution to science?", "llama3.2:1b");
Console.WriteLine("AI: " + reply2);

You can maintain context by using Chat(). The chat history is kept internally.

๐Ÿ”„ To Clear Chat History:

OllamaClient.ClearChatHistory();

๐Ÿ“Œ Notes

Generate method uses the /api/generate endpoint and streams the output.

Chat method uses the /api/chat endpoint and maintains internal chat history.

Timeout is set to 5 minutes for long-running responses.

๐Ÿ“ Project Structure

OllamaClient
โ”œโ”€โ”€ Generate(...)        // Stateless streaming prompt generation
โ”œโ”€โ”€ Chat(...)            // Stateful chat with message history
โ”œโ”€โ”€ ClearChatHistory()   // Clears the internal chat history

๐Ÿงฑ Example Model Names

  • llama3

  • llama3.2:1b

  • mistral

Any other model available through Ollama

Ensure the model is already pulled by running:

ollama run llama3.2:1b

๐Ÿ“ƒ License

MIT License โ€“ free to use, modify, and distribute.

๐Ÿค Contributions

Feature requests and improvements are welcome. Please fork and PR your changes!

Product Compatible and additional computed target framework versions.
.NET net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 was computed.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 was computed.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.
  • net8.0

    • No dependencies.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last Updated
2.0.5 156 6/2/2025
2.0.4 159 5/29/2025
2.0.3 145 5/29/2025
2.0.2 77 5/24/2025
2.0.1 86 5/24/2025
1.2.5 167 5/21/2025
1.2.4 150 5/21/2025
1.2.3 161 5/21/2025

- Readme.md added
- Chart support added