Rapid.AI.Ollama.Framework
2.0.5
dotnet add package Rapid.AI.Ollama.Framework --version 2.0.5
NuGet\Install-Package Rapid.AI.Ollama.Framework -Version 2.0.5
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Rapid.AI.Ollama.Framework" Version="2.0.5" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="Rapid.AI.Ollama.Framework" Version="2.0.5" />
<PackageReference Include="Rapid.AI.Ollama.Framework" />
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add Rapid.AI.Ollama.Framework --version 2.0.5
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
#r "nuget: Rapid.AI.Ollama.Framework, 2.0.5"
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package Rapid.AI.Ollama.Framework@2.0.5
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=Rapid.AI.Ollama.Framework&version=2.0.5
#tool nuget:?package=Rapid.AI.Ollama.Framework&version=2.0.5
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
Rapid.AI.Ollama.Framework
Rapid.AI.Ollama.Framework is a lightweight .NET library that allows developers to interact with locally or remotely running Ollama models. It supports both stateless prompt generation and contextual multi-turn chat conversations using the Ollama REST API.- Say goodbye to API limits and monthly bills. Develop and run a powerful local AI that browses, codes, and thinks.
โจ Features
- ๐ฆ Simple and easy-to-integrate C# API
- ๐ค Ollama Integration: Send prompts and receive completions from local or remote Ollama servers.
- ๐งผ Stateful & Stateless Chat: Clearable chat histories to manage conversational memory.
- ๐ง Supports streaming output for prompt generation
๐ค Ollama Gateway (IOllamaGateway
)
- Lightweight prompt-based chat and content generation
- Statefull and Stateless by default, with optional history reset
๐ฆ Installation
Install from NuGet.org:
dotnet add package Rapid.AI.Ollama.Framework
Or via Package Manager Console:
Install-Package Rapid.AI.Ollama.Framework
๐ Getting Started
๐ฆ Prerequisites
- .NET 6 or later
- Ollama running locally (default port
http://localhost:11434
) - A downloaded model (e.g.,
llama3
,llama3.2:1b
, etc.)
๐งฑ Example Model Names
- llama3
- llama3.2:1b
- mistral
Any other model available through Ollama
Ensure the model is already pulled by running:
ollama run llama3.2:1b
๐งช Usage
๐ญ 1. Factory Class
public interface IOllamaGatewayFactory
{
IOllamaGateway Create(string ollamaUrl, string model);
}
๐ค 2. IOllamaGateway
public interface IOllamaGateway
{
string ChatAsync(string prompt, string model = "");
void ClearChatHistory();
string GenerateAsync(string prompt, string model = "");
}
โจ 1. Chat Usage example with Context (Multi-Turn)
var factory = new OllamaGatewayFactory();
var ollama = factory.Create(ollamaUrl, myModel);
string response = ollama.ChatAsync("Tell me about India?");
Console.WriteLine(response);
string responseUpdated = ollama.ChatAsync("What are the best tourist places?");
Console.WriteLine(responseUpdated);
ollama.ClearChatHistory();
โจ 1. Content generation Usage example
var factory = new OllamaGatewayFactory();
var ollama = factory.Create(ollamaUrl, myModel);
string response = ollama.GenerateAsync("Tell me about India?");
Console.WriteLine(response);
๐ To Clear Chat History:
OllamaClient.ClearChatHistory();
๐ License
MIT License โ free to use, modify, and distribute. ๐ง aruna.devadiga@gmail.com
๐ค Contributions
Feature requests and improvements are welcome. Please fork and PR your changes! ๐ง aruna.devadiga@gmail.com
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.
-
net8.0
- No dependencies.
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
- License file udpated
- Readme updated
- Factory introduced to create Ollama gateway<br />