Rapid.AI.Ollama.Framework
2.0.3
See the version list below for details.
dotnet add package Rapid.AI.Ollama.Framework --version 2.0.3
NuGet\Install-Package Rapid.AI.Ollama.Framework -Version 2.0.3
<PackageReference Include="Rapid.AI.Ollama.Framework" Version="2.0.3" />
<PackageVersion Include="Rapid.AI.Ollama.Framework" Version="2.0.3" />
<PackageReference Include="Rapid.AI.Ollama.Framework" />
paket add Rapid.AI.Ollama.Framework --version 2.0.3
#r "nuget: Rapid.AI.Ollama.Framework, 2.0.3"
#:package Rapid.AI.Ollama.Framework@2.0.3
#addin nuget:?package=Rapid.AI.Ollama.Framework&version=2.0.3
#tool nuget:?package=Rapid.AI.Ollama.Framework&version=2.0.3
Rapid.AI.Ollama.Framework
Rapid.AI.Ollama.Framework is a lightweight .NET library that allows developers to interact with locally or remotely running Ollama models. It supports both stateless prompt generation and contextual multi-turn chat conversations using the Ollama REST API.- Say goodbye to API limits and monthly bills. Develop and run a powerful local AI that browses, codes, and thinks.
โจ Features
- ๐ฆ Simple and easy-to-integrate C# API
- ๐ค Ollama Integration: Send prompts and receive completions from local or remote Ollama servers.
- ๐งผ Stateful & Stateless Chat: Clearable chat histories to manage conversational memory.
- ๐ง Supports streaming output for prompt generation
๐ค Ollama Gateway (IOllamaGateway
)
- Lightweight prompt-based chat and content generation
- Statefull and Stateless by default, with optional history reset
๐ฆ Installation
Install from NuGet.org:
dotnet add package Rapid.AI.Ollama.Framework
Or via Package Manager Console:
Install-Package Rapid.AI.Ollama.Framework
๐ Getting Started
๐ฆ Prerequisites
- .NET 6 or later
- Ollama running locally (default port
http://localhost:11434
) - A downloaded model (e.g.,
llama3
,llama3.2:1b
, etc.)
๐งฑ Example Model Names
- llama3
- llama3.2:1b
- mistral
Any other model available through Ollama
Ensure the model is already pulled by running:
ollama run llama3.2:1b
๐งช Usage
๐ญ 1. Factory Class
public interface IOllamaGatewayFactory
{
IOllamaGateway Create(string ollamaUrl, string model);
}
๐ค 2. IOllamaGateway
public interface IOllamaGateway
{
string ChatAsync(string prompt, string model = "");
void ClearChatHistory();
string GenerateAsync(string prompt, string model = "");
}
โจ 1. Chat Usage example with Context (Multi-Turn)
var factory = new OllamaGatewayFactory();
var ollama = factory.Create(ollamaUrl, myModel);
string response = ollama.ChatAsync("Tell me about India?");
Console.WriteLine(response);
string responseUpdated = ollama.ChatAsync("What are the best tourist places?");
Console.WriteLine(responseUpdated);
ollama.ClearChatHistory();
โจ 1. Content generation Usage example
var factory = new OllamaGatewayFactory();
var ollama = factory.Create(ollamaUrl, myModel);
string response = ollama.GenerateAsync("Tell me about India?");
Console.WriteLine(response);
๐ To Clear Chat History:
OllamaClient.ClearChatHistory();
๐ License
MIT License โ free to use, modify, and distribute. ๐ง support@vedicaai.com or aruna.devadiga@gmail.com
๐ค Contributions
Feature requests and improvements are welcome. Please fork and PR your changes! ๐ง support@vedicaai.com or aruna.devadiga@gmail.com
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net8.0
- No dependencies.
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
- Factory introduced to create Ollama gateway<br />
- Readme updated