Rapid.Agentic.AI.Framework 2.1.2

There is a newer version of this package available.
See the version list below for details.
dotnet add package Rapid.Agentic.AI.Framework --version 2.1.2
                    
NuGet\Install-Package Rapid.Agentic.AI.Framework -Version 2.1.2
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Rapid.Agentic.AI.Framework" Version="2.1.2" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="Rapid.Agentic.AI.Framework" Version="2.1.2" />
                    
Directory.Packages.props
<PackageReference Include="Rapid.Agentic.AI.Framework" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add Rapid.Agentic.AI.Framework --version 2.1.2
                    
#r "nuget: Rapid.Agentic.AI.Framework, 2.1.2"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package Rapid.Agentic.AI.Framework@2.1.2
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=Rapid.Agentic.AI.Framework&version=2.1.2
                    
Install as a Cake Addin
#tool nuget:?package=Rapid.Agentic.AI.Framework&version=2.1.2
                    
Install as a Cake Tool

Rapid.Agentic.AI.Framework

Rapid.Agentic.AI.Framework is a lightweight .NET library that simplifies integration with Retrieval-Augmented Generation (RAG) systems and Ollama LLMs. It provides intuitive gateway interfaces for building intelligent applications that can chat, generate content, and ingest domain-specific documents such as PDFs.

Say goodbye to API limits and monthly bills. Run a powerful local AI that browses, codes, and thinks.

โœจ Features

  • ๐Ÿค– Ollama Integration: Send prompts and receive completions from local or remote Ollama servers.
  • ๐Ÿ“„ Document Ingestion: Upload and convert PDFs into vectorized knowledge for retrieval.
  • ๐Ÿง  RAG-Powered Answers: Ask questions based on your custom data (uploaded documents).
  • ๐Ÿงผ Stateful & Stateless Chat: Clearable chat histories to manage conversational memory.

๐Ÿ” RAG Gateway (IRagGateway)

  • Upload and process PDF documents
  • Query domain-specific knowledge with context
  • Generate answers and summaries
  • Statefull and Stateless by default, with optional history reset

๐Ÿค– Ollama Gateway (IOllamaGateway)

  • Lightweight prompt-based chat and content generation
  • Statefull and Stateless by default, with optional history reset

๐Ÿงช Use Cases

  • RAG-enabled customer support systems
  • AI-driven PDF analysis tools
  • Developer tools for querying technical documentation
  • AI assistants embedded in enterprise software

๐Ÿ“ฆ Installation

Install from NuGet.org:

dotnet add package Rapid.Agentic.AI.Framework

Or via Package Manager Console:

Install-Package Rapid.Agentic.AI.Framework

๐Ÿš€ Getting Started

๐Ÿ“ฆ Prerequisites

  • .NET 6 or later
  • Ollama running locally (default port http://localhost:11434)
  • A downloaded model (e.g., llama3, llama3.2:1b, etc.)

๐Ÿงฑ Example Model Names

  • llama3
  • llama3.2:1b
  • codegemma:2b
  • mistral

๐Ÿ“ Quick Recommendation Table for LLM Models in Ollama

Model Speed Quality Context Length Notes
llama3:8b ๐ŸŸข Fast ๐ŸŸข Great 8K tokens Balanced for quality and speed
llama3:70b ๐Ÿ”ด Slow ๐ŸŸข Excellent 8K tokens High quality but very resource intensive
mistral ๐ŸŸข Very Fast ๐ŸŸก Good 4K tokens Lightweight, good for quick answers and small tasks
mixtral ๐ŸŸก Medium ๐ŸŸข Great 32K tokens Good quality, Mixture of Experts model
gemma:7b ๐ŸŸข Fast ๐ŸŸก Decent 8K tokens Smaller, open model from Google
dolphin-mistral ๐ŸŸก Medium ๐ŸŸข Tuned 4K tokens Fine-tuned for chat and reasoning
codellama:7b ๐ŸŸก Medium ๐ŸŸก OK 4K tokens Best for code understanding and generation
phi:2 ๐ŸŸข Very Fast ๐ŸŸก Average 2K tokens Extremely small, best for tiny apps/devices

๐Ÿงฑ Example Embedding Model Names

  • nomic-embed-text โ†’ for general-purpose RAG
  • gte-small โ†’ if you want faster inference with slightly lower quality
  • bge-base-en โ†’ for rich English text documents

๐Ÿ“ Quick Recommendation Table for Embedding Models in Ollama

Model Vector Dim Speed Quality Notes
nomic-embed-text 768 ๐ŸŸข Fast ๐ŸŸข Great Best for most use cases
gte-small 384 ๐ŸŸข Very Fast ๐ŸŸก Good Great for constrained setups
gte-large 1024 ๐ŸŸก Medium ๐ŸŸข Excellent Needs more compute
bge-base-en 768 ๐ŸŸข Fast ๐ŸŸข Great Excellent for English text
all-MiniLM-L6-v2 384 ๐ŸŸข Fast ๐ŸŸก Good Low-resource, reliable choice

Any other model available through Ollama

Ensure the model is already pulled by running:

ollama run llama3.2:1b

โœจ Document Format Support. Following document formats are supported
  • Microsoft Word (.docx) files
  • HTML files
  • Markdown files
  • PDF files
  • c# solution files (.sln). Along with .sln file, all .csproj (C# project) files referenced in the solution and all .cs files referenced in the projects are also processed.
  • c# project files (.csproj). Along with .csproj file, all .cs files referenced in the project are also processed.
  • C# source code files (.cs)
  • .bat files
  • .txt files

All above formats can be processed for RAG (Retrieval-Augmented Generation) vector database generation using following API


๐Ÿงช Usage

๐Ÿญ Factory Class

public interface IAgenticClientFactory
{
     IRagGateway CreateRagGateway(string ollamaUrl, string model, string embeddingModel, string dbPath = "rag.db");
     IOllamaGateway CreateOllamaGateway(string ollamaUrl, string model);
}

๐Ÿ” 1. RAG Gateway

public interface IRagGateway
{
    Task<string> ChatAsync(string question);
    void ClearChatHistory();
    Task<string> GenerateAsync(string question, bool enableStreaming);
    Task UploadFileAsync(string pdfPath);
}
โœจ 1. Content generation example - Summarizing a PDF document

var factory = new AgenticClientFactory();
var rag = factory.CreateRagGateway("http://localhost:11434", "llama3", "nomic-embed-text");

await rag.UploadFileAsync("documents/manual.pdf");

string summary = await rag.GenerateAsync("Summarize the manual.", enableStreaming: false);
Console.WriteLine(summary);
โœจ 2. Content generation example - Reviewing c# source code
var factory = new AgenticClientFactory();
var rag = factory.CreateRagGateway("http://localhost:11434", "llama3");

await rag.UploadFileAsync("documents/MyAlgorithm.cs");

string summary = await rag.GenerateAsync("Review the cs file", enableStreaming: false);
Console.WriteLine(summary);
โœจ 3. Content generation example - without uploading local file
var factory = new AgenticClientFactory();
var rag = factory.CreateRagGateway("http://localhost:11434", "llama3");

string summary = await rag.GenerateAsync("What are the best tourist places in Bangalore", enableStreaming: false);
Console.WriteLine(summary);
โœจ 4. Chat example (Multi-Turn) - Reviewing c# source code and asking for fix
var factory = new AgenticClientFactory();
var rag = factory.CreateRagGateway("http://localhost:11434", "llama3");

await rag.UploadFileAsync("MyApplication/MyAlgorithm.cs");

string summary = await rag.ChatAsync("Review the cs file", enableStreaming: false);
Console.WriteLine(summary);

string modifiedsummary = await rag.ChatAsync("Provide me top review comments with potential fix", enableStreaming: false);
Console.WriteLine(modifiedsummary);
๐Ÿ”„ 5. To Clear Chat History:
rag.ClearChatHistory();
โœจ 6. Automatically uploading of c# .cs files from given .csproj or .sln files

Automatically identifies and uploads and generates vector database for RAG:

  • All .csproj and .cs files from a given .sln (solution) file and .sln file itesf
  • All .cs files referenced in each .csproj (C# project) file.
// Uploads all the .csproj files and .cs files belonging to this .sln file and .sln file itself
await rag.UploadFileAsync("MyApplication/MyAlgorithm.sln");

๐Ÿค– 2. IOllamaGateway

public interface IOllamaGateway
{
    string ChatAsync(string prompt, string model = "");
    void ClearChatHistory();
    string GenerateAsync(string prompt, string model = "");
}
โœจ 1. Chat Usage example with Context (Multi-Turn)

var factory = new AgenticClientFactory();
var ollama = factory.CreateOllamaGateway(ollamaUrl, myModel);

string response = ollama.ChatAsync("Tell me about India?");
Console.WriteLine(response);

string responseUpdated = ollama.ChatAsync("What are the best tourist places?");
Console.WriteLine(responseUpdated);

ollama.ClearChatHistory();
โœจ 2. Content generation Usage example

var factory = new AgenticClientFactory();
var ollama = factory.CreateOllamaGateway(ollamaUrl, myModel);

string response = ollama.GenerateAsync("Tell me about India?");
Console.WriteLine(response);
๐Ÿ”„ 3. To Clear Chat History:
OllamaClient.ClearChatHistory();

๐Ÿ“ƒ License

The Core Framework is not open-source, but is freely distributed via NuGet.

For commercial use, licensing, or integration support:

๐Ÿ“ง aruna.devadiga@gmail.com


๐Ÿ™‹ Support & Contributions

This framework is not open source, but it is freely distributed via NuGet. If you encounter issues, bugs, or have suggestions, feel free to reach out to the maintainers or submit feedback via the appropriate support channels.

๐Ÿ“ง aruna.devadiga@gmail.com


Product Compatible and additional computed target framework versions.
.NET net8.0 is compatible.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed.  net9.0 was computed.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 was computed.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.

Version Downloads Last Updated
2.1.7 294 6/12/2025
2.1.6 209 6/9/2025
2.1.4 138 6/2/2025
2.1.3 148 5/29/2025
2.1.2 141 5/29/2025
2.1.1 144 5/25/2025
2.1.0 147 5/25/2025
2.0.5 97 5/25/2025
2.0.2 74 5/24/2025
2.0.0 118 5/23/2025
1.0.1 154 5/21/2025

New Feature: Extended Document Format Support
- Added support for additional document formats including:
- Microsoft Word (.docx) files
- HTML files
- Markdown files
- PDF files
- c# solution files (.sln). Along with .sln file, all .csproj (C# project) files referenced in the solution and all .cs files referenced in the projects are also processed.
- c# project files (.csproj). Along with .csproj file, all .cs files referenced in the project are also processed.
- C# source code files (.cs)
- .bat files
- .txt files
- All these formats can be processed for RAG (Retrieval-Augmented Generation) vector database generation.

Improvements (also check readme file)
- Improved RAG vector database generation
- Improved Sentence Tokenizer
- Improved Context Search
Bug Fixes
- Fixed wrong model usage in generating string embedding for storing.

Automatically uploading of c# .cs files from given .csproj or .sln files
- Automatically identifies and uploads then generates vector database for RAG:
- All .csproj and .cs files from a given .sln (solution) file and .sln file itesf
- All .cs files referenced in each .csproj (C# project) file.