Sora.Ai.Provider.Ollama 0.4.0

dotnet add package Sora.Ai.Provider.Ollama --version 0.4.0
                    
NuGet\Install-Package Sora.Ai.Provider.Ollama -Version 0.4.0
                    
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Sora.Ai.Provider.Ollama" Version="0.4.0" />
                    
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="Sora.Ai.Provider.Ollama" Version="0.4.0" />
                    
Directory.Packages.props
<PackageReference Include="Sora.Ai.Provider.Ollama" />
                    
Project file
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add Sora.Ai.Provider.Ollama --version 0.4.0
                    
#r "nuget: Sora.Ai.Provider.Ollama, 0.4.0"
                    
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package Sora.Ai.Provider.Ollama@0.4.0
                    
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=Sora.Ai.Provider.Ollama&version=0.4.0
                    
Install as a Cake Addin
#tool nuget:?package=Sora.Ai.Provider.Ollama&version=0.4.0
                    
Install as a Cake Tool

Sylin.Sora.Ai.Provider.Ollama

Ollama AI provider for Sora: local LLM chat, stream, and embeddings via Ollama endpoint.

  • Target framework: net9.0
  • License: Apache-2.0

Install

dotnet add package Sylin.Sora.Ai.Provider.Ollama

Minimal setup

Register Sora + the Ollama provider (typical ASP.NET Program.cs):

// using Sora.AI; using Sora.Ai.Provider.Ollama; using Sora.AI.Web;
var builder = WebApplication.CreateBuilder(args);
builder.Services
		.AddSora()
		.AddAi()
		.AddOllama(o =>
		{
				o.BaseAddress = new Uri("http://localhost:11434"); // default
				o.DefaultModel = "qwen3:4b"; // or any local model/tag
		})
		.AddAiWeb(); // optional HTTP endpoints under /ai
var app = builder.Build();
app.MapControllers();
app.Run();

Then query the default engine:

using Sora.AI;
var res = await Engine.Prompt("Explain quantum entanglement briefly.");
Console.WriteLine(res.Text);

Or via HTTP (if AddAiWeb is enabled):

POST /ai/chat { "model": "qwen3:4b", "messages": [{ "role": "user", "content": "Explain quantum entanglement briefly." }] }

Using reasoning (think)

Set the optional think flag in prompt options. The Ollama adapter emits a top-level think in its JSON payload for models that support it (e.g., Qwen3/R1/DeepSeek‑V3.1 quants).

using Sora.AI;
using Sora.AI.Contracts.Models;

var res = await Engine.Prompt(
		"Briefly explain quantum entanglement.",
		model: "qwen3:4b",
		opts: new AiPromptOptions { Think = true }
);
Console.WriteLine(res.Text);

HTTP example:

POST /ai/chat { "model": "qwen3:4b", "messages": [{ "role": "user", "content": "Explain quantum entanglement briefly." }], "options": { "think": true } }

Vendor-specific options (passthrough)

Unknown fields posted under options are forwarded to Ollama’s options bag in the request. Use this for provider-specific parameters like mirostat, repeat_penalty, etc. Known Sora fields still map to Ollama (temperature, top_p, num_predict, stop). If keys overlap, your posted value wins.

HTTP example:

POST /ai/chat { "model": "qwen3:4b", "messages": [{ "role": "user", "content": "Summarize Bell’s theorem in one sentence." }], "options": { "temperature": 0.6, "topP": 0.95, "maxOutputTokens": 128, "mirostat": 2, "mirostat_tau": 5.0, "repeat_penalty": 1.1 } }

Programmatic example:

using Sora.AI;
using Sora.AI.Contracts.Models;

var opts = new AiPromptOptions
{
		Temperature = 0.6,
		TopP = 0.95,
		MaxOutputTokens = 128,
		// Add arbitrary vendor params via object initializer syntax in JSON calls
};
var res = await Engine.Prompt("Summarize Bell’s theorem in one sentence.", "qwen3:4b", opts);

Notes

  • The Web API uses messages[] instead of a top-level prompt.
  • Embeddings are supported via POST /ai/embeddings with model and input (array of strings).
  • Streaming is supported via POST /ai/chat/stream with the same body as /ai/chat.
Product Compatible and additional computed target framework versions.
.NET net9.0 is compatible.  net9.0-android was computed.  net9.0-browser was computed.  net9.0-ios was computed.  net9.0-maccatalyst was computed.  net9.0-macos was computed.  net9.0-tvos was computed.  net9.0-windows was computed.  net10.0 was computed.  net10.0-android was computed.  net10.0-browser was computed.  net10.0-ios was computed.  net10.0-maccatalyst was computed.  net10.0-macos was computed.  net10.0-tvos was computed.  net10.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages

This package is not used by any NuGet packages.

GitHub repositories

This package is not used by any popular GitHub repositories.