InstructSharp 1.0.1
See the version list below for details.
dotnet add package InstructSharp --version 1.0.1
NuGet\Install-Package InstructSharp -Version 1.0.1
<PackageReference Include="InstructSharp" Version="1.0.1" />
<PackageVersion Include="InstructSharp" Version="1.0.1" />
<PackageReference Include="InstructSharp" />
paket add InstructSharp --version 1.0.1
#r "nuget: InstructSharp, 1.0.1"
#:package InstructSharp@1.0.1
#addin nuget:?package=InstructSharp&version=1.0.1
#tool nuget:?package=InstructSharp&version=1.0.1
InstructSharp
InstructSharp is a high‑performance, provider‑agnostic .NET SDK that turns large‑language‑model requests into one‑line calls and structured JSON responses. ✨
Seamlessly swap between OpenAI ChatGPT, Anthropic Claude, Google Gemini, X.AI Grok, DeepSeek, or Meta LLaMA without rewriting a single line of business logic.
TL;DR – Install the package, define a POCO, call
QueryAsync<T>()
, get strongly‑typed results. ✅
📑 Table of Contents
- Key Features
- Quick Install
- Hello, World
- Provider Matrix
- Advanced Usage
- Performance Notes
- Roadmap
- Contributing
- License
Key Features
🚀 Feature | Description |
---|---|
Multi‑Provider | One unified client for ChatGPT, Claude, Gemini, Grok, DeepSeek, LLaMA – more coming. |
Strong Typing | Pass any C# POCO → receive a LLMResponse<T> with fully‑deserialized data. |
Consistent API | Every client exposes QueryAsync<T>(request) so swapping vendors is a one‑line change. |
JSON Schema Enforcement | Automatic schema generation via NJsonSchema, keeping responses strict & safe. |
Minimal Setup | Install → add API key → ship. Works in console apps, ASP.NET, Azure Functions, Blazor & more. |
Full .NET 8 Support | Targets net8.0 but runs on .NET 6/7 via multi‑target NuGet build. |
Tiny Footprint | Zero reflection at runtime, no heavy AI SDKs pulled in. Pure HTTP + System.Text.Json . |
Quick Install
# Package Manager
Install-Package InstructSharp
# .NET CLI
dotnet add package InstructSharp
⚡ Tip – add
--prerelease
to grab nightly builds from CI.
Hello, World
using InstructSharp.Clients.ChatGPT;
using InstructSharp.Core;
class QuestionAnswer
{
public string Question { get; set; }
public string Answer { get; set; }
}
var chat = new ChatGPTClient("YOUR_OPENAI_API_KEY");
var req = new ChatGPTRequest
{
Model = ChatGPTModels.GPT4oMini,
Instruction = "Talk like a pirate.",
Input = "What is 2 + 2?"
};
var res = await chat.QueryAsync<QuestionAnswer>(req);
Console.WriteLine($"A• {res.Result.Answer}");
Want raw text? Simply use string
instead of a POCO:
var text = await chat.QueryAsync<string>(req);
Provider Matrix
Provider | Client Class | Structured JSON | Streaming | Docs |
---|---|---|---|---|
OpenAI ChatGPT | ChatGPTClient |
✅ JSON Schema | ⏳ (roadmap) | link |
Anthropic Claude 3 | ClaudeClient |
✅ Tool Calls | ⏳ | link |
Google Gemini 2.5 | GeminiClient |
✅ responseJsonSchema |
⏳ | link |
X.AI Grok 3 | GrokClient |
✅ JSON Schema | ⏳ | link |
DeepSeek Chat | DeepSeekClient |
✅ JSON Object | ⏳ | link |
Meta LLaMA (DeepInfra) | LLamaClient |
✅ JSON Object | ⏳ | link |
Streaming support is on the roadmap – follow the issues to vote.
Advanced Usage
🔒 Secure Configuration
var http = new HttpClient {
Timeout = TimeSpan.FromSeconds(15)
};
var chat = new ChatGPTClient(Environment.GetEnvironmentVariable("OPENAI_KEY"), http);
HttpClient
injection lets you share retry policies, logging handlers, or proxies.- Add standard headers globally via
DefaultRequestHeaders
.
🎛️ Tuning Parameters
Every request type exposes vendor‑specific knobs such as Temperature
, TopP
, MaxTokens
, etc. Set only what you need – defaults are sane.
🗂️ Error Handling
try
{
var res = await chat.QueryAsync<MyType>(req);
}
catch(HttpRequestException ex) when (ex.StatusCode == HttpStatusCode.TooManyRequests)
{
// handle 429 rate limit
}
📉 Token Usage
LLMResponse<T>.Usage
returns prompt, response & total tokens. Use it for cost tracking or throttling.
Performance Notes
- No dynamic – all JSON is parsed with
System.Text.Json
. Fast. - Schema cache – Generated JSON schemas are cached per‑type to avoid regeneration.
- One HTTP round‑trip – no second prompt to "format JSON"; the schema is sent in the first call.
Benchmarks live under /benchmark
– PRs welcome! 🏎️💨
Roadmap
- 🔄 Streaming completions
- 🧩 Function & tool call helpers
- 🏗️ Automatic retries / exponential back‑off
- 📝 DocFX site with full API reference
- 🏆 Benchmarks vs raw vendor SDKs
Have a feature in mind? Open an issue or send a PR!
Contributing
- Fork the repo
git clone
&dotnet build
– tests should pass- Create your branch:
git checkout -b feature/my-awesome
- Commit & push, then open a PR
Dev Environment
- .NET 8 SDK
- Optional:
direnv
/dotenv
for API keys - EditorConfig + Roslyn analyzers enforce style
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net8.0
- Newtonsoft.Json (>= 13.0.3)
- NJsonSchema (>= 11.3.2)
- NJsonSchema.NewtonsoftJson (>= 11.3.2)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.