AchieveAi.LmDotnetTools.OpenAIProvider
1.0.2
dotnet add package AchieveAi.LmDotnetTools.OpenAIProvider --version 1.0.2
NuGet\Install-Package AchieveAi.LmDotnetTools.OpenAIProvider -Version 1.0.2
<PackageReference Include="AchieveAi.LmDotnetTools.OpenAIProvider" Version="1.0.2" />
<PackageVersion Include="AchieveAi.LmDotnetTools.OpenAIProvider" Version="1.0.2" />
<PackageReference Include="AchieveAi.LmDotnetTools.OpenAIProvider" />
paket add AchieveAi.LmDotnetTools.OpenAIProvider --version 1.0.2
#r "nuget: AchieveAi.LmDotnetTools.OpenAIProvider, 1.0.2"
#:package AchieveAi.LmDotnetTools.OpenAIProvider@1.0.2
#addin nuget:?package=AchieveAi.LmDotnetTools.OpenAIProvider&version=1.0.2
#tool nuget:?package=AchieveAi.LmDotnetTools.OpenAIProvider&version=1.0.2
LmDotnet - Large Language Model SDK for .NET
LmDotnet is a comprehensive .NET SDK for working with large language models (LLMs) from multiple providers including OpenAI, Anthropic, and OpenRouter.
Features
- Multi-Provider Support: Unified interface for OpenAI, Anthropic, OpenRouter, and more
- Streaming & Synchronous: Support for both streaming and traditional request/response patterns
- Middleware Pipeline: Extensible middleware for logging, caching, function calls, and usage tracking
- Type Safety: Strongly-typed models and responses
- Performance Optimized: Built for high-throughput production scenarios
- Comprehensive Testing: Extensive test coverage with mocking utilities
Quick Start
Installation
dotnet add package AchieveAi.LmDotnetTools.LmCore
dotnet add package AchieveAi.LmDotnetTools.OpenAIProvider
dotnet add package AchieveAi.LmDotnetTools.AnthropicProvider
Basic Usage
using AchieveAi.LmDotnetTools.LmCore.Agents;
using AchieveAi.LmDotnetTools.LmCore.Messages;
using AchieveAi.LmDotnetTools.OpenAIProvider.Agents;
// Create an agent
var agent = new OpenClientAgent("MyAgent", openClient);
// Send a message
var messages = new[] { new TextMessage { Role = Role.User, Text = "Hello!" } };
var response = await agent.GenerateReplyAsync(messages);
OpenRouter Usage Tracking
LmDotnet includes comprehensive usage tracking for OpenRouter, providing automatic token and cost monitoring.
Key Features
- ✅ Automatic Integration: Seamlessly activated when using OpenRouter as provider
- ✅ Inline Usage Preferred: Uses usage data directly from API responses when available
- ✅ Intelligent Fallback: Falls back to generation endpoint lookup when needed
- ✅ Performance Optimized: In-memory caching with configurable TTL
- ✅ Zero Configuration: Works out-of-the-box with sensible defaults
- ✅ Comprehensive Logging: Structured logging for monitoring and debugging
Quick Setup
# Environment variables
export ENABLE_USAGE_MIDDLEWARE=true
export OPENROUTER_API_KEY=sk-or-your-api-key-here
export USAGE_CACHE_TTL_SEC=300
// Usage data automatically provided in dedicated UsageMessage
var options = new GenerateReplyOptions { ModelId = "openai/gpt-4" };
var messages = await agent.GenerateReplyAsync(userMessages, options);
// Access usage information from UsageMessage
var usageMessage = messages.OfType<UsageMessage>().LastOrDefault();
if (usageMessage != null)
{
var usage = usageMessage.Usage;
Console.WriteLine($"Tokens: {usage.TotalTokens}, Cost: ${usage.TotalCost:F4}");
}
Comprehensive Documentation
For detailed configuration, troubleshooting, and examples:
📖 Complete OpenRouter Usage Tracking Guide
Project Structure
LmDotnet/
├── src/
│ ├── LmCore/ # Core interfaces and models
│ ├── OpenAIProvider/ # OpenAI and OpenRouter provider
│ ├── AnthropicProvider/ # Anthropic Claude provider
│ ├── LmConfig/ # Configuration and agent factories
│ ├── LmEmbeddings/ # Embedding services
│ └── LmTestUtils/ # Testing utilities
├── tests/ # Comprehensive test suite
└── docs/ # Additional documentation
Supported Providers
Provider | Models | Streaming | Function Calls | Usage Tracking |
---|---|---|---|---|
OpenAI | GPT-3.5, GPT-4, GPT-4 Turbo | ✅ | ✅ | ✅ |
OpenRouter | 100+ models | ✅ | ✅ | ✅ Enhanced |
Anthropic | Claude 3 (Sonnet, Haiku, Opus) | ✅ | ✅ | ✅ |
Custom | Extensible | ✅ | ✅ | 🔧 Configurable |
Advanced Features
Middleware Pipeline
// Custom middleware for logging, caching, etc.
public class CustomMiddleware : IStreamingMiddleware
{
public async Task<IAsyncEnumerable<IMessage>> InvokeStreamingAsync(
MiddlewareContext context, IStreamingAgent agent, CancellationToken cancellationToken)
{
// Pre-processing
yield return await agent.GenerateReplyStreamingAsync(context.Messages, context.Options, cancellationToken);
// Post-processing
}
}
Function Calling
var functions = new[]
{
new FunctionDefinition
{
Name = "get_weather",
Description = "Get current weather",
Parameters = new { location = new { type = "string" } }
}
};
var options = new GenerateReplyOptions { Functions = functions };
var response = await agent.GenerateReplyAsync(messages, options);
Performance Monitoring
Built-in performance tracking and telemetry:
// Performance metrics automatically collected
var metrics = performanceTracker.GetMetrics();
Console.WriteLine($"Average latency: {metrics.AverageLatency}ms");
Console.WriteLine($"Token throughput: {metrics.TokensPerSecond}");
Configuration
Environment Variables
Variable | Default | Description |
---|---|---|
ENABLE_USAGE_MIDDLEWARE |
true |
Enable OpenRouter usage tracking |
OPENROUTER_API_KEY |
- | OpenRouter API key (required for usage tracking) |
USAGE_CACHE_TTL_SEC |
300 |
Usage cache TTL in seconds |
ENABLE_INLINE_USAGE |
true |
Prefer inline usage over fallback |
Dependency Injection
// In Program.cs or Startup.cs
services.AddLmDotnet(configuration);
services.ValidateOpenRouterUsageConfiguration(configuration);
Testing
Comprehensive testing utilities included:
// Mock HTTP responses
var handler = FakeHttpMessageHandler.CreateOpenAIResponseHandler("Hello!");
var httpClient = new HttpClient(handler);
// Mock streaming responses
var handler = FakeHttpMessageHandler.CreateSseStreamHandler(events);
// Performance testing
var agent = new TestAgent();
var metrics = await PerformanceTestHelper.MeasureLatency(agent, messages);
Contributing
- Fork the repository
- Create a feature branch
- Add tests for new functionality
- Ensure all tests pass
- Submit a pull request
Documentation
- OpenRouter Usage Tracking - Complete usage tracking guide
- Testing Utilities - SSE testing documentation
- Architecture - System architecture and design decisions
License
This project is licensed under the MIT License - see the LICENSE file for details.
Support
- Issues: GitHub Issues
- Discussions: GitHub Discussions
- OpenRouter Support: OpenRouter Help
Built with ❤️ for the .NET community
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 is compatible. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
-
net8.0
- AchieveAi.LmDotnetTools.LmCore (>= 1.0.2)
- JsonSchema.Net (>= 7.3.4)
- JsonSchema.Net.Generation (>= 5.0.2)
- Microsoft.Extensions.Caching.Memory (>= 9.0.0)
- Microsoft.Extensions.Configuration.Abstractions (>= 9.0.5)
- Microsoft.Extensions.DependencyInjection.Abstractions (>= 9.0.5)
- Microsoft.Extensions.Logging.Abstractions (>= 9.0.5)
- System.Net.ServerSentEvents (>= 9.0.5)
-
net9.0
- AchieveAi.LmDotnetTools.LmCore (>= 1.0.2)
- JsonSchema.Net (>= 7.3.4)
- JsonSchema.Net.Generation (>= 5.0.2)
- Microsoft.Extensions.Caching.Memory (>= 9.0.0)
- Microsoft.Extensions.Configuration.Abstractions (>= 9.0.5)
- Microsoft.Extensions.DependencyInjection.Abstractions (>= 9.0.5)
- Microsoft.Extensions.Logging.Abstractions (>= 9.0.5)
- System.Net.ServerSentEvents (>= 9.0.5)
NuGet packages (2)
Showing the top 2 NuGet packages that depend on AchieveAi.LmDotnetTools.OpenAIProvider:
Package | Downloads |
---|---|
AchieveAi.LmDotnetTools.LmConfig
Configuration management library for language models, providing flexible configuration loading and validation. |
|
AchieveAi.LmDotnetTools.Misc
Miscellaneous utilities and helper functions for the LmDotnetTools library ecosystem. |
GitHub repositories
This package is not used by any popular GitHub repositories.
See CHANGELOG.md for detailed release notes.