Ivilson.AI.VllmChatClient
1.3.1
There is a newer version of this package available.
See the version list below for details.
See the version list below for details.
dotnet add package Ivilson.AI.VllmChatClient --version 1.3.1
NuGet\Install-Package Ivilson.AI.VllmChatClient -Version 1.3.1
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="Ivilson.AI.VllmChatClient" Version="1.3.1" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
<PackageVersion Include="Ivilson.AI.VllmChatClient" Version="1.3.1" />
<PackageReference Include="Ivilson.AI.VllmChatClient" />
For projects that support Central Package Management (CPM), copy this XML node into the solution Directory.Packages.props file to version the package.
paket add Ivilson.AI.VllmChatClient --version 1.3.1
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
#r "nuget: Ivilson.AI.VllmChatClient, 1.3.1"
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
#:package Ivilson.AI.VllmChatClient@1.3.1
#:package directive can be used in C# file-based apps starting in .NET 10 preview 4. Copy this into a .cs file before any lines of code to reference the package.
#addin nuget:?package=Ivilson.AI.VllmChatClient&version=1.3.1
#tool nuget:?package=Ivilson.AI.VllmChatClient&version=1.3.1
The NuGet Team does not provide support for this client. Please contact its maintainers for support.
C# vllm chat client
It can work for qwen3, qwq32b, gemma3, deepseek-r1 on vllm.
project on github : https://github.com/iwaitu/vllmchatclient
最近更新
- 更新版本并增强消息处理逻辑
- 修复了无法输出 json 格式数据的问题
- VllmQwen3ChatClient
- VllmQwqChatClient
- VllmGemmaChatClient
- VllmDeepseekR1ChatClient
support stream function call .
for model: qwq or qwen3 vllm deployment:
docker run -it --gpus all -p 8000:8000 \
-v /models/Qwen3-32B-FP8:/models/Qwen3-32B-FP8 \
--restart always \
-e VLLM_USE_V1=1 \
vllm/llm-openai:v0.8.5 \
--model /models/Qwen3-32B-FP8 \
--enable-auto-tool-choice \
--tool-call-parser llama3_json \
--trust-remote-code \
--max-model-len 131072 \
--tensor-parallel-size 2 \
--gpu_memory_utilization 0.8 \
--served-model-name "qwen3"
for model: gemma3 vllm deployment:
docker run -it --gpus all -p 8000:8000 \
-v /models/gemma-3-27b-it-FP8-Dynamic:/models/gemma-3-27b-it-FP8-Dynamic \
-v /home/lc/work/gemma3.jinja:/home/lc/work/gemma3.jinja \
-e TZ=Asia/Shanghai \
-e VLLM_USE_V1=1 \
--restart always \
vllm/llm-openai:v0.8.2 \
--model /models/gemma-3-27b-it-FP8-Dynamic \
--enable-auto-tool-choice \
--tool-call-parser pythonic \
--chat-template /home/lc/work/gemma3.jinja \
--trust-remote-code \
--max-model-len 128000 \
--tensor-parallel-size 2 \
--gpu_memory_utilization 0.8 \
--served-model-name "gemma3"
Qwen3 model sample
[Description("Gets the weather")]
static string GetWeather() => Random.Shared.NextDouble() > 0.1 ? "It's sunny" : "It's raining";
public async Task StreamChatFunctionCallTest()
{
IChatClient vllmclient = new VllmQwqChatClient(apiurl,null, "qwen3");
IChatClient client = new ChatClientBuilder(vllmclient)
.UseFunctionInvocation()
.Build();
var messages = new List<ChatMessage>
{
new ChatMessage(ChatRole.System ,"你是一个智能助手,名字叫菲菲"),
new ChatMessage(ChatRole.User,"今天天气如何?")
};
Qwen3ChatOptions chatOptions = new()
{
Tools = [AIFunctionFactory.Create(GetWeather)],
NoThinking = true //qwen3 only
};
string res = string.Empty;
await foreach (var update in client.GetStreamingResponseAsync(messages, chatOptions))
{
res += update;
}
Assert.True(res != null);
}
QwQ model sampleusing Microsoft.Extensions.AI
string apiurl = "http://localhost:8000/{0}/{1}";
[Description("Gets the weather")]
static string GetWeather() => Random.Shared.NextDouble() > 0.5 ? "It's sunny" : "It's raining";
IChatClient vllmclient = new VllmQwqChatClient(apiurl,null, "qwq");
ChatOptions chatOptions = new()
{
Tools = [AIFunctionFactory.Create(GetWeather)],
NoThink = false
};
var messages = new List<ChatMessage>
{
new ChatMessage(ChatRole.System ,"你是一个智能助手,名字叫菲菲"),
new ChatMessage(ChatRole.User,"今天天气如何?")
};
private async Task<(string answer, string reasoning)> StreamChatResponseAsync(List<ChatMessage> messages, ChatOptions chatOptions)
{
string answer = string.Empty;
string reasoning = string.Empty;
await foreach (var update in _chatClient.GetStreamingResponseAsync(messages, chatOptions))
{
var updateText = update.ToString();
if (update is ReasoningChatResponseUpdate reasoningUpdate)
{
if (!reasoningUpdate.Thinking)
{
answer += updateText;
}
else
{
reasoning += updateText;
}
}
else
{
answer += updateText;
}
}
return (answer, reasoning);
}
var (answer, reasoning) = await StreamChatResponseAsync(messages, chatOptions);
for model deepseek r1:
var messages = new List<ChatMessage>
{
new ChatMessage(ChatRole.System ,"你是一个智能助手,名字叫菲菲"),
new ChatMessage(ChatRole.User,"你是谁?")
};
string res = string.Empty;
string think = string.Empty;
await foreach (ReasoningChatResponseUpdate update in _client.GetStreamingResponseAsync(messages))
{
var updateText = update.ToString();
if (update is ReasoningChatResponseUpdate)
{
if (update.Thinking)
{
think += updateText;
}
else
{
res += updateText;
}
}
}
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net8.0 is compatible. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. net9.0 was computed. net9.0-android was computed. net9.0-browser was computed. net9.0-ios was computed. net9.0-maccatalyst was computed. net9.0-macos was computed. net9.0-tvos was computed. net9.0-windows was computed. net10.0 was computed. net10.0-android was computed. net10.0-browser was computed. net10.0-ios was computed. net10.0-maccatalyst was computed. net10.0-macos was computed. net10.0-tvos was computed. net10.0-windows was computed. |
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.
-
net8.0
- McpDotNet.Extensions.AI (>= 1.1.0.1)
- Microsoft.Extensions.AI.Abstractions (>= 9.3.0-preview.1.25161.3)
- Newtonsoft.Json (>= 13.0.3)
NuGet packages
This package is not used by any NuGet packages.
GitHub repositories
This package is not used by any popular GitHub repositories.
Version | Downloads | Last Updated |
---|---|---|
1.4.2 | 185 | 6/23/2025 |
1.4.0 | 134 | 6/23/2025 |
1.3.8 | 135 | 6/21/2025 |
1.3.6 | 70 | 6/21/2025 |
1.3.5 | 87 | 6/20/2025 |
1.3.2 | 62 | 6/20/2025 |
1.3.1 | 62 | 6/20/2025 |
1.3.0 | 343 | 6/3/2025 |
1.2.8 | 123 | 6/3/2025 |
1.2.7 | 80 | 5/31/2025 |
1.2.6 | 56 | 5/31/2025 |
1.2.5 | 50 | 5/31/2025 |
1.2.3 | 57 | 5/30/2025 |
1.2.2 | 116 | 5/24/2025 |
1.2.0 | 293 | 5/13/2025 |
1.1.8 | 224 | 5/13/2025 |
1.1.6 | 173 | 5/12/2025 |
1.1.5 | 102 | 5/2/2025 |
1.1.4 | 116 | 4/29/2025 |
1.1.3 | 107 | 4/29/2025 |
1.1.2 | 111 | 4/29/2025 |
1.1.1 | 112 | 4/29/2025 |
1.1.0 | 145 | 4/23/2025 |
1.0.8 | 143 | 4/23/2025 |
1.0.6 | 129 | 4/18/2025 |