InfluxDB.Client.Linq
4.18.0-dev.14769
See the version list below for details.
dotnet add package InfluxDB.Client.Linq --version 4.18.0-dev.14769
NuGet\Install-Package InfluxDB.Client.Linq -Version 4.18.0-dev.14769
<PackageReference Include="InfluxDB.Client.Linq" Version="4.18.0-dev.14769" />
paket add InfluxDB.Client.Linq --version 4.18.0-dev.14769
#r "nuget: InfluxDB.Client.Linq, 4.18.0-dev.14769"
// Install InfluxDB.Client.Linq as a Cake Addin #addin nuget:?package=InfluxDB.Client.Linq&version=4.18.0-dev.14769&prerelease // Install InfluxDB.Client.Linq as a Cake Tool #tool nuget:?package=InfluxDB.Client.Linq&version=4.18.0-dev.14769&prerelease
InfluxDB.Client.Linq
The library supports to use a LINQ expression to query the InfluxDB.
Documentation
This section contains links to the client library documentation.
Usage
- How to start
- Time Series
- Perform Query
- Filtering
- Supported LINQ operators
- Custom LINQ operators
- Domain Converter
- How to debug output Flux Query
- How to filter by Measurement
- Asynchronous Queries
How to start
First, add the library as a dependency for your project:
# For actual version please check: https://www.nuget.org/packages/InfluxDB.Client.Linq/
dotnet add package InfluxDB.Client.Linq --version 1.17.0-dev.linq.17
Next, you should add additional using statement to your program:
using InfluxDB.Client.Linq;
The LINQ query depends on QueryApiSync
, you could create an instance of QueryApiSync
by:
var client = new InfluxDBClient("http://localhost:8086", "my-token");
var queryApi = client.GetQueryApiSync();
In the following examples we assume that the Sensor
entity is defined as:
class Sensor
{
[Column("sensor_id", IsTag = true)]
public string SensorId { get; set; }
/// <summary>
/// "production" or "testing"
/// </summary>
[Column("deployment", IsTag = true)]
public string Deployment { get; set; }
/// <summary>
/// Value measured by sensor
/// </summary>
[Column("data")]
public float Value { get; set; }
[Column(IsTimestamp = true)]
public DateTime Timestamp { get; set; }
}
Time Series
The InfluxDB uses concept of TimeSeries - a collection of data that shares a measurement, tag set, and bucket. You always operate on each time-series, if you querying data with Flux.
Imagine that you have following data:
sensor,deployment=production,sensor_id=id-1 data=15
sensor,deployment=testing,sensor_id=id-1 data=28
sensor,deployment=testing,sensor_id=id-1 data=12
sensor,deployment=production,sensor_id=id-1 data=89
The corresponding time series are:
sensor,deployment=production,sensor_id=id-1
sensor,deployment=testing,sensor_id=id-1
If you query your data with following Flux:
from(bucket: "my-bucket")
|> range(start: 0)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
|> limit(n:1)
The result will be one item for each time-series:
sensor,deployment=production,sensor_id=id-1 data=15
sensor,deployment=testing,sensor_id=id-1 data=28
and this is also way how this LINQ driver works.
The driver supposes that you are querying over one time-series.
There is a way how to change this configuration:
Enable querying multiple time-series
var settings = new QueryableOptimizerSettings{QueryMultipleTimeSeries = true};
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", _queryApi, settings)
select s;
The group() function is way how to query multiple time-series and gets correct results.
The following query works correctly:
from(bucket: "my-bucket")
|> range(start: 0)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
|> group()
|> limit(n:1)
and corresponding result:
sensor,deployment=production,sensor_id=id-1 data=15
Do not used this functionality if it is not required because it brings a performance costs caused by sorting:
Group does not guarantee sort order
The group()
does not guarantee sort order of output records.
To ensure data is sorted correctly, use orderby
expression.
Client Side Evaluation
The library attempts to evaluate a query on the server as much as possible. The client side evaluations is required for aggregation function if there is more then one time series.
If you want to count your data with following Flux:
from(bucket: "my-bucket")
|> range(start: 0)
|> drop(columns: ["_start", "_stop", "_measurement"])
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> stateCount(fn: (r) => true, column: "linq_result_column")
|> last(column: "linq_result_column")
|> keep(columns: ["linq_result_column"])
The result will be one count for each time-series:
#group,false,false,false
#datatype,string,long,long
#default,_result,,
,result,table,linq_result_column
,,0,1
,,0,1
and client has to aggregate this multiple results into one scalar value.
Operators that could cause client side evaluation:
Count
CountLong
TL;DR
Perform Query
The LINQ query requires bucket
and organization
as a source of data. Both of them could be name or ID.
var query = (from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
where s.SensorId == "id-1"
where s.Value > 12
where s.Timestamp > new DateTime(2019, 11, 16, 8, 20, 15, DateTimeKind.Utc)
where s.Timestamp < new DateTime(2021, 01, 10, 5, 10, 0, DateTimeKind.Utc)
orderby s.Timestamp
select s)
.Take(2)
.Skip(2);
var sensors = query.ToList();
Flux Query:
from(bucket: "my-bucket")
|> range(start: 2019-11-16T08:20:15Z, stop: 2021-01-10T05:10:00Z)
|> filter(fn: (r) => (r["sensor_id"] == "id-1"))
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
|> filter(fn: (r) => (r["data"] > 12))
|> limit(n: 2, offset: 2)
Filtering
The range() and filter() are pushdown functions
that allow push their data manipulation down to the underlying data source rather than storing and manipulating data in memory.
Using pushdown functions at the beginning of query we greatly reduce the amount of server memory necessary to run a query.
The LINQ provider needs to aligns fields within each input table that have the same timestamp to column-wise format:
From
_time | _value | _measurement | _field |
---|---|---|---|
1970-01-01T00:00:00.000000001Z | 1.0 | "m1" | "f1" |
1970-01-01T00:00:00.000000001Z | 2.0 | "m1" | "f2" |
1970-01-01T00:00:00.000000002Z | 3.0 | "m1" | "f1" |
1970-01-01T00:00:00.000000002Z | 4.0 | "m1" | "f2" |
To
_time | _measurement | f1 | f2 |
---|---|---|---|
1970-01-01T00:00:00.000000001Z | "m1" | 1.0 | 2.0 |
1970-01-01T00:00:00.000000002Z | "m1" | 3.0 | 4.0 |
For that reason we need to use the pivot() function.
The pivot
is heavy and should be used at the end of our Flux query.
There is an also possibility to disable appending pivot
by:
var optimizerSettings =
new QueryableOptimizerSettings
{
AlignFieldsWithPivot = false
};
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi, optimizerSettings)
select s;
Mapping LINQ filters
For the best performance on the both side - server
, LINQ provider
we maps the LINQ expressions to FLUX query following way:
Filter by Timestamp
Mapped to range().
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
where s.Timestamp >= new DateTime(2019, 11, 16, 8, 20, 15, DateTimeKind.Utc)
select s;
var sensors = query.ToList();
Flux Query:
from(bucket: "my-bucket")
|> range(start: 2019-11-16T08:20:15ZZ)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
Filter by Tag
Mapped to filter() before pivot()
.
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
where s.SensorId == "id-1"
select s;
Flux Query:
from(bucket: "my-bucket")
|> range(start: 0)
|> filter(fn: (r) => (r["sensor_id"] == "id-1"))
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
Filter by Field
The filter by field has to be after the pivot()
because we want to select all fields from pivoted table.
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
where s.Value < 28
select s;
Flux Query:
from(bucket: "my-bucket")
|> range(start: 0)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
|> filter(fn: (r) => (r["data"] < 28))
If we move the filter()
for fields before the pivot()
then we will gets wrong results:
Data
m1 f1=1,f2=2 1
m1 f1=3,f2=4 2
Without filter
from(bucket: "my-bucket")
|> range(start: 0)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
Results:
_time | f1 | f2 |
---|---|---|
1970-01-01T00:00:00.000000001Z | 1.0 | 2.0 |
1970-01-01T00:00:00.000000002Z | 3.0 | 4.0 |
Filter before pivot()
filter:
f1 > 0
from(bucket: "my-bucket")
|> range(start: 0)
|> filter(fn: (r) => (r["_field"] == "f1" and r["_value"] > 0))
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
Results:
_time | f1 |
---|---|
1970-01-01T00:00:00.000000001Z | 1.0 |
1970-01-01T00:00:00.000000002Z | 3.0 |
Time Range Filtering
The time filtering expressions are mapped to Flux range()
function.
This function has start
and stop
parameters with following behaviour: start <= _time < stop
:
Results include records with
_time
values greater than or equal to the specifiedstart
time and less than the specifiedstop
time.
This means that we have to add one nanosecond to start
if we want timestamp greater than
and also add one nanosecond to stop
if we want to timestamp lesser or equal than
.
Example 1:
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
where s.Timestamp > new DateTime(2019, 11, 16, 8, 20, 15, DateTimeKind.Utc)
where s.Timestamp < new DateTime(2021, 01, 10, 5, 10, 0, DateTimeKind.Utc)
select s;
var sensors = query.ToList();
Flux Query:
start_shifted = int(v: time(v: "2019-11-16T08:20:15Z")) + 1
from(bucket: "my-bucket")
|> range(start: time(v: start_shifted), stop: 2021-01-10T05:10:00Z)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
Example 2:
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
where s.Timestamp >= new DateTime(2019, 11, 16, 8, 20, 15, DateTimeKind.Utc)
where s.Timestamp <= new DateTime(2021, 01, 10, 5, 10, 0, DateTimeKind.Utc)
select s;
var sensors = query.ToList();
Flux Query:
stop_shifted = int(v: time(v: "2021-01-10T05:10:00Z")) + 1
from(bucket: "my-bucket")
|> range(start: 2019-11-16T08:20:15Z, stop: time(v: stop_shifted))
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
Example 3:
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
where s.Timestamp >= new DateTime(2019, 11, 16, 8, 20, 15, DateTimeKind.Utc)
select s;
var sensors = query.ToList();
Flux Query:
from(bucket: "my-bucket")
|> range(start: 2019-11-16T08:20:15ZZ)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
Example 4:
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
where s.Timestamp <= new DateTime(2021, 01, 10, 5, 10, 0, DateTimeKind.Utc)
select s;
var sensors = query.ToList();
Flux Query:
stop_shifted = int(v: time(v: "2021-01-10T05:10:00Z")) + 1
from(bucket: "my-bucket")
|> range(start: 0, stop: time(v: stop_shifted))
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
Example 5:
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
where s.Timestamp == new DateTime(2019, 11, 16, 8, 20, 15, DateTimeKind.Utc)
select s;
var sensors = query.ToList();
Flux Query:
stop_shifted = int(v: time(v: "2019-11-16T08:20:15Z")) + 1
from(bucket: "my-bucket")
|> range(start: 2019-11-16T08:20:15Z, stop: time(v: stop_shifted))
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
There is also a possibility to specify the default value for start
and stop
parameter. This is useful when you need to include data with future timestamps when no time bounds are explicitly set.
var settings = new QueryableOptimizerSettings
{
RangeStartValue = DateTime.UtcNow.AddHours(-24),
RangeStopValue = DateTime.UtcNow.AddHours(1)
};
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi, settings)
select s;
TD;LR
Supported LINQ operators
Equal
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
where s.SensorId == "id-1"
select s;
Flux Query:
from(bucket: "my-bucket")
|> range(start: 0)
|> filter(fn: (r) => (r["sensor_id"] == "id-1"))
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
Not Equal
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
where s.SensorId != "id-1"
select s;
Flux Query:
from(bucket: "my-bucket")
|> range(start: 0)
|> filter(fn: (r) => (r["sensor_id"] != "id-1"))
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
Less Than
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
where s.Value < 28
select s;
Flux Query:
from(bucket: "my-bucket")
|> range(start: 0)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
|> filter(fn: (r) => (r["data"] < 28))
Less Than Or Equal
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
where s.Value <= 28
select s;
Flux Query:
from(bucket: "my-bucket")
|> range(start: 0)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
|> filter(fn: (r) => (r["data"] <= 28))
Greater Than
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
where s.Value > 28
select s;
Flux Query:
from(bucket: "my-bucket")
|> range(start: 0)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
|> filter(fn: (r) => (r["data"] > 28))
Greater Than Or Equal
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
where s.Value >= 28
select s;
Flux Query:
from(bucket: "my-bucket")
|> range(start: 0)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
|> filter(fn: (r) => (r["data"] >= 28))
And
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
where s.Value >= 28 && s.SensorId != "id-1"
select s;
Flux Query:
from(bucket: "my-bucket")
|> range(start: 0)
|> filter(fn: (r) => (r["sensor_id"] != "id-1"))
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
|> filter(fn: (r) => (r["data"] >= 28))
Or
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
where s.Value >= 28 || s.Value <= 5
select s;
Flux Query:
from(bucket: "my-bucket")
|> range(start: 0)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
|> filter(fn: (r) => ((r["data"] >= 28) or (r["data"] <=> 28)))
Any
The following code demonstrates how to use the Any
operator to determine whether a collection contains any elements.
By default the InfluxDB.Client
doesn't supports to store a subcollection in your DomainObject
.
Imagine that you have following entities:
class SensorCustom
{
public Guid Id { get; set; }
public float Data { get; set; }
public DateTimeOffset Time { get; set; }
public virtual ICollection<SensorAttribute> Attributes { get; set; }
}
class SensorAttribute
{
public string Name { get; set; }
public string Value { get; set; }
}
To be able to store SensorCustom
entity in InfluxDB and retrieve it from database you should implement IDomainObjectMapper.
The converter tells to the Client how to map DomainObject
into PointData and how to map FluxRecord to DomainObject
.
Entity Converter:
private class SensorEntityConverter : IDomainObjectMapper
{
//
// Parse incoming FluxRecord to DomainObject
//
public T ConvertToEntity<T>(FluxRecord fluxRecord)
{
if (typeof(T) != typeof(SensorCustom))
{
throw new NotSupportedException($"This converter doesn't supports: {typeof(SensorCustom)}");
}
//
// Create SensorCustom entity and parse `SeriesId`, `Value` and `Time`
//
var customEntity = new SensorCustom
{
Id = Guid.Parse(Convert.ToString(fluxRecord.GetValueByKey("series_id"))!),
Data = Convert.ToDouble(fluxRecord.GetValueByKey("data")),
Time = fluxRecord.GetTime().GetValueOrDefault().ToDateTimeUtc(),
Attributes = new List<SensorAttribute>()
};
foreach (var (key, value) in fluxRecord.Values)
{
//
// Parse SubCollection values
//
if (key.StartsWith("property_"))
{
var attribute = new SensorAttribute
{
Name = key.Replace("property_", string.Empty), Value = Convert.ToString(value)
};
customEntity.Attributes.Add(attribute);
}
}
return (T) Convert.ChangeType(customEntity, typeof(T));
}
//
// Convert DomainObject into PointData
//
public PointData ConvertToPointData<T>(T entity, WritePrecision precision)
{
if (!(entity is SensorCustom ce))
{
throw new NotSupportedException($"This converter doesn't supports: {typeof(SensorCustom)}");
}
//
// Map `SeriesId`, `Value` and `Time` to Tag, Field and Timestamp
//
var point = PointData
.Measurement("custom_measurement")
.Tag("series_id", ce.Id.ToString())
.Field("data", ce.Data)
.Timestamp(ce.Time, precision);
//
// Map subattributes to Fields
//
foreach (var attribute in ce.Attributes ?? new List<SensorAttribute>())
{
point = point.Field($"property_{attribute.Name}", attribute.Value);
}
return point;
}
}
The Converter
could be passed to QueryApiSync, QueryApi or WriteApi by:
// Create Converter
var converter = new SensorEntityConverter();
// Get Query and Write API
var queryApi = client.GetQueryApiSync(converter);
var writeApi = client.GetWriteApi(converter);
The LINQ provider needs to know how properties of DomainObject
are stored in InfluxDB - their name and type (tag, field, timestamp).
If you use a IDomainObjectMapper instead of InfluxDB Attributes you should implement IMemberNameResolver:
private class SensorMemberResolver: IMemberNameResolver
{
//
// Tell to LINQ providers how is property of DomainObject mapped - Tag, Field, Timestamp, ... ?
//
public MemberType ResolveMemberType(MemberInfo memberInfo)
{
//
// Mapping of subcollection
//
if (memberInfo.DeclaringType == typeof(SensorAttribute))
{
return memberInfo.Name switch
{
"Name" => MemberType.NamedField,
"Value" => MemberType.NamedFieldValue,
_ => MemberType.Field
};
}
//
// Mapping of "root" domain
//
return memberInfo.Name switch
{
"Time" => MemberType.Timestamp,
"Id" => MemberType.Tag,
_ => MemberType.Field
};
}
//
// Tell to LINQ provider how is property of DomainObject named
//
public string GetColumnName(MemberInfo memberInfo)
{
return memberInfo.Name switch
{
"Id" => "series_id",
"Data" => "data",
_ => memberInfo.Name
};
}
//
// Tell to LINQ provider how is named property that is flattened
//
public string GetNamedFieldName(MemberInfo memberInfo, object value)
{
return "attribute_" + Convert.ToString(value);
}
}
Now We are able to provide a required information to the LINQ provider by memberResolver
parameter:
var memberResolver = new SensorMemberResolver();
var query = from s in InfluxDBQueryable<SensorCustom>.Queryable("my-bucket", "my-org", queryApi, memberResolver)
where s.Attributes.Any(a => a.Name == "quality" && a.Value == "good")
select s;
Flux Query:
from(bucket: "my-bucket")
|> range(start: 0)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
|> filter(fn: (r) => (r["attribute_quality"] == "good"))
For more info see CustomDomainMappingAndLinq example.
Take
var query = (from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
select s)
.Take(10);
Flux Query:
from(bucket: "my-bucket")
|> range(start: 0)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
|> limit(n: 10)
Note: the limit()
function can be align before pivot()
function by:
var optimizerSettings =
new QueryableOptimizerSettings
{
AlignLimitFunctionAfterPivot = false
};
Performance: The pivot()
is a “heavy” function. Using limit()
before pivot()
is much faster but works only if you have consistent data series. See #318 for more details.
TakeLast
var query = (from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
select s)
.TakeLast(10);
Flux Query:
from(bucket: "my-bucket")
|> range(start: 0)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
|> tail(n: 10)
Note: the tail()
function can be align before pivot()
function by:
var optimizerSettings =
new QueryableOptimizerSettings
{
AlignLimitFunctionAfterPivot = false
};
Performance: The pivot()
is a “heavy” function. Using tail()
before pivot()
is much faster but works only if you have consistent data series. See #318 for more details.
Skip
var query = (from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
select s)
.Take(10)
.Skip(50);
Flux Query:
from(bucket: "my-bucket")
|> range(start: 0)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
|> limit(n: 10, offset: 50)
OrderBy
Example 1:
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
orderby s.Deployment
select s;
Flux Query:
from(bucket: "my-bucket")
|> range(start: 0)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
|> sort(columns: ["deployment"], desc: false)
Example 2:
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
orderby s.Timestamp descending
select s;
Flux Query:
from(bucket: "my-bucket")
|> range(start: 0)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
|> sort(columns: ["_time"], desc: true)
Count
Possibility of partial client side evaluation
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
select s;
var sensors = query.Count();
Flux Query:
from(bucket: "my-bucket")
|> range(start: 0)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
|> stateCount(fn: (r) => true, column: "linq_result_column")
|> last(column: "linq_result_column")
|> keep(columns: ["linq_result_column"])
LongCount
Possibility of partial client side evaluation
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
select s;
var sensors = query.LongCount();
Flux Query:
from(bucket: "my-bucket")
|> range(start: 0)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
|> stateCount(fn: (r) => true, column: "linq_result_column")
|> last(column: "linq_result_column")
|> keep(columns: ["linq_result_column"])
Contains
int[] values = {15, 28};
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
where values.Contains(s.Value)
select s;
var sensors = query.Count();
Flux Query:
from(bucket: "my-bucket")
|> range(start: 0)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
|> filter(fn: (r) => contains(value: r["data"], set: [15, 28]))
Custom LINQ operators
AggregateWindow
The AggregateWindow
applies an aggregate function to fixed windows of time.
Can be used only for a field which is defined as timestamp
- [Column(IsTimestamp = true)]
.
For more info about aggregateWindow() function
see Flux's documentation - https://docs.influxdata.com/flux/v0.x/stdlib/universe/aggregatewindow/.
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
where s.Timestamp.AggregateWindow(TimeSpan.FromSeconds(20), TimeSpan.FromSeconds(40), "mean")
select s;
Flux Query:
from(bucket: "my-bucket")
|> range(start: 0)
|> aggregateWindow(every: 20s, period: 40s, fn: mean)
|> pivot(rowKey:["_time"], columnKey: ["_field"], valueColumn: "_value")
|> drop(columns: ["_start", "_stop", "_measurement"])
Domain Converter
There is also possibility to use custom domain converter to transform data from/to your DomainObject
.
Instead of following Influx attributes:
[Measurement("temperature")]
private class Temperature
{
[Column("location", IsTag = true)] public string Location { get; set; }
[Column("value")] public double Value { get; set; }
[Column(IsTimestamp = true)] public DateTime Time { get; set; }
}
you could create own instance of IDomainObjectMapper
and use it with QueryApiSync
, QueryApi
and WriteApi
.
var converter = new DomainEntityConverter();
var queryApi = client.GetQueryApiSync(converter)
To satisfy LINQ Query Provider you have to implement IMemberNameResolver
:
var resolver = new MemberNameResolver();
var query = from s in InfluxDBQueryable<SensorCustom>.Queryable("my-bucket", "my-org", queryApi, nameResolver)
where s.Attributes.Any(a => a.Name == "quality" && a.Value == "good")
select s;
for more details see Any operator and for full example see: CustomDomainMappingAndLinq.
How to debug output Flux Query
var query = (from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", _queryApi)
where s.SensorId == "id-1"
where s.Value > 12
where s.Timestamp > new DateTime(2019, 11, 16, 8, 20, 15, DateTimeKind.Utc)
where s.Timestamp < new DateTime(2021, 01, 10, 5, 10, 0, DateTimeKind.Utc)
orderby s.Timestamp
select s)
.Take(2)
.Skip(2);
Console.WriteLine("==== Debug LINQ Queryable Flux output ====");
var influxQuery = ((InfluxDBQueryable<Sensor>) query).ToDebugQuery();
foreach (var statement in influxQuery.Extern.Body)
{
var os = statement as OptionStatement;
var va = os?.Assignment as VariableAssignment;
var name = va?.Id.Name;
var value = va?.Init.GetType().GetProperty("Value")?.GetValue(va.Init, null);
Console.WriteLine($"{name}={value}");
}
Console.WriteLine();
Console.WriteLine(influxQuery._Query);
How to filter by Measurement
By default, as an optimization step, Flux queries generated by LINQ will automatically drop the Start, Stop and Measurement columns:
from(bucket: "my-bucket")
|> range(start: 0)
|> drop(columns: ["_start", "_stop", "_measurement"])
...
This is because typical POCO classes do not include them:
[Measurement("temperature")]
private class Temperature
{
[Column("location", IsTag = true)] public string Location { get; set; }
[Column("value")] public double Value { get; set; }
[Column(IsTimestamp = true)] public DateTime Time { get; set; }
}
It is, however, possible to utilize the Measurement column in LINQ queries by enabling it in the query optimization settings:
var optimizerSettings =
new QueryableOptimizerSettings
{
DropMeasurementColumn = false,
// Note we can also enable the start and stop columns
//DropStartColumn = false,
//DropStopColumn = false
};
var queryable =
new InfluxDBQueryable<InfluxPoint>("my-bucket", "my-org", queryApi, new DefaultMemberNameResolver(), optimizerSettings);
var latest =
await queryable.Where(p => p.Measurement == "temperature")
.OrderByDescending(p => p.Time)
.ToInfluxQueryable()
.GetAsyncEnumerator()
.FirstOrDefaultAsync();
private class InfluxPoint
{
[Column(IsMeasurement = true)] public string Measurement { get; set; }
[Column("location", IsTag = true)] public string Location { get; set; }
[Column("value")] public double Value { get; set; }
[Column(IsTimestamp = true)] public DateTime Time { get; set; }
}
Asynchronous Queries
The LINQ driver also supports asynchronous querying. For asynchronous queries you have to initialize InfluxDBQueryable
with asynchronous version of QueryApi and transform IQueryable<T>
to IAsyncEnumerable<T>
:
var client = new InfluxDBClient("http://localhost:8086", "my-token");
var queryApi = client.GetQueryApi();
var query = from s in InfluxDBQueryable<Sensor>.Queryable("my-bucket", "my-org", queryApi)
select s;
IAsyncEnumerable<Sensor> enumerable = query
.ToInfluxQueryable()
.GetAsyncEnumerator();
Product | Versions Compatible and additional computed target framework versions. |
---|---|
.NET | net5.0 was computed. net5.0-windows was computed. net6.0 was computed. net6.0-android was computed. net6.0-ios was computed. net6.0-maccatalyst was computed. net6.0-macos was computed. net6.0-tvos was computed. net6.0-windows was computed. net7.0 was computed. net7.0-android was computed. net7.0-ios was computed. net7.0-maccatalyst was computed. net7.0-macos was computed. net7.0-tvos was computed. net7.0-windows was computed. net8.0 was computed. net8.0-android was computed. net8.0-browser was computed. net8.0-ios was computed. net8.0-maccatalyst was computed. net8.0-macos was computed. net8.0-tvos was computed. net8.0-windows was computed. |
.NET Core | netcoreapp2.0 was computed. netcoreapp2.1 was computed. netcoreapp2.2 was computed. netcoreapp3.0 was computed. netcoreapp3.1 was computed. |
.NET Standard | netstandard2.0 is compatible. netstandard2.1 is compatible. |
.NET Framework | net461 was computed. net462 was computed. net463 was computed. net47 was computed. net471 was computed. net472 was computed. net48 was computed. net481 was computed. |
MonoAndroid | monoandroid was computed. |
MonoMac | monomac was computed. |
MonoTouch | monotouch was computed. |
Tizen | tizen40 was computed. tizen60 was computed. |
Xamarin.iOS | xamarinios was computed. |
Xamarin.Mac | xamarinmac was computed. |
Xamarin.TVOS | xamarintvos was computed. |
Xamarin.WatchOS | xamarinwatchos was computed. |
-
.NETStandard 2.0
- InfluxDB.Client (>= 4.18.0-dev.14769)
- Remotion.Linq (>= 2.2.0)
-
.NETStandard 2.1
- InfluxDB.Client (>= 4.18.0-dev.14769)
- Remotion.Linq (>= 2.2.0)
NuGet packages (4)
Showing the top 4 NuGet packages that depend on InfluxDB.Client.Linq:
Package | Downloads |
---|---|
SpmisNet.Data
Package Description |
|
DeerNet.InfluxDb2
Package Description |
|
MicroHeart.InfluxDB
Package Description |
|
ToolNET.InfluxDB.SDK
时序数据库InfluxDB操作SDK |
GitHub repositories
This package is not used by any popular GitHub repositories.
Version | Downloads | Last updated |
---|---|---|
4.19.0-dev.15190 | 57 | 12/5/2024 |
4.19.0-dev.15189 | 44 | 12/5/2024 |
4.19.0-dev.15188 | 40 | 12/5/2024 |
4.19.0-dev.15178 | 45 | 12/5/2024 |
4.19.0-dev.15177 | 45 | 12/5/2024 |
4.19.0-dev.14906 | 107 | 10/2/2024 |
4.19.0-dev.14897 | 54 | 10/2/2024 |
4.19.0-dev.14896 | 47 | 10/2/2024 |
4.19.0-dev.14895 | 49 | 10/2/2024 |
4.19.0-dev.14811 | 68 | 9/13/2024 |
4.18.0 | 13,567 | 9/13/2024 |
4.18.0-dev.14769 | 66 | 9/4/2024 |
4.18.0-dev.14743 | 60 | 9/3/2024 |
4.18.0-dev.14694 | 57 | 9/3/2024 |
4.18.0-dev.14693 | 54 | 9/3/2024 |
4.18.0-dev.14692 | 52 | 9/3/2024 |
4.18.0-dev.14618 | 53 | 9/2/2024 |
4.18.0-dev.14609 | 51 | 9/2/2024 |
4.18.0-dev.14592 | 53 | 9/2/2024 |
4.18.0-dev.14446 | 79 | 8/19/2024 |
4.18.0-dev.14414 | 70 | 8/12/2024 |
4.17.0 | 6,530 | 8/12/2024 |
4.17.0-dev.headers.read.1 | 83 | 7/22/2024 |
4.17.0-dev.14350 | 51 | 8/5/2024 |
4.17.0-dev.14333 | 46 | 8/5/2024 |
4.17.0-dev.14300 | 42 | 8/5/2024 |
4.17.0-dev.14291 | 42 | 8/5/2024 |
4.17.0-dev.14189 | 60 | 7/23/2024 |
4.17.0-dev.14179 | 56 | 7/22/2024 |
4.17.0-dev.14101 | 133 | 7/1/2024 |
4.17.0-dev.14100 | 65 | 7/1/2024 |
4.17.0-dev.14044 | 66 | 6/24/2024 |
4.16.0 | 6,931 | 6/24/2024 |
4.16.0-dev.13990 | 68 | 6/3/2024 |
4.16.0-dev.13973 | 59 | 6/3/2024 |
4.16.0-dev.13972 | 58 | 6/3/2024 |
4.16.0-dev.13963 | 66 | 6/3/2024 |
4.16.0-dev.13962 | 62 | 6/3/2024 |
4.16.0-dev.13881 | 64 | 6/3/2024 |
4.16.0-dev.13775 | 77 | 5/17/2024 |
4.16.0-dev.13702 | 68 | 5/17/2024 |
4.15.0 | 2,639 | 5/17/2024 |
4.15.0-dev.13674 | 76 | 5/14/2024 |
4.15.0-dev.13567 | 83 | 4/2/2024 |
4.15.0-dev.13558 | 63 | 4/2/2024 |
4.15.0-dev.13525 | 74 | 4/2/2024 |
4.15.0-dev.13524 | 64 | 4/2/2024 |
4.15.0-dev.13433 | 75 | 3/7/2024 |
4.15.0-dev.13432 | 74 | 3/7/2024 |
4.15.0-dev.13407 | 72 | 3/7/2024 |
4.15.0-dev.13390 | 68 | 3/7/2024 |
4.15.0-dev.13388 | 66 | 3/7/2024 |
4.15.0-dev.13282 | 74 | 3/6/2024 |
4.15.0-dev.13257 | 74 | 3/6/2024 |
4.15.0-dev.13113 | 235 | 2/1/2024 |
4.15.0-dev.13104 | 70 | 2/1/2024 |
4.15.0-dev.13081 | 71 | 2/1/2024 |
4.15.0-dev.13040 | 69 | 2/1/2024 |
4.15.0-dev.13039 | 72 | 2/1/2024 |
4.15.0-dev.12863 | 119 | 1/8/2024 |
4.15.0-dev.12846 | 87 | 1/8/2024 |
4.15.0-dev.12837 | 79 | 1/8/2024 |
4.15.0-dev.12726 | 160 | 12/1/2023 |
4.15.0-dev.12725 | 81 | 12/1/2023 |
4.15.0-dev.12724 | 78 | 12/1/2023 |
4.15.0-dev.12691 | 82 | 12/1/2023 |
4.15.0-dev.12658 | 77 | 12/1/2023 |
4.15.0-dev.12649 | 80 | 12/1/2023 |
4.15.0-dev.12624 | 77 | 12/1/2023 |
4.15.0-dev.12471 | 104 | 11/7/2023 |
4.15.0-dev.12462 | 78 | 11/7/2023 |
4.14.0 | 51,214 | 11/7/2023 |
4.14.0-dev.12437 | 80 | 11/7/2023 |
4.14.0-dev.12343 | 92 | 11/2/2023 |
4.14.0-dev.12310 | 79 | 11/2/2023 |
4.14.0-dev.12284 | 82 | 11/1/2023 |
4.14.0-dev.12235 | 81 | 11/1/2023 |
4.14.0-dev.12226 | 79 | 11/1/2023 |
4.14.0-dev.11972 | 215 | 8/8/2023 |
4.14.0-dev.11915 | 116 | 7/31/2023 |
4.14.0-dev.11879 | 125 | 7/28/2023 |
4.13.0 | 22,062 | 7/28/2023 |
4.13.0-dev.11854 | 97 | 7/28/2023 |
4.13.0-dev.11814 | 109 | 7/21/2023 |
4.13.0-dev.11771 | 100 | 7/19/2023 |
4.13.0-dev.11770 | 108 | 7/19/2023 |
4.13.0-dev.11728 | 96 | 7/18/2023 |
4.13.0-dev.11686 | 97 | 7/17/2023 |
4.13.0-dev.11685 | 93 | 7/17/2023 |
4.13.0-dev.11676 | 111 | 7/17/2023 |
4.13.0-dev.11479 | 96 | 6/27/2023 |
4.13.0-dev.11478 | 98 | 6/27/2023 |
4.13.0-dev.11477 | 102 | 6/27/2023 |
4.13.0-dev.11396 | 103 | 6/19/2023 |
4.13.0-dev.11395 | 88 | 6/19/2023 |
4.13.0-dev.11342 | 99 | 6/15/2023 |
4.13.0-dev.11330 | 108 | 6/12/2023 |
4.13.0-dev.11305 | 101 | 6/12/2023 |
4.13.0-dev.11296 | 101 | 6/12/2023 |
4.13.0-dev.11217 | 104 | 6/6/2023 |
4.13.0-dev.11089 | 94 | 5/30/2023 |
4.13.0-dev.11064 | 101 | 5/30/2023 |
4.13.0-dev.10998 | 98 | 5/29/2023 |
4.13.0-dev.10989 | 101 | 5/29/2023 |
4.13.0-dev.10871 | 104 | 5/8/2023 |
4.13.0-dev.10870 | 86 | 5/8/2023 |
4.13.0-dev.10819 | 114 | 4/28/2023 |
4.12.0 | 13,375 | 4/28/2023 |
4.12.0-dev.10777 | 105 | 4/27/2023 |
4.12.0-dev.10768 | 110 | 4/27/2023 |
4.12.0-dev.10759 | 106 | 4/27/2023 |
4.12.0-dev.10742 | 102 | 4/27/2023 |
4.12.0-dev.10685 | 95 | 4/27/2023 |
4.12.0-dev.10684 | 96 | 4/27/2023 |
4.12.0-dev.10643 | 98 | 4/27/2023 |
4.12.0-dev.10642 | 102 | 4/27/2023 |
4.12.0-dev.10569 | 98 | 4/27/2023 |
4.12.0-dev.10193 | 140 | 2/23/2023 |
4.11.0 | 20,394 | 2/23/2023 |
4.11.0-dev.10176 | 109 | 2/23/2023 |
4.11.0-dev.10059 | 214 | 1/26/2023 |
4.10.0 | 6,423 | 1/26/2023 |
4.10.0-dev.10033 | 129 | 1/25/2023 |
4.10.0-dev.10032 | 129 | 1/25/2023 |
4.10.0-dev.10031 | 126 | 1/25/2023 |
4.10.0-dev.9936 | 2,203 | 12/26/2022 |
4.10.0-dev.9935 | 123 | 12/26/2022 |
4.10.0-dev.9881 | 119 | 12/21/2022 |
4.10.0-dev.9880 | 114 | 12/21/2022 |
4.10.0-dev.9818 | 123 | 12/16/2022 |
4.10.0-dev.9773 | 113 | 12/12/2022 |
4.10.0-dev.9756 | 120 | 12/12/2022 |
4.10.0-dev.9693 | 115 | 12/6/2022 |
4.9.0 | 9,549 | 12/6/2022 |
4.9.0-dev.9684 | 117 | 12/6/2022 |
4.9.0-dev.9666 | 124 | 12/6/2022 |
4.9.0-dev.9617 | 117 | 12/6/2022 |
4.9.0-dev.9478 | 112 | 12/5/2022 |
4.9.0-dev.9469 | 127 | 12/5/2022 |
4.9.0-dev.9444 | 109 | 12/5/2022 |
4.9.0-dev.9411 | 104 | 12/5/2022 |
4.9.0-dev.9350 | 114 | 12/1/2022 |
4.8.0 | 1,598 | 12/1/2022 |
4.8.0-dev.9324 | 116 | 11/30/2022 |
4.8.0-dev.9232 | 120 | 11/28/2022 |
4.8.0-dev.9223 | 116 | 11/28/2022 |
4.8.0-dev.9222 | 124 | 11/28/2022 |
4.8.0-dev.9117 | 129 | 11/21/2022 |
4.8.0-dev.9108 | 114 | 11/21/2022 |
4.8.0-dev.9099 | 126 | 11/21/2022 |
4.8.0-dev.9029 | 116 | 11/16/2022 |
4.8.0-dev.8971 | 120 | 11/15/2022 |
4.8.0-dev.8961 | 126 | 11/14/2022 |
4.8.0-dev.8928 | 124 | 11/14/2022 |
4.8.0-dev.8899 | 130 | 11/14/2022 |
4.8.0-dev.8898 | 122 | 11/14/2022 |
4.8.0-dev.8839 | 136 | 11/14/2022 |
4.8.0-dev.8740 | 112 | 11/7/2022 |
4.8.0-dev.8725 | 117 | 11/7/2022 |
4.8.0-dev.8648 | 116 | 11/3/2022 |
4.7.0 | 24,226 | 11/3/2022 |
4.7.0-dev.8625 | 124 | 11/2/2022 |
4.7.0-dev.8594 | 126 | 10/31/2022 |
4.7.0-dev.8579 | 125 | 10/31/2022 |
4.7.0-dev.8557 | 116 | 10/31/2022 |
4.7.0-dev.8540 | 108 | 10/31/2022 |
4.7.0-dev.8518 | 112 | 10/31/2022 |
4.7.0-dev.8517 | 121 | 10/31/2022 |
4.7.0-dev.8509 | 119 | 10/31/2022 |
4.7.0-dev.8377 | 123 | 10/26/2022 |
4.7.0-dev.8360 | 130 | 10/25/2022 |
4.7.0-dev.8350 | 129 | 10/24/2022 |
4.7.0-dev.8335 | 126 | 10/24/2022 |
4.7.0-dev.8334 | 127 | 10/24/2022 |
4.7.0-dev.8223 | 167 | 10/19/2022 |
4.7.0-dev.8178 | 121 | 10/17/2022 |
4.7.0-dev.8170 | 119 | 10/17/2022 |
4.7.0-dev.8148 | 128 | 10/17/2022 |
4.7.0-dev.8133 | 125 | 10/17/2022 |
4.7.0-dev.8097 | 115 | 10/17/2022 |
4.7.0-dev.8034 | 131 | 10/11/2022 |
4.7.0-dev.8025 | 119 | 10/11/2022 |
4.7.0-dev.8009 | 137 | 10/10/2022 |
4.7.0-dev.8001 | 142 | 10/10/2022 |
4.7.0-dev.7959 | 119 | 10/4/2022 |
4.7.0-dev.7905 | 124 | 9/30/2022 |
4.7.0-dev.7875 | 115 | 9/29/2022 |
4.6.0 | 2,703 | 9/29/2022 |
4.6.0-dev.7832 | 129 | 9/29/2022 |
4.6.0-dev.7817 | 128 | 9/29/2022 |
4.6.0-dev.7779 | 143 | 9/27/2022 |
4.6.0-dev.7778 | 139 | 9/27/2022 |
4.6.0-dev.7734 | 130 | 9/26/2022 |
4.6.0-dev.7733 | 130 | 9/26/2022 |
4.6.0-dev.7677 | 133 | 9/20/2022 |
4.6.0-dev.7650 | 137 | 9/16/2022 |
4.6.0-dev.7626 | 191 | 9/14/2022 |
4.6.0-dev.7618 | 182 | 9/14/2022 |
4.6.0-dev.7574 | 123 | 9/13/2022 |
4.6.0-dev.7572 | 122 | 9/13/2022 |
4.6.0-dev.7528 | 118 | 9/12/2022 |
4.6.0-dev.7502 | 129 | 9/9/2022 |
4.6.0-dev.7479 | 146 | 9/8/2022 |
4.6.0-dev.7471 | 133 | 9/8/2022 |
4.6.0-dev.7447 | 125 | 9/7/2022 |
4.6.0-dev.7425 | 120 | 9/7/2022 |
4.6.0-dev.7395 | 118 | 9/6/2022 |
4.6.0-dev.7344 | 123 | 8/31/2022 |
4.6.0-dev.7329 | 117 | 8/31/2022 |
4.6.0-dev.7292 | 109 | 8/30/2022 |
4.6.0-dev.7240 | 125 | 8/29/2022 |
4.5.0 | 2,507 | 8/29/2022 |
4.5.0-dev.7216 | 121 | 8/27/2022 |
4.5.0-dev.7147 | 127 | 8/22/2022 |
4.5.0-dev.7134 | 126 | 8/17/2022 |
4.5.0-dev.7096 | 133 | 8/15/2022 |
4.5.0-dev.7070 | 137 | 8/11/2022 |
4.5.0-dev.7040 | 157 | 8/10/2022 |
4.5.0-dev.7011 | 135 | 8/3/2022 |
4.5.0-dev.6987 | 138 | 8/1/2022 |
4.5.0-dev.6962 | 141 | 7/29/2022 |
4.4.0 | 14,733 | 7/29/2022 |
4.4.0-dev.6901 | 141 | 7/25/2022 |
4.4.0-dev.6843 | 133 | 7/19/2022 |
4.4.0-dev.6804 | 137 | 7/19/2022 |
4.4.0-dev.6789 | 135 | 7/19/2022 |
4.4.0-dev.6760 | 133 | 7/19/2022 |
4.4.0-dev.6705 | 145 | 7/14/2022 |
4.4.0-dev.6663 | 171 | 6/24/2022 |
4.4.0-dev.6655 | 131 | 6/24/2022 |
4.3.0 | 11,548 | 6/24/2022 |
4.3.0-dev.multiple.buckets3 | 161 | 6/21/2022 |
4.3.0-dev.multiple.buckets2 | 125 | 6/17/2022 |
4.3.0-dev.multiple.buckets1 | 132 | 6/17/2022 |
4.3.0-dev.6631 | 126 | 6/22/2022 |
4.3.0-dev.6623 | 134 | 6/22/2022 |
4.3.0-dev.6374 | 137 | 6/13/2022 |
4.3.0-dev.6286 | 139 | 5/20/2022 |
4.2.0 | 2,413 | 5/20/2022 |
4.2.0-dev.6257 | 141 | 5/13/2022 |
4.2.0-dev.6248 | 138 | 5/12/2022 |
4.2.0-dev.6233 | 143 | 5/12/2022 |
4.2.0-dev.6194 | 140 | 5/10/2022 |
4.2.0-dev.6193 | 134 | 5/10/2022 |
4.2.0-dev.6158 | 2,849 | 5/6/2022 |
4.2.0-dev.6135 | 145 | 5/6/2022 |
4.2.0-dev.6091 | 146 | 4/28/2022 |
4.2.0-dev.6048 | 146 | 4/28/2022 |
4.2.0-dev.6047 | 146 | 4/28/2022 |
4.2.0-dev.5966 | 148 | 4/25/2022 |
4.2.0-dev.5938 | 149 | 4/19/2022 |
4.1.0 | 3,400 | 4/19/2022 |
4.1.0-dev.5910 | 342 | 4/13/2022 |
4.1.0-dev.5888 | 148 | 4/13/2022 |
4.1.0-dev.5887 | 154 | 4/13/2022 |
4.1.0-dev.5794 | 154 | 4/6/2022 |
4.1.0-dev.5725 | 156 | 3/18/2022 |
4.0.0 | 7,896 | 3/18/2022 |
4.0.0-rc3 | 399 | 3/4/2022 |
4.0.0-rc2 | 549 | 2/25/2022 |
4.0.0-rc1 | 210 | 2/18/2022 |
4.0.0-dev.5709 | 148 | 3/18/2022 |
4.0.0-dev.5684 | 158 | 3/15/2022 |
4.0.0-dev.5630 | 158 | 3/4/2022 |
4.0.0-dev.5607 | 150 | 3/3/2022 |
4.0.0-dev.5579 | 155 | 2/25/2022 |
4.0.0-dev.5556 | 158 | 2/24/2022 |
4.0.0-dev.5555 | 146 | 2/24/2022 |
4.0.0-dev.5497 | 144 | 2/23/2022 |
4.0.0-dev.5489 | 155 | 2/23/2022 |
4.0.0-dev.5460 | 151 | 2/23/2022 |
4.0.0-dev.5444 | 145 | 2/22/2022 |
4.0.0-dev.5333 | 150 | 2/17/2022 |
4.0.0-dev.5303 | 145 | 2/16/2022 |
4.0.0-dev.5280 | 157 | 2/16/2022 |
4.0.0-dev.5279 | 158 | 2/16/2022 |
4.0.0-dev.5241 | 252 | 2/15/2022 |
4.0.0-dev.5225 | 148 | 2/15/2022 |
4.0.0-dev.5217 | 153 | 2/15/2022 |
4.0.0-dev.5209 | 144 | 2/15/2022 |
4.0.0-dev.5200 | 148 | 2/14/2022 |
4.0.0-dev.5188 | 148 | 2/10/2022 |
4.0.0-dev.5180 | 147 | 2/10/2022 |
4.0.0-dev.5172 | 150 | 2/10/2022 |
4.0.0-dev.5130 | 144 | 2/10/2022 |
4.0.0-dev.5122 | 150 | 2/9/2022 |
4.0.0-dev.5103 | 157 | 2/9/2022 |
4.0.0-dev.5097 | 156 | 2/9/2022 |
4.0.0-dev.5091 | 151 | 2/9/2022 |
4.0.0-dev.5084 | 151 | 2/8/2022 |
3.4.0-dev.5263 | 159 | 2/15/2022 |
3.4.0-dev.4986 | 151 | 2/7/2022 |
3.4.0-dev.4968 | 167 | 2/4/2022 |
3.3.0 | 8,680 | 2/4/2022 |
3.3.0-dev.4889 | 156 | 2/3/2022 |
3.3.0-dev.4865 | 162 | 2/1/2022 |
3.3.0-dev.4823 | 165 | 1/19/2022 |
3.3.0-dev.4691 | 163 | 1/7/2022 |
3.3.0-dev.4557 | 1,373 | 11/26/2021 |
3.2.0 | 5,901 | 11/26/2021 |
3.2.0-dev.4533 | 4,868 | 11/24/2021 |
3.2.0-dev.4484 | 230 | 11/11/2021 |
3.2.0-dev.4475 | 202 | 11/10/2021 |
3.2.0-dev.4387 | 178 | 10/26/2021 |
3.2.0-dev.4363 | 193 | 10/22/2021 |
3.2.0-dev.4356 | 191 | 10/22/2021 |
3.1.0 | 1,795 | 10/22/2021 |
3.1.0-dev.4303 | 195 | 10/18/2021 |
3.1.0-dev.4293 | 197 | 10/15/2021 |
3.1.0-dev.4286 | 174 | 10/15/2021 |
3.1.0-dev.4240 | 213 | 10/12/2021 |
3.1.0-dev.4202 | 170 | 10/11/2021 |
3.1.0-dev.4183 | 213 | 10/11/2021 |
3.1.0-dev.4131 | 182 | 10/8/2021 |
3.1.0-dev.3999 | 189 | 10/5/2021 |
3.1.0-dev.3841 | 267 | 9/29/2021 |
3.1.0-dev.3798 | 188 | 9/17/2021 |
3.0.0 | 1,208 | 9/17/2021 |
3.0.0-dev.3726 | 530 | 8/31/2021 |
3.0.0-dev.3719 | 175 | 8/31/2021 |
3.0.0-dev.3671 | 189 | 8/20/2021 |
2.2.0-dev.3652 | 183 | 8/20/2021 |
2.1.0 | 1,562 | 8/20/2021 |
2.1.0-dev.3605 | 188 | 8/17/2021 |
2.1.0-dev.3584 | 189 | 8/16/2021 |
2.1.0-dev.3558 | 179 | 8/16/2021 |
2.1.0-dev.3527 | 226 | 7/29/2021 |
2.1.0-dev.3519 | 227 | 7/29/2021 |
2.1.0-dev.3490 | 178 | 7/20/2021 |
2.1.0-dev.3445 | 201 | 7/12/2021 |
2.1.0-dev.3434 | 236 | 7/9/2021 |
2.0.0 | 9,022 | 7/9/2021 |
2.0.0-dev.3401 | 217 | 6/25/2021 |
2.0.0-dev.3368 | 202 | 6/23/2021 |
2.0.0-dev.3361 | 213 | 6/23/2021 |
2.0.0-dev.3330 | 209 | 6/17/2021 |
2.0.0-dev.3291 | 212 | 6/16/2021 |
1.20.0-dev.3218 | 229 | 6/4/2021 |
1.19.0 | 934 | 6/4/2021 |
1.19.0-dev.3204 | 199 | 6/3/2021 |
1.19.0-dev.3160 | 183 | 6/2/2021 |
1.19.0-dev.3159 | 179 | 6/2/2021 |
1.19.0-dev.3084 | 840 | 5/7/2021 |
1.19.0-dev.3051 | 204 | 5/5/2021 |
1.19.0-dev.3044 | 201 | 5/5/2021 |
1.19.0-dev.3008 | 195 | 4/30/2021 |
1.18.0 | 1,243 | 4/30/2021 |
1.18.0-dev.2973 | 214 | 4/27/2021 |
1.18.0-dev.2930 | 194 | 4/16/2021 |
1.18.0-dev.2919 | 191 | 4/13/2021 |
1.18.0-dev.2893 | 177 | 4/12/2021 |
1.18.0-dev.2880 | 196 | 4/12/2021 |
1.18.0-dev.2856 | 190 | 4/7/2021 |
1.18.0-dev.2830 | 286 | 4/1/2021 |
1.18.0-dev.2816 | 192 | 4/1/2021 |
1.17.0 | 771 | 4/1/2021 |
1.17.0-dev.linq.17 | 805 | 3/18/2021 |
1.17.0-dev.linq.16 | 184 | 3/16/2021 |
1.17.0-dev.linq.15 | 218 | 3/15/2021 |
1.17.0-dev.linq.14 | 222 | 3/12/2021 |
1.17.0-dev.linq.13 | 251 | 3/11/2021 |
1.17.0-dev.linq.12 | 202 | 3/10/2021 |
1.17.0-dev.linq.11 | 197 | 3/8/2021 |
1.17.0-dev.2776 | 221 | 3/26/2021 |
1.17.0-dev.2713 | 234 | 3/25/2021 |
1.16.0-dev.linq.10 | 1,239 | 2/4/2021 |
1.15.0-dev.linq.9 | 218 | 2/4/2021 |
1.15.0-dev.linq.8 | 191 | 1/28/2021 |
1.15.0-dev.linq.7 | 208 | 1/27/2021 |
1.15.0-dev.linq.6 | 225 | 1/20/2021 |
1.15.0-dev.linq.5 | 244 | 1/19/2021 |
1.15.0-dev.linq.4 | 207 | 1/15/2021 |
1.15.0-dev.linq.3 | 183 | 1/14/2021 |
1.15.0-dev.linq.2 | 199 | 1/13/2021 |
1.15.0-dev.linq.1 | 223 | 1/12/2021 |