Documentation Index
Fetch the complete documentation index at: https://private-7c7dfe99-page-updates.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Quick start
using ClickHouse.Driver;
// Create a client (typically as a singleton)
using var client = new ClickHouseClient("Host=my.clickhouse;Protocol=https;Port=8443;Username=user");
// Execute a query
var version = await client.ExecuteScalarAsync("SELECT version()");
Console.WriteLine(version);
Creating a table
await client.ExecuteNonQueryAsync(
"CREATE TABLE IF NOT EXISTS default.my_table (id Int64, name String) ENGINE = Memory"
);
Inserting data
Parameterized inserts
using ClickHouse.Driver;
using ClickHouse.Driver.ADO.Parameters;
var parameters = new ClickHouseParameterCollection();
parameters.Add("id", 1L);
parameters.Add("name", "Alice");
await client.ExecuteNonQueryAsync(
"INSERT INTO default.my_table (id, name) VALUES ({id:Int64}, {name:String})",
parameters
);
Bulk inserts
// Prepare data as IEnumerable<object[]>
var rows = Enumerable.Range(0, 1_000_000)
.Select(i => new object[] { (long)i, $"value{i}" });
var columns = new[] { "id", "name" };
// Basic insert
long rowsInserted = await client.InsertBinaryAsync("default.my_table", columns, rows);
Console.WriteLine($"Rows inserted: {rowsInserted}");
For large datasets, configure batching and parallelism with InsertOptions:
var options = new InsertOptions
{
BatchSize = 100_000, // Rows per batch (default: 100,000)
MaxDegreeOfParallelism = 4 // Parallel batch uploads (default: 1)
};
Reading data
using ClickHouse.Driver.ADO.Parameters;
var parameters = new ClickHouseParameterCollection();
parameters.Add("max_id", 100L);
var reader = await client.ExecuteReaderAsync(
"SELECT * FROM default.my_table WHERE id < {max_id:Int64}",
parameters
);
while (reader.Read())
{
Console.WriteLine($"Id: {reader.GetInt64(0)}, Name: {reader.GetString(1)}");
}
Raw streaming
using var result = await client.ExecuteRawResultAsync(
"SELECT * FROM default.my_table LIMIT 100 FORMAT JSONEachRow"
);
await using var stream = await result.ReadAsStreamAsync();
using var reader = new StreamReader(stream);
var json = await reader.ReadToEndAsync();
Raw stream insert
Insert data directly from file or memory streams in formats like CSV, JSON, Parquet, or any supported ClickHouse format.
await using var fileStream = File.OpenRead("data.csv");
using var response = await client.InsertRawStreamAsync(
table: "my_table",
stream: fileStream,
format: "CSV",
columns: ["id", "product", "price"] // Optional: specify columns
);
Sessions
var settings = new ClickHouseClientSettings
{
Host = "localhost",
UseSession = true,
SessionId = "my-session", // Optional -- will be auto-generated if not provided
};
using var client = new ClickHouseClient(settings);
await client.ExecuteNonQueryAsync("CREATE TEMPORARY TABLE temp_ids (id UInt64)");
await client.ExecuteNonQueryAsync("INSERT INTO temp_ids VALUES (1), (2), (3)");
var reader = await client.ExecuteReaderAsync(
"SELECT * FROM users WHERE id IN (SELECT id FROM temp_ids)"
);
More examples
For additional practical usage examples, see the examples directory in the GitHub repository.