This engine provides an integration with existing Delta Lake tables in S3, GCP and Azure storage and supports both reads and writes (from v25.10).Documentation Index
Fetch the complete documentation index at: https://private-7c7dfe99-page-updates.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Create a DeltaLake table
To create a DeltaLake table it must already exist in S3, GCP or Azure storage. The commands below do not take DDL parameters to create a new table.- S3
- GCP
- Azure
SyntaxEngine parametersUsing named collections:
url— Bucket url with path to the existing Delta Lake table.aws_access_key_id,aws_secret_access_key- Long-term credentials for the AWS account user. You can use these to authenticate your requests. Parameter is optional. If credentials are not specified, they are used from the configuration file.extra_credentials- Optional. Used to pass arole_arnfor role-based access in ClickHouse Cloud. See Secure S3 for configuration steps.
Write data using a DeltaLake table
Once you have created a table using the DeltaLake table engine, you can insert data into it with:Writing using the table engine is supported only through delta kernel.
Writes to Azure are not yet supported but work for S3 and GCS.
Data cache
TheDeltaLake table engine and table function support data caching, the same as S3, AzureBlobStorage, HDFS storages. See “S3 table engine” for more details.