Skip to main content

Documentation Index

Fetch the complete documentation index at: https://private-7c7dfe99-page-updates.mintlify.app/llms.txt

Use this file to discover all available pages before exploring further.

All quickstarts
BeginnerReal-Time AnalyticsData WarehousingObservabilityAI/MLCloud

Prerequisites

You should also have completed the following quickstarts:

What you’ll build

In this quickstart you’ll use clickhouse-client - the official ClickHouse CLI client - to insert data from a local CSV file into a ClickHouse Cloud service. You’ll install clickhouse-client, prepare a sample dataset, connect to your Cloud service, create a table, and insert data from your local machine. By the end, you’ll know how to use clickhouse-client to load local files into ClickHouse Cloud, a workflow that works with CSV, Parquet, JSON, and many other formats.
1

Install clickhouse-client

clickhouse-client is the official CLI for connecting to ClickHouse. It is included in the ClickHouse binary.Install it using the universal installer:
curl https://clickhouse.com/ | sh
This downloads the latest clickhouse binary into your current directory. Verify the installation:
./clickhouse client --version
You should see output showing the ClickHouse version number, confirming clickhouse-client is ready to use.
2

Prepare a sample CSV file

Create a small sample CSV file so this quickstart is self-contained. Run the following in your terminal:
cat <<'EOF' > sample_data.csv
timestamp,event_type,user_id,duration_ms,status
2024-01-15 10:30:00,page_view,1001,120,success
2024-01-15 10:31:15,click,1002,45,success
2024-01-15 10:32:00,page_view,1003,200,success
2024-01-15 10:33:30,purchase,1001,1500,success
2024-01-15 10:34:00,click,1004,60,error
2024-01-15 10:35:45,page_view,1002,95,success
2024-01-15 10:36:10,purchase,1005,2200,success
2024-01-15 10:37:00,click,1003,30,success
2024-01-15 10:38:20,page_view,1004,150,error
2024-01-15 10:39:00,purchase,1002,1800,success
EOF
3

Connect to your Cloud service

The commands below assume you’ve exported CLICKHOUSE_HOST, CLICKHOUSE_USER, and CLICKHOUSE_PASSWORD as environment variables, as described in the Obtain your Cloud connection details quickstart. If you haven’t, you can replace them with your values directly.Test connectivity by running a simple query against your Cloud service:
./clickhouse client \
  --host $CLICKHOUSE_HOST \
  --port 9440 \
  --user $CLICKHOUSE_USER \
  --password $CLICKHOUSE_PASSWORD \
  --secure \
  -q "SELECT 1"
If the connection succeeds you’ll see 1 printed to the terminal. If you get a connection error, check that your service is awake in the Cloud console and that your hostname and password are correct.
Port 9440 is the secure native protocol port for ClickHouse Cloud. The --secure flag enables TLS encryption. These are required for all Cloud connections.
4

Create a target table and insert data

First, create a table on your Cloud service to receive the data:
./clickhouse client \
  --host $CLICKHOUSE_HOST \
  --port 9440 \
  --user $CLICKHOUSE_USER \
  --password $CLICKHOUSE_PASSWORD \
  --secure \
  -q "
    CREATE TABLE IF NOT EXISTS events (
      timestamp DateTime,
      event_type LowCardinality(String),
      user_id UInt32,
      duration_ms UInt32,
      status LowCardinality(String)
    )
    ENGINE = MergeTree
    ORDER BY (event_type, timestamp)
  "
Now insert the data from your local CSV file. This is the key step - clickhouse-client reads the file from stdin and streams it to the remote service:
./clickhouse client \
  --host $CLICKHOUSE_HOST \
  --port 9440 \
  --user $CLICKHOUSE_USER \
  --password $CLICKHOUSE_PASSWORD \
  --secure \
  -q "INSERT INTO events FORMAT CSVWithNames" < sample_data.csv
clickhouse-client sends the contents of the CSV file directly to your Cloud service. The CSVWithNames format tells ClickHouse that the first row contains column headers.
5

Verify the data was inserted

Query your Cloud service to confirm the rows arrived:
./clickhouse client \
  --host $CLICKHOUSE_HOST \
  --port 9440 \
  --user $CLICKHOUSE_USER \
  --password $CLICKHOUSE_PASSWORD \
  --secure \
  -q "SELECT count() FROM events"
You should see 10 - the number of rows in the sample CSV.Preview the data:
./clickhouse client \
  --host $CLICKHOUSE_HOST \
  --port 9440 \
  --user $CLICKHOUSE_USER \
  --password $CLICKHOUSE_PASSWORD \
  --secure \
  -q "SELECT * FROM events ORDER BY timestamp LIMIT 5"
This same workflow works with Parquet, TSV, JSON, Avro, ORC, and many other supported formats - just change the FORMAT clause and pipe in the appropriate file.

Next steps

In this quickstart you installed clickhouse-client, connected it to a ClickHouse Cloud service, and used it to stream data from a local CSV file into the Cloud. This same approach works for Parquet, JSON, and dozens of other formats. Check out the following quickstarts next: Or go deeper with the reference documentation:
ClickHouse Academy — Master ClickHouse with expert-designed training for every skill level