Overview

Exporting data from Oxla to a CSV file can be accomplished using the COPY TO command. This guide outlines the methods for exporting data, including specifying delimiters, handling headers and controlling null value representation.

Syntax

The syntax for this function is as follows:

COPY table_name (column_name) TO {'file_path' | STDOUT} WITH (option, …);

Parameters

  • table_name: existing table from which the data will be exported
  • column_name: optionally list of columns to be exported. If omitted, all columns are exported
  • TO: destination for the exported data
    • file_path: path to the CSV file where the data will be written
    • STDOUT: exports the data to the standard output stream
  • option: available options below
    • FORMAT: output format (currently only CSV is supported)
    • DELIMITER: delimiter character represented as single quote string literal (By default, this function uses commas as the delimiter)
    • NULL: string to use for representing NULL values (e.g. NULL ‘unknown’)
    • HEADER: boolean value indicating whether to include a header row with column names (values can be TRUE or FALSE, with default set to FALSE)
    • Endpoint: provide object-based storage credentials

Examples

Exporting Data via STDOUT

These example demonstrates how to export data from the file table to a CSV file named film_export.csv, including a header row and using a semicolon as the delimiter.

1. Creating a table

Ensure that the table exists in your Oxla instance. If the table does not exist, create one using the following command:

CREATE TABLE film (
    title text NOT NULL,
    length int,
    rating text
    );

2. Inserting Data

INSERT INTO film(title, length, rating) VALUES
    ('ATTRACTION NEWTON', 83, 'PG-13'),
    ('CHRISTMAS MOONSHINE', 150, 'NC-17'),
    ('DANGEROUS UPTOWN', 121, 'PG'),
    ('KILL BROTHERHOOD', 54, 'G'),
    ('HALLOWEEN NUTS', 47, 'PG-13'),
    ('HOURS RAGE', 122, 'NC-17'),
    ('PIANIST OUTFIELD', 136, 'NC-17'),
    ('PICKUP DRIVING', 77, 'G'),
    ('INDEPENDENCE HOTEL', 157, 'NC-17'),
    ('PRIVATE DROP', 106, 'PG'),
    ('SAINTS BRIDE', 125, 'G'),
    ('FOREVER CANDIDATE', 131, 'NC-17'),
    ('MILLION ACE', 142, 'PG-13'),
    ('SLEEPY JAPANESE', 137, 'PG'),
    ('WRATH MILE', 176, 'NC-17'),
    ('YOUTH KICK', 179, 'NC-17'),
    ('CLOCKWORK PARADISE', 143, 'PG-13');

3. Executing the Export Command

COPY `table_name` TO 'file_path' WITH (FORMAT CSV, HEADER TRUE, DELIMITER ';');
  • Replace table_name with your target table (e.g. film)
  • Replace file_path with the full path where the data will be written
  • Use HEADER to include a header row with column names

4. Verifying Export

Now let’s check the contents of film_export.csv to ensure the data has been successfully exported

public.film.title;public.film.length;public.film.rating
"ATTRACTION NEWTON";83;PG-13
"CHRISTMAS MOONSHINE";150;NC-17
"DANGEROUS UPTOWN";121;PG
"KILL BROTHERHOOD";54;G
"HALLOWEEN NUTS";47;PG-13
"HOURS RAGE";122;NC-17
"PIANIST OUTFIELD";136;NC-17
"PICKUP DRIVING";77;G
"INDEPENDENCE HOTEL";157;NC-17
"PRIVATE DROP";106;PG
"SAINTS BRIDE";125;G
"FOREVER CANDIDATE";131;NC-17
"MILLION ACE";142;PG-13
"SLEEPY JAPANESE";137;PG
"WRATH MILE";176;NC-17
"YOUTH KICK";179;NC-17
"CLOCKWORK PARADISE";143;PG-13

Exporting Data to Cloud Storage

To export data to an object storage from an Oxla table, you can use the COPY TO command with object storage credentials. This command allows you to transfer data to a cloud storage services like AWS S3, Google Cloud Storage or Azure Blob Storage directly from your Oxla instance.

COPY table_name TO 'cloud_storage_file_path' (object_storage(object_storage_credentials));
  • object storage: AWS_CRED,AZURE_CRED or GCS_CRED (depending on your provider)
  • object_storage_credentials: for accessing your cloud storage

You need to provide Provider-Specific credentials to authenticate access to your files. Use the following authentication parameters to access your cloud storage

AWS S3 Bucket

  • aws_region: AWS region associated with the storage service
  • key_id: key identifier for authentication
  • access_key: access key for authentication
  • endpoint_url: URL endpoint for the storage service
COPY table_name TO 's3://your-bucket/file_name' WITH (AWS_CRED(AWS_REGION 'us-west-1', AWS_KEY_ID 'key_id', AWS_PRIVATE_KEY 'access_key', ENDPOINT 's3.us-west-1.amazonaws.com'), FORMAT CSV, HEADER ON, NULL 'unknown');
In the exported file, NULL values will be represented as ‘unknown’ as specified in the NULL option

Google Cloud Storage

  • <path_to_credentials>: path to JSON credentials file
  • <json_credentials_string>: contents of the GCS’s credentials file
COPY table_name TO 'gs://your-bucket/file_name' (GCS_CRED('/path/to/credentials.json'));
For Google Cloud Storage, it’s recommended to use HMAC keys for authentication. You can find more details about that on the HMAC keys - Cloud Storage page.

Azure Blob Storage

  • tenant_id: tenant identifier representing your organization’s identity in Azure
  • client_id: client identifier used for authentication
  • client_secret: secret identifier acting as a password for authentication.
COPY table_name TO 'wasbs://container-name/your_blob' (AZURE_CRED(TENANT_ID 'your_tenant_id' CLIENT_ID 'your_client_id', CLIENT_SECRET 'your_client_secret'));