Overview
When running COPY TO queries, you should have the option to include the endpoint URL. This feature is especially useful for scenarios where you need to provide credentials and specific endpoints.Syntax
The syntax for usingCOPY TO statement is as follows:
Replace
AWS_CRED with AZURE_CRED or GCS_CRED when copying to the Azure Blob Storage or Google Cloud Storage.-
Shared parameters:
table_name: table containing the data to be exportedfile_path: CSV file location accessible from the server
-
Parameters in
AWS_CRED:aws_region: AWS region associated with the storage service (e.g. ‘region1’)key_id: key identifier used for authenticationaccess_key: access key used for authenticationendpoint_url: URL endpoint for the storage service
-
Parameters in
GCS_CRED:<path_to_credentials>: path to JSON credentials file.<json_credentials_string>: contents of the GCS’s credentials file
-
Parameters in
AZURE_CRED:tenant_id: tenant identifier representing your organization’s identity in Azureclient_id: client identifier used for authentication.client_secret: secret identifier acting as a password when authenticating
Examples
COPY TO with AWS S3 Bucket
In this example, we use theCOPY TO statement to export data from the students table to a CSV file named students_file.
student table data is copied to the students_file on AWS S3
COPY TO with Google Cloud Storage
This example shows how to use theCOPY TO statement to export data, but this time, the data is stored on Google Cloud Storage.
credentials.json file, you can also pass its contents as a string in the following way:
Make sure that it is in JSON format
AWS_CRED like below:
project table is copied to the project_file on Google Cloud Storage
COPY TO with Azure Blob Storage
It’s a similar story for storing data in Azure Blob Storage.taxi_data table is copied to your_blob on Azure Blob Storage