public static final class AlterDatasourceRequest.DatasourceUpdatesMap extends Object
LOCATION
: Location of the remote storage in
'storage_provider_type://[storage_path[:storage_port]]' format.
Supported storage provider types are 'azure','gcs','hdfs','kafka' and 's3'.
USER_NAME
: Name of the remote system user; may be an empty string
PASSWORD
: Password for the remote system user; may be an empty string
SKIP_VALIDATION
: Bypass validation of connection to remote source.
Supported values:
The default value is FALSE
.
CONNECTION_TIMEOUT
: Timeout in seconds for connecting to this storage
provider
WAIT_TIMEOUT
: Timeout in seconds for reading from this storage provider
CREDENTIAL
: Name of the credential object to be used in data source
S3_BUCKET_NAME
: Name of the Amazon S3 bucket to use as the data source
S3_REGION
: Name of the Amazon S3 region where the given bucket is
located
S3_AWS_ROLE_ARN
: Amazon IAM Role ARN which has required S3 permissions
that can be assumed for the given S3 IAM user
S3_ENCRYPTION_CUSTOMER_ALGORITHM
: Customer encryption algorithm used
encrypting data
S3_ENCRYPTION_CUSTOMER_KEY
: Customer encryption key to encrypt or
decrypt data
HDFS_KERBEROS_KEYTAB
: Kerberos keytab file location for the given HDFS
user. This may be a KIFS file.
HDFS_DELEGATION_TOKEN
: Delegation token for the given HDFS user
HDFS_USE_KERBEROS
: Use kerberos authentication for the given HDFS
cluster
Supported values:
The default value is FALSE
.
AZURE_STORAGE_ACCOUNT_NAME
: Name of the Azure storage account to use as
the data source, this is valid only if tenant_id is specified
AZURE_CONTAINER_NAME
: Name of the Azure storage container to use as the
data source
AZURE_TENANT_ID
: Active Directory tenant ID (or directory ID)
AZURE_SAS_TOKEN
: Shared access signature token for Azure storage
account to use as the data source
AZURE_OAUTH_TOKEN
: OAuth token to access given storage container
GCS_BUCKET_NAME
: Name of the Google Cloud Storage bucket to use as the
data source
GCS_PROJECT_ID
: Name of the Google Cloud project to use as the data
source
GCS_SERVICE_ACCOUNT_KEYS
: Google Cloud service account keys to use for
authenticating the data source
KAFKA_URL
: The publicly-accessible full path URL to the Kafka broker,
e.g., 'http://172.123.45.67:9300'.
KAFKA_TOPIC_NAME
: Name of the Kafka topic to use as the data source
JDBC_DRIVER_JAR_PATH
: JDBC driver jar file location. This may be a
KIFS file.
JDBC_DRIVER_CLASS_NAME
: Name of the JDBC driver class
ANONYMOUS
: Create an anonymous connection to the storage
provider--DEPRECATED: this is now the default. Specify
use_managed_credentials for non-anonymous connection
Supported values:
The default value is TRUE
.
USE_MANAGED_CREDENTIALS
: When no credentials are supplied, we use
anonymous access by default. If this is set, we will use cloud provider
user settings.
Supported values:
The default value is FALSE
.
USE_HTTPS
: Use https to connect to datasource if true, otherwise use
http
Supported values:
The default value is TRUE
.
SCHEMA_NAME
: Updates the schema name. If schema_name
doesn't exist, an error will be thrown. If schema_name
is empty,
then the user's
default schema will be used.
datasourceUpdatesMap
.Modifier and Type | Field and Description |
---|---|
static String |
ANONYMOUS
Create an anonymous connection to the storage provider--DEPRECATED:
this is now the default.
|
static String |
AZURE_CONTAINER_NAME
Name of the Azure storage container to use as the data source
|
static String |
AZURE_OAUTH_TOKEN
OAuth token to access given storage container
|
static String |
AZURE_SAS_TOKEN
Shared access signature token for Azure storage account to use as
the data source
|
static String |
AZURE_STORAGE_ACCOUNT_NAME
Name of the Azure storage account to use as the data source, this is
valid only if tenant_id is specified
|
static String |
AZURE_TENANT_ID
Active Directory tenant ID (or directory ID)
|
static String |
CONNECTION_TIMEOUT
Timeout in seconds for connecting to this storage provider
|
static String |
CREDENTIAL
|
static String |
GCS_BUCKET_NAME
Name of the Google Cloud Storage bucket to use as the data source
|
static String |
GCS_PROJECT_ID
Name of the Google Cloud project to use as the data source
|
static String |
GCS_SERVICE_ACCOUNT_KEYS
Google Cloud service account keys to use for authenticating the data
source
|
static String |
HDFS_DELEGATION_TOKEN
Delegation token for the given HDFS user
|
static String |
HDFS_KERBEROS_KEYTAB
Kerberos keytab file location for the given HDFS user.
|
static String |
HDFS_USE_KERBEROS
|
static String |
JDBC_DRIVER_CLASS_NAME
Name of the JDBC driver class
|
static String |
JDBC_DRIVER_JAR_PATH
JDBC driver jar file location.
|
static String |
KAFKA_TOPIC_NAME
Name of the Kafka topic to use as the data source
|
static String |
KAFKA_URL
The publicly-accessible full path URL to the Kafka broker, e.g.,
'http://172.123.45.67:9300'.
|
static String |
LOCATION
Location of the remote storage in
'storage_provider_type://[storage_path[:storage_port]]' format.
|
static String |
PASSWORD
Password for the remote system user; may be an empty string
|
static String |
S3_AWS_ROLE_ARN
Amazon IAM Role ARN which has required S3 permissions that can be
assumed for the given S3 IAM user
|
static String |
S3_BUCKET_NAME
Name of the Amazon S3 bucket to use as the data source
|
static String |
S3_ENCRYPTION_CUSTOMER_ALGORITHM
Customer encryption algorithm used encrypting data
|
static String |
S3_ENCRYPTION_CUSTOMER_KEY
Customer encryption key to encrypt or decrypt data
|
static String |
S3_REGION
Name of the Amazon S3 region where the given bucket is located
|
static String |
SCHEMA_NAME
Updates the schema name.
|
static String |
SKIP_VALIDATION
Bypass validation of connection to remote source.
|
static String |
TRUE |
static String |
USE_HTTPS
|
static String |
USE_MANAGED_CREDENTIALS
When no credentials are supplied, we use anonymous access by
default.
|
static String |
USER_NAME
Name of the remote system user; may be an empty string
|
static String |
WAIT_TIMEOUT
Timeout in seconds for reading from this storage provider
|
public static final String LOCATION
Supported storage provider types are 'azure','gcs','hdfs','kafka' and 's3'.
public static final String USER_NAME
public static final String PASSWORD
public static final String SKIP_VALIDATION
FALSE
.public static final String TRUE
public static final String FALSE
public static final String CONNECTION_TIMEOUT
public static final String WAIT_TIMEOUT
public static final String CREDENTIAL
public static final String S3_BUCKET_NAME
public static final String S3_REGION
public static final String S3_AWS_ROLE_ARN
public static final String S3_ENCRYPTION_CUSTOMER_ALGORITHM
public static final String S3_ENCRYPTION_CUSTOMER_KEY
public static final String HDFS_KERBEROS_KEYTAB
public static final String HDFS_DELEGATION_TOKEN
public static final String HDFS_USE_KERBEROS
FALSE
.public static final String AZURE_STORAGE_ACCOUNT_NAME
public static final String AZURE_CONTAINER_NAME
public static final String AZURE_TENANT_ID
public static final String AZURE_SAS_TOKEN
public static final String AZURE_OAUTH_TOKEN
public static final String GCS_BUCKET_NAME
public static final String GCS_PROJECT_ID
public static final String GCS_SERVICE_ACCOUNT_KEYS
public static final String KAFKA_URL
public static final String KAFKA_TOPIC_NAME
public static final String JDBC_DRIVER_JAR_PATH
public static final String JDBC_DRIVER_CLASS_NAME
public static final String ANONYMOUS
TRUE
.public static final String USE_MANAGED_CREDENTIALS
FALSE
.public static final String USE_HTTPS
TRUE
.public static final String SCHEMA_NAME
schema_name
doesn't exist, an error will be thrown. If schema_name
is
empty, then the user's
default schema will be used.Copyright © 2024. All rights reserved.