Kinetica C# API  Version 7.1.10.0
 All Classes Namespaces Files Functions Variables Enumerations Enumerator Properties Pages
kinetica.AlterDatasourceRequest.DatasourceUpdatesMap Struct Reference

Map containing the properties of the data source to be updated. More...

Public Attributes

const string LOCATION = "location"
 Location of the remote storage in 'storage_provider_type://[storage_path[:storage_port]]' format. More...
 
const string USER_NAME = "user_name"
 Name of the remote system user; may be an empty string More...
 
const string PASSWORD = "password"
 Password for the remote system user; may be an empty string More...
 
const string SKIP_VALIDATION = "skip_validation"
 Bypass validation of connection to remote source. More...
 
const string TRUE = "true"
 
const string FALSE = "false"
 
const string CONNECTION_TIMEOUT = "connection_timeout"
 Timeout in seconds for connecting to this storage provider More...
 
const string WAIT_TIMEOUT = "wait_timeout"
 Timeout in seconds for reading from this storage provider More...
 
const string CREDENTIAL = "credential"
 Name of the credential object to be used in data source More...
 
const string S3_BUCKET_NAME = "s3_bucket_name"
 Name of the Amazon S3 bucket to use as the data source More...
 
const string S3_REGION = "s3_region"
 Name of the Amazon S3 region where the given bucket is located More...
 
const string S3_AWS_ROLE_ARN = "s3_aws_role_arn"
 Amazon IAM Role ARN which has required S3 permissions that can be assumed for the given S3 IAM user More...
 
const string S3_ENCRYPTION_CUSTOMER_ALGORITHM = "s3_encryption_customer_algorithm"
 Customer encryption algorithm used encrypting data More...
 
const string S3_ENCRYPTION_CUSTOMER_KEY = "s3_encryption_customer_key"
 Customer encryption key to encrypt or decrypt data More...
 
const string HDFS_KERBEROS_KEYTAB = "hdfs_kerberos_keytab"
 Kerberos keytab file location for the given HDFS user. More...
 
const string HDFS_DELEGATION_TOKEN = "hdfs_delegation_token"
 Delegation token for the given HDFS user More...
 
const string HDFS_USE_KERBEROS = "hdfs_use_kerberos"
 Use kerberos authentication for the given HDFS cluster Supported values:

The default value is FALSE. More...

 
const string AZURE_STORAGE_ACCOUNT_NAME = "azure_storage_account_name"
 Name of the Azure storage account to use as the data source, this is valid only if tenant_id is specified More...
 
const string AZURE_CONTAINER_NAME = "azure_container_name"
 Name of the Azure storage container to use as the data source More...
 
const string AZURE_TENANT_ID = "azure_tenant_id"
 Active Directory tenant ID (or directory ID) More...
 
const string AZURE_SAS_TOKEN = "azure_sas_token"
 Shared access signature token for Azure storage account to use as the data source More...
 
const string AZURE_OAUTH_TOKEN = "azure_oauth_token"
 OAuth token to access given storage container More...
 
const string GCS_BUCKET_NAME = "gcs_bucket_name"
 Name of the Google Cloud Storage bucket to use as the data source More...
 
const string GCS_PROJECT_ID = "gcs_project_id"
 Name of the Google Cloud project to use as the data source More...
 
const string GCS_SERVICE_ACCOUNT_KEYS = "gcs_service_account_keys"
 Google Cloud service account keys to use for authenticating the data source More...
 
const string KAFKA_URL = "kafka_url"
 The publicly-accessible full path URL to the Kafka broker, e.g., 'http://172.123.45.67:9300'. More...
 
const string KAFKA_TOPIC_NAME = "kafka_topic_name"
 Name of the Kafka topic to use as the data source More...
 
const string JDBC_DRIVER_JAR_PATH = "jdbc_driver_jar_path"
 JDBC driver jar file location. More...
 
const string JDBC_DRIVER_CLASS_NAME = "jdbc_driver_class_name"
 Name of the JDBC driver class More...
 
const string ANONYMOUS = "anonymous"
 Create an anonymous connection to the storage provider–DEPRECATED: this is now the default. More...
 
const string USE_MANAGED_CREDENTIALS = "use_managed_credentials"
 When no credentials are supplied, we use anonymous access by default. More...
 
const string USE_HTTPS = "use_https"
 Use https to connect to datasource if true, otherwise use http Supported values:

The default value is TRUE. More...

 
const string SCHEMA_NAME = "schema_name"
 Updates the schema name. More...
 

Detailed Description

Map containing the properties of the data source to be updated.

Error if empty.

  • LOCATION: Location of the remote storage in 'storage_provider_type://[storage_path[:storage_port]]' format.
    Supported storage provider types are 'azure','gcs','hdfs','kafka' and 's3'.
  • USER_NAME: Name of the remote system user; may be an empty string
  • PASSWORD: Password for the remote system user; may be an empty string
  • SKIP_VALIDATION: Bypass validation of connection to remote source. Supported values: The default value is FALSE.
  • CONNECTION_TIMEOUT: Timeout in seconds for connecting to this storage provider
  • WAIT_TIMEOUT: Timeout in seconds for reading from this storage provider
  • CREDENTIAL: Name of the credential object to be used in data source
  • S3_BUCKET_NAME: Name of the Amazon S3 bucket to use as the data source
  • S3_REGION: Name of the Amazon S3 region where the given bucket is located
  • S3_AWS_ROLE_ARN: Amazon IAM Role ARN which has required S3 permissions that can be assumed for the given S3 IAM user
  • S3_ENCRYPTION_CUSTOMER_ALGORITHM: Customer encryption algorithm used encrypting data
  • S3_ENCRYPTION_CUSTOMER_KEY: Customer encryption key to encrypt or decrypt data
  • HDFS_KERBEROS_KEYTAB: Kerberos keytab file location for the given HDFS user. This may be a KIFS file.
  • HDFS_DELEGATION_TOKEN: Delegation token for the given HDFS user
  • HDFS_USE_KERBEROS: Use kerberos authentication for the given HDFS cluster Supported values: The default value is FALSE.
  • AZURE_STORAGE_ACCOUNT_NAME: Name of the Azure storage account to use as the data source, this is valid only if tenant_id is specified
  • AZURE_CONTAINER_NAME: Name of the Azure storage container to use as the data source
  • AZURE_TENANT_ID: Active Directory tenant ID (or directory ID)
  • AZURE_SAS_TOKEN: Shared access signature token for Azure storage account to use as the data source
  • AZURE_OAUTH_TOKEN: OAuth token to access given storage container
  • GCS_BUCKET_NAME: Name of the Google Cloud Storage bucket to use as the data source
  • GCS_PROJECT_ID: Name of the Google Cloud project to use as the data source
  • GCS_SERVICE_ACCOUNT_KEYS: Google Cloud service account keys to use for authenticating the data source
  • KAFKA_URL: The publicly-accessible full path URL to the Kafka broker, e.g., 'http://172.123.45.67:9300'.
  • KAFKA_TOPIC_NAME: Name of the Kafka topic to use as the data source
  • JDBC_DRIVER_JAR_PATH: JDBC driver jar file location. This may be a KIFS file.
  • JDBC_DRIVER_CLASS_NAME: Name of the JDBC driver class
  • ANONYMOUS: Create an anonymous connection to the storage provider–DEPRECATED: this is now the default. Specify use_managed_credentials for non-anonymous connection Supported values: The default value is TRUE.
  • USE_MANAGED_CREDENTIALS: When no credentials are supplied, we use anonymous access by default. If this is set, we will use cloud provider user settings. Supported values: The default value is FALSE.
  • USE_HTTPS: Use https to connect to datasource if true, otherwise use http Supported values: The default value is TRUE.
  • SCHEMA_NAME: Updates the schema name. If schema_name doesn't exist, an error will be thrown. If schema_name is empty, then the user's default schema will be used.


A set of string constants for the parameter datasource_updates_map.

Definition at line 292 of file AlterDatasource.cs.

Member Data Documentation

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.ANONYMOUS = "anonymous"

Create an anonymous connection to the storage provider–DEPRECATED: this is now the default.

Specify use_managed_credentials for non-anonymous connection Supported values:

The default value is TRUE.

Definition at line 446 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.AZURE_CONTAINER_NAME = "azure_container_name"

Name of the Azure storage container to use as the data source

Definition at line 390 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.AZURE_OAUTH_TOKEN = "azure_oauth_token"

OAuth token to access given storage container

Definition at line 401 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.AZURE_SAS_TOKEN = "azure_sas_token"

Shared access signature token for Azure storage account to use as the data source

Definition at line 397 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.AZURE_STORAGE_ACCOUNT_NAME = "azure_storage_account_name"

Name of the Azure storage account to use as the data source, this is valid only if tenant_id is specified

Definition at line 386 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.AZURE_TENANT_ID = "azure_tenant_id"

Active Directory tenant ID (or directory ID)

Definition at line 393 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.CONNECTION_TIMEOUT = "connection_timeout"

Timeout in seconds for connecting to this storage provider

Definition at line 330 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.CREDENTIAL = "credential"

Name of the credential object to be used in data source

Definition at line 339 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.FALSE = "false"

Definition at line 326 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.GCS_BUCKET_NAME = "gcs_bucket_name"

Name of the Google Cloud Storage bucket to use as the data source

Definition at line 405 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.GCS_PROJECT_ID = "gcs_project_id"

Name of the Google Cloud project to use as the data source

Definition at line 409 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.GCS_SERVICE_ACCOUNT_KEYS = "gcs_service_account_keys"

Google Cloud service account keys to use for authenticating the data source

Definition at line 413 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.HDFS_DELEGATION_TOKEN = "hdfs_delegation_token"

Delegation token for the given HDFS user

Definition at line 366 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.HDFS_KERBEROS_KEYTAB = "hdfs_kerberos_keytab"

Kerberos keytab file location for the given HDFS user.

This may be a KIFS file.

Definition at line 363 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.HDFS_USE_KERBEROS = "hdfs_use_kerberos"

Use kerberos authentication for the given HDFS cluster Supported values:

The default value is FALSE.

Definition at line 382 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.JDBC_DRIVER_CLASS_NAME = "jdbc_driver_class_name"

Name of the JDBC driver class

Definition at line 428 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.JDBC_DRIVER_JAR_PATH = "jdbc_driver_jar_path"

JDBC driver jar file location.

This may be a KIFS file.

Definition at line 425 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.KAFKA_TOPIC_NAME = "kafka_topic_name"

Name of the Kafka topic to use as the data source

Definition at line 421 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.KAFKA_URL = "kafka_url"

The publicly-accessible full path URL to the Kafka broker, e.g., 'http://172.123.45.67:9300'.

Definition at line 417 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.LOCATION = "location"

Location of the remote storage in 'storage_provider_type://[storage_path[:storage_port]]' format.


Supported storage provider types are 'azure','gcs','hdfs','kafka' and 's3'.

Definition at line 300 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.PASSWORD = "password"

Password for the remote system user; may be an empty string

Definition at line 308 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.S3_AWS_ROLE_ARN = "s3_aws_role_arn"

Amazon IAM Role ARN which has required S3 permissions that can be assumed for the given S3 IAM user

Definition at line 351 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.S3_BUCKET_NAME = "s3_bucket_name"

Name of the Amazon S3 bucket to use as the data source

Definition at line 343 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.S3_ENCRYPTION_CUSTOMER_ALGORITHM = "s3_encryption_customer_algorithm"

Customer encryption algorithm used encrypting data

Definition at line 355 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.S3_ENCRYPTION_CUSTOMER_KEY = "s3_encryption_customer_key"

Customer encryption key to encrypt or decrypt data

Definition at line 359 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.S3_REGION = "s3_region"

Name of the Amazon S3 region where the given bucket is located

Definition at line 347 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.SCHEMA_NAME = "schema_name"

Updates the schema name.

If schema_name doesn't exist, an error will be thrown. If schema_name is empty, then the user's default schema will be used.

Definition at line 487 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.SKIP_VALIDATION = "skip_validation"

Bypass validation of connection to remote source.

Supported values:

The default value is FALSE.

Definition at line 324 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.TRUE = "true"

Definition at line 325 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.USE_HTTPS = "use_https"

Use https to connect to datasource if true, otherwise use http Supported values:

The default value is TRUE.

Definition at line 481 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.USE_MANAGED_CREDENTIALS = "use_managed_credentials"

When no credentials are supplied, we use anonymous access by default.

If this is set, we will use cloud provider user settings. Supported values:

The default value is FALSE.

Definition at line 464 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.USER_NAME = "user_name"

Name of the remote system user; may be an empty string

Definition at line 304 of file AlterDatasource.cs.

const string kinetica.AlterDatasourceRequest.DatasourceUpdatesMap.WAIT_TIMEOUT = "wait_timeout"

Timeout in seconds for reading from this storage provider

Definition at line 334 of file AlterDatasource.cs.


The documentation for this struct was generated from the following file: