Kinetica C# API
Version 7.1.10.0
|
Optional parameters. More...
Public Attributes | |
const string | SKIP_VALIDATION = "skip_validation" |
Bypass validation of connection to remote source. More... | |
const string | TRUE = "true" |
const string | FALSE = "false" |
const string | CONNECTION_TIMEOUT = "connection_timeout" |
Timeout in seconds for connecting to this storage provider More... | |
const string | WAIT_TIMEOUT = "wait_timeout" |
Timeout in seconds for reading from this storage provider More... | |
const string | CREDENTIAL = "credential" |
Name of the credential object to be used in data source More... | |
const string | S3_BUCKET_NAME = "s3_bucket_name" |
Name of the Amazon S3 bucket to use as the data source More... | |
const string | S3_REGION = "s3_region" |
Name of the Amazon S3 region where the given bucket is located More... | |
const string | S3_VERIFY_SSL = "s3_verify_ssl" |
Set to false for testing purposes or when necessary to bypass TLS errors (e.g. More... | |
const string | S3_USE_VIRTUAL_ADDRESSING = "s3_use_virtual_addressing" |
Whether to use virtual addressing when referencing the Amazon S3 source Supported values:
| |
const string | S3_AWS_ROLE_ARN = "s3_aws_role_arn" |
Amazon IAM Role ARN which has required S3 permissions that can be assumed for the given S3 IAM user More... | |
const string | S3_ENCRYPTION_CUSTOMER_ALGORITHM = "s3_encryption_customer_algorithm" |
Customer encryption algorithm used encrypting data More... | |
const string | S3_ENCRYPTION_CUSTOMER_KEY = "s3_encryption_customer_key" |
Customer encryption key to encrypt or decrypt data More... | |
const string | HDFS_KERBEROS_KEYTAB = "hdfs_kerberos_keytab" |
Kerberos keytab file location for the given HDFS user. More... | |
const string | HDFS_DELEGATION_TOKEN = "hdfs_delegation_token" |
Delegation token for the given HDFS user More... | |
const string | HDFS_USE_KERBEROS = "hdfs_use_kerberos" |
Use kerberos authentication for the given HDFS cluster Supported values: | |
const string | AZURE_STORAGE_ACCOUNT_NAME = "azure_storage_account_name" |
Name of the Azure storage account to use as the data source, this is valid only if tenant_id is specified More... | |
const string | AZURE_CONTAINER_NAME = "azure_container_name" |
Name of the Azure storage container to use as the data source More... | |
const string | AZURE_TENANT_ID = "azure_tenant_id" |
Active Directory tenant ID (or directory ID) More... | |
const string | AZURE_SAS_TOKEN = "azure_sas_token" |
Shared access signature token for Azure storage account to use as the data source More... | |
const string | AZURE_OAUTH_TOKEN = "azure_oauth_token" |
OAuth token to access given storage container More... | |
const string | GCS_BUCKET_NAME = "gcs_bucket_name" |
Name of the Google Cloud Storage bucket to use as the data source More... | |
const string | GCS_PROJECT_ID = "gcs_project_id" |
Name of the Google Cloud project to use as the data source More... | |
const string | GCS_SERVICE_ACCOUNT_KEYS = "gcs_service_account_keys" |
Google Cloud service account keys to use for authenticating the data source More... | |
const string | IS_STREAM = "is_stream" |
To load from Azure/GCS/S3 as a stream continuously. More... | |
const string | KAFKA_TOPIC_NAME = "kafka_topic_name" |
Name of the Kafka topic to use as the data source More... | |
const string | JDBC_DRIVER_JAR_PATH = "jdbc_driver_jar_path" |
JDBC driver jar file location. More... | |
const string | JDBC_DRIVER_CLASS_NAME = "jdbc_driver_class_name" |
Name of the JDBC driver class More... | |
const string | ANONYMOUS = "anonymous" |
Use anonymous connection to storage provider–DEPRECATED: this is now the default. More... | |
const string | USE_MANAGED_CREDENTIALS = "use_managed_credentials" |
When no credentials are supplied, we use anonymous access by default. More... | |
const string | USE_HTTPS = "use_https" |
Use https to connect to datasource if true, otherwise use http Supported values: | |
const string | SCHEMA_REGISTRY_LOCATION = "schema_registry_location" |
Location of Confluent Schema Registry in '[storage_path[:storage_port]]' format. More... | |
const string | SCHEMA_REGISTRY_CREDENTIAL = "schema_registry_credential" |
Confluent Schema Registry credential object name. More... | |
const string | SCHEMA_REGISTRY_PORT = "schema_registry_port" |
Confluent Schema Registry port (optional). More... | |
Optional parameters.
The default value is an empty Dictionary. A set of string constants for the parameter options.
Definition at line 337 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.ANONYMOUS = "anonymous" |
Use anonymous connection to storage provider–DEPRECATED: this is now the default.
Specify use_managed_credentials for non-anonymous connection. Supported values:
The default value is TRUE.
Definition at line 528 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.AZURE_CONTAINER_NAME = "azure_container_name" |
Name of the Azure storage container to use as the data source
Definition at line 460 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.AZURE_OAUTH_TOKEN = "azure_oauth_token" |
OAuth token to access given storage container
Definition at line 471 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.AZURE_SAS_TOKEN = "azure_sas_token" |
Shared access signature token for Azure storage account to use as the data source
Definition at line 467 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.AZURE_STORAGE_ACCOUNT_NAME = "azure_storage_account_name" |
Name of the Azure storage account to use as the data source, this is valid only if tenant_id is specified
Definition at line 456 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.AZURE_TENANT_ID = "azure_tenant_id" |
Active Directory tenant ID (or directory ID)
Definition at line 463 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.CONNECTION_TIMEOUT = "connection_timeout" |
Timeout in seconds for connecting to this storage provider
Definition at line 360 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.CREDENTIAL = "credential" |
Name of the credential object to be used in data source
Definition at line 369 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.FALSE = "false" |
Definition at line 356 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.GCS_BUCKET_NAME = "gcs_bucket_name" |
Name of the Google Cloud Storage bucket to use as the data source
Definition at line 475 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.GCS_PROJECT_ID = "gcs_project_id" |
Name of the Google Cloud project to use as the data source
Definition at line 479 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.GCS_SERVICE_ACCOUNT_KEYS = "gcs_service_account_keys" |
Google Cloud service account keys to use for authenticating the data source
Definition at line 483 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.HDFS_DELEGATION_TOKEN = "hdfs_delegation_token" |
Delegation token for the given HDFS user
Definition at line 436 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.HDFS_KERBEROS_KEYTAB = "hdfs_kerberos_keytab" |
Kerberos keytab file location for the given HDFS user.
This may be a KIFS file.
Definition at line 433 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.HDFS_USE_KERBEROS = "hdfs_use_kerberos" |
Use kerberos authentication for the given HDFS cluster Supported values:
The default value is FALSE.
Definition at line 452 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.IS_STREAM = "is_stream" |
To load from Azure/GCS/S3 as a stream continuously.
Supported values:
The default value is FALSE.
Definition at line 499 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.JDBC_DRIVER_CLASS_NAME = "jdbc_driver_class_name" |
Name of the JDBC driver class
Definition at line 510 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.JDBC_DRIVER_JAR_PATH = "jdbc_driver_jar_path" |
JDBC driver jar file location.
This may be a KIFS file.
Definition at line 507 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.KAFKA_TOPIC_NAME = "kafka_topic_name" |
Name of the Kafka topic to use as the data source
Definition at line 503 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.S3_AWS_ROLE_ARN = "s3_aws_role_arn" |
Amazon IAM Role ARN which has required S3 permissions that can be assumed for the given S3 IAM user
Definition at line 421 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.S3_BUCKET_NAME = "s3_bucket_name" |
Name of the Amazon S3 bucket to use as the data source
Definition at line 373 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.S3_ENCRYPTION_CUSTOMER_ALGORITHM = "s3_encryption_customer_algorithm" |
Customer encryption algorithm used encrypting data
Definition at line 425 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.S3_ENCRYPTION_CUSTOMER_KEY = "s3_encryption_customer_key" |
Customer encryption key to encrypt or decrypt data
Definition at line 429 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.S3_REGION = "s3_region" |
Name of the Amazon S3 region where the given bucket is located
Definition at line 377 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.S3_USE_VIRTUAL_ADDRESSING = "s3_use_virtual_addressing" |
Whether to use virtual addressing when referencing the Amazon S3 source Supported values:
FALSE: Use path-style URI for requests.
The default value is TRUE.
Definition at line 417 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.S3_VERIFY_SSL = "s3_verify_ssl" |
Set to false for testing purposes or when necessary to bypass TLS errors (e.g.
self-signed certificates). This value is true by default. Supported values:
The default value is TRUE.
Definition at line 395 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.SCHEMA_REGISTRY_CREDENTIAL = "schema_registry_credential" |
Confluent Schema Registry credential object name.
Definition at line 572 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.SCHEMA_REGISTRY_LOCATION = "schema_registry_location" |
Location of Confluent Schema Registry in '[storage_path[:storage_port]]' format.
Definition at line 567 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.SCHEMA_REGISTRY_PORT = "schema_registry_port" |
Confluent Schema Registry port (optional).
Definition at line 575 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.SKIP_VALIDATION = "skip_validation" |
Bypass validation of connection to remote source.
Supported values:
The default value is FALSE.
Definition at line 354 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.TRUE = "true" |
Definition at line 355 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.USE_HTTPS = "use_https" |
Use https to connect to datasource if true, otherwise use http Supported values:
The default value is TRUE.
Definition at line 563 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.USE_MANAGED_CREDENTIALS = "use_managed_credentials" |
When no credentials are supplied, we use anonymous access by default.
If this is set, we will use cloud provider user settings. Supported values:
The default value is FALSE.
Definition at line 546 of file CreateDatasource.cs.
const string kinetica.CreateDatasourceRequest.Options.WAIT_TIMEOUT = "wait_timeout" |
Timeout in seconds for reading from this storage provider
Definition at line 364 of file CreateDatasource.cs.