Kinetica C# API  Version 7.1.10.0
 All Classes Namespaces Files Functions Variables Enumerations Enumerator Properties Pages
kinetica.AlterDatasourceRequest Class Reference

A set of parameters for Kinetica.alterDatasource(string,IDictionary{string, string},IDictionary{string, string}). More...

+ Inheritance diagram for kinetica.AlterDatasourceRequest:
+ Collaboration diagram for kinetica.AlterDatasourceRequest:

Classes

struct  DatasourceUpdatesMap
 Map containing the properties of the data source to be updated. More...
 

Public Member Functions

 AlterDatasourceRequest ()
 Constructs an AlterDatasourceRequest object with default parameters. More...
 
 AlterDatasourceRequest (string name, IDictionary< string, string > datasource_updates_map, IDictionary< string, string > options)
 Constructs an AlterDatasourceRequest object with the specified parameters. More...
 
- Public Member Functions inherited from kinetica.KineticaData
 KineticaData (KineticaType type)
 Constructor from Kinetica Type More...
 
 KineticaData (System.Type type=null)
 Default constructor, with optional System.Type More...
 
object Get (int fieldPos)
 Retrieve a specific property from this object More...
 
void Put (int fieldPos, object fieldValue)
 Write a specific property to this object More...
 

Properties

string name [get, set]
 Name of the data source to be altered. More...
 
IDictionary< string, string > datasource_updates_map [get, set]
 Map containing the properties of the data source to be updated. More...
 
IDictionary< string, string > options = new Dictionary<string, string>() [get, set]
 Optional parameters. More...
 
- Properties inherited from kinetica.KineticaData
Schema Schema [get]
 Avro Schema for this class More...
 

Additional Inherited Members

- Static Public Member Functions inherited from kinetica.KineticaData
static RecordSchema SchemaFromType (System.Type t, KineticaType ktype=null)
 Create an Avro Schema from a System.Type and a KineticaType. More...
 

Detailed Description

A set of parameters for Kinetica.alterDatasource(string,IDictionary{string, string},IDictionary{string, string}).


Alters the properties of an existing data source

Definition at line 21 of file AlterDatasource.cs.

Constructor & Destructor Documentation

kinetica.AlterDatasourceRequest.AlterDatasourceRequest ( )
inline

Constructs an AlterDatasourceRequest object with default parameters.

Definition at line 769 of file AlterDatasource.cs.

kinetica.AlterDatasourceRequest.AlterDatasourceRequest ( string  name,
IDictionary< string, string >  datasource_updates_map,
IDictionary< string, string >  options 
)
inline

Constructs an AlterDatasourceRequest object with the specified parameters.

Parameters
nameName of the data source to be altered. Must be an existing data source.
datasource_updates_mapMap containing the properties of the data source to be updated. Error if empty.
  • LOCATION: Location of the remote storage in 'storage_provider_type://[storage_path[:storage_port]]' format. Supported storage provider types are 'azure','gcs','hdfs','kafka' and 's3'.
  • USER_NAME: Name of the remote system user; may be an empty string
  • PASSWORD: Password for the remote system user; may be an empty string
  • SKIP_VALIDATION: Bypass validation of connection to remote source. Supported values: The default value is FALSE.
  • CONNECTION_TIMEOUT: Timeout in seconds for connecting to this storage provider
  • WAIT_TIMEOUT: Timeout in seconds for reading from this storage provider
  • CREDENTIAL: Name of the credential object to be used in data source
  • S3_BUCKET_NAME: Name of the Amazon S3 bucket to use as the data source
  • S3_REGION: Name of the Amazon S3 region where the given bucket is located
  • S3_AWS_ROLE_ARN: Amazon IAM Role ARN which has required S3 permissions that can be assumed for the given S3 IAM user
  • S3_ENCRYPTION_CUSTOMER_ALGORITHM: Customer encryption algorithm used encrypting data
  • S3_ENCRYPTION_CUSTOMER_KEY: Customer encryption key to encrypt or decrypt data
  • HDFS_KERBEROS_KEYTAB: Kerberos keytab file location for the given HDFS user. This may be a KIFS file.
  • HDFS_DELEGATION_TOKEN: Delegation token for the given HDFS user
  • HDFS_USE_KERBEROS: Use kerberos authentication for the given HDFS cluster Supported values: The default value is FALSE.
  • AZURE_STORAGE_ACCOUNT_NAME: Name of the Azure storage account to use as the data source, this is valid only if tenant_id is specified
  • AZURE_CONTAINER_NAME: Name of the Azure storage container to use as the data source
  • AZURE_TENANT_ID: Active Directory tenant ID (or directory ID)
  • AZURE_SAS_TOKEN: Shared access signature token for Azure storage account to use as the data source
  • AZURE_OAUTH_TOKEN: OAuth token to access given storage container
  • GCS_BUCKET_NAME: Name of the Google Cloud Storage bucket to use as the data source
  • GCS_PROJECT_ID: Name of the Google Cloud project to use as the data source
  • GCS_SERVICE_ACCOUNT_KEYS: Google Cloud service account keys to use for authenticating the data source
  • KAFKA_URL: The publicly-accessible full path URL to the Kafka broker, e.g., 'http://172.123.45.67:9300'.
  • KAFKA_TOPIC_NAME: Name of the Kafka topic to use as the data source
  • JDBC_DRIVER_JAR_PATH: JDBC driver jar file location. This may be a KIFS file.
  • JDBC_DRIVER_CLASS_NAME: Name of the JDBC driver class
  • ANONYMOUS: Create an anonymous connection to the storage provider–DEPRECATED: this is now the default. Specify use_managed_credentials for non-anonymous connection Supported values: The default value is TRUE.
  • USE_MANAGED_CREDENTIALS: When no credentials are supplied, we use anonymous access by default. If this is set, we will use cloud provider user settings. Supported values: The default value is FALSE.
  • USE_HTTPS: Use https to connect to datasource if true, otherwise use http Supported values: The default value is TRUE.
  • SCHEMA_NAME: Updates the schema name. If schema_name doesn't exist, an error will be thrown. If schema_name is empty, then the user's default schema will be used.
optionsOptional parameters.

Definition at line 1043 of file AlterDatasource.cs.

Property Documentation

IDictionary<string, string> kinetica.AlterDatasourceRequest.datasource_updates_map
getset

Map containing the properties of the data source to be updated.

Error if empty.

  • LOCATION: Location of the remote storage in 'storage_provider_type://[storage_path[:storage_port]]' format.
    Supported storage provider types are 'azure','gcs','hdfs','kafka' and 's3'.
  • USER_NAME: Name of the remote system user; may be an empty string
  • PASSWORD: Password for the remote system user; may be an empty string
  • SKIP_VALIDATION: Bypass validation of connection to remote source. Supported values: The default value is FALSE.
  • CONNECTION_TIMEOUT: Timeout in seconds for connecting to this storage provider
  • WAIT_TIMEOUT: Timeout in seconds for reading from this storage provider
  • CREDENTIAL: Name of the credential object to be used in data source
  • S3_BUCKET_NAME: Name of the Amazon S3 bucket to use as the data source
  • S3_REGION: Name of the Amazon S3 region where the given bucket is located
  • S3_AWS_ROLE_ARN: Amazon IAM Role ARN which has required S3 permissions that can be assumed for the given S3 IAM user
  • S3_ENCRYPTION_CUSTOMER_ALGORITHM: Customer encryption algorithm used encrypting data
  • S3_ENCRYPTION_CUSTOMER_KEY: Customer encryption key to encrypt or decrypt data
  • HDFS_KERBEROS_KEYTAB: Kerberos keytab file location for the given HDFS user. This may be a KIFS file.
  • HDFS_DELEGATION_TOKEN: Delegation token for the given HDFS user
  • HDFS_USE_KERBEROS: Use kerberos authentication for the given HDFS cluster Supported values: The default value is FALSE.
  • AZURE_STORAGE_ACCOUNT_NAME: Name of the Azure storage account to use as the data source, this is valid only if tenant_id is specified
  • AZURE_CONTAINER_NAME: Name of the Azure storage container to use as the data source
  • AZURE_TENANT_ID: Active Directory tenant ID (or directory ID)
  • AZURE_SAS_TOKEN: Shared access signature token for Azure storage account to use as the data source
  • AZURE_OAUTH_TOKEN: OAuth token to access given storage container
  • GCS_BUCKET_NAME: Name of the Google Cloud Storage bucket to use as the data source
  • GCS_PROJECT_ID: Name of the Google Cloud project to use as the data source
  • GCS_SERVICE_ACCOUNT_KEYS: Google Cloud service account keys to use for authenticating the data source
  • KAFKA_URL: The publicly-accessible full path URL to the Kafka broker, e.g., 'http://172.123.45.67:9300'.
  • KAFKA_TOPIC_NAME: Name of the Kafka topic to use as the data source
  • JDBC_DRIVER_JAR_PATH: JDBC driver jar file location. This may be a KIFS file.
  • JDBC_DRIVER_CLASS_NAME: Name of the JDBC driver class
  • ANONYMOUS: Create an anonymous connection to the storage provider–DEPRECATED: this is now the default. Specify use_managed_credentials for non-anonymous connection Supported values: The default value is TRUE.
  • USE_MANAGED_CREDENTIALS: When no credentials are supplied, we use anonymous access by default. If this is set, we will use cloud provider user settings. Supported values: The default value is FALSE.
  • USE_HTTPS: Use https to connect to datasource if true, otherwise use http Supported values: The default value is TRUE.
  • SCHEMA_NAME: Updates the schema name. If schema_name doesn't exist, an error will be thrown. If schema_name is empty, then the user's default schema will be used.

Definition at line 761 of file AlterDatasource.cs.

string kinetica.AlterDatasourceRequest.name
getset

Name of the data source to be altered.

Must be an existing data source.

Definition at line 493 of file AlterDatasource.cs.

IDictionary<string, string> kinetica.AlterDatasourceRequest.options = new Dictionary<string, string>()
getset

Optional parameters.

Definition at line 764 of file AlterDatasource.cs.


The documentation for this class was generated from the following file: