Data Sinks

A data sink is reference object for a data target that is external to the database. It consists of the location & connection information to that external target. A data sink can make use of a credential object for storing remote authentication information.

A data sink name must adhere to the standard naming criteria. Each data sink exists within a schema and follows the standard name resolution rules for tables.

The following data sink types are supported:

  • CData (CData Software source-specific JDBC driver; see driver list for the full list of supported JDBC drivers)
  • JDBC
  • Apache Kafka
  • HTTP/HTTPS Webhook

Data sinks perform no function by themselves, but act as proxies for transmitting data when referenced as a destination in the creation of a table monitor (see also the CREATE STREAM command in SQL).

Note

  • CData data sinks can use a JDBC credential for authentication.
  • Kafka data sinks will be validated upon creation, by default, and will fail to be created if an authorized connection cannot be established.

Managing Data Sinks

A data sink can be managed using the following API endpoint calls. For managing data sinks in SQL, see CREATE DATA SINK.

API Call Description
/create/datasink Creates a data sink, given a location and connection information
/alter/datasink Modifies the properties of a data sink, validating the new connection
/drop/datasink Removes the data sink reference from the database; optionally removing all dependent table monitors as well
/show/datasink Outputs the data sink properties
/grant/permission Grants the permission for a user to connect to a data sink
/revoke/permission Revokes the permission for a user to connect to a data sink

Creating a Data Sink

To create a data sink, kin_dsink, that targets Apache Kafka, in Python:

1
2
3
4
5
6
7
8
db.create_datasink(
    name = 'kin_dsink',
    destination = 'kafka://kafka.abc.com:9092',
    options = {
        'credential': 'kafka_credential',
        'kafka_topic_name': 'kafka_topic'
    }
)

Provider-Specific Syntax

Several authentication schemes across multiple providers are supported.

CData

Credential
1
2
3
4
5
h_db.create_datasink(
    name = '[<data sink schema name>.]<data sink name>',
    destination = '<cdata jdbc url>',
    options = {credential = '[<credential schema name>.]<credential name>'}
)

JDBC

Credential
1
2
3
4
5
6
7
8
9
h_db.create_datasink(
    name = '[<data sink schema name>.]<data sink name>',
    destination = '<jdbc url>',
    options = {
        credential = '[<credential schema name>.]<credential name>',
        jdbc_driver_class_name = '<jdbc driver class full path>',
        jdbc_driver_jar_path = 'kifs://<jdbc driver jar path>'
    }
)

Kafka

Credential
1
2
3
4
5
6
7
8
h_db.create_datasink(
    name = '[<data sink schema name>.]<data sink name>',
    destination = 'kafka://<kafka.host>:<kafka.port>',
    options = {
        credential = '[<credential schema name>.]<credential name>',
        kafka_topic_name = '<kafka topic name>'
    }
)
Public (No Auth)
1
2
3
4
5
6
7
h_db.create_datasink(
    name = '[<schema name>.]<data sink name>',
    destination = 'kafka://<kafka.host>:<kafka.port>',
    options = {
        kafka_topic_name = '<kafka topic name>'
    }
)

Webhook

Credential (with HTTPS)
1
2
3
4
5
6
7
h_db.create_datasink(
    name = '[<data sink schema name>.]<data sink name>',
    destination = 'https://<webhook.host>:<webhook.port>',
    options = {
        credential = '[<credential schema name>.]<credential name>'
    }
)
HTTP
1
2
3
4
h_db.create_datasink(
    name = '[<schema name>.]<data sink name>',
    destination = 'http://<webhook.host>:<webhook.port>'
)