7.1 Release Notes

Publish Date: 2022.03.20

Features

Kinetica is now available as a managed application in the AWS Marketplace!

  • Streamlined provisioning process simplifies setup and gets you up and running in 1 hour
  • Pay-as-you-go option allows you to pay for what you use, billed through AWS
  • Infrastructure provisioned into your AWS subscription
  • Kinetica Workbench simplifies SQL analysis and collaboration
  • Easily ingest data from developer's local environment into Kinetica's marketplace instance
  • Easily set up streaming ingestion from Kafka, AWS S3, and Azure blob storage from within Kinetica
  • User-Defined Functions (UDFs), Graphs, ML models, and more can be managed & executed via SQL
  • Integrated performance monitoring with AWS Monitor
  • One-click upgrades

Data and Analytics Capabilities

SQL Analytics

Kinetica supports:

  • Standard SQL data types
  • Standard DDL, DML, subqueries, joins, and set/pivot operations, including date/time, math, & string manipulation functions
  • Logical & materialized variants of views and external tables
  • Out of the box SQL analytics: aggregation, grouping, distribution, window, machine learning, graph, and GIS functions
  • User-Defined Functions (UDF) written in Python, Java, or C++
    • Scalar or table functions
    • Both distributed & non-distributed execution modes
    • Management & execution supported in SQL

Key-Value Lookups

Kinetica can perform high-performance, high-throughput, key/value look-ups from tables and views; available within the C++, C#, Java, and Python native APIs.

Temporal Analytics

Kinetica enables temporal analytics:

  • Date, Time, DateTime, and Timestamp data types
  • Date/time formatting codes to assist with data ingestion
  • Date/Time SQL functions and expression support
  • Inexact temporal joins using the ASOF join function

Geospatial Analytics

Support for ingest of geospatial data from a variety of data formats, including:

  • WKTs within text-based files
  • GeoJSON
  • Shapefiles

Store a variety of geospatial data objects in Kinetica, including: Points, Polygons, Linestrings, Tracks (GPS Positions), and Labels

A huge library of over 130 geospatial SQL functions ported from PostGIS that enable:

  • Spatial filtering
  • Spatial aggregation
  • GPS tracking
  • Geospatial joins
  • Spatial relations
  • Geometry transformation
  • Merge, dissolve, and simplification
  • Measurement
  • Isochrones and isodistance

Tracking streaming GPS data through space and time:

  • Native time-series track tables
  • Special track functions for understanding a track’s relationship to other geospatial objects
  • Track visualization rendering mode that shows the path of travel

Endpoints for server-side and client-side visualization:

  • Web Map Service (WMS) for large-scale, server-side visualization in a variety of styles:
    • Raster
    • Class-break Raster
    • Tracks
    • Heatmap
    • Contours
    • Labels
  • Vector tile service for client-side visualization

Graph Analytics

Model graphs from relational data using an intuitive identifier syntax that maps data to nodes, edges, restrictions, and weights. Weights and restrictions can be statically associated with the graph model or applied at query-time to retain flexibility in your data model. Graphs can be geospatial or non-geospatial (property graph) in nature.

Kinetica supports a wide variety of graph analytic functions, including:

  • Single Source Shortest Path
  • Inverse Shortest Path
  • Multiple Routing (Traveling Salesman)
  • Multiple Supply/Demand
  • Cycle Detection
  • Page Rank
  • Probability Rank
  • All Paths
  • Betweenness Centrality
  • Backhaul Routing
  • Hidden Markov Model (HMM) Map Matching
  • Match Origin/Destination Pairs
  • Model Statistical Analysis

Kinetica supports the ability to query property graphs based on given criteria applied to graph attributes.

Geospatial graphs can be displayed on a map using Kinetica’s Web Map Service (WMS).

ML Analytics

Kinetica supports several built-in SQL functions for common algorithms like linear regression and outlier detection.

Build your own ML models and deploy them in Kinetica:

  • Any model and ML framework is compatible
  • Load the model and invoke inferences with SQL
  • Inference with either batch (static data), or continuous (streaming data) modes
  • Scale inference horsepower by defining the number of replicas, on Kinetica’s infrastructure or your own Kubernetes cluster

Real-time Decisioning

Kinetica enables end-to-end real-time decisioning with streaming data ingestion, high-performance analytics, and data stream egress.

Ingress / Egress

Programmatic Interfaces

Import and export data using one of Kinetica’s programmatic interfaces:

  • SQL
  • Native APIs: Python, Java, C++, C#, NodeJS, JavaScript
  • Drivers: ODBC, JDBC
  • PostgreSQL Wire Protocol

Data Sources

Import data from various sources:

  • Message queue topics on Confluent Kafka, Apache Kafka, and Azure Event Hub
  • SQL and no-SQL databases
  • SaaS data stores
  • CRM (e.g., Sales Force)
  • Marketing (e.g., Eloqua)
  • E-Commerce (e.g., Shopify)
  • Collaboration (e.g., Slack)
  • Files hosted on:
    • Kinetica Server (KiFS)
    • External storage: AWS S3, Azure Blob Store, & HDFS

Import a variety of data formats, including:

  • CSV and other delimited text formats
  • Parquet
  • JSON/GeoJSON
  • Shapefiles
  • Avro

Import data server-side:

  • Batch or streaming modes
  • SQL and native API support

Ingest data client-side:

  • Batch import from a text file on the client
  • Upload a local file to the server for later ingestion
  • SQL and native API support

Data Sinks

Export data to:

  • Message queue topics on Confluent Kafka, Apache Kafka, & Azure Event Hub
  • CRM (e.g., Sales Force)
  • Marketing (e.g., Eloqua)
  • E-Commerce (e.g., Shopify)
  • Collaboration (e.g., Slack)

Data Streams

Send streaming output to a data sink to enable real-time decisioning.

Interfaces

Native APIs

Kinetica supports a wide variety of native API languages, including:

  • C++
  • C#
  • Java
  • JavaScript / Node.js
  • Python

JDBC / ODBC

Connect to a variety of tools and frameworks with Kinetica’s JDBC and ODBC interfaces.

Tools

KiSQL

A CLI client for Kinetica that enables users to:

  • Run queries remotely
  • Insert data into a Kinetica server
  • Upload files to and download files from a Kinetica server

Connectors

  • Spark
  • NiFi
  • Storm
  • FME
  • R
  • Beam
  • Mapbox

UI

Workbench

Kinetica deploys a Workbench user interface that comes with:

  • A data object explorer to help you manage all data objects in your system
  • A data import wizard that helps you set up and initiate data loading procedures
  • A file system that allows you to drag and drop files into Kinetica
  • SQL workbooks that store and run SQL commands, and help you visualize data
  • Easy to find connection strings for all APIs and supported connections
  • Management capabilities to monitor and cancel running jobs
  • User administration
  • One-click upgrades and backup snapshots

Reveal

Kinetica Reveal is a lightweight business intelligence tool that helps you build dashboards with a wide array of data visualizations in Kinetica, secured with role-based access controls (RBAC).

Security

  • SSL encryption
  • Role-based authentication
  • Row-level and column-level security
  • Data masking and obfuscation

Resilience

  • Kinetica supports complete backups through Workbench, which persists snapshots to cold storage like AWS S3 or Azure Blob Storage.
  • Automatically suspend your cluster when it has been inactive for a set period of time to save operational costs.
  • Intracluster resilience enabled through Kubernetes, which automatically restarts nodes upon crashes

Monitoring

Monitor the performance of Kinetica (metrics and logs) in AWS CloudWatch