7.1 Release Notes

Publish Date: 2022.03.20


Kinetica is now available as a managed application in the AWS Marketplace!

  • Streamlined provisioning process simplifies setup and gets you up and running in 1 hour
  • Pay-as-you-go option allows you to pay for what you use, billed through AWS
  • Infrastructure provisioned into your AWS subscription
  • Kinetica Workbench simplifies SQL analysis and collaboration
  • Easily ingest data from developer's local environment into Kinetica's marketplace instance
  • Easily set up streaming ingestion from Kafka, AWS S3, and Azure blob storage from within Kinetica
  • User-Defined Functions (UDFs), Graphs, ML models, and more can be managed & executed via SQL
  • Integrated performance monitoring with AWS Monitor
  • One-click upgrades

Data and Analytics Capabilities

SQL Analytics

Kinetica supports:

  • Standard SQL data types
  • Standard DDL, DML, subqueries, joins, and set/pivot operations, including date/time, math, & string manipulation functions
  • Logical & materialized variants of views and external tables
  • Out of the box SQL analytics: aggregation, grouping, distribution, window, machine learning, graph, and GIS functions
  • User-Defined Functions (UDF) written in Python, Java, or C++
    • Scalar or table functions
    • Both distributed & non-distributed execution modes
    • Management & execution supported in SQL

Key-Value Lookups

Kinetica can perform high-performance, high-throughput, key/value look-ups from tables and views; available within the C++, C#, Java, and Python native APIs.

Temporal Analytics

Kinetica enables temporal analytics:

  • Date, Time, DateTime, and Timestamp data types
  • Date/time formatting codes to assist with data ingestion
  • Date/Time SQL functions and expression support
  • Inexact temporal joins using the ASOF join function

Geospatial Analytics

Support for ingest of geospatial data from a variety of data formats, including:

  • WKTs within text-based files
  • GeoJSON
  • Shapefiles

Store a variety of geospatial data objects in Kinetica, including: Points, Polygons, Linestrings, Tracks (GPS Positions), and Labels

A huge library of over 130 geospatial SQL functions ported from PostGIS that enable:

  • Spatial filtering
  • Spatial aggregation
  • GPS tracking
  • Geospatial joins
  • Spatial relations
  • Geometry transformation
  • Merge, dissolve, and simplification
  • Measurement
  • Isochrones and isodistance

Tracking streaming GPS data through space and time:

  • Native time-series track tables
  • Special track functions for understanding a track’s relationship to other geospatial objects
  • Track visualization rendering mode that shows the path of travel

Endpoints for server-side and client-side visualization:

  • Web Map Service (WMS) for large-scale, server-side visualization in a variety of styles:
    • Raster
    • Class-break Raster
    • Tracks
    • Heatmap
    • Contours
    • Labels
  • Vector tile service for client-side visualization

Graph Analytics

Model graphs from relational data using an intuitive identifier syntax that maps data to nodes, edges, restrictions, and weights. Weights and restrictions can be statically associated with the graph model or applied at query-time to retain flexibility in your data model. Graphs can be geospatial or non-geospatial (property graph) in nature.

Kinetica supports a wide variety of graph analytic functions, including:

  • Single Source Shortest Path
  • Inverse Shortest Path
  • Multiple Routing (Traveling Salesman)
  • Multiple Supply/Demand
  • Cycle Detection
  • Page Rank
  • Probability Rank
  • All Paths
  • Betweenness Centrality
  • Backhaul Routing
  • Hidden Markov Model (HMM) Map Matching
  • Match Origin/Destination Pairs
  • Model Statistical Analysis

Kinetica supports the ability to query property graphs based on given criteria applied to graph attributes.

Geospatial graphs can be displayed on a map using Kinetica’s Web Map Service (WMS).

ML Analytics

Kinetica supports several built-in SQL functions for common algorithms like linear regression and outlier detection.

Build your own ML models and deploy them in Kinetica:

  • Any model and ML framework is compatible
  • Load the model and invoke inferences with SQL
  • Inference with either batch (static data), or continuous (streaming data) modes
  • Scale inference horsepower by defining the number of replicas, on Kinetica’s infrastructure or your own Kubernetes cluster

Real-time Decisioning

Kinetica enables end-to-end real-time decisioning with streaming data ingestion, high-performance analytics, and data stream egress.

Ingress / Egress

Programmatic Interfaces

Import and export data using one of Kinetica’s programmatic interfaces:

  • SQL
  • Native APIs: Python, Java, C++, C#, NodeJS, JavaScript
  • Drivers: ODBC, JDBC
  • PostgreSQL Wire Protocol

Data Sources

Import data from various sources:

  • Message queue topics on Confluent Kafka, Apache Kafka, and Azure Event Hub
  • SQL and no-SQL databases
  • SaaS data stores
  • CRM (e.g., Sales Force)
  • Marketing (e.g., Eloqua)
  • E-Commerce (e.g., Shopify)
  • Collaboration (e.g., Slack)
  • Files hosted on:
    • Kinetica Server (KiFS)
    • External storage: AWS S3, Azure Blob Store, & HDFS

Import a variety of data formats, including:

  • CSV and other delimited text formats
  • Parquet
  • Shapefiles
  • Avro

Import data server-side:

  • Batch or streaming modes
  • SQL and native API support

Ingest data client-side:

  • Batch import from a text file on the client
  • Upload a local file to the server for later ingestion
  • SQL and native API support

Data Sinks

Export data to:

  • Message queue topics on Confluent Kafka, Apache Kafka, & Azure Event Hub
  • CRM (e.g., Sales Force)
  • Marketing (e.g., Eloqua)
  • E-Commerce (e.g., Shopify)
  • Collaboration (e.g., Slack)

Data Streams

Send streaming output to a data sink to enable real-time decisioning.


Native APIs

Kinetica supports a wide variety of native API languages, including:

  • C++
  • C#
  • Java
  • JavaScript / Node.js
  • Python


Connect to a variety of tools and frameworks with Kinetica’s JDBC and ODBC interfaces.



A CLI client for Kinetica that enables users to:

  • Run queries remotely
  • Insert data into a Kinetica server
  • Upload files to and download files from a Kinetica server


  • Spark
  • NiFi
  • Storm
  • FME
  • R
  • Beam
  • Mapbox



Kinetica deploys a Workbench user interface that comes with:

  • A data object explorer to help you manage all data objects in your system
  • A data import wizard that helps you set up and initiate data loading procedures
  • A file system that allows you to drag and drop files into Kinetica
  • SQL workbooks that store and run SQL commands, and help you visualize data
  • Easy to find connection strings for all APIs and supported connections
  • Management capabilities to monitor and cancel running jobs
  • User administration
  • One-click upgrades and backup snapshots


Kinetica Reveal is a lightweight business intelligence tool that helps you build dashboards with a wide array of data visualizations in Kinetica, secured with role-based access controls (RBAC).


  • SSL encryption
  • Role-based authentication
  • Row-level and column-level security
  • Data masking and obfuscation


  • Kinetica supports complete backups through Workbench, which persists snapshots to cold storage like AWS S3 or Azure Blob Storage.
  • Automatically suspend your cluster when it has been inactive for a set period of time to save operational costs.
  • Intracluster resilience enabled through Kubernetes, which automatically restarts nodes upon crashes


Monitor the performance of Kinetica (metrics and logs) in AWS CloudWatch

Version 7.1.8

Build Date: 2022.10.16


  • Data egress from Kinetica to CSV and Parquet
  • Boolean data type support
  • Regular expression matching via SQL
  • KIO deprecated; replaced by native ingress & egress capabilities


  • New functions:
    • REGEXP_LIKE - regular expression filtering
  • Geospatial indexes for WKT or latitude/longitude columns
  • Added support for multi-head fast record retrieval with KI_HINT_KEY_LOOKUP hint
  • SQL support for applying the k-means algorithm
  • LIST DATASOURCE command for listing remote tables via data source
  • Materialized views can be moved from one schema to another
  • Support for extracting the epoch from a timestamp using EXTRACT
  • Simplified syntax for loading data from a remote table via LOAD INTO...FROM REMOTE TABLE
  • Improved performance for queries using primary or partition key columns
  • Reduced-memory index for low-cardinality columns


  • Native data egress in CSV & Parquet formats
  • Support for limiting KiFS directory sizes
  • Support for altering the number of CPU & GPU data processors, as well as the maximum number of concurrent running GPU kernels, at runtime

Geospatial/Network Graph/Visualization

  • Graph support for all WKT types, including discontinuous ones
  • Support for running the supply/demand solve as a batch traveling salesman solve via BATCH_TSM_MODE option
  • Support for filtering out demand sites beyond a given distance from a transport's starting location when using the supply/demand solver, via TRUCK_SERVICE_RADIUS
  • Support for a crowd-sourcing type of supply/demand solve, made possible by DEPOT_ID being used as a grouping mechanism and not assumed to be the location of the transports
  • Added optional supply/demand unit unloading penalties via DEMAND_PENALTY & SUPPLY_TRUCK_PENALTY
  • Added optional threshold on supply side sequencing via MAX_SUPPLY_COMBINATIONS
  • Sequencing on the supply side of supply/demand solvers is now multithreaded
  • Support for querying feature information at a given point via the WMS GetFeatureInfo function


  • Java API
    • Boolean type support
    • Added retry handler for requests that fail under heavy load
    • Improved bulk ingestion error handling/reporting
    • Improved support for compiling & running API client code under different Java versions
  • JDBC
    • BLOB/CLOB support
    • Added support for multi-head fast record retrieval with KI_HINT_KEY_LOOKUP hint
    • Added support for disabling multi-head inserts
    • Improved INSERT error handling/reporting
    • Added support for the non-transactional driver being used by JDBC clients requiring transactions
  • Python API
    • Improved HTTP protocol compliance


  • Workbench
    • Added CData import UI
    • Added usage metrics UI
    • Updated example workbooks
    • Enhanced home screen

Version 7.1.7

Build Date: 2022.07.03


  • Kinetica Workbench web interface for developing and collaborating with interactive SQL workbooks
  • JDBC ingress and egress
  • Import and export data to/from other data sources and enterprise applications
  • Reveal filter slice allows additional map layers to be used for filtering base layer
  • New track SQL functions


  • New functions:
  • Other new functions:
  • Added support for ILIKE operator
  • New SQL command to upload a file to KiFS from a URL
  • Support for renaming schemas
  • Support for renaming and moving materialized views
  • Added SQL Support for generating isochrones
  • SQL Procedures can now invoke UDFs
  • Added support for pg_roles and pg_type catalog tables
  • Statistics and errors now saved for data imports and exports
  • Track rendering now supported on joins and materialized views
  • Track rendering now supports alternate ID and timestamp column names
  • Logical views, logical external tables, and catalog tables are now accessible from the native API
  • Materialized views are now updated, when needed, after multi-head inserts
  • PostgreSQL Wire Protocol support now includes Extended Query Protocol
  • KiSQL now supports line editing and line history
  • Improved performance of ingestion and copying of data with init_with_uuid column property
  • Improved UUID generation algorithm to produce more unique values for large batch inserts
  • Added support for chunk-skipping in series partitions, when query includes the partition key
  • Input UUIDs do not require hyphens
  • Support for optional AM/PM in default date format
  • Added LLL (whole milliseconds) and $ (start of optional section) code to TO_CHAR family of functions
  • Updated CAST function to work more consistently with other databases
  • K-means clustering algorithm now supported on CPUs, join views, and materialized views
  • Support for multiple subscriptions to same Kafka topic, if done by different users


  • Google Cloud Storage support
    • Direct ingestion from files in GCS
    • External tables from files in GCS
    • GCS-based cold storage tier
  • JDBC ingress support
    • Direct ingestion from queries on JDBC-accessible databases
    • External tables from queries on JDBC-accessible databases
  • JDBC Egress support
    • Export Kinetica query results into JDBC-accessible databases
  • Broader support for SQL & NoSQL databases and enterprise applications via Kinetica data sources
  • Support for KiFS files in configuration settings and credential, data source, & data sink properties that reference files
  • Support customer-managed keys for AWS S3 access
  • Allow S3 data source or cold storage user to be able specify the server-side encryption method and key
  • Enable S3 IAM role parameter in configuration settings for cold storage

Geospatial/Network Graph/Visualization

  • New /match/graph solver, match_charging_stations, for finding an optimal route involving multiple electric-vehicle charging stations
  • Distributed graph support for /query/graph


  • Java API
    • Added capability to pass in self-signed certificates & passwords as options
    • Updated dependencies to more secure versions
    • Fixed unhandled exception when an Avro encoding error occurs
    • Fixed error where a type's column order would not match a table created from it
    • Removed client-side primary key check, to improve performance and make returned errors more consistently delivered
  • Python API
    • Made the API more Python3 compatible
    • Prevented client hanging when connection IP/URL does not match any known to the server; client will operate in degraded mode (no multi-head, etc.)
    • Removed client-side primary key check, to improve performance and make returned errors more consistently delivered
    • Rectified a formatting issue while building expressions for keyed lookups that was resulting in a failure on Python 2.7.x.
    • Corrected some string/null comparisons
  • C++/C#
    • Removed client-side primary key check, to improve performance and make returned errors more consistently delivered


  • Kinetica Workbench web interface for developing and collaborating with interactive SQL workbooks
  • GAdmin viewer for SVG animation results in Graph Match UI
  • Allow Reveal additional map layers to be used for filtering base layer
  • Allow Reveal map tracks click and highlight
  • Allow Reveal to be embedded in an iframe