deephaven_enterprise.data_ingestion.dis¶
This module provides a way to consume a Kafka topic and persistently write the data to a Deephaven Data Import Server, making the table available to workers throughout your Deephaven Enterprise cluster. It also includes utilities for accessing DIS by name and creating PartitionedTables from last-by views.
- class DataImportServer(j_dis)[source]¶
Bases:
JObjectWrapperA class that represents a Data Import Server (DIS) in Deephaven. An instance of this class is needed to ingest data from external sources, such as Kafka, into Deephaven. This class provides convenience methods to create PartitionedTables from the last-by views.
Note, users should not create this class directly. Instead, use the
get_dis_by_name()function to get a Data Import Server (DIS) by name, and optionally, its private storage path.- consume(opts)[source]¶
Consumes a Kafka topic and persistently writes the data to this Deephaven Data Import Server (DIS), making the table available to workers throughout the Deephaven Enterprise cluster.
- Parameters:
opts (KafkaTableWriterOptions) – the configuration for ingesting data from Kafka
- Raises:
DHError –
- j_object_type¶
alias of
DataImportServer
- partitioned_table_from_last_by_view(namespace, table_name, partition_value, last_by_view=None)[source]¶
Creates a PartitionedTable from a last-by view created by an ingester on this Data Import Server (DIS).
- Parameters:
namespace (str) – The namespace of the table.
table_name (str) – The name of the table.
partition_value (str) – The column partition value, must not be None.
last_by_view (str) – The last-by view to use, defaults to None, which means to use the anonymous last-by view that is created on the specified table.
- Return type:
- Returns:
PartitionedTable
- Raises:
DHError –