Package io.deephaven.kafka.ingest
Class ProtobufConsumerRecordToTableWriterAdapter.Builder
java.lang.Object
io.deephaven.kafka.ingest.ProtobufConsumerRecordToTableWriterAdapter.Builder
- Enclosing class:
- ProtobufConsumerRecordToTableWriterAdapter
A builder to map key and value fields to table columns.
-
Constructor Summary
ConstructorsConstructorDescriptionCreates a new builder instance which will use a pre-compiled ProtoBuf descriptor file for parsing.Creates a new builder instance which will use a user-defined function for ProtoBuf parsing. -
Method Summary
Modifier and TypeMethodDescriptionaddColumnToValueField
(String column, String field) Map a column to a field in the value record.addColumnToValueFunction
(String column, Function<ProtobufRecord, ?> function) Map a column to a function of the value record.Identify that a request for a value using a key that is not found in the record will receive a null value.allowMissingKeys
(boolean allowMissingKeys) If allowMissingKeys is set, then a request for a value using a key that is not found in the record will receive a null value.Identify that records with a null value will have their columns null filled; otherwise an exception is thrown on receipt of a null value.allowNullValues
(boolean allowNullValues) If allowNullValues is set, then records with a null value will have their columns null filled; otherwise an exception is thrown on receipt of a null value.allowUnmapped
(String allowUnmapped) Allow the column with the given name to be unmapped in the output.Identifies that all unused columns are mapped to a field of the same name in the ProtoBuf record.autoValueMapping
(boolean autoValueMapping) If autoValueMapping is set, then all unused columns are mapped to a field of the same name in the ProtoBuf record.caseInsensitiveSearch
(boolean caseInsensitiveSearch) IfautoValueMapping(boolean)
is set to true, then all unused columns are mapped to a field of the same name in the generic record without regard to case.kafkaKeyColumnName
(String kafkaKeyColumnName) Set the name of the column which stores the Kafka key of the record.kafkaPartitionColumnName
(String kafkaPartitionColumnName) Set the name of the column which stores the Kafka partition identifier of the record.offsetColumnName
(String offsetColumnName) Set the name of the column which stores the Kafka offset of the record.parallelParsers
(int parallelThreads) Define maximum number of dedicated threads to use for parsing of inbound messages.parseErrorIsFatal
(boolean parseErrorIsFatal) If set, then failure to parse a specific ProtoBuf record will not be considered fata.recvTimeColumnName
(String recvTimeColumnName) Set the name of the column which stores the time that the record was received by the ingester.setFilter
(BiPredicate<org.apache.kafka.clients.consumer.ConsumerRecord<?, ?>, ProtobufRecord> recordPredicate) If a filter is added, then all records will be passed through the predicate before being consumed.timestampColumnName
(String timestampColumnName) Set the name of the column which stores the Kafka timestamp of the record.withParallelValueField
(String parallelField) Identifies a repeating field within the ProtoBuf message, which will be used to potentially determine a 1 to N mapping.
-
Constructor Details
-
Builder
Creates a new builder instance which will use a pre-compiled ProtoBuf descriptor file for parsing. To create a descriptor file, execute the following command against your .proto file:protoc --descriptor_set_out=${protoName}.desc ${protoName}.proto
- Parameters:
protoDescriptor
- filesystem path to a pre-compiled ProtoBuf descriptor file
-
Builder
Creates a new builder instance which will use a user-defined function for ProtoBuf parsing. This allows for the message to be cast to the pre-defined java class created by executing the following command against your .proto file:protoc --java_out=. ${protoName}.proto
- Parameters:
protoConstructor
- a method which is provided with the message as a byte[] and returns a Message. If the appropriate class is within the classpath, this can be a basic `.parseForm(...)` of the byte[] within the pre-defined class (ie: `return MyClass.parseFrom(pBufBytes)`)
-
-
Method Details
-
kafkaPartitionColumnName
@NotNull @ScriptApi public ProtobufConsumerRecordToTableWriterAdapter.Builder kafkaPartitionColumnName(@NotNull String kafkaPartitionColumnName) Set the name of the column which stores the Kafka partition identifier of the record.- Parameters:
kafkaPartitionColumnName
- the name of the column in the output table- Returns:
- this builder
-
kafkaKeyColumnName
@NotNull @ScriptApi public ProtobufConsumerRecordToTableWriterAdapter.Builder kafkaKeyColumnName(@NotNull String kafkaKeyColumnName) Set the name of the column which stores the Kafka key of the record. Key must be a String.- Parameters:
kafkaKeyColumnName
- the name of the column in the output table- Returns:
- this builder
-
offsetColumnName
@ScriptApi @NotNull public ProtobufConsumerRecordToTableWriterAdapter.Builder offsetColumnName(@NotNull String offsetColumnName) Set the name of the column which stores the Kafka offset of the record.- Parameters:
offsetColumnName
- the name of the column in the output table- Returns:
- this builder
-
recvTimeColumnName
@ScriptApi @NotNull public ProtobufConsumerRecordToTableWriterAdapter.Builder recvTimeColumnName(@NotNull String recvTimeColumnName) Set the name of the column which stores the time that the record was received by the ingester.- Parameters:
recvTimeColumnName
- the name of the column in the output table- Returns:
- this builder
-
timestampColumnName
@ScriptApi @NotNull public ProtobufConsumerRecordToTableWriterAdapter.Builder timestampColumnName(@NotNull String timestampColumnName) Set the name of the column which stores the Kafka timestamp of the record.- Parameters:
timestampColumnName
- the name of the column in the output table- Returns:
- this builder
-
withParallelValueField
@ScriptApi @NotNull public ProtobufConsumerRecordToTableWriterAdapter.Builder withParallelValueField(@NotNull String parallelField) Identifies a repeating field within the ProtoBuf message, which will be used to potentially determine a 1 to N mapping.- Parameters:
parallelField
- a ProtoBuf Repeating field- Returns:
- this builder
-
allowUnmapped
@ScriptApi @NotNull public ProtobufConsumerRecordToTableWriterAdapter.Builder allowUnmapped(@NotNull String allowUnmapped) Allow the column with the given name to be unmapped in the output. Unmapped columns will have no setter, and will thus be null filled in the output.- Parameters:
allowUnmapped
- the column name to allow to be unmapped- Returns:
- this builder
-
addColumnToValueField
@ScriptApi @NotNull public ProtobufConsumerRecordToTableWriterAdapter.Builder addColumnToValueField(@NotNull String column, @NotNull String field) Map a column to a field in the value record.- Parameters:
column
- the name of the output columnfield
- the name of the field in the value record- Returns:
- this builder
-
addColumnToValueFunction
@ScriptApi @NotNull public ProtobufConsumerRecordToTableWriterAdapter.Builder addColumnToValueFunction(@NotNull String column, @NotNull Function<ProtobufRecord, ?> function) Map a column to a function of the value record.- Parameters:
column
- the name of the output columnfunction
- the function to apply to the value record- Returns:
- this builder
-
allowMissingKeys
@ScriptApi @NotNull public ProtobufConsumerRecordToTableWriterAdapter.Builder allowMissingKeys(boolean allowMissingKeys) If allowMissingKeys is set, then a request for a value using a key that is not found in the record will receive a null value. Otherwise, an exception is thrown when a key is not found in the record.- Parameters:
allowMissingKeys
- to allow quietly continuing if requested value's key is not in the current record.- Returns:
- this builder
-
allowMissingKeys
Identify that a request for a value using a key that is not found in the record will receive a null value. If not set, an exception is thrown when a key is not found in the record.- Returns:
- this builder
-
allowNullValues
@ScriptApi @NotNull public ProtobufConsumerRecordToTableWriterAdapter.Builder allowNullValues(boolean allowNullValues) If allowNullValues is set, then records with a null value will have their columns null filled; otherwise an exception is thrown on receipt of a null value. If no value fields are set, then no columns are taken from the value; so this flag has no effect.- Parameters:
allowNullValues
- if null values are allowed- Returns:
- this builder
-
allowNullValues
Identify that records with a null value will have their columns null filled; otherwise an exception is thrown on receipt of a null value. If no value fields are set, then no columns are taken from the value; so this flag has no effect.- Returns:
- this builder
-
parseErrorIsFatal
@ScriptApi @NotNull public ProtobufConsumerRecordToTableWriterAdapter.Builder parseErrorIsFatal(boolean parseErrorIsFatal) If set, then failure to parse a specific ProtoBuf record will not be considered fata. Otherwise, a non-parsable will throw an exception.- Parameters:
parseErrorIsFatal
- if parse-errors are fatal. Defaults to true- Returns:
- this builder
-
autoValueMapping
@ScriptApi @NotNull public ProtobufConsumerRecordToTableWriterAdapter.Builder autoValueMapping(boolean autoValueMapping) If autoValueMapping is set, then all unused columns are mapped to a field of the same name in the ProtoBuf record.- Parameters:
autoValueMapping
- if unused columns should be automatically mapped to a field of the same name- Returns:
- this builder
-
autoValueMapping
Identifies that all unused columns are mapped to a field of the same name in the ProtoBuf record.- Returns:
- this builder
-
caseInsensitiveSearch
@ScriptApi @NotNull public ProtobufConsumerRecordToTableWriterAdapter.Builder caseInsensitiveSearch(boolean caseInsensitiveSearch) IfautoValueMapping(boolean)
is set to true, then all unused columns are mapped to a field of the same name in the generic record without regard to case. For example, a json record field ofwidgetId
would be matched to a table column namedWidgetID
.- Parameters:
caseInsensitiveSearch
- should a case-insensitive search for matching columns be performed?- Returns:
- this builder
-
parallelParsers
@ScriptApi @NotNull public ProtobufConsumerRecordToTableWriterAdapter.Builder parallelParsers(int parallelThreads) Define maximum number of dedicated threads to use for parsing of inbound messages.- Parameters:
parallelThreads
- maximum number of dedicated threads to use for parsing inbound messages- Returns:
- this builder
-
setFilter
@ScriptApi @NotNull public ProtobufConsumerRecordToTableWriterAdapter.Builder setFilter(BiPredicate<org.apache.kafka.clients.consumer.ConsumerRecord<?, ?>, ProtobufRecord> recordPredicate) If a filter is added, then all records will be passed through the predicate before being consumed. Only records that match the predicate will be ingested.- Parameters:
recordPredicate
- returns true if the record should be ingested, otherwise false- Returns:
- this builder
-