Interface AppendableSink<TYPE,TARRAY>
- Type Parameters:
TYPE
- The column data typeTARRAY
- The stored values array data type (for example Integer DataType column would have int[] array data type) on occasion this may be different for exampleAppendableBooleanAsByteColumnSink
- All Superinterfaces:
PartitionUpdatesObserver
,RowUpdateObservable
,io.deephaven.csv.sinks.Sink<TARRAY>
- All Known Subinterfaces:
AppendableColumnSink<DATA_TYPE,
,TARRAY> ColumnSinkHolder<DATA_TYPE,
TARRAY>
- All Known Implementing Classes:
AppendableColumnSinkHolder
,BaseAppendableColumnSink
public interface AppendableSink<TYPE,TARRAY>
extends io.deephaven.csv.sinks.Sink<TARRAY>, PartitionUpdatesObserver, RowUpdateObservable
Sink interface used in import csv.
-
Method Summary
Modifier and TypeMethodDescriptionReturns the defined constant value in case of a constant column value.Returns the initializedCustomSetterSinkDataProcessor
in the constructordefault Object
Default implementation forSink.getUnderlying()
is to return the current object instance.boolean
This is true if the schema import column section of this column defines a class to be used as a CustomSetter.void
nullFlagsToValues
(TARRAY values, boolean[] isNull, int size) The method allows the appropriate null values to be populated in the chunk.void
publishToCustomSetter
(TARRAY values, boolean[] isNull, int size) The method will publish the values chunk toCustomSetterSinkDataProcessor
if it is present and applicablevoid
updateCustomSetterData
(int size, long destEnd) Allows custom setter columns to process updates to next chunk of rows.default void
updateNonSourceColRowChunk
(int size, long destEnd) Allows columns not in csv source file to apply updates to next chunk of rows.void
updateRowChunk
(int size, long destEnd) Allows columns not in csv source file to apply updates to next chunk of rows.void
Verifies if the Column processing had encountered errors during the processing of current chunkdefault void
Default implementation forSink.write(Object, boolean[], long, long, boolean)
void
writeToLocal
(TARRAY values, int size, long destEnd) The method will save the values chunk to disk.Methods inherited from interface com.illumon.iris.importers.csv.PartitionUpdatesObserver
onPartitionParserUpdate
Methods inherited from interface com.illumon.iris.importers.csv.RowUpdateObservable
registerRowUpdateObserver
-
Method Details
-
getUnderlying
Default implementation forSink.getUnderlying()
is to return the current object instance.- Specified by:
getUnderlying
in interfaceio.deephaven.csv.sinks.Sink<TYPE>
-
write
Default implementation forSink.write(Object, boolean[], long, long, boolean)
- Specified by:
write
in interfaceio.deephaven.csv.sinks.Sink<TYPE>
- Parameters:
src
- The chunk of data, a typed array (short[], double[], etc.) with valid elements in the half-open interval[0..(destEnd - destBegin))
.isNull
- A boolean array, with the same range of valid elements. A "true" value at positioni
means thatsrc[i]
should be ignored and the element should be considered as the "null value", whose representation depends on the target data structure. A "false" value means thatsrc[i]
should be interpreted normally.destBegin
- The inclusive start index of the destination range.destEnd
- The exclusive end index of the destination range.appending
- A hint to the destination which indicates whether the system is appending to the data structure (if appending is true), or overwriting previously-written values (if appending is false). The caller promises to never span these two cases: i.e. it will never pass a chunk of data which partially overwrites values and then partially appends values. This flag is convenient but technically redundant because code can also determine what case it's in by comparingdestEnd
to the data structure's current size.
-
nullFlagsToValues
The method allows the appropriate null values to be populated in the chunk.- Parameters:
values
- The chunk to populate null values if 'isNull' param is true for the indexisNull
- Indicates if the cell should be null valuesize
- The size of the values array that should be persisted
-
writeToLocal
The method will save the values chunk to disk.- Parameters:
values
- The current chunk ready to be persistedsize
- The size of the values array that should be persisteddestEnd
- The exclusive end index of the destination range.
-
publishToCustomSetter
The method will publish the values chunk toCustomSetterSinkDataProcessor
if it is present and applicable- Parameters:
values
- The current chunk ready to be persistedisNull
- A boolean array, with the same range of valid elements. A "true" value at positioni
means thatsrc[i]
should be ignored and the element should be considered as the "null value".size
- The size of the values array that should be persisted
-
getConstantValue
Returns the defined constant value in case of a constant column value. Constant value definition is present in the importColumn xml element. The value will be accessible through ImportDataTransformer -
updateRowChunk
void updateRowChunk(int size, long destEnd) Allows columns not in csv source file to apply updates to next chunk of rows. For example constant column values are not part of the csv source file, they will receive current row chunk details by registering with a row that is in csv source file.- Parameters:
size
- The current chunk lengthdestEnd
- The exclusive end index of the destination range.
-
updateNonSourceColRowChunk
default void updateNonSourceColRowChunk(int size, long destEnd) Allows columns not in csv source file to apply updates to next chunk of rows. For example constant column values are not part of the csv source file, they will receive current row chunk details by registering with a row that is in csv source file.- Parameters:
size
- The current chunk lengthdestEnd
- The exclusive end index of the destination range.
-
updateCustomSetterData
void updateCustomSetterData(int size, long destEnd) Allows custom setter columns to process updates to next chunk of rows. These are the steps that are expected to be done in implementing classes- Pull data from
CustomSetterSinkDataProcessor.getAllColumnDataMap(String)
- loop across individual row and invoke
BaseCsvFieldWriter.processValues(Map, int, long)
- Pull and populate columns value array at the index using processed value from
BaseCsvFieldWriter.getSetterValue()
- invoke appropriate type based add call, with singleValue flag set to false
- Parameters:
size
- The current chunk lengthdestEnd
- The exclusive end index of the destination range.
- Pull data from
-
isCustomSetterColumn
boolean isCustomSetterColumn()This is true if the schema import column section of this column defines a class to be used as a CustomSetter.In addition, the property
getCustomSinkDataProcessor()
should return a non-null value for allAppendableColumnSink
of the tableIn terms of processing the flag will be looked at while processing updates, as they are being written in to the sink for source columns. See
write(Object, boolean[], long, long, boolean)
andupdateNonSourceColRowChunk(int, long)
fo usage.- Returns:
- true if the Column uses a CustomSetter
-
getCustomSinkDataProcessor
CustomSetterSinkDataProcessor getCustomSinkDataProcessor()Returns the initializedCustomSetterSinkDataProcessor
in the constructorThe presence of a non-null value would indicate the presence of a custom setter in the schema. That would mean all non-custom setter columns would publish data to Custom Setter Processor.
- Returns:
- The
CustomSetterSinkDataProcessor
initialized in constructor
-
validateForProcessingErrors
void validateForProcessingErrors()Verifies if the Column processing had encountered errors during the processing of current chunkThis is relevant when the column is a
RowUpdateObservable
and non-source columns encountered an error. In that event the import will exit throwing appropriate error
-