Uses of Interface
io.debezium.relational.Column
Packages that use Column
Package
Description
-
Uses of Column in io.debezium.jdbc
Methods in io.debezium.jdbc that return types with arguments of type ColumnModifier and TypeMethodDescriptionJdbcConnection.getColumnsDetails(String databaseCatalog, String schemaNamePattern, String tableName, Tables.TableFilter tableFilter, Tables.ColumnNameFilter columnFilter, DatabaseMetaData metadata, Set<TableId> viewIds) Methods in io.debezium.jdbc with parameters of type ColumnModifier and TypeMethodDescriptionprotected ObjectJdbcValueConverters.convertBigInt(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.INTEGER.protected ObjectJdbcValueConverters.convertBinary(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data, CommonConnectorConfig.BinaryHandlingMode mode) protected ObjectJdbcValueConverters.convertBinaryToBase64(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.BLOB,Types.BINARY,Types.VARBINARY,Types.LONGVARBINARY.protected ObjectJdbcValueConverters.convertBinaryToBase64UrlSafe(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.BLOB,Types.BINARY,Types.VARBINARY,Types.LONGVARBINARY.protected ObjectJdbcValueConverters.convertBinaryToBytes(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.BLOB,Types.BINARY,Types.VARBINARY,Types.LONGVARBINARY.protected ObjectJdbcValueConverters.convertBinaryToHex(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.BLOB,Types.BINARY,Types.VARBINARY,Types.LONGVARBINARY.protected ObjectJdbcValueConverters.convertBit(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.BIT.protected ValueConverterJdbcValueConverters.convertBits(Column column, org.apache.kafka.connect.data.Field fieldDefn) protected ObjectJdbcValueConverters.convertBits(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data, int numBytes) Converts a value object for an expected JDBC type ofTypes.BITof length 2+.protected ObjectJdbcValueConverters.convertBoolean(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.BOOLEAN.protected ObjectJdbcValueConverters.convertDateToEpochDays(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.DATEto the number of days past epoch.protected ObjectJdbcValueConverters.convertDateToEpochDaysAsDate(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.DATEto the number of days past epoch, but represented as aDatevalue at midnight on the date.protected ObjectJdbcValueConverters.convertDecimal(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.NUMERIC.protected ObjectJdbcValueConverters.convertDouble(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.DOUBLE.protected ObjectJdbcValueConverters.convertFloat(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.FLOAT.protected ObjectJdbcValueConverters.convertInteger(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.INTEGER.protected ObjectJdbcValueConverters.convertNumeric(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.NUMERIC.protected ObjectJdbcValueConverters.convertReal(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.REAL.protected ObjectJdbcValueConverters.convertRowId(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.ROWID.protected ObjectJdbcValueConverters.convertSmallInt(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.SMALLINT.protected ObjectJdbcValueConverters.convertString(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.CHAR,Types.VARCHAR,Types.LONGVARCHAR,Types.CLOB,Types.NCHAR,Types.NVARCHAR,Types.LONGNVARCHAR,Types.NCLOB,Types.DATALINK, andTypes.SQLXML.protected ObjectJdbcValueConverters.convertTime(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) protected ObjectJdbcValueConverters.convertTimestampToEpochMicros(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.TIMESTAMPtoMicroTimestampvalues, or microseconds past epoch.protected ObjectJdbcValueConverters.convertTimestampToEpochMillis(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.TIMESTAMPtoTimestampvalues, or milliseconds past epoch.protected ObjectJdbcValueConverters.convertTimestampToEpochMillisAsDate(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.TIMESTAMPtoDatevalues representing milliseconds past epoch.protected ObjectJdbcValueConverters.convertTimestampToEpochNanos(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.TIMESTAMPtoNanoTimestampvalues, or nanoseconds past epoch.protected ObjectJdbcValueConverters.convertTimestampWithZone(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.TIMESTAMP_WITH_TIMEZONE.protected ObjectJdbcValueConverters.convertTimeToMicrosPastMidnight(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.TIMEtoMicroTimevalues, or microseconds past midnight.protected ObjectJdbcValueConverters.convertTimeToMillisPastMidnight(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.TIMEtoTimevalues, or milliseconds past midnight.protected ObjectJdbcValueConverters.convertTimeToMillisPastMidnightAsDate(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.TIMEtoDatevalues representing the milliseconds past midnight on the epoch day.protected ObjectJdbcValueConverters.convertTimeToNanosPastMidnight(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.TIMEtoNanoTimevalues, or nanoseconds past midnight.protected ObjectJdbcValueConverters.convertTimeWithZone(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.TIME_WITH_TIMEZONE.protected ObjectJdbcValueConverters.convertTinyInt(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected JDBC type ofTypes.TINYINT.protected ObjectJdbcValueConverters.convertValue(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data, Object fallback, ValueConversionCallback callback) Converts the given value for the given column/field.JdbcConnection.getColumnValue(ResultSet rs, int columnIndex, Column column, Table table) Reads a value from JDBC result set and execute per-connector conversion if neededprotected intJdbcValueConverters.getTimePrecision(Column column) protected ObjectJdbcValueConverters.handleUnknownData(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Convert an unknown data value.protected byte[]JdbcValueConverters.normalizeBinaryData(Column column, byte[] data) Converts the given byte array value into a normalized byte array.org.apache.kafka.connect.data.SchemaBuilderJdbcValueConverters.schemaBuilder(Column column) voidJdbcConnection.setQueryColumnValue(PreparedStatement statement, Column column, int pos, Object value) Sets value onPreparedStatementand set explicit SQL type for it if necessaryprotected ObjectJdbcValueConverters.toBigDecimal(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) protected ByteBufferJdbcValueConverters.toByteBuffer(Column column, byte[] data) Converts the given byte array value into a byte buffer as preferred by Kafka Connect.protected BigDecimalJdbcValueConverters.withScaleAdjustedIfNeeded(Column column, BigDecimal data) -
Uses of Column in io.debezium.pipeline.source.snapshot.incremental
Methods in io.debezium.pipeline.source.snapshot.incremental that return types with arguments of type ColumnModifier and TypeMethodDescriptionAbstractChunkQueryBuilder.getQueryColumns(IncrementalSnapshotContext<T> context, Table table) ChunkQueryBuilder.getQueryColumns(IncrementalSnapshotContext<T> context, Table table) Returns the columns that are used for paginating the incremental snapshot chunks.Method parameters in io.debezium.pipeline.source.snapshot.incremental with type arguments of type ColumnModifier and TypeMethodDescriptionprivate booleanRowValueConstructorChunkQueryBuilder.fallbackToSuper(List<Column> pkColumns) private voidRowValueConstructorChunkQueryBuilder.rowValueComparison(List<Column> pkColumns, String operator, StringBuilder sql) -
Uses of Column in io.debezium.processors.reselect
Methods in io.debezium.processors.reselect with parameters of type ColumnModifier and TypeMethodDescriptionprivate ObjectReselectColumnsPostProcessor.getConvertedValue(Column column, org.apache.kafka.connect.data.Field field, Object value) -
Uses of Column in io.debezium.relational
Classes in io.debezium.relational that implement ColumnFields in io.debezium.relational with type parameters of type ColumnModifier and TypeFieldDescriptionTableImpl.columnDefsTableImpl.columnsByLowercaseNameprivate final FieldNameSelector.FieldNamer<Column>RelationalDatabaseConnectorConfig.fieldNamerprivate final FieldNameSelector.FieldNamer<Column>TableSchemaBuilder.fieldNamerprivate LinkedHashMap<String,Column> TableEditorImpl.sortedColumnsMethods in io.debezium.relational that return ColumnModifier and TypeMethodDescriptionNoOpTableEditorImpl.columnWithName(String name) Table.columnWithName(String name) Get the definition for the column in this table with the supplied name.TableEditor.columnWithName(String name) Get the definition for the column in this table with the supplied name.TableEditorImpl.columnWithName(String name) TableImpl.columnWithName(String name) ColumnEditor.create()Obtain an immutable column definition representing the current state of this editor.ColumnEditorImpl.create()Methods in io.debezium.relational that return types with arguments of type ColumnModifier and TypeMethodDescriptionNoOpTableEditorImpl.columns()Table.columns()Get the definitions for the columns in this table, in the same order in which the table defines them.TableEditor.columns()Get the definitions for the columns in this table.TableEditorImpl.columns()TableImpl.columns()Create the map of predicate functions that specify which columns are to be included.Table.filterColumns(Predicate<Column> predicate) Utility to obtain a copy of a list of the columns that satisfy the specified predicate.RelationalDatabaseConnectorConfig.getFieldNamer()Key.KeyMapper.getKeyKolumns(Table table) Key.keyColumns()Table.primaryKeyColumns()Get the columns that make up the primary key for this table.Methods in io.debezium.relational with parameters of type ColumnModifier and TypeMethodDescriptionprotected voiddefault TableEditorAdd one columns to this table, regardless of thepositionof the supplied columns.NoOpTableEditorImpl.addColumns(Column... columns) TableEditor.addColumns(Column... columns) Add one or more columns to this table, regardless of thepositionof the supplied columns.TableEditorImpl.addColumns(Column... columns) protected voidTableSchemaBuilder.addField(org.apache.kafka.connect.data.SchemaBuilder builder, Table table, Column column, ColumnMapper mapper) Add to the suppliedSchemaBuildera field for the column with the given information.default intValueConverterProvider.converter(Column columnDefinition, org.apache.kafka.connect.data.Field fieldDefn) Returns aValueConverterthat can be used to convert the JDBC values corresponding to the given JDBC temporal type into literal values described by theschema.protected ValueConverterTableSchemaBuilder.createValueConverterFor(TableId tableId, Column column, org.apache.kafka.connect.data.Field fieldDefn) Create aValueConverterthat can be used to convert row values for the given column into the Kafka Connect value object described by thefield definition.private StringCustomConverterRegistry.fullColumnName(TableId table, Column column) CustomConverterRegistry.getValueConverter(TableId table, Column column) Obtain a pre-registered converter for a given column.private static booleanKey.matchColumn(Selectors.TableIdToStringMapper tableIdMapper, Table table, Predicate<ColumnId> predicate, Column c) Parses the string-representation of the default value to an object.DefaultValueConverter.parseDefaultValue(Column column, String defaultValueExpression) This interface is used to convert the default value literal to a Java type recognized by value converters for a subset of types.private ObjectTableSchemaBuilder.parseDefaultValue(TableId tableId, Column column) Optional<org.apache.kafka.connect.data.SchemaBuilder>CustomConverterRegistry.registerConverterFor(TableId table, Column column, Object defaultValue) Create and register a converter for a given database column.org.apache.kafka.connect.data.SchemaBuilderValueConverterProvider.schemaBuilder(Column columnDefinition) Returns aSchemaBuilderfor aSchemadescribing literal values of the given JDBC type.NoOpTableEditorImpl.setColumns(Column... columns) TableEditor.setColumns(Column... columns) Set this table's column definitions.TableEditorImpl.setColumns(Column... columns) NoOpTableEditorImpl.updateColumn(Column column) TableEditor.updateColumn(Column column) Update the column with the given name.TableEditorImpl.updateColumn(Column newColumn) private ValueConverterTableSchemaBuilder.wrapInMappingConverterIfNeeded(ColumnMappers mappers, TableId tableId, Column column, ValueConverter converter) Method parameters in io.debezium.relational with type arguments of type ColumnModifier and TypeMethodDescriptionNoOpTableEditorImpl.addColumns(Iterable<Column> columns) TableEditor.addColumns(Iterable<Column> columns) Add one or more columns to the end of this table's list of columns, regardless of thepositionof the supplied columns.TableEditorImpl.addColumns(Iterable<Column> columns) protected ValueConverter[]TableSchemaBuilder.convertersForColumns(org.apache.kafka.connect.data.Schema schema, TableId tableId, List<Column> columns, ColumnMappers mappers) Obtain the array of converters for each column in a row.protected StructGeneratorTableSchemaBuilder.createKeyGenerator(org.apache.kafka.connect.data.Schema schema, TableId columnSetName, List<Column> columns, TopicNamingStrategy topicNamingStrategy) Creates the function that produces a Kafka Connect key object for a row of data.protected StructGeneratorTableSchemaBuilder.createValueGenerator(org.apache.kafka.connect.data.Schema schema, TableId tableId, List<Column> columns, Tables.ColumnNameFilter filter, ColumnMappers mappers) Creates the function that produces a Kafka Connect value object for a row of data.protected org.apache.kafka.connect.data.Field[]TableSchemaBuilder.fieldsForColumns(org.apache.kafka.connect.data.Schema schema, List<Column> columns) Table.filterColumns(Predicate<Column> predicate) Utility to obtain a copy of a list of the columns that satisfy the specified predicate.protected int[]TableSchemaBuilder.indexesForColumns(List<Column> columns) Tables.overwriteTable(TableId tableId, List<Column> columnDefs, List<String> primaryKeyColumnNames, String defaultCharsetName, List<Attribute> attributes) Add or update the definition for the identified table.NoOpTableEditorImpl.setColumns(Iterable<Column> columns) TableEditor.setColumns(Iterable<Column> columns) Set this table's column definitions.TableEditorImpl.setColumns(Iterable<Column> columns) Constructor parameters in io.debezium.relational with type arguments of type ColumnModifierConstructorDescription(package private)TableImpl(TableId id, List<Column> sortedColumns, List<String> pkColumnNames, String defaultCharsetName, String comment, List<Attribute> attributes) TableSchemaBuilder(ValueConverterProvider valueConverterProvider, DefaultValueConverter defaultValueConverter, SchemaNameAdjuster schemaNameAdjuster, CustomConverterRegistry customConverterRegistry, org.apache.kafka.connect.data.Schema sourceInfoSchema, FieldNameSelector.FieldNamer<Column> fieldNamer, boolean multiPartitionMode) Create a new instance of the builder.TableSchemaBuilder(ValueConverterProvider valueConverterProvider, DefaultValueConverter defaultValueConverter, SchemaNameAdjuster schemaNameAdjuster, CustomConverterRegistry customConverterRegistry, org.apache.kafka.connect.data.Schema sourceInfoSchema, FieldNameSelector.FieldNamer<Column> fieldNamer, boolean multiPartitionMode, CommonConnectorConfig.EventConvertingFailureHandlingMode eventConvertingFailureHandlingMode) TableSchemaBuilder(ValueConverterProvider valueConverterProvider, DefaultValueConverter defaultValueConverter, SchemaNameAdjuster schemaNameAdjuster, CustomConverterRegistry customConverterRegistry, org.apache.kafka.connect.data.Schema sourceInfoSchema, org.apache.kafka.connect.data.Schema transactionSchema, FieldNameSelector.FieldNamer<Column> fieldNamer, boolean multiPartitionMode, CommonConnectorConfig.EventConvertingFailureHandlingMode eventConvertingFailureHandlingMode) Create a new instance of the builder.TableSchemaBuilder(ValueConverterProvider valueConverterProvider, SchemaNameAdjuster schemaNameAdjuster, CustomConverterRegistry customConverterRegistry, org.apache.kafka.connect.data.Schema sourceInfoSchema, FieldNameSelector.FieldNamer<Column> fieldNamer, boolean multiPartitionMode) Create a new instance of the builder.TableSchemaBuilder(ValueConverterProvider valueConverterProvider, SchemaNameAdjuster schemaNameAdjuster, CustomConverterRegistry customConverterRegistry, org.apache.kafka.connect.data.Schema sourceInfoSchema, org.apache.kafka.connect.data.Schema transactionSchema, FieldNameSelector.FieldNamer<Column> fieldNamer, boolean multiPartitionMode) -
Uses of Column in io.debezium.relational.ddl
Methods in io.debezium.relational.ddl that return ColumnModifier and TypeMethodDescriptionprotected ColumnAbstractDdlParser.createColumnFromConstant(String columnName, String constantValue) -
Uses of Column in io.debezium.relational.history
Methods in io.debezium.relational.history with parameters of type ColumnModifier and TypeMethodDescriptionprivate DocumentJsonTableChangeSerializer.toDocument(Column column) private org.apache.kafka.connect.data.Struct -
Uses of Column in io.debezium.relational.mapping
Fields in io.debezium.relational.mapping with type parameters of type ColumnModifier and TypeFieldDescriptionprivate final Function<Column,ValueConverter> MaskStrings.converterFromColumnprotected final BiPredicate<TableId,Column> ColumnMappers.MapperRule.predicateMethods in io.debezium.relational.mapping with parameters of type ColumnModifier and TypeMethodDescriptiondefault voidColumnMapper.alterFieldSchema(Column column, org.apache.kafka.connect.data.SchemaBuilder schemaBuilder) Optionally annotate the schema with properties to better capture the mapping behavior.voidMaskStrings.alterFieldSchema(Column column, org.apache.kafka.connect.data.SchemaBuilder schemaBuilder) voidPropagateSourceMetadataToSchemaParameter.alterFieldSchema(Column column, org.apache.kafka.connect.data.SchemaBuilder schemaBuilder) voidTruncateStrings.alterFieldSchema(Column column, org.apache.kafka.connect.data.SchemaBuilder schemaBuilder) Create for the given column a function that maps values.ColumnMappers.Builder.fullyQualifiedColumnDatatype(TableId tableId, Column column) ColumnMappers.Builder.fullyQualifiedColumnName(TableId tableId, Column column) protected booleanTruncateStrings.isTruncationPossible(Column column) ColumnMappers.Builder.mappedTableColumnDatatype(TableId tableId, Column column) ColumnMappers.Builder.mappedTableColumnName(TableId tableId, Column column) Get the value mapping function for the given column.ColumnMappers.mappingConverterFor(TableId tableId, Column column) Get the value mapping function for the given column.ColumnMappers.mappingConverterFor(Table table, Column column) Get the value mapping function for the given column.protected booleanConstructor parameters in io.debezium.relational.mapping with type arguments of type ColumnModifierConstructorDescriptionprotectedMapperRule(BiPredicate<TableId, Column> predicate, ColumnMapper mapper) -
Uses of Column in io.debezium.schema
Methods in io.debezium.schema that return types with arguments of type ColumnModifier and TypeMethodDescriptionstatic FieldNameSelector.FieldNamer<Column>FieldNameSelector.defaultSelector(SchemaNameAdjuster fieldNameAdjuster) -
Uses of Column in io.debezium.util
Fields in io.debezium.util declared as ColumnFields in io.debezium.util with type parameters of type ColumnMethods in io.debezium.util that return ColumnMethods in io.debezium.util that return types with arguments of type ColumnConstructors in io.debezium.util with parameters of type ColumnConstructor parameters in io.debezium.util with type arguments of type ColumnModifierConstructorDescriptionMappedColumns(Map<String, Column> sourceTableColumns, int greatestColumnPosition)