Package io.debezium.relational
Class TableSchemaBuilder
java.lang.Object
io.debezium.relational.TableSchemaBuilder
Builder that constructs
TableSchema instances for Table definitions.
This builder is responsible for mapping table columns to fields in Kafka Connect Schemas,
and this is necessarily dependent upon the database's supported types. Although mappings are defined for standard types,
this class may need to be subclassed for each DBMS to add support for DBMS-specific types by overriding any of the
"add*Field" methods.
See the Java SE Mapping SQL
and Java Types for details about how JDBC types map to Java value types.
- Author:
- Randall Hauch
-
Field Summary
FieldsModifier and TypeFieldDescriptionprivate final CustomConverterRegistryprivate final DefaultValueConverterprivate final FieldNameSelector.FieldNamer<Column>private static final org.slf4j.Loggerprivate final booleanprivate final SchemaNameAdjusterprivate final org.apache.kafka.connect.data.Schemaprivate final ValueConverterProvider -
Constructor Summary
ConstructorsConstructorDescriptionTableSchemaBuilder(ValueConverterProvider valueConverterProvider, DefaultValueConverter defaultValueConverter, SchemaNameAdjuster schemaNameAdjuster, CustomConverterRegistry customConverterRegistry, org.apache.kafka.connect.data.Schema sourceInfoSchema, FieldNameSelector.FieldNamer<Column> fieldNamer, boolean multiPartitionMode) Create a new instance of the builder.TableSchemaBuilder(ValueConverterProvider valueConverterProvider, SchemaNameAdjuster schemaNameAdjuster, CustomConverterRegistry customConverterRegistry, org.apache.kafka.connect.data.Schema sourceInfoSchema, FieldNameSelector.FieldNamer<Column> fieldNamer, boolean multiPartitionMode) Create a new instance of the builder. -
Method Summary
Modifier and TypeMethodDescriptionprotected voidaddField(org.apache.kafka.connect.data.SchemaBuilder builder, Table table, Column column, ColumnMapper mapper) Add to the suppliedSchemaBuildera field for the column with the given information.protected ValueConverter[]convertersForColumns(org.apache.kafka.connect.data.Schema schema, TableId tableId, List<Column> columns, ColumnMappers mappers) Obtain the array of converters for each column in a row.create(TopicNamingStrategy topicNamingStrategy, Table table, Tables.ColumnNameFilter filter, ColumnMappers mappers, Key.KeyMapper keysMapper) Create aTableSchemafrom the giventable definition.protected StructGeneratorcreateKeyGenerator(org.apache.kafka.connect.data.Schema schema, TableId columnSetName, List<Column> columns, TopicNamingStrategy topicNamingStrategy) Creates the function that produces a Kafka Connect key object for a row of data.protected ValueConvertercreateValueConverterFor(TableId tableId, Column column, org.apache.kafka.connect.data.Field fieldDefn) Create aValueConverterthat can be used to convert row values for the given column into the Kafka Connect value object described by thefield definition.protected StructGeneratorcreateValueGenerator(org.apache.kafka.connect.data.Schema schema, TableId tableId, List<Column> columns, Tables.ColumnNameFilter filter, ColumnMappers mappers) Creates the function that produces a Kafka Connect value object for a row of data.protected org.apache.kafka.connect.data.Field[]fieldsForColumns(org.apache.kafka.connect.data.Schema schema, List<Column> columns) protected int[]indexesForColumns(List<Column> columns) booleanprivate voidvalidateIncomingRowToInternalMetadata(int[] recordIndexes, org.apache.kafka.connect.data.Field[] fields, ValueConverter[] converters, Object[] row, int position) private ValueConverterwrapInMappingConverterIfNeeded(ColumnMappers mappers, TableId tableId, Column column, ValueConverter converter)
-
Field Details
-
LOGGER
private static final org.slf4j.Logger LOGGER -
schemaNameAdjuster
-
valueConverterProvider
-
defaultValueConverter
-
sourceInfoSchema
private final org.apache.kafka.connect.data.Schema sourceInfoSchema -
fieldNamer
-
customConverterRegistry
-
multiPartitionMode
private final boolean multiPartitionMode
-
-
Constructor Details
-
TableSchemaBuilder
public TableSchemaBuilder(ValueConverterProvider valueConverterProvider, SchemaNameAdjuster schemaNameAdjuster, CustomConverterRegistry customConverterRegistry, org.apache.kafka.connect.data.Schema sourceInfoSchema, FieldNameSelector.FieldNamer<Column> fieldNamer, boolean multiPartitionMode) Create a new instance of the builder.- Parameters:
valueConverterProvider- the provider for obtainingValueConverters andSchemaBuilders; may not be nullschemaNameAdjuster- the adjuster for schema names; may not be null
-
TableSchemaBuilder
public TableSchemaBuilder(ValueConverterProvider valueConverterProvider, DefaultValueConverter defaultValueConverter, SchemaNameAdjuster schemaNameAdjuster, CustomConverterRegistry customConverterRegistry, org.apache.kafka.connect.data.Schema sourceInfoSchema, FieldNameSelector.FieldNamer<Column> fieldNamer, boolean multiPartitionMode) Create a new instance of the builder.- Parameters:
valueConverterProvider- the provider for obtainingValueConverters andSchemaBuilders; may not be nulldefaultValueConverter- is used to convert the default value literal to a Java type recognized by value converters for a subset of types. may be null.schemaNameAdjuster- the adjuster for schema names; may not be null
-
-
Method Details
-
create
public TableSchema create(TopicNamingStrategy topicNamingStrategy, Table table, Tables.ColumnNameFilter filter, ColumnMappers mappers, Key.KeyMapper keysMapper) Create aTableSchemafrom the giventable definition. The resulting TableSchema will have akey schemathat contains all of the columns that make up the table's primary key, and avalue schemathat contains only those columns that are not in the table's primary key.This is equivalent to calling
create(table,false).- Parameters:
topicNamingStrategy- the topic naming strategytable- the table definition; may not be nullfilter- the filter that specifies whether columns in the table should be included; may be null if all columns are to be includedmappers- the mapping functions for columns; may be null if none of the columns are to be mapped to different values- Returns:
- the table schema that can be used for sending rows of data for this table to Kafka Connect; never null
-
isMultiPartitionMode
public boolean isMultiPartitionMode() -
createKeyGenerator
protected StructGenerator createKeyGenerator(org.apache.kafka.connect.data.Schema schema, TableId columnSetName, List<Column> columns, TopicNamingStrategy topicNamingStrategy) Creates the function that produces a Kafka Connect key object for a row of data.- Parameters:
schema- the Kafka Connect schema for the key; may be null if there is no known schema, in which case the generator will be nullcolumnSetName- the name for the set of columns, used in error messages; may not be nullcolumns- the column definitions for the table that defines the row; may not be nulltopicNamingStrategy- the topic naming strategy- Returns:
- the key-generating function, or null if there is no key schema
-
validateIncomingRowToInternalMetadata
private void validateIncomingRowToInternalMetadata(int[] recordIndexes, org.apache.kafka.connect.data.Field[] fields, ValueConverter[] converters, Object[] row, int position) -
createValueGenerator
protected StructGenerator createValueGenerator(org.apache.kafka.connect.data.Schema schema, TableId tableId, List<Column> columns, Tables.ColumnNameFilter filter, ColumnMappers mappers) Creates the function that produces a Kafka Connect value object for a row of data.- Parameters:
schema- the Kafka Connect schema for the value; may be null if there is no known schema, in which case the generator will be nulltableId- the table identifier; may not be nullcolumns- the column definitions for the table that defines the row; may not be nullfilter- the filter that specifies whether columns in the table should be included; may be null if all columns are to be includedmappers- the mapping functions for columns; may be null if none of the columns are to be mapped to different values- Returns:
- the value-generating function, or null if there is no value schema
-
indexesForColumns
-
fieldsForColumns
-
convertersForColumns
protected ValueConverter[] convertersForColumns(org.apache.kafka.connect.data.Schema schema, TableId tableId, List<Column> columns, ColumnMappers mappers) Obtain the array of converters for each column in a row. A converter might be null if the column is not be included in the records.- Parameters:
schema- the schema; may not be nulltableId- the identifier of the table that contains the columnscolumns- the columns in the row; may not be nullmappers- the mapping functions for columns; may be null if none of the columns are to be mapped to different values- Returns:
- the converters for each column in the rows; never null
-
wrapInMappingConverterIfNeeded
private ValueConverter wrapInMappingConverterIfNeeded(ColumnMappers mappers, TableId tableId, Column column, ValueConverter converter) -
addField
protected void addField(org.apache.kafka.connect.data.SchemaBuilder builder, Table table, Column column, ColumnMapper mapper) Add to the suppliedSchemaBuildera field for the column with the given information.- Parameters:
builder- the schema builder; never nulltable- the table definition; never nullcolumn- the column definitionmapper- the mapping function for the column; may be null if the columns is not to be mapped to different values
-
createValueConverterFor
protected ValueConverter createValueConverterFor(TableId tableId, Column column, org.apache.kafka.connect.data.Field fieldDefn) Create aValueConverterthat can be used to convert row values for the given column into the Kafka Connect value object described by thefield definition. This uses the suppliedValueConverterProviderobject.- Parameters:
tableId- the id of the table containing the column; never nullcolumn- the column describing the input values; never nullfieldDefn- the definition for the field in a Kafka ConnectSchemadescribing the output of the function; never null- Returns:
- the value conversion function; may not be null
-