| Package | Description |
|---|---|
| io.debezium.jdbc | |
| io.debezium.relational | |
| io.debezium.relational.ddl | |
| io.debezium.relational.history | |
| io.debezium.relational.mapping |
| Modifier and Type | Method and Description |
|---|---|
protected Object |
JdbcValueConverters.convertBigInt(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.INTEGER. |
protected Object |
JdbcValueConverters.convertBinary(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.BLOB, Types.BINARY,
Types.VARBINARY, Types.LONGVARBINARY. |
protected Object |
JdbcValueConverters.convertBit(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.BIT. |
protected ValueConverter |
JdbcValueConverters.convertBits(Column column,
org.apache.kafka.connect.data.Field fieldDefn) |
protected Object |
JdbcValueConverters.convertBits(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data,
int numBytes)
Converts a value object for an expected JDBC type of
Types.BIT of length 2+. |
protected Object |
JdbcValueConverters.convertBoolean(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.BOOLEAN. |
protected ByteBuffer |
JdbcValueConverters.convertByteArray(Column column,
byte[] data)
Converts the given byte array value into a byte buffer as preferred by Kafka Connect.
|
protected Object |
JdbcValueConverters.convertDateToEpochDays(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.DATE to the number of days past epoch. |
protected Object |
JdbcValueConverters.convertDateToEpochDaysAsDate(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.DATE to the number of days past epoch, but represented
as a Date value at midnight on the date. |
protected Object |
JdbcValueConverters.convertDecimal(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.NUMERIC. |
protected Object |
JdbcValueConverters.convertDouble(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.DOUBLE. |
ValueConverter |
JdbcValueConverters.converter(Column column,
org.apache.kafka.connect.data.Field fieldDefn) |
protected Object |
JdbcValueConverters.convertFloat(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.FLOAT. |
protected Object |
JdbcValueConverters.convertInteger(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.INTEGER. |
protected Object |
JdbcValueConverters.convertNumeric(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.NUMERIC. |
protected Object |
JdbcValueConverters.convertReal(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.REAL. |
protected Object |
JdbcValueConverters.convertRowId(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.ROWID. |
protected Object |
JdbcValueConverters.convertSmallInt(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.SMALLINT. |
protected Object |
JdbcValueConverters.convertString(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.CHAR, Types.VARCHAR,
Types.LONGVARCHAR, Types.CLOB, Types.NCHAR, Types.NVARCHAR, Types.LONGNVARCHAR,
Types.NCLOB, Types.DATALINK, and Types.SQLXML. |
protected Object |
JdbcValueConverters.convertTimestampToEpochMicros(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TIMESTAMP to MicroTimestamp values, or
microseconds past epoch. |
protected Object |
JdbcValueConverters.convertTimestampToEpochMillis(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TIMESTAMP to Timestamp values, or milliseconds
past epoch. |
protected Object |
JdbcValueConverters.convertTimestampToEpochMillisAsDate(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TIMESTAMP to Date values representing
milliseconds past epoch. |
protected Object |
JdbcValueConverters.convertTimestampToEpochNanos(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TIMESTAMP to NanoTimestamp values, or
nanoseconds past epoch. |
protected Object |
JdbcValueConverters.convertTimestampWithZone(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TIMESTAMP_WITH_TIMEZONE. |
protected Object |
JdbcValueConverters.convertTimeToMicrosPastMidnight(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TIME to MicroTime values, or microseconds past
midnight. |
protected Object |
JdbcValueConverters.convertTimeToMillisPastMidnight(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TIME to Time values, or milliseconds past
midnight. |
protected Object |
JdbcValueConverters.convertTimeToMillisPastMidnightAsDate(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TIME to Date values representing
the milliseconds past midnight on the epoch day. |
protected Object |
JdbcValueConverters.convertTimeToNanosPastMidnight(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TIME to NanoTime values, or nanoseconds past
midnight. |
protected Object |
JdbcValueConverters.convertTimeWithZone(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TIME_WITH_TIMEZONE. |
protected Object |
JdbcValueConverters.convertTinyInt(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TINYINT. |
protected int |
JdbcValueConverters.getTimePrecision(Column column) |
protected Object |
JdbcValueConverters.handleUnknownData(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Convert an unknown data value.
|
org.apache.kafka.connect.data.SchemaBuilder |
JdbcValueConverters.schemaBuilder(Column column) |
protected Object |
JdbcValueConverters.toBigDecimal(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data) |
| Modifier and Type | Method and Description |
|---|---|
static void |
JdbcConnection.columnsFor(ResultSet resultSet,
Consumer<Column> consumer)
Determine the column definitions for the supplied result set and add each column to the specified consumer.
|
| Modifier and Type | Class and Description |
|---|---|
(package private) class |
ColumnImpl |
| Modifier and Type | Field and Description |
|---|---|
private List<Column> |
TableImpl.columnDefs |
private Map<String,Column> |
TableImpl.columnsByLowercaseName |
private LinkedHashMap<String,Column> |
TableEditorImpl.sortedColumns |
| Modifier and Type | Method and Description |
|---|---|
Column |
Table.columnWithName(String name)
Get the definition for the column in this table with the supplied name.
|
Column |
NoOpTableEditorImpl.columnWithName(String name) |
Column |
TableEditorImpl.columnWithName(String name) |
Column |
TableEditor.columnWithName(String name)
Get the definition for the column in this table with the supplied name.
|
Column |
TableImpl.columnWithName(String name) |
Column |
ColumnEditor.create()
Obtain an immutable column definition representing the current state of this editor.
|
Column |
ColumnEditorImpl.create() |
private Column[] |
HistorizedRelationalSnapshotChangeEventSource.getColumnsForResultSet(Table table,
ResultSet rs) |
| Modifier and Type | Method and Description |
|---|---|
List<Column> |
Table.columns()
Get the definitions for the columns in this table, in the same order in which the table defines them.
|
List<Column> |
NoOpTableEditorImpl.columns() |
List<Column> |
TableEditorImpl.columns() |
List<Column> |
TableEditor.columns()
Get the definitions for the columns in this table.
|
List<Column> |
TableImpl.columns() |
static Map<TableId,Predicate<Column>> |
ColumnId.filter(String columnBlacklist)
Create the map of predicate functions that specify which columns are to be included.
|
default List<Column> |
Table.filterColumns(Predicate<Column> predicate)
Utility to obtain a copy of a list of the columns that satisfy the specified predicate.
|
default List<Column> |
Table.nonPrimaryKeyColumns()
Get the columns that are not included in the primary key for this table.
|
default List<Column> |
Table.primaryKeyColumns()
Get the columns that make up the primary key for this table.
|
| Modifier and Type | Method and Description |
|---|---|
protected void |
TableEditorImpl.add(Column defn) |
default TableEditor |
TableEditor.addColumn(Column column)
Add one columns to this table, regardless of the
position of the supplied
columns. |
TableEditor |
NoOpTableEditorImpl.addColumns(Column... columns) |
TableEditor |
TableEditorImpl.addColumns(Column... columns) |
TableEditor |
TableEditor.addColumns(Column... columns)
Add one or more columns to this table, regardless of the
position of the supplied
columns. |
protected void |
TableSchemaBuilder.addField(org.apache.kafka.connect.data.SchemaBuilder builder,
Column column,
ColumnMapper mapper)
Add to the supplied
SchemaBuilder a field for the column with the given information. |
default int |
Column.compareTo(Column that) |
ValueConverter |
ValueConverterProvider.converter(Column columnDefinition,
org.apache.kafka.connect.data.Field fieldDefn)
Returns a
ValueConverter that can be used to convert the JDBC values corresponding to the given JDBC temporal type
into literal values described by the schema. |
protected ValueConverter |
TableSchemaBuilder.createValueConverterFor(Column column,
org.apache.kafka.connect.data.Field fieldDefn)
Create a
ValueConverter that can be used to convert row values for the given column into the Kafka Connect value
object described by the field definition. |
private Object |
HistorizedRelationalSnapshotChangeEventSource.getColumnValue(ResultSet rs,
int columnIndex,
Column column) |
org.apache.kafka.connect.data.SchemaBuilder |
ValueConverterProvider.schemaBuilder(Column columnDefinition)
Returns a
SchemaBuilder for a Schema describing literal values of the given JDBC type. |
TableEditor |
NoOpTableEditorImpl.setColumns(Column... columns) |
TableEditor |
TableEditorImpl.setColumns(Column... columns) |
TableEditor |
TableEditor.setColumns(Column... columns)
Set this table's column definitions.
|
private ValueConverter |
TableSchemaBuilder.wrapInMappingConverterIfNeeded(ColumnMappers mappers,
TableId tableId,
Column column,
ValueConverter converter) |
| Modifier and Type | Method and Description |
|---|---|
TableEditor |
NoOpTableEditorImpl.addColumns(Iterable<Column> columns) |
TableEditor |
TableEditorImpl.addColumns(Iterable<Column> columns) |
TableEditor |
TableEditor.addColumns(Iterable<Column> columns)
Add one or more columns to the end of this table's list of columns, regardless of the
position of the supplied columns. |
protected ValueConverter[] |
TableSchemaBuilder.convertersForColumns(org.apache.kafka.connect.data.Schema schema,
TableId tableId,
List<Column> columns,
Predicate<ColumnId> filter,
ColumnMappers mappers)
Obtain the array of converters for each column in a row.
|
protected Function<Object[],Object> |
TableSchemaBuilder.createKeyGenerator(org.apache.kafka.connect.data.Schema schema,
TableId columnSetName,
List<Column> columns)
Creates the function that produces a Kafka Connect key object for a row of data.
|
protected Function<Object[],org.apache.kafka.connect.data.Struct> |
TableSchemaBuilder.createValueGenerator(org.apache.kafka.connect.data.Schema schema,
TableId tableId,
List<Column> columns,
Predicate<ColumnId> filter,
ColumnMappers mappers)
Creates the function that produces a Kafka Connect value object for a row of data.
|
protected org.apache.kafka.connect.data.Field[] |
TableSchemaBuilder.fieldsForColumns(org.apache.kafka.connect.data.Schema schema,
List<Column> columns) |
default List<String> |
Table.filterColumnNames(Predicate<Column> predicate)
Utility to obtain a copy of a list of the names of those columns that satisfy the specified predicate.
|
default List<Column> |
Table.filterColumns(Predicate<Column> predicate)
Utility to obtain a copy of a list of the columns that satisfy the specified predicate.
|
protected int[] |
TableSchemaBuilder.indexesForColumns(List<Column> columns) |
Table |
Tables.overwriteTable(TableId tableId,
List<Column> columnDefs,
List<String> primaryKeyColumnNames,
String defaultCharsetName)
Add or update the definition for the identified table.
|
TableEditor |
NoOpTableEditorImpl.setColumns(Iterable<Column> columns) |
TableEditor |
TableEditorImpl.setColumns(Iterable<Column> columns) |
TableEditor |
TableEditor.setColumns(Iterable<Column> columns)
Set this table's column definitions.
|
| Constructor and Description |
|---|
TableImpl(TableId id,
List<Column> sortedColumns,
List<String> pkColumnNames,
String defaultCharsetName) |
| Modifier and Type | Method and Description |
|---|---|
protected Column |
AbstractDdlParser.createColumnFromConstant(String columnName,
String constantValue) |
| Modifier and Type | Method and Description |
|---|---|
protected Map<String,Column> |
LegacyDdlParser.parseColumnsInSelectClause(TokenStream.Marker start)
Parse the column information in the SELECT clause.
|
| Modifier and Type | Method and Description |
|---|---|
private Document |
TableChanges.TableChange.toDocument(Column column) |
| Modifier and Type | Method and Description |
|---|---|
default void |
ColumnMapper.alterFieldSchema(Column column,
org.apache.kafka.connect.data.SchemaBuilder schemaBuilder)
Optionally annotate the schema with properties to better capture the mapping behavior.
|
void |
PropagateSourceTypeToSchemaParameter.alterFieldSchema(Column column,
org.apache.kafka.connect.data.SchemaBuilder schemaBuilder) |
void |
TruncateStrings.alterFieldSchema(Column column,
org.apache.kafka.connect.data.SchemaBuilder schemaBuilder) |
void |
MaskStrings.alterFieldSchema(Column column,
org.apache.kafka.connect.data.SchemaBuilder schemaBuilder) |
ValueConverter |
ColumnMapper.create(Column column)
Create for the given column a function that maps values.
|
ValueConverter |
PropagateSourceTypeToSchemaParameter.create(Column column) |
ValueConverter |
TruncateStrings.create(Column column) |
ValueConverter |
MaskStrings.create(Column column) |
protected boolean |
TruncateStrings.isTruncationPossible(Column column) |
ColumnMapper |
ColumnMappers.mapperFor(TableId tableId,
Column column)
Get the value mapping function for the given column.
|
ValueConverter |
ColumnMappers.mappingConverterFor(Table table,
Column column)
Get the value mapping function for the given column.
|
ValueConverter |
ColumnMappers.mappingConverterFor(TableId tableId,
Column column)
Get the value mapping function for the given column.
|
Copyright © 2018 JBoss by Red Hat. All rights reserved.