Class BinlogValueConverters
java.lang.Object
io.debezium.jdbc.JdbcValueConverters
io.debezium.connector.binlog.jdbc.BinlogValueConverters
- All Implemented Interfaces:
ValueConverterProvider
Binlog-based connector-specific customizations for converting JDBC values obtained from the binlog client
library.
This class uses UTC for the default time zone when converting values without timezone details to values that
require timezones. This is because MariaDB
TIMESTAMP values are always stored in UTC, unlike types
like DATETIME, and aare replicated as such. Meanwhile, the Binlog Client will deserialize these as
Timestamp which have no timezone; therefore, are presumed to be UTC.
If a column is Types.TIMESTAMP_WITH_TIMEZONE, the converters will need to convert the value
from a Timestamp to an OffsetDateTime using the default time zone, which
is always UTC.- Author:
- Randall Hauch, Chris Cranford
- See Also:
-
AbstractRowsEventDataDeserializer
-
Nested Class Summary
Nested classes/interfaces inherited from class io.debezium.jdbc.JdbcValueConverters
JdbcValueConverters.BigIntUnsignedMode, JdbcValueConverters.DecimalMode -
Field Summary
FieldsModifier and TypeFieldDescriptionprivate static final PatternUsed to parse values of DATE columns.private final CommonConnectorConfig.EventConvertingFailureHandlingModeprivate static final org.slf4j.Loggerprivate static final org.slf4j.Loggerprivate static final PatternUsed to parse values of TIME columns.private static final PatternUsed to parse values of TIMESTAMP columns.Fields inherited from class io.debezium.jdbc.JdbcValueConverters
adaptiveTimeMicrosecondsPrecisionMode, adaptiveTimePrecisionMode, adjuster, bigIntUnsignedMode, binaryMode, decimalMode, defaultOffset, fallbackTimestampWithTimeZone, logger -
Constructor Summary
ConstructorsConstructorDescriptionBinlogValueConverters(JdbcValueConverters.DecimalMode decimalMode, TemporalPrecisionMode temporalPrecisionMode, JdbcValueConverters.BigIntUnsignedMode bigIntUnsignedMode, CommonConnectorConfig.BinaryHandlingMode binaryHandlingMode, TemporalAdjuster adjuster, CommonConnectorConfig.EventConvertingFailureHandlingMode eventConvertingFailureHandlingMode) Create a new instance of the value converters that always uses UTC for the default time zone when converting values without timezone information to values that require timezones. -
Method Summary
Modifier and TypeMethodDescriptionstatic TemporaladjustTemporal(Temporal temporal) A utility method that adjusts ambiguous 2-digit year values of DATETIME, DATE, and TIMESTAMP types using these database-specific rules: Year values in the range 00-69 are converted to 2000-2069. Year values in the range 70-99 are converted to 1970-1999.protected ByteOrderprotected CharsetcharsetFor(Column column) Return theCharsetinstance with the database-specific character set name used by the given column.static booleancontainsZeroValuesInDatePart(String timestampString, Column column, Table table) Checks if theTIMESTAMP_FIELD_PATTERNtimestamp field pattern contains0in its year, month, or day parts of the parsed value.protected ObjectconvertBigInt(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) protected ObjectconvertDurationToMicroseconds(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) protected ObjectconvertEnumToString(List<String> options, Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for anENUM, which is represented in the binlog events as an integer value containing the index of the enum option.protected ObjectconvertFloat(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) FLOAT(p) values are reported as FLOAT and DOUBLE.protected ObjectconvertGeometry(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Convert a value representing a GEOMETRYbyte[]value to a Geometry value used in aSourceRecord.protected ObjectconvertInteger(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) protected ObjectconvertJson(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) protected ObjectconvertPoint(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Convert a value representing a POINTbyte[]value to a Point value used in aSourceRecord.protected ObjectconvertSetToString(List<String> options, Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for aSET, which is represented in the binlog events contain a long number in which every bit corresponds to a different option.protected StringconvertSetValue(Column column, long indexes, List<String> options) protected ObjectconvertSmallInt(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) protected ObjectconvertString(Column column, org.apache.kafka.connect.data.Field fieldDefn, Charset columnCharset, Object data) protected ObjectconvertTimestampToLocalDateTime(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) protected ObjectconvertUnsignedBigint(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Convert a value representing a Unsigned BIGINT value to the correct Unsigned INT representation.protected ObjectconvertUnsignedInt(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Convert a value representing a Unsigned INT value to the correct Unsigned INT representation.protected ObjectconvertUnsignedMediumint(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Convert a value representing a Unsigned MEDIUMINT value to the correct Unsigned SMALLINT representation.protected ObjectconvertUnsignedSmallint(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Convert a value representing a Unsigned SMALLINT value to the correct Unsigned SMALLINT representation.protected ObjectconvertUnsignedTinyint(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Convert a value representing a Unsigned TINYINT value to the correct Unsigned TINYINT representation.protected ObjectconvertYearToInt(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for aYEAR, which appear in the binlog as an integer though returns from the the JDBC driver as either a short or aDate.extractEnumAndSetOptions(Column column) protected Stringprotected abstract StringgetJavaEncodingForCharSet(String charSetName) protected ObjecthandleUnknownData(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) protected booleanisGeometryCollection(String upperCaseTypeName) Determine if the uppercase form of a column's type is geometry collection independent of JDBC driver or server version.protected booleanDetermine if the uppercase form of a column's type exactly matches or begins with the specified prefix.protected byte[]normalizeBinaryData(Column column, byte[] data) org.apache.kafka.connect.data.SchemaBuilderschemaBuilder(Column column) static DurationstringToDuration(String timeString) Converts aTIME_FIELD_PATTERNtime field pattern string to aDuration.static LocalDatestringToLocalDate(String dateString, Column column, Table table) Converts aDATE_FIELD_PATTERNdate string field pattern to aLocalDatevalue.Methods inherited from class io.debezium.jdbc.JdbcValueConverters
convertBinary, convertBinaryToBase64, convertBinaryToBase64UrlSafe, convertBinaryToBytes, convertBinaryToHex, convertBit, convertBits, convertBits, convertBoolean, convertDateToEpochDays, convertDateToEpochDaysAsDate, convertDecimal, convertDouble, convertNumeric, convertReal, convertRowId, convertString, convertTime, convertTimestampToEpochMicros, convertTimestampToEpochMillis, convertTimestampToEpochMillisAsDate, convertTimestampToEpochNanos, convertTimestampWithZone, convertTimeToMicrosPastMidnight, convertTimeToMillisPastMidnight, convertTimeToMillisPastMidnightAsDate, convertTimeToNanosPastMidnight, convertTimeWithZone, convertTinyInt, convertValue, getTimePrecision, padLittleEndian, toBigDecimal, toByteBuffer, unexpectedBinary, withScaleAdjustedIfNeeded
-
Field Details
-
LOGGER
private static final org.slf4j.Logger LOGGER -
INVALID_VALUE_LOGGER
private static final org.slf4j.Logger INVALID_VALUE_LOGGER -
TIME_FIELD_PATTERN
Used to parse values of TIME columns. Format: 000:00:00.000000. -
DATE_FIELD_PATTERN
Used to parse values of DATE columns. Format: 000-00-00. -
TIMESTAMP_FIELD_PATTERN
Used to parse values of TIMESTAMP columns. Format: 000-00-00 00:00:00.000. -
eventConvertingFailureHandlingMode
private final CommonConnectorConfig.EventConvertingFailureHandlingMode eventConvertingFailureHandlingMode
-
-
Constructor Details
-
BinlogValueConverters
public BinlogValueConverters(JdbcValueConverters.DecimalMode decimalMode, TemporalPrecisionMode temporalPrecisionMode, JdbcValueConverters.BigIntUnsignedMode bigIntUnsignedMode, CommonConnectorConfig.BinaryHandlingMode binaryHandlingMode, TemporalAdjuster adjuster, CommonConnectorConfig.EventConvertingFailureHandlingMode eventConvertingFailureHandlingMode) Create a new instance of the value converters that always uses UTC for the default time zone when converting values without timezone information to values that require timezones.- Parameters:
decimalMode- howDECIMALandNUMERICvalues are treated; can be null ifJdbcValueConverters.DecimalMode.PRECISEis usedtemporalPrecisionMode- temporal precision modebigIntUnsignedMode- howBIGINT UNSIGNEDvalues are treated; may be null ifJdbcValueConverters.BigIntUnsignedMode.PRECISEis used.binaryHandlingMode- how binary columns should be treatedadjuster- a temporal adjuster to make a database specific time before conversioneventConvertingFailureHandlingMode- how to handle conversion failures
-
-
Method Details
-
schemaBuilder
- Specified by:
schemaBuilderin interfaceValueConverterProvider- Overrides:
schemaBuilderin classJdbcValueConverters
-
converter
- Specified by:
converterin interfaceValueConverterProvider- Overrides:
converterin classJdbcValueConverters
-
byteOrderOfBitType
- Overrides:
byteOrderOfBitTypein classJdbcValueConverters
-
convertSmallInt
protected Object convertSmallInt(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) - Overrides:
convertSmallIntin classJdbcValueConverters
-
convertInteger
protected Object convertInteger(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) - Overrides:
convertIntegerin classJdbcValueConverters
-
convertBigInt
protected Object convertBigInt(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) - Overrides:
convertBigIntin classJdbcValueConverters
-
convertFloat
protected Object convertFloat(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) FLOAT(p) values are reported as FLOAT and DOUBLE. A value with precision 0 to 23 results in a 4-byte, single-precision FLOAT column. A value with precision 24 to 53 results in an 8-byte, double-precision DOUBLE column. MySQL 8.0.17 deprecated the non-standard FLOAT(M,D) and DOUBLE(M,D) syntax, and should expect support for it to be removed in a future release. todo: check the above deprecation with regard to MariaDB- Overrides:
convertFloatin classJdbcValueConverters
-
normalizeBinaryData
- Overrides:
normalizeBinaryDatain classJdbcValueConverters
-
handleUnknownData
protected Object handleUnknownData(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) - Overrides:
handleUnknownDatain classJdbcValueConverters
-
convertString
protected Object convertString(Column column, org.apache.kafka.connect.data.Field fieldDefn, Charset columnCharset, Object data) - Parameters:
column- the column in which the value appearsfieldDefn- the field definition for theSourceRecord'sSchema; never nullcolumnCharset- the Java character set in which column byte[] values are encoded; may not be nulldata- the data; may be null- Returns:
- the converted value, or null if the conversion could not be made and the column allows nulls
- Throws:
IllegalArgumentException- if the value could not be converted but the column does not allow nulls
-
convertJson
protected Object convertJson(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) - Parameters:
column- the column in which the value appearsfieldDefn- the field definition for theSourceRecord'sSchema; never nulldata- the data; may be null- Returns:
- the converted value, or null if the conversion could not be made and the column allows nulls
- Throws:
IllegalArgumentException- if the value could not be converted but the column does not allow nulls
-
convertYearToInt
protected Object convertYearToInt(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for aYEAR, which appear in the binlog as an integer though returns from the the JDBC driver as either a short or aDate.- Parameters:
column- the column definition describing thedatavalue; never nullfieldDefn- the field definition; never nulldata- the data object to be converted into a year literal integer value; never null- Returns:
- the converted value, or null if the conversion could not be made and the column allows nulls
- Throws:
IllegalArgumentException- if the value could not be converted but the column does not allow nulls
-
convertEnumToString
protected Object convertEnumToString(List<String> options, Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for anENUM, which is represented in the binlog events as an integer value containing the index of the enum option. The JDBC driver returns a string containing the option, so this method calculates the same.- Parameters:
options- the characters that appear in the same order as defined in the column; may not be nullcolumn- the column definition describing thedatavalue; never nullfieldDefn- the field definition; never nulldata- the data object to be converted into anENUMliteral String value- Returns:
- the converted value, or null if the conversion could not be made and the column allows nulls
- Throws:
IllegalArgumentException- if the value could not be converted but the column does not allow nulls
-
convertSetToString
protected Object convertSetToString(List<String> options, Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for aSET, which is represented in the binlog events contain a long number in which every bit corresponds to a different option. The JDBC driver returns a string containing the comma-separated options, so this method calculates the same.- Parameters:
options- the characters that appear in the same order as defined in the column; may not be nullcolumn- the column definition describing thedatavalue; never nullfieldDefn- the field definition; never nulldata- the data object to be converted into anSETliteral String value; never null- Returns:
- the converted value, or null if the conversion could not be made and the column allows nulls
- Throws:
IllegalArgumentException- if the value could not be converted but the column does not allow nulls
-
charsetFor
Return theCharsetinstance with the database-specific character set name used by the given column.- Parameters:
column- the column in which the character set is used; never null- Returns:
- the Java
Charset, or null if there is no mapping
-
matches
Determine if the uppercase form of a column's type exactly matches or begins with the specified prefix. Note that this logic works when the column'stypecontains the type name followed by parentheses.- Parameters:
upperCaseTypeName- the upper case form of the column'stype nameupperCaseMatch- the upper case form of the expected type or prefix of the type; may not be null- Returns:
trueif the type matches the specified type, orfalseotherwise
-
isGeometryCollection
Determine if the uppercase form of a column's type is geometry collection independent of JDBC driver or server version.- Parameters:
upperCaseTypeName- the upper case form of the column'stype name- Returns:
trueif the type is geometry collection
-
extractEnumAndSetOptionsAsString
-
convertSetValue
-
convertPoint
protected Object convertPoint(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Convert a value representing a POINTbyte[]value to a Point value used in aSourceRecord.- Parameters:
column- the column in which the value appearsfieldDefn- the field definition for theSourceRecord'sSchema; never nulldata- the data; may be null- Returns:
- the converted value, or null if the conversion could not be made and the column allows nulls
- Throws:
IllegalArgumentException- if the value could not be converted but the column does not allow nulls
-
convertGeometry
protected Object convertGeometry(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Convert a value representing a GEOMETRYbyte[]value to a Geometry value used in aSourceRecord.- Parameters:
column- the column in which the value appearsfieldDefn- the field definition for theSourceRecord'sSchema; never nulldata- the data; may be null- Returns:
- the converted value, or null if the conversion could not be made and the column allows nulls
- Throws:
IllegalArgumentException- if the value could not be converted but the column does not allow nulls
-
convertUnsignedTinyint
protected Object convertUnsignedTinyint(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Convert a value representing a Unsigned TINYINT value to the correct Unsigned TINYINT representation.- Parameters:
column- the column in which the value appearsfieldDefn- the field definition for theSourceRecord'sSchema; never nulldata- the data; may be null- Returns:
- the converted value, or null if the conversion could not be made and the column allows nulls
- Throws:
IllegalArgumentException- if the value could not be converted but the column does not allow nulls
-
convertUnsignedSmallint
protected Object convertUnsignedSmallint(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Convert a value representing a Unsigned SMALLINT value to the correct Unsigned SMALLINT representation.- Parameters:
column- the column in which the value appearsfieldDefn- the field definition for theSourceRecord'sSchema; never nulldata- the data; may be null- Returns:
- the converted value, or null if the conversion could not be made and the column allows nulls
- Throws:
IllegalArgumentException- if the value could not be converted but the column does not allow nulls
-
convertUnsignedMediumint
protected Object convertUnsignedMediumint(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Convert a value representing a Unsigned MEDIUMINT value to the correct Unsigned SMALLINT representation.- Parameters:
column- the column in which the value appearsfieldDefn- the field definition for theSourceRecord'sSchema; never nulldata- the data; may be null- Returns:
- the converted value, or null if the conversion could not be made and the column allows nulls
- Throws:
IllegalArgumentException- if the value could not be converted but the column does not allow nulls
-
convertUnsignedInt
protected Object convertUnsignedInt(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Convert a value representing a Unsigned INT value to the correct Unsigned INT representation.- Parameters:
column- the column in which the value appearsfieldDefn- the field definition for theSourceRecord'sSchema; never nulldata- the data; may be null- Returns:
- the converted value, or null if the conversion could not be made and the column allows nulls
- Throws:
IllegalArgumentException- if the value could not be converted but the column does not allow nulls
-
convertUnsignedBigint
protected Object convertUnsignedBigint(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Convert a value representing a Unsigned BIGINT value to the correct Unsigned INT representation.- Parameters:
column- the column in which the value appearsfieldDefn- the field definition for theSourceRecord'sSchema; never nulldata- the data; may be null- Returns:
- the converted value, or null if the conversion could not be made and the column allows nulls
- Throws:
IllegalArgumentException- if the value could not be converted but the column does not allow nulls
-
convertDurationToMicroseconds
protected Object convertDurationToMicroseconds(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data) Converts a value object for an expected type ofDurationtoLongvalues that represents the time in microseconds. Per the JDBC specification, databases should returnTimeinstances, but that's not working because it can only handle Daytime 00:00:00-23:59:59. We useDurationinstead that can handle the range of -838:59:59.000000 to 838:59:59.000000 of a TIME type and transfer data as signed INT64 which reflects the DB value converted to microseconds.- Parameters:
column- the column definition describing thedatavalue; never nullfieldDefn- the field definition; never nulldata- the data object to be converted into aDurationtype; never null- Returns:
- the converted value, or null if the conversion could not be made and the column allows nulls
- Throws:
IllegalArgumentException- if the value could not be converted but the column does not allow nulls
-
convertTimestampToLocalDateTime
-
extractEnumAndSetOptions
-
getJavaEncodingForCharSet
-
adjustTemporal
A utility method that adjusts ambiguous 2-digit year values of DATETIME, DATE, and TIMESTAMP types using these database-specific rules:- Year values in the range 00-69 are converted to 2000-2069.
- Year values in the range 70-99 are converted to 1970-1999.
- Parameters:
temporal- the temporal instance to adjust; may not be null- Returns:
- the possibly adjusted temporal instance; never null
-
stringToDuration
Converts aTIME_FIELD_PATTERNtime field pattern string to aDuration.- Parameters:
timeString- the time field string value; should not be null- Returns:
- the parsed duration
-
stringToLocalDate
Converts aDATE_FIELD_PATTERNdate string field pattern to aLocalDatevalue.- Parameters:
dateString- date string to be parsed; may be null or emptycolumn- the relational column, should not be nulltable- the relational table, should not be null- Returns:
- the parsed local date or null if the date's year, month, or day are zero
-
containsZeroValuesInDatePart
public static boolean containsZeroValuesInDatePart(String timestampString, Column column, Table table) Checks if theTIMESTAMP_FIELD_PATTERNtimestamp field pattern contains0in its year, month, or day parts of the parsed value.- Parameters:
timestampString- timestamp string to be parsed, may be null or emptycolumn- the relational column, should not be nulltable- the relational table, should not be null- Returns:
- true if the timestamp's year, month, or day are zero; false otherwise
-