@Immutable public class JdbcValueConverters extends Object implements ValueConverterProvider
ValueConverters and SchemaBuilders for various column types. This implementation is aware
of the most common JDBC types and values. Specializations for specific DBMSes can be addressed in subclasses.
Although it is more likely that values will correspond pretty closely to the expected JDBC types, this class assumes it is
possible for some variation to occur when values originate in libraries that are not JDBC drivers. Specifically, the conversion
logic for JDBC temporal types with timezones (e.g., Types.TIMESTAMP_WITH_TIMEZONE) do support converting values that
don't have timezones (e.g., Timestamp) by assuming a default time zone offset for values that don't have
(but are expected to have) timezones. Again, when the values are highly-correlated with the expected SQL/JDBC types, this
default timezone offset will not be needed.
| Modifier and Type | Class and Description |
|---|---|
static class |
JdbcValueConverters.BigIntUnsignedMode |
static class |
JdbcValueConverters.DecimalMode |
| Modifier and Type | Field and Description |
|---|---|
protected boolean |
adaptiveTimeMicrosecondsPrecisionMode |
protected boolean |
adaptiveTimePrecisionMode |
private TemporalAdjuster |
adjuster |
protected JdbcValueConverters.BigIntUnsignedMode |
bigIntUnsignedMode |
protected CommonConnectorConfig.BinaryHandlingMode |
binaryMode |
protected JdbcValueConverters.DecimalMode |
decimalMode |
private ZoneOffset |
defaultOffset |
private String |
fallbackTimestampWithTimeZone
Fallback value for TIMESTAMP WITH TZ is epoch
|
private String |
fallbackTimeWithTimeZone
Fallback value for TIME WITH TZ is 00:00
|
protected org.slf4j.Logger |
logger |
| Constructor and Description |
|---|
JdbcValueConverters()
Create a new instance that always uses UTC for the default time zone when converting values without timezone information
to values that require timezones, and uses adapts time and timestamp values based upon the precision of the database
columns.
|
JdbcValueConverters(JdbcValueConverters.DecimalMode decimalMode,
TemporalPrecisionMode temporalPrecisionMode,
ZoneOffset defaultOffset,
TemporalAdjuster adjuster,
JdbcValueConverters.BigIntUnsignedMode bigIntUnsignedMode,
CommonConnectorConfig.BinaryHandlingMode binaryMode)
Create a new instance, and specify the time zone offset that should be used only when converting values without timezone
information to values that require timezones.
|
| Modifier and Type | Method and Description |
|---|---|
protected ByteOrder |
byteOrderOfBitType()
|
protected Object |
convertBigInt(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.INTEGER. |
protected Object |
convertBinary(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data,
CommonConnectorConfig.BinaryHandlingMode mode) |
protected Object |
convertBinaryToBase64(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.BLOB, Types.BINARY,
Types.VARBINARY, Types.LONGVARBINARY. |
protected Object |
convertBinaryToBytes(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.BLOB, Types.BINARY,
Types.VARBINARY, Types.LONGVARBINARY. |
protected Object |
convertBinaryToHex(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.BLOB, Types.BINARY,
Types.VARBINARY, Types.LONGVARBINARY. |
protected Object |
convertBit(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.BIT. |
protected ValueConverter |
convertBits(Column column,
org.apache.kafka.connect.data.Field fieldDefn) |
protected Object |
convertBits(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data,
int numBytes)
Converts a value object for an expected JDBC type of
Types.BIT of length 2+. |
protected Object |
convertBoolean(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.BOOLEAN. |
protected Object |
convertDateToEpochDays(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.DATE to the number of days past epoch. |
protected Object |
convertDateToEpochDaysAsDate(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.DATE to the number of days past epoch, but represented
as a Date value at midnight on the date. |
protected Object |
convertDecimal(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.NUMERIC. |
protected Object |
convertDouble(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.DOUBLE. |
ValueConverter |
converter(Column column,
org.apache.kafka.connect.data.Field fieldDefn)
Returns a
ValueConverter that can be used to convert the JDBC values corresponding to the given JDBC temporal type
into literal values described by the schema. |
protected Object |
convertFloat(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.FLOAT. |
protected Object |
convertInteger(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.INTEGER. |
protected Object |
convertNumeric(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.NUMERIC. |
protected Object |
convertReal(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.REAL. |
protected Object |
convertRowId(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.ROWID. |
protected Object |
convertSmallInt(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.SMALLINT. |
protected Object |
convertString(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.CHAR, Types.VARCHAR,
Types.LONGVARCHAR, Types.CLOB, Types.NCHAR, Types.NVARCHAR, Types.LONGNVARCHAR,
Types.NCLOB, Types.DATALINK, and Types.SQLXML. |
protected Object |
convertTime(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data) |
protected Object |
convertTimestampToEpochMicros(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TIMESTAMP to MicroTimestamp values, or
microseconds past epoch. |
protected Object |
convertTimestampToEpochMillis(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TIMESTAMP to Timestamp values, or milliseconds
past epoch. |
protected Object |
convertTimestampToEpochMillisAsDate(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TIMESTAMP to Date values representing
milliseconds past epoch. |
protected Object |
convertTimestampToEpochNanos(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TIMESTAMP to NanoTimestamp values, or
nanoseconds past epoch. |
protected Object |
convertTimestampWithZone(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TIMESTAMP_WITH_TIMEZONE. |
protected Object |
convertTimeToMicrosPastMidnight(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TIME to MicroTime values, or microseconds past
midnight. |
protected Object |
convertTimeToMillisPastMidnight(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TIME to Time values, or milliseconds past
midnight. |
protected Object |
convertTimeToMillisPastMidnightAsDate(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TIME to Date values representing
the milliseconds past midnight on the epoch day. |
protected Object |
convertTimeToNanosPastMidnight(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TIME to NanoTime values, or nanoseconds past
midnight. |
protected Object |
convertTimeWithZone(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TIME_WITH_TIMEZONE. |
protected Object |
convertTinyInt(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Converts a value object for an expected JDBC type of
Types.TINYINT. |
protected Object |
convertValue(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data,
Object fallback,
ValueConversionCallback callback)
Converts the given value for the given column/field.
|
protected int |
getTimePrecision(Column column) |
protected Object |
handleUnknownData(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data)
Convert an unknown data value.
|
protected byte[] |
padLittleEndian(int numBytes,
byte[] data) |
org.apache.kafka.connect.data.SchemaBuilder |
schemaBuilder(Column column)
Returns a
SchemaBuilder for a Schema describing literal values of the given JDBC type. |
private boolean |
supportsLargeTimeValues() |
protected Object |
toBigDecimal(Column column,
org.apache.kafka.connect.data.Field fieldDefn,
Object data) |
private byte[] |
toByteArray(char[] chars) |
private ByteBuffer |
toByteBuffer(char[] chars) |
protected ByteBuffer |
toByteBuffer(Column column,
byte[] data)
Converts the given byte array value into a byte buffer as preferred by Kafka Connect.
|
private ByteBuffer |
toByteBuffer(String string) |
protected byte[] |
unexpectedBinary(Object value,
org.apache.kafka.connect.data.Field fieldDefn)
Handle the unexpected value from a row with a column type of
Types.BLOB, Types.BINARY,
Types.VARBINARY, Types.LONGVARBINARY. |
protected BigDecimal |
withScaleAdjustedIfNeeded(Column column,
BigDecimal data) |
protected final org.slf4j.Logger logger
private final ZoneOffset defaultOffset
private final String fallbackTimestampWithTimeZone
private final String fallbackTimeWithTimeZone
protected final boolean adaptiveTimePrecisionMode
protected final boolean adaptiveTimeMicrosecondsPrecisionMode
protected final JdbcValueConverters.DecimalMode decimalMode
private final TemporalAdjuster adjuster
protected final JdbcValueConverters.BigIntUnsignedMode bigIntUnsignedMode
protected final CommonConnectorConfig.BinaryHandlingMode binaryMode
public JdbcValueConverters()
public JdbcValueConverters(JdbcValueConverters.DecimalMode decimalMode, TemporalPrecisionMode temporalPrecisionMode, ZoneOffset defaultOffset, TemporalAdjuster adjuster, JdbcValueConverters.BigIntUnsignedMode bigIntUnsignedMode, CommonConnectorConfig.BinaryHandlingMode binaryMode)
decimalMode - how DECIMAL and NUMERIC values should be treated; may be null if
JdbcValueConverters.DecimalMode.PRECISE is to be usedtemporalPrecisionMode - temporal precision mode based on TemporalPrecisionModedefaultOffset - the zone offset that is to be used when converting non-timezone related values to values that do
have timezones; may be null if UTC is to be usedadjuster - the optional component that adjusts the local date value before obtaining the epoch day; may be null if no
adjustment is necessarybigIntUnsignedMode - how BIGINT UNSIGNED values should be treated; may be null if
JdbcValueConverters.BigIntUnsignedMode.PRECISE is to be usedbinaryMode - how binary columns should be representedpublic org.apache.kafka.connect.data.SchemaBuilder schemaBuilder(Column column)
ValueConverterProviderSchemaBuilder for a Schema describing literal values of the given JDBC type.schemaBuilder in interface ValueConverterProvidercolumn - the column definition; never nullpublic ValueConverter converter(Column column, org.apache.kafka.connect.data.Field fieldDefn)
ValueConverterProviderValueConverter that can be used to convert the JDBC values corresponding to the given JDBC temporal type
into literal values described by the schema.
This method is only called when ValueConverterProvider.schemaBuilder(Column) returns a non-null SchemaBuilder for the same column
definition.
converter in interface ValueConverterProvidercolumn - the column definition; never nullfieldDefn - the definition for the field in a Kafka Connect Schema describing the output of the function;
never nullprotected ValueConverter convertBits(Column column, org.apache.kafka.connect.data.Field fieldDefn)
protected Object convertTimestampWithZone(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.TIMESTAMP_WITH_TIMEZONE.
The standard ANSI to Java 8 type
mappings specify that the preferred mapping (when using JDBC's getObject(...)
methods) in Java 8 is to return OffsetDateTime for these values.
This method handles several types of objects, including OffsetDateTime, Timestamp,
Date, LocalTime, and LocalDateTime.
column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertTimeWithZone(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.TIME_WITH_TIMEZONE.
The standard ANSI to Java 8 type
mappings specify that the preferred mapping (when using JDBC's getObject(...)
methods) in Java 8 is to return OffsetTime for these values.
This method handles several types of objects, including OffsetTime, Time, Date,
LocalTime, and LocalDateTime. If any of the types have date components, those date
components are ignored.
column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertTime(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
protected Object convertTimestampToEpochMillis(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.TIMESTAMP to Timestamp values, or milliseconds
past epoch.
Per the JDBC specification, databases should return Timestamp instances, which have date and time info
but no time zone info. This method handles Date objects plus any other standard date-related objects such
as Date, LocalTime, and LocalDateTime.
column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertTimestampToEpochMicros(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.TIMESTAMP to MicroTimestamp values, or
microseconds past epoch.
Per the JDBC specification, databases should return Timestamp instances, which have date and time info
but no time zone info. This method handles Date objects plus any other standard date-related objects such
as Date, LocalTime, and LocalDateTime.
column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertTimestampToEpochNanos(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.TIMESTAMP to NanoTimestamp values, or
nanoseconds past epoch.
Per the JDBC specification, databases should return Timestamp instances, which have date and time info
but no time zone info. This method handles Date objects plus any other standard date-related objects such
as Date, LocalTime, and LocalDateTime.
column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertTimestampToEpochMillisAsDate(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.TIMESTAMP to Date values representing
milliseconds past epoch.
Per the JDBC specification, databases should return Timestamp instances, which have date and time info
but no time zone info. This method handles Date objects plus any other standard date-related objects such
as Date, LocalTime, and LocalDateTime.
column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertTimeToMillisPastMidnight(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.TIME to Time values, or milliseconds past
midnight.
Per the JDBC specification, databases should return Time instances that have no notion of date or
time zones. This method handles Date objects plus any other standard date-related objects such as
Date, LocalTime, and LocalDateTime. If any of the types might
have date components, those date components are ignored.
column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertTimeToMicrosPastMidnight(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.TIME to MicroTime values, or microseconds past
midnight.
Per the JDBC specification, databases should return Time instances that have no notion of date or
time zones. This method handles Date objects plus any other standard date-related objects such as
Date, LocalTime, and LocalDateTime. If any of the types might
have date components, those date components are ignored.
column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertTimeToNanosPastMidnight(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.TIME to NanoTime values, or nanoseconds past
midnight.
Per the JDBC specification, databases should return Time instances that have no notion of date or
time zones. This method handles Date objects plus any other standard date-related objects such as
Date, LocalTime, and LocalDateTime. If any of the types might
have date components, those date components are ignored.
column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertTimeToMillisPastMidnightAsDate(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.TIME to Date values representing
the milliseconds past midnight on the epoch day.
Per the JDBC specification, databases should return Time instances that have no notion of date or
time zones. This method handles Date objects plus any other standard date-related objects such as
Date, LocalTime, and LocalDateTime. If any of the types might
have date components, those date components are ignored.
column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertDateToEpochDays(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.DATE to the number of days past epoch.
Per the JDBC specification, databases should return Date instances that have no notion of time or
time zones. This method handles Date objects plus any other standard date-related objects such as
Date, LocalDate, and LocalDateTime. If any of the types might
have time components, those time components are ignored.
column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date typeIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertDateToEpochDaysAsDate(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.DATE to the number of days past epoch, but represented
as a Date value at midnight on the date.
Per the JDBC specification, databases should return Date instances that have no notion of time or
time zones. This method handles Date objects plus any other standard date-related objects such as
Date, LocalDate, and LocalDateTime. If any of the types might
have time components, those time components are ignored.
column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date typeIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertBinary(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data, CommonConnectorConfig.BinaryHandlingMode mode)
protected Object convertBinaryToBytes(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.BLOB, Types.BINARY,
Types.VARBINARY, Types.LONGVARBINARY.column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertBinaryToBase64(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.BLOB, Types.BINARY,
Types.VARBINARY, Types.LONGVARBINARY.column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertBinaryToHex(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.BLOB, Types.BINARY,
Types.VARBINARY, Types.LONGVARBINARY.column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected ByteBuffer toByteBuffer(Column column, byte[] data)
protected byte[] unexpectedBinary(Object value, org.apache.kafka.connect.data.Field fieldDefn)
Types.BLOB, Types.BINARY,
Types.VARBINARY, Types.LONGVARBINARY.value - the binary value for which no conversion was found; never nullfieldDefn - the field definition in the Kafka Connect schema; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsconvertBinaryToBytes(Column, Field, Object)protected Object convertTinyInt(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.TINYINT.column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertSmallInt(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.SMALLINT.column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertInteger(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.INTEGER.column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertBigInt(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.INTEGER.column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertFloat(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.FLOAT.column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertDouble(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.DOUBLE.column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertReal(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.REAL.column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertNumeric(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.NUMERIC.column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertDecimal(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.NUMERIC.column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object toBigDecimal(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
protected BigDecimal withScaleAdjustedIfNeeded(Column column, BigDecimal data)
protected Object convertString(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.CHAR, Types.VARCHAR,
Types.LONGVARCHAR, Types.CLOB, Types.NCHAR, Types.NVARCHAR, Types.LONGNVARCHAR,
Types.NCLOB, Types.DATALINK, and Types.SQLXML.column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertRowId(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.ROWID.column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertBit(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.BIT.column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object convertBits(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data, int numBytes)
Types.BIT of length 2+.column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullnumBytes - the number of bytes that should be included in the resulting byte[]IllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected byte[] padLittleEndian(int numBytes,
byte[] data)
protected ByteOrder byteOrderOfBitType()
byte[] values for columns of type BIT(n) are big-endian
or little-endian. All values for BIT(n) columns are to be returned in
little-endian.
By default, this method returns ByteOrder.LITTLE_ENDIAN.
protected Object convertBoolean(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
Types.BOOLEAN.column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected Object handleUnknownData(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data)
column - the column definition describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date type; never nullIllegalArgumentException - if the value could not be converted but the column does not allow nullsprotected int getTimePrecision(Column column)
protected Object convertValue(Column column, org.apache.kafka.connect.data.Field fieldDefn, Object data, Object fallback, ValueConversionCallback callback)
column - describing the data value; never nullfieldDefn - the field definition; never nulldata - the data object to be converted into a Kafka Connect date typefallback - value that will be applied in case the column is defined as NOT NULL without a default value, but we
still received no value; may happen e.g. when enabling MySQL's non-strict modecallback - conversion routine that will be invoked in case the value is not nullnull if the inbound value was null and the column is
optional. Will be the column's default value (converted to the corresponding KC type, if the inbound
value was null, the column is non-optional and has a default value. Will be fallback if
the inbound value was null, the column is non-optional and has no default value. Otherwise, it
will be the value produced by callback and lastly the result returned by
handleUnknownData(Column, Field, Object).private boolean supportsLargeTimeValues()
private byte[] toByteArray(char[] chars)
private ByteBuffer toByteBuffer(char[] chars)
private ByteBuffer toByteBuffer(String string)
Copyright © 2021 JBoss by Red Hat. All rights reserved.