Package io.debezium.connector.jdbc.util
Interface SinkRecordFactory
- All Known Implementing Classes:
DebeziumSinkRecordFactory,FlatSinkRecordFactory
public interface SinkRecordFactory
- Author:
- Chris Cranford
-
Method Summary
Modifier and TypeMethodDescriptiondefault org.apache.kafka.connect.data.Schemadefault org.apache.kafka.connect.data.Schemadefault org.apache.kafka.connect.data.Schemadefault org.apache.kafka.connect.data.SchemaReturns a single field key schema.default org.apache.kafka.connect.data.SchemabasicKeySchema(UnaryOperator<String> columnNameTransformation) Returns a single field key schema.default org.apache.kafka.connect.data.Schemadefault org.apache.kafka.connect.data.SchemabasicRecordSchema(UnaryOperator<String> columnNameTransformation) default org.apache.kafka.connect.data.SchemaReturns a simple source info block schema.default org.apache.kafka.connect.sink.SinkRecordcloudEventRecord(String topicName, io.debezium.converters.spi.SerializerType serializerType, String cloudEventsSchemaName) Returns a createSinkRecordBuilderinstancedefault org.apache.kafka.connect.sink.SinkRecordcreateRecord(String topicName) default org.apache.kafka.connect.sink.SinkRecordcreateRecord(String topicName, byte key) default org.apache.kafka.connect.sink.SinkRecordcreateRecord(String topicName, byte key, String database, String schema, String table) default org.apache.kafka.connect.sink.SinkRecordcreateRecord(String topicName, byte key, UnaryOperator<String> columnNameTransformation) default org.apache.kafka.connect.sink.SinkRecordcreateRecordMultipleKeyColumns(String topicName) default org.apache.kafka.connect.sink.SinkRecordcreateRecordNoKey(String topicName) default org.apache.kafka.connect.sink.SinkRecordcreateRecordWithSchemaValue(String topicName, byte key, String fieldName, org.apache.kafka.connect.data.Schema fieldSchema, Object value) default org.apache.kafka.connect.sink.SinkRecordcreateRecordWithSchemaValue(String topicName, byte key, List<String> fieldNames, List<org.apache.kafka.connect.data.Schema> fieldSchemas, List<Object> values) Returns a deleteSinkRecordBuilderinstancedefault org.apache.kafka.connect.sink.SinkRecorddeleteRecord(String topicName) default org.apache.kafka.connect.sink.SinkRecorddeleteRecordMultipleKeyColumns(String topicName) booleanReturns whether the factor constructs flattened records or complex debezium payloads.default org.apache.kafka.connect.data.SchemakeySchema(UnaryOperator<String> columnNameTransformation, org.apache.kafka.connect.data.Schema schema) Returns a single field key schema.default org.apache.kafka.connect.data.Schemadefault org.apache.kafka.connect.data.SchemaReturns a multiple field key schema.default org.apache.kafka.connect.data.SchemanickNameFieldSchema(UnaryOperator<String> columnNameTransformation) default org.apache.kafka.connect.data.SchemaReturns a primitive key schema.default org.apache.kafka.connect.sink.SinkRecordtombstoneRecord(String topicName) default org.apache.kafka.connect.sink.SinkRecordtruncateRecord(String topicName) Returns an updateSinkRecordBuilderinstancedefault org.apache.kafka.connect.sink.SinkRecordupdateRecord(String topicName) default org.apache.kafka.connect.sink.SinkRecordupdateRecordWithSchemaValue(String topicName, byte key, String fieldName, org.apache.kafka.connect.data.Schema fieldSchema, Object value)
-
Method Details
-
isFlattened
boolean isFlattened()Returns whether the factor constructs flattened records or complex debezium payloads. -
createBuilder
Returns a createSinkRecordBuilderinstance -
updateBuilder
Returns an updateSinkRecordBuilderinstance -
deleteBuilder
Returns a deleteSinkRecordBuilderinstance -
primitiveKeySchema
default org.apache.kafka.connect.data.Schema primitiveKeySchema()Returns a primitive key schema. -
basicKeySchema
default org.apache.kafka.connect.data.Schema basicKeySchema()Returns a single field key schema. -
basicKeySchema
default org.apache.kafka.connect.data.Schema basicKeySchema(UnaryOperator<String> columnNameTransformation) Returns a single field key schema.- Parameters:
columnNameTransformation- transformation for the field name
-
keySchema
default org.apache.kafka.connect.data.Schema keySchema(UnaryOperator<String> columnNameTransformation, org.apache.kafka.connect.data.Schema schema) Returns a single field key schema.- Parameters:
columnNameTransformation- transformation for the field nameschema- the schema used for the key field
-
multipleKeySchema
default org.apache.kafka.connect.data.Schema multipleKeySchema()Returns a multiple field key schema. -
basicSourceSchema
default org.apache.kafka.connect.data.Schema basicSourceSchema()Returns a simple source info block schema. -
basicRecordSchema
default org.apache.kafka.connect.data.Schema basicRecordSchema() -
basicRecordSchema
default org.apache.kafka.connect.data.Schema basicRecordSchema(UnaryOperator<String> columnNameTransformation) -
nickNameFieldSchema
default org.apache.kafka.connect.data.Schema nickNameFieldSchema(UnaryOperator<String> columnNameTransformation) -
multipleKeyRecordSchema
default org.apache.kafka.connect.data.Schema multipleKeyRecordSchema() -
allKafkaSchemaTypesSchema
default org.apache.kafka.connect.data.Schema allKafkaSchemaTypesSchema() -
allKafkaSchemaTypesSchemaWithDefaults
default org.apache.kafka.connect.data.Schema allKafkaSchemaTypesSchemaWithDefaults() -
allKafkaSchemaTypesSchemaWithOptionalDefaultValues
default org.apache.kafka.connect.data.Schema allKafkaSchemaTypesSchemaWithOptionalDefaultValues() -
createRecordNoKey
-
createRecord
-
createRecord
-
createRecord
default org.apache.kafka.connect.sink.SinkRecord createRecord(String topicName, byte key, UnaryOperator<String> columnNameTransformation) -
createRecordWithSchemaValue
-
createRecordWithSchemaValue
-
createRecord
-
createRecordMultipleKeyColumns
-
updateRecord
-
updateRecordWithSchemaValue
-
deleteRecord
-
deleteRecordMultipleKeyColumns
-
tombstoneRecord
-
truncateRecord
-
cloudEventRecord
-