Uses of Interface
io.debezium.schema.DataCollectionId
Packages that use DataCollectionId
Package
Description
-
Uses of DataCollectionId in io.debezium.config
Methods in io.debezium.config with parameters of type DataCollectionIdModifier and TypeMethodDescriptionbooleanCommonConnectorConfig.isSignalDataCollection(DataCollectionId dataCollectionId) -
Uses of DataCollectionId in io.debezium.connector.common
Fields in io.debezium.connector.common with type parameters of type DataCollectionIdModifier and TypeFieldDescriptionprivate final Supplier<Collection<? extends DataCollectionId>>CdcSourceTaskContext.collectionsSupplierObtains the data collections captured at the point of invocation.Constructor parameters in io.debezium.connector.common with type arguments of type DataCollectionIdModifierConstructorDescriptionCdcSourceTaskContext(String connectorType, String connectorName, String taskId, Supplier<Collection<? extends DataCollectionId>> collectionsSupplier) CdcSourceTaskContext(String connectorType, String connectorName, Supplier<Collection<? extends DataCollectionId>> collectionsSupplier) -
Uses of DataCollectionId in io.debezium.heartbeat
Classes in io.debezium.heartbeat with type parameters of type DataCollectionIdModifier and TypeClassDescriptionclassHeartbeatFactory<T extends DataCollectionId>A factory for creating the appropriateHeartbeatimplementation based on the connector type and its configured properties. -
Uses of DataCollectionId in io.debezium.pipeline
Classes in io.debezium.pipeline with type parameters of type DataCollectionIdModifier and TypeClassDescriptionclassEventDispatcher<P extends Partition,T extends DataCollectionId> Central dispatcher for data change and schema change events.static interfaceEventDispatcher.InconsistentSchemaHandler<P extends Partition,T extends DataCollectionId> Reaction to an incoming change event for which schema is not foundMethod parameters in io.debezium.pipeline with type arguments of type DataCollectionIdModifier and TypeMethodDescriptionvoidEventDispatcher.setIncrementalSnapshotChangeEventSource(Optional<IncrementalSnapshotChangeEventSource<P, ? extends DataCollectionId>> incrementalSnapshotChangeEventSource) Enable support for incremental snapshotting. -
Uses of DataCollectionId in io.debezium.pipeline.meters
Methods in io.debezium.pipeline.meters with parameters of type DataCollectionIdModifier and TypeMethodDescriptionvoidSnapshotMeter.dataCollectionSnapshotCompleted(DataCollectionId dataCollectionId, long numRows) voidCommonEventMeter.onEvent(DataCollectionId source, OffsetContext offset, Object key, org.apache.kafka.connect.data.Struct value, Envelope.Operation operation) voidStreamingMeter.onEvent(DataCollectionId source, OffsetContext offset, Object key, org.apache.kafka.connect.data.Struct value) Method parameters in io.debezium.pipeline.meters with type arguments of type DataCollectionIdModifier and TypeMethodDescriptionvoidSnapshotMeter.monitoredDataCollectionsDetermined(Iterable<? extends DataCollectionId> dataCollectionIds) -
Uses of DataCollectionId in io.debezium.pipeline.metrics
Methods in io.debezium.pipeline.metrics with parameters of type DataCollectionIdModifier and TypeMethodDescriptionvoidDefaultSnapshotChangeEventSourceMetrics.dataCollectionSnapshotCompleted(P partition, DataCollectionId dataCollectionId, long numRows) voidDefaultStreamingChangeEventSourceMetrics.onEvent(P partition, DataCollectionId source, OffsetContext offset, Object key, org.apache.kafka.connect.data.Struct value, Envelope.Operation operation) voidPipelineMetrics.onEvent(P partition, DataCollectionId source, OffsetContext offset, Object key, org.apache.kafka.connect.data.Struct value, Envelope.Operation operation) Method parameters in io.debezium.pipeline.metrics with type arguments of type DataCollectionIdModifier and TypeMethodDescriptionvoidDefaultSnapshotChangeEventSourceMetrics.monitoredDataCollectionsDetermined(P partition, Iterable<? extends DataCollectionId> dataCollectionIds) -
Uses of DataCollectionId in io.debezium.pipeline.signal
Fields in io.debezium.pipeline.signal with type parameters of type DataCollectionIdModifier and TypeFieldDescriptionprivate final EventDispatcher<P,? extends DataCollectionId> ExecuteSnapshot.dispatcherMethods in io.debezium.pipeline.signal with parameters of type DataCollectionIdConstructor parameters in io.debezium.pipeline.signal with type arguments of type DataCollectionIdModifierConstructorDescriptionExecuteSnapshot(EventDispatcher<P, ? extends DataCollectionId> dispatcher) SchemaChanges(EventDispatcher<P, ? extends DataCollectionId> dispatcher, boolean useCatalogBeforeSchema) Signal(CommonConnectorConfig connectorConfig, EventDispatcher<P, ? extends DataCollectionId> eventDispatcher) -
Uses of DataCollectionId in io.debezium.pipeline.source
Methods in io.debezium.pipeline.source with type parameters of type DataCollectionIdModifier and TypeMethodDescriptionprotected <T extends DataCollectionId>
Stream<T>AbstractSnapshotChangeEventSource.determineDataCollectionsToBeSnapshotted(Collection<T> allDataCollections) -
Uses of DataCollectionId in io.debezium.pipeline.source.snapshot.incremental
Classes in io.debezium.pipeline.source.snapshot.incremental with type parameters of type DataCollectionIdModifier and TypeClassDescriptionclassAbstractIncrementalSnapshotChangeEventSource<P extends Partition,T extends DataCollectionId> An incremental snapshot change event source that emits events from a DB log interleaved with snapshot events.interfaceIncrementalSnapshotChangeEventSource<P extends Partition,T extends DataCollectionId> A Contract tclassSignalBasedIncrementalSnapshotChangeEventSource<P extends Partition,T extends DataCollectionId> Fields in io.debezium.pipeline.source.snapshot.incremental with type parameters of type DataCollectionIdModifier and TypeFieldDescriptionprivate final EventDispatcher<P,? extends DataCollectionId> CloseIncrementalSnapshotWindow.dispatcherMethods in io.debezium.pipeline.source.snapshot.incremental with parameters of type DataCollectionIdModifier and TypeMethodDescriptionprotected voidAbstractIncrementalSnapshotChangeEventSource.deduplicateWindow(DataCollectionId dataCollectionId, Object key) voidIncrementalSnapshotChangeEventSource.processMessage(P partition, DataCollectionId dataCollectionId, Object key, OffsetContext offsetContext) voidSignalBasedIncrementalSnapshotChangeEventSource.processMessage(Partition partition, DataCollectionId dataCollectionId, Object key, OffsetContext offsetContext) voidAbstractIncrementalSnapshotChangeEventSource.processSchemaChange(P partition, DataCollectionId dataCollectionId) default voidIncrementalSnapshotChangeEventSource.processSchemaChange(P partition, DataCollectionId dataCollectionId) Constructor parameters in io.debezium.pipeline.source.snapshot.incremental with type arguments of type DataCollectionIdModifierConstructorDescriptionCloseIncrementalSnapshotWindow(EventDispatcher<P, ? extends DataCollectionId> dispatcher) -
Uses of DataCollectionId in io.debezium.pipeline.source.spi
Methods in io.debezium.pipeline.source.spi that return types with arguments of type DataCollectionIdModifier and TypeMethodDescriptiondefault Optional<IncrementalSnapshotChangeEventSource<P,? extends DataCollectionId>> ChangeEventSourceFactory.getIncrementalSnapshotChangeEventSource(O offsetContext, SnapshotProgressListener<P> snapshotProgressListener, DataChangeEventListener<P> dataChangeEventListener) Returns and incremental snapshot change event source that can run in parallel with streaming and read and send data collection content in chunk.Methods in io.debezium.pipeline.source.spi with parameters of type DataCollectionIdModifier and TypeMethodDescriptionvoidSnapshotProgressListener.dataCollectionSnapshotCompleted(P partition, DataCollectionId dataCollectionId, long numRows) EventMetadataProvider.getEventSourcePosition(DataCollectionId source, OffsetContext offset, Object key, org.apache.kafka.connect.data.Struct value) EventMetadataProvider.getEventTimestamp(DataCollectionId source, OffsetContext offset, Object key, org.apache.kafka.connect.data.Struct value) EventMetadataProvider.getTransactionId(DataCollectionId source, OffsetContext offset, Object key, org.apache.kafka.connect.data.Struct value) voidDataChangeEventListener.onEvent(P partition, DataCollectionId source, OffsetContext offset, Object key, org.apache.kafka.connect.data.Struct value, Envelope.Operation operation) Invoked if an event is processed for a captured table.default StringEventMetadataProvider.toSummaryString(DataCollectionId source, OffsetContext offset, Object key, org.apache.kafka.connect.data.Struct value) Method parameters in io.debezium.pipeline.source.spi with type arguments of type DataCollectionIdModifier and TypeMethodDescriptionvoidSnapshotProgressListener.monitoredDataCollectionsDetermined(P partition, Iterable<? extends DataCollectionId> dataCollectionIds) -
Uses of DataCollectionId in io.debezium.pipeline.spi
Methods in io.debezium.pipeline.spi with parameters of type DataCollectionIdModifier and TypeMethodDescriptionvoidOffsetContext.event(DataCollectionId collectionId, Instant timestamp) Records the name of the collection and the timestamp of the last event -
Uses of DataCollectionId in io.debezium.pipeline.txmetadata
Methods in io.debezium.pipeline.txmetadata with parameters of type DataCollectionIdModifier and TypeMethodDescriptionvoidTransactionMonitor.dataEvent(Partition partition, DataCollectionId source, OffsetContext offset, Object key, org.apache.kafka.connect.data.Struct value) longTransactionContext.event(DataCollectionId source) private voidTransactionMonitor.transactionEvent(OffsetContext offsetContext, DataCollectionId source, org.apache.kafka.connect.data.Struct value) -
Uses of DataCollectionId in io.debezium.relational
Classes in io.debezium.relational that implement DataCollectionId -
Uses of DataCollectionId in io.debezium.schema
Classes in io.debezium.schema with type parameters of type DataCollectionIdModifier and TypeInterfaceDescriptioninterfaceDatabaseSchema<I extends DataCollectionId>The schema of a database.static interfaceinterfaceHistorizedDatabaseSchema<I extends DataCollectionId>A database schema that is historized, i.e.classTopicSelector<I extends DataCollectionId>Implementations return names for Kafka topics (data and meta-data).static interfaceImplementations determine the topic name corresponding to a given data collection.private static classTopicSelector.TopicNameCache<I extends DataCollectionId>A topic namer that caches names it has obtained from a delegate.private static classTopicSelector.TopicNameSanitizer<I extends DataCollectionId>A topic namer that replaces any characters invalid in a topic name with_.Methods in io.debezium.schema with type parameters of type DataCollectionIdModifier and TypeMethodDescriptionstatic <I extends DataCollectionId>
TopicSelector<I>TopicSelector.defaultSelector(CommonConnectorConfig connectorConfig, TopicSelector.DataCollectionTopicNamer<I> dataCollectionTopicNamer) static <I extends DataCollectionId>
TopicSelector<I>TopicSelector.defaultSelector(String prefix, String heartbeatPrefix, String delimiter, TopicSelector.DataCollectionTopicNamer<I> dataCollectionTopicNamer) Methods in io.debezium.schema that return DataCollectionId