Package io.debezium.kafka
Class KafkaCluster.Usage
- java.lang.Object
-
- io.debezium.kafka.KafkaCluster.Usage
-
- Enclosing class:
- KafkaCluster
public class KafkaCluster.Usage extends Object
A set of methods to use a running KafkaCluster.
-
-
Constructor Summary
Constructors Constructor Description Usage()
-
Method Summary
All Methods Instance Methods Concrete Methods Modifier and Type Method Description <K,V>
voidconsume(String groupId, String clientId, org.apache.kafka.clients.consumer.OffsetResetStrategy autoOffsetReset, org.apache.kafka.common.serialization.Deserializer<K> keyDeserializer, org.apache.kafka.common.serialization.Deserializer<V> valueDeserializer, BooleanSupplier continuation, org.apache.kafka.clients.consumer.OffsetCommitCallback offsetCommitCallback, Runnable completion, Collection<String> topics, Consumer<org.apache.kafka.clients.consumer.ConsumerRecord<K,V>> consumerFunction)Use the supplied function to asynchronously consume messages from the cluster.voidconsumeDocuments(String topicName, int count, long timeout, TimeUnit unit, Runnable completion)Asynchronously consume all messages on the given topic from the cluster.voidconsumeDocuments(String topicName, int count, long timeout, TimeUnit unit, Runnable completion, BiPredicate<String,io.debezium.document.Document> consumer)Asynchronously consume all messages on the given topic from the cluster.voidconsumeDocuments(BooleanSupplier continuation, Runnable completion, Collection<String> topics, Consumer<org.apache.kafka.clients.consumer.ConsumerRecord<String,io.debezium.document.Document>> consumerFunction)Asynchronously consume all messages from the cluster.voidconsumeIntegers(String topicName, int count, long timeout, TimeUnit unit, Runnable completion)Asynchronously consume all messages on the given topic from the cluster.voidconsumeIntegers(String topicName, int count, long timeout, TimeUnit unit, Runnable completion, BiPredicate<String,Integer> consumer)Asynchronously consume all messages on the given topic from the cluster.voidconsumeIntegers(BooleanSupplier continuation, Runnable completion, Collection<String> topics, Consumer<org.apache.kafka.clients.consumer.ConsumerRecord<String,Integer>> consumerFunction)Asynchronously consume all messages from the cluster.voidconsumeStrings(String topicName, int count, long timeout, TimeUnit unit, Runnable completion)Asynchronously consume all messages on the given topic from the cluster.voidconsumeStrings(String topicName, int count, long timeout, TimeUnit unit, Runnable completion, BiPredicate<String,String> consumer)Asynchronously consume all messages on the given topic from the cluster.voidconsumeStrings(BooleanSupplier continuation, Runnable completion, Collection<String> topics, Consumer<org.apache.kafka.clients.consumer.ConsumerRecord<String,String>> consumerFunction)Asynchronously consume all messages from the cluster.protected BooleanSuppliercontinueIfNotExpired(BooleanSupplier continuation, long timeout, TimeUnit unit)KafkaCluster.InteractiveConsumer<String,io.debezium.document.Document>createConsumer(String groupId, String clientId, String topicName, Runnable completion)Create ansimple consumerthat can be used to read messages from the cluster.<K,V>
KafkaCluster.InteractiveConsumer<K,V>createConsumer(String groupId, String clientId, String topicName, org.apache.kafka.common.serialization.Deserializer<K> keyDeserializer, org.apache.kafka.common.serialization.Deserializer<V> valueDeserializer, Runnable completion)Create ansimple consumerthat can be used to read messages from the cluster.KafkaCluster.InteractiveConsumer<String,io.debezium.document.Document>createConsumer(String groupId, String clientId, Set<String> topicNames, Runnable completion)Create ansimple consumerthat can be used to read messages from the cluster.<K,V>
KafkaCluster.InteractiveConsumer<K,V>createConsumer(String groupId, String clientId, Set<String> topicNames, org.apache.kafka.common.serialization.Deserializer<K> keyDeserializer, org.apache.kafka.common.serialization.Deserializer<V> valueDeserializer, Runnable completion)Create ansimple consumerthat can be used to read messages from the cluster.KafkaCluster.InteractiveProducer<String,io.debezium.document.Document>createProducer(String producerName)Create ansimple producerthat can be used to writeDocumentmessages to the cluster.<K,V>
KafkaCluster.InteractiveProducer<K,V>createProducer(String producerName, org.apache.kafka.common.serialization.Serializer<K> keySerializer, org.apache.kafka.common.serialization.Serializer<V> valueSerializer)Create ansimple producerthat can be used to write messages to the cluster.PropertiesgetConsumerProperties(String groupId, String clientId, org.apache.kafka.clients.consumer.OffsetResetStrategy autoOffsetReset)Get a new set of properties for consumers that want to talk to this server.PropertiesgetProducerProperties(String clientId)Get a new set of properties for producers that want to talk to this server.<K,V>
voidproduce(String producerName, int messageCount, org.apache.kafka.common.serialization.Serializer<K> keySerializer, org.apache.kafka.common.serialization.Serializer<V> valueSerializer, Runnable completionCallback, Supplier<org.apache.kafka.clients.producer.ProducerRecord<K,V>> messageSupplier)Use the supplied function to asynchronously produce messages and write them to the cluster.<K,V>
voidproduce(String producerName, Consumer<KafkaCluster.InteractiveProducer<String,io.debezium.document.Document>> producer)Use the supplied function to asynchronously produceDocumentmessages and write them to the cluster.<K,V>
voidproduce(String producerName, org.apache.kafka.common.serialization.Serializer<K> keySerializer, org.apache.kafka.common.serialization.Serializer<V> valueSerializer, Consumer<KafkaCluster.InteractiveProducer<K,V>> producer)Use the supplied function to asynchronously produce messages and write them to the cluster.voidproduceDocuments(int messageCount, Runnable completionCallback, Supplier<org.apache.kafka.clients.producer.ProducerRecord<String,io.debezium.document.Document>> messageSupplier)Use the supplied function to asynchronously produce messages with String keys andDocumentvalues, and write them to the cluster.voidproduceDocuments(String topic, int messageCount, Runnable completionCallback, Supplier<io.debezium.document.Document> valueSupplier)Asynchronously produce messages with monotonically increasing String keys and values obtained from the supplied function, and write them to the cluster.voidproduceIntegers(int messageCount, Runnable completionCallback, Supplier<org.apache.kafka.clients.producer.ProducerRecord<String,Integer>> messageSupplier)Use the supplied function to asynchronously produce messages with String keys and Integer values, and write them to the cluster.voidproduceIntegers(String topic, int messageCount, int initialValue, Runnable completionCallback)Asynchronously produce messages with String keys and sequential Integer values, and write them to the cluster.voidproduceStrings(int messageCount, Runnable completionCallback, Supplier<org.apache.kafka.clients.producer.ProducerRecord<String,String>> messageSupplier)Use the supplied function to asynchronously produce messages with String keys and values, and write them to the cluster.voidproduceStrings(String topic, int messageCount, Runnable completionCallback, Supplier<String> valueSupplier)Asynchronously produce messages with monotonically increasing String keys and values obtained from the supplied function, and write them to the cluster.
-
-
-
Method Detail
-
getConsumerProperties
public Properties getConsumerProperties(String groupId, String clientId, org.apache.kafka.clients.consumer.OffsetResetStrategy autoOffsetReset)
Get a new set of properties for consumers that want to talk to this server.- Parameters:
groupId- the group ID for the consumer; may not be nullclientId- the optional identifier for the client; may be null if not neededautoOffsetReset- how to pick a starting offset when there is no initial offset in ZooKeeper or if an offset is out of range; may be null for the default to be used- Returns:
- the mutable consumer properties
- See Also:
getProducerProperties(String)
-
getProducerProperties
public Properties getProducerProperties(String clientId)
Get a new set of properties for producers that want to talk to this server.- Parameters:
clientId- the optional identifier for the client; may be null if not needed- Returns:
- the mutable producer properties
- See Also:
getConsumerProperties(String, String, OffsetResetStrategy)
-
createProducer
public <K,V> KafkaCluster.InteractiveProducer<K,V> createProducer(String producerName, org.apache.kafka.common.serialization.Serializer<K> keySerializer, org.apache.kafka.common.serialization.Serializer<V> valueSerializer)
Create ansimple producerthat can be used to write messages to the cluster.- Parameters:
producerName- the name of the producer; may not be nullkeySerializer- the serializer for the keys; may not be nullvalueSerializer- the serializer for the values; may not be null- Returns:
- the object that can be used to produce messages; never null
-
createProducer
public KafkaCluster.InteractiveProducer<String,io.debezium.document.Document> createProducer(String producerName)
Create ansimple producerthat can be used to writeDocumentmessages to the cluster.- Parameters:
producerName- the name of the producer; may not be null- Returns:
- the object that can be used to produce messages; never null
-
createConsumer
public <K,V> KafkaCluster.InteractiveConsumer<K,V> createConsumer(String groupId, String clientId, String topicName, org.apache.kafka.common.serialization.Deserializer<K> keyDeserializer, org.apache.kafka.common.serialization.Deserializer<V> valueDeserializer, Runnable completion)
Create ansimple consumerthat can be used to read messages from the cluster.- Parameters:
groupId- the name of the group; may not be nullclientId- the name of the client; may not be nulltopicName- the name of the topic to read; may not be null and may not be emptykeyDeserializer- the deserializer for the keys; may not be nullvalueDeserializer- the deserializer for the values; may not be nullcompletion- the function to call when the consumer terminates; may be null- Returns:
- the running interactive consumer; never null
-
createConsumer
public <K,V> KafkaCluster.InteractiveConsumer<K,V> createConsumer(String groupId, String clientId, Set<String> topicNames, org.apache.kafka.common.serialization.Deserializer<K> keyDeserializer, org.apache.kafka.common.serialization.Deserializer<V> valueDeserializer, Runnable completion)
Create ansimple consumerthat can be used to read messages from the cluster.- Parameters:
groupId- the name of the group; may not be nullclientId- the name of the client; may not be nulltopicNames- the names of the topics to read; may not be null and may not be emptykeyDeserializer- the deserializer for the keys; may not be nullvalueDeserializer- the deserializer for the values; may not be nullcompletion- the function to call when the consumer terminates; may be null- Returns:
- the running interactive consumer; never null
-
createConsumer
public KafkaCluster.InteractiveConsumer<String,io.debezium.document.Document> createConsumer(String groupId, String clientId, String topicName, Runnable completion)
Create ansimple consumerthat can be used to read messages from the cluster.- Parameters:
groupId- the name of the group; may not be nullclientId- the name of the client; may not be nulltopicName- the name of the topic to read; may not be null and may not be emptycompletion- the function to call when the consumer terminates; may be null- Returns:
- the running interactive consumer; never null
-
createConsumer
public KafkaCluster.InteractiveConsumer<String,io.debezium.document.Document> createConsumer(String groupId, String clientId, Set<String> topicNames, Runnable completion)
Create ansimple consumerthat can be used to read messages from the cluster.- Parameters:
groupId- the name of the group; may not be nullclientId- the name of the client; may not be nulltopicNames- the names of the topics to read; may not be null and may not be emptycompletion- the function to call when the consumer terminates; may be null- Returns:
- the running interactive consumer; never null
-
produce
public <K,V> void produce(String producerName, Consumer<KafkaCluster.InteractiveProducer<String,io.debezium.document.Document>> producer)
Use the supplied function to asynchronously produceDocumentmessages and write them to the cluster.- Parameters:
producerName- the name of the producer; may not be nullproducer- the function that will asynchronously
-
produce
public <K,V> void produce(String producerName, org.apache.kafka.common.serialization.Serializer<K> keySerializer, org.apache.kafka.common.serialization.Serializer<V> valueSerializer, Consumer<KafkaCluster.InteractiveProducer<K,V>> producer)
Use the supplied function to asynchronously produce messages and write them to the cluster.- Parameters:
producerName- the name of the producer; may not be nullkeySerializer- the serializer for the keys; may not be nullvalueSerializer- the serializer for the values; may not be nullproducer- the function that will asynchronously
-
produce
public <K,V> void produce(String producerName, int messageCount, org.apache.kafka.common.serialization.Serializer<K> keySerializer, org.apache.kafka.common.serialization.Serializer<V> valueSerializer, Runnable completionCallback, Supplier<org.apache.kafka.clients.producer.ProducerRecord<K,V>> messageSupplier)
Use the supplied function to asynchronously produce messages and write them to the cluster.- Parameters:
producerName- the name of the producer; may not be nullmessageCount- the number of messages to produce; must be positivekeySerializer- the serializer for the keys; may not be nullvalueSerializer- the serializer for the values; may not be nullcompletionCallback- the function to be called when the producer is completed; may be nullmessageSupplier- the function to produce messages; may not be null
-
produceStrings
public void produceStrings(int messageCount, Runnable completionCallback, Supplier<org.apache.kafka.clients.producer.ProducerRecord<String,String>> messageSupplier)Use the supplied function to asynchronously produce messages with String keys and values, and write them to the cluster.- Parameters:
messageCount- the number of messages to produce; must be positivecompletionCallback- the function to be called when the producer is completed; may be nullmessageSupplier- the function to produce messages; may not be null
-
produceDocuments
public void produceDocuments(int messageCount, Runnable completionCallback, Supplier<org.apache.kafka.clients.producer.ProducerRecord<String,io.debezium.document.Document>> messageSupplier)Use the supplied function to asynchronously produce messages with String keys andDocumentvalues, and write them to the cluster.- Parameters:
messageCount- the number of messages to produce; must be positivecompletionCallback- the function to be called when the producer is completed; may be nullmessageSupplier- the function to produce messages; may not be null
-
produceIntegers
public void produceIntegers(int messageCount, Runnable completionCallback, Supplier<org.apache.kafka.clients.producer.ProducerRecord<String,Integer>> messageSupplier)Use the supplied function to asynchronously produce messages with String keys and Integer values, and write them to the cluster.- Parameters:
messageCount- the number of messages to produce; must be positivecompletionCallback- the function to be called when the producer is completed; may be nullmessageSupplier- the function to produce messages; may not be null
-
produceIntegers
public void produceIntegers(String topic, int messageCount, int initialValue, Runnable completionCallback)
Asynchronously produce messages with String keys and sequential Integer values, and write them to the cluster.- Parameters:
topic- the name of the topic to which the messages should be written; may not be nullmessageCount- the number of messages to produce; must be positiveinitialValue- the first integer value to producecompletionCallback- the function to be called when the producer is completed; may be null
-
produceStrings
public void produceStrings(String topic, int messageCount, Runnable completionCallback, Supplier<String> valueSupplier)
Asynchronously produce messages with monotonically increasing String keys and values obtained from the supplied function, and write them to the cluster.- Parameters:
topic- the name of the topic to which the messages should be written; may not be nullmessageCount- the number of messages to produce; must be positivecompletionCallback- the function to be called when the producer is completed; may be nullvalueSupplier- the value supplier; may not be null
-
produceDocuments
public void produceDocuments(String topic, int messageCount, Runnable completionCallback, Supplier<io.debezium.document.Document> valueSupplier)
Asynchronously produce messages with monotonically increasing String keys and values obtained from the supplied function, and write them to the cluster.- Parameters:
topic- the name of the topic to which the messages should be written; may not be nullmessageCount- the number of messages to produce; must be positivecompletionCallback- the function to be called when the producer is completed; may be nullvalueSupplier- the value supplier; may not be null
-
consume
public <K,V> void consume(String groupId, String clientId, org.apache.kafka.clients.consumer.OffsetResetStrategy autoOffsetReset, org.apache.kafka.common.serialization.Deserializer<K> keyDeserializer, org.apache.kafka.common.serialization.Deserializer<V> valueDeserializer, BooleanSupplier continuation, org.apache.kafka.clients.consumer.OffsetCommitCallback offsetCommitCallback, Runnable completion, Collection<String> topics, Consumer<org.apache.kafka.clients.consumer.ConsumerRecord<K,V>> consumerFunction)
Use the supplied function to asynchronously consume messages from the cluster.- Parameters:
groupId- the name of the group; may not be nullclientId- the name of the client; may not be nullautoOffsetReset- how to pick a starting offset when there is no initial offset in ZooKeeper or if an offset is out of range; may be null for the default to be usedkeyDeserializer- the deserializer for the keys; may not be nullvalueDeserializer- the deserializer for the values; may not be nullcontinuation- the function that determines if the consumer should continue; may not be nulloffsetCommitCallback- the callback that should be used after committing offsets; may be null if offsets are not to be committedcompletion- the function to call when the consumer terminates; may be nulltopics- the set of topics to consume; may not be null or emptyconsumerFunction- the function to consume the messages; may not be null
-
consumeDocuments
public void consumeDocuments(BooleanSupplier continuation, Runnable completion, Collection<String> topics, Consumer<org.apache.kafka.clients.consumer.ConsumerRecord<String,io.debezium.document.Document>> consumerFunction)
Asynchronously consume all messages from the cluster.- Parameters:
continuation- the function that determines if the consumer should continue; may not be nullcompletion- the function to call when all messages have been consumed; may be nulltopics- the set of topics to consume; may not be null or emptyconsumerFunction- the function to consume the messages; may not be null
-
consumeStrings
public void consumeStrings(BooleanSupplier continuation, Runnable completion, Collection<String> topics, Consumer<org.apache.kafka.clients.consumer.ConsumerRecord<String,String>> consumerFunction)
Asynchronously consume all messages from the cluster.- Parameters:
continuation- the function that determines if the consumer should continue; may not be nullcompletion- the function to call when all messages have been consumed; may be nulltopics- the set of topics to consume; may not be null or emptyconsumerFunction- the function to consume the messages; may not be null
-
consumeIntegers
public void consumeIntegers(BooleanSupplier continuation, Runnable completion, Collection<String> topics, Consumer<org.apache.kafka.clients.consumer.ConsumerRecord<String,Integer>> consumerFunction)
Asynchronously consume all messages from the cluster.- Parameters:
continuation- the function that determines if the consumer should continue; may not be nullcompletion- the function to call when all messages have been consumed; may be nulltopics- the set of topics to consume; may not be null or emptyconsumerFunction- the function to consume the messages; may not be null
-
consumeStrings
public void consumeStrings(String topicName, int count, long timeout, TimeUnit unit, Runnable completion, BiPredicate<String,String> consumer)
Asynchronously consume all messages on the given topic from the cluster.- Parameters:
topicName- the name of the topic; may not be nullcount- the expected number of messages to read before terminating; may not be nulltimeout- the maximum time that this consumer should run before terminating; must be positiveunit- the unit of time for the timeout; may not be nullcompletion- the function to call when all messages have been consumed; may be nullconsumer- the function to consume the messages; may not be null
-
consumeDocuments
public void consumeDocuments(String topicName, int count, long timeout, TimeUnit unit, Runnable completion, BiPredicate<String,io.debezium.document.Document> consumer)
Asynchronously consume all messages on the given topic from the cluster.- Parameters:
topicName- the name of the topic; may not be nullcount- the expected number of messages to read before terminating; may not be nulltimeout- the maximum time that this consumer should run before terminating; must be positiveunit- the unit of time for the timeout; may not be nullcompletion- the function to call when all messages have been consumed; may be nullconsumer- the function to consume the messages; may not be null
-
consumeIntegers
public void consumeIntegers(String topicName, int count, long timeout, TimeUnit unit, Runnable completion, BiPredicate<String,Integer> consumer)
Asynchronously consume all messages on the given topic from the cluster.- Parameters:
topicName- the name of the topic; may not be nullcount- the expected number of messages to read before terminating; may not be nulltimeout- the maximum time that this consumer should run before terminating; must be positiveunit- the unit of time for the timeout; may not be nullcompletion- the function to call when all messages have been consumed; may be nullconsumer- the function to consume the messages; may not be null
-
consumeStrings
public void consumeStrings(String topicName, int count, long timeout, TimeUnit unit, Runnable completion)
Asynchronously consume all messages on the given topic from the cluster.- Parameters:
topicName- the name of the topic; may not be nullcount- the expected number of messages to read before terminating; may not be nulltimeout- the maximum time that this consumer should run before terminating; must be positiveunit- the unit of time for the timeout; may not be nullcompletion- the function to call when all messages have been consumed; may be null
-
consumeDocuments
public void consumeDocuments(String topicName, int count, long timeout, TimeUnit unit, Runnable completion)
Asynchronously consume all messages on the given topic from the cluster.- Parameters:
topicName- the name of the topic; may not be nullcount- the expected number of messages to read before terminating; may not be nulltimeout- the maximum time that this consumer should run before terminating; must be positiveunit- the unit of time for the timeout; may not be nullcompletion- the function to call when all messages have been consumed; may be null
-
consumeIntegers
public void consumeIntegers(String topicName, int count, long timeout, TimeUnit unit, Runnable completion)
Asynchronously consume all messages on the given topic from the cluster.- Parameters:
topicName- the name of the topic; may not be nullcount- the expected number of messages to read before terminating; may not be nulltimeout- the maximum time that this consumer should run before terminating; must be positiveunit- the unit of time for the timeout; may not be nullcompletion- the function to call when all messages have been consumed; may be null
-
continueIfNotExpired
protected BooleanSupplier continueIfNotExpired(BooleanSupplier continuation, long timeout, TimeUnit unit)
-
-