SlideShare a Scribd company logo
1 of 27
Download to read offline
Markus Günther
Freelance Software Engineer / Architect
mail@mguenther.net | mguenther.net | @markus_guenther
Apache Kafka
Testing Kafka-enabled software components
2
There are several approaches to testing Kafka-enabled
components or services.
Problem
Solution ▪ Spring Kafka provides an embeddable cluster
▪ Lacks the possiblity to work with external clusters
▪ Leads to boilerplate code in your test
▪ There are (costly) testing tools that are able to inject records into topics
▪ Lacks the possibility to work with embedded clusters
▪ Suitable for system testing, not integration testing
▪ Testing Kafka-based components or services is not trivial
▪ Need to have a working cluster
▪ Must be able to instrument the cluster
▪ Mocking is not an option for integration or system tests
3
Kafka for JUnit is suitable for integration testing as well as
system testing.
Solution ▪ Kafka for JUnit
▪ Works with embedded Kafka clusters – suitable for integration testing
▪ Works with external Kafka clusters – suitable for system testing
▪ Features a concise and readable API
▪ Features fault injection to trigger error handling
▪ Integrates well with Spring Kafka
4
Kafka for JUnit allows you to write integration tests that work
against an embedded Kafka cluster.
Integration Test
JUNIT JUPITER
Consumer / Producer
KAFKA-BASED COMPONENT
my.test.topic
EMBEDDED APACHE KAFKA CLUSTER
VERTICALS
instruments
provisions
interacts
with
6
Kafka for JUnit allows you to write system tests that work against
an external Kafka cluster.
VERTICALS
Event Producer
SERVICE A
Event Consumer
SERVICE B
topic.for.events
APACHE KAFKA CLUSTER
System Test
JUNIT JUPITER
trigger use case
observes
topic
for
expected
records
publishes
events
consumes
events
7
Kafka for JUnit provides abstractions for interacting with a Kafka cluster
through EmbeddedKafkaCluster and ExternalKafkaCluster.
my.test.topic
APACHE KAFKA CLUSTER
RecordProducer
EmbeddedKafkaCluster
RecordConsumer
TopicManager
FaultInjection
my.test.topic
APACHE KAFKA CLUSTER
RecordProducer
ExternalKafkaCluster
RecordConsumer
TopicManager
8
A RecordProducer provides the means to send key-value pairs
or non-keyed values to a Kafka topic.
public interface RecordProducer {
<V> List<RecordMetadata> send(SendValues<V> sendRequest) throws ...;
<V> List<RecordMetadata> send(SendValuesTransactional<V> sendRequest) throws ...;
<K,V> List<RecordMetadata> send(SendKeyValues<K,V> sendRequest) throws ...;
<K,V> List<RecordMetadata> send(SendKeyValuesTransactional<K,V> sendRequest) ...;
/* overloaded methods that accept builder instances
* for the resp. type have been omitted for brevity
*/
}
Kafka for JUnit provides builders
for these requests!
Interface definition of RecordProducer
9
Publishing data to a Kafka topic is as simple as contributing a
one-liner in the default case.
kafka.send(SendValues.to(“my.test.topic“, “a“, “b“, “c“));
Sending non-keyed values using defaults
kafka.send(SendValuesTransactional.inTransaction(
“my.test.topic“,
Arrays.asList(“a“, “b“, “c“)));
kafka.send(SendValues.to(“my.test.topic“, “a“, “b“, “c“))
.with(ProducerConfig.ENABLE_IDEMPOTENCE_CONFIG, “true“)
.with(ProducerConfig.MAX_IN_FLIGHT_REQUEST_PER_CONNECTION, “1“));
Sending non-keyed values using overrides
Sending non-keyed values transactionally
1
A RecordConsumer provides the means to read data from a topic or
observe a topic until some criteria is met or a timeout elapses.
public interface RecordConsumer {
<V> List<V> readValues(ReadKeyValues<String,V> readRequest) throws ...;
<K,V> List<KeyValue<K,V>> read(ReadKeyValues<K,V> readRequest) throws ...;
<V> List<V> observeValues(ObserveKeyValues<String,V> observeRequest) throws ...;
<K,V> List<KeyValue<K,V>> observe(ObserveKeyValues<K,V> observeRequest) throws ...;
/* overloaded methods that accept builder instances
* for the resp. type have been omitted for brevity
*/
}
Kafka for JUnit provides builders
for these requests!
Interface definition of RecordConsumer
1
Consuming records is just as easy as producing them.
val values = kafka.readValues(ReadKeyValues.from(“my.test.topic“));
Consuming only values using defaults
List<KeyValue<String, Long>> records = kafka.read(ReadKeyValues
.from(“my.test.topic“, Long.class)
.with(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, ...));
val records = kafka.read(ReadKeyValues.from(“my.test.topic“));
Consuming key-value pairs using defaults
Consuming key-value pairs using overrides
1
Observations can be used to let a test fail unless given criteria
are met.
kafka.observeValues(ObserveKeyValues.on(“my.test.topic“, 3));
Observing a topic until n values have been consumed
val keyFilter = Integer.parseInt(k) % 2 == 0;
kafka.observe(ObserveKeyValues.on(“my.test.topic“, 3, Integer.class)
.with(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, ...)
.filterOnKeys(keyFilter));
List<KeyValue<String,String>> observedRecords = kafka
.observe(ObserveKeyValues.on(“my.test.topic“, 3));
Observing a topic until n records have been consumed
Using filters when consuming or observing a topic
1
A TopicManager provides the means to manage Kafka
topics.
public interface TopicManager {
void createTopic(TopicConfig config);
void deleteTopic(String topic);
boolean exists(String topic);
Map<Integer, LeaderAndIsr> fetchLeaderAndIsr(String topic);
Properties fetchTopicConfig(String topic);
/* overloaded methods that accept builder instances for the resp. type have been
omitted for brevity */
}
Kafka for JUnit provides builders
for these requests!
Interface definition of TopicManager
1
The default TopicManager implementation leverages the
AdminClient implementation of the Kafka Client library.
kafka.createTopic(TopicConfig.withName(“my.test.topic“));
Creating a topic using defaults
kafka.deleteTopic(“my.test.topic“);
Deleting a topic
kafka.createTopic(TopicConfig.withName(“my.test.topic“)
.withNumberOfPartitions(5)
.withNumberOfReplicas(1));
Creating a topic with specific properties
1
Writing integration tests with
Kafka for JUnit
1
Let’s write a test that exercises the write-path of Kafka-based
component.
TurbineEventPublisherTest
JUNIT JUPITER
TurbineEventPublisher
TURBINE REGISTRY
turbine.lifecycle.events
EMBEDDED APACHE KAFKA CLUSTER
instruments
provisions
interacts
with
1
1
Let’s write a simple test that verifies that TurbineEventPublisher
is able to write events to the designated topic.
public class TurbineEventPublisherTest {
private EmbeddedKafkaCluster kafka;
@BeforeEach
void setupKafka() {
kafka = provisionWith(defaultClusterConfig());
kafka.start();
}
@AfterEach
void tearDownKafka() {
kafka.stop();
}
}
Provide the skeleton for the component test incl. a workable Kafka cluster.
static import from
ExternalKafkaCluster
static import from
ExternalKafkaClusterConfig
1
The observe method throws an AssertionError once a
certain configurable amount of time has elapsed.
var config = Map.<String, Object>of(
ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, kafka.getBrokerList());
var publisher = new TurbineEventPublisher(“turbine.lifecycle.events“,
config);
var event = new TurbineRegisteredEvent(“1a5c6012“, 49.875114, 8.978702);
publisher.log(event);
Create an instance of the subject-under-test and publish test data
2
The observe method throws an AssertionError once a
certain configurable amount of time has elapsed. (cont.)
kafka.observe(ObserveKeyValues
.on(“turbine.lifecycle.events“, 1, TurbineEvent.class)
.observeFor(15, TimeUnit.SECONDS)
.filterOnKeys(key -> key.equals(“1a5c6012“))
.with(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
TurbineEventDeserializer.class));
Observe the designated topic for new data
override arbitrary
consumer properties
the topic we want to
observe
the number of
records we expect
the value type of the
expected records
use filters to add
observation criterias
2
The observe method returns all records that it obtained from
watching the topic.
var record = kafka.observe(
on(“turbine.lifecycle.events“, 1, TurbineEvent.class)
.with(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
TurbineEventDeserializer.class))
.stream()
.findFirst()
.orElseThrow(AssertionError::new);
assertThat(record.getKey()).isEqualTo(“1a5c6012“);
assertThat(record.getValue()).isInstanceOf(TurbineRegisteredEvent.class);
Fetch observed records and assert that their data is what you expect it to be
2
Writing systems tests with
Kafka for JUnit
2
2
We’ll use a simple client interface to provide the means to exert an
external stimulus on the system. (cont.)
Listing 1: Example for Feign-based HTTP client that interacts with the system
public interface GettingThingsDone {
@RequestLine(“POST /items“)
@Headers(“Content-Type: application/json“)
Response createItem(CreateItem payload);
@RequestLine(“GET /items/{itemId}“)
@Headers(“Accept: application/json“)
Item getItem(@Param(“itemId“) String itemId);
/* additional methods omitted for brevity */
}
Example for Feign-based HTTP client that interacts with the system
2
We’ll use a simple client interface to provide the means to exert an
external stimulus on the system. (cont.)
Listing 1: Example for Feign-based HTTP client that interacts with the system
val kafka = ExternalKafkaCluster.at(“http://localhost:9092“);
Gain programmatic access to the cluster
val itemId = extractItemId(response);
Extract the ID of the newly created item from the response
val gtd = createGettingThingsDoneClient();
val payload = new CreateItem(“Buy groceries!“);
val response = gtd.createItem(payload)
Trigger a use case using the client interface
2
Leverage Kafka for JUnit to observe the designated topic and apply
assertions on the returned records.
Listing 1: Example for Feign-based HTTP client that interacts with the system
List<AvroItemEvent> publishedEvents = kafka
.observeValues(on(“item.events“, 1, AvroItemEvent.class)
.observeFor(10, TimeUnit.SECONDS)
.with(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG,
ItemEventDeserializer.class)
.filterOnKeys(aggregateId -> aggregateId.equals(itemId)));
Observe the designated topic for the
throws an AssertionError
if the timeout elapses
2
Want to know more?
GitHub
Blog ▪ Günther, M., Writing component tests for Kafka producers,
https://bit.ly/39NpoCU
▪ Günther, M., Writing component tests for Kafka consumers,
https://bit.ly/36KrXoV
▪ Günther, M., Writing system tests for a Kafka-enabled microservice,
https://bit.ly/2OUeEMs
▪ Günther, M., Using Kafka for JUnit with Spring Kafka,
https://bit.ly/3c61WSx
▪ Kafka for JUnit on GitHub,
https://mguenther.github.io/kafka-junit/
▪ User Guide to Kafka for JUnit,
https://mguenther.github.io/kafka-junit/
2
Questions?
mguenther.net markus_guenther
mail@mguenther.net

More Related Content

What's hot

A visual introduction to Apache Kafka
A visual introduction to Apache KafkaA visual introduction to Apache Kafka
A visual introduction to Apache KafkaPaul Brebner
 
Deploying Kafka Streams Applications with Docker and Kubernetes
Deploying Kafka Streams Applications with Docker and KubernetesDeploying Kafka Streams Applications with Docker and Kubernetes
Deploying Kafka Streams Applications with Docker and Kubernetesconfluent
 
Apache Kafka 0.8 basic training - Verisign
Apache Kafka 0.8 basic training - VerisignApache Kafka 0.8 basic training - Verisign
Apache Kafka 0.8 basic training - VerisignMichael Noll
 
Common issues with Apache Kafka® Producer
Common issues with Apache Kafka® ProducerCommon issues with Apache Kafka® Producer
Common issues with Apache Kafka® Producerconfluent
 
Apache Kafka Introduction
Apache Kafka IntroductionApache Kafka Introduction
Apache Kafka IntroductionAmita Mirajkar
 
Kafka 101 and Developer Best Practices
Kafka 101 and Developer Best PracticesKafka 101 and Developer Best Practices
Kafka 101 and Developer Best Practicesconfluent
 
Disaster Recovery Options Running Apache Kafka in Kubernetes with Rema Subra...
 Disaster Recovery Options Running Apache Kafka in Kubernetes with Rema Subra... Disaster Recovery Options Running Apache Kafka in Kubernetes with Rema Subra...
Disaster Recovery Options Running Apache Kafka in Kubernetes with Rema Subra...HostedbyConfluent
 
Kafka Overview
Kafka OverviewKafka Overview
Kafka Overviewiamtodor
 
Kafka Tutorial - Introduction to Apache Kafka (Part 1)
Kafka Tutorial - Introduction to Apache Kafka (Part 1)Kafka Tutorial - Introduction to Apache Kafka (Part 1)
Kafka Tutorial - Introduction to Apache Kafka (Part 1)Jean-Paul Azar
 
Stream processing using Kafka
Stream processing using KafkaStream processing using Kafka
Stream processing using KafkaKnoldus Inc.
 
A Deep Dive into Kafka Controller
A Deep Dive into Kafka ControllerA Deep Dive into Kafka Controller
A Deep Dive into Kafka Controllerconfluent
 
Kafka Tutorial - introduction to the Kafka streaming platform
Kafka Tutorial - introduction to the Kafka streaming platformKafka Tutorial - introduction to the Kafka streaming platform
Kafka Tutorial - introduction to the Kafka streaming platformJean-Paul Azar
 
Apache Kafka - Martin Podval
Apache Kafka - Martin PodvalApache Kafka - Martin Podval
Apache Kafka - Martin PodvalMartin Podval
 
Distributed stream processing with Apache Kafka
Distributed stream processing with Apache KafkaDistributed stream processing with Apache Kafka
Distributed stream processing with Apache Kafkaconfluent
 
APACHE KAFKA / Kafka Connect / Kafka Streams
APACHE KAFKA / Kafka Connect / Kafka StreamsAPACHE KAFKA / Kafka Connect / Kafka Streams
APACHE KAFKA / Kafka Connect / Kafka StreamsKetan Gote
 

What's hot (20)

Introduction to Apache Kafka
Introduction to Apache KafkaIntroduction to Apache Kafka
Introduction to Apache Kafka
 
A visual introduction to Apache Kafka
A visual introduction to Apache KafkaA visual introduction to Apache Kafka
A visual introduction to Apache Kafka
 
Deploying Kafka Streams Applications with Docker and Kubernetes
Deploying Kafka Streams Applications with Docker and KubernetesDeploying Kafka Streams Applications with Docker and Kubernetes
Deploying Kafka Streams Applications with Docker and Kubernetes
 
Apache Kafka 0.8 basic training - Verisign
Apache Kafka 0.8 basic training - VerisignApache Kafka 0.8 basic training - Verisign
Apache Kafka 0.8 basic training - Verisign
 
Introduction to apache kafka
Introduction to apache kafkaIntroduction to apache kafka
Introduction to apache kafka
 
Common issues with Apache Kafka® Producer
Common issues with Apache Kafka® ProducerCommon issues with Apache Kafka® Producer
Common issues with Apache Kafka® Producer
 
Apache Kafka Introduction
Apache Kafka IntroductionApache Kafka Introduction
Apache Kafka Introduction
 
Apache kafka
Apache kafkaApache kafka
Apache kafka
 
Kafka 101 and Developer Best Practices
Kafka 101 and Developer Best PracticesKafka 101 and Developer Best Practices
Kafka 101 and Developer Best Practices
 
Disaster Recovery Options Running Apache Kafka in Kubernetes with Rema Subra...
 Disaster Recovery Options Running Apache Kafka in Kubernetes with Rema Subra... Disaster Recovery Options Running Apache Kafka in Kubernetes with Rema Subra...
Disaster Recovery Options Running Apache Kafka in Kubernetes with Rema Subra...
 
Kafka Overview
Kafka OverviewKafka Overview
Kafka Overview
 
Kafka Tutorial - Introduction to Apache Kafka (Part 1)
Kafka Tutorial - Introduction to Apache Kafka (Part 1)Kafka Tutorial - Introduction to Apache Kafka (Part 1)
Kafka Tutorial - Introduction to Apache Kafka (Part 1)
 
Stream processing using Kafka
Stream processing using KafkaStream processing using Kafka
Stream processing using Kafka
 
A Deep Dive into Kafka Controller
A Deep Dive into Kafka ControllerA Deep Dive into Kafka Controller
A Deep Dive into Kafka Controller
 
Kafka Tutorial - introduction to the Kafka streaming platform
Kafka Tutorial - introduction to the Kafka streaming platformKafka Tutorial - introduction to the Kafka streaming platform
Kafka Tutorial - introduction to the Kafka streaming platform
 
Kafka 101
Kafka 101Kafka 101
Kafka 101
 
Apache Kafka
Apache KafkaApache Kafka
Apache Kafka
 
Apache Kafka - Martin Podval
Apache Kafka - Martin PodvalApache Kafka - Martin Podval
Apache Kafka - Martin Podval
 
Distributed stream processing with Apache Kafka
Distributed stream processing with Apache KafkaDistributed stream processing with Apache Kafka
Distributed stream processing with Apache Kafka
 
APACHE KAFKA / Kafka Connect / Kafka Streams
APACHE KAFKA / Kafka Connect / Kafka StreamsAPACHE KAFKA / Kafka Connect / Kafka Streams
APACHE KAFKA / Kafka Connect / Kafka Streams
 

Similar to Testing Kafka components with Kafka for JUnit

Exactly-once Stream Processing with Kafka Streams
Exactly-once Stream Processing with Kafka StreamsExactly-once Stream Processing with Kafka Streams
Exactly-once Stream Processing with Kafka StreamsGuozhang Wang
 
Kafka Summit SF 2017 - Exactly-once Stream Processing with Kafka Streams
Kafka Summit SF 2017 - Exactly-once Stream Processing with Kafka StreamsKafka Summit SF 2017 - Exactly-once Stream Processing with Kafka Streams
Kafka Summit SF 2017 - Exactly-once Stream Processing with Kafka Streamsconfluent
 
Spark Streaming Info
Spark Streaming InfoSpark Streaming Info
Spark Streaming InfoDoug Chang
 
Exactly-once Data Processing with Kafka Streams - July 27, 2017
Exactly-once Data Processing with Kafka Streams - July 27, 2017Exactly-once Data Processing with Kafka Streams - July 27, 2017
Exactly-once Data Processing with Kafka Streams - July 27, 2017confluent
 
How Scala promotes TDD
How Scala promotes TDDHow Scala promotes TDD
How Scala promotes TDDShai Yallin
 
Developing a custom Kafka connector? Make it shine! | Igor Buzatović, Porsche...
Developing a custom Kafka connector? Make it shine! | Igor Buzatović, Porsche...Developing a custom Kafka connector? Make it shine! | Igor Buzatović, Porsche...
Developing a custom Kafka connector? Make it shine! | Igor Buzatović, Porsche...HostedbyConfluent
 
Kubernetes fundamentals
Kubernetes fundamentalsKubernetes fundamentals
Kubernetes fundamentalsVictor Morales
 
I can't believe it's not a queue: Kafka and Spring
I can't believe it's not a queue: Kafka and SpringI can't believe it's not a queue: Kafka and Spring
I can't believe it's not a queue: Kafka and SpringJoe Kutner
 
Secure Kafka at scale in true multi-tenant environment ( Vishnu Balusu & Asho...
Secure Kafka at scale in true multi-tenant environment ( Vishnu Balusu & Asho...Secure Kafka at scale in true multi-tenant environment ( Vishnu Balusu & Asho...
Secure Kafka at scale in true multi-tenant environment ( Vishnu Balusu & Asho...confluent
 
Declarative benchmarking of cassandra and it's data models
Declarative benchmarking of cassandra and it's data modelsDeclarative benchmarking of cassandra and it's data models
Declarative benchmarking of cassandra and it's data modelsMonal Daxini
 
TDC2016POA | Trilha Arquitetura - Apache Kafka: uma introdução a logs distrib...
TDC2016POA | Trilha Arquitetura - Apache Kafka: uma introdução a logs distrib...TDC2016POA | Trilha Arquitetura - Apache Kafka: uma introdução a logs distrib...
TDC2016POA | Trilha Arquitetura - Apache Kafka: uma introdução a logs distrib...tdc-globalcode
 
How to Build an Apache Kafka® Connector
How to Build an Apache Kafka® ConnectorHow to Build an Apache Kafka® Connector
How to Build an Apache Kafka® Connectorconfluent
 
Streaming Design Patterns Using Alpakka Kafka Connector (Sean Glover, Lightbe...
Streaming Design Patterns Using Alpakka Kafka Connector (Sean Glover, Lightbe...Streaming Design Patterns Using Alpakka Kafka Connector (Sean Glover, Lightbe...
Streaming Design Patterns Using Alpakka Kafka Connector (Sean Glover, Lightbe...confluent
 
Developing Real-Time Data Pipelines with Apache Kafka
Developing Real-Time Data Pipelines with Apache KafkaDeveloping Real-Time Data Pipelines with Apache Kafka
Developing Real-Time Data Pipelines with Apache KafkaJoe Stein
 
Developing Realtime Data Pipelines With Apache Kafka
Developing Realtime Data Pipelines With Apache KafkaDeveloping Realtime Data Pipelines With Apache Kafka
Developing Realtime Data Pipelines With Apache KafkaJoe Stein
 
Stream analysis with kafka native way and considerations about monitoring as ...
Stream analysis with kafka native way and considerations about monitoring as ...Stream analysis with kafka native way and considerations about monitoring as ...
Stream analysis with kafka native way and considerations about monitoring as ...Andrew Yongjoon Kong
 
Workflow as code with Azure Durable Functions
Workflow as code with Azure Durable FunctionsWorkflow as code with Azure Durable Functions
Workflow as code with Azure Durable FunctionsMassimo Bonanni
 

Similar to Testing Kafka components with Kafka for JUnit (20)

Exactly-once Stream Processing with Kafka Streams
Exactly-once Stream Processing with Kafka StreamsExactly-once Stream Processing with Kafka Streams
Exactly-once Stream Processing with Kafka Streams
 
Kafka Summit SF 2017 - Exactly-once Stream Processing with Kafka Streams
Kafka Summit SF 2017 - Exactly-once Stream Processing with Kafka StreamsKafka Summit SF 2017 - Exactly-once Stream Processing with Kafka Streams
Kafka Summit SF 2017 - Exactly-once Stream Processing with Kafka Streams
 
Maf3 - Part 1
Maf3 - Part 1Maf3 - Part 1
Maf3 - Part 1
 
Spark Streaming Info
Spark Streaming InfoSpark Streaming Info
Spark Streaming Info
 
Exactly-once Data Processing with Kafka Streams - July 27, 2017
Exactly-once Data Processing with Kafka Streams - July 27, 2017Exactly-once Data Processing with Kafka Streams - July 27, 2017
Exactly-once Data Processing with Kafka Streams - July 27, 2017
 
KAFKA Quickstart
KAFKA QuickstartKAFKA Quickstart
KAFKA Quickstart
 
How Scala promotes TDD
How Scala promotes TDDHow Scala promotes TDD
How Scala promotes TDD
 
Stream Processing made simple with Kafka
Stream Processing made simple with KafkaStream Processing made simple with Kafka
Stream Processing made simple with Kafka
 
Developing a custom Kafka connector? Make it shine! | Igor Buzatović, Porsche...
Developing a custom Kafka connector? Make it shine! | Igor Buzatović, Porsche...Developing a custom Kafka connector? Make it shine! | Igor Buzatović, Porsche...
Developing a custom Kafka connector? Make it shine! | Igor Buzatović, Porsche...
 
Kubernetes fundamentals
Kubernetes fundamentalsKubernetes fundamentals
Kubernetes fundamentals
 
I can't believe it's not a queue: Kafka and Spring
I can't believe it's not a queue: Kafka and SpringI can't believe it's not a queue: Kafka and Spring
I can't believe it's not a queue: Kafka and Spring
 
Secure Kafka at scale in true multi-tenant environment ( Vishnu Balusu & Asho...
Secure Kafka at scale in true multi-tenant environment ( Vishnu Balusu & Asho...Secure Kafka at scale in true multi-tenant environment ( Vishnu Balusu & Asho...
Secure Kafka at scale in true multi-tenant environment ( Vishnu Balusu & Asho...
 
Declarative benchmarking of cassandra and it's data models
Declarative benchmarking of cassandra and it's data modelsDeclarative benchmarking of cassandra and it's data models
Declarative benchmarking of cassandra and it's data models
 
TDC2016POA | Trilha Arquitetura - Apache Kafka: uma introdução a logs distrib...
TDC2016POA | Trilha Arquitetura - Apache Kafka: uma introdução a logs distrib...TDC2016POA | Trilha Arquitetura - Apache Kafka: uma introdução a logs distrib...
TDC2016POA | Trilha Arquitetura - Apache Kafka: uma introdução a logs distrib...
 
How to Build an Apache Kafka® Connector
How to Build an Apache Kafka® ConnectorHow to Build an Apache Kafka® Connector
How to Build an Apache Kafka® Connector
 
Streaming Design Patterns Using Alpakka Kafka Connector (Sean Glover, Lightbe...
Streaming Design Patterns Using Alpakka Kafka Connector (Sean Glover, Lightbe...Streaming Design Patterns Using Alpakka Kafka Connector (Sean Glover, Lightbe...
Streaming Design Patterns Using Alpakka Kafka Connector (Sean Glover, Lightbe...
 
Developing Real-Time Data Pipelines with Apache Kafka
Developing Real-Time Data Pipelines with Apache KafkaDeveloping Real-Time Data Pipelines with Apache Kafka
Developing Real-Time Data Pipelines with Apache Kafka
 
Developing Realtime Data Pipelines With Apache Kafka
Developing Realtime Data Pipelines With Apache KafkaDeveloping Realtime Data Pipelines With Apache Kafka
Developing Realtime Data Pipelines With Apache Kafka
 
Stream analysis with kafka native way and considerations about monitoring as ...
Stream analysis with kafka native way and considerations about monitoring as ...Stream analysis with kafka native way and considerations about monitoring as ...
Stream analysis with kafka native way and considerations about monitoring as ...
 
Workflow as code with Azure Durable Functions
Workflow as code with Azure Durable FunctionsWorkflow as code with Azure Durable Functions
Workflow as code with Azure Durable Functions
 

Recently uploaded

chapter--4-software-project-planning.ppt
chapter--4-software-project-planning.pptchapter--4-software-project-planning.ppt
chapter--4-software-project-planning.pptkotipi9215
 
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...stazi3110
 
Asset Management Software - Infographic
Asset Management Software - InfographicAsset Management Software - Infographic
Asset Management Software - InfographicHr365.us smith
 
Professional Resume Template for Software Developers
Professional Resume Template for Software DevelopersProfessional Resume Template for Software Developers
Professional Resume Template for Software DevelopersVinodh Ram
 
Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...OnePlan Solutions
 
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer DataAdobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer DataBradBedford3
 
Salesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantSalesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantAxelRicardoTrocheRiq
 
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfLearn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfkalichargn70th171
 
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...MyIntelliSource, Inc.
 
Cloud Management Software Platforms: OpenStack
Cloud Management Software Platforms: OpenStackCloud Management Software Platforms: OpenStack
Cloud Management Software Platforms: OpenStackVICTOR MAESTRE RAMIREZ
 
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASE
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASEBATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASE
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASEOrtus Solutions, Corp
 
Engage Usergroup 2024 - The Good The Bad_The Ugly
Engage Usergroup 2024 - The Good The Bad_The UglyEngage Usergroup 2024 - The Good The Bad_The Ugly
Engage Usergroup 2024 - The Good The Bad_The UglyFrank van der Linden
 
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdfThe Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdfkalichargn70th171
 
Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)OPEN KNOWLEDGE GmbH
 
ODSC - Batch to Stream workshop - integration of Apache Spark, Cassandra, Pos...
ODSC - Batch to Stream workshop - integration of Apache Spark, Cassandra, Pos...ODSC - Batch to Stream workshop - integration of Apache Spark, Cassandra, Pos...
ODSC - Batch to Stream workshop - integration of Apache Spark, Cassandra, Pos...Christina Lin
 
Hand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxHand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxbodapatigopi8531
 
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...ICS
 
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptx
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptxKnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptx
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptxTier1 app
 
What is Fashion PLM and Why Do You Need It
What is Fashion PLM and Why Do You Need ItWhat is Fashion PLM and Why Do You Need It
What is Fashion PLM and Why Do You Need ItWave PLM
 

Recently uploaded (20)

chapter--4-software-project-planning.ppt
chapter--4-software-project-planning.pptchapter--4-software-project-planning.ppt
chapter--4-software-project-planning.ppt
 
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
 
Asset Management Software - Infographic
Asset Management Software - InfographicAsset Management Software - Infographic
Asset Management Software - Infographic
 
Professional Resume Template for Software Developers
Professional Resume Template for Software DevelopersProfessional Resume Template for Software Developers
Professional Resume Template for Software Developers
 
Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...
 
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer DataAdobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
Adobe Marketo Engage Deep Dives: Using Webhooks to Transfer Data
 
Salesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantSalesforce Certified Field Service Consultant
Salesforce Certified Field Service Consultant
 
Call Girls In Mukherjee Nagar 📱 9999965857 🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
Call Girls In Mukherjee Nagar 📱  9999965857  🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...Call Girls In Mukherjee Nagar 📱  9999965857  🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
Call Girls In Mukherjee Nagar 📱 9999965857 🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
 
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdfLearn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
Learn the Fundamentals of XCUITest Framework_ A Beginner's Guide.pdf
 
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
 
Cloud Management Software Platforms: OpenStack
Cloud Management Software Platforms: OpenStackCloud Management Software Platforms: OpenStack
Cloud Management Software Platforms: OpenStack
 
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASE
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASEBATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASE
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASE
 
Engage Usergroup 2024 - The Good The Bad_The Ugly
Engage Usergroup 2024 - The Good The Bad_The UglyEngage Usergroup 2024 - The Good The Bad_The Ugly
Engage Usergroup 2024 - The Good The Bad_The Ugly
 
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdfThe Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
 
Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)
 
ODSC - Batch to Stream workshop - integration of Apache Spark, Cassandra, Pos...
ODSC - Batch to Stream workshop - integration of Apache Spark, Cassandra, Pos...ODSC - Batch to Stream workshop - integration of Apache Spark, Cassandra, Pos...
ODSC - Batch to Stream workshop - integration of Apache Spark, Cassandra, Pos...
 
Hand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxHand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptx
 
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
The Real-World Challenges of Medical Device Cybersecurity- Mitigating Vulnera...
 
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptx
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptxKnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptx
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptx
 
What is Fashion PLM and Why Do You Need It
What is Fashion PLM and Why Do You Need ItWhat is Fashion PLM and Why Do You Need It
What is Fashion PLM and Why Do You Need It
 

Testing Kafka components with Kafka for JUnit

  • 1. Markus Günther Freelance Software Engineer / Architect mail@mguenther.net | mguenther.net | @markus_guenther Apache Kafka Testing Kafka-enabled software components
  • 2. 2 There are several approaches to testing Kafka-enabled components or services. Problem Solution ▪ Spring Kafka provides an embeddable cluster ▪ Lacks the possiblity to work with external clusters ▪ Leads to boilerplate code in your test ▪ There are (costly) testing tools that are able to inject records into topics ▪ Lacks the possibility to work with embedded clusters ▪ Suitable for system testing, not integration testing ▪ Testing Kafka-based components or services is not trivial ▪ Need to have a working cluster ▪ Must be able to instrument the cluster ▪ Mocking is not an option for integration or system tests
  • 3. 3 Kafka for JUnit is suitable for integration testing as well as system testing. Solution ▪ Kafka for JUnit ▪ Works with embedded Kafka clusters – suitable for integration testing ▪ Works with external Kafka clusters – suitable for system testing ▪ Features a concise and readable API ▪ Features fault injection to trigger error handling ▪ Integrates well with Spring Kafka
  • 4. 4 Kafka for JUnit allows you to write integration tests that work against an embedded Kafka cluster. Integration Test JUNIT JUPITER Consumer / Producer KAFKA-BASED COMPONENT my.test.topic EMBEDDED APACHE KAFKA CLUSTER VERTICALS instruments provisions interacts with
  • 5. 6 Kafka for JUnit allows you to write system tests that work against an external Kafka cluster. VERTICALS Event Producer SERVICE A Event Consumer SERVICE B topic.for.events APACHE KAFKA CLUSTER System Test JUNIT JUPITER trigger use case observes topic for expected records publishes events consumes events
  • 6. 7 Kafka for JUnit provides abstractions for interacting with a Kafka cluster through EmbeddedKafkaCluster and ExternalKafkaCluster. my.test.topic APACHE KAFKA CLUSTER RecordProducer EmbeddedKafkaCluster RecordConsumer TopicManager FaultInjection my.test.topic APACHE KAFKA CLUSTER RecordProducer ExternalKafkaCluster RecordConsumer TopicManager
  • 7. 8 A RecordProducer provides the means to send key-value pairs or non-keyed values to a Kafka topic. public interface RecordProducer { <V> List<RecordMetadata> send(SendValues<V> sendRequest) throws ...; <V> List<RecordMetadata> send(SendValuesTransactional<V> sendRequest) throws ...; <K,V> List<RecordMetadata> send(SendKeyValues<K,V> sendRequest) throws ...; <K,V> List<RecordMetadata> send(SendKeyValuesTransactional<K,V> sendRequest) ...; /* overloaded methods that accept builder instances * for the resp. type have been omitted for brevity */ } Kafka for JUnit provides builders for these requests! Interface definition of RecordProducer
  • 8. 9 Publishing data to a Kafka topic is as simple as contributing a one-liner in the default case. kafka.send(SendValues.to(“my.test.topic“, “a“, “b“, “c“)); Sending non-keyed values using defaults kafka.send(SendValuesTransactional.inTransaction( “my.test.topic“, Arrays.asList(“a“, “b“, “c“))); kafka.send(SendValues.to(“my.test.topic“, “a“, “b“, “c“)) .with(ProducerConfig.ENABLE_IDEMPOTENCE_CONFIG, “true“) .with(ProducerConfig.MAX_IN_FLIGHT_REQUEST_PER_CONNECTION, “1“)); Sending non-keyed values using overrides Sending non-keyed values transactionally
  • 9. 1 A RecordConsumer provides the means to read data from a topic or observe a topic until some criteria is met or a timeout elapses. public interface RecordConsumer { <V> List<V> readValues(ReadKeyValues<String,V> readRequest) throws ...; <K,V> List<KeyValue<K,V>> read(ReadKeyValues<K,V> readRequest) throws ...; <V> List<V> observeValues(ObserveKeyValues<String,V> observeRequest) throws ...; <K,V> List<KeyValue<K,V>> observe(ObserveKeyValues<K,V> observeRequest) throws ...; /* overloaded methods that accept builder instances * for the resp. type have been omitted for brevity */ } Kafka for JUnit provides builders for these requests! Interface definition of RecordConsumer
  • 10. 1 Consuming records is just as easy as producing them. val values = kafka.readValues(ReadKeyValues.from(“my.test.topic“)); Consuming only values using defaults List<KeyValue<String, Long>> records = kafka.read(ReadKeyValues .from(“my.test.topic“, Long.class) .with(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, ...)); val records = kafka.read(ReadKeyValues.from(“my.test.topic“)); Consuming key-value pairs using defaults Consuming key-value pairs using overrides
  • 11. 1 Observations can be used to let a test fail unless given criteria are met. kafka.observeValues(ObserveKeyValues.on(“my.test.topic“, 3)); Observing a topic until n values have been consumed val keyFilter = Integer.parseInt(k) % 2 == 0; kafka.observe(ObserveKeyValues.on(“my.test.topic“, 3, Integer.class) .with(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, ...) .filterOnKeys(keyFilter)); List<KeyValue<String,String>> observedRecords = kafka .observe(ObserveKeyValues.on(“my.test.topic“, 3)); Observing a topic until n records have been consumed Using filters when consuming or observing a topic
  • 12. 1 A TopicManager provides the means to manage Kafka topics. public interface TopicManager { void createTopic(TopicConfig config); void deleteTopic(String topic); boolean exists(String topic); Map<Integer, LeaderAndIsr> fetchLeaderAndIsr(String topic); Properties fetchTopicConfig(String topic); /* overloaded methods that accept builder instances for the resp. type have been omitted for brevity */ } Kafka for JUnit provides builders for these requests! Interface definition of TopicManager
  • 13. 1 The default TopicManager implementation leverages the AdminClient implementation of the Kafka Client library. kafka.createTopic(TopicConfig.withName(“my.test.topic“)); Creating a topic using defaults kafka.deleteTopic(“my.test.topic“); Deleting a topic kafka.createTopic(TopicConfig.withName(“my.test.topic“) .withNumberOfPartitions(5) .withNumberOfReplicas(1)); Creating a topic with specific properties
  • 14. 1 Writing integration tests with Kafka for JUnit
  • 15. 1 Let’s write a test that exercises the write-path of Kafka-based component. TurbineEventPublisherTest JUNIT JUPITER TurbineEventPublisher TURBINE REGISTRY turbine.lifecycle.events EMBEDDED APACHE KAFKA CLUSTER instruments provisions interacts with
  • 16. 1
  • 17. 1 Let’s write a simple test that verifies that TurbineEventPublisher is able to write events to the designated topic. public class TurbineEventPublisherTest { private EmbeddedKafkaCluster kafka; @BeforeEach void setupKafka() { kafka = provisionWith(defaultClusterConfig()); kafka.start(); } @AfterEach void tearDownKafka() { kafka.stop(); } } Provide the skeleton for the component test incl. a workable Kafka cluster. static import from ExternalKafkaCluster static import from ExternalKafkaClusterConfig
  • 18. 1 The observe method throws an AssertionError once a certain configurable amount of time has elapsed. var config = Map.<String, Object>of( ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, kafka.getBrokerList()); var publisher = new TurbineEventPublisher(“turbine.lifecycle.events“, config); var event = new TurbineRegisteredEvent(“1a5c6012“, 49.875114, 8.978702); publisher.log(event); Create an instance of the subject-under-test and publish test data
  • 19. 2 The observe method throws an AssertionError once a certain configurable amount of time has elapsed. (cont.) kafka.observe(ObserveKeyValues .on(“turbine.lifecycle.events“, 1, TurbineEvent.class) .observeFor(15, TimeUnit.SECONDS) .filterOnKeys(key -> key.equals(“1a5c6012“)) .with(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, TurbineEventDeserializer.class)); Observe the designated topic for new data override arbitrary consumer properties the topic we want to observe the number of records we expect the value type of the expected records use filters to add observation criterias
  • 20. 2 The observe method returns all records that it obtained from watching the topic. var record = kafka.observe( on(“turbine.lifecycle.events“, 1, TurbineEvent.class) .with(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, TurbineEventDeserializer.class)) .stream() .findFirst() .orElseThrow(AssertionError::new); assertThat(record.getKey()).isEqualTo(“1a5c6012“); assertThat(record.getValue()).isInstanceOf(TurbineRegisteredEvent.class); Fetch observed records and assert that their data is what you expect it to be
  • 21. 2 Writing systems tests with Kafka for JUnit
  • 22. 2
  • 23. 2 We’ll use a simple client interface to provide the means to exert an external stimulus on the system. (cont.) Listing 1: Example for Feign-based HTTP client that interacts with the system public interface GettingThingsDone { @RequestLine(“POST /items“) @Headers(“Content-Type: application/json“) Response createItem(CreateItem payload); @RequestLine(“GET /items/{itemId}“) @Headers(“Accept: application/json“) Item getItem(@Param(“itemId“) String itemId); /* additional methods omitted for brevity */ } Example for Feign-based HTTP client that interacts with the system
  • 24. 2 We’ll use a simple client interface to provide the means to exert an external stimulus on the system. (cont.) Listing 1: Example for Feign-based HTTP client that interacts with the system val kafka = ExternalKafkaCluster.at(“http://localhost:9092“); Gain programmatic access to the cluster val itemId = extractItemId(response); Extract the ID of the newly created item from the response val gtd = createGettingThingsDoneClient(); val payload = new CreateItem(“Buy groceries!“); val response = gtd.createItem(payload) Trigger a use case using the client interface
  • 25. 2 Leverage Kafka for JUnit to observe the designated topic and apply assertions on the returned records. Listing 1: Example for Feign-based HTTP client that interacts with the system List<AvroItemEvent> publishedEvents = kafka .observeValues(on(“item.events“, 1, AvroItemEvent.class) .observeFor(10, TimeUnit.SECONDS) .with(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, ItemEventDeserializer.class) .filterOnKeys(aggregateId -> aggregateId.equals(itemId))); Observe the designated topic for the throws an AssertionError if the timeout elapses
  • 26. 2 Want to know more? GitHub Blog ▪ Günther, M., Writing component tests for Kafka producers, https://bit.ly/39NpoCU ▪ Günther, M., Writing component tests for Kafka consumers, https://bit.ly/36KrXoV ▪ Günther, M., Writing system tests for a Kafka-enabled microservice, https://bit.ly/2OUeEMs ▪ Günther, M., Using Kafka for JUnit with Spring Kafka, https://bit.ly/3c61WSx ▪ Kafka for JUnit on GitHub, https://mguenther.github.io/kafka-junit/ ▪ User Guide to Kafka for JUnit, https://mguenther.github.io/kafka-junit/