Rest API with Apache Kafka

Sajith Mohanan
3 min readFeb 25, 2021

--

Integrate your API with Apache Kafka

Introduction : This article is all about integrations . We ingest data into apache kafka using Rest API

Assumptions: You know and have the following setups ready

Apache zookeeper

Apache Kafka

Spring Boot

Maven

Definitely a Postman to post API

What are we going to do :

Kafka Topic creation

Kafka Producer configuration

Build API (GET and POST ) to publish data/payload to topic

Consume from Kafka Topic

Khalas !!

Rest API will be exposed from the application. API will be used to post data in to application . API are connected to kafka via topic. We will then configure the application to push data into kafka.

Step 1 : We will set up Spring Boot application . Dependency required will be spring web and kafka .

Download the jar file and import into IDE.

Step 2: Rest Controller is defined for two endpoints

Get →/createUser/userName

Post →/createUser

We can use any of them . I have pointed both of them to same topic

API will consume the the data and push the payload into kafka Topic . we are yet to define the topic. we will do it shortly. Producer configuration is done within the application . I have kafka and zookeeper defined within the machine hence the ip defined as localhost. If we have the kafka broker defined in separate machine the configurations can go here.

Step 3 : Start zookeeper . Important thing to note here is zookeeper has to be started first and kafka server to be started only once the zookeper is up and running .

Starting zookeeper

Once the zookeeper is up, its kafka servers turn to be up.

Step 4:Start kafka server.

Start kafka server

Step 5 : Create Topic . This is required to push data into kafka . Here the spring boot application/rest API acts as producer.

Here we have created topic kafka-user-topic . This is the same topic which we have configured in the spring boot application.

Step 6: Now its time to post the API and push the payload to kafka topic .We are sending json using the API . Once the data reaches the spring boot application the kafka template would be used to send the data to kafka topic.

Step 7: We will create kafka consumer to check if the data pushed has reached the consumer.Below we have created consumber and the message can be seen as well at the consumer’s end.

Kafka Consumer

We can go ahead and use kafka connect connector to fetch data from topic and publish to and other system like AWS S3, apache Cassandra . Will publish next article on it .

To summarize. I have demonstrated how data can be brought in to kafka using rest api spring boot integration. Kafka is build around java and there is a littlie bit of coding involved in the whole process.

Kafka has much more capability to ingest streaming data .

Code can forked from here .

sajithgit/kafka (github.com)

Thank you,

Regards !!

--

--