Wednesday, October 6, 2021

Apache Kafka, Setup it

Kafka short introduction

Kafka is an event based messaging system that safely moves data between systems.

Related Post
Before you start I would like to suggest you to read my previous post "What is Apache Kafka".

Let's talk about setup. 
You need java running in your system, let's see my current java version. Just open terminal and execute "java -version".

  $ java -version
  java version "1.8.0_301"
  Java(TM) SE Runtime Environment (build 1.8.0_301-b09)
  Java HotSpot(TM) 64-Bit Server VM (build 25.301-b09, mixed mode)
  
Once you get java running in your system, follow steps to setup and get kafka running:

1. Download & Start service
a. Get latest kafka release, using below link or from official kafka build page. 
b. Unzip archive and navigate to the extracted directory in console.
c. Run the following commands in order to start all services in the correct order. (new terminal for each command)

    Start the ZooKeeper service
    $ bin/zookeeper-server-start.sh config/zookeeper.properties

    Start the Kafka broker service
    $ bin/kafka-server-start.sh config/server.properties

2. Create a topic to store your events.
Kafka is a distributed event streaming platform that lets you read, write, store, and process events (also called records or messages in the documentation) across many machines.
Example events are payment transactions, geolocation updates from mobile phones, shipping orders, sensor measurements from IoT devices or medical equipment, and much more. These events are organized and stored in topics. Very simplified, a topic is similar to a folder in a filesystem, and the events are the files in that folder.
So before you can write your first events, you must create a topic (named "quickstart-events"). Open another terminal session and run:
$ bin/kafka-topics.sh --create --topic quickstart-events --bootstrap-server localhost:9092

3. Write some event into the topic
A Kafka client communicates with the Kafka brokers via the network for writing (or reading) events. Once received, the brokers will store the events in a durable and fault-tolerant manner for as long as you need—even forever.
Run the console producer client to write a few events into your topic. By default, each line you enter will result in a separate event being written to the topic.

$ bin/kafka-console-producer.sh --topic quickstart-events --bootstrap-server localhost:9092
  This is my first event
  This is my second event
    
4. Read the events
Open another terminal session and run the console consumer client to read the events you just created:

$ bin/kafka-console-consumer.sh --topic quickstart-events --from-beginning --bootstrap-server localhost:9092
This is my first event
This is my second event



5. Higher Level Diagram



So this was about setup of "Apache Kafka", in next post we will use Kafka in NodeJS application.


No comments:

Post a Comment