Apache Kafka Foundation Course - Quick Start Demo


Welcome to Apache Kafka tutorial at Learning Journal. In this video, I will provide a quick start demo. We will cover following things.

  1. Download and Install Apache Kafka
  2. Start Kafka server
  3. Create a topic
  4. Start a console producer
  5. Start a console consumer
  6. Send and receive messages

In fact, I am going to follow quick start guide from Apache Kafka documentation . I will also explain few things along the way, and this demo will provide a good sense of some command line tools that Kafka provides.
If you want to play with Apache Kafka and follow the example discussed in this tutorial, you will need a Linux machine. I have a virtual machine installed on my windows box, and I am going to use it for this demo.

Download and Install Apache Kafka

The first thing is to download and install Kafka. You can find the download link in Kafka documentation. So, go ahead and download Apache Kafka using the link. I downloaded Kafka 0.10.1 release because that's the latest version at the time of recording this video. However, you can follow the latest Kafka quick start from the official Kafka documentation.
The download will provide you a tar file. I created a directory named Kafka in my home directory and placed the tar file in that directory. Now I need to uncompress this file. So, let's do it.
The below command will uncompress the tar file in the current directory.

That's it. You installed Apache Kafka. That's all about the installation.
If you list your current directory, you will see that a new directory is present. Change to that directory.


You can refer this new directory as the Kafka Home.
If you list you Kafka home, you will see several folders.
The bin folder contains some command line Kafka tools. We will use most of those tools in this tutorial, and I will explain those tools when we use them for the first time.
The config directory contains all the configuration files. We will explore some configuration files as well in this tutorial.
The libs directory contains all the jar files.
The logs directory is the data directory. Kafka stores all the messages in that folder.

Start Kafka Server

The next step is to start the first Kafka broker. But Kafka uses Zookeeper. Those who don't know about zookeeper, let me give you a quick bite on zookeeper.
Zookeeper is another open source project that came out from Hadoop project. Zookeeper is used to provide some coordination services for a distributed system. Since Kafka is a distributed system, and we have multiple brokers. So, we need a system to coordinate various things among these brokers. That's why we need zookeeper.
So, before we start Kafka broker, we have to start a zookeeper.
Kafka provides a command line tool to start a zookeeper. It's available in your bin directory. So let's go ahead and start zookeeper.

The zookeeper-server-start.sh is a shell script, and it takes one parameter. The parameter is a configuration file name. We can use the default config.
By executing the above command, you should see a message, as the zookeeper is running on port 2181. Minimize the current terminal and start another one for starting a Kafka broker.
Kafka provides a command line tool to start the broker. Use the below shell command to start a Broker. The shell script takes some Broker configurations, and we provide a config file with all the default values.

The server properties file is part of default Kafka installation, so you don't have to change anything, just use the default file.
Once you execute the above command, you will see a lot of messages coming. Hopefully, you will see a message like started or start up complete. If you don't see such message, scroll a little up and try to find it. We just need to make sure that there are no errors and the Broker started successfully. In my case, it's started.


Create Kafka Topic

The next step is to create a Kafka Topic. In fact, you can skip this step because, by default, Kafka will create a topic automatically. So, when a producer sends data to a non-existent topic, Kafka will create the Topic and accepts the message. But let's create the Topic using the Topic Management tool.

The first parameter is zookeeper address and port. The next item is the create command. The topic management tool provides many functionalities. In this example, we are using it to create a Topic, so we give the create command. The next parameter is to provide a Topic name. So, I name it as MyFirstTopic1. Other two parameters are to give the number of partitions and replication factor. Let's ignore replication factor for the time being. We will cover replication factor in next video. You already know Partitions. I am creating two partitions on this topic. You might be wondering, that I have a single broker then how come we create two partitions. That's not a problem. Kafka will try to distribute partitions evenly over the available Brokers. But in our case, we just have a single Broker, so Kafka doesn't have any option other than creating both partitions on the same machine.

Start Kafka Producer and Consumer

Now, we have the last thing. Create a console producer in one terminal. Create a console consumer in another terminal. Then we will send some messages from the producer. They should appear at the Consumer.
So, let's start the producer using the command listed below.

To send a message to Kafka, you need a Broker address. That's what the first parameter specifies. We have a Broker running on the localhost and default port 9092. You submit the above command, and a producer starts running. It is a console Producer, so whatever we type on the console, it will send that to the broker.
Before we start sending some messages, open another terminal and start a consumer. Use the below command to start a console Consumer.

All the parameters are same as we used for Producer. The parameter bootstrap-server has the same meaning as broker list.

Great, you should have the consumer as well as a producer running in two different terminals. You can send some text from the producer by typing messages at the console. You should see those messages arriving at the consumer terminal.
The console producer and console consumer are of no use other than simple demonstrations and testing the functionality. In a real-life project, you must use Kafka Producer APIs and Consumer APIs and code your producers and consumers according to your requirements.


You will also like: