Integrating LVM with Hadooop and providing Elasticity to DataNode Storage!!!

Kanishka Shakya
3 min readMar 14, 2021

What is Hadoop??

Hadoop is an open-source framework that allows to store and process of big data in a distributed storage environment across the cluster of computers. It provides massive storage for any kind of data, enormous processing power, and the ability to handle virtually limitless concurrent tasks or jobs.

What is (LVM)Logical Volume Management ??

LVM is a tool for logical volume management which includes allocating disks, striping, mirroring, and resizing logical volumes. With LVM, a hard drive or set of hard drives is allocated to one or more physical volumes. LVM physical volumes can be placed on other block devices which might span two or more disks. The physical volumes are combined into logical volumes.

Let’s go:-

Step1:- I created a volume .

Step2:- Confirm this by Graphical Console.

Step3:- Now , I attached EBS Volume to Instance.

Step4:- By ‘’fdisk -l’’ command we can confirm the volume is attached or not.

Step5:- Create Physical Volume from that hard disk.

Step6:- To create the volume group we have to use vgcreate command.

Step7:- To create the logical volume we have to use lvcreate command.

Step8:- To format the logical volume we have to use mkfs command.

Step9:- Mount the Logical Volume with the DataNode directory.

Finally, we were able to change the storage contribution of datanode to the hadoop cluster by integrating it with the concept of LVM.

THANK YOU ALL!!!

--

--

Kanishka Shakya

Aviatrix Certified Engineer | DevOps | Python | Big Data | RHCSA 8 | AWS-CSA | AWS-DEVELOPER | Ansible | Docker | CKA & CKAD | GIT & GITHUB |