HOW TO INSTALL AND USE MICROSOFT HDINSIGHT (HADOOP ON WINDOWS)

HDInsight is Microsoft’s 100% Apache compatible Hadoop distribution, supported by Microsoft. HDInsight, available both on Windows Server or as an Windows Azure service, empowers organizations with new insights on previously untouched unstructured data, while connecting to the most widely used Business Intelligence (BI) tools on the planet. In this post we'll directory jump into the hands-on. But, if you want more on HDInsight, you can visit my another post here.

NOTE : OS used - Windows 7

So let's get started.

First of all go to the Microsoft Big Data page, and click on the Download HDInsight Server link (shown in the blue eclipse). You will see something like this :



Once you click the link it will guide you to the Download Center. Now, go to the Instructions heading and click on Microsoft Web Platform Installer.



This will automatically download and install all the required thing.

Once the installation is over open the  Microsoft Web Platform Installer and go to the Top Right corner of the Microsoft Web Platform Installer UI where you will find a Search Box. Type Hadoop in there. This will show you Microsoft HDInsight for Windows Server Community Technology Preview bar. Select it and click on install. And you are done.

NOTE : It may take some time to install all the necessary components depending upon your connection speed.

On successful completion of HDInsight you can find the Hadoop Command Line icon on your desktop. Also you will find a brand new directory named Hadoop inside your C drive. This indicates that everything was OK and you are good to go.

TESTING TIME

It's time now to test HDInsight.

Step1. Go to the C:\Hadoop\hadoop-1.1.0-SNAPSHOT\bin directory :
c:\>cd Hadoop\hadoop-1.1.0-SNAPSHOT\bin

Step2. Now, start the daemons using start_daemons.cmd :
c:\Hadoop\hadoop-1.1.0-SNAPSHOT\bin>start_daemons.cmd

It will show you something like this on your terminal :


This means that your Hadoop processes have been started successfully and you are all set.

Let us use few of the Hadoop Commands to get ourselves familiar with Hadoop.

1. List all the directories, sub-directories and file present in Hdfs. And we do it using fs -ls :



2. Create a new directory inside Hdfs.We use fs -mkdir to do that :



You would have become familiar with the Hadoop shell by now. But I would suggest to go to the official Hadoop Page and try more in order to get a good grip. HAPPY HADOOPING..!!


Comments


  1. Your info is really amazing with impressive content..Excellent blog with informative concept.
    DevOps Online Training institute
    DevOps Online Training in Hyderabad
    DevOps Course in Hyderabad

    ReplyDelete
  2. Thank you for sharing such a great information.Its really nice and informative.hope more posts from you. I also want to share some information recently i have gone through and i had find the one of the best

    DevOps Training in Chennai | DevOps Training in anna nagar | DevOps Training in omr | DevOps Training in porur | DevOps Training in tambaram | DevOps Training in velachery

    ReplyDelete
  3. ZModeler Crack is an ideal included 3D demonstrating an interest which intends to benefit visual specialists joined by making the low. Zmodeler 3 Crack

    ReplyDelete
  4. May God bless you with all of the wonderful desires of your heart and bless you to keep the many blessings that you already have. Happy Birthday My Love Quotes

    ReplyDelete

Post a Comment

Popular posts from this blog

Fun with HBase shell

BETWEEN OPERATOR IN HIVE

HOW TO SETUP AND CONFIGURE 'ssh' ON LINUX (UBUNTU)