SapHanaTutorial.Com HOME     Learning-Materials Interview-Q&A Certifications Quiz Online-Courses Forum Jobs Trendz FAQs  
     Explore The World of Hana With Us     
About Us
Contact Us
 Apps
X
HANA App
>>>
Hadoop App
>>>
Tutorial App on SAP HANA
This app is an All-In-One package to provide everything to HANA Lovers.

It contains
1. Courses on SAP HANA - Basics, Modeling and Administration
2. Multiple Quizzes on Overview, Modelling, Architeture, and Administration
3. Most popular articles on SAP HANA
4. Series of Interview questions to brushup your HANA skills
Tutorial App on Hadoop
This app is an All-In-One package to provide everything to Hadoop Lovers.

It contains
1. Courses on Hadoop - Basics and Advanced
2. Multiple Quizzes on Basics, MapReduce and HDFS
3. Most popular articles on Hadoop
4. Series of Interview questions to brushup your skills
Apps
HANA App
Hadoop App
';
Search
Stay Connected
Search Topics
Topic Index
+
-
Hadoop Overview
+
-
MapReduce
+
-
YARN
+
-
Miscellaneous
Hadoop 1.0 Installation on Windows 7 Using Cygwin


This article explain the installation of Hadoop 1.0 on Windows 7. Follow the step by step process to install Hadoop 1.0.
In case you get any error, you may ask directly our Hadoop experts in Forum

Download Prerequisite Softwares:

There are set of prerequisite softwares which you need to install before you setup Hadoop.

Cygwin:
You need to download Unix command-line tool Cygwin. Cygwin is a large collection of GNU and Open Source tools which provide functionality similar to a Linux distribution on Windows. It is needed to run the scripts supplied with Hadoop because they are all written for the Unix platform.
Download Cygwin installer from below link. Select either 32 bit or 64 bit as per your operating system.
Java:
You need to download Java 1.6 or latest version. Download JDK 1.6 from the link Java SE 6 Downloads
You will have to create an Oracle account to download it.

Important Note:
By default programs get installed in Program files.
It is strictly advisable that you create folders in C drive for Cygwin and Java so as to not run in problems of having folder name including spaces.

Install Cygwin:
    1. Run the downloaded Cygwin setup file
    2. On the Choose Installation Type screen, select Install from Internet, then click Next.

      Hadoop Installation on Windows

    3. Choose Installation Directory screen, change the install path which will be by default "Program files" to "C:\Cywgin" as Root Directory

      Hadoop Installation on Windows

    4. Select Local Package Directory as the same folder - C:\Cywgin

      Hadoop Installation on Windows

    5. On the Select Connection Type screen, select appropriate settings to connect to the internet, then click Next.

      Hadoop Installation on Windows

    6. On the Choose Download Site(s) screen, select any site from the available list, then click Next.

      Hadoop Installation on Windows

    7. Keep pressing Next again and you would see the Select Package screen.
    8. Now type "open" in the search textbox. Expand "Net" and click on "Skip" for selecting openssh and openssl.

      Hadoop Installation on Windows

    9. Again search for "dos2unix" and select this file too.

      Hadoop Installation on Windows

    10. 1After you ave selected all the packages press Next button, the process will take some time .

      Hadoop Installation on Windows

    11. Click Finish to complete the installation process.

      Hadoop Installation on Windows


Cygwin is installed successfully.



Configure OpenSSH in Cygwin:
    1. Once Cygwin is installed and running properly, you'll need to configure the ssh components in order to execute Hadoop scripts.
    2. Right-click on your Cygwin shortcut, and click on "Run as administrator". This will make sure we have the proper privileges for everything. You'll see an empty Cygwin window come up.

      Hadoop Installation on Windows

    3. Execute the following command to start configuration wizard for openssh.
ssh-host-config
  • You'll see the script generate some default files, and then you'll be prompted for whether or not you want to enable "Privilege Separation." It's on by default in standard installations of OpenSSH on other systems, so go ahead and say "yes" to the prompt.

    Hadoop Installation on Windows

  • Next, you'll be asked if you want sshd to run as a service. This will allow you to get SSH access regardless of whether or not Cygwin is currently running, which is what we want. Go ahead and hit "yes" to continue.

    Hadoop Installation on Windows

  • Next, you'll be asked to enter a value for the daemon. Enter the value ntsec
  • You'll see the script give you some information on your system and then it will ask you to create a privileged account with the default username "cyg_server". The default works well, so type "no" when it asks you if you want to use a different account name, although you can change this if you really like.

    Hadoop Installation on Windows

  • 8. Enter an easy-to-remember password. After some information, you'll get a success message:
    "Host configuration finished. Have fun!"
  • You can either restart, or enter the following command to start the sshd service:
  • net start sshd


    Hadoop Installation on Windows



    User Configuration of SSH:
      1. Now we'll create the appropriate SSH keys for your user account. Enter the following command
    ssh-user-config
  • You'll be asked to create specific keys for your user account. SSH2 is more secure, so we recommend to answer as below.
  • Shall I create a SSH1 RSA identity file for you? : no
    Shall I create a SSH2 RSA identity file for you? : yes
  • Enter the passphrase and answer yes to question
  • Do you want to use this identity to login to this machine? (yes/no) - yes


    Hadoop Installation on Windows

  • Next, you'll be asked to create an SSH2 DSA ID file. You may answer that as no. Finally you will see a success message as below.

    Hadoop Installation on Windows

  • That's it! Configuration is done. If you want to test your configuration really quickly, enter the following command in your Cygwin window.
  • ssh -v localhost

    Note: If you find yourself stuck at any of the configuration steps, make sure that the Windows User Account you're running has Administrative access. You may get weird errors if you try to run the host configuration as a normal user, so make sure you run Cygwin with admin privileges during that step.



    Install Java:
    We need to install Java 64 bit, minimum version 1.6.

    Double click on the JDK file you have downloaded.

    Important Note:
    By default programs get installed in Program files. It is strictly advisable that you create folders in C drive for Java so as to not run in problems of having folder name including spaces.

    If you have already Java installed in "Program Files" folder, we recommend to uninstall that and install into C:\Java folder.

    Set Environment Variable for Cygwin and Java:
    The next step is to set up the PATH environment variable. Path is an environment variable which is used by the operating system to find the executable.
      1. Right click on "My Computer and select Properties item from the menu.
      2. Click on "Advanced System Settings".
        In "System Properties" window, click on "Environment Variables" button and locate the PATH variable in "System Variables" section.

        Hadoop Installation on Windows

      3. Append the bin folder path of Installed Cygwin C:\cygwin64\bin; and click OK.
        Be default the path will be C:\cygwin64\bin; In case the installation path is different, you may have to select a different path.

        Hadoop Installation on Windows

      4. Add new System Variable JAVA_HOME and add the installed JAVA path C:\JAVA (no semicolon at the end).

        Hadoop Installation on Windows


    Download The Hadoop 1.0:
    Now that you have successfully installed and configured all pre-requisite software, let us download and install Hadoop.
      1. Download Hadoop 1.0 from this link
      2. Open Cygwin terminal (Run as Administration)
      3. Execute command "explorer ." to locate your home directory. It will open up Cygwin Home Directory Folder.

        Hadoop Installation on Windows

      4. Copy the Hadoop folder (you just downloaded) and place it in the home directory folder (which was opened in previous step)

        Hadoop Installation on Windows



    Unzip The Hadoop Package:
    Now we need to unzip the downloaded Hadoop package and save it to Cygwin home folder.
      1. Open Cygwin terminal (Run as Administration)
      2. Execute the tar command as below to start unpacking the Hadoop package.
    tar -xzf hadoop-x.y.z.tar.gz

    Here x.y.z will be your Hadoop version.

    This process might take some time and after that you see Cygwin command prompt again. There is no success message.

    Hadoop Installation on Windows

  • Execute command ls-l to list the contents of home directory. You would see new folder of hadoop-x.y.z in the directory. Enter command cd hadoop-x.y.zto go to hadoop root folder

    Hadoop Installation on Windows

  • Execute ls-l again to see the content in Hadoop. If you get similar results as below image, it means Hadoop is unpacked successfully and all required files are present .

    Hadoop Installation on Windows




  • Configurations:
      1. Open Cygwin and type the command "explorer ." to open Home folder.

        Hadoop Installation on Windows

        Create a folder with name "hadoop-dir". And inside "hadoop-dir" folder create 2 folder with names "datadir" and "namedir".
      2. In Cygwin execute chmod command to change folder permissions so that it will be accesses by Hadoop.
    $ chmod 755 hadoop-dir
    cd hadoop-dir
    $ chmod 755 datadir
    $ chmod 755 namedir

    You need to change the current folder using cd command before executing these commands.

    Hadoop Installation on Windows

  • Open Cygwin terminal (Run as Administration) and execute following command
  • $ cd hadoop-x.y.z
    $ cd conf
    $ explorer .

    It will open conf folder in Windows explorer window.

    Hadoop Installation on Windows

  • Open hadoop-env.sh file to set Java home as you did it before for environmental variable setup
    Uncomment the line which contains "export JAVA_HOME" and provide your Java path.
  • export JAVA_HOME= "C:\Java\"


    Hadoop Installation on Windows

  • Open core-site.xml file and add below code.
  • <configuration>
    <property>
    <name>fs.default.name</name>
    <value>hdfs://localhost:50000</value>
    </property>
    </configuration>


    Hadoop Installation on Windows

  • Open mapred -site.xml file and add below code
  • <configuration>
    <property>
    <name>mapred.job.tracker</name>
    <value>localhost:50001</value>
    </property>
    </configuration>


    Hadoop Installation on Windows

  • Open hdfs -site.xml file and add below code
    Change the USER_FOLDER_NAME in the below code as per your your user name.
  • <configuration>
    <property>
    <name>dfs.data.dir</name>
    <value>/home/USER_FOLDER_NAME/hadoop-dir/datadir</value>
    </property>
    <property>
    <name>dfs.name.dir</name>
    <value>/home/USER_FOLDER_NAME/hadoop-dir/namedir</value>
    </property>
    </configuration>


    Hadoop Installation on Windows

  • For every file we changed, run dos2unix command.

    Hadoop Installation on Windows


  • Congratulation!! We have successfully installed Hadoop 1.0 on Windows.

    Format the NameNode and Run Hadoop Daemons :
    Format the NameNode:
    We first need to format the NameNode to create a Hadoop Distributed File System (HDFS).

    Open Cygwin terminal (Run as Administration) and execute following command
    $ cd hadoop-x.y.z
    $ bin/hadoop namenode -format
    This command will run for some time. You should be able to see message "Storage Directory has been successfully formatted"

    Start Hadoop Daemons:
    Once the filesystem has been created . Next step would be to check and start Hadoop Cluster Daemons NameNode, DataNode, SecondaryNameNode, JobTracker, TaskTracker.

    Restart the Cygwin Terminal and execute below command to start all daemons on Hadoop Cluster.
    $ bin/start-all.sh

    This command will start all the services in Cluster and now you have your Hadoop Cluster running.

    Stop Hadoop Daemons: To stop all the daemons, we can execute the command
    $ bin/stop-all.sh


    Web interface for the NameNode and the JobTracker:
    After you have started the Hadoop Daemons by using the command bin/start-all.sh, you can open and check NameNode and JobTracker in browser.
    By default they are available at below address. The web interface for these services provide information and status of each of these components. They are first entry point to obtain a view of the state of a Hadoop cluster.

    NameNode Web Interface:
    The NameNode web interface can be accessed via the URL http://<host>:50070/
    In this case, http://localhost:50070/

    Hadoop Installation on Windows

      1. The first section of the web page displays the name of the server running the NameNode which is 127.0.0.1 and the port 9100, when it was started, version information.
      2. In the Next section you see Cluster Summary which represent a high view of the state of the cluster.

        files and directories, blocks: Each Filesystem metadata item consumes this much memory.

        Configured Capacity: It represents total capacity of HDFS.

        DFS Used: It represent space used in HDFS.

        Non DFS Used: It tells about the space used for non-HDFS items like any other application running on system.

    JobTracker Web Interface:
    The JobTracker web interface can be accessed via the URL http://<namenode_host>:50030/
    In this case, http://localhost:50030/

    Hadoop Installation on Windows





    Support us by sharing this article.



    Explore More
    Close X
    Close X

    79 thoughts on “Hadoop Installation on Windows 7 Using Cygwin

    1. PeterP says:

      The tasktracker never comes up. I’m sure you have seen this as well

      ERROR mapred.TaskTracker: Can not start task tracker because java.io.IOException:
      Failed to set permissions of path: \tmp\hadoop-xxx\mapred\local\taskTracker to 0755

      I am not sure if that yak-shave is needed here or move to a 2.x versio or a 0.xx version.

      • Admin says:

        Hi Peter,
        Please post your technical question to the Forum

      • Manish Agrawal says:

        Hi Peter,

        Check JAVA_HOME path in hadoop-env.sh
        The root cause of this problem is JAVA_HOME path. It should be correct in this file(hadoop-env.sh) and Environment variable.
        As per admin, make sure your Java is installed in C:\ directory (so that JAVA_HOME will be C:/Java/jdk_version).

        If your java is installed in program file then make sure it JAVA_HOME path.
        JAVA_HOME = C:/Progra~1/Java/jdk_version

        Example:- JAVA_HOME=C:/Progra~1/Java/jdk1.7.0_60

        Note: Linux environment need forward slash(/) as separator unlike windows(\). So make changes as written above in example

        • Priyanka says:

          Hi Manish,

          I have checked the JAVA_HOME setting. It is right …still my tasktracker is not starting…Can you provide me with any other probable solution..

          Thank you.

          • Admin says:

            Hi Priyanka,
            What is the error message you are getting? Could you please check if there is any space within the Java JDK’s file path?

            Regards,
            Raja

      • Shadab Bigdel says:

        for risolving the problem of “ERROR mapred.TaskTracker: Can not start task tracker because java.io.IOException:
        Failed to set permissions of path: \tmp\hadoop-xxx\mapred\local\taskTracker to 0755”
        See this link: https://github.com/congainc/patch-hadoop_7682-1.0.x-win

        Download the pre-built JAR, patch-hadoop_7682-1.0.x-win.jar, from the Downloads section (or build it yourself).
        Copy patch-hadoop_7682-1.0.x-win.jar to the ${HADOOP_HOME}/lib directory

        Modify ${HADOOP_HOME}/conf/core-site.xml to enable the overriden implementation as shown below:

        fs.file.impl
        com.conga.services.hadoop.patch.HADOOP_7682.WinLocalFileSystem
        Enables patch for issue HADOOP-7682 on Windows

        Run your job as usual (using Cygwin). You should see some logging to System.err if the patch is enabled and working.

        • Chandan says:

          Thanks for the info. patch worked like a champ.
          For elaborate detail,
          Add this property dont replace with existing one.

          fs.file.impl
          com.conga.services.hadoop.patch.HADOOP_7682.WinLocalFileSystem
          Enables patch for issue HADOOP-7682 on Windows

          fs.default.name
          hdfs://localhost:50000

    2. Jack Yim says:

      I’m facing some problems that I cant access to http://localhost:50070 and http://localhost:50030, may I know what problem?

      • Admin says:

        Hi Jack,
        Could you please post the error message. What exactly is the problem?

        • Zahia says:

          Hi. Same here. when I finished the installation I accessed the 2 sites normally but the second time I tried It didnt work.. it either shows the (connection failed) page or it charges the first lines of the page then shows the (connection failed) page again… I tried more and some times I got one page the other no; then I try again I get the second but not the first. Can you help me please?

          • Hariharan says:

            I am also getting stuck at the same. At first time both worked just fine. But afterwards only Jobtracker page is displayed.

    3. sanjeev says:

      thanks a lot… I was able to configure successfully… 🙂 🙂

    4. sanjeev says:

      can anyone help with how to configure the eclipse juno plugins for hadoop 1.2.1. It will be of great help, if link to the plugin is provided.

      Thanks in advance… 🙂

    5. Prayas says:

      Hi, I am using Windows 7 Enterprise 32-Bit Operating System.
      Which installer of Java should I download for setting up Hadoop.

      I see following version for windows in Java SE Development Kit 6u45 category-

      Windows x86
      Windows x64

      Thanks!!

    6. Prayas says:

      Thanks Admin, I am using Windows 7 Enterprise 32-Bit Operating System on LAN.
      When I issue command- ssh-host-config after starting Cygwin, it gives me following prompt. How should I respond?
      I want to setup Hadoop env on my machine only without disturbing rest of the machines on LAN. Please help.

      *** Info: StrictModes is set to ‘yes’ by default.
      *** Info: This is the recommended setting, but it requires that the POSIX
      *** Info: permissions of the user’s home directory, the user’s .ssh
      *** Info: directory, and the user’s ssh key files are tight so that
      *** Info: only the user has write permissions.
      *** Info: On the other hand, StrictModes don’t work well with default
      *** Info: Windows permissions of a home directory mounted with the
      *** Info: ‘noacl’ option, and they don’t work at all if the home
      *** Info: directory is on a FAT or FAT32 partition.
      *** Query: Should StrictModes be used? (yes/no)

      Regards
      Prayas

    7. nagamani k says:

      I followed same procedure as you said,when we type the command bin/hadoop namenode -format iam getting following error.please give me reply as soon as possible.
      error:
      bin/hadoop: line 58: 54.: command not found
      bin/hadoop: line 60: syntax error near unexpected token `then’
      bin/hadoop: line 60: `55. if [ -e “$bin”/../libexec/hadoop-config.sh ]; then’

    8. Anusha says:

      HAi,
      I followed the same steps as u said above . I find an error while extrating the tar file of Hadoop.
      When ls -l is executed I am not able to get jar files in it .what can i do ?

      • Admin says:

        Hi Anusha,
        Could you please post the error message you got while extracting the tar file of Hadoop.

        Regards,
        Admin

    9. Raj says:

      I’m facing some problems that I cant access to http://localhost:50030. I try to access this, i am getting an error, while I am able to access the namenode web interface…please help. Thank you.

      • Admin says:

        Hi Raj,
        Could you please provide the error details.

        Regards,
        Raja

        • Raj says:

          Thank you.
          here is the error log in the file hadoop-cyg_server-jobtracker-hp-pc.out file. Btw, pls advise if am I referring to the correct error file?

          /home/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 31: unexpected EOF while looking for matching `”‘
          /home/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 58: syntax error: unexpected end of file
          [Fatal Error] hdfs-site.xml:8:41: The element type “user” must be terminated by the matching end-tag “”.
          unlimited
          file size (blocks, -f) unlimited
          open files (-n) 256
          pipe size (512 bytes, -p) 8
          stack size (kbytes, -s) 2031
          cpu time (seconds, -t) unlimited
          max user processes (-u) 256
          virtual memory (kbytes, -v) unlimited

          Thank you for your guidance.

          • pranav says:

            i know this is a tad too late…. but may help someone who tries to set this up in the future.
            ensure there’s no space in the path to your JAVA_HOME dir.
            you may also need to run dos2unix against the files you modified

        • Mahender says:

          Same error i am facing what is the wrong in hadoop-env.sh i did’t anything just i was changed java_home attribute value only

      • pranav says:

        look at your hadoop-jobtracker–PC.log file
        you may see a permission issue like:
        ERROR org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:Tatter cause:java.io.IOException: Failed to set permissions of path: D:\hadoop\hadoop-1.2.1\logs\history to 0755

        fix it manually and retry….. worked for me

      • Mukesh Pandey says:

        Please add following setting in core-site.xml @ ..hadoop-1.2.1\conf\

        hadoop.tmp.dir
        /tmp/hadoop-${user.name}
        A base for other temporary directories.

    10. saurabh asthana says:

      i am unable to run /bin/hadoop namenode -format

      ~/hadoop-2.7.0
      $ bin/hadoop namenode -format
      bin/hadoop: line 2: $’\r’: command not found
      bin/hadoop: line 17: $’\r’: command not found
      bin/hadoop: line 19: $’\r’: command not found
      : No such file or directoryome/ASTHANAS/hadoop-2.7.0/bin
      bin/hadoop: line 23: $’\r’: command not found
      bin/hadoop: line 25: $’\r’: command not found
      bin/hadoop: line 27: syntax error near unexpected token `$’in\r”
      ‘in/hadoop: line 27: `case “$(uname)” in

    11. Naidu says:

      hi,I followed same procedure as you said,when I used the command

      bin/hadoop namenode -format

      I am getting following error.
      error:
      bin/hadoop: line 350: C:\Program: command not found
      bin/hadoop: line 434: C:\Program Files (x86)\Java/bin/java: No such file or directory

      any help would be great.thanks!

    12. Mahender says:

      I got this error when i am trying to hit this command: $ bin/hadoop namenode –format

      $ /home/MAHI/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 31: unexpected EOF w hile looking for matching `”‘
      > /home/MAHI/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 58: syntax error: un expected end of file
      > 15/06/28 19:01:21 INFO namenode.NameNode: STARTUP_MSG:
      > /************************************************************
      > STARTUP_MSG: Starting NameNode
      > STARTUP_MSG: host = MAHI-PC/10.10.34.94
      > STARTUP_MSG: args = [▒format]
      > STARTUP_MSG: version = 1.2.1
      > STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common/branches/b ranch-1.2 -r 1503152; compiled by ‘mattf’ on Mon Jul 22 15:23:09 PDT 2013
      > STARTUP_MSG: java = 1.7.0_79
      > ************************************************************/
      > Usage: java NameNode [-format [-force ] [-nonInteractive]] | [-upgrade] | [-roll back] | [-finalize] | [-importCheckpoint] | [-recover [ -force ] ]
      > 15/06/28 19:01:21 INFO namenode.NameNode: SHUTDOWN_MSG:
      > /************************************************************
      > SHUTDOWN_MSG: Shutting down NameNode at MAHI-PC/10.10.34.94
      > ************************************************************/

      I WAS JUST CHANGED THE VALUE OF JAVA_HOME ONLY

    13. chandan says:

      Hi,

      I have completed the setup but while running the command “bin/hadoop namenode- format”, i am getting some error. The error is “Error: Could not find or load main class Mishra\hadoop-1.2.1\logs”. I have installed JDK 1.8 and its working fine. Can you please help me out to resolve it.

    14. Sravanthi says:

      Hello When I enter the command “Explorer” in Cygwin it is not taking me to be Cygwin directory location. instead it is opening the documents library folder. How can I fix the mapping issue. Kindly assist.

    15. shwteha says:

      Hi,

      can anyone help to configure eclipse with hadoop

    16. shwteha says:

      I have completed the setup by following the instructions but while executing bin:/hadoopnamenode-format i’m getting the error as
      -bash: bin/hadoopnamenode-format: No such file or directory

      please help to complete installation by giving solution to my problm

      • Harish Narayanan says:

        try ./hadoop – this means path issue and not set
        naru@IS-131605N ~/hadoop-1.2.1/bin
        $ pwd
        /home/naru/hadoop-1.2.1/bin

        naru@IS-131605N ~/hadoop-1.2.1/bin
        $

        naru@IS-131605N ~/hadoop-1.2.1/bin
        $ hadoop – version
        -bash: hadoop: command not found

        naru@IS-131605N ~/hadoop-1.2.1/bin
        $ ./hadoop -version
        java version “1.7.0_79”
        Java(TM) SE Runtime Environment (build 1.7.0_79-b15)
        Java HotSpot(TM) 64-Bit Server VM (build 24.79-b02, mixed mode)

      • Abbas says:

        setting JAVA_HOME in side hadoop-env.sh — VERY IMPORTANT
        In case you need to fix the environment variable in hadoop and bcs getting
        the error message: -bash: hadoop: command not found

        I have installed java into the following directory on my local C drive.
        C:/Java64/jdk1.7.0_80

        However in the file hadoop-env.sh you should write it this way:
        export JAVA_HOME=C:/Java64/jdk1.7.0_80

        Unix understands forward slash for directories or else you will

        Here is what you need to do:
        1- Open Cygwin terminal (Run as Administration) and execute following command
        2- $ cd hadoop-1.2.1 //I am using hadoop version 1.2.1
        3- $ cd conf
        4- $ explorer .
        5- Open hadoop-env.sh file to set Java home as above environmental variable setup
        6- Uncommentor change the line which contains “export JAVA_HOME” and provide the path with forward slash similar to above
        7- $ dos2unix hadoop-env.sh
        8- close Cygwin and then open Cygwin As Administrator

        ————————–

        The confusion comes from two versions of the directory:
        The one explained here is the hadoop-env.sh version
        the other version is the one inside your System Properties -> Environment Variables
        inside your Windows which is accessed from the control panel -> system -> Advanced system Setting

        This version is written:
        JAVA_HOME=C:\Java64\jdk1.7.0_80

        By the way thanks for GREAT TUTORIAL – I have learned a lot and it works!

    17. Hadoop user says:

      unable to run hadoop fs -ls command

    18. Mads says:

      Hi, so I followed all your steps (thanks for that, BTW. 😀 ) and my NameNode page is turning up fine. However, I can’t seem to access my JobTracker page. The browser’s saying “Connection refused” for it. And I can’t seem to find out what’s wrong.
      This is what I’m getting when I run: bin/start-all.sh:

      starting namenode, logging to /home/Admin/hadoop-1.2.1/libexec/../logs/hadoop-Ad min-namenode-Admin-PC.out
      /home/Admin/hadoop-1.2.1/libexec/../bin/hadoop: line 350: C:\Program: command no t found
      localhost: Connection closed by 127.0.0.1
      localhost: Connection closed by 127.0.0.1
      starting jobtracker, logging to /home/Admin/hadoop-1.2.1/libexec/../logs/hadoop- Admin-jobtracker-Admin-PC.out
      /home/Admin/hadoop-1.2.1/libexec/../bin/hadoop: line 350: C:\Program: command no t found
      localhost: Connection closed by 127.0.0.1

      I don’t know why it’s saying ‘command not found’. Can I please know what to do about it? Thanks.

      • Ms. zin mar myo says:

        Dear, I can not reach the name node page. However, I get the message when starting the hadoop deamon like this:
        starting namenode,logging to home/dell/…..
        localhost:authentication fail
        localhost: authentication fail
        starting jobtracker, logging to home/dell……
        local host:authentication fail

        please, help me if possible. How can I solve?
        best regards

    19. i am unable to start Job Tracker
      Error msg – Unable to connect

    20. revati says:

      it was wonderful experience for me, this tutorial is really fruit full for installation of hadoop.
      Thanks

    21. Veeresh P says:

      in Name node url (http://localhost:50070/dfshealth.jsp) when I click on ‘Browse file system’ I am getting below error
      HTTP ERROR 404
      Problem accessing /browseDirectory.jsp. Reason:
      /browseDirectory.jsp

      Please help me .

    22. Yogendra says:

      finally, for the 1st time in last 2 weeks
      I could do hadoop on Windows 7 PRO
      thanks to your this article

      yes, except the very last : JobTracker Web Interface
      got the response : This page can’t be displayed

      thanks again

    23. deepak says:

      The JobTracker web interface cant be accessed via the URL http://localhost:50030/
      i haave facing the error message is

      [Fiddler] The socket connection to localhost failed.
      ErrorCode: 10061.
      No connection could be made because the target machine actively refused it 127.0.0.1:50030

    24. Adeoye Adewale says:

      Good day Admin,

      Your tutorial on hadoop is very informative and easy to install.

      am trying to install Hadoop-2.7.1 version with the steps you gave and i had this issue no
      “Adeoye Adewale@AdeoyeAdewale ~/hadoop-2.7.1
      $ bin/hadoop namenode -format
      bin/hadoop: line 22: cd: /home
      Adewale/hadoop-2.7.1/bin: No such file or directory
      bin/hadoop: line 27: /home/Adeoye: No such file or directory
      DEPRECATED: Use of this script to execute hdfs command is deprecated.
      Instead use the hdfs command for it.

      HADOOP_HDFS_HOME not found!”

      your swift responds will be highly appreciated as my undergraduate project depends on it.

    25. Arpit SINHA says:

      What would be the size of this CYGWIN file…?

      • Admin says:

        Hello Arpit,

        It would be 100 MB we guess, could be little less or more depending upon the distribution and environments.

        Thanks,
        Admin

    26. Nathan says:

      JobTracker Web Interface is not working for me ..pls help on it ( ERR_CONNECTION_REFUSED)

    27. Nathan says:

      Hi Admin, I am getting this error …Please help on it

      NoRuleZ@NoRuleZ-PC ~/hadoop-1.2.1
      $ bin/start-all.sh

      localhost: /home/NoRuleZ/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 31: unexpected EOF while looking for matching `”‘
      localhost: /home/NoRuleZ/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 58: syntax error: unexpected end of file
      localhost: /home/NoRuleZ/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 31: unexpected EOF while looking for matching `”‘
      localhost: /home/NoRuleZ/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 58: syntax error: unexpected end of file
      localhost: starting datanode, logging to /home/NoRuleZ/hadoop-1.2.1/libexec/../logs/hadoop-NoRuleZ-datanode-NoRuleZ-PC.out
      localhost: /home/NoRuleZ/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 31: unexpected EOF while looking for matching `”‘
      localhost: /home/NoRuleZ/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 58: syntax error: unexpected end of file
      /home/NoRuleZ/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 31: unexpected EOF while looking for matching `”‘
      /home/NoRuleZ/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 58: syntax error: unexpected end of file
      /home/NoRuleZ/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 31: unexpected EOF while looking for matching `”‘
      /home/NoRuleZ/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 58: syntax error: unexpected end of file

    28. Hari says:

      Hi,

      http://localhost:50070/dfshealth.jsp. I can see the screen but when I click on Browse File System, an error is thrown saying UNKNOWN HOST

    29. Gitanjali Sharma says:

      lg@lg-PC /cygdrive/c/Users/lg/hadoop-1.2.1
      $ bin/hadoop namenode -format
      /cygdrive/c/Users/lg/hadoop-1.2.1/libexec/../conf/hadoop-env.sh: line 10: export: `C:\ProgramFiles\Java\jdk1.7.0_79′: not a valid identifier
      Error: JAVA_HOME is not set.

      getting this error again and again. please help.

    30. Ahmed Attia says:

      First of all this tutorial is perfect (A+) but i need bit help
      I’m trying to use hadoop (2.7.1) for the first time using cygwin on win 8. I followed the configuration and i think every thing is well. However when i issue sbin/start-all.sh, i get the following error

      WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
      Also when i try to use the browser and issue http://localhost:50030/ or http://localhost:50070/ i get the following message

      This site can’t be reached
      localhost refused to connect.
      How can I get my computer to stop refusing the connection? I don’t know much about networking or servers. Thanks in advance

    31. Ankit says:

      Thank you so much. I have done the installation correct.

    32. Saurav Sircar says:

      I performed the installation as above. But whenever I execute the “start-all.sh”, it starts only ResourceManager. So I tried executing individually. When I executed “start-dfs.sh”, it gives me the Warning: Unable to loadnative-hadoop library on your platform… using builtin-java classes where applicable.
      And then none of the daemons start.
      Also, I get the error which says Winutils binary not found(Occasionally).
      The only difference is I had java installed before trying this tutorial. I’m using java 8 and Hadoop 2.7.2.

      • mac says:

        saurav i think u have to see the path of perticular cmd,like if u want to cmd like start-all.sh u would like to first hadoop-2.7.2 dir.whatever u was used java and hadoop,sure about dir

    33. Tushar Dey says:

      After completing all the installation steps (viz. Cygwin, Java, Hadoop 1.2.1), when I am giving the command http://localhost:50070 or http:/localhost:50030 I am getting a message “The webpage is not available”.

      Before that, when I have run $bin/hadoop namenode -format, I am getting following error message.
      /home/tkd/hadoop-1.2.1/libexec/../bin/hadoop: line 350: C:/JAVA/bin/java: cannot execute binary file: Exec format error
      /home/tkd/hadoop-1.2.1/libexec/../bin/hadoop: line 434: C:/JAVA/bin/java: cannot execute binary file: Exec format error
      /home/tkd/hadoop-1.2.1/libexec/../bin/hadoop: line 434: exec: C:/JAVA/bin/java: cannot execute: Permission denied

      Can you tell me how to remove this error when running $bin/hadoop namenode -format ????

    34. Anubha says:

      Wonderful tutorial. Thank you so much 🙂

    35. Madhu says:

      Hi,
      I Followed the procedure as given everything went well Except that I am not able to open the local Host for job tracker. Msg is coming like ‘site can’t be reached and local host refused to connect’. Please help me out in this.
      Thanks in advance

      Madhu

    36. Madhu says:

      Hi all,
      For those who are facing issues to open the local host for job tracker please use the command “bin/hadoop jobtracker -format” after execuitng “bin/hadoop namenode-format” and then try to open it again .

      Good luck

    37. lakshman says:

      Hi all,
      I have followed the procedure as given above, when I am using command bin/hadoop namenode -fomat. I am getting this error
      bin/hadoop: line 350: C:\Program: command not found
      16/08/29 15:43:39 INFO namenode.NameNode: STARTUP_MSG:
      /************************************************************
      STARTUP_MSG: Starting NameNode
      STARTUP_MSG: host = ADMINIB-NGBQQUT/9.109.48.38
      STARTUP_MSG: args = [-format]
      STARTUP_MSG: version = 1.2.1
      STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r 1503152; compiled by ‘mattf’ on Mon Jul 22 15:23:09 PDT 2013
      STARTUP_MSG: java = 1.7.0_71
      ************************************************************/
      Re-format filesystem in \home\USER_FOLDER_NAME\hadoop-dir\namedir ? (Y or N)

    38. Nitin says:

      When i run cammand ” bin/hadoop namenode -format ”
      the terminal show an error

      bin/hadoop namenode -format
      bin/hadoop: line 350: C:javajdk1.6.0_45/bin/java: No such file or directory
      bin/hadoop: line 434: C:javajdk1.6.0_45/bin/java: No such file or directory

      plz help mi to solve this problem

    39. Kumaravel says:

      hi guys, i jus configured hadoop by the post.. i can able to name node successfully. but job tracker url im not able to get. it shows site can not be reached

    40. Pramod Pant says:

      when I run the command bin/hadoop namenode -format I got the following error
      17/01/17 16:43:10 INFO namenode.NameNode: STARTUP_MSG:
      /************************************************************
      STARTUP_MSG: Starting NameNode
      STARTUP_MSG: host = P_Pant/10.18.1.173
      STARTUP_MSG: args = [-format]
      STARTUP_MSG: version = 1.2.1
      STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r 1503152; compiled by ‘mattf’ on Mon Jul 22 15:23:09 PDT 2013
      STARTUP_MSG: java = 1.8.0_25
      ************************************************************/
      17/01/17 16:43:10 ERROR conf.Configuration: Failed to set setXIncludeAware(true) for parser org.apache.xerces.jaxp.DocumentBuilderFactoryImpl@1e7a66b:java.lang.UnsupportedOperationException: setXIncludeAware is not supported on this JAXP implementation or earlier: class org.apache.xerces.jaxp.DocumentBuilderFactoryImpl
      java.lang.UnsupportedOperationException: setXIncludeAware is not supported on this JAXP implementation or earlier: class org.apache.xerces.jaxp.DocumentBuilderFactoryImpl
      at javax.xml.parsers.DocumentBuilderFactory.setXIncludeAware(DocumentBuilderFactory.java:584)
      at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1131)
      at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:1107)
      at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:1053)
      at org.apache.hadoop.conf.Configuration.set(Configuration.java:420)
      at org.apache.hadoop.hdfs.server.namenode.NameNode.setStartupOption(NameNode.java:1374)
      at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1463)
      at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1488)
      17/01/17 16:43:10 ERROR conf.Configuration: Failed to set setXIncludeAware(true) for parser org.apache.xerces.jaxp.DocumentBuilderFactoryImpl@e45eb6:java.lang.UnsupportedOperationException: setXIncludeAware is not supported on this JAXP implementation or earlier: class org.apache.xerces.jaxp.DocumentBuilderFactoryImpl
      java.lang.UnsupportedOperationException: setXIncludeAware is not supported on this JAXP implementation or earlier: class org.apache.xerces.jaxp.DocumentBuilderFactoryImpl
      at javax.xml.parsers.DocumentBuilderFactory.setXIncludeAware(DocumentBuilderFactory.java:584)
      at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1131)
      at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:1107)
      at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:1053)
      at org.apache.hadoop.conf.Configuration.set(Configuration.java:420)
      at org.apache.hadoop.hdfs.server.namenode.NameNode.setStartupOption(NameNode.java:1374)
      at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1463)
      at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1488)
      17/01/17 16:43:10 ERROR conf.Configuration: Failed to set setXIncludeAware(true) for parser org.apache.xerces.jaxp.DocumentBuilderFactoryImpl@a3ffec:java.lang.UnsupportedOperationException: setXIncludeAware is not supported on this JAXP implementation or earlier: class org.apache.xerces.jaxp.DocumentBuilderFactoryImpl
      java.lang.UnsupportedOperationException: setXIncludeAware is not supported on this JAXP implementation or earlier: class org.apache.xerces.jaxp.DocumentBuilderFactoryImpl
      at javax.xml.parsers.DocumentBuilderFactory.setXIncludeAware(DocumentBuilderFactory.java:584)
      at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1131)
      at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:1107)
      at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:1053)
      at org.apache.hadoop.conf.Configuration.set(Configuration.java:420)
      at org.apache.hadoop.hdfs.server.namenode.NameNode.setStartupOption(NameNode.java:1374)
      at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1463)
      at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1488)
      17/01/17 16:43:10 ERROR conf.Configuration: Failed to set setXIncludeAware(true) for parser org.apache.xerces.jaxp.DocumentBuilderFactoryImpl@11161c7:java.lang.UnsupportedOperationException: setXIncludeAware is not supported on this JAXP implementation or earlier: class org.apache.xerces.jaxp.DocumentBuilderFactoryImpl
      java.lang.UnsupportedOperationException: setXIncludeAware is not supported on this JAXP implementation or earlier: class org.apache.xerces.jaxp.DocumentBuilderFactoryImpl
      at javax.xml.parsers.DocumentBuilderFactory.setXIncludeAware(DocumentBuilderFactory.java:584)
      at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:1131)
      at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:1107)
      at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:1053)
      at org.apache.hadoop.conf.Configuration.set(Configuration.java:420)
      at org.apache.hadoop.hdfs.server.namenode.NameNode.setStartupOption(NameNode.java:1374)
      at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1463)
      at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1488)
      Re-format filesystem in \home\USER_FOLDER_NAME\hadoop-dir\namedir ? (Y or N)

      Can anyone help me ??

    41. sadhana says:

      when i run ssh command it always gives command not found error as follows please help me to configure the ssh host and user
      sadhana@sadhana-PC ~
      $ ssh-host-config
      -bash: ssh-host-config: command not found

      sadhana@sadhana-PC ~
      $ ssh-user-config
      -bash: ssh-user-config: command not found

    42. Sreeni says:

      Hi,
      I am getting the below error when i apply the below command. can someone please advise?

      $ bin/hadoop namenode -format
      bin/hadoop: line 350: C:JAVA/bin/java: No such file or directory
      bin/hadoop: line 434: C:JAVA/bin/java: No such file or directory

    43. Sreeni says:

      This is a Great effort – look like, I am able to open Name Node & Job Tracker using the URL’s provided above.
      Now, please advise – how do i install Hive & Practice Hive on it.
      Appreciated your swift response

    44. Sreeni says:

      Home Directory~/hadoop-1.2.1
      $ jps
      -bash: jps: command not found

      But, I am able to open
      NameNode: http://localhost:50070/
      JobTracker : http://localhost:50030/

    45. Jay Bibodi says:

      Seems like I have some problem with starting Job tracker and task tracker could anyone please help ?

      Requested by : Bibodi 2017-04-07 02:36:57,470 INFO org.apache.hadoop.mapred.JobTracker: Cleaning up the system directory 2017-04-07 02:36:57,563 INFO org.apache.hadoop.mapred.JobHistory: Creating DONE folder at file:/C:/cygwin/home/Bibodi/hadoop-1.2.1/logs/history/done 2017-04-07 02:36:57,563 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable 2017-04-07 02:36:57,563 ERROR org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:Bibodi cause:java.io.IOException: Failed to set permissions of path: C:\cygwin\home\Bibodi\hadoop-1.2.1\logs\history\done to 0755 2017-04-07 02:36:57,563 FATAL org.apache.hadoop.mapred.JobTracker: java.io.IOException: Failed to set permissions of path: C:\cygwin\home\Bibodi\hadoop-1.2.1\logs\history\done to 0755 at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:691) at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:672) at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:514) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:349) at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:193) at org.apache.hadoop.mapred.JobHistory.initDone(JobHistory.java:564) at org.apache.hadoop.mapred.JobHistory.initDone(JobHistory.java:540) at org.apache.hadoop.mapred.JobTracker$4.run(JobTracker.java:1823) at org.apache.hadoop.mapred.JobTracker$4.run(JobTracker.java:1821) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190) at org.apache.hadoop.mapred.JobTracker.initialize(JobTracker.java:1820) at org.apache.hadoop.mapred.JobTracker.offerService(JobTracker.java:2147) at org.apache.hadoop.mapred.JobTracker.main(JobTracker.java:4711) 2017-04-07 02:36:57,563 INFO org.apache.hadoop.mapred.JobTracker: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down JobTracker at Bibodi-PC/

      STARTUP_MSG: Starting TaskTracker STARTUP_MSG: host = Bibodi-PC/ STARTUP_MSG: args = [] STARTUP_MSG: version = 1.2.1 STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common/branches/branch-1.2 -r 1503152; compiled by ‘mattf’ on Mon Jul 22 15:23:09 PDT 2013 STARTUP_MSG: java = 1.8.0_74 ************************************************************/ 2017-04-07 02:37:02,255 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties 2017-04-07 02:37:02,442 INFO org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source MetricsSystem,sub=Stats registered. 2017-04-07 02:37:02,442 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s). 2017-04-07 02:37:02,442 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: TaskTracker metrics system started 2017-04-07 02:37:03,602 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable 2017-04-07 02:37:03,711 INFO org.apache.hadoop.metrics2.impl.MetricsSourceAdapter: MBean for source ugi registered. 2017-04-07 02:37:03,727 WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Source name ugi already exists! 2017-04-07 02:37:04,055 INFO org.mortbay.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.mortbay.log) via org.mortbay.log.Slf4jLog 2017-04-07 02:37:04,211 INFO org.apache.hadoop.http.HttpServer: Added global filtersafety (class=org.apache.hadoop.http.HttpServer$QuotingInputFilter) 2017-04-07 02:37:04,274 INFO org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs’ truncater with mapRetainSize=-1 and reduceRetainSize=-1 2017-04-07 02:37:04,289 INFO org.apache.hadoop.mapred.TaskTracker: Starting tasktracker with owner as Bibodi 2017-04-07 02:37:04,289 INFO org.apache.hadoop.mapred.TaskTracker: Good mapred local directories are: /tmp/hadoop-Bibodi/mapred/local 2017-04-07 02:37:04,305 ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker because java.io.IOException: Failed to set permissions of path: \tmp\hadoop-Bibodi\mapred\local\taskTracker to 0755 at org.apache.hadoop.fs.FileUtil.checkReturnValue(FileUtil.java:691) at org.apache.hadoop.fs.FileUtil.setPermission(FileUtil.java:672) at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:514) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:349) at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:193) at org.apache.hadoop.mapred.TaskTracker.initialize(TaskTracker.java:823) at org.apache.hadoop.mapred.TaskTracker.(TaskTracker.java:1573) at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3937) 2017-04-07 02:37:04,336 INFO org.apache.hadoop.mapred.TaskTracker: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down TaskTracker at Bibodi-PC/

      Thanks
      Jay

    46. Nagesh Wadhera says:

      Hey admin, i have set the path for JAVA_HOME, but while executing “bin/hadoop namenode -format” this command, i’m getting error i.e
      bin/hadoop: line 350: C:\Program: command not found
      Error: Could not find or load main class Dhir\hadoop-1.2.1\logs

    47. sridhar says:

      Istallation done successfully. But when started all componets using bin/start-all.sh command. It is asking for a password, when entered password it is saying that permission denied. Please help. Thanks.

      Sri.

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    Current day month ye@r *

     © 2017 : saphanatutorial.com, All rights reserved.  Privacy Policy