SapHanaTutorial.Com HOME     Learning-Materials Interview-Q&A Certifications Quiz Online-Courses Forum Jobs Trendz FAQs  
     Explore The World of Hana With Us     
About Us
Contact Us
 Apps
X
HANA App
>>>
Hadoop App
>>>
Tutorial App on SAP HANA
This app is an All-In-One package to provide everything to HANA Lovers.

It contains
1. Courses on SAP HANA - Basics, Modeling and Administration
2. Multiple Quizzes on Overview, Modelling, Architeture, and Administration
3. Most popular articles on SAP HANA
4. Series of Interview questions to brushup your HANA skills
Tutorial App on Hadoop
This app is an All-In-One package to provide everything to Hadoop Lovers.

It contains
1. Courses on Hadoop - Basics and Advanced
2. Multiple Quizzes on Basics, MapReduce and HDFS
3. Most popular articles on Hadoop
4. Series of Interview questions to brushup your skills
Apps
HANA App
Hadoop App
';
Search
Stay Connected
Search Topics
Course Index
Close
X
Install Your Own Hadoop on Windows and Run MapReduce Programs
Course Overview
1. Introduction
2. Installation of Hadoop 1.0
3. MapReduce Programs in Hadoop 1.0
4. Installation of Hadoop 2.0
5. MapReduce Programs in Hadoop 2.0
6. What is Next?
<< Previous
Next >>
3.2. Web interface for the NameNode and the JobTracker

After you have started the Hadoop Daemons by using the command bin/start-all.sh, you can open and check NameNode and JobTracker in browser.
By default they are available at below address. The web interface for these services provide information and status of each of these components. They are first entry point to obtain a view of the state of a Hadoop cluster.

NameNode Web Interface:

The NameNode web interface can be accessed via the URL http://<host>:50070/
In this case, http://localhost:50070/

Hadoop Installation on Windows

    1. The first section of the web page displays the name of the server running the NameNode which is 127.0.0.1 and the port 9100, when it was started, version information.
    2. In the Next section you see Cluster Summary which represent a high view of the state of the cluster.

      files and directories, blocks: Each Filesystem metadata item consumes this much memory.

      Configured Capacity: It represents total capacity of HDFS.

      DFS Used: It represent space used in HDFS.

      Non DFS Used: It tells about the space used for non-HDFS items like any other application running on system.

JobTracker Web Interface:

The JobTracker web interface can be accessed via the URL http://<namenode_host>:50030/
In this case, http://localhost:50030/

Hadoop Installation on Windows

<< Previous
Next >>

Leave a Reply

Your email address will not be published. Required fields are marked *

Current day month ye@r *

 © 2017 : saphanatutorial.com, All rights reserved.  Privacy Policy