This course is targeted for people who have basic understanding of Hadoop and want to learn
- How to install Hadoop 1.0 and Hadoop 2.0 on Windows
- How to start and stop various Hadoop services(NameNode, JobTracker, YARN Resource Manager, Node Manager etc.)
- What are different web interfaces of NameNode, JobTracker, Resource Manager etc.
- Execute MapReduce programs
You should have basic knowledge of Hadoop. If you do not have the basic knowledge of Hadoop, refer to the Hadoop Basics Course
About The Course: :
This course is the next step after learning Hadoop Basics
After getting an idea of Hadoop, Hadoop Architecture and Hadoop components, let''s try gaining some deep knowledge on various topics/ aspects of Hadoop by actually installing Apache Hadoop on Windows and running MapReduce programs.
In this course we will cover installation of both Hadoop 1.0 and Hadoop 2.0.
In Hadoop 1.0 we will also look into various web browser interfaces for NameNode and JobTracker, TaskTracker whereas with Hadoop 2.0 we will see newly introduced YARN framework and its core components Resource Manager and Node Manager.
No Course Fee - Totally Free:
This course is free of charge.
Just one note, if you really like the course, please share it with your friends and do provide your valuable feedback/comments through Contact Us
You can also share information, ask questions, provide feedbacks and suggest us through The Forum
The course duration will be approx 12 hours. This includes downloading required softwares, installation of prerequisite softwares, installation of Hadoop and running MapReduce programs.
Important Note: :
In the very first course Hadoop Basics
we installed sandbox version of Hadoop, provided by Hortonworks to run few sample examples. With the sandbox version, without much effort and knowing much details we were able to do stuff like loading the file in HDFS, Query the data (file) using Pig and Hive languages and do analysis of the query result in Business Intelligence tool like Excel.
In this course we will not work with Sandbox version anymore, but we will install our own Hadoop Cluster.