Hadoop Basics

Wikipedia definition of Hadoop : Apache Hadoop is an open source software project, which is Java-based that supports the processing of large data sets in a distributed computing environment. It is part of the Apache project sponsored by the Apache Software Foundation.

As per Forrester analyst Mike Gualtieri it is a platform that makes big data easier to manage.
Hadoop was created by Doug Cutting and Mike Cafarella in 2005. It was initially developed to support distribution for Nutch search engine project.Doug’s son was 2 years at that time and had a yellow toy elephant named Hadoop. Doug named the project as Hadoop.

Hadoop has 4 main components, Hadoop common, HDFS, YARN and Map Reduce.
Hadoop Common contains libraries and utilities needed for other Hadoop modules.
Hadoop Distributed File System (HDFS) is a distributed file-system that stores data on the commodity machines, providing very high aggregate bandwidth across the cluster. It enables to store files which are very larger than a particular node or server
Hadoop YARN: a resource-management platform responsible for managing compute resources in clusters and using them for scheduling of users’ applications
Hadoop MapReduce is a programming model for large scale data processing. This enables Hadoop to process and provide a framework for processing that data

hadoop architecture
hadoop architecture

Reference :

September 9, 2015

0 Responses on Hadoop Basics"

2014 © Netscitus Corporation. All Rights Reserved.