BIG DATA - Hadoop

What are the concepts used in the Hadoop Framework?  

The  Hadoop  Framework  functions  on  two  core  concepts:

HDFS :  Abbreviation for Hadoop Distributed File System,it is a Java¬based file system for scalable and reliable storage of large datasets.

HDFS itself works on  the Master-Slave Architecture and stores all its data in the form of blocks.

MapReduce :  This  is  the  programming  model  and  the  associated  implementation  for processing  and  generating  large  data  sets.  The  Hadoop  jobs  are  basically  divided  into two  different  tasks  job.  The  map  job  breaks  down  the  data  set  into  key¬value  pairs  or tuples.  And  then  the  reduce  job  takes  the  output  of  the  map  job  and  combines  the  data tuples  into  a  smaller  set  of  tuples.

What is name Name and Master Node in the Hadoop?

These are actually related to storage in the Hadoop. Name Node is basically considered as a master node and is responsible for maintaining the Meta data information which is related to different blocks based on some of the factors related with them. Data Nodes are considered as Slave Nodes which is mainly responsible for storage and management of data in the basic format.