General syntax :
hadoop command [generic options] [command options]
1. ls - to list the files in hdfs
Ex: hadoop fs -ls /user/
hadoop fs -ls -R /user/ #Note: Here, -R indicates Recursive listing
hadoop fs -ls hdfs://localhost/user/root
2. mkdir - to create a new directory
Ex: hadoop fs -mkdir /user/root
hadoop fs -mkdir -p hdfs://localhost/user/root/hadoop
3.put -to copy from local file system to hdfs
Ex: hadoop fs -put hello.txt /subhash
hadoop fs -put localfile1 localfile2 /usr/local/
4.get - to copy files from hdfs to local file system
Ex: hadoop fs -get /user/root/localfile3 /home/documents
5.copyFromLocal -same as put command
Ex: hadoop fs -copyFromLocal sourcepath destinationpath
6.copyToLocal- same as get command
Ex: hadoop fs -copyToLocal sourcepath destinationpath
7.cat- to view contents of a file
Ex:hadoop fs -cat /subhash/sample.txt
8.cp- to copy files from one location to another within hdfs
Ex: hadoop fs -cp sourcepath destinationpath
9.df- displays disk free space and all other statistics
Ex:hadoop fs -df /user
10.du- displays size of a file on hdfs
Ex:hadoop fs -du /user/root/
11.moveFromLocal-localfile will be moved to hdfs
Ex:hadoop fs -moveFromLocal sourcepath destinationpath
12.moveToLocal- move file from hdfs to local file system
Ex:hadoop fs -moveToLocal sourcepath destinationpath
13.mv-moving files within hdfs
Ex: hadoop fs -mv sourcepath destinationpath
14.rm - to remove files from hdfs
Ex:hadoop fs -rm /subhash/sample.txt
15.text - output in text format
Ex:hadoop fs -text sourcepath
16.appendToFile - is used to append data to a file in HDFS
Ex:hadoop fs -appendToFile sourcepath destinationpath
17. checksum - to show checksum of a file
Ex: hadoop fs -checksum /subhash/hello.txt
18.setrep - To set replication factor of a file
Ex: hadoop fs -setrep -w 3 /user/root/dir1
-w indicates waiting for the replication to complete (Typically w option will take long time to complete)
Ex: hadoop fs -setrep 3 /user/root/dir1/sample.txt
19. stat - all the statistics for the file
Ex: hadoop fs -stat "%f" filepath
20. count
21.getmerge-concatenates all the files into a single file
22.help- help about commands
23. fsck -check the health of the all files under the specified path
Ex: hadoop fsck /
24. report - complete report of the cluster
Ex:hadoop dfsadmin -report
25. chmod,chown,chgrp -similar to unix
Ex: hadoop fs -chmod 777 filename
hadoop fs -chown user:group filename
26.safemode - To make hdfs in safe mode
Ex: hadoop dfsadmin -safemode leave #come out of the safe mode
27.list -to list the currently jobs in name node
Ex: hadoop job -list
28. status - to know the status of the job with jobid
Ex: hadoop job -status jobid
Exercise on Hadoop Commands:
To see help of any command on hdfs you may type hadoop fs -help comamnd_name
Answers:
1) hadoop fs –mkdir /subhash
2)hadoop fs –mkdir /subhash/dir1 /subhash/dir2
3) hadoop fs –ls –R /
4) vi sample1.txt (or) nano sample.txt
5)hadoop fs –put ./sample1.txt /subhash/dir1
6)hadoop fs –ls –R /subhash
7)hadoop fs –put ./sample2.txt /subhash/dir2
8)hadoop fs –get /subhash/dir1/sample1.txt
9)hadoop fs –cat /subhash/dir1/sample1.txt
10)hadoop fs –cp /subhash/dir1/sample1.txt /subhash/dir2
11)hadoop fs –mv /subhash/dir1/sample1.txt /subhash/dir2
12)hadoop fs –copyFromLocal ./sample1.txt /subhash/dir1
hadoop fs –copyToLocal /subhash/dir1/sample1.txt
13)hadoop fs –tail /subhash/dir1/sample1.txt
14)hadoop fs –du /subhash/dir1/sample1.txt
15)hadoop fs –appendToFile /subhash/dir1/sample1.txt /subhash/dir2/sample2.txt
16)hadoop fs –getmerge /subhash/dir1 /subhash/dir2/sample3.txt
17)hadoop fs –cp /subhash/dir1 /subhash/dir2
18)hadoop fs –setrep 4 /subhash/dir1/sample1.txt
19)hadoop fs –rm /subhash/dir1/sample1.txt
20)hadoop fs –rm –R /subhash/dir1