Hadoop Update
Optionally restart Hadoop Cluster 2
Optionally repeat above commands: 2
Upgrading Namenode, Secondary Namenode, Datanodes 2
Before you start, make sure all machines are up to date. Run yum update on all machines.
https://cwiki.apache.org/confluence/display/HADOOP2/Hadoop+Upgrade
These instructions are for upgrading to Bigtop 1.5 (hdfs-2.10.0)
The procedure is identical for all versions of Hadoop.
You could probably skip to Bigtop 3.2.1.
For Bigtop 3.2.1 you want to change the wget bigtop.repo command
And use the xrootd-hdfs:
http://tquark.colorado.edu/computing/xrootd-hdfs/xrootd-hdfs-2.2.2-1.el7.x86_64.rpm
Fix DFS to the point there are no errors. The resulting file will contain a complete block map of the file system. The last line in the log file should be:
The filesystem under path '/' is HEALTHY
hdfs fsck / -files -blocks -locations > /tmp/hadoop-fsck-2.6.0.log 2>&1
The resulting file will contain the complete namespace of the file system.
hdfs dfs -ls -R / > /tmp/hadoop-ls-R-2.6.0.log 2>&1
Create a list of data nodes participating in the cluster.
hdfs dfsadmin -report > /tmp/hadoop-datanodes-2.6.0.lst 2>&1
Optionally, stop and restart Hadoop cluster, in order to create an up-to-date namespace checkpoint of the old version.
Optionally above commands, and compare the results with the previous run to ensure the state of the file system remained unchanged.
Copy the following checkpoint files into a backup directory:
dfs.name.dir/edits
dfs.name.dir/image/fsimage
Stop Hadoop on Namenode, SecondaryNameNode and all data nodes; and umount /mnt/hadoop
/etc/init.d/hadoop-hdfs-namenode stop; umount /mnt/hadoop
/etc/init.d/hadoop-hdfs-secondarynamenode stop; umount /mnt/hadoop
/etc/init.d/hadoop-hdfs-datanode stop; umount /mnt/hadoop
On namenode, make a copy of the file system image. Same procedure as done when moving to a new server. ${hadoop.tmp.dir} is unique to your site
tar -czpvf /nfs/SomeDisk/checkpoint.tar.gz /mnt/hadoopnn/scratch/ ${hadoop.tmp.dir}
Update Hadoop on NN
cd /etc/yum.repos.d
wget https://archive.apache.org/dist/bigtop/bigtop-3.2.1/repos/centos-7/bigtop.repo
yum clean all
yum update
/etc/init.d/hadoop-hdfs-namenode upgrade
Update datanodes
cd /etc/yum.repos.d
wget https://archive.apache.org/dist/bigtop/bigtop-1.5.0/repos/centos-7/bigtop.repo
yum clean all
yum update
/etc/init.d/hadoop-hdfs-datanode start
mount -a
5) Update secondary namenode:
cd /etc/yum.repos.d
wget https://archive.apache.org/dist/bigtop/bigtop-1.5.0/repos/centos-7/bigtop.repo
yum clean all
yum update
/etc/init.d/secondarynamenode start
mount -a
At this point, you have an updated Hadoop. When you are happy with the update, on the namenode run:
hdfs dfsadmin -finalizeUpgrade
systemctl stop xrootd@[standalone or clustered]
yum update
yum remove xrootd-hdfs
cd /etc/yum.repos.d/
wget https://archive.apache.org/dist/bigtop/bigtop-1.5.0/repos/centos-7/bigtop.repo
yum update
yum install \
http://tquark.colorado.edu/computing/xrootd-hdfs/xrootd-hdfs-2.2.0-1.el7.x86_64.rpm
systemctl start xrootd@[standalone or clustered]
rpm -qa | grep java
and I think what you want is:
lnxfarm323> rpm -qa | grep java
tzdata-java-2024b-4.el8.noarch
java-1.8.0-openjdk-headless-1.8.0.432.b06-2.el8.x86_64
javapackages-filesystem-5.3.0-2.module_el8.6.0+3333+6f2999f0.noarch
One your packages, I think it was java-1.8.0-openjdk-headless had the version number 23 in it and that is not supported by Hadoop. Perhaps you did:
yum install java-latest
I think you have to remove your current java and install the correct version. Be careful, if you:
yum remove java...