Problem#1: mapreduce app not submitted to yarn resource manager.
sol:I was trying to call new Job from within other java app. What worked:
hadoop jar <jar-name>.jar [MainClass] arguments
Sol#2>Make sure logger are configured properly, if not then most of helpfull logs are not displayed any where.
Sol#3> wheck if jarname is set in job.
Problem#2: put validation error when Value in put command is of huge size(300m in my case)
sol: setting property hbase.client.keyvalue.maxsize greater then 300m worked(default is 10m).
Problem#3: mapreduce unable to map large input source. terminated with error "OutOfOrderScannerNextException: was there a rpc timeout?"
sol: setCaching to 1
increase hbase.rpc.timeout 900000.
increase hbase.client.scanner.timeout.period 900000.
Problem#3: Error on execution of MR, Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/io/LimitInputStream
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:467)
Sol: removed module guava from all containing dependencies.
gradle dependencies # executed, Lists dependency tree
exclude module:'guava' # added in build.gradle, removes guava
exclude group:'com.google.guava' # added in build.gradle, removes guava, was not working hence above.
compile 'com.google.guava:guava:13.0' # added in build.gradle, worked
Problem#4: java.io.IOException: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.
Sol:
Problem#5:Exception in thread "main" java.lang.IllegalArgumentException: No enum constant org.apache.hadoop.mapreduce.JobCounter.MB_MILLIS_MAPS. while importing table data.
Sol: though error, data was imported.
Problem#6:While executing mapreduce using java -jar got error : org.apache.hadoop.hbase.util.DynamicClassLoader - Failed to identify the fs of dir hdfs://
Sol: executed using hadoop jar. worked.
Problem#7: Mapper not progressing