I am trying to compile the Hbase Spark connector for Spark 3.5 but I am getting below errors. (It works for Spark 3.1)
Is there any other way to connect Hbase via Spark 3.5 ?
Main Repo;
https://github.com/apache/hbase-connectors.git
Command;
mvn -Dspark.version=3.5.0 -Dscala.version=2.12.17 -Dscala.binary.version=2.12 -Dhbase.version=2.4.17 -Dhadoop.profile=3.3 -Dhadoop-three.version=3.3.6 -DskipTests -Dcheckstyle.skip -U clean package
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/DefaultSource.scala:33: object mapred is not a member of package org.apache.hadoop
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/DefaultSource.scala:212: not found: type JobConf
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/DefaultSource.scala:212: not found: type JobConf
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/DefaultSource.scala:256: Symbol 'type org.apache.hadoop.mapred.JobConf' is missing from the classpath.
This symbol is required by 'value org.apache.spark.rdd.PairRDDFunctions.conf'.
Make sure that type JobConf is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
A full rebuild may help if 'PairRDDFunctions.class' was compiled against an incompatible version of org.apache.hadoop.mapred.
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/HBaseContext.scala:39: object mapred is not a member of package org.apache.hadoop
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/HBaseContext.scala:40: object mapreduce is not a member of package org.apache.hadoop
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/HBaseContext.scala:69: not found: value Job
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/HBaseContext.scala:70: Class org.apache.hadoop.mapreduce.Job not found - continuing with a stub.
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/HBaseContext.scala:423: not found: type Job
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/HBaseContext.scala:423: not found: value Job
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/HBaseContext.scala:426: Class org.apache.hadoop.mapreduce.Mapper not found - continuing with a stub.
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/HBaseContext.scala:434: not found: type JobConf
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/NewHBaseRDD.scala:21: object mapreduce is not a member of package org.apache.hadoop
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/NewHBaseRDD.scala:29: not found: type InputFormat
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/NewHBaseRDD.scala:34: Symbol 'term org.apache.hadoop.mapreduce' is missing from the classpath.
This symbol is required by 'type org.apache.spark.rdd.NewHadoopRDD._$1'.
Make sure that term mapreduce is in your classpath and check for conflicting dependencies with `-Ylog-classpath`.
A full rebuild may help if 'NewHadoopRDD.class' was compiled against an incompatible version of org.apache.hadoop.
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/NewHBaseRDD.scala:34: type mismatch;
found : Class[_$1(in value <local spark>)] where type _$1(in value <local spark>)
required: Class[_ <: org.apache.hadoop.mapreduce.InputFormat[?,?]]
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/HBaseContext.scala:441: Class org.apache.hadoop.mapreduce.InputFormat not found - continuing with a stub.
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/Logging.scala:23: object impl is not a member of package org.slf4j
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/Logging.scala:109: not found: value StaticLoggerBinder
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/NewHBaseRDD.scala:34: no arguments allowed for nullary constructor Object: ()Object
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/NewHBaseRDD.scala:38: value compute is not a member of AnyRef
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/example/hbasecontext/HBaseBulkPutExampleFromFile.scala:27: object mapred is not a member of package org.apache.hadoop
spark/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/example/hbasecontext/HBaseBulkPutExampleFromFile.scala:57: not found: type TextInputFormat
[ERROR] 23 errors found