I want to write iceberg format with directory based catalog in hadoop.
I follow configurations in documentation
https://iceberg.apache.org/docs/nightly/spark-configuration/#catalogs
My SparkSession configurations are below. I get this error just with spark version greater than 3.2.x. In 3.2.x versions everything is well.
val spark = SparkSession.builder().appName("Spark Iceberg Example")
.config("spark.sql.extensions", "org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions")
.config("spark.sql.catalog.spark_catalog","org.apache.iceberg.spark.SparkSessionCatalog")
.config("spark.sql.catalog.spark_catalog.type","hadoop")
.config("spark.sql.catalog.hadoop_prod.type","hadoop")
.config("spark.sql.catalog.hadoop_prod","org.apache.iceberg.spark.SparkCatalog")
.config("spark.sql.catalog.hadoop_prod.warehouse","hdfs://xxxx")
.getOrCreate()