I have a docker project to work with spark locally, as follows:
- Ubuntu 20.04 (WSL)
- openjdk:17.0.2
- Scala 2.12
- Spark 3.4.0
- Spark-delta 2.4.0
- JupyterLab
Everything works fine, but when I wanted to upgrade the version of spark to 3.5.0 and spark-delta which works with 3.1.0, I got this error when I wanted to create or query a delta table:
Py4JJavaError: An error occurred while calling o57.sql.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 5.0 failed 4 times, most recent failure: Lost task 0.3 in stage 5.0 (TID 26) (172.20.0.4 executor 0): java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.sql.catalyst.expressions.ScalaUDF.f of type scala.Function1 in instance of org.apache.spark.sql.catalyst.expressions.ScalaUDF
Is this a compatibility issue?
New contributor
Yaya is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.