I am encountering a ShutdownHookManager error when running Spark with a custom temporary directory configuration. My directory structure and configuration details are as follows:
Directory Structure:
makefile
Copy code
C:
├── Spark
│ └── conf
│ └── spark-defaults
└── SparkCourses
├── SparkCourseTEMP
└── SparkCourseSparkCourseTEMP
Configuration Details:
In spark-defaults.conf:
properties
Copy code
spark.local.dir=C:/SparkCourse/SparkCourseTEMP
In Spark code:
python
Copy code
conf.set(“spark.local.dir”, “C:/SparkCourse/SparkCourseTEMP”)
Error:
A ShutdownHookManager error related to temporary directory cleanup is reported. It doesn’t stop execution but is annoying and might affect system resources.
Steps Taken:
Verified spark.local.dir configuration.
Cleaned up residual temporary directories.
Ensured proper permissions.
Checked for conflicting configurations.
Considered updating Spark or using the default temp directory.
Questions:
What could be causing this error?
How can I ensure proper cleanup of temporary directories?
Are there other configurations or updates that might resolve this issue?
24/08/04 07:33:14 ERROR ShutdownHookManager: Exception while deleting Spark temp dir: C:SparkCourseSparkCourseSparkCourseTEMPspark-b6c2dc5e-8586-45a3-9982-268bcb911152pyspark-e68da28b-3665-4d34-8e1d-ba2cca5b76d1
java.nio.file.NoSuchFileException: C:SparkCourseSparkCourseSparkCourseTEMPspark-b6c2dc5e-8586-45a3-9982-268bcb911152pyspark-e68da28b-3665-4d34-8e1d-ba2cca5b76d1
at java.base/sun.nio.fs.WindowsException.translateToIOException(WindowsException.java:85)
at java.base/sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:103)
at java.base/sun.nio.fs.WindowsException.rethrowAsIOException(WindowsException.java:108)
at java.base/sun.nio.fs.WindowsFileAttributeViews$Basic.readAttributes(WindowsFileAttributeViews.java:53)
at java.base/sun.nio.fs.WindowsFileAttributeViews$Basic.readAttributes(WindowsFileAttributeViews.java:38)
at java.base/sun.nio.fs.WindowsFileSystemProvider.readAttributes(WindowsFileSystemProvider.java:199)
at java.base/java.nio.file.Files.readAttributes(Files.java:1851)
at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:124)
at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:117)
at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:90)
at org.apache.spark.util.SparkFileUtils.deleteRecursively(SparkFileUtils.scala:121)
at org.apache.spark.util.SparkFileUtils.deleteRecursively$(SparkFileUtils.scala:120)
at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1126)
at org.apache.spark.util.ShutdownHookManager$.$anonfun$new$4(ShutdownHookManager.scala:65)
at org.apache.spark.util.ShutdownHookManager$.$anonfun$new$4$adapted(ShutdownHookManager.scala:62)
at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36)
at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198)
at org.apache.spark.util.ShutdownHookManager$.$anonfun$new$2(ShutdownHookManager.scala:62)
at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214)
at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$2(ShutdownHookManager.scala:188)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928)
at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$1(ShutdownHookManager.scala:188)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at scala.util.Try$.apply(Try.scala:213)
at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188)
at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178)
at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:842)