I have Java/Spring and Spark/Java Repos. Spring one is using Spark in order to write data to the sources.
I have similar to below code
String errorCause = "";
String errorMessage = "";
try {
dataset.write()...save()
} catch(Exception e) {
errorCause = e.getCause().toString();
errorMessage = e.getMessage();
}
When spark run on local[*] I am able to get cause and message, when spark run on remote cluster can get message but cant cause. Message includes like ” Worker ... executor 6 failed” which is not useful but cause could give me “value too large for column A actual size 60, maximum 30”
I guess it is not reported because of this line : https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/TaskResultGetter.scala#L154