Authorized committer failed; but task commit success, data duplication may happen when using spark 3.4 with spots
I am working with spark on dataiku using spots on S3 ( not in demand instances). I had no problem until I moved from park 3.3
to spark 3.4
!
I always have this fail and could not understand what configuration in the new version of spark lead to it