Pyspark streaming from eventhub fails with concurrent.completionException
I have a pyspark application to stream data from eventhub. It is executed in azure databricks using pools. It uses a checkpoint folder in adls. Data being ready from eventhub will be transformed first, then write to adls delta table, write to 2 SQL tables based on some filters, write timestamp info from dataframe into another adls delta table. I get streaming query exception while writing the data. Error shows concurrent completion exception- receiver disconnected. The error is not occurring for same write operation always. Sometimes I can see could not add containers error in cluster eventlog.When the error occurred for long time for multiple runs I have to reset the checkpoint and then it runs fine. But sometimes without resetting the checkpoint also it will get resolved. Anyone please suggest how to resolve the issue. It is not occurring consistently and not at same line of code.