I’m using Databricks Connect to supply data to a web app built with Dash. My use case is similar to this example: https://github.com/databricks-demos/dbconnect-examples/tree/main/python/Plotly
After a certain amount of inactivity, I get the following GRPC exception:
Exception: <_InactiveRpcError of RPC that terminated with:
status = StatusCode.FAILED_PRECONDITION
details = "BAD_REQUEST: session_id is no longer usable. Generate a new session_id by detaching and reattaching the compute and then try again. (requestId=7e1394be-cf27-47d0-9c6a-f176af85522a)"
debug_error_string = "UNKNOWN:Error received from peer {grpc_message:"BAD_REQUEST: session_id is no longer usable. Generate a new session_id by detaching and reattaching the compute and then try again
I think that the session is expiring, but I’m not sure how to reestablish the connection without restarting the entire app.
I’ve tried the following commands:
spark.stop()
spark.newSession()
spark = DatabricksSession.builder.serverless().getOrCreate()
But when I call spark.newSession() I get:
pyspark.errors.exceptions.base.PySparkAttributeError: [JVM_ATTRIBUTE_NOT_SUPPORTED] Attribute `newSession` is not supported in Spark Connect as it depends on the JVM. If you need to use this attribute, do not use Spark Connect when creating your session. Visit for creating regular Spark Session in detail.https://spark.apache.org/docs/latest/sql-getting-started.html#starting-point-sparksession
Any ideas on how to restore the Databricks Connect session from within the Dash app? Thank you!
Carson Grose is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
5