I have a java app in form of jar file which is installed in DataBricks cluster. The app reads and writes to tables in Databricks. So it needs Spark session to perform these actions. I need to somehow retrieve the spark session running in Databricks in the java code.
All known methods in SparkSession like SparkSession.builder, SparkSession.active and SparkSession.getActiveSession are not available to use. How can I access the spark session inside java code?
I have tried to pass the session directly as function argument and it did run but I also need to pass some arguments to main function so I cannot do both things together