I Install Spark Cluster my Spark Master Ports is as 0.0.0.0:7077->7077/tcp, 6066/tcp, 0.0.0.0:8080->8080/tcp and container ID is 894e4b6f96bb how to make spark session please guide unable to find host name this way i am create spark session Spark Cluster in Spark, Cluster install under same network
where hadoop and hive is installed
import subprocess
from pyspark.sql import SparkSession
def get_container_id(container_name):
command = ["docker", "inspect", "-f", "{{ .Config.Hostname }}", container_name]
result = subprocess.run(command, capture_output=True, text=True)
return result.stdout.strip()
spark_container_name = "spark-master"
hostname = get_container_id(spark_container_name)
print(hostname)
spark = SparkSession.builder
.appName("SparkSessionExample")
.master(f"spark://{hostname}:7077")
.getOrCreate()
I need help to solve this issue i am new in spark in docker