Setting default log level to “WARN”.
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
24/05/01 19:03:41 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
Loading class com.mysql.jdbc.Driver'. This is deprecated. The new driver class is
com.mysql.cj.jdbc.Driver’. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
Traceback (most recent call last):
File “C:UsersAdministratorPycharmProjectspythonProjectconnect_db.py”, line 6, in
db= spark.read.format("jdbc").option("url","host").option("user","username").option("password","mypass").option("dbtable","emp").option("driver","com.mysql.jdbc.Driver").load()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
hadoop3pythonlibpyspark.zippysparksqlreadwriter.py”, line 314, in load
File “C:sparkspark-3.5.1-bin-hadoop3pythonlibpy4j-0.10.9.7-src.zippy4jjava_gateway.py”, line 1322, in call
File “C:sparkspark-3.5.1-bin-hadoop3pythonlibpyspark.zippysparkerrorsexceptionscaptured.py”, line 185, in deco
pyspark.errors.exceptions.captured.IllegalArgumentException: requirement failed: The driver could not open a JDBC connection. Check the URL: host
S_R_Ws is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
2