I am trying to migrate hive metastore into unity catalog, so that I have to enabled unity catalog in my existing cluster but my one of the notebook we are using below code is not supported now and throwing the error. kindly assist how should we handle after enabled unity catalog in current cluster to existing code.
please advice us, any alternative code can be possible to achieve.
jdbcUrl = "jdbc:sqlserver://{0}:{1};database={2};UserName={3};Password={4}".format(jdbcHostname, jdbcPort, jdbcDatabase,dbr_PrincipalId,dbr_PrincipalSecret)
connectionProperties={
"driver":"com.microsoft.sqlserver.jdbc.SQLServerDriver",
"authentication": "ActiveDirectoryServicePrincipal"
}
props = spark._sc._gateway.jvm.java.util.Properties()
props.putAll(connectionProperties)
driver_manager = spark._sc._gateway.jvm.java.sql.DriverManager
con = driver_manager.getConnection(jdbcUrl,props)
error :
py4j.security.Py4JSecurityException: Method public synchronized void java.util.Hashtable.putAll(java.util.Map) is not whitelisted on class class java.util.Properties
2
java.util.Hashtable.putAll(java.util.Map)
is not whitelisted
due to py4j.security.Py4JSecurityException
exception.
So, as @Ganesh Chandrasekaran mentioned you use any one of the following.
remote_table1 = (spark.read
.format("sqlserver")
.option("host", "hostName")
.option("port", 1433)
.option("user", "userName")
.option("password", "password")
.option("database", "databaseName")
.option("dbtable", "tableName")
.load()
)
#OR
driver = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
remote_table2 = (spark.read
.format("jdbc")
.option("driver", driver)
.option("url", "jdbc:sqlserver://<hostName>:1433;database=master")
.option("dbtable", "tableName")
.option("user", "userName")
.option("password", "password")
.load()
)
Also, check this experimental features for taking advantage of Unity Catalog syntax and data governance tools.