I used below code to save ms sql table to databricks table.
driver = "com.microsoft.sqlserver.jdbc.SQLServerDriver"
database_host = "myservername"
database_port = "1433" # update if you use a non-default port
database_name = "mydbname"
table = "mytablename"
user = "username"
password = "password"
url = f"jdbc:sqlserver://{database_host}:{database_port};database={database_name}"
remote_table = (spark.read
.format("jdbc")
.option("driver", driver)
.option("url", url)
.option("dbtable", table)
.option("user", user)
.option("password", password)
.load()
)
display(remote_table)
import pandas as pd
pandas_df = remote_table.toPandas()
pandas_df
df1 = pd.DataFrame(pandas_df)
spark_df = spark.createDataFrame(df1)
spark_df.write.saveAsTable("databricks_database_name.test")
%sql
select * from edl_dev_app_ent_deat_src.test2
after the last sql code, I got this error. How can I save table to databricks table?
IllegalStateException: Couldn’t find UMishandled#13586 in [Id#13513,RunId#13514,ParentDisplayValue#13515,ParentLink#13516,MadeSla#13517,UServiceofferingDisplayValue#13518,UServiceofferingLink#13519,WatchList#13520,ScCatalog#13521,SnEsignDocument#13522,UponReject#13523,SysUpdatedOn#13524,TaskEffectiveNumber#13525,UMultipleComments#13526,ApprovalHistory#13527,Skills#13528,Number#13529,SysUpdatedBy#13530,OpenedByDisplayValue#13531,OpenedByLink#13532,UserInput#13533,SysCreatedOn#13534,SysDomainDisplayValue#13535,SysDomainLink#13536,State#13537,RouteReason#13538,SysCreatedBy#13539,Order#13540,CalendarStc#13541,ClosedAt#13542,CmdbCiDisplayValue#13543,CmdbCiLink#13544,CmdbCiBusinessApp#13545,Contract#13546,Impact#13547,Active#13548,WorkNotesList#13549,BusinessServiceDisplayValue#13550,BusinessServiceLink#13551,Priority#13552,SysDomainPath#13553,TimeWorked#13554,ExpectedStart#13555,OpenedAt#13556,BusinessDuration#13557,GroupList#13558,WorkEnd#13559,ApprovalSet#13560,WorkNotes#13561,UniversalRequest#13562,RequestDisplayValue#13563,RequestLink#13564,ShortDescription#13565,CorrelationDisplay#13566,WorkStart#13567,AssignmentGroupDisplayValue#13568,AssignmentGroupLink#13569,AdditionalAssigneeList#13570,Description#13571,UAtt#13572,CalendarDuration#13573,CloseNotes#13574,ServiceOfferingDisplayValue#13575,ServiceOfferingLink#13576,SysClassName#13577,ClosedByDisplayValue#13578,ClosedByLink#13579,FollowUp#13580,URptDuration#13581,SysId#13582,ContactType#13583,SnEsignEsignatureConfiguration#13584,Urgency#13585,Company#13587,ReassignmentCount#13588,ActivityDue#13589,AssignedToDisplayValue#13590,AssignedToLink#13591,Comments#13592,Approval#13593,SlaDue#13594,CommentsAndWorkNotes#13595,DueDate#13596,SysModCount#13597,RequestItemDisplayValue#13598,RequestItemLink#13599,UEbondSrCreate#13600,SysTags#13601,CatItemDisplayValue#13603,CatItemLink#13604,Escalation#13605,UponApproval#13606,CorrelationId#13607,Location#13608,VarRequestedFor#13609,VarProjectTitle#13610,VarPrimaryContact#13611,VarApplication#13612,VarWhichSiteTemplateIsIt#13613,VarDescriptionOfRequest#13614,VarSolutionURL#13615,VarHowAutomatedIsYourCurrentProcess#13616,VarWhatValueWouldAutomatingYourProcessProvide#13617,VarHowManyFTEsDoesYourCurrentProcessRequire#13618,VarHowManyEWsDoesYourCurrentProcessRequire#13619,VarHowMuchTimeDoesYourCurrentProcessRequire#13620,VarWhatIsYourAnnualizedProcessFrequency#13621,VarAreYouEstimatingAnyCostSavings#13622,VarLBenefitAnalysis#13623,VarLProductivityHoursSaved#13624,VarLComplianceRiskPercentage#13625,VarLDidYouAttachTheScreenshotOfTheErrorOrIssue#13626,VarLPointOfContact#13627,VarLOverviewOfTheRequest#13628,VarLStakeholders#13629,VarLRequirements#13630,VarLPermissions#13631,VarLDidYouAttachTheRequirementDetails#13632]
I tried pandas df, but it also did not work and I got same error when I ran sql select table after I used below code.
spark.createDataFrame(pandas_df).write.saveAsTable("edl_dev_app_ent_deat_src.test")
/databricks/spark/python/pyspark/sql/pandas/conversion.py:626: FutureWarning: iteritems is deprecated and will be removed in a future version. Use .items instead.
[(c, t) for (_, c), t in zip(pdf_slice.iteritems(), arrow_types)]
김서연 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.