Thru Pyspark dataframe, Records are selected(thru spark.read.format) from SQL Server & insert into Postgres table using dataframe.write option. During the insert, it is throwing with following error.
INSERT INTO test (“Num”,”DateCreated”,”DateLastModified”,”RecordType”,”RecordSubType”,”coName”,”RunDateTime”,”RunIdent”,”Payer”,”ProcessYear”,”etlInsertId”,”etlUpdateId”) VALUES (1,’2019-06-18 19:45:36.8716624 -06:00′,’2019-06-18 19:45:36.8716624 -06:00′,’H’,’ ‘,’DataBricks’,’202106181943′,’CONSUMER’,’111023′,2022,0,0) was aborted: ERROR: column “DateCreated” is of type timestamp with time zone but expression is of type character varying
And “DateCreated” is TIMESTAMP WITH TIME ZONE (data type) in Postgres table.
Let me know what error am I making here?
Thanks in advance.
I am expecting records to be inserted into Postgres table.
SivaC is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.