I’m working on a project where we are trying to load multiple tables from Delta table into Synapse dedicated pool. The reason why we need the data in SQL is because the apps that read it are using .NET libraries and we do not intend to change any code there.
I’m aware that we are write data from Databricks straight into Synapse using the spark connector but I was exploring if I can read data from the delta table using a Synapse pipeline and write it to Synapse dedicated pool. I tried using the native delta lake connector and was able to read and write using a Synapse pipeline. I’m using the staged copy option available to a ADLS storage location.
My question is how can I use polybase to copy data efficiently from my delta lake to Synapse dedicated SQL pool? The documentation says that Polybase option is available and is the best option when writing data to Synapse pool
However, I do not see the Polybase option when configuring my copy activity. What I see is the below.
Am I missing something here? How do I enable Polybase so I can copy my data from source to sink?