I am using python binding of spark docker image for kubernetes, this image already contains pyspark lib. I need to use conda in this image. Is there the way to make existing pyspark visible for conda pip?
Initial idea was to get dist-info from the version, that you get when install pyspark via conda pip and modify the paths, but maybe there is a better way to do it?
New contributor
Roman Gershgorin is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.