I have the above file structure. Within src/jobs there is a file common.py which has code that will be reused across all the different jobs. Each job will be in a separate sub-directory. I want to import common.py into a job using an absolute import such as from src.jobs.common import *
but I am getting error messages. I want absolute imports because I will be testing the jobs and using relative imports can cause the tests to fail. How can I structure this as a package in python to achieve this?
I have tried running the file from different locations and with init.py files in all folders but keep getting a variant of this message:
$ python src/jobs/most_popular_products/most_popular_product.py
Traceback (most recent call last):
File "C:gitspark-jobs-spikesrcjobsmost_popular_productsmost_popular_product.py", line 1, in <module>
from src.jobs.common import create_spark_session
ModuleNotFoundError: No module named 'src'
I am running on VSCode windows