First larger project which required the use of many packages, modules, etc. I’ve been reading the official Python documentation and answers here which talk about the “proper” way to organize packages, but there seems not to be a consensus. Suppose I have this directory:
application
├── src
│ └── package
│ └── __init__.py
│ └── module.py
│ └── module2.py
├── scripts
│ └── some_script.ipynb
└── more_scripts
└── some_script.ipynb
My goal is to import functions from modules inside the src/package into all of these scripts. I’ve seen a lot of discussion on setting PYTHONPATH or syspath or using importlib or using relative imports – all of these strategies are seemingly frowned upon by some group of users.
My question is: what is the best practice to explicitly import these modules into all scripts? In case it is relevant, I’m trying to push onto Github for an audience of non-CS people. Ideally they would not have to set PATHS and stuff.
Thanks for your help in advance.
I have experimented with setting PYTHONPATH, but I recognize that this might be demanding of the non-CS facing end user of the repo.
I have experimented with setting syspath but Slackoverflow seems to dislike this.
Slackoverflow seems to dislike relative imports, but explicit imports don’t work since scripts is not a package.
user26507033 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.