I’m working on a Python codebase which we are trying to improve the modularity of. Both the top level scripts and the modules that they depend on are all in one main repo. We have another repo which contains utility scripts for various runtime actions including fixing problems etc.
We complement these utility scripts with runbooks, describing how to deploy and use the scripts. As another DevOps team are the ones who use what we create, we try to keep the instructions as simple as possible to avoid confusion.
To help with this, we’ve tried to keep our utility scripts individual and self contained, even if that means duplicating code from the modules in our main repo. Obviously this is not ideal and we’ve reached the point where the utility scripts should start using the main modules rather than duplicating code.
I know we could, and maybe should, refactor the entire main repo to move the modules to a separate repo which both main and utilities repos could use. We could also employ CI/CD to test and package deliverables from both repos (currently just used on the main repo)
However, this could get extremely complex and will take time and development effort, so is something that we will have to move towards. Therefore, my main problem at the moment is how we can simply package and deploy a utility and dependent modules in a straightforward way which makes it simple for DevOps to deploy and run the utilities and allows us to continue refactoring our code as the first step towards re-organising the repos.
Are there any tools that would, say, allow us to identify necessary Python file/modules, by repo and branch/SHA (all git based) in a manifest and create an easily deployable package that our DevOps can use?
4