I am generating documentation for my project for GitHub Pages, but I am doing it by running Jekyll/Doxygen and committing the results back to the repo.
This feels wrong; all these files are auto-generated, why am I tracking them in git? Do I have to track all these auto-generated files in my repo even though I have the source files in the repo and can rebuild them any time?
I cannot seem to figure out a GitHub workflow which builds my documentation with Jekyll/Doxygen then deploys it without committing back to the repo.
Is this necessary? Do I have to commit the output of Jekyll/Doxygen back to the repo in order to host it on GitHub Pages?
To further clarify my situation, I have a main branch master
which is the core software – then I have several ‘derivative’ branches like branch1
, branch2
, branch3
which are derivatives of master
but will never be merged back into master
because they are for different purposes. Each of these branches has its own Doxygen documentation because their source code is slightly different.
I want to generate documentation for the entirety of the repo with Jekyll, I have the md files and such for that in a /docs
folder in a gh-pages
branch. But then, included in that Jekyll wiki, I also want to expose all of the Doxygen docs for each branch.
I want a workflow which can:
- take the Jekyll md source files and generate the Jekyll wiki
- check out each derivative branch and generate its Doxygen docs
- store each derivative branch’s Doxygen docs in a subfolder of the Jekyll wiki
- deploy the Jekyll wiki and Doxygen docs to GitHub Pages
Do I need to commit Jekyll/Doxygen outputs back to my repo in order to host them on GitHub Pages? Or can I just “build and deploy” them in a single workflow?
3