I have a rather simple question, I am generating documentation for my project for gh-pages, but I am doing it by running jekyll/doxygen and committing the results back to the repo.
This feels wrong, all these files are auto-generated, why am I tracking them in git?
For the life of me, I cannot seem to figure out a github workflow which builds my documentation with jekyll/doxygen then deploys it without committing back to the repo.
Is this necessary? Do I have to commit the output of jekyll/doxygen back to the repo in order to host it on github pages?
ie. I have to track all these auto-generated files in my repo even though I have the source files in the repo and can rebuild them any time?
To further clarify my situation, I have a main branch ‘master’ which is the core software – then I have several ‘derivative’ branches like branch1 branch2 branch3 which are derivatives of master but will never be merged back into master because they are for different purposes. Each of these ‘branches’ has it’s own doxygen documentation because their source code is slightly different.
I want to generate documentation for the entirety of the repo with jekyll, I have the md files and such for that in a /docs folder in a gh-pages branch. But then, included in that jekyll wiki, I also want to expose all of the doxygen docs for each branch.
I want a workflow which can:
- take the jekyll md source files and generate the jekyll wiki
- checkout each derivative branch and generate it’s doxygen docs
- store each derivative branches doxygen docs in a subfolder of the jekyll wiki
- deploy the jekyll wiki + doxygen docs to my gh pages
Do I need to commit jekyll/doxygen outputs back to my repo in order to host them on gh-pages? Or can I just “build and deploy” them in a single workflow?