I have a complex build:
- Many components; one git repository for each
- Parallel branches - many branches merge into one release.
At a Jenkins Conference a year ago, a colleague was told to have each component (git repo) have its own pipeline, and then use another pipeline to 'glue' them together.
The best path forward that I can see is to have a sort of "driver" Jenkinsfile which then uses something along the line of:
unarchive <a+b objects>
... and so on
... and then figure out a way to pass binaries between the different multibranch pipeline jobs. (not nexus/artifactory, because $REASONS I can't discuss).
Is there a better way to go about the task?
- There doesn't seem to be a branch-aware "upstream/downstream" relationship that I can leverage.
- I'm not sure if it's a good practice to have one multibranch pipeline using many git repositories (each for a different stage).