Pipeline of Pipelines?

I have a complex build:

  • Many components; one git repository for each
  • Parallel branches - many branches merge into one release.

At a Jenkins Conference a year ago, a colleague was told to have each component (git repo) have its own pipeline, and then use another pipeline to 'glue' them together.

The best path forward that I can see is to have a sort of "driver" Jenkinsfile which then uses something along the line of:

stage(a) {
archive <objects>
stage(b) {
unarchive <a>
build '../repo-b/${env.BRANCH_NAME}'
archive <objects>
stage(c) {
unarchive <a+b objects>
build '../repo-c/${env.BRANCH_NAME}'
archive <objects>
... and so on

... and then figure out a way to pass binaries between the different multibranch pipeline jobs. (not nexus/artifactory, because $REASONS I can't discuss).

Is there a better way to go about the task?

  • There doesn't seem to be a branch-aware "upstream/downstream" relationship that I can leverage.
  • I'm not sure if it's a good practice to have one multibranch pipeline using many git repositories (each for a different stage).

1 comment

Please sign in to leave a comment.