How does jenkins determine downstream projects
It doesn't change the overall job status, but it does notify us when the downstream job fails. Jenkins Prerequisite build step Plugin. We set a post build step "Trigger parameterized build on other projects " in all three of the original downstream jobs. The parameter that parse into the new job depends on the three jobs' status and the parameter will causes the new job react accordingly. Create new job which contains one simple class and one simple test.
Both parameters dependens, i. Set Trigger parameterized build on other projects for the three original downstream jobs with relevant configurations. I have a job upstream that executes 4 downstream jobs. If the upstream job finish successfully the downstream jobs start their execution.
This approach is very useful see the top rated solution : stackoverflow. The last maintained version seems incompatible with Jenkins 2. It also helps to reduce the dependency tree as much as possible - I initially had ours reflect the source code dependencies one by one, but if you reduce it to only absolutely the necessary paths e. No, there are good reason to split this into multiple jobs.
And maintainability. Every job has a Jenkinsfile that is about a hundred lines long. Combining it into one job makes it unreadable. Also the parallel step mixes the output of all nodes which renders it close to unreadable. Also being able to start the intermediate jobs individually isn't possible with one single pipeline job.
Shall I continue. Sorry if that's a dumb question, but I cannot yet see how these orchestration pipelines help address the issue at hand. I do get that they help define the build sequence nicely and without the need for the two options that spawned this discussion thread. However, as far as I understood it, I would either. The first scenario is not desirable as it tends to become a significant waste of resources once we are looking at more than just some five to ten jobs.
And with the second scenario we're back at the point where having these blocking check boxes become a bit of a necessity. Of course one could start breaking things down into several smaller job groups so that the orchestration jobs make more sense, but then again this is an extra configuration effort that simply wasn't necessary when only using freestyle and matrix jobs and therefore feels like a bit of an overhead.
Actually it is possible, though we do not yet provide a convenient framework for it; possibly Declarative will in the future. When build D is successful, it'll trigger a build of A, B and C.
But you don't want A to build straight away as it'll get triggered again as soon as B and C are built. So A should block until both B and C are built. Freestyle allow doing this and it works very well, I don't understand why this is not possible for pipeline projects? We have exactly the situation Dario described with about 60 interdependent jobs. The excessive triggering of downstream jobs makes pipeline jobs practically unusable for our builds.
In Darios example A is triggered 3 times instead of once. The total build time is about 6 hours with pipeline jobs instead of less than 1 hour with freestyle jobs because of the missing blocking feature. Same, we sometimes end up in a situation where there are 20 projects in the queue, most of which are duplicates. I think I'm going to work on a fix for this in the next few weeks and report back here. That person is me haha! But thanks for the heads up and apologies for confusion!
In that case I should apologize as well: StuporMundi is my nickname Thanks a lot for looking into this!!! Hi Andras Gaal , unfortunately I haven't yet found time to work on this. Obviously if anyone else wants to work on this please feel free and let people know here.
Generally this part of core has always been rather fragile and should arguably be deprecated en masse in favor of some plugin with a fresh design. Just a heads up for people who also have the issue that too many builds are triggered when using pipelines: we implemented something similar to what the old Maven project type does in the Pipeline Maven Plugin as part of JENKINS You will need to set up your pipelines to use the withMaven step in conjunction with its PipelineGraphPublisher feature to trigger pipeline builds when a snapshot dependency has been built.
Then your pipelines will trigger each other, but the plugin will notice when a job will be triggered later on in the dependency chain, so the scenario described by Dario Simonetti won't happen anymore. Of course, this is not a general solution since it only applies to Maven and not to other build tools such as Gradle, but it might be useful for people who are still using the Maven project type and are looking into migrating to pipeline.
This would be usefull fur us as well. I'm wondering if this could be integrated in the trigger part:. Would this be possible? I can even try to provide a patch if someone can head me to the right direction.
Where would such a change be placed? This is unlikely to be a simple patch. A major chunk of Jenkins core APIs would need to be refactored. I do not think it is worth doing anyway. Stack Overflow for Teams — Collaborate and share knowledge with a private group. Create a free Team What is Teams? Collectives on Stack Overflow. Learn more. Where does Jenkins store Downstream Projects information in the file system?
Ask Question. Asked 5 years, 8 months ago. Active 4 years, 7 months ago. Viewed 6k times. Improve this question. Add a comment. Active Oldest Votes.
As I know, there are 2 ways jenkins can set up downstream project. Improve this answer. Thanks mainframer.
0コメント