如何从一个本地 git 存储库获得多个 Jenkins 构建?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/9914664/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-09-10 13:19:22  来源:igfitidea点击:

How can I get multiple Jenkins builds to work from one local git repo?

gitjenkins

提问by Malcolm Box

I have a GitHub repo that's big and contains several independently build-able bits. If I configure Jenkins with a job (or two) for each of these, I end up with having to pull gigabytes of data multiple times (one clone of the repo for each job).

我有一个很大的 GitHub 存储库,其中包含几个可独立构建的位。如果我为 Jenkins 中的每一个配置一个(或两个)作业,我最终不得不多次提取千兆字节的数据(每个作业的一个 repo 克隆)。

This takes both diskspace and bandwidth.

这需要磁盘空间和带宽。

What I'd like to do is have "Refresh local repo" job that clones github once, then configure each of the jobs to clone themselves from that repo, and build. Then by setting up the sub-jobs as dependent builds, I can run "Refresh local repo", have it pull all the latest stuff from GitHub, then have each of the builds run.

我想要做的是拥有克隆 github 一次的“刷新本地存储库”作业,然后配置每个作业以从该存储库克隆自己,然后构建。然后通过将子作业设置为依赖构建,我可以运行“刷新本地存储库”,让它从 GitHub 中提取所有最新内容,然后运行每个构建。

So far I've got the "Refresh local repo" working - it clones successfully, and if I go to the workspace, I see that it has the HEAD commit of origin/master.

到目前为止,我已经让“刷新本地存储库”工作 - 它成功克隆,如果我转到工作区,我会看到它具有 origin/master 的 HEAD 提交。

The problem is the other jobs - these don't seem to be picking up updates. Here's how I've got one of them configured:

问题是其他工作 - 这些似乎没有更新。这是我如何配置其中之一:

Git
 Repository URL file:////Users/malcolmbox/.jenkins/jobs/Refresh Local repo/workspace
 Branches to build  master

Instead of this updating to the latest commit, it's stuck several days in the past.

它没有更新到最新的提交,而是在过去几天卡住了。

How can I get it to pull the tip and do the right thing?

我怎样才能让它拉动小费并做正确的事情?

To clarify: the .../Refresh Local repo/workspace has commit 6b20268389064590147d5c73d2b6aceb6ba5fe70 submitted 28/3

澄清: .../Refresh Local repo/workspace has commit 6b20268389064590147d5c73d2b6aceb6ba5fe70 提交 28/3

The dependent build, after running a build (so presumably doing a git clone/pull step) is checked out to 79a25992cc192376522bcb634ee0f7eb3033fc7e submitted 26/3 - so it's a couple of days behind.

依赖构建,在运行构建之后(所以大概是执行 git clone/pull 步骤)被检出到 79a25992cc192376522bcb634ee0f7eb3033fc7e 提交 26/3 - 所以它落后了几天。

回答by sti

If you open the job configuration and click on the Advanced button of the git SCM configuration, you will see a place to specify "Path of the reference repo to use during clone (optional)".

如果您打开作业配置并单击 git SCM 配置的高级按钮,您将看到一个指定“克隆期间使用的参考仓库的路径(可选)”的地方。

If you have a local clone of your repository, add the path to the reference repo field.

如果您有存储库的本地克隆,请将路径添加到引用存储库字段。

Git will then use the local clone and share most of the git objects on the disk and pulling from github only what is missing from the local clone resulting in lightning fast clones and saved disk space.

然后,Git 将使用本地克隆并共享磁盘上的大部分 git 对象,并仅从 github 中提取本地克隆中缺少的内容,从而实现闪电般的快速克隆并节省磁盘空间。

Or is this exactly how you have configured your job and it is not picking up latest commits? If that is so, please provide more details. Consider publishing your job configuration.

或者这正是您配置工作的方式并且它没有选择最新的提交?如果是这样,请提供更多详细信息。考虑发布您的作业配置。

回答by Lars Kotthoff

Have a look at the Clone Workspace plugin. You can either use that or configure a job to update a local repository from Github and then have all the other jobs pull from that local repo.

看看克隆工作区插件。您可以使用它或配置一个作业从 Github 更新本地存储库,然后从该本地存储库中提取所有其他作业。

This won't help with the problem that the workspaces still need the diskspace, but as far as I know there's no simple solution for that. You could either have the build steps change to a shared directory outside the workspace, but that's hacky and might break other things. Alternatively, you could use a filesystem that provides deduplication.

这无助于解决工作区仍然需​​要磁盘空间的问题,但据我所知,没有简单的解决方案。您可以将构建步骤更改为工作区外的共享目录,但这很麻烦,并且可能会破坏其他内容。或者,您可以使用提供重复数据删除的文件系统。

回答by joel truher

I had the same experience.

我有同样的经历。

I have one job to pull the real remote repo, which is github.

我有一项工作来拉取真正的远程仓库,即 github。

Each of the other jobs (there are many) has a "Repository URL" like this:

每个其他作业(有很多)都有一个“存储库 URL”,如下所示:

file:///C:/Program Files (x86)/Jenkins/jobs/webtest-local-repo/workspace/.git

It clones fine, but subsequent fetches don't notice any changes.

它克隆得很好,但随后的提取不会注意到任何变化。

The same issue presents in gitbash, so i guess this is a git issue, not a jenkins issue.

同样的问题出现在 gitbash 中,所以我想这是一个 git 问题,而不是 jenkins 问题。

My horrific workaroundwas to make the dependent jobs delete their workspaces when finished building, so that every git operation is a "clone." It's ridiculous, but maybe less ridiculous than having a zillion jobs banging away at the same github repo.

可怕的解决方法是在完成构建后让依赖作业删除它们的工作区,这样每个 git 操作都是一个“克隆”。这很荒谬,但可能比在同一个 github 存储库中大量工作更荒谬。

ZOMG!That didn't work either, because while git could successfully clone the repo, jenkins would remember the previous revision and build the very same one again. Perhaps it's related to this issue, i dunno, i'm pretty fed up. We gave up, and now all the jobs poll github again. Maybe i'll get a hook working instead.

赞!这也不起作用,因为虽然 git 可以成功克隆 repo,但 jenkins 会记住以前的修订版并再次构建完全相同的修订版。也许它与这个问题有关,我不知道,我已经受够了。我们放弃了,现在所有的工作再次轮询 github。也许我会得到一个钩子来代替。