github ci cd_带有github操作的foss ci cd

贾烨
2023-12-01

github ci cd

If you are a passionate open-source developer, you are probably not only contributing to existing free and open source software (FOSS) projects started by others, but already started your own. If not — the time will come and you will start one.

如果您是一个热情的开源开发人员,那么您可能不仅会为别人发起的现有自由和开源软件(FOSS)项目做出贡献,而且已经开始了自己的工作。 如果没有,那时候到了,您将开始一个。

A bunch of activities arises around such a project, from pure project management and release planning to management of source code and publishing the deliverable artifact. In the last decade, the process of delivering the artifacts from the source code is usually performed automatically through the continuous delivery (CD) pipeline. A delivery pipeline is a set of tools integrated to a single chain responsible for continuous delivery (CD). On the other hand, if your software project is not running anywhere, but is a library that is used by others, your delivery pipeline aims to publish an artifact into an artifact repositories.

从纯项目管理和发布计划到源代码管理以及发布可交付工件,围绕该项目进行了许多活动。 在最近的十年中,通常通过连续交付(CD)管道自动执行从源代码交付工件的过程。 交付管道是集成到负责连续交付 (CD)的单个链中的一组工具。 另一方面,如果您的软件项目不在任何地方运行,而是一个供他人使用的库,那么您的交付管道旨在将工件发布到工件存储库中。

There is a number of source code versioning systems available on the Internet, but I’ll focus on github.com being most popular. In the following article, I’ll summarize steps needed to set up a continuous integration/delivery (CI/CD) pipeline using GitHub Actions required to build a Java library and publish it into Sonatype Maven Central repository — one of the largest public repository for Java artifacts.

互联网上有许多源代码版本控制系统,但我将重点关注最受欢迎的github.com 。 在下一篇文章中,我将总结使用GitHub设置持续集成/交付(CI / CD)管道所需的步骤,以构建Java库并将其发布到Sonatype Maven Central存储库中 ,该操作是最大的公共存储库之一。 Java工件。

For demonstration purposes, I created a sample reference project on GitHub https://github.com/toolisticon/foss-github-actions-java

出于演示目的,我在GitHub https://github.com/toolisticon/foss-github-actions-java上创建了一个示例参考项目。

建立管道 (Build pipeline)

First of all, let us setup a small build pipeline. Its purpose is to build the software, run all tests, and to run any kind of code analysis on any change committed by a developer. I consider you are using Apache Maven as a build tool and will focus on that (I’m aware of Gradle, but this article is on old-school Maven)

首先,让我们建立一个小的构建管道。 它的目的是构建软件,运行所有测试以及对开发人员所做的任何更改进行任何类型的代码分析。 我认为您正在使用Apache Maven作为构建工具,并将重点关注该工具(我知道Gradle,但本文是针对老式Maven的)

The steps you want to invoke are:

您要调用的步骤是:

  • compile production code

    编译生产代码
  • compile test code

    编译测试代码
  • run unit tests

    运行单元测试

Apache Maven provides so-called Maven Plugins for all those steps. Even more, since those steps are very common, you don’t even need to declare the usage of the plugins in your pom.xml.

Apache Maven为所有这些步骤提供了所谓的Maven插件。 更重要的是,由于这些步骤非常普遍,因此您甚至无需在pom.xml声明插件的用法。

Writing Unit tests for relevant places in code is an advanced skill. The so-called unit test coverage might be an indicator helping to find missing untested code. In this tutorial, I’m using a free library JaCoCo which became the de-facto standard free solution for coverage metering. JaCoCo is shipped as a Maven plugin and is instrumenting code before execution and creates a report after the test run, indicating which places in code has been invoked during the tests.

为代码中的相关位置编写单元测试是一项高级技能。 所谓的单元测试覆盖率可能是一个指标,可以帮助您找到缺少的未经测试的代码。 在本教程中,我使用的是免费库JaCoCo ,它已成为事实上的用于覆盖计量的标准免费解决方案。 JaCoCo作为Maven插件提供,并在执行前检测代码,并在测试运行后创建报告,指示在测试期间已调用代码中的哪些位置。

Here is how the setup of the plugin works:

这是插件设置的工作方式:

JaCoCo configuration to prepare the agent
JaCoCo配置准备代理

By default, JaCoCo creates the binding to the Java Agent for the unit test run and stores it in the {argLine} variable. To use this during the run of the JUnit test you need to specify it in the configuration of the Surefire plugin:

默认情况下,JaCoCo为单元测试运行创建到Java Agent的绑定,并将其存储在{argLine}变量中。 要在JUnit测试运行期间使用它,您需要在Surefire插件的配置中指定它:

Surefire configuration to meter test coverage
Surefire配置可计量测试覆盖范围

Apache Maven itself is required to be present on the build machine and ideally it should be a pre-defined version, to produce repeatable results. For this purpose, a so called Maven Wrapper may be used, a small tool for jump-starting the correct version of Maven inside your project.

Apache Maven本身必须存在于构建机器上,理想情况下,它应该是预定义的版本,以产生可重复的结果。 为此,可以使用所谓的Maven包装器 ,这是一个用于在项目内部快速启动正确版本的Maven的小工具。

If you already have a recent version of Java on your machine you will need to run ./mvnw clean verify from your command line. GitHub Actions runs the build inside of a docker container, so we will need to prepare it first.

如果您的计算机上已经有Java的最新版本,则需要从命令行运行./mvnw clean verify 。 GitHub Actions在docker容器中运行构建,因此我们需要首先准备它。

Usually, your library depends on other libraries and frameworks which need to be available for the compilation from an artifact repository. In large projects we often joke about “downloading the Internet” since the total time consumption for this process becomes a problem. For this purpose, Apache Maven uses a local file cache on the build node. If the build node is set up every time from scratch (a fresh docker container) we can not directly benefit from this cache. Luckily, there is a GitHub Action for saving and restoring files used as cache from and to created build node.

通常,您的库依赖于其他库和框架,这些库和框架需要可用于从工件存储库进行编译。 在大型项目中,我们经常开玩笑说“下载Internet”,因为此过程的总时间消耗成问题。 为此,Apache Maven在构建节点上使用本地文件缓存。 如果每次都从头开始构建节点(使用新的Docker容器),我们将无法直接从此缓存中受益。 幸运的是,有一个GitHub Action用于保存和还原用作缓存的文件以及从其创建构建节点。

Since we are collecting test coverage metrics, it is a good idea to track them along the development history of the projects. There are many tools on the Internet offering free access for FOSS projects. I’ll demonstrate the use of the CodeCov platform.

由于我们正在收集测试覆盖率指标,因此最好在项目的开发历史中跟踪它们。 互联网上有许多工具可为FOSS项目提供免费访问。 我将演示CodeCov平台的用法

Now we discussed all required ingredients, so the first build pipeline can be created. The steps to be executed are:

现在,我们讨论了所有必需的成分,因此可以创建第一个构建管道。 要执行的步骤是:

  • checkout

    退房
  • setup JDK

    设置JDK
  • restore cache

    恢复缓存
  • prepare Maven Wrapper

    准备Maven包装器
  • run build

    运行构建
  • upload CodeCov metrics

    上载CodeCov指标

The steps look as following using the GitHub Actions pipeline syntax:

使用GitHub Actions管道语法的步骤如下所示:

Default build pipeline steps
默认构建管道步骤

We want to make sure that the build pipeline is run on every commit on every branch, to do so add the following trigger:

我们要确保在每个分支的每次提交上都运行构建管道,为此添加以下触发器:

on:
push:
branches:
- '*'
- '**/*'

公共资料库中的秘密 (Secrets in public repository)

You probably noticed the expression ${{secrets.CODECOV_TOKEN}} in the steps listing above. GitHub Actions provides convenient mechanics to store secrets (like credentials, access codes and others) and use them inside the build scripts and pipelines. Much effort is spent to make sure that the secret remains a secret and can’t be revealed by the user, but I don’t want to focus on how safe they are — my assumption at this place is that it is safe enough, as long as you control the code base itself (like pull requests from others by doing the review).

您可能在上面列出的步骤中注意到了表达式${{secrets.CODECOV_TOKEN}} 。 GitHub Actions提供了方便的机制来存储秘密(例如凭据,访问代码等),并在构建脚本和管道中使用它们。 我们花了很多精力来确保机密仍然是机密并且不会被用户泄露,但是我不想集中于它们的安全性–我在这里的假设是它足够安全,因为只要您控制代码库本身(例如通过检查从其他人拉取请求)即可。

The secret has a public visible name and a hidden value, which can be entered once and not be viewed again later. There are two scopes of secrets: per repository and per organization and the first one will overwrite the latter ones if the name of the secret is the same.

该机密有一个公开的可见名称和一个隐藏值,可以一次输入该秘密,以后不能再次查看。 秘密的范围有两种:每个存储库和每个组织,如果秘密的名称相同,则第一个将覆盖后一个。

As you will see later, several credentials are required during the release pipeline and all of them will be stored in GitHub Secrets.

如您将在后面看到的,在发布管道中需要多个凭证,所有凭证都将存储在GitHub Secrets中。

Sonatype OSS要求 (Sonatype OSS Requirements)

After the creation of a build pipeline, let us collect all requirements Sonatype requests for publication in Maven Central repository.

创建构建管道之后,让我们收集所有需求Sonatype请求,以在Maven Central存储库中发布。

  • You need a Sonatype account

    您需要一个Sonatype帐户
  • Your account needs permissions to publish using the specified Group Id

    您的帐户需要使用指定的组ID进行发布的权限
  • For every Java artifact (jar file), there must be sources and javadoc files

    对于每个Java工件(jar文件),都必须有源文件和javadoc文件
  • Every artifact must be supplied with an additional signature file (this includes pom.xml, any binary jar, sources jar and javadoc jar).

    每个工件都必须提供一个附加的签名文件(包括pom.xml ,任何二进制jar,源jar和javadoc jar)。

  • Additional requirements on the pom.xml (artifact name, artifact description, url, scm, distribution management, license)

    pom.xml附加要求(工件名称,工件描述,URL,scm,分发管理,许可证)

That’s it, sounds easy, right? In the following sections I’ll explain how to set all this up.

就是这样,听起来很简单,对吧? 在以下各节中,我将说明如何设置所有这些设置。

Sonatype帐户和组ID权限 (Sonatype account and Group Id permissions)

As described in OSSRH Guide you need to create the JIRA account and file a ticket to get permission to publish under certain group id. I usually do the verification of the group id by creating a DNS TXT record in the domain used. This works very fast and you should get the response from Joel (the Sonatype employee) almost immediately.

如《 OSSRH指南》中所述,您需要创建JIRA帐户并提交票证以获得在特定组ID下发布的许可。 我通常通过在使用的域中创建DNS TXT记录来验证组ID。 这非常快速,您应该几乎立即从Joel(Sonatype员工)那里得到答复。

If you already have permissions to publish the artifact, you will need to configure Apache Maven to use your credentials. For doing so, the pom.xml must declare a repository in the distribution management / specify the server id of the repository to use and configure credentials for this repository. The configuration of the credentials is performed in settings.xml used during the build. There is GitHub Action for doing so:

如果您已经具有发布工件的权限,则需要配置Apache Maven以使用您的凭证。 为此, pom.xml必须在分发管理中声明一个存储库/指定该存储库的服务器ID以使用和配置该存储库的凭据。 凭证的配置在构建期间使用的settings.xml执行。 有GitHub Action可以这样做:

Setup settings.xml with credentials for the server OSSRH
使用服务器OSSRH的凭据设置settings.xml

I’ve seen an example of the configuration of settings.xml during setup of JDK, but I never got it running, because of Maven security, so I stick to the example above in my projects.

在JDK的安装过程中,我已经看到了settings.xml的配置示例,但是由于Maven的安全性,我从未使其运行,因此在我的项目中,我坚持上面的示例。

In order to publish an artifact to a Maven Repository, Maven provides a default deploy plugin (enabled by default). I’m not using it, but rely on the Sonatype release plugin because of better automation of the deployment and and the release process. Here is the configuration in pom.xml:

为了将工件发布到Maven存储库,Maven提供了一个默认的deploy插件(默认情况下启用)。 我没有使用它,而是依靠Sonatype发布插件,因为它可以更好地实现部署和发布过程的自动化。 这是pom.xml的配置:

Disabled deploy plugin and nexus-staging plugin used instead
已禁用禁用的部署插件和nexus-staging插件

创建JavaDoc和Sources档案 (Create JavaDoc and Sources archives)

As we use Apache Maven as a build manager, the corresponding Maven plugins are responsible for creating the JavaDoc and sources archives. Here is the relevant part of the configuration in your pom.xml:

当我们使用Apache Maven作为构建管理器时,相应的Maven插件负责创建JavaDoc和源档案。 这是pom.xml配置的相关部分:

JavaDoc and Sources plugin configurations
JavaDoc和Sources插件配置

签名文物 (Signing artifacts)

Signing artifacts is performed using the GPG (GPG2) tool. Again a special Maven plugin is used to integrate the signature into your build process, but in contrast to other activities it is just passing the configuration to the native program. This means that GPG must be installed on your machine (standard on *-ix and mac) to test the signature locally. For the run in GitHub Actions, GPG is already installed in the Docker container.

使用GPG(GPG2)工具执行签名工件。 再次使用一个特殊的Maven插件将签名集成到您的构建过程中,但是与其他活动相反,它只是将配置传递给本机程序。 这意味着必须在您的机器上安装GPG(*-ix和mac上为标准),才能在本地测试签名。 对于GitHub Actions中的运行,GPG已安装在Docker容器中。

In order to sign the artifacts GPG requires your private key. Your public key should be publically available on PGP server, so everyone can verify that the artifact has been created by you. I won’t spend time and discuss how to generate private/public key pair and to distribute the public key. I just assume you managed to generate a key pair and your key is protected by a passphrase.

为了签名工件,GPG需要您的私钥。 您的公钥应在PGP服务器上公开可用,以便每个人都可以验证工件已由您创建。 我不会花时间讨论如何生成私钥/公钥对以及分发公钥。 我只是假设您设法生成了一个密钥对,并且您的密钥受到密码短语的保护。

Here are the options of the corresponding Maven plugin:

以下是相应的Maven插件的选项:

<!-- To sign the artifacts -->
        <plugin>
          <groupId>org.apache.maven.plugins</groupId>
          <artifactId>maven-gpg-plugin</artifactId>
          <version>1.6</version>
          <configuration>
            <gpgArguments>
              <arg>--batch</arg>
              <arg>--yes</arg>
              <arg>--pinentry-mode</arg>
              <arg>loopback</arg>
            </gpgArguments>
          </configuration>
          <executions>
            <execution>
              <id>sign-artifacts</id>
              <phase>verify</phase>
              <goals>
                <goal>sign</goal>
              </goals>
            </execution>
          </executions>
        </plugin>

If you now run your build (remind the batch mode) the build will fail, since it requires gpg.keyname and gpg.passphrase to be set-up. If you supply the required parameters, the plugin will invoke GPG and the latter will try to find the key by the given name later. This works locally but is challenging to achieve in a fresh-created Docker container.

如果您现在运行构建(提醒batch模式),构建将失败,因为它需要设置gpg.keynamegpg.passphrase 。 如果提供必需的参数,则插件将调用GPG,后者稍后将尝试通过给定名称查找密钥。 这在本地有效,但是要在新创建的Docker容器中实现具有挑战性。

The idea is to import the key(s) from a secret prior invoking Maven during the build of the artifact. First, let us export the key from your local machine. Since the key is provided in a binary format, we need to encode it using base64 encoding. In addition, GPG will reject a key generated on a random other machine and will accept it only in case the so-called owner trust key is present.

这个想法是在构建构件期间从秘密调用Maven之前导入密钥。 首先,让我们从您的本地计算机导出密钥。 由于密钥是以二进制格式提供的,因此我们需要使用base64编码对其进行编码。 另外,GPG将拒绝在其他随机机器上生成的密钥,并且仅在存在所谓的所有者信任密钥的情况下才接受。

To export the owner trust, run:

要导出所有者信任,请运行:

gpg --export-ownertrust | base64

gpg --export-ownertrust | base64

To export the key with the name A1B2C3D4 run

要导出名称为A1B2C3D4的密钥, A1B2C3D4运行

gpg --export-secret-keys A1B2C3D4 | base64

gpg --export-secret-keys A1B2C3D4 | base64

Both commands will produce base64 encoded strings which you need to store in your repository secrets (GPG_OWNERTRUST and GPG_SECRET_KEYS). Make sure the strings don’t contain new line/line feeds produced by your console (join manually to a single line string).

这两个命令都将生成base64编码的字符串,您需要将它们存储在存储库秘密中( GPG_OWNERTRUSTGPG_SECRET_KEYS )。 确保字符串不包含控制台产生的新换行/换行符(手动加入单个换行字符串)。

Now let’s see, how the GPG can be prepared for signature:

现在让我们看看如何为签名准备GPG:

- name: Import GPG Owner Trust
      run: echo ${{secrets.GPG_OWNERTRUST}} | base64 --decode | gpg --import-ownertrust


    - name: Import GPG key
      run: echo ${{secrets.GPG_SECRET_KEYS}} | base64 --decode | gpg --import --no-tty --batch --yes

After the execution of the steps, the GPG in GitHub Actions Docker will have the key for the signature of the artifact. To pass the key name and passphrase the command line parameters are used.

执行完这些步骤后,GitHub Actions Docker中的GPG将具有用于工件签名的密钥。 要传递密钥名称和密码,使用命令行参数。

将其移至个人资料 (Move it to a profile)

Now we defined a bunch of plugins which will be executed during the artifact release, but slow down the execution of a standard build locally (or in the build pipeline) drastically. To avoid this waste of time, we should consider moving all the release relevant plugins to a Maven profile (or move the definition into a pluginManagement section and reference them from the profile only). Here is how the default build block looks like:

现在,我们定义了一堆插件,这些插件将在工件发布期间执行,但是会大大减慢本地(或在构建管道中)标准构建的执行速度。 为了避免浪费时间,我们应该考虑将所有与发行版相关的插件移动到Maven配置文件中(或将定义移动到pluginManagement部分中,并仅从配置文件中引用它们)。 这是默认构建块的外观:

<plugins>
      <plugin>
        <groupId>org.jacoco</groupId>
        <artifactId>jacoco-maven-plugin</artifactId>
      </plugin>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-compiler-plugin</artifactId>
      </plugin>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-enforcer-plugin</artifactId>
      </plugin>
    </plugins>

and here is the block relevant for the publication in a separate profile executed in addition to the default build block:

这是除默认构建块之外在单独的概要文件中与发布相关的块:

<profiles>
    <!--
      Profile creating all artifacts: JARs, POMs, Sources, JavaDoc and all signatures.
    -->
    <profile>
      <id>release</id>
      <activation>
        <property>
          <name>release</name>
        </property>
      </activation>
      <build>
        <plugins>
          <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-javadoc-plugin</artifactId>
          </plugin>
          <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-source-plugin</artifactId>
          </plugin>
          <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-gpg-plugin</artifactId>
          </plugin>
          <plugin>
            <groupId>org.sonatype.plugins</groupId>
            <artifactId>nexus-staging-maven-plugin</artifactId>
          </plugin>
        </plugins>
      </build>
    </profile>

放在一起 (Putting all together)

Here are the relevant GitHub Actions to release on Maven Central:

以下是要在Maven Central上发布的相关GitHub动作:

- name: Import GPG Owner Trust
      run: echo ${{secrets.GPG_OWNERTRUST}} | base64 --decode | gpg --import-ownertrust


    - name: Import GPG key
      run: echo ${{secrets.GPG_SECRET_KEYS}} | base64 --decode | gpg --import --no-tty --batch --yes


    - name: Clean settings.xml
      run: rm -rf ~/.m2/settings.xml


    - name: Create settings.xml
      uses: s4u/maven-settings-action@v1
      with:
        servers: '[{"id": "ossrh", "username": "${{secrets.SONATYPE_USERNAME}}", "password": "${{secrets.SONATYPE_PASSWORD}}"}]'


    - name: Deploy a new version to central
      run: ./mvnw clean deploy -B -DskipTests -DskipExamples -Prelease -Dgpg.keyname=${{secrets.GPG_KEYNAME}} -Dgpg.passphrase=${{secrets.GPG_PASSPHRASE}}

Please note that we used -DskipTests and -DskipExamples during the build and pass the -Prelease profile along with the credentials for the GPG signing. The target phase is deploy to activate the publication plugins.

请注意,我们在构建过程中使用了-DskipTests-DskipExamples ,并将-Prelease配置文件以及GPG签名的凭据传递给了。 目标阶段已deploy以激活发布插件。

We can skip the tests, because the run with the tests has passed just before (we release after build, so the beginning of the release-pipeline is the copy of your build pipeline).

我们可以跳过测试,因为运行测试的时间刚好过去了(我们在构建后才发布,所以release-pipeline的开始就是构建管道的副本)。

I’m usually skipping example projects, because I don’t want to include them into publications. That is why the example Maven module is not included directly into the parent POM, but inside a Maven profile, which can be deactivated by the flag above.

我通常会跳过示例项目,因为我不想将它们包含在出版物中。 这就是为什么example Maven模块不直接包含在父POM中,而是包含在Maven配置文件中的原因,可以通过上面的标志将其禁用。

Finally, the profile release activates all required plugins and the passphrase and key name are stored in secrets for signing.

最后,配置文件release激活所有必需的插件,并且密码和密钥名称存储在用于签名的秘密中。

关于代码/发布管理的一些话 (Some words on code / release management)

After getting it technically working a question may arise how to trigger the release and how the versioning of source code works well with release management.

从技术上讲它可以正常工作之后,可能会出现一个问题,即如何触发发行版以及源代码的版本控制如何与发行版管理一起很好地工作。

There are different approaches on this, but I like to use gitflow as a git branching model. In doing so, the features are developed on feature branches and are merged by pull requests (reviewed by other developers) on develop branch. Finally, if a release needs to be created, the changes go (via release branch) to the master branch. By doing so, every commit to master branch can be used as trigger for creation of the release and manifest it in the artifact version.

对此有不同的方法,但是我喜欢使用gitflow作为git分支模型。 在这种情况下,功能是在功能分支上开发的,并通过develop分支上的拉取请求(由其他开发人员审查)进行合并。 最后,如果需要创建发布,则更改(通过发布分支)转到master分支。 这样,对master分支的每次提交都可以用作创建发行版的触发器,并将其体现在工件版本中。

Of course, you can execute the steps of Gitflow manually (with Git and Maven versioning), but there is a nice plugin called gitflow-plugin which automates the steps for you. It’s configuration is pretty straight forward:

当然,您可以手动执行Gitflow的步骤(使用Git和Maven版本控制),但是有一个不错的插件gitflow-plugin可以自动为您执行这些步骤。 它的配置非常简单:

<!-- gitflow -->
        <plugin>
          <groupId>com.amashchenko.maven.plugin</groupId>
          <artifactId>gitflow-maven-plugin</artifactId>
          <version>1.14.0</version>
          <configuration>
            <gitFlowConfig>
              <productionBranch>master</productionBranch>
              <developmentBranch>develop</developmentBranch>
              <featureBranchPrefix>feature/</featureBranchPrefix>
              <releaseBranchPrefix>release/</releaseBranchPrefix>
              <hotfixBranchPrefix>hotfix/</hotfixBranchPrefix>
              <supportBranchPrefix>support/</supportBranchPrefix>
              <origin>origin</origin>
            </gitFlowConfig>
            <useSnapshotInHotfix>true</useSnapshotInHotfix>
            <useSnapshotInRelease>true</useSnapshotInRelease>
            <keepBranch>false</keepBranch>
            <pushRemote>true</pushRemote>
          </configuration>
        </plugin>

To create a new release, update the version of the pom and finally merge everything to master you will need to run the following command:

要创建新发行版,请更新pom的版本,最后将所有内容合并到master您将需要运行以下命令:

./mvnw -B gitflow:release-start gitflow:release-finish

./mvnw -B gitflow:release-start gitflow:release-finish

You can do it manually from your developer machine or even go a step further and trigger this from an additional GitHub Action (for example if a release is announced and published from a milestone via GitHub User Interface).

您可以在开发人员机器上手动执行此操作,甚至可以更进一步,并通过其他GitHub Action触发此操作(例如,如果某个发布是通过GitHub User Interface从某个里程碑宣布并发布的)。

In the end the plugin will push to the master branch of your repository, so you can use this as trigger inside the release pipeline:

最后,插件将推送到您存储库的master分支,因此您可以将其用作发布管道中的触发器:

on:
push:
branches:
- master

The trigger will first build your software, then create all artifacts and finally upload it to Sonatype Nexus (OSSRH). If it was successful and all checks executed by Sonatype are passed, the staging repository will be closed and all artifacts released to the release repository automatically.

触发器将首先构建您的软件,然后创建所有工件,最后将其上传到Sonatype Nexus(OSSRH)。 如果成功,并且通过了Sonatype执行的所有检查,则登台存储库将关闭,所有工件都将自动释放到发行存储库。

The last steps of the output of your GitHub Action should look like this:

GitHub Action输出的最后一步应如下所示:

[INFO]  * Upload of locally staged artifacts finished.
[INFO] * Closing staging repository with ID "iotoolisticon-1070".
Waiting for operation to complete...
..........
[INFO] Remote staged 1 repositories, finished with success.
[INFO] Remote staging repositories are being released...
Waiting for operation to complete...
..........
[INFO] Remote staging repositories released.
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for github-actions-java-parent 0.0.2:
[INFO]
[INFO] github-actions-java-parent ......................... SUCCESS
[INFO] github-actions-java ................................ SUCCESS
[INFO] -------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] -------------------------------------------------------------
[INFO] Total time: 01:31 min
[INFO] Finished at: 2020-08-13T15:25:10Z
[INFO] -------------------------------------------------------------

Please note that it can last several hours until the artifact is accessible and searchable (and is findable in the artifact index). If any errors occur during the publication, have a look on the Sonatype Nexus report displayed in Staging Repository section.

请注意,它可能持续数小时,直到可以访问和搜索工件(并且可以在工件索引中找到)。 如果发布期间发生任何错误,请查看“登台存储库”部分中显示的Sonatype Nexus报告。

翻译自: https://medium.com/@zambrovski/foss-ci-cd-with-github-actions-c65c37236c19

github ci cd

 类似资料: