esp-idf/.gitlab/ci
Aleksei Apaseev a6b84b5ccc feat(ci): add retry job functionality to dynamic pipeline report
Introduced changes:

- add a manual ci job to retry failed jobs.
- refactor js scripts in report template
- extract the CI ENV vars related to the report generation script to the predefined constants.py module
- introduce a new action "retry_failed_jobs" in helper script "gitlab_api.py"
2024-08-09 19:05:53 +08:00
..
dependencies ci: dynamic pipeline 2024-01-10 15:37:34 +01:00
build.yml ci: fast_pipeline yml branch 2024-08-05 16:07:36 +08:00
common.yml feat(ci): add retry job functionality to dynamic pipeline report 2024-08-09 19:05:53 +08:00
danger.yml change(ci): disable dangerjs checker for release notes 2024-08-02 11:56:38 +02:00
default-build-test-rules.yml feat(esp32c61): enable c61 ci build 2024-07-16 16:06:19 +08:00
deploy.yml ci: optimize git strategy 2023-11-14 15:31:30 +01:00
docs.yml ci(docs): set PDF build as allowed to fail 2024-06-21 15:43:20 +08:00
host-test.yml ci: add idf-build-apps load soc caps test case 2024-07-29 15:34:26 +02:00
integration_test.yml ci: always set expire_in and when with artifacts 2024-02-16 13:43:53 +01:00
post_deploy.yml ci: add dut_log_url column to failed testcases report 2024-07-17 17:52:11 +08:00
pre_check.yml ci: fast_pipeline yml branch 2024-08-05 16:07:36 +08:00
pre_commit.yml feat: Added pre-commit hook of PowerShell scrip checker 2024-06-20 08:26:46 +02:00
README.md ci: remove unused integration test rules 2023-10-25 14:16:45 +08:00
retry_failed_jobs.yml feat(ci): add retry job functionality to dynamic pipeline report 2024-08-09 19:05:53 +08:00
rules.yml feat(ci): check LL functions not read write register by half word 2024-06-18 14:58:41 +08:00
static-code-analysis.yml ci: always set expire_in and when with artifacts 2024-02-16 13:43:53 +01:00
test-win.yml ci(tools): Fix IDF_MIRROR_PREFIX_MAP for including all tools from local 2024-04-26 15:17:39 +02:00
upload_cache.yml ci: do not upload caches for dev branches by default 2023-12-19 20:36:55 +08:00

IDF CI

General Workflow

  1. Push to a remote branch
  2. Create an MR, choose related labels (not required)
  3. A detached pipeline will be created.
  4. if you push a new commit, a new pipeline will be created automatically.

What if Expected Jobs ARE NOT Created?

  1. check the file patterns

    If you found a job that is not running as expected with some file changes, a git commit to improve the pattern will be appreciated.

  2. please add MR labels to run additional tests, currently we have to do this only for target-test jobs, please use it as few as possible. Our final goal is to remove all the labels and let the file changes decide everything!

MR labels for additional jobs

Supported MR Labels

  • build
  • build_docs
  • component_ut[_esp32/esp32s2/...]
  • custom_test[_esp32/esp32s2/...]
  • docker
  • docs
  • docs_full, triggers a full docs build, regardless of files changed
  • example_test[_esp32/esp32s2/...]
  • fuzzer_test
  • host_test
  • integration_test
  • iperf_stress_test
  • macos
  • macos_test
  • nvs_coverage
  • submodule
  • windows

There are two general labels (not recommended since these two labels will trigger a lot of jobs)

  • target_test: includes all target for example_test, custom_test, component_ut, integration_test
  • all_test: includes all test labels

How to trigger a detached pipeline without pushing new commits?

Go to MR web page -> Pipelines tab -> click Run pipeline button.

In very rare case, this tab will not show up because no merge_request pipeline is created before. Please use web API then.

curl -X POST --header "PRIVATE-TOKEN: [YOUR PERSONAL ACCESS TOKEN]" [GITLAB_SERVER]/api/v4/projects/103/merge_requests/[MERGE_REQUEST_IID]/pipelines

How to Develop With rules.yml?

General Concepts

  • pattern: Defined in an array. A GitLab job will be created if the changed files in this MR matched one of the patterns. For example:

    .patterns-python-files: &patterns-python-files
      - "**/*.py"
    
  • label: Defined in an if clause, similar as the previous bot command. A GitLab job will be created if the pipeline variables contains variables in BOT_LABEL_xxx format (DEPRECATED) or included in the MR labels. For example:

    .if-label-build_docs: &if-label-build_docs
      if: '$BOT_LABEL_BUILD_DOCS || $CI_MERGE_REQUEST_LABELS =~ /^(?:[^,\n\r]+,)*build_docs(?:,[^,\n\r]+)*$/i'
    
  • rule: A combination of various patterns, and labels. It will be used by GitLab YAML extends keyword to tell GitLab in what conditions will this job be created. For example:

    .rules:build:docs:
      rules:
        - <<: *if-protected
        - <<: *if-label-build_docs
        - <<: *if-label-docs
        - <<: *if-dev-push
          changes: *patterns-docs
    

    An example for GitLab job on how to use extends:

    check_docs_lang_sync:
      extends:
        - .pre_check_template
        - .rules:build:docs
      script:
        - cd docs
        - ./check_lang_folder_sync.sh
    

How to Add a New Job?

check if there's a suitable .rules:<rules-you-need> template

  1. if there is, put this in the job extends. All done, now you can close this window. (extends could be array or string)
  2. if there isn't
    1. check How to Add a New Rules Template?, create a suitable one
    2. follow step 1

How to Add a New Rules Template?

check if this rule is related to labels, patterns

  1. if it is, please refer to dependencies/README.md and add new rules by auto-generating
  2. if it isn't, please continue reading

check if there's a suitable .if-<if-anchor-you-need> anchor

  1. if there is, create a rule following rules Template Naming Rules.For detail information, please refer to GitLab Documentation rules-if. Here's an example.

    .rules:patterns:python-files:
      rules:
        - <<: *if-protected
        - <<: *if-dev-push
          changes: *patterns-python-files
    
  2. if there isn't

    1. check How to Add a New if Anchor?, create a suitable one
    2. follow step 1

How to Add a New if Anchor?

Create an if anchor following if Anchors Naming Rules. For detailed information about how to write the condition clause, please refer to GitLab Documentation `only/except (advanced). Here's an example.

.if-schedule: &if-schedule:
  if: '$CI_PIPELINE_SOURCE == "schedule"'

Naming Rules

Common Naming Rules

if a phrase has multi words, use _ to concatenate them.

e.g. regular_test

if a name has multi phrases, use - to concatenate them.

e.g. regular_test-example_test

if Anchors Naming Rules

  • if it's a label: .if-label-<label_name>

  • if it's a ref: .if-ref-<ref_name>

  • if it's a branch: .if-branch-<branch_name>

  • if it's a tag: .if-tag-<tag_name>

  • if it's multi-type combination: .if-ref-<release_name>-branch-<branch_name>

    Common Phrases/Abbreviations

    • no_label

      $BOT_TRIGGER_WITH_LABEL == null

    • protected

      ($CI_COMMIT_REF_NAME == "master" || $CI_COMMIT_BRANCH =~ /^release\/v/ || $CI_COMMIT_TAG =~ /^v\d+\.\d+(\.\d+)?($|-)/)

    • target_test

      a combination of example_test, custom_test, component_ut, integration_test and all targets

rules Template Naming Rules

  • if it's tag related: .rules:tag:<tag_1>-<tag_2>
  • if it's label related: .rules:labels:<label_1>-<label_2>
  • if it's test related: .rules:test:<test_type>
  • if it's build related: .rules:build:<build_type>
  • if it's pattern related: .rules:patterns:<patterns>

Reusable Shell Script tools/ci/utils.sh

It is used to put all the reusable shell scripts as small functions. If you want to set before_script: [] for you job, now you can set extends: .before_script_slim instead. it will only run source tools/ci/utils.sh

If you're developing CI shell scripts, you can use these functions without source them. They're already included in all before_script

To run these commands in shell script locally, place source tools/ci/utils.sh at the very beginning.

Functions

  • add_gitlab_ssh_keys
  • add_github_ssh_keys
  • add_doc_server_ssh_keys
  • fetch_submodules
  • get_all_submodules
  • error: log in red color
  • warning: log in orange color
  • info: log in green color
  • run_cmd: run the command with duration seconds info
  • retry_failed: run the command with duration seconds info, retry when failed

Manifest File to Control the Build/Test apps

.build-test-rules.yml file is a manifest file to control if the CI is running the build and test job or not. The Supported Targets table in README.md for apps would be auto-generated by pre-commit from the app's .build-test-rules.yml.

Grammar

We're using the latest version of idf-build-apps. Please refer to their documentation

Special Rules

In ESP-IDF CI, there's a few more special rules are additionally supported to disable the check app dependencies feature:

  • Add MR labels BUILD_AND_TEST_ALL_APPS
  • Run in protected branches

Upload/Download Artifacts to Internal Minio Server

Users Without Access to Minio

If you don't have access to the internal Minio server, you can still download the artifacts from the shared link in the job log.

The log will look like this:

Pipeline ID    : 587355
Job name       : build_clang_test_apps_esp32
Job ID         : 40272275
Created archive file: 40272275.zip, uploading as 587355/build_dir_without_map_and_elf_files/build_clang_test_apps_esp32/40272275.zip
Please download the archive file includes build_dir_without_map_and_elf_files from [INTERNAL_URL]

Users With Access to Minio

Env Vars for Minio

Minio takes these env vars to connect to the server:

  • IDF_S3_SERVER
  • IDF_S3_ACCESS_KEY
  • IDF_S3_SECRET_KEY
  • IDF_S3_BUCKET

Artifacts Types and File Patterns

The artifacts types and corresponding file patterns are defined in tools/ci/artifacts_handler.py, inside ArtifactType and TYPE_PATTERNS_DICT.

Upload

python tools/ci/artifacts_handler.py upload

will upload the files that match the file patterns to minio object storage with name:

<pipeline_id>/<artifact_type>/<job_name>/<job_id>.zip

For example, job 39043328 will upload these four files:

  • 575500/map_and_elf_files/build_pytest_examples_esp32/39043328.zip
  • 575500/build_dir_without_map_and_elf_files/build_pytest_examples_esp32/39043328.zip
  • 575500/logs/build_pytest_examples_esp32/39043328.zip
  • 575500/size_reports/build_pytest_examples_esp32/39043328.zip

Download

You may run

python tools/ci/artifacts_handler.py download --pipeline_id <pipeline_id>

to download all files of the pipeline, or

python tools/ci/artifacts_handler.py download --pipeline_id <pipeline_id> --job_name <job_name_or_pattern>

to download all files with the specified job name or pattern, or

python tools/ci/artifacts_handler.py download --pipeline_id <pipeline_id> --job_name <job_name_or_pattern> --type <artifact_type> <artifact_type> ...

to download all files with the specified job name or pattern and artifact type(s).

You may check all detailed documentation with python tools/ci/artifacts_handler.py download -h