8000 feat(pingcap/tidb): support ci for release-8.2 by wuhuizuo · Pull Request #3011 · PingCAP-QE/ci · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

feat(pingcap/tidb): support ci for release-8.2 #3011

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Jun 25, 2024

Conversation

wuhuizuo
Copy link
Collaborator
@wuhuizuo wuhuizuo commented Jun 25, 2024

User description

Signed-off-by: wuhuizuo wuhuizuo@126.com


PR Type

enhancement, tests


Description

  • Added multiple Jenkins pipeline scripts for various tests, including SQL logic, additional checks, JDBC integration, common integration, DDL integration, MySQL integration, BR integration, Lightning integration, Python ORM integration, Node.js integration, MySQL client, code checks, unit tests, TiFlash tests, common tests, coprocessor integration, and end-to-end tests.
  • Configured Kubernetes agents, environment variables, and stages for debugging, checkout, preparation, and testing.
  • Implemented matrix testing for different test parameters, directories, and stores.
  • Added Jenkins job DSL scripts for the corresponding pipelines, configuring job parameters, properties, and SCM settings.
  • Added a Jenkins folder configuration for the release-8.2 pipelines.

Changes walkthrough 📝

Relevant files
Enhancement
35 files
pull_sqllogic_test.groovy
Add Jenkins pipeline for SQL logic tests                                 

pipelines/pingcap/tidb/release-8.2/pull_sqllogic_test.groovy

  • Added a new Jenkins pipeline script for SQL logic tests.
  • Configured Kubernetes agents and environment variables.
  • Defined stages for debugging, checkout, preparation, and testing.
  • Implemented matrix testing for different test paths and cache
    settings.
  • +199/-0 
    pull_check2.groovy
    Add Jenkins pipeline for additional checks                             

    pipelines/pingcap/tidb/release-8.2/pull_check2.groovy

  • Added a new Jenkins pipeline script for additional checks.
  • Configured Kubernetes agents and environment variables.
  • Defined stages for debugging, checkout, preparation, and checks.
  • Implemented matrix testing for various script and argument
    combinations.
  • +178/-0 
    pull_integration_jdbc_test.groovy
    Add Jenkins pipeline for JDBC integration tests                   

    pipelines/pingcap/tidb/release-8.2/pull_integration_jdbc_test.groovy

  • Added a new Jenkins pipeline script for JDBC integration tests.
  • Configured Kubernetes agents and environment variables.
  • Defined stages for debugging, checkout, preparation, and testing.
  • Implemented matrix testing for different JDBC test parameters and
    stores.
  • +170/-0 
    pull_build.groovy
    Add Jenkins pipeline for project build                                     

    pipelines/pingcap/tidb/release-8.2/pull_build.groovy

  • Added a new Jenkins pipeline script for building the project.
  • Configured Kubernetes agents and environment variables.
  • Defined stages for debugging, checkout, and building.
  • Implemented post-build steps for artifact upload and archiving.
  • +148/-0 
    pull_integration_common_test.groovy
    Add Jenkins pipeline for common integration tests               

    pipelines/pingcap/tidb/release-8.2/pull_integration_common_test.groovy

  • Added a new Jenkins pipeline script for common integration tests.
  • Configured Kubernetes agents and environment variables.
  • Defined stages for debugging, checkout, preparation, and testing.
  • Implemented matrix testing for different test directories and stores.
  • +160/-0 
    pull_integration_ddl_test.groovy
    Add Jenkins pipeline for DDL integration tests                     

    pipelines/pingcap/tidb/release-8.2/pull_integration_ddl_test.groovy

  • Added a new Jenkins pipeline script for DDL integration tests.
  • Configured Kubernetes agents and environment variables.
  • Defined stages for debugging, checkout, preparation, and testing.
  • Implemented matrix testing for different DDL test cases.
  • +156/-0 
    pull_integration_mysql_test.groovy
    Add Jenkins pipeline for MySQL integration tests                 

    pipelines/pingcap/tidb/release-8.2/pull_integration_mysql_test.groovy

  • Added a new Jenkins pipeline script for MySQL integration tests.
  • Configured Kubernetes agents and environment variables.
  • Defined stages for debugging, checkout, preparation, and testing.
  • Implemented matrix testing for different test parts and stores.
  • +152/-0 
    pull_integration_br_test.groovy
    Add Jenkins pipeline for BR integration tests                       

    pipelines/pingcap/tidb/release-8.2/pull_integration_br_test.groovy

  • Added a new Jenkins pipeline script for BR integration tests.
  • Configured Kubernetes agents and environment variables.
  • Defined stages for debugging, checkout, preparation, and testing.
  • Implemented matrix testing for different test groups.
  • +153/-0 
    pull_integration_lightning_test.groovy
    Add Jenkins pipeline for Lightning integration tests         

    pipelines/pingcap/tidb/release-8.2/pull_integration_lightning_test.groovy

  • Added a new Jenkins pipeline script for Lightning integration tests.
  • Configured Kubernetes agents and environment variables.
  • Defined stages for debugging, checkout, preparation, and testing.
  • Implemented matrix testing for different test groups.
  • +151/-0 
    pull_integration_python_orm_test.groovy
    Add Jenkins pipeline for Python ORM integration tests       

    pipelines/pingcap/tidb/release-8.2/pull_integration_python_orm_test.groovy

  • Added a new Jenkins pipeline script for Python ORM integration tests.
  • Configured Kubernetes agents and environment variables.
  • Defined stages for debugging, checkout, preparation, and testing.
  • Implemented matrix testing for different Python ORM test parameters
    and stores.
  • +150/-0 
    pull_integration_nodejs_test.groovy
    Add Jenkins pipeline for Node.js integration tests             

    pipelines/pingcap/tidb/release-8.2/pull_integration_nodejs_test.groovy

  • Added a new Jenkins pipeline script fo 8000 r Node.js integration tests.
  • Configured Kubernetes agents and environment variables.
  • Defined stages for debugging, checkout, preparation, and testing.
  • Implemented matrix testing for different Node.js test directories.
  • +146/-0 
    pull_mysql_test.groovy
    Add Jenkins pipeline for MySQL tests                                         

    pipelines/pingcap/tidb/release-8.2/pull_mysql_test.groovy

  • Added a new Jenkins pipeline script for MySQL tests.
  • Configured Kubernetes agents and environment variables.
  • Defined stages for debugging, checkout, preparation, and testing.
  • Implemented matrix testing for different test parts.
  • +139/-0 
    pull_unit_test.groovy
    Add Jenkins pipeline for unit tests                                           

    pipelines/pingcap/tidb/release-8.2/pull_unit_test.groovy

  • Added a new Jenkins pipeline script for unit tests.
  • Configured Kubernetes agents and environment variables.
  • Defined stages for debugging, checkout, and testing.
  • Implemented post-test steps for coverage upload and artifact
    archiving.
  • +126/-0 
    pull_tiflash_test.groovy
    Add Jenkins pipeline for TiFlash tests                                     

    pipelines/pingcap/tidb/release-8.2/pull_tiflash_test.groovy

  • Added a new Jenkins pipeline script for TiFlash tests.
  • Configured Kubernetes agents and environment variables.
  • Defined stages for debugging, checkout, preparation, and testing.
  • Implemented post-test steps for log collection and artifact archiving.

  • +121/-0 
    pull_common_test.groovy
    Add Jenkins pipeline for common tests                                       

    pipelines/pingcap/tidb/release-8.2/pull_common_test.groovy

  • Added a new Jenkins pipeline script for common tests.
  • Configured Kubernetes agents and environment variables.
  • Defined stages for debugging, checkout, preparation, and testing.
  • Implemented matrix testing for different test directories and
    commands.
  • +127/-0 
    pull_integration_copr_test.groovy
    Add Jenkins pipeline for coprocessor integration tests     

    pipelines/pingcap/tidb/release-8.2/pull_integration_copr_test.groovy

  • Added a new Jenkins pipeline script for coprocessor integration tests.
  • Configured Kubernetes agents and environment variables.
  • Defined stages for debugging, checkout, preparation, and testing.
  • +110/-0 
    pull_e2e_test.groovy
    Add Jenkins pipeline for end-to-end tests                               

    pipelines/pingcap/tidb/release-8.2/pull_e2e_test.groovy

  • Added a new Jenkins pipeline script for end-to-end tests.
  • Configured Kubernetes agents and environment variables.
  • Defined stages for debugging, checkout, preparation, and testing.
  • Implemented post-test steps for log collection and artifact archiving.

  • +103/-0 
    pull_mysql_client_test.groovy
    Add Jenkins pipeline for MySQL client tests                           

    pipelines/pingcap/tidb/release-8.2/pull_mysql_client_test.groovy

  • Added a new Jenkins pipeline script for MySQL client tests.
  • Configured Kubernetes agents and environment variables.
  • Defined stages for debugging, checkout, preparation, and testing.
  • +96/-0   
    pull_check.groovy
    Add Jenkins pipeline for code checks                                         

    pipelines/pingcap/tidb/release-8.2/pull_check.groovy

  • Added a new Jenkins pipeline script for code checks.
  • Configured Kubernetes agents and environment variables.
  • Defined stages for debugging, checkout, and checks.
  • Implemented post-check steps for coverage upload and artifact
    archiving.
  • +96/-0   
    pull_build.groovy
    Add Jenkins job DSL for build pipeline                                     

    jobs/pingcap/tidb/release-8.2/pull_build.groovy

  • Added a new Jenkins job DSL script for the build pipeline.
  • Configured job parameters and properties.
  • Defined the pipeline script path and SCM settings.
  • +39/-0   
    pull_check2.groovy
    Add Jenkins job DSL for additional checks pipeline             

    jobs/pingcap/tidb/release-8.2/pull_check2.groovy

  • Added a new Jenkins job DSL script for the additional checks pipeline.
  • Configured job parameters and properties.
  • Defined the pipeline script path and SCM settings.
  • +38/-0   
    pull_integration_python_orm_test.groovy
    Add Jenkins job DSL for Python ORM integration tests pipeline

    jobs/pingcap/tidb/release-8.2/pull_integration_python_orm_test.groovy

  • Added a new Jenkins job DSL script for Python ORM integration tests
    pipeline.
  • Configured job parameters and properties.
  • Defined the pipeline script path and SCM settings.
  • +37/-0   
    pull_mysql_test.groovy
    Add Jenkins job DSL for MySQL tests pipeline                         

    jobs/pingcap/tidb/release-8.2/pull_mysql_test.groovy

  • Added a new Jenkins job DSL script for MySQL tests pipeline.
  • Configured job parameters and properties.
  • Defined the pipeline script path and SCM settings.
  • +38/-0   
    pull_unit_test.groovy
    Add Jenkins job DSL for unit tests pipeline                           

    jobs/pingcap/tidb/release-8.2/pull_unit_test.groovy

  • Added a new Jenkins job DSL script for unit tests pipeline.
  • Configured job parameters and properties.
  • Defined the pipeline script path and SCM settings.
  • +38/-0   
    pull_check.groovy
    Add Jenkins job DSL for code checks pipeline                         

    jobs/pingcap/tidb/release-8.2/pull_check.groovy

  • Added a new Jenkins job DSL script for code checks pipeline.
  • Configured job parameters and properties.
  • Defined the pipeline script path and SCM settings.
  • +38/-0   
    pull_e2e_test.groovy
    Add Jenkins job DSL for end-to-end tests pipeline               

    jobs/pingcap/tidb/release-8.2/pull_e2e_test.groovy

  • Added a new Jenkins job DSL script for end-to-end tests pipeline.
  • Configured job parameters and properties.
  • Defined the pipeline script path and SCM settings.
  • +37/-0   
    pull_integration_copr_test.groovy
    Add Jenkins job DSL for coprocessor integration tests pipeline

    jobs/pingcap/tidb/release-8.2/pull_integration_copr_test.groovy

  • Added a new Jenkins job DSL script for coprocessor integration tests
    pipeline.
  • Configured job parameters and properties.
  • Defined the pipeline script path and SCM settings.
  • +37/-0   
    pull_integration_ddl_test.groovy
    Add Jenkins job DSL for DDL integration tests pipeline     

    jobs/pingcap/tidb/release-8.2/pull_integration_ddl_test.groovy

  • Added a new Jenkins job DSL script for DDL integration tests pipeline.
  • Configured job parameters and properties.
  • Defined the pipeline script path and SCM settings.
  • +37/-0   
    pull_integration_jdbc_test.groovy
    Add Jenkins job DSL for JDBC integration tests pipeline   

    jobs/pingcap/tidb/release-8.2/pull_integration_jdbc_test.groovy

  • Added a new Jenkins job DSL script for JDBC integration tests
    pipeline.
  • Configured job parameters and properties.
  • Defined the pipeline script path and SCM settings.
  • +37/-0   
    pull_integration_lightning_test.groovy
    Add Jenkins job DSL for Lightning integration tests pipeline

    jobs/pingcap/tidb/release-8.2/pull_integration_lightning_test.groovy

  • Added a new Jenkins job DSL script for Lightning integration tests
    pipeline.
  • Configured job parameters and properties.
  • Defined the pipeline script path and SCM settings.
  • +37/-0   
    pull_integration_nodejs_test.groovy
    Add Jenkins job DSL for Node.js integration tests pipeline

    jobs/pingcap/tidb/release-8.2/pull_integration_nodejs_test.groovy

  • Added a new Jenkins job DSL script for Node.js integration tests
    pipeline.
  • Configured job parameters and properties.
  • Defined the pipeline script path and SCM settings.
  • +37/-0   
    pull_tiflash_test.groovy
    Add Jenkins job DSL for TiFlash tests pipeline                     

    jobs/pingcap/tidb/release-8.2/pull_tiflash_test.groovy

  • Added a new Jenkins job DSL script for TiFlash tests pipeline.
  • Configured job parameters and properties.
  • Defined the pipeline script path and SCM settings.
  • +37/-0   
    pull_integration_br_test.groovy
    Add Jenkins job DSL for BR integration tests pipeline       

    jobs/pingcap/tidb/release-8.2/pull_integration_br_test.groovy

  • Added a new Jenkins job DSL script for BR integration tests pipeline.
  • Configured job parameters and properties.
  • Defined the pipeline script path and SCM settings.
  • +37/-0   
    pull_integration_common_test.groovy
    Add Jenkins job DSL for common integration tests pipeline

    jobs/pingcap/tidb/release-8.2/pull_integration_common_test.groovy

  • Added a new Jenkins job DSL script for common integration tests
    pipeline.
  • Configured job parameters and properties.
  • Defined the pipeline script path and SCM settings.
  • +37/-0   
    aa_folder.groovy
    Add Jenkins folder configuration for release-8.2 pipelines

    jobs/pingcap/tidb/release-8.2/aa_folder.groovy

  • Added a new Jenkins folder configuration for the release-8.2
    pipelines.
  • +3/-0     

    💡 PR-Agent usage:
    Comment /help on the PR to get a list of all available PR-Agent tools and their descriptions

    Copy link
    ti-chi-bot bot commented Jun 25, 2024

    I Skip it since the diff size(196601 bytes > 80000 bytes) is too large

    Copy link
    Contributor

    PR Reviewer Guide 🔍

    ⏱️ Estimated effort to review [1-5] 5
    🧪 Relevant tests Yes
    🔒 Security concerns No
    ⚡ Key issues to review Possible Bug:
    The PR introduces a large number of Jenkins pipeline scripts and configurations for various integration, unit, and other types of tests. It is crucial to ensure that these scripts are thoroughly reviewed for syntax errors, logical errors, and potential inefficiencies. For example, the extensive use of shell scripts within the Jenkins pipeline code could be error-prone and hard to maintain.
    Performance Concerns:
    The extensive matrix configurations in the Jenkins pipeline might lead to a significant resource allocation and prolonged execution time. It's important to ensure that the resources are optimally used and the execution time is minimized.
    Hardcoded Values:
    There are several hardcoded values within the scripts, such as namespace names, file paths, and environment variables. These should ideally be parameterized to make the pipelines more flexible and environment-agnostic.

    Copy link
    Contributor

    PR Code Suggestions ✨

    CategorySuggestion                                                                                                                                    Score
    Security
    Use withCredentials block to handle GIT_CREDENTIALS_ID securely during Git operations

    Use withCredentials block to handle GIT_CREDENTIALS_ID securely when setting the SSH key
    for Git operations.

    pipelines/pingcap/tidb/release-8.2/pull_build.groovy [49-53]

    -script {
    -    git.setSshKey(GIT_CREDENTIALS_ID)
    -    retry(2) {
    -        prow.checkoutRefs(REFS, timeout = 5, credentialsId = '', gitBaseUrl = 'https://github.com', withSubmodule=true)
    +withCredentials([sshUserPrivateKey(credentialsId: GIT_CREDENTIALS_ID, keyFileVariable: 'SSH_KEY')]) {
    +    script {
    +        git.setSshKey(SSH_KEY)
    +        retry(2) {
    +            prow.checkoutRefs(REFS, timeout = 5, credentialsId = '', gitBaseUrl = 'https://github.com', withSubmodule=true)
    +        }
         }
     }
     
    • Apply this suggestion
    Suggestion importance[1-10]: 9

    Why: Using withCredentials for handling credentials securely is crucial for security best practices, especially in CI/CD pipelines where sensitive data is involved.

    9
    Best practice
    Add a post block to handle failures and always archive logs for better debugging

    Consider adding a post block to the TestsGroup1 and TestsGroup2 stages to handle potential
    failures and always archive logs or artifacts for better debugging and traceability.

    pipelines/pingcap/tidb/release-8.2/pull_sqllogic_test.groovy [84-139]

     stage('TestsGroup1') {
         matrix {
             axes {
                 axis {
                     name 'CACHE_ENABLED'
                     values '0', "1"
                 }
                 axis {
                     name 'TEST_PATH_STRING'
                     values 'index/between/1 index/between/10 index/between/100', 'index/between/100 index/between/1000 index/commute/10',
                         'index/commute/100 index/commute/1000_n1 index/commute/1000_n2',
                         'index/delete/1 index/delete/10 index/delete/100 index/delete/1000 index/delete/10000',
                         'random/aggregates_n1 random/aggregates_n2 random/expr' , 'random/expr random/select_n1 random/select_n2', 
                         'select random/groupby'
                 }
             }
             agent{
                 kubernetes {
                     namespace K8S_NAMESPACE
                     yamlFile POD_TEMPLATE_FILE
                     defaultContainer 'golang'
                 }
             } 
             stages {
                 stage("Test") {
                     options { timeout(time: 40, unit: 'MINUTES') }
                     steps {
                         dir('tidb-test') {
                             cache(path: "./sqllogic_test", includes: '**/*', key: "ws/${BUILD_TAG}/tidb-test") {
                                 sh label: "print version", script: """
                                     ls -alh sqllogic_test/
                                     ./sqllogic_test/tidb-server -V
                                 """
                                 container("golang") {
                                     sh label: "test_path: ${TEST_PATH_STRING}, cache_enabled:${CACHE_ENABLED}", script: """#!/usr/bin/env bash
                                         cd sqllogic_test/
                                         env
                                         ulimit -n
                                         sed -i '3i\\set -x' test.sh
     
                                         path_array=(${TEST_PATH_STRING})
                                         for path in \${path_array[@]}; do
                                             echo "test path: \${path}"
                                             export SQLLOGIC_TEST_PATH="/git/sqllogictest/test/\${path}"
                                             export TIDB_PARALLELISM=8
                                             export TIDB_SERVER_PATH="${WORKSPACE}/tidb-test/sqllogic_test/tidb-server"
                                             export CACHE_ENABLED="${CACHE_ENABLED}"
                                             ./test.sh
                                         done
                                     """
                                 }
                             }
                         }
                     }
                 }
             }
         }
    +    post {
    +        always {
    +            archiveArtifacts artifacts: '**/sqllogic_test/*.log', allowEmptyArchive: true
    +        }
    +        failure {
    +            archiveArtifacts artifacts: '**/sqllogic_test/*.log', allowEmptyArchive: true
    +        }
    +    }
     }
     
    • Apply this suggestion
    Suggestion importance[1-10]: 8

    Why: Adding a post block for handling failures and archiving logs in TestsGroup1 and TestsGroup2 stages is crucial for debugging and traceability, especially in complex CI environments.

    8
    Add a post block to handle cleanup actions after the pipeline execution

    Consider adding a post block to the pipeline to handle cleanup actions, such as deleting
    temporary files or directories, to ensure the workspace is clean for the next build.

    pipelines/pingcap/tidb/release-8.2/pull_integration_mysql_test.groovy [11-151]

     pipeline {
         agent {
             kubernetes {
                 namespace K8S_NAMESPACE
                 yamlFile POD_TEMPLATE_FILE
                 defaultContainer 'golang'
             }
         }
         environment {
             FILE_SERVER_URL = 'http://fileserver.pingcap.net'
             GITHUB_TOKEN = credentials('github-bot-token')
         }
         options {
             timeout(time: 40, unit: 'MINUTES')
             parallelsAlwaysFailFast()
         }
         stages {
             ...
         }
    +    post {
    +        always {
    +            cleanWs()
    +        }
    +    }
     }
     
    Suggestion importance[1-10]: 8

    Why: Adding a post block for cleanup is crucial to maintain a clean workspace, which is important for the reliability of subsequent builds.

    8
    Add a post block with always condition to ensure cleanup steps are executed regardless of the build result

    Consider adding a post block with always condition to the pipeline to ensure that
    important cleanup steps or notifications are executed regardless of the build result.

    pipelines/pingcap/tidb/release-8.2/pull_integration_python_orm_test.groovy [10-150]

     pipeline {
         agent {
             kubernetes {
                 namespace K8S_NAMESPACE
                 yamlFile POD_TEMPLATE_FILE
                 defaultContainer 'golang'
             }
         }
         environment {
             FILE_SERVER_URL = 'http://fileserver.pingcap.net'
         }
         options {
             timeout(time: 60, unit: 'MINUTES')
         }
         stages {
             ...
         }
    +    post {
    +        always {
    +            echo 'This will always run'
    +            // Add any necessary cleanup or notification steps here
    +        }
    +    }
     }
     
    Suggestion importance[1-10]: 8

    Why: Adding a post block with always condition is crucial for ensuring that cleanup and necessary final steps are executed, which is important for maintaining the environment clean and ready for subsequent builds.

    8
    Add a post block to archive test results and logs after the Test stage

    Add a post block to the Test stage to archive test results and logs, ensuring that they
    are available for review even if the tests fail.

    pipelines/pingcap/tidb/release-8.2/pull_integration_mysql_test.groovy [117-143]

     stage("Test") {
         options { timeout(time: 40, unit: 'MINUTES') }
         steps {
             dir('tidb-test') {
                 cache(path: "./", includes: '**/*', key: "ws/${BUILD_TAG}/tidb-test") {
                     sh label: 'print version', script: """
                         ls -alh bin/
                         ./bin/tidb-server -V
                         ./bin/tikv-server -V
                         ./bin/pd-server -V
                     """
                     container("golang") {
                         sh label: "test_store=${TEST_STORE} test_part=${TEST_PART}", script: """#!/usr/bin/env bash
                             if [[ "${TEST_STORE}" == "tikv" ]]; then
                                 echo '[storage]\nreserve-space = "0MB"'> tikv_config.toml
                                 bash ${WORKSPACE}/scripts/PingCAP-QE/tidb-test/start_tikv.sh
                                 export TIDB_SERVER_PATH="${WORKSPACE}/tidb-test/bin/tidb-server"
                                 export TIKV_PATH="127.0.0.1:2379"
                                 export TIDB_TEST_STORE_NAME="tikv"
                                 cd mysql_test/ && ./test.sh -blacklist=1 -part=${TEST_PART}
                             else
                                 export TIDB_SERVER_PATH="${WORKSPACE}/tidb-test/bin/tidb-server"
                                 export TIDB_TEST_STORE_NAME="unistore"
                                 cd mysql_test/ && ./test.sh -blacklist=1 -part=${TEST_PART}
                             fi
                         """
                     }
                 }
             }
         }
    +    post {
    +        always {
    +            archiveArtifacts artifacts: '**/test-results/*.xml', allowEmptyArchive: true
    +            archiveArtifacts artifacts: '**/logs/*.log', allowEmptyArchive: true
    +        }
    +    }
     }
     
    • Apply this suggestion
    Suggestion importance[1-10]: 7

    Why: Archiving test results and logs is essential for debugging and auditing purposes, especially when tests fail. This suggestion enhances the traceability and review process of test executions.

    7
    Add a post block with cleanup condition to ensure temporary files and resources are cleaned up after tests

    Add a post block with cleanup condition to the Test stage to ensure that any temporary
    files or resources are cleaned up after the tests are executed.

    pipelines/pingcap/tidb/release-8.2/pull_integration_python_orm_test.groovy [113-145]

     stage("Test") {
         options { timeout(time: 40, unit: 'MINUTES') }
         steps {
             dir('tidb-test') {
                 cache(path: "./", includes: '**/*', key: "ws/${BUILD_TAG}/tidb-test") {
                     sh label: 'print version', script: """
                         ls -alh bin/
                         ./bin/tidb-server -V
                         ./bin/tikv-server -V
                         ./bin/pd-server -V
                     """
                     sh label: "test_params=${TEST_PARAMS} ", script: """#!/usr/bin/env bash
                         export TIDB_SERVER_PATH="${WORKSPACE}/tidb-test/bin/tidb-server"
                         export TIDB_TEST_STORE_NAME="\$TEST_STORE"
                         set -- \${TEST_PARAMS}
                         TEST_DIR=\$1
                         TEST_SCRIPT=\$2
                         echo "TEST_DIR=\${TEST_DIR}"
                         echo "TEST_SCRIPT=\${TEST_SCRIPT}"
                         cd \${TEST_DIR} && chmod +x *.sh && \${TEST_SCRIPT}
                     """
                 }
             }
         }
         post{
             failure {
                 script {
                     println "Test failed, archive the log"
                 }
             }
    +        cleanup {
    +            script {
    +                println "Cleaning up temporary files and resources"
    +                // Add cleanup commands here
    +            }
    +        }
         }
     }
     
    • Apply this suggestion
    Suggestion importance[1-10]: 7

    Why: The suggestion to add a post block with cleanup condition is valid and improves resource management by ensuring that temporary files are cleaned up after tests, which is good practice.

    7
    Add a post block with success condition to perform actions only when tests pass successfully

    Add a post block with success condition to the Test stage to perform actions such as
    sending notifications or updating statuses only when the tests pass successfully.

    pipelines/pingcap/tidb/release-8.2/pull_integration_python_orm_test.groovy [113-145]

     stage("Test") {
         options { timeout(time: 40, unit: 'MINUTES') }
         steps {
             dir('tidb-test') {
                 cache(path: "./", includes: '**/*', key: "ws/${BUILD_TAG}/tidb-test") {
                     sh label: 'print version', script: """
                         ls -alh bin/
                         ./bin/tidb-server -V
                         ./bin/tikv-server -V
                         ./bin/pd-server -V
                     """
                     sh label: "test_params=${TEST_PARAMS} ", script: """#!/usr/bin/env bash
                         export TIDB_SERVER_PATH="${WORKSPACE}/tidb-test/bin/tidb-server"
                         export TIDB_TEST_STORE_NAME="\$TEST_STORE"
                         set -- \${TEST_PARAMS}
                         TEST_DIR=\$1
                         TEST_SCRIPT=\$2
                         echo "TEST_DIR=\${TEST_DIR}"
                         echo "TEST_SCRIPT=\${TEST_SCRIPT}"
                         cd \${TEST_DIR} && chmod +x *.sh && \${TEST_SCRIPT}
                     """
                 }
             }
         }
         post{
             failure {
                 script {
                     println "Test failed, archive the log"
                 }
             }
    +        success {
    +            script {
    +                println "Tests passed successfully, performing post-success actions"
    +                // Add success actions here
    +            }
    +        }
         }
     }
     
    • Apply this suggestion
    Suggestion importance[1-10]: 7

    Why: Adding a post block with success condition is a good practice as it allows specific actions to be performed only when tests pass, which can include notifications or other success-dependent tasks.

    7
    Add a post block with always condition to the "Checkout" stage for workspace cleanup

    Add a post block with always condition to the "Checkout" stage to ensure workspace cleanup
    in case of failure or success.

    pipelines/pingcap/tidb/release-8.2/pull_integration_common_test.groovy [42-63]

     stage('Checkout') {
         options { timeout(time: 10, unit: 'MINUTES') }
         steps {
             dir("tidb") {
                 cache(path: "./", includes: '**/*', key: prow.getCacheKey('git', REFS), restoreKeys: prow.getRestoreKeys('git', REFS)) {
                     retry(2) {
                         script {
                             prow.checkoutRefs(REFS)
                         }
                     }
                 }
             }
             dir("tidb-test") {
                 cache(path: "./", includes: '**/*', key: "git/PingCAP-QE/tidb-test/rev-${REFS.pulls[0].sha}", restoreKeys: ['git/PingCAP-QE/tidb-test/rev-']) {
                     retry(2) {
                         script {
                             component.checkout('git@github.com:PingCAP-QE/tidb-test.git', 'tidb-test', REFS.base_ref, REFS.pulls[0].title, GIT_CREDENTIALS_ID)
                         }
                     }
                 }
             }
         }
    +    post {
    +        always {
    +            cleanWs()
    +        }
    +    }
     }
     
    • Apply this suggestion
    Suggestion importance[1-10]: 6

    Why: Adding a post block with an always condition for workspace cleanup is a good practice to ensure that resources are properly cleaned up, enhancing maintainability.

    6
    Add a post block with always condition to the "Prepare" stage for workspace cleanup

    Add a post block with always condition to the "Prepare" stage to ensure workspace cleanup
    in case of failure or success.

    pipelines/pingcap/tidb/release-8.2/pull_integration_ddl_test.groovy [68-97]

     stage('Prepare') {
         steps {
             dir('tidb') {
                 container("golang") {
                     sh label: 'tidb-server', script: '[ -f bin/tidb-server ] || make'
                     sh label: 'ddl-test', script: 'ls bin/ddltest || make ddltest'
                     retry(3) {
                         sh label: 'download binary', script: """
                             chmod +x ${WORKSPACE}/scripts/PingCAP-QE/tidb-test/*.sh
                             ${WORKSPACE}/scripts/PingCAP-QE/tidb-test/download_pingcap_artifact.sh --pd=${REFS.base_ref} --tikv=${REFS.base_ref}
                             mv third_bin/tikv-server bin/
                             mv third_bin/pd-server bin/
                             ls -alh bin/
                         """
                     }
                 }
             }
             dir('tidb-test') {
                 cache(path: "./", includes: '**/*', key: "ws/${BUILD_TAG}/tidb-test") {
                     sh label: "prepare", script: """
                         touch ws-${BUILD_TAG}
                         mkdir -p bin
                         cp ${WORKSPACE}/tidb/bin/* bin/ && chmod +x bin/*
                         ls -alh bin/
                         ./bin/pd-server -V
                         ./bin/tikv-server -V
                         ./bin/tidb-server -V
                     """
                 }
             }
         }
    +    post {
    +        always {
    +            cleanWs()
    +        }
    +    }
     }
     
    • Apply this suggestion
    Suggestion importance[1-10]: 6

    Why: Implementing a post block with an always condition for workspace cleanup in the "Prepare" stage is beneficial for maintaining a clean state in the CI environment.

    6
    Add a post block with unstable condition to handle cases where tests are marked as unstable

    Add a post block with unstable condition to the Test stage to handle cases where tests are
    marked as unstable, allowing for specific actions such as notifications or retries.

    pipelines/pingcap/tidb/release-8.2/pull_integration_python_orm_test.groovy [113-145]

     stage("Test") {
         options { timeout(time: 40, unit: 'MINUTES') }
         steps {
             dir('tidb-test') {
                 cache(path: "./", includes: '**/*', key: "ws/${BUILD_TAG}/tidb-test") {
                     sh label: 'print version', script: """
                         ls -alh bin/
                         ./bin/tidb-server -V
                         ./bin/tikv-server -V
                         ./bin/pd-server -V
                     """
                     sh label: "test_params=${TEST_PARAMS} ", script: """#!/usr/bin/env bash
                         export TIDB_SERVER_PATH="${WORKSPACE}/tidb-test/bin/tidb-server"
                         export TIDB_TEST_STORE_NAME="\$TEST_STORE"
                         set -- \${TEST_PARAMS}
                         TEST_DIR=\$1
                         TEST_SCRIPT=\$2
                         echo "TEST_DIR=\${TEST_DIR}"
                         echo "TEST_SCRIPT=\${TEST_SCRIPT}"
                         cd \${TEST_DIR} && chmod +x *.sh && \${TEST_SCRIPT}
                     """
                 }
             }
         }
         post{
             failure {
                 script {
                     println "Test failed, archive the log"
                 }
             }
    +        unstable {
    +            script {
    +                println "Tests marked as unstable, performing post-unstable actions"
    +                // Add unstable actions here
    +            }
    +        }
         }
     }
     
    • Apply this suggestion
    Suggestion importance[1-10]: 6

    Why: The suggestion to add a post block with unstable condition is useful for handling special cases where tests do not fail but are not fully stable, allowing for specific actions like additional logging or notifications.

    6
    Possible issue
    Add a retry block around the artifact download commands to handle transient failures

    Add a retry block around the sh command in the Prepare stage to handle transient failures
    when downloading artifacts.

    pipelines/pingcap/tidb/release-8.2/pull_check2.groovy [57-75]

     stage("Prepare") {
         steps {
             dir('tidb') {
                 cache(path: "./bin", includes: '**/*', key: "binary/pingcap/tidb/tidb-server/rev-${REFS.base_sha}-${REFS.pulls[0].sha}") {
                     sh label: 'tidb-server', script: 'ls bin/tidb-server || make server'
                 }
                 script {
    -                 component.fetchAndExtractArtifact(FILE_SERVER_URL, 'tikv', REFS.base_ref, REFS.pulls[0].title, 'centos7/tikv-server.tar.gz', 'bin', trunkBranch="master", artifactVerify=true)
    -                 component.fetchAndExtractArtifact(FILE_SERVER_URL, 'pd', REFS.base_ref, REFS.pulls[0].title, 'centos7/pd-server.tar.gz', 'bin', trunkBranch="master", artifactVerify=true)
    +                retry(3) {
    +                    component.fetchAndExtractArtifact(FILE_SERVER_URL, 'tikv', REFS.base_ref, REFS.pulls[0].title, 'centos7/tikv-server.tar.gz', 'bin', trunkBranch="master", artifactVerify=true)
    +                    component.fetchAndExtractArtifact(FILE_SERVER_URL, 'pd', REFS.base_ref, REFS.pulls[0].title, 'centos7/pd-server.tar.gz', 'bin', trunkBranch="master", artifactVerify=true)
    +                }
                 }
                 // cache it for other pods
                 cache(path: "./", includes: '**/*', key: "ws/${BUILD_TAG}") {
                     sh """
                         mv bin/tidb-server bin/integration_test_tidb-server
                         touch rev-${REFS.pulls[0].sha}
                     """
                 }
             }
         }
     }
     
    • Apply this suggestion
    Suggestion importance[1-10]: 7

    Why: Adding a retry block around artifact download commands is a good practice to handle potential transient network or service issues, ensuring robustness in the pipeline.

    7
    Add a retry block around the make command to handle transient failures

    Add a retry block around the sh command in the Prepare stage to handle transient failures
    when running the make command.

    pipelines/pingcap/tidb/release-8.2/pull_sqllogic_test.groovy [66-79]

     stage('Prepare') {
         steps {
             dir('tidb') {
                 container("golang") {
    -                sh label: 'tidb-server', script: '[ -f bin/tidb-server ] || make'
    +                retry(3) {
    +                    sh label: 'tidb-server', script: '[ -f bin/tidb-server ] || make'
    +                }
                 }
             }
             dir('tidb-test') {
                 cache(path: "./sqllogic_test", includes: '**/*', key: "ws/${BUILD_TAG}/tidb-test") {
                     sh label: 'prepare tidb-test', script: """
                         touch ws-${BUILD_TAG}
                         cd sqllogic_test && ./build.sh
                         cp ${WORKSPACE}/tidb/bin/tidb-server ./
                         chmod +x tidb-server && ./tidb-server -V
                     """
                 }
             }
         }
     }
     
    • Apply this suggestion
    Suggestion importance[1-10]: 7

    Why: Implementing a retry mechanism for the make command in the Prepare stage is beneficial to handle possible transient issues during the build process, improving the reliability of the pipeline setup.

    7
    Add a timeout to the sh steps in the "Debug info" stage to prevent hanging

    Consider adding a timeout option to the sh steps in the "Debug info" stage to prevent
    potential hanging if any command takes too long to execute.

    pipelines/pingcap/tidb/release-8.2/pull_build.groovy [29-36]

     sh label: 'Debug info', script: """
         printenv
         echo "-------------------------"
         go env
         echo "-------------------------"
         ls -l /dev/null
         echo "debug command: kubectl -n ${K8S_NAMESPACE} exec -ti ${NODE_NAME} bash"
    -"""
    +""", timeout: 5
     
    • Apply this suggestion
    Suggestion importance[1-10]: 7

    Why: Adding a timeout to the sh steps in the "Debug info" stage is a good practice to prevent potential hanging, which can improve the robustness of the pipeline.

    7
    Add a retry block around the sh step to handle transient errors during the download of third-party binaries

    Add a retry block around the sh step in the Prepare stage to handle transient errors that
    might occur during the download of third-party binaries.

    pipelines/pingcap/tidb/release-8.2/pull_integration_lightning_test.groovy [59-71]

     stage('Prepare') {
         steps {
             dir("third_party_download") {
    -            retry(2) {
    +            retry(3) {
                     sh label: "download third_party", script: """
                         chmod +x ../tidb/lightning/tests/*.sh
                         ${WORKSPACE}/tidb/lightning/tests/download_integration_test_binaries.sh master
                         mkdir -p bin && mv third_bin/* bin/
                         ls -alh bin/
                         ./bin/pd-server -V
                         ./bin/tikv-server -V
                         ./bin/tiflash --version
                     """
                 }
             }
             ...
         }
     }
     
    Suggestion importance[1-10]: 6

    Why: Adding more retries can help mitigate transient network or server issues during binary downloads, although the existing retry mechanism is already in place.

    6
    Enhancement
    Add a post block to handle notifications after the pipeline execution

    Add a post block to the pipeline to handle notifications, such as sending an email or
    Slack message, to inform stakeholders about the build status.

    pipelines/pingcap/tidb/release-8.2/pull_integration_br_test.groovy [12-152]

     pipeline {
         agent {
             kubernetes {
                 namespace K8S_NAMESPACE
                 yamlFile POD_TEMPLATE_FILE
                 defaultContainer 'golang'
             }
         }
         environment {
             FILE_SERVER_URL = 'http://fileserver.pingcap.net'
         }
         options {
             timeout(time: 60, unit: 'MINUTES')
             // parallelsAlwaysFailFast()
         }
         stages {
             ...
         }
    +    post {
    +        always {
    +            mail to: 'team@example.com',
    +                 subject: "Build ${currentBuild.fullDisplayName}",
    +                 body: "Build ${currentBuild.fullDisplayName} finished with status: ${currentBuild.currentResult}"
    +        }
    +    }
     }
     
    Suggestion importance[1-10]: 7

    Why: Implementing notifications for build status is a good practice for keeping stakeholders informed, enhancing the communication and monitoring of build processes.

    7

    Signed-off-by: wuhuizuo <wuhuizuo@126.com>
    @wuhuizuo wuhuizuo force-pushed the feature/support-release-8.2 branch from ad2cc01 to a20bf43 Compare June 25, 2024 08:10
    Copy link
    ti-chi-bot bot commented Jun 25, 2024

    I Skip it since the diff size(201812 bytes > 80000 bytes) is too large

    Copy link
    ti-chi-bot bot commented Jun 25, 2024

    I Skip it since the diff size(202027 bytes > 80000 bytes) is too large

    Copy link
    Collaborator
    @purelind purelind left a comment

    Choose a reason for hiding this comment

    The reason will be displayed to describe this comment to others. Learn more.

    /lgtm

    @ti-chi-bot ti-chi-bot bot added the lgtm label Jun 25, 2024
    Copy link
    ti-chi-bot bot commented Jun 25, 2024

    [APPROVALNOTIFIER] This PR is APPROVED

    This pull-request has been approved by: purelind

    The full list of commands accepted by this bot can be found here.

    The pull request process is described here

    Needs approval from an approver in each of these files:

    Approvers can indicate their approval by writing /approve in a comment
    Approvers can cancel approval by writing /approve cancel in a comment

    @ti-chi-bot ti-chi-bot bot added the approved label Jun 25, 2024
    Copy link
    ti-chi-bot bot commented Jun 25, 2024

    I Skip it since the diff size(198336 bytes > 80000 bytes) is too large

    @ti-chi-bot ti-chi-bot bot removed the lgtm label Jun 25, 2024
    Copy link
    ti-chi-bot bot commented Jun 25, 2024

    [LGTM Timeline notifier]

    Timeline:

    • 2024-06-25 08:54:14.308724694 +0000 UTC m=+709780.794213525: ☑️ agreed by purelind.
    • 2024-06-25 08:55:11.330950999 +0000 UTC m=+709837.816439829: ✖️🔁 reset by purelind.

    Copy link
    ti-chi-bot bot commented Jun 25, 2024

    New changes are detected. LGTM label has been removed.

    @purelind
    Copy link
    Collaborator

    /hold

    Copy link
    ti-chi-bot bot commented Jun 25, 2024

    I Skip it since the diff size(202149 bytes > 80000 bytes) is too large

    @purelind
    Copy link
    Collaborator

    /unhold

    @purelind purelind added the lgtm label Jun 25, 2024
    @ti-chi-bot ti-chi-bot bot merged commit 2c61e60 into main Jun 25, 2024
    3 checks passed
    @ti-chi-bot ti-chi-bot bot deleted the feature/support-release-8.2 branch June 25, 2024 09:41
    Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
    Projects
    Status: Done
    Development

    Successfully merging this pull request may close these issues.

    2 participants
    0