Running with gitlab-runner 13.8.0 (775dd39d)  on dsop-shared-gitlab-runner-f887cbcbd-srgz6 E82_g8RG section_start:1630492675:resolve_secrets Resolving secrets section_end:1630492675:resolve_secrets section_start:1630492675:prepare_executor Preparing the "kubernetes" executor Using Kubernetes namespace: gitlab-runner-ironbank-dsop WARNING: Pulling GitLab Runner helper image from Docker Hub. Helper image is migrating to registry.gitlab.com, for more information see https://docs.gitlab.com/runner/configuration/advanced-configuration.html#migrating-helper-image-to-registrygitlabcom Using Kubernetes executor with image registry1.dso.mil/ironbank/ironbank-pipelines/rootless-podman:0.2 ... section_end:1630492675:prepare_executor section_start:1630492675:prepare_script Preparing environment Waiting for pod gitlab-runner-ironbank-dsop/runner-e82g8rg-project-4877-concurrent-0wwwcl to be running, status is Pending Waiting for pod gitlab-runner-ironbank-dsop/runner-e82g8rg-project-4877-concurrent-0wwwcl to be running, status is Pending ContainersNotReady: "containers with unready status: [build helper istio-proxy]" ContainersNotReady: "containers with unready status: [build helper istio-proxy]" Waiting for pod gitlab-runner-ironbank-dsop/runner-e82g8rg-project-4877-concurrent-0wwwcl to be running, status is Pending ContainersNotReady: "containers with unready status: [build helper istio-proxy]" ContainersNotReady: "containers with unready status: [build helper istio-proxy]" Waiting for pod gitlab-runner-ironbank-dsop/runner-e82g8rg-project-4877-concurrent-0wwwcl to be running, status is Pending ContainersNotReady: "containers with unready status: [build helper istio-proxy]" ContainersNotReady: "containers with unready status: [build helper istio-proxy]" Running on runner-e82g8rg-project-4877-concurrent-0wwwcl via dsop-shared-gitlab-runner-f887cbcbd-srgz6... section_end:1630492687:prepare_script section_start:1630492687:get_sources Getting source from Git repository $ until [ $(curl --fail --silent --output /dev/stderr --write-out "%{http_code}" localhost:15020/healthz/ready) -eq 200 ]; do echo Waiting for Sidecar; sleep 3 ; done ; echo Sidecar available; Sidecar available Fetching changes with git depth set to 50... Initialized empty Git repository in /builds/dsop/opensource/spark-operator/spark-py/.git/ Created fresh repository. Checking out 7a878b13 as master... Skipping object checkout, Git LFS is not installed. Skipping Git submodules setup section_end:1630492688:get_sources section_start:1630492688:download_artifacts Downloading artifacts Downloading artifacts for hardening-manifest (6141519)... Downloading artifacts from coordinator... ok  id=6141519 responseStatus=200 OK token=VqZs8Wth WARNING: ci-artifacts/preflight/: lchown ci-artifacts/preflight/: operation not permitted (suppressing repeats) Downloading artifacts for import-artifacts (6141521)... Downloading artifacts from coordinator... ok  id=6141521 responseStatus=200 OK token=jzuHdyid WARNING: ci-artifacts/import-artifacts/: lchown ci-artifacts/import-artifacts/: operation not permitted (suppressing repeats) Downloading artifacts for load-scripts (6141516)... Downloading artifacts from coordinator... ok  id=6141516 responseStatus=200 OK token=vPPfWgfX WARNING: ci-artifacts/[MASKED]/: lchown ci-artifacts/[MASKED]/: operation not permitted (suppressing repeats) Downloading artifacts for wl-compare-lint (6141520)... Downloading artifacts from coordinator... ok  id=6141520 responseStatus=200 OK token=xBqbj5XE WARNING: ci-artifacts/lint/: lchown ci-artifacts/lint/: operation not permitted (suppressing repeats) section_end:1630492721:download_artifacts section_start:1630492721:step_script Executing "step_script" stage of the job script $ "${PIPELINE_REPO_DIR}/stages/build/build-run.sh" Determine source registry based on branch Load any images used in Dockerfile build loading image ci-artifacts/import-artifacts/images/spark-operator-spark-py-2.4.5.tar Getting image source signatures Copying blob sha256:0907fceccb5df6ebf2f03516d3866266866445583e9e8732908de258215fbaac Copying blob sha256:4b7c55f086f067e6fffd121662a56b717e22c938782b7266e79fd12ecca1a788 Copying blob sha256:5a98df2206b31324864168aace1281c7336c358856b150376486ab360148799d Copying blob sha256:017dfbffecc1851a4ac270fcc9127e712ad9648016a027c587df30e7bf15de85 Copying blob sha256:b67d19e65ef653823ed62a5835399c610a40e8205c16f839c5cc567954fcf594 Copying blob sha256:beed6f7f1c12dc709b670022d788cd5df8dba56df1c4a4a83c985cb6d63ce7cf Copying blob sha256:507302fb1640ec70ce24b5c789139534de379450fd38a623cc616979ca0420f0 Copying blob sha256:375a95b78f3bad944afda156a1e18e7ba0df2d76a7a09fc2de76c523bb4fd62c Copying blob sha256:43556d4e6d23274a80067d870489a0ae9215ff09ff142776797c54afdff20f5f Copying blob sha256:6752f14edc0850ce0e548d054031db5d2ac420d6fb7014ceaa9fa95f0aa222c2 Copying blob sha256:9a7b83551c84d9dd4d547119833769b12e3e92c30ec9b60cd60ffb8c9838e56d Copying blob sha256:bc5557987564c445d4cac35d7e3a6c7e45169970455c3fc543574227e0c57a9c Copying blob sha256:85aa33aa52348589eb215f274d19193e0a2d8e36cbf374c75e8a1189b37a95b3 Copying blob sha256:795e6fa7e1d883c0f5752a01ca55021e7d24776892a8b711fad528acda204e61 Copying blob sha256:68c373d60139c952a928d297ab36ca1494831438618269c8f640c9d9be687047 Copying config sha256:579644ff878d4148e707951134f0b3758a6d0ca431173572e9d93489032929cc Writing manifest to image destination Storing signatures Loaded image(s): localhost/spark-operator/spark-py:2.4.5 Load HTTP and S3 external resources 'ci-artifacts/import-artifacts/external-resources/pip-21.2.4-py3-none-any.whl' -> './pip-21.2.4-py3-none-any.whl' 'ci-artifacts/import-artifacts/external-resources/setuptools-57.4.0-py3-none-any.whl' -> './setuptools-57.4.0-py3-none-any.whl' 'ci-artifacts/import-artifacts/external-resources/tini' -> './tini' Converting labels from hardening manifest into command line args Converting build args from hardening manifest into command line args Build the image STEP 1: FROM spark-operator/spark-py:2.4.5 AS base Getting image source signatures Copying blob sha256:b67d19e65ef653823ed62a5835399c610a40e8205c16f839c5cc567954fcf594 Copying blob sha256:4b7c55f086f067e6fffd121662a56b717e22c938782b7266e79fd12ecca1a788 Copying blob sha256:beed6f7f1c12dc709b670022d788cd5df8dba56df1c4a4a83c985cb6d63ce7cf Copying blob sha256:017dfbffecc1851a4ac270fcc9127e712ad9648016a027c587df30e7bf15de85 Copying blob sha256:5a98df2206b31324864168aace1281c7336c358856b150376486ab360148799d Copying blob sha256:0907fceccb5df6ebf2f03516d3866266866445583e9e8732908de258215fbaac Copying blob sha256:507302fb1640ec70ce24b5c789139534de379450fd38a623cc616979ca0420f0 Copying blob sha256:375a95b78f3bad944afda156a1e18e7ba0df2d76a7a09fc2de76c523bb4fd62c Copying blob sha256:6752f14edc0850ce0e548d054031db5d2ac420d6fb7014ceaa9fa95f0aa222c2 Copying blob sha256:43556d4e6d23274a80067d870489a0ae9215ff09ff142776797c54afdff20f5f Copying blob sha256:9a7b83551c84d9dd4d547119833769b12e3e92c30ec9b60cd60ffb8c9838e56d Copying blob sha256:bc5557987564c445d4cac35d7e3a6c7e45169970455c3fc543574227e0c57a9c Copying blob sha256:85aa33aa52348589eb215f274d19193e0a2d8e36cbf374c75e8a1189b37a95b3 Copying blob sha256:795e6fa7e1d883c0f5752a01ca55021e7d24776892a8b711fad528acda204e61 Copying blob sha256:68c373d60139c952a928d297ab36ca1494831438618269c8f640c9d9be687047 Copying config sha256:b1a470252f1965434230aa81a00e29232054e510f0b2ba003f02e072a1b2e107 Writing manifest to image destination Storing signatures --> b1a470252f1 STEP 2: FROM registry1.dso.mil/ironbank/opensource/python/python38:3.8 Trying to pull registry1.dso.mil/ironbank/opensource/python/python38:3.8... Getting image source signatures Copying blob sha256:612731bd4073ce3a4b81a4cba66d261924ba75b5ce327bb37f3e0e3f4aced605 Copying blob sha256:db2f50b75fc09a20e8d9c497d96ad384fbeacf7e77df71e4c7b578d4c07fccce Copying blob sha256:96476a77b28db43fcb170401c287700d91d95cdff9c06e5ea7b48289d40a8e57 Copying blob sha256:55a436000dbf9921b880d0fd54fd2214eb4613978b782e98f6e0e2d1ac6e4fe7 Copying config sha256:df8e1c0e200b1c16f6d033833b08aa87a230ccc4f46a695a43e94acb24114d06 Writing manifest to image destination Storing signatures STEP 3: ARG spark_uid=185 STEP 4: USER root STEP 5: COPY *.whl ./ STEP 6: RUN dnf -y update && dnf -y upgrade && pip3 install --upgrade ./*.whl && rm -rf /var/cache/dnf && mkdir -p /opt/spark/python/pyspark && mkdir -p /opt/spark/python/lib && chown -R 185:185 /opt/spark/ Red Hat Universal Base Image 8 (RPMs) - BaseOS 79 kB/s | 790 kB 00:10 Red Hat Universal Base Image 8 (RPMs) - AppStre 244 kB/s | 2.4 MB 00:10 Red Hat Universal Base Image 8 (RPMs) - CodeRea 1.5 kB/s | 14 kB 00:09 Dependencies resolved. Nothing to do. Complete! Last metadata expiration check: 0:00:02 ago on Wed Sep 1 10:43:04 2021. Dependencies resolved. Nothing to do. Complete! Processing /pip-21.2.4-py3-none-any.whl Processing /setuptools-57.4.0-py3-none-any.whl setuptools is already installed with the same version as the provided wheel. Use --force-reinstall to force an installation of the wheel. Installing collected packages: pip Attempting uninstall: pip Found existing installation: pip 21.1.3 Uninstalling pip-21.1.3: Successfully uninstalled pip-21.1.3 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv Successfully installed pip-21.2.4 STEP 7: COPY --from=base /opt/spark/python/lib /opt/spark/python/lib STEP 8: COPY scripts/entrypoint.sh /opt/entrypoint.sh STEP 9: COPY tini /usr/bin/ STEP 10: ENV SPARK_HOME /opt/spark STEP 11: WORKDIR /opt/spark/work-dir STEP 12: RUN chmod g+w /opt/spark/work-dir && chmod a+x /opt/entrypoint.sh && chmod a+x /usr/bin/tini STEP 13: ENTRYPOINT [ "/opt/entrypoint.sh" ] STEP 14: USER ${spark_uid} STEP 15: COMMIT registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py Getting image source signatures Copying blob sha256:98469092e6042f8c9cc81dcb1a710957fb5ef27817c9b178f7b71c4f242cb2ed Copying blob sha256:bfb9caafb0fc0d8496a27709f1698ac90d1a306556387a75b92a86063544f4c8 Copying blob sha256:f221949c4d4c872ec15a3aece5a3367c1730ec2c25d8d3a1173c8d8f4f630adb Copying blob sha256:c768fef9bed9bb9aeb773afe08d93460f752923fdfd765fe1da614f7b20f75ba Copying blob sha256:f579f2768bd270ba56ae0eb6bd318c2c9783e2181ecaa7b0d96b6b1015e9b0d6 Copying config sha256:ee9ae7d13bf9e4d213af5f60bc3f71e2bac3735856973321ab4750a4b1f70c25 Writing manifest to image destination Storing signatures --> ee9ae7d13bf Successfully tagged registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:latest ee9ae7d13bf9e4d213af5f60bc3f71e2bac3735856973321ab4750a4b1f70c25 + buildah tag --storage-driver=vfs registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:ibci-455099 + buildah push --storage-driver=vfs --authfile staging_auth.json --digestfile=ci-artifacts/build/digest registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:ibci-455099 Getting image source signatures Copying blob sha256:f579f2768bd270ba56ae0eb6bd318c2c9783e2181ecaa7b0d96b6b1015e9b0d6 Copying blob sha256:bfb9caafb0fc0d8496a27709f1698ac90d1a306556387a75b92a86063544f4c8 Copying blob sha256:f221949c4d4c872ec15a3aece5a3367c1730ec2c25d8d3a1173c8d8f4f630adb Copying blob sha256:c768fef9bed9bb9aeb773afe08d93460f752923fdfd765fe1da614f7b20f75ba Copying blob sha256:98469092e6042f8c9cc81dcb1a710957fb5ef27817c9b178f7b71c4f242cb2ed Copying config sha256:ee9ae7d13bf9e4d213af5f60bc3f71e2bac3735856973321ab4750a4b1f70c25 Writing manifest to image destination Storing signatures Read the tags + echo 'Read the tags' + tags_file=ci-artifacts/preflight/tags.txt + test -f ci-artifacts/preflight/tags.txt + IFS= + read -r tag + buildah tag --storage-driver=vfs registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:v3.1.1 + buildah push --storage-driver=vfs --authfile staging_auth.json registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:v3.1.1 Getting image source signatures Copying blob sha256:f221949c4d4c872ec15a3aece5a3367c1730ec2c25d8d3a1173c8d8f4f630adb Copying blob sha256:f579f2768bd270ba56ae0eb6bd318c2c9783e2181ecaa7b0d96b6b1015e9b0d6 Copying blob sha256:bfb9caafb0fc0d8496a27709f1698ac90d1a306556387a75b92a86063544f4c8 Copying blob sha256:c768fef9bed9bb9aeb773afe08d93460f752923fdfd765fe1da614f7b20f75ba Copying blob sha256:98469092e6042f8c9cc81dcb1a710957fb5ef27817c9b178f7b71c4f242cb2ed Copying config sha256:ee9ae7d13bf9e4d213af5f60bc3f71e2bac3735856973321ab4750a4b1f70c25 Writing manifest to image destination Storing signatures + IFS= + read -r tag + buildah tag --storage-driver=vfs registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:latest + buildah push --storage-driver=vfs --authfile staging_auth.json registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:latest Getting image source signatures Copying blob sha256:c768fef9bed9bb9aeb773afe08d93460f752923fdfd765fe1da614f7b20f75ba Copying blob sha256:f579f2768bd270ba56ae0eb6bd318c2c9783e2181ecaa7b0d96b6b1015e9b0d6 Copying blob sha256:f221949c4d4c872ec15a3aece5a3367c1730ec2c25d8d3a1173c8d8f4f630adb Copying blob sha256:98469092e6042f8c9cc81dcb1a710957fb5ef27817c9b178f7b71c4f242cb2ed Copying blob sha256:bfb9caafb0fc0d8496a27709f1698ac90d1a306556387a75b92a86063544f4c8 Copying config sha256:ee9ae7d13bf9e4d213af5f60bc3f71e2bac3735856973321ab4750a4b1f70c25 Writing manifest to image destination Storing signatures + IFS= + read -r tag ++ podman inspect --storage-driver=vfs registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py --format '{{.Id}}' + IMAGE_ID=sha256:ee9ae7d13bf9e4d213af5f60bc3f71e2bac3735856973321ab4750a4b1f70c25 + echo IMAGE_ID=sha256:ee9ae7d13bf9e4d213af5f60bc3f71e2bac3735856973321ab4750a4b1f70c25 + IMAGE_PODMAN_SHA=sha256:18e9f1eebae14eb25184ab620023bd569ace6ec91a3deb920241db566da92544 + echo IMAGE_PODMAN_SHA=sha256:18e9f1eebae14eb25184ab620023bd569ace6ec91a3deb920241db566da92544 + echo IMAGE_FULLTAG=registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:ibci-455099 + echo IMAGE_NAME=opensource/spark-operator/spark-py + branches=("master" "development") + [[ master development =~ master ]] A tarball of the built image can be retrieved from the documentation job artifacts. + msg='A tarball of the built image can be retrieved from the documentation job artifacts.' + echo 'A tarball of the built image can be retrieved from the documentation job artifacts.' section_end:1630493023:step_script section_start:1630493023:upload_artifacts_on_success Uploading artifacts for successful job Uploading artifacts... ci-artifacts/build/: found 2 matching files and directories Uploading artifacts as "archive" to coordinator... ok id=6141523 responseStatus=201 Created token=YaQpekWt Uploading artifacts... build.env: found 1 matching files and directories  Uploading artifacts as "dotenv" to coordinator... ok id=6141523 responseStatus=201 Created token=YaQpekWt section_end:1630493024:upload_artifacts_on_success section_start:1630493024:cleanup_file_variables Cleaning up file based variables section_end:1630493024:cleanup_file_variables Job succeeded