Running with gitlab-runner 13.8.0 (775dd39d)  on dsop-shared-gitlab-runner-f887cbcbd-srgz6 E82_g8RG section_start:1630319901:resolve_secrets Resolving secrets section_end:1630319901:resolve_secrets section_start:1630319901:prepare_executor Preparing the "kubernetes" executor Using Kubernetes namespace: gitlab-runner-ironbank-dsop WARNING: Pulling GitLab Runner helper image from Docker Hub. Helper image is migrating to registry.gitlab.com, for more information see https://docs.gitlab.com/runner/configuration/advanced-configuration.html#migrating-helper-image-to-registrygitlabcom Using Kubernetes executor with image registry1.dso.mil/ironbank/ironbank-pipelines/rootless-podman:0.2 ... section_end:1630319901:prepare_executor section_start:1630319901:prepare_script Preparing environment Waiting for pod gitlab-runner-ironbank-dsop/runner-e82g8rg-project-4877-concurrent-0bvwpb to be running, status is Pending Waiting for pod gitlab-runner-ironbank-dsop/runner-e82g8rg-project-4877-concurrent-0bvwpb to be running, status is Pending ContainersNotInitialized: "containers with incomplete status: [istio-init]" ContainersNotReady: "containers with unready status: [build helper istio-proxy]" ContainersNotReady: "containers with unready status: [build helper istio-proxy]" Waiting for pod gitlab-runner-ironbank-dsop/runner-e82g8rg-project-4877-concurrent-0bvwpb to be running, status is Pending ContainersNotReady: "containers with unready status: [build helper istio-proxy]" ContainersNotReady: "containers with unready status: [build helper istio-proxy]" Waiting for pod gitlab-runner-ironbank-dsop/runner-e82g8rg-project-4877-concurrent-0bvwpb to be running, status is Pending ContainersNotReady: "containers with unready status: [build helper istio-proxy]" ContainersNotReady: "containers with unready status: [build helper istio-proxy]" Waiting for pod gitlab-runner-ironbank-dsop/runner-e82g8rg-project-4877-concurrent-0bvwpb to be running, status is Pending ContainersNotReady: "containers with unready status: [build helper istio-proxy]" ContainersNotReady: "containers with unready status: [build helper istio-proxy]" Running on runner-e82g8rg-project-4877-concurrent-0bvwpb via dsop-shared-gitlab-runner-f887cbcbd-srgz6... section_end:1630319917:prepare_script section_start:1630319917:get_sources Getting source from Git repository $ until [ $(curl --fail --silent --output /dev/stderr --write-out "%{http_code}" localhost:15020/healthz/ready) -eq 200 ]; do echo Waiting for Sidecar; sleep 3 ; done ; echo Sidecar available; Waiting for Sidecar Sidecar available Fetching changes with git depth set to 50... Initialized empty Git repository in /builds/dsop/opensource/spark-operator/spark-py/.git/ Created fresh repository. Checking out 7a878b13 as master... Skipping object checkout, Git LFS is not installed. Skipping Git submodules setup section_end:1630319921:get_sources section_start:1630319921:download_artifacts Downloading artifacts Downloading artifacts for hardening-manifest (6069128)... Downloading artifacts from coordinator... ok  id=6069128 responseStatus=200 OK token=5V3usPnN WARNING: ci-artifacts/preflight/: lchown ci-artifacts/preflight/: operation not permitted (suppressing repeats) Downloading artifacts for import-artifacts (6069130)... Downloading artifacts from coordinator... ok  id=6069130 responseStatus=200 OK token=sU1JPJha WARNING: ci-artifacts/import-artifacts/: lchown ci-artifacts/import-artifacts/: operation not permitted (suppressing repeats) Downloading artifacts for load-scripts (6069125)... Downloading artifacts from coordinator... ok  id=6069125 responseStatus=200 OK token=TLrV1xdw WARNING: ci-artifacts/[MASKED]/: lchown ci-artifacts/[MASKED]/: operation not permitted (suppressing repeats) Downloading artifacts for wl-compare-lint (6069129)... Downloading artifacts from coordinator... ok  id=6069129 responseStatus=200 OK token=ze2S3K7k WARNING: ci-artifacts/lint/: lchown ci-artifacts/lint/: operation not permitted (suppressing repeats) section_end:1630319962:download_artifacts section_start:1630319962:step_script Executing "step_script" stage of the job script $ "${PIPELINE_REPO_DIR}/stages/build/build-run.sh" Determine source registry based on branch Load any images used in Dockerfile build loading image ci-artifacts/import-artifacts/images/spark-operator-spark-py-2.4.5.tar Getting image source signatures Copying blob sha256:0907fceccb5df6ebf2f03516d3866266866445583e9e8732908de258215fbaac Copying blob sha256:4b7c55f086f067e6fffd121662a56b717e22c938782b7266e79fd12ecca1a788 Copying blob sha256:b67d19e65ef653823ed62a5835399c610a40e8205c16f839c5cc567954fcf594 Copying blob sha256:beed6f7f1c12dc709b670022d788cd5df8dba56df1c4a4a83c985cb6d63ce7cf Copying blob sha256:5a98df2206b31324864168aace1281c7336c358856b150376486ab360148799d Copying blob sha256:017dfbffecc1851a4ac270fcc9127e712ad9648016a027c587df30e7bf15de85 Copying blob sha256:507302fb1640ec70ce24b5c789139534de379450fd38a623cc616979ca0420f0 Copying blob sha256:375a95b78f3bad944afda156a1e18e7ba0df2d76a7a09fc2de76c523bb4fd62c Copying blob sha256:6752f14edc0850ce0e548d054031db5d2ac420d6fb7014ceaa9fa95f0aa222c2 Copying blob sha256:9a7b83551c84d9dd4d547119833769b12e3e92c30ec9b60cd60ffb8c9838e56d Copying blob sha256:43556d4e6d23274a80067d870489a0ae9215ff09ff142776797c54afdff20f5f Copying blob sha256:bc5557987564c445d4cac35d7e3a6c7e45169970455c3fc543574227e0c57a9c Copying blob sha256:85aa33aa52348589eb215f274d19193e0a2d8e36cbf374c75e8a1189b37a95b3 Copying blob sha256:795e6fa7e1d883c0f5752a01ca55021e7d24776892a8b711fad528acda204e61 Copying blob sha256:68c373d60139c952a928d297ab36ca1494831438618269c8f640c9d9be687047 Copying config sha256:579644ff878d4148e707951134f0b3758a6d0ca431173572e9d93489032929cc Writing manifest to image destination Storing signatures Loaded image(s): localhost/spark-operator/spark-py:2.4.5 Load HTTP and S3 external resources 'ci-artifacts/import-artifacts/external-resources/pip-21.2.4-py3-none-any.whl' -> './pip-21.2.4-py3-none-any.whl' 'ci-artifacts/import-artifacts/external-resources/setuptools-57.4.0-py3-none-any.whl' -> './setuptools-57.4.0-py3-none-any.whl' 'ci-artifacts/import-artifacts/external-resources/tini' -> './tini' Converting labels from hardening manifest into command line args Converting build args from hardening manifest into command line args Build the image STEP 1: FROM spark-operator/spark-py:2.4.5 AS base Getting image source signatures Copying blob sha256:b67d19e65ef653823ed62a5835399c610a40e8205c16f839c5cc567954fcf594 Copying blob sha256:4b7c55f086f067e6fffd121662a56b717e22c938782b7266e79fd12ecca1a788 Copying blob sha256:beed6f7f1c12dc709b670022d788cd5df8dba56df1c4a4a83c985cb6d63ce7cf Copying blob sha256:017dfbffecc1851a4ac270fcc9127e712ad9648016a027c587df30e7bf15de85 Copying blob sha256:5a98df2206b31324864168aace1281c7336c358856b150376486ab360148799d Copying blob sha256:0907fceccb5df6ebf2f03516d3866266866445583e9e8732908de258215fbaac Copying blob sha256:507302fb1640ec70ce24b5c789139534de379450fd38a623cc616979ca0420f0 Copying blob sha256:375a95b78f3bad944afda156a1e18e7ba0df2d76a7a09fc2de76c523bb4fd62c Copying blob sha256:6752f14edc0850ce0e548d054031db5d2ac420d6fb7014ceaa9fa95f0aa222c2 Copying blob sha256:43556d4e6d23274a80067d870489a0ae9215ff09ff142776797c54afdff20f5f Copying blob sha256:9a7b83551c84d9dd4d547119833769b12e3e92c30ec9b60cd60ffb8c9838e56d Copying blob sha256:bc5557987564c445d4cac35d7e3a6c7e45169970455c3fc543574227e0c57a9c Copying blob sha256:85aa33aa52348589eb215f274d19193e0a2d8e36cbf374c75e8a1189b37a95b3 Copying blob sha256:795e6fa7e1d883c0f5752a01ca55021e7d24776892a8b711fad528acda204e61 Copying blob sha256:68c373d60139c952a928d297ab36ca1494831438618269c8f640c9d9be687047 Copying config sha256:a1c047ab7dfffc5339c16582c87c09bc6a242578cb1d565d55aa7cf8b76a2426 Writing manifest to image destination Storing signatures --> a1c047ab7df STEP 2: FROM registry1.dso.mil/ironbank/opensource/python/python38:3.8 Trying to pull registry1.dso.mil/ironbank/opensource/python/python38:3.8... Getting image source signatures Copying blob sha256:8e491b598e6168125170adb84b9fe7768b4496dcd7857f1b8b5410d94fb40b59 Copying blob sha256:7f75f7936d6bed6d671febf0c75c895150a9bf46c2682b230ef2face89b21916 Copying blob sha256:96476a77b28db43fcb170401c287700d91d95cdff9c06e5ea7b48289d40a8e57 Copying blob sha256:db2f50b75fc09a20e8d9c497d96ad384fbeacf7e77df71e4c7b578d4c07fccce Copying config sha256:ed526ff17c200878aa01f2c6b059d6d3e70ad8c38e8bd226bb0ffa429b92af38 Writing manifest to image destination Storing signatures STEP 3: ARG spark_uid=185 STEP 4: USER root STEP 5: COPY *.whl ./ STEP 6: RUN dnf -y update && dnf -y upgrade && pip3 install --upgrade ./*.whl && rm -rf /var/cache/dnf && mkdir -p /opt/spark/python/pyspark && mkdir -p /opt/spark/python/lib && chown -R 185:185 /opt/spark/ Red Hat Universal Base Image 8 (RPMs) - BaseOS 79 kB/s | 790 kB 00:10 Red Hat Universal Base Image 8 (RPMs) - AppStre 253 kB/s | 2.4 MB 00:09 Red Hat Universal Base Image 8 (RPMs) - CodeRea 1.5 kB/s | 14 kB 00:09 Dependencies resolved. Nothing to do. Complete! Last metadata expiration check: 0:00:02 ago on Mon Aug 30 10:42:35 2021. Dependencies resolved. Nothing to do. Complete! Processing /pip-21.2.4-py3-none-any.whl Processing /setuptools-57.4.0-py3-none-any.whl setuptools is already installed with the same version as the provided wheel. Use --force-reinstall to force an installation of the wheel. Installing collected packages: pip Attempting uninstall: pip Found existing installation: pip 21.1.3 Uninstalling pip-21.1.3: Successfully uninstalled pip-21.1.3 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv Successfully installed pip-21.2.4 STEP 7: COPY --from=base /opt/spark/python/lib /opt/spark/python/lib STEP 8: COPY scripts/entrypoint.sh /opt/entrypoint.sh STEP 9: COPY tini /usr/bin/ STEP 10: ENV SPARK_HOME /opt/spark STEP 11: WORKDIR /opt/spark/work-dir STEP 12: RUN chmod g+w /opt/spark/work-dir && chmod a+x /opt/entrypoint.sh && chmod a+x /usr/bin/tini STEP 13: ENTRYPOINT [ "/opt/entrypoint.sh" ] STEP 14: USER ${spark_uid} STEP 15: COMMIT registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py Getting image source signatures Copying blob sha256:98469092e6042f8c9cc81dcb1a710957fb5ef27817c9b178f7b71c4f242cb2ed Copying blob sha256:bfb9caafb0fc0d8496a27709f1698ac90d1a306556387a75b92a86063544f4c8 Copying blob sha256:fe9a5cd9c83e50c9c3173b075db624c3055266e42eb6b3c062846f05fbb80b32 Copying blob sha256:c2f1c0919d60923a04dca28029d60baaef5eaf7ed1056d3fa152ea8aa5390587 Copying blob sha256:600dca3bd55bdbf53b4063fd9e970322f394d42993728a125b120a6535f9b30f Copying config sha256:eb63c4e594a92271f0d50f91b3876e3828810bd936ddee544a07b280a11d2b13 Writing manifest to image destination Storing signatures --> eb63c4e594a Successfully tagged registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:latest eb63c4e594a92271f0d50f91b3876e3828810bd936ddee544a07b280a11d2b13 + buildah tag --storage-driver=vfs registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:ibci-450477 + buildah push --storage-driver=vfs --authfile staging_auth.json --digestfile=ci-artifacts/build/digest registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:ibci-450477 Getting image source signatures Copying blob sha256:600dca3bd55bdbf53b4063fd9e970322f394d42993728a125b120a6535f9b30f Copying blob sha256:bfb9caafb0fc0d8496a27709f1698ac90d1a306556387a75b92a86063544f4c8 Copying blob sha256:c2f1c0919d60923a04dca28029d60baaef5eaf7ed1056d3fa152ea8aa5390587 Copying blob sha256:fe9a5cd9c83e50c9c3173b075db624c3055266e42eb6b3c062846f05fbb80b32 Copying blob sha256:98469092e6042f8c9cc81dcb1a710957fb5ef27817c9b178f7b71c4f242cb2ed Copying config sha256:eb63c4e594a92271f0d50f91b3876e3828810bd936ddee544a07b280a11d2b13 Writing manifest to image destination Storing signatures Read the tags + echo 'Read the tags' + tags_file=ci-artifacts/preflight/tags.txt + test -f ci-artifacts/preflight/tags.txt + IFS= + read -r tag + buildah tag --storage-driver=vfs registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:v3.1.1 + buildah push --storage-driver=vfs --authfile staging_auth.json registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:v3.1.1 Getting image source signatures Copying blob sha256:c2f1c0919d60923a04dca28029d60baaef5eaf7ed1056d3fa152ea8aa5390587 Copying blob sha256:fe9a5cd9c83e50c9c3173b075db624c3055266e42eb6b3c062846f05fbb80b32 Copying blob sha256:98469092e6042f8c9cc81dcb1a710957fb5ef27817c9b178f7b71c4f242cb2ed Copying blob sha256:bfb9caafb0fc0d8496a27709f1698ac90d1a306556387a75b92a86063544f4c8 Copying blob sha256:600dca3bd55bdbf53b4063fd9e970322f394d42993728a125b120a6535f9b30f Copying config sha256:eb63c4e594a92271f0d50f91b3876e3828810bd936ddee544a07b280a11d2b13 Writing manifest to image destination Storing signatures + IFS= + read -r tag + buildah tag --storage-driver=vfs registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:latest + buildah push --storage-driver=vfs --authfile staging_auth.json registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:latest Getting image source signatures Copying blob sha256:fe9a5cd9c83e50c9c3173b075db624c3055266e42eb6b3c062846f05fbb80b32 Copying blob sha256:600dca3bd55bdbf53b4063fd9e970322f394d42993728a125b120a6535f9b30f Copying blob sha256:c2f1c0919d60923a04dca28029d60baaef5eaf7ed1056d3fa152ea8aa5390587 Copying blob sha256:bfb9caafb0fc0d8496a27709f1698ac90d1a306556387a75b92a86063544f4c8 Copying blob sha256:98469092e6042f8c9cc81dcb1a710957fb5ef27817c9b178f7b71c4f242cb2ed Copying config sha256:eb63c4e594a92271f0d50f91b3876e3828810bd936ddee544a07b280a11d2b13 Writing manifest to image destination Storing signatures + IFS= + read -r tag ++ podman inspect --storage-driver=vfs registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py --format '{{.Id}}' + IMAGE_ID=sha256:eb63c4e594a92271f0d50f91b3876e3828810bd936ddee544a07b280a11d2b13 + echo IMAGE_ID=sha256:eb63c4e594a92271f0d50f91b3876e3828810bd936ddee544a07b280a11d2b13 A tarball of the built image can be retrieved from the documentation job artifacts. + IMAGE_PODMAN_SHA=sha256:f8af1a37551f4ca54fdd1a71e9ae665c51f46b58483063aca7de7bd884078801 + echo IMAGE_PODMAN_SHA=sha256:f8af1a37551f4ca54fdd1a71e9ae665c51f46b58483063aca7de7bd884078801 + echo IMAGE_FULLTAG=registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:ibci-450477 + echo IMAGE_NAME=opensource/spark-operator/spark-py + branches=("master" "development") + [[ master development =~ master ]] + msg='A tarball of the built image can be retrieved from the documentation job artifacts.' + echo 'A tarball of the built image can be retrieved from the documentation job artifacts.' section_end:1630320226:step_script section_start:1630320226:upload_artifacts_on_success Uploading artifacts for successful job Uploading artifacts... ci-artifacts/build/: found 2 matching files and directories Uploading artifacts as "archive" to coordinator... ok id=6069132 responseStatus=201 Created token=x8v3wtF1 Uploading artifacts... build.env: found 1 matching files and directories  Uploading artifacts as "dotenv" to coordinator... ok id=6069132 responseStatus=201 Created token=x8v3wtF1 section_end:1630320228:upload_artifacts_on_success section_start:1630320228:cleanup_file_variables Cleaning up file based variables section_end:1630320228:cleanup_file_variables Job succeeded