Running with gitlab-runner 13.12.0 (7a6612da)  on dsop-shared-gitlab-runner-5fcd8977b8-m6qmr JrExJ6yx  feature flags: FF_USE_LEGACY_KUBERNETES_EXECUTION_STRATEGY:false section_start:1631097524:resolve_secrets Resolving secrets section_end:1631097524:resolve_secrets section_start:1631097524:prepare_executor Preparing the "kubernetes" executor Using Kubernetes namespace: gitlab-runner-ironbank-dsop Using Kubernetes executor with image registry1.dso.mil/ironbank/ironbank-pipelines/rootless-podman:0.2 ... Using attach strategy to execute scripts... section_end:1631097524:prepare_executor section_start:1631097524:prepare_script Preparing environment Waiting for pod gitlab-runner-ironbank-dsop/runner-jrexj6yx-project-4877-concurrent-0m25m8 to be running, status is Pending Waiting for pod gitlab-runner-ironbank-dsop/runner-jrexj6yx-project-4877-concurrent-0m25m8 to be running, status is Pending ContainersNotInitialized: "containers with incomplete status: [istio-init]" ContainersNotReady: "containers with unready status: [build helper istio-proxy]" ContainersNotReady: "containers with unready status: [build helper istio-proxy]" Running on runner-jrexj6yx-project-4877-concurrent-0m25m8 via dsop-shared-gitlab-runner-5fcd8977b8-m6qmr... section_end:1631097530:prepare_script section_start:1631097530:get_sources Getting source from Git repository $ until [ $(curl --fail --silent --output /dev/stderr --write-out "%{http_code}" localhost:15020/healthz/ready) -eq 200 ]; do echo Waiting for Sidecar; sleep 3 ; done ; echo Sidecar available; Sidecar available Fetching changes with git depth set to 50... Initialized empty Git repository in /builds/JrExJ6yx/0/dsop/opensource/spark-operator/spark-py/.git/ Created fresh repository. Checking out 7a878b13 as master... Skipping Git submodules setup section_end:1631097531:get_sources section_start:1631097531:download_artifacts Downloading artifacts Downloading artifacts for hardening-manifest (6314387)... Downloading artifacts from coordinator... ok  id=6314387 responseStatus=200 OK token=rMkYNze8 WARNING: ci-artifacts/preflight/: lchown ci-artifacts/preflight/: operation not permitted (suppressing repeats) Downloading artifacts for import-artifacts (6314389)... Downloading artifacts from coordinator... ok  id=6314389 responseStatus=200 OK token=U386WpB9 WARNING: ci-artifacts/import-artifacts/: lchown ci-artifacts/import-artifacts/: operation not permitted (suppressing repeats) Downloading artifacts for load-scripts (6314384)... Downloading artifacts from coordinator... ok  id=6314384 responseStatus=200 OK token=qy4ut81Q WARNING: ci-artifacts/[MASKED]/: lchown ci-artifacts/[MASKED]/: operation not permitted (suppressing repeats) Downloading artifacts for wl-compare-lint (6314388)... Downloading artifacts from coordinator... ok  id=6314388 responseStatus=200 OK token=cKkYsLKz WARNING: ci-artifacts/lint/: lchown ci-artifacts/lint/: operation not permitted (suppressing repeats) section_end:1631097550:download_artifacts section_start:1631097550:step_script Executing "step_script" stage of the job script $ "${PIPELINE_REPO_DIR}/stages/build/build-run.sh" Determine source registry based on branch Load any images used in Dockerfile build loading image ci-artifacts/import-artifacts/images/spark-operator-spark-py-2.4.5.tar Getting image source signatures Copying blob sha256:0907fceccb5df6ebf2f03516d3866266866445583e9e8732908de258215fbaac Copying blob sha256:017dfbffecc1851a4ac270fcc9127e712ad9648016a027c587df30e7bf15de85 Copying blob sha256:4b7c55f086f067e6fffd121662a56b717e22c938782b7266e79fd12ecca1a788 Copying blob sha256:5a98df2206b31324864168aace1281c7336c358856b150376486ab360148799d Copying blob sha256:b67d19e65ef653823ed62a5835399c610a40e8205c16f839c5cc567954fcf594 Copying blob sha256:beed6f7f1c12dc709b670022d788cd5df8dba56df1c4a4a83c985cb6d63ce7cf Copying blob sha256:507302fb1640ec70ce24b5c789139534de379450fd38a623cc616979ca0420f0 Copying blob sha256:375a95b78f3bad944afda156a1e18e7ba0df2d76a7a09fc2de76c523bb4fd62c Copying blob sha256:6752f14edc0850ce0e548d054031db5d2ac420d6fb7014ceaa9fa95f0aa222c2 Copying blob sha256:43556d4e6d23274a80067d870489a0ae9215ff09ff142776797c54afdff20f5f Copying blob sha256:9a7b83551c84d9dd4d547119833769b12e3e92c30ec9b60cd60ffb8c9838e56d Copying blob sha256:bc5557987564c445d4cac35d7e3a6c7e45169970455c3fc543574227e0c57a9c Copying blob sha256:85aa33aa52348589eb215f274d19193e0a2d8e36cbf374c75e8a1189b37a95b3 Copying blob sha256:795e6fa7e1d883c0f5752a01ca55021e7d24776892a8b711fad528acda204e61 Copying blob sha256:68c373d60139c952a928d297ab36ca1494831438618269c8f640c9d9be687047 Copying config sha256:579644ff878d4148e707951134f0b3758a6d0ca431173572e9d93489032929cc Writing manifest to image destination Storing signatures Loaded image(s): localhost/spark-operator/spark-py:2.4.5 Load HTTP and S3 external resources 'ci-artifacts/import-artifacts/external-resources/pip-21.2.4-py3-none-any.whl' -> './pip-21.2.4-py3-none-any.whl' 'ci-artifacts/import-artifacts/external-resources/setuptools-57.4.0-py3-none-any.whl' -> './setuptools-57.4.0-py3-none-any.whl' 'ci-artifacts/import-artifacts/external-resources/tini' -> './tini' Converting labels from hardening manifest into command line args Converting build args from hardening manifest into command line args Build the image STEP 1: FROM spark-operator/spark-py:2.4.5 AS base Getting image source signatures Copying blob sha256:b67d19e65ef653823ed62a5835399c610a40e8205c16f839c5cc567954fcf594 Copying blob sha256:4b7c55f086f067e6fffd121662a56b717e22c938782b7266e79fd12ecca1a788 Copying blob sha256:beed6f7f1c12dc709b670022d788cd5df8dba56df1c4a4a83c985cb6d63ce7cf Copying blob sha256:017dfbffecc1851a4ac270fcc9127e712ad9648016a027c587df30e7bf15de85 Copying blob sha256:5a98df2206b31324864168aace1281c7336c358856b150376486ab360148799d Copying blob sha256:0907fceccb5df6ebf2f03516d3866266866445583e9e8732908de258215fbaac Copying blob sha256:507302fb1640ec70ce24b5c789139534de379450fd38a623cc616979ca0420f0 Copying blob sha256:375a95b78f3bad944afda156a1e18e7ba0df2d76a7a09fc2de76c523bb4fd62c Copying blob sha256:6752f14edc0850ce0e548d054031db5d2ac420d6fb7014ceaa9fa95f0aa222c2 Copying blob sha256:43556d4e6d23274a80067d870489a0ae9215ff09ff142776797c54afdff20f5f Copying blob sha256:9a7b83551c84d9dd4d547119833769b12e3e92c30ec9b60cd60ffb8c9838e56d Copying blob sha256:bc5557987564c445d4cac35d7e3a6c7e45169970455c3fc543574227e0c57a9c Copying blob sha256:85aa33aa52348589eb215f274d19193e0a2d8e36cbf374c75e8a1189b37a95b3 Copying blob sha256:795e6fa7e1d883c0f5752a01ca55021e7d24776892a8b711fad528acda204e61 Copying blob sha256:68c373d60139c952a928d297ab36ca1494831438618269c8f640c9d9be687047 Copying config sha256:d3e3218da17e6a4759d442196d3c60a6861c5d6dda15a865f938d9401bb92e67 Writing manifest to image destination Storing signatures --> d3e3218da17 STEP 2: FROM registry1.dso.mil/ironbank/opensource/python/python38:3.8 Trying to pull registry1.dso.mil/ironbank/opensource/python/python38:3.8... Getting image source signatures Copying blob sha256:361ab352f363f6ec6a580e3cfcf5dca838fb183986a9850dba165161b137d3bb Copying blob sha256:96476a77b28db43fcb170401c287700d91d95cdff9c06e5ea7b48289d40a8e57 Copying blob sha256:7e9523461232e9a8f06fb30c474bcf3f7fa7dd493076bf8f3c0be657bce43490 Copying blob sha256:db2f50b75fc09a20e8d9c497d96ad384fbeacf7e77df71e4c7b578d4c07fccce Copying config sha256:530d179ae49e56087597154bc85c20a30085fceb9294694f0e27b047da2331f1 Writing manifest to image destination Storing signatures STEP 3: ARG spark_uid=185 STEP 4: USER root STEP 5: COPY *.whl ./ STEP 6: RUN dnf -y update && dnf -y upgrade && pip3 install --upgrade ./*.whl && rm -rf /var/cache/dnf && mkdir -p /opt/spark/python/pyspark && mkdir -p /opt/spark/python/lib && chown -R 185:185 /opt/spark/ Red Hat Universal Base Image 8 (RPMs) - BaseOS 79 kB/s | 790 kB 00:10 Red Hat Universal Base Image 8 (RPMs) - AppStre 247 kB/s | 2.4 MB 00:09 Red Hat Universal Base Image 8 (RPMs) - CodeRea 1.5 kB/s | 14 kB 00:09 Dependencies resolved. Nothing to do. Complete! Last metadata expiration check: 0:00:01 ago on Wed Sep 8 10:41:22 2021. Dependencies resolved. Nothing to do. Complete! Processing /pip-21.2.4-py3-none-any.whl Processing /setuptools-57.4.0-py3-none-any.whl setuptools is already installed with the same version as the provided wheel. Use --force-reinstall to force an installation of the wheel. Installing collected packages: pip Attempting uninstall: pip Found existing installation: pip 21.1.3 Uninstalling pip-21.1.3: Successfully uninstalled pip-21.1.3 Successfully installed pip-21.2.4 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv STEP 7: COPY --from=base /opt/spark/python/lib /opt/spark/python/lib STEP 8: COPY scripts/entrypoint.sh /opt/entrypoint.sh STEP 9: COPY tini /usr/bin/ STEP 10: ENV SPARK_HOME /opt/spark STEP 11: WORKDIR /opt/spark/work-dir STEP 12: RUN chmod g+w /opt/spark/work-dir && chmod a+x /opt/entrypoint.sh && chmod a+x /usr/bin/tini STEP 13: ENTRYPOINT [ "/opt/entrypoint.sh" ] STEP 14: USER ${spark_uid} STEP 15: COMMIT registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py Getting image source signatures Copying blob sha256:98469092e6042f8c9cc81dcb1a710957fb5ef27817c9b178f7b71c4f242cb2ed Copying blob sha256:bfb9caafb0fc0d8496a27709f1698ac90d1a306556387a75b92a86063544f4c8 Copying blob sha256:a7289086c95c733fa09dcfd84e354b7c88469dc5dc470dc53a2dfa9e83cae53b Copying blob sha256:f26e38742ba1d4bde0dbebfe5040a5c40b88c4351eab6e9113b67feeaf5dfb9c Copying blob sha256:36035eaff48f95ad27650dec5fdd97860d435fbba5dca20503010b1948fe57fe Copying config sha256:5720a086743837656ec37f3e006ed8dfa1d7de1a26fd2d7b132a8a739e475c14 Writing manifest to image destination Storing signatures --> 5720a086743 Successfully tagged registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:latest 5720a086743837656ec37f3e006ed8dfa1d7de1a26fd2d7b132a8a739e475c14 + buildah tag --storage-driver=vfs registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:ibci-465784 + buildah push --storage-driver=vfs --authfile staging_auth.json --digestfile=ci-artifacts/build/digest registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:ibci-465784 Getting image source signatures Copying blob sha256:bfb9caafb0fc0d8496a27709f1698ac90d1a306556387a75b92a86063544f4c8 Copying blob sha256:36035eaff48f95ad27650dec5fdd97860d435fbba5dca20503010b1948fe57fe Copying blob sha256:a7289086c95c733fa09dcfd84e354b7c88469dc5dc470dc53a2dfa9e83cae53b Copying blob sha256:f26e38742ba1d4bde0dbebfe5040a5c40b88c4351eab6e9113b67feeaf5dfb9c Copying blob sha256:98469092e6042f8c9cc81dcb1a710957fb5ef27817c9b178f7b71c4f242cb2ed Copying config sha256:5720a086743837656ec37f3e006ed8dfa1d7de1a26fd2d7b132a8a739e475c14 Writing manifest to image destination Storing signatures + echo 'Read the tags' Read the tags + tags_file=ci-artifacts/preflight/tags.txt + test -f ci-artifacts/preflight/tags.txt + IFS= + read -r tag + buildah tag --storage-driver=vfs registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:v3.1.1 + buildah push --storage-driver=vfs --authfile staging_auth.json registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:v3.1.1 Getting image source signatures Copying blob sha256:a7289086c95c733fa09dcfd84e354b7c88469dc5dc470dc53a2dfa9e83cae53b Copying blob sha256:98469092e6042f8c9cc81dcb1a710957fb5ef27817c9b178f7b71c4f242cb2ed Copying blob sha256:36035eaff48f95ad27650dec5fdd97860d435fbba5dca20503010b1948fe57fe Copying blob sha256:bfb9caafb0fc0d8496a27709f1698ac90d1a306556387a75b92a86063544f4c8 Copying blob sha256:f26e38742ba1d4bde0dbebfe5040a5c40b88c4351eab6e9113b67feeaf5dfb9c Copying config sha256:5720a086743837656ec37f3e006ed8dfa1d7de1a26fd2d7b132a8a739e475c14 Writing manifest to image destination Storing signatures + IFS= + read -r tag + buildah tag --storage-driver=vfs registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:latest + buildah push --storage-driver=vfs --authfile staging_auth.json registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:latest Getting image source signatures Copying blob sha256:98469092e6042f8c9cc81dcb1a710957fb5ef27817c9b178f7b71c4f242cb2ed Copying blob sha256:f26e38742ba1d4bde0dbebfe5040a5c40b88c4351eab6e9113b67feeaf5dfb9c Copying blob sha256:bfb9caafb0fc0d8496a27709f1698ac90d1a306556387a75b92a86063544f4c8 Copying blob sha256:a7289086c95c733fa09dcfd84e354b7c88469dc5dc470dc53a2dfa9e83cae53b Copying blob sha256:36035eaff48f95ad27650dec5fdd97860d435fbba5dca20503010b1948fe57fe Copying config sha256:5720a086743837656ec37f3e006ed8dfa1d7de1a26fd2d7b132a8a739e475c14 Writing manifest to image destination Storing signatures + IFS= + read -r tag ++ podman inspect --storage-driver=vfs registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py --format '{{.Id}}' + IMAGE_ID=sha256:5720a086743837656ec37f3e006ed8dfa1d7de1a26fd2d7b132a8a739e475c14 + echo IMAGE_ID=sha256:5720a086743837656ec37f3e006ed8dfa1d7de1a26fd2d7b132a8a739e475c14 + IMAGE_PODMAN_SHA=sha256:ef6e2672125dea01a4bdb6cba952236ca8732d4afd685772919dc8f24acfc8f8 + echo IMAGE_PODMAN_SHA=sha256:ef6e2672125dea01a4bdb6cba952236ca8732d4afd685772919dc8f24acfc8f8 + echo IMAGE_FULLTAG=registry1.dso.mil/ironbank-staging/opensource/spark-operator/spark-py:ibci-465784 + echo IMAGE_NAME=opensource/spark-operator/spark-py + branches=("master" "development") + [[ master development =~ master ]] + msg='A tarball of the built image can be retrieved from the documentation job artifacts.' + echo 'A tarball of the built image can be retrieved from the documentation job artifacts.' A tarball of the built image can be retrieved from the documentation job artifacts. section_end:1631097716:step_script section_start:1631097716:upload_artifacts_on_success Uploading artifacts for successful job Uploading artifacts... ci-artifacts/build/: found 2 matching files and directories Uploading artifacts as "archive" to coordinator... ok id=6314391 responseStatus=201 Created token=y3Q8ahFD Uploading artifacts... build.env: found 1 matching files and directories  Uploading artifacts as "dotenv" to coordinator... ok id=6314391 responseStatus=201 Created token=y3Q8ahFD section_end:1631097718:upload_artifacts_on_success section_start:1631097718:cleanup_file_variables Cleaning up file based variables section_end:1631097718:cleanup_file_variables Job succeeded