Running with gitlab-runner 13.8.0 (775dd39d)  on global-shared-gitlab-runner-5dc56b4ff7-zm7kn XosKzEsy section_start:1614197967:resolve_secrets Resolving secrets section_end:1614197967:resolve_secrets section_start:1614197967:prepare_executor Preparing the "kubernetes" executor Using Kubernetes namespace: gitlab-runner-isolated WARNING: Pulling GitLab Runner helper image from Docker Hub. Helper image is migrating to registry.gitlab.com, for more information see https://docs.gitlab.com/runner/configuration/advanced-configuration.html#migrating-helper-image-to-registrygitlabcom Using Kubernetes executor with image ${GITLAB_INTERNAL_REGISTRY}/ironbank-tools/ironbank-pipeline/ib-pipeline-image:0.1 ... section_end:1614197967:prepare_executor section_start:1614197967:prepare_script Preparing environment Waiting for pod gitlab-runner-isolated/runner-xoskzesy-project-4877-concurrent-0ljf92 to be running, status is Pending Running on runner-xoskzesy-project-4877-concurrent-0ljf92 via global-shared-gitlab-runner-5dc56b4ff7-zm7kn... section_end:1614197970:prepare_script section_start:1614197970:get_sources Getting source from Git repository Fetching changes with git depth set to 50... Initialized empty Git repository in /builds/dsop/opensource/spark-operator/spark-py/.git/ Created fresh repository. Checking out 1976363e as development... Skipping Git submodules setup section_end:1614197970:get_sources section_start:1614197970:download_artifacts Downloading artifacts Downloading artifacts for hardening_manifest (2154657)... Downloading artifacts from coordinator... ok  id=2154657 responseStatus=200 OK token=LPS9ibGn Downloading artifacts for import artifacts (2154659)... Downloading artifacts from coordinator... ok  id=2154659 responseStatus=200 OK token=sDfxU9es Downloading artifacts for load scripts (2154654)... Downloading artifacts from coordinator... ok  id=2154654 responseStatus=200 OK token=-qJzxSSz Downloading artifacts for wl compare lint (2154658)... Downloading artifacts from coordinator... ok  id=2154658 responseStatus=200 OK token=Vs3DrWHA section_end:1614197979:download_artifacts section_start:1614197979:step_script Executing "step_script" stage of the job script $ "${PIPELINE_REPO_DIR}/stages/build/build-run.sh" Determine source registry based on branch Load any images used in Dockerfile build loading image ci-artifacts/import-artifacts/images/spark-operator-spark-py-2.4.4.tar time="2021-02-24T20:19:39Z" level=error msg="unable to write system event: \"write unixgram @1aaad->/run/systemd/journal/socket: sendmsg: no such file or directory\"" Getting image source signatures Copying blob sha256:f1b5933fe4b5f49bbe8258745cf396afe07e625bdab3168e364daf7c956b6b81 Copying blob sha256:ceaf9e1ebef5f9eaa707a838848a3c13800fcf32d7757be10d4b08fb85f1bc8a Copying blob sha256:219f11f20cf205af7a1b07bb6075357be82a5ba95c9b1b48759ef62a890f5dee Copying blob sha256:9b9b7f3d56a01e3d9076874990c62e7a516cc4032f784f421574d06b18ef9aa4 Copying blob sha256:382ad4215084d72badfb0bd395ba5654d2bbc363a8a44b3546045e9154e37d5b Copying blob sha256:4bffdb237130695db25cec651b6fc9a7461f517e510ccb1c4645e0f3d423e2f2 Copying blob sha256:a68de98d2593c736b5ef11b2822587763928ed0c82d19dfe036f093e9753126b Copying blob sha256:cc8bf859269b7ca30c30c08b477495227b570c69573066ac990b37e964e053a7 Copying blob sha256:5f2a42df8591cb9f128f5b1410a5239963dd4bd7354402c0cc7d92fb8ed2f474 Copying blob sha256:1cb7685e594f871b314054c0a385af960a0371c377d601fe19eb7ff522eb5a48 Copying blob sha256:e5492e23a9b02566f657ff378c5659a3538b72bb1e1b7c7bf9691499366eaf88 Copying blob sha256:5e9bf8f7c450e44d585305964f2207d0bbb119a56546a7fade173f4aca9f9c51 Copying blob sha256:557315e59be33a4e698e3c59e188a683372f9d22c3fe0f4fd748c9c4e5e150f1 Copying blob sha256:14a31e48ee105773d6f7cbd2af03a9733684d04489e4a5153d8a6cc374a18bf6 Copying blob sha256:cc4207c05b6cf7304dd543256a78bc18ca91e00e693163112710076b368245c9 Copying blob sha256:79d681921fa324899b9b16506369827f8ac003eb7b530a34c902b9ae3386e84b Copying config sha256:d06b1fc033ea946a68c8e9c523dd02e665170ce25a68652cc5d9e62979b443ee Writing manifest to image destination Storing signatures Loaded image(s): localhost/spark-operator/spark-py:2.4.4 Load HTTP and S3 external resources 'ci-artifacts/import-artifacts/external-resources/pip-21.0.1-py3-none-any.whl' -> './pip-21.0.1-py3-none-any.whl' 'ci-artifacts/import-artifacts/external-resources/setuptools-53.0.0-py3-none-any.whl' -> './setuptools-53.0.0-py3-none-any.whl' Converting labels from hardening manifest into command line args Converting build args from hardening manifest into command line args Build the image STEP 1: FROM spark-operator/spark-py:2.4.4 AS base Getting image source signatures Copying blob sha256:f1b5933fe4b5f49bbe8258745cf396afe07e625bdab3168e364daf7c956b6b81 Copying blob sha256:9b9b7f3d56a01e3d9076874990c62e7a516cc4032f784f421574d06b18ef9aa4 Copying blob sha256:ceaf9e1ebef5f9eaa707a838848a3c13800fcf32d7757be10d4b08fb85f1bc8a Copying blob sha256:4bffdb237130695db25cec651b6fc9a7461f517e510ccb1c4645e0f3d423e2f2 Copying blob sha256:382ad4215084d72badfb0bd395ba5654d2bbc363a8a44b3546045e9154e37d5b Copying blob sha256:219f11f20cf205af7a1b07bb6075357be82a5ba95c9b1b48759ef62a890f5dee Copying blob sha256:a68de98d2593c736b5ef11b2822587763928ed0c82d19dfe036f093e9753126b Copying blob sha256:cc8bf859269b7ca30c30c08b477495227b570c69573066ac990b37e964e053a7 Copying blob sha256:5f2a42df8591cb9f128f5b1410a5239963dd4bd7354402c0cc7d92fb8ed2f474 Copying blob sha256:1cb7685e594f871b314054c0a385af960a0371c377d601fe19eb7ff522eb5a48 Copying blob sha256:e5492e23a9b02566f657ff378c5659a3538b72bb1e1b7c7bf9691499366eaf88 Copying blob sha256:5e9bf8f7c450e44d585305964f2207d0bbb119a56546a7fade173f4aca9f9c51 Copying blob sha256:557315e59be33a4e698e3c59e188a683372f9d22c3fe0f4fd748c9c4e5e150f1 Copying blob sha256:14a31e48ee105773d6f7cbd2af03a9733684d04489e4a5153d8a6cc374a18bf6 Copying blob sha256:cc4207c05b6cf7304dd543256a78bc18ca91e00e693163112710076b368245c9 Copying blob sha256:79d681921fa324899b9b16506369827f8ac003eb7b530a34c902b9ae3386e84b Copying config sha256:af0e015af8607426b7b78cae7be4a484df3bf58999d1c447d4c48b598b9caf9d Writing manifest to image destination Storing signatures af0e015af8607426b7b78cae7be4a484df3bf58999d1c447d4c48b598b9caf9d STEP 2: FROM registry1.dsop.io/ironbank/opensource/python/python38:3.8 Getting image source signatures Copying blob sha256:9fd15ceb6af1254a7efcd960902db632e6d82abc9860f7469fe37eb89b1c7095 Copying blob sha256:3408f0c0f53ed01637f896fc4b2dfa3a0062cfccce5ee1b267dba08f90e18174 Copying blob sha256:eced0fee1cb302a6286f0191a70189c0d510a976f5ebd256d2e8b2c962c7a8e2 Copying blob sha256:8e2ac6b804ac5cb5a99d108663f112ee9b5c7af09431a69dbdf1efa19ca24345 Copying config sha256:57c16fbebfddee33417b8d67baee6398be25f2258a096454eb2845d6331c09ec Writing manifest to image destination Storing signatures STEP 3: ARG spark_uid=185 STEP 4: USER root STEP 5: COPY pip-21.0.1-py3-none-any.whl setuptools-53.0.0-py3-none-any.whl ./ STEP 6: RUN dnf -y update && dnf -y upgrade && pip3 install --upgrade ./pip-21.0.1-py3-none-any.whl ./setuptools-53.0.0-py3-none-any.whl && rm -rf /var/cache/dnf && mkdir -p /opt/spark/python/pyspark && mkdir -p /opt/spark/python/lib && chown -R 185:185 /opt/spark/ Red Hat Universal Base Image 8 (RPMs) - BaseOS 77 kB/s | 774 kB 00:10 Red Hat Universal Base Image 8 (RPMs) - AppStre 517 kB/s | 5.1 MB 00:10 Red Hat Universal Base Image 8 (RPMs) - CodeRea 1.3 kB/s | 13 kB 00:10 Dependencies resolved. Nothing to do. Complete! Last metadata expiration check: 0:00:01 ago on Wed Feb 24 20:21:30 2021. Dependencies resolved. Nothing to do. Complete! Processing /pip-21.0.1-py3-none-any.whl Processing /setuptools-53.0.0-py3-none-any.whl Installing collected packages: pip, setuptools Attempting uninstall: pip Found existing installation: pip 20.2.2 Uninstalling pip-20.2.2: Successfully uninstalled pip-20.2.2 Attempting uninstall: setuptools Found existing installation: setuptools 49.2.1 Uninstalling setuptools-49.2.1: Successfully uninstalled setuptools-49.2.1 Successfully installed pip-21.0.1 setuptools-53.0.0 STEP 7: COPY --from=base /opt/spark/python/lib /opt/spark/python/lib STEP 8: ENV SPARK_HOME /opt/spark STEP 9: WORKDIR /opt/spark/work-dir STEP 10: RUN chmod g+w /opt/spark/work-dir STEP 11: ENTRYPOINT [ "/opt/entrypoint.sh" ] STEP 12: USER ${spark_uid} STEP 13: COMMIT registry1.dsop.io/ironbank-staging/opensource/spark-operator/spark-py Getting image source signatures Copying blob sha256:72e7d306c279696257fb1796dac29d887127c53479403f8687d3aa4c560edff2 Copying blob sha256:9624be4353eb5540a948608d314a5690c6a0b841aa04979810b61fd190e04769 Copying blob sha256:2e57dce4718369776245aa1d8f4dc935aa4ae688b1676b0a9d55b7539e8decaf Copying blob sha256:e14bd7890c3e8d12b6b84dda3596a3078227ad71ea31c9ce3c7f5354a5a0b61b Copying blob sha256:6f45e916d3695d91cd27dc183499ca7c06ec038c02b453bbdceb3da99aa5f1f5 Copying config sha256:fa20339954ab88e124cf87f0db1be7d29dfde1bf60d1bea9d44ccb7482520864 Writing manifest to image destination Storing signatures fa20339954ab88e124cf87f0db1be7d29dfde1bf60d1bea9d44ccb7482520864 fa20339954ab88e124cf87f0db1be7d29dfde1bf60d1bea9d44ccb7482520864 + buildah tag --storage-driver=vfs registry1.dsop.io/ironbank-staging/opensource/spark-operator/spark-py registry1.dsop.io/ironbank-staging/opensource/spark-operator/spark-py:175719 + buildah push --storage-driver=vfs --authfile staging_auth.json --digestfile=ci-artifacts/build/digest registry1.dsop.io/ironbank-staging/opensource/spark-operator/spark-py:175719 Getting image source signatures Copying blob sha256:6f45e916d3695d91cd27dc183499ca7c06ec038c02b453bbdceb3da99aa5f1f5 Copying blob sha256:2e57dce4718369776245aa1d8f4dc935aa4ae688b1676b0a9d55b7539e8decaf Copying blob sha256:9624be4353eb5540a948608d314a5690c6a0b841aa04979810b61fd190e04769 Copying blob sha256:72e7d306c279696257fb1796dac29d887127c53479403f8687d3aa4c560edff2 Copying blob sha256:e14bd7890c3e8d12b6b84dda3596a3078227ad71ea31c9ce3c7f5354a5a0b61b Copying config sha256:fa20339954ab88e124cf87f0db1be7d29dfde1bf60d1bea9d44ccb7482520864 Writing manifest to image destination Storing signatures Read the tags + echo 'Read the tags' + tags_file=ci-artifacts/preflight/tags.txt + test -f ci-artifacts/preflight/tags.txt + IFS= + read -r tag + buildah tag --storage-driver=vfs registry1.dsop.io/ironbank-staging/opensource/spark-operator/spark-py registry1.dsop.io/ironbank-staging/opensource/spark-operator/spark-py:2.4.4 + buildah push --storage-driver=vfs --authfile staging_auth.json registry1.dsop.io/ironbank-staging/opensource/spark-operator/spark-py:2.4.4 Getting image source signatures Copying blob sha256:6f45e916d3695d91cd27dc183499ca7c06ec038c02b453bbdceb3da99aa5f1f5 Copying blob sha256:72e7d306c279696257fb1796dac29d887127c53479403f8687d3aa4c560edff2 Copying blob sha256:2e57dce4718369776245aa1d8f4dc935aa4ae688b1676b0a9d55b7539e8decaf Copying blob sha256:9624be4353eb5540a948608d314a5690c6a0b841aa04979810b61fd190e04769 Copying blob sha256:e14bd7890c3e8d12b6b84dda3596a3078227ad71ea31c9ce3c7f5354a5a0b61b Copying config sha256:fa20339954ab88e124cf87f0db1be7d29dfde1bf60d1bea9d44ccb7482520864 Writing manifest to image destination Storing signatures + IFS= + read -r tag + buildah tag --storage-driver=vfs registry1.dsop.io/ironbank-staging/opensource/spark-operator/spark-py registry1.dsop.io/ironbank-staging/opensource/spark-operator/spark-py:latest + buildah push --storage-driver=vfs --authfile staging_auth.json registry1.dsop.io/ironbank-staging/opensource/spark-operator/spark-py:latest Getting image source signatures Copying blob sha256:9624be4353eb5540a948608d314a5690c6a0b841aa04979810b61fd190e04769 Copying blob sha256:e14bd7890c3e8d12b6b84dda3596a3078227ad71ea31c9ce3c7f5354a5a0b61b Copying blob sha256:2e57dce4718369776245aa1d8f4dc935aa4ae688b1676b0a9d55b7539e8decaf Copying blob sha256:6f45e916d3695d91cd27dc183499ca7c06ec038c02b453bbdceb3da99aa5f1f5 Copying blob sha256:72e7d306c279696257fb1796dac29d887127c53479403f8687d3aa4c560edff2 Copying config sha256:fa20339954ab88e124cf87f0db1be7d29dfde1bf60d1bea9d44ccb7482520864 Writing manifest to image destination Storing signatures + IFS= + read -r tag + skopeo copy --src-authfile staging_auth.json docker://registry1.dsop.io/ironbank-staging/opensource/spark-operator/spark-py:175719 docker-archive:ci-artifacts/build/spark-py-175719.tar Getting image source signatures Copying blob sha256:9fd15ceb6af1254a7efcd960902db632e6d82abc9860f7469fe37eb89b1c7095 Copying blob sha256:3408f0c0f53ed01637f896fc4b2dfa3a0062cfccce5ee1b267dba08f90e18174 Copying blob sha256:8e2ac6b804ac5cb5a99d108663f112ee9b5c7af09431a69dbdf1efa19ca24345 Copying blob sha256:eced0fee1cb302a6286f0191a70189c0d510a976f5ebd256d2e8b2c962c7a8e2 Copying blob sha256:1f899be16a2a34cfd1e4cfaea4f66cb264eb743fdeebcdcb4a79929ad08985de Copying config sha256:fa20339954ab88e124cf87f0db1be7d29dfde1bf60d1bea9d44ccb7482520864 Writing manifest to image destination Storing signatures ++ podman inspect --storage-driver=vfs registry1.dsop.io/ironbank-staging/opensource/spark-operator/spark-py --format '{{.Id}}' + IMAGE_ID=sha256:fa20339954ab88e124cf87f0db1be7d29dfde1bf60d1bea9d44ccb7482520864 + echo IMAGE_ID=sha256:fa20339954ab88e124cf87f0db1be7d29dfde1bf60d1bea9d44ccb7482520864 ++ sha256sum ci-artifacts/build/spark-py-175719.tar ++ grep -E '^[a-zA-Z0-9]+' -o + IMAGE_TAR_SHA=ea95ca38b2ce7fd86144bad321d98b8023bfb9eba82134f77af6071f1bd35803 + echo IMAGE_TAR_SHA=ea95ca38b2ce7fd86144bad321d98b8023bfb9eba82134f77af6071f1bd35803 + IMAGE_PODMAN_SHA=sha256:0cef6bb33820927ba839560568354b5798e1c7766497426a2206a44805f08f61 + echo IMAGE_PODMAN_SHA=sha256:0cef6bb33820927ba839560568354b5798e1c7766497426a2206a44805f08f61 + echo IMAGE_FILE=spark-py-175719 + echo IMAGE_FULLTAG=registry1.dsop.io/ironbank-staging/opensource/spark-operator/spark-py:175719 + echo IM_NAME=opensource/spark-operator/spark-py section_end:1614198180:step_script section_start:1614198180:upload_artifacts_on_success Uploading artifacts for successful job Uploading artifacts... ci-artifacts/build/: found 3 matching files and directories Uploading artifacts as "archive" to coordinator... ok id=2154661 responseStatus=201 Created token=d3vSGFPn Uploading artifacts... build.env: found 1 matching files and directories  Uploading artifacts as "dotenv" to coordinator... ok id=2154661 responseStatus=201 Created token=d3vSGFPn section_end:1614198211:upload_artifacts_on_success section_start:1614198211:cleanup_file_variables Cleaning up file based variables section_end:1614198211:cleanup_file_variables Job succeeded