Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

copying sources: uploading build context: exit status 1 #3944

Closed
jeusdi opened this issue Apr 11, 2020 · 9 comments · Fixed by #4023
Closed

copying sources: uploading build context: exit status 1 #3944

jeusdi opened this issue Apr 11, 2020 · 9 comments · Fixed by #4023
Assignees
Labels

Comments

@jeusdi
Copy link

jeusdi commented Apr 11, 2020

This is my skaffold.yaml. It's very straightforward:

apiVersion: skaffold/v2beta1
kind: Config
metadata:
  name: spring-boot-slab
build:
  artifacts:
  - image: skaffold-covid-backend
    kaniko: {}
  cluster: {}
deploy:
  kubectl:
    manifests:
    - k8s/*

When I'm trying to build it:

$ skaffold build --default-repo=registry.local:5000 --verbosity=debug
INFO[0000] Skaffold &{Version:v1.7.0 ConfigVersion:skaffold/v2beta1 GitVersion: GitCommit:145f59579470eb1f0a7f40d8e0924f8716c6f05b GitTreeState:clean BuildDate:2020-04-02T21:49:58Z GoVersion:go1.14 Compiler:gc Platform:linux/amd64} 
DEBU[0000] validating yamltags of struct SkaffoldConfig 
DEBU[0000] validating yamltags of struct Metadata       
DEBU[0000] validating yamltags of struct Pipeline       
DEBU[0000] validating yamltags of struct BuildConfig    
DEBU[0000] validating yamltags of struct Artifact       
DEBU[0000] validating yamltags of struct ArtifactType   
DEBU[0000] validating yamltags of struct KanikoArtifact 
DEBU[0000] validating yamltags of struct TagPolicy      
DEBU[0000] validating yamltags of struct GitTagger      
DEBU[0000] validating yamltags of struct BuildType      
DEBU[0000] validating yamltags of struct ClusterDetails 
DEBU[0000] validating yamltags of struct DeployConfig   
DEBU[0000] validating yamltags of struct DeployType     
DEBU[0000] validating yamltags of struct KubectlDeploy  
DEBU[0000] validating yamltags of struct KubectlFlags   
INFO[0000] Using kubectl context: k3s-traefik-v2        
DEBU[0000] Using builder: cluster                       
DEBU[0000] setting Docker user agent to skaffold-v1.7.0 
Generating tags...
 - skaffold-covid-backend -> DEBU[0000] Running command: [git describe --tags --always] 
DEBU[0000] Command output: [c5dfd81
]                   
DEBU[0000] Running command: [git status . --porcelain]  
DEBU[0000] Command output: [ M skaffold.yaml
?? k8s/configmap.yaml
] 
registry.local:5000/skaffold-covid-backend:c5dfd81-dirty
INFO[0000] Tags generated in 3.135415ms                 
Checking cache...
 - skaffold-covid-backend: WARN[0016] error checking cache, caching may not work as expected: getting hash for artifact skaffold-covid-backend: getting dependencies for "skaffold-covid-backend": file pattern [target/*.jar] must match at least one file 
Error checking cache. Rebuilding.
INFO[0016] Cache check complete in 16.225444103s        
Building [skaffold-covid-backend]...
DEBU[0016] getting client config for kubeContext: ``    
INFO[0016] Waiting for kaniko-lmf99 to be initialized   
DEBU[0017] Running command: [kubectl --context k3s-traefik-v2 exec -i kaniko-lmf99 -c kaniko-init-container -n skaffold -- tar -xf - -C /kaniko/buildcontext] 
FATA[0017] build failed: building [skaffold-covid-backend]: copying sources: uploading build context: exit status 1

As you can see:

WARN[0016] error checking cache, caching may not work as expected: getting hash for artifact skaffold-covid-backend: getting dependencies for "skaffold-covid-backend": file pattern [target/*.jar] must match at least one file
Error checking cache. Rebuilding.

And

DEBU[0017] Running command: [kubectl --context k3s-traefik-v2 exec -i kaniko-lmf99 -c kaniko-init-container -n skaffold -- tar -xf - -C /kaniko/buildcontext]
FATA[0017] build failed: building [skaffold-covid-backend]: copying sources: uploading build context: exit status 1

My project is a maven project. And my Dockerfile is:

FROM openjdk:8-jdk-alpine
RUN addgroup -S spring && adduser -S spring -G spring
USER spring:spring
ARG JAR_FILE=target/*.jar
COPY ${JAR_FILE} app.jar
ENTRYPOINT ["java","-jar","/app.jar"]

Any ideas?

@tstromberg tstromberg changed the title [HELP] kaniko build failed since uploading build context is failing copying sources: uploading build context: exit status 1 Apr 15, 2020
@tstromberg
Copy link
Contributor

The error message here is useless. Shouldn't we be showing including stderr in the log?

@jturi
Copy link

jturi commented Apr 15, 2020

UPDATE:
Changed my container base image to

FROM oraclelinux:7

Now everything works.
If anybody could explain why it works with oraclelinux and not with alpine, ubuntu that would be great.

ISSUE:

Skaffold version: v1.7.0

DEBU[0000] getting client config for kubeContext: ``
INFO[0000] Waiting for kaniko-pxfjg to be initialized
DEBU[0003] Running command: [kubectl --context kworker02 --namespace jenkins-kubernetes exec -i kaniko-pxfjg -c kaniko-init-container -n jenkins-kubernetes -- tar -xf - -C /kaniko/buildcontext]
DEBU[0003] FIXME: Got an status-code for which error does not match any expected type!!!: -1  module=api status_code=-1
FATA[0005] failed to build: build failed: building [private_repo/flask_app_img]: copying sources: uploading build context: exit status 1

I'm getting the same copying sources: uploading build context: exit status 1 error as well.
My flask_app repository and skaffold.yaml file is deploying on the same cluster to the same namespace with the same secrets without any problem from our bare metal host (it has docker installed but daemon not running).

But if I build and run a docker image with skaffold in it (experimenting with jenkins kubernetes plugin), (this container has a similar environment to the host) and I try to do the same in this container I get this error. How can I get a more detailed error message? I can't debug it further.

  • I run: SKAFFOLD_UPDATE_CHECK=false skaffold --verbosity DEBUG --cache-artifacts=false run -n jenkins-kubernetes
  • Container has kubectl, tar, skaffold, ~/.kube/config
  • kubectl exec, kubectl cp kubectl get secrets, kubectl get pods all runs fine from this container, but there has to be some difference between this and the host (oracle linux).

I tried to build the container with

FROM jenkins-jnlp-slave:3.10-1-alpine
and
FROM ubuntu:latest

No change. I don't understand why it is working on the host but not in a container running on the host.

Maybe Skaffold is missing a dependency or some environment variable in the container?

@tstromberg tstromberg added the kind/question User question label Apr 20, 2020
@dgageot dgageot self-assigned this Apr 24, 2020
dgageot added a commit to dgageot/skaffold that referenced this issue Apr 24, 2020
Fix GoogleContainerTools#3944

Signed-off-by: David Gageot <david@gageot.net>
@dgageot
Copy link
Contributor

dgageot commented Apr 24, 2020

@jturi The problem is that to upload the source to your container, we rely on tar. I'll improve the error message to make that much clearer

@jturi
Copy link

jturi commented Apr 25, 2020

@dgageot Hi David, thanks for looking into this!

We just got server updates and Skaffold no longer works on some of our machines:

Not working (updated server oracle7)
DEBU[0000] getting client config for kubeContext: ``
INFO[0000] Waiting for kaniko-kkqvn to be initialized
DEBU[0001] Running command: [kubectl --context dgi-it-kwmvpm02 exec -i kaniko-kkqvn -c kaniko-init-container -n jarvis-dev -- tar -xf - -C /kaniko/buildcontext]
DEBU[0001] FIXME: Got an status-code for which error does not match any expected type!!!: -1  module=api status_code=-1FATA[0002] failed to build: build failed: building [private_registry/jarvis_api_dev]: copying sources: uploading build context: exit status 1
Working (non-updated server oracle7)
DEBU[0001] getting client config for kubeContext: ``
INFO[0001] Waiting for kaniko-c6tgx to be initialized
DEBU[0003] Running command: [kubectl --context dgi-it-kwmvpm02 exec -i kaniko-c6tgx -c kaniko-init-container -n jarvis-dev -- tar -xf - -C /kaniko/buildcontext]
DEBU[0003] Found dependencies for dockerfile: [{src/app.py /jarvis/jarvis_api/src true} {src/check_pv_rbac.py /jarvis/jarvis_api/src true}{src/create_pv.py /jarvis/jarvis_api/src true} {src/create_pvc.py /jarvis/jarvis_api/src true} {src/delete_pv.py /jarvis/jarvis_api/src true}{src/requirements.txt /jarvis/jarvis_api/src/requirements.txt false}]

I have no idea what causing Skaffold to fail to upload the files so I'm really curious. Two weeks ago we deployed jenkins in our cluster with Jenkins Kubernetes Plugin, using jenkins/jnlp-slave and a custom oracle7 image with skaffold in it. It clones projects and executes skaffold run to build and deploy our applications. Skaffold with Jenkins pipelines is fantastic it works like a charm even with strict PSP configurations, so thank you for all the efforts.

dgageot added a commit to dgageot/skaffold that referenced this issue Apr 26, 2020
Fix GoogleContainerTools#3944

Signed-off-by: David Gageot <david@gageot.net>
dgageot added a commit that referenced this issue Apr 26, 2020
* [kaniko] Better error message when upload fails

Fix #3944

Signed-off-by: David Gageot <david@gageot.net>

* Feedback

Signed-off-by: David Gageot <david@gageot.net>
@thetume
Copy link

thetume commented Apr 30, 2020

Did you find a fix for this issue? As we are having similar problems running in-cluster build with Kaniko, with the same errors ending with:

copying sources: uploading build context: exit status 1

@jturi
Copy link

jturi commented Apr 30, 2020

Hi @thetume,

Yes dgageot's new changes fixed our issue, all running fine with the latest bleeding edge binary.

If you replace your old skaffold binary with this one and do skaffold version you should see the latest commit hash like: 8dd4570 at the moment.

@Bonehead5338
Copy link

I am getting this error too, though I don't see the "exit status 1" maybe because I am on windows.

`skaffold.exe build -p dev-kaniko
Generating tags...

  • registry/app -> registry/app:tag
    Checking cache...
  • registry/app: Not found. Building
    Starting build...
    Checking for kaniko secret [docker-registry/kaniko-pull-secret]...
    time="2021-06-03T09:57:55-05:00" level=warning msg="Assuming the secret kaniko-pull-secret is mounted inside Kaniko pod with the filename kaniko-secret. If your secret is mounted at different path, please specify using config key pullSecretPath.\nSee https://skaffold.dev/docs/references/yaml/#build-cluster-pullSecretPath"
    Creating docker config secret [kaniko-docker-cfg]...
    Building [registry/app]...
    uploading build context: ` <-- RED

Any ideas how I can further troubleshoot root cause?

@Bonehead5338
Copy link

I managed to get it down to a single file which looks to be a filename-length issue:
./src/app/<filename> is 201 characters long
C:\fullpath is 261
/fullpath/ is 259
Which makes sense that the tar limit is 256 and I have a file with 3 less characters which is ok. Anyway...

Is there a way to make this error a bit clearer?

Thanks

@nkubala
Copy link
Contributor

nkubala commented Aug 2, 2021

@dthomastx we could probably add some checks to this code to validate filename lengths before adding and either skip or error out.

func addFileToTar(root string, src string, dst string, tw *tar.Writer, hm headerModifier) error {

feel free to open a separate issue to track, or if you're feeling ambitious submit a fix yourself!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants