Recent runs || View in Spyglass
PR | dims: Bump dependencies and go version (in go.mod) |
Result | FAILURE |
Tests | 0 failed / 0 succeeded |
Started | |
Elapsed | 16m13s |
Revision | b784cc42d806db67b32806c14fb93a72ac2b6092 |
Refs |
547 |
... skipping 426 lines ... go build \ -o=/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/_output/bin/aws-iam-authenticator \ -ldflags="-w -s -X sigs.k8s.io/aws-iam-authenticator/pkg.Version= -X sigs.k8s.io/aws-iam-authenticator/pkg.BuildDate=2023-01-21T00:30:59Z -X sigs.k8s.io/aws-iam-authenticator/pkg.CommitID=fe6f3c99413588f6a96f1d495cf2ad8a019ed45e" \ ./cmd/aws-iam-authenticator/ make[2]: Leaving directory '/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator' make[1]: Leaving directory '/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator' Error: cluster not found "test-cluster-20382.k8s.local" ### ## Setting up roles # An error occurred (NoSuchEntity) when calling the GetRole operation: The role with name aws-iam-authenticator-test-role-KubernetesAdmin cannot be found. ### ## Creating aws-iam-authenticator-test-role-KubernetesAdmin role # admin role: arn:aws:iam::607362164682:role/aws-iam-authenticator-test-role-KubernetesAdmin An error occurred (NoSuchEntity) when calling the GetRole operation: The role with name aws-iam-authenticator-test-role-KubernetesUsers cannot be found. ### ## Creating aws-iam-authenticator-test-role-KubernetesUsers role # user role: arn:aws:iam::607362164682:role/aws-iam-authenticator-test-role-KubernetesUsers ### ## Generating SSH key /home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/hack/e2e/e2e-test-artifacts/id_rsa ... skipping 12 lines ... | + S / B * | | . B O @ X o | | . * O B E .| | + o = | | . | +----[SHA256]-----+ Error: cluster not found "test-cluster-20382.k8s.local" ### ## Creating cluster test-cluster-20382.k8s.local with /home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/hack/e2e/e2e-test-artifacts/test-cluster-20382.k8s.local.json (dry run) # I0121 00:31:08.436066 17766 new_cluster.go:248] Inferred "aws" cloud provider from zone "us-west-2a" I0121 00:31:08.436298 17766 new_cluster.go:1102] Cloud Provider ID = aws I0121 00:31:08.767060 17766 subnets.go:182] Assigned CIDR 172.20.32.0/19 to subnet us-west-2a ... skipping 79 lines ... Unable to connect to the server: dial tcp: lookup api-test-cluster-20382-k8-8l1uvh-1694432600.us-west-2.elb.amazonaws.com on 10.63.240.10:53: no such host Unable to connect to the server: dial tcp: lookup api-test-cluster-20382-k8-8l1uvh-1694432600.us-west-2.elb.amazonaws.com on 10.63.240.10:53: no such host Unable to connect to the server: dial tcp: lookup api-test-cluster-20382-k8-8l1uvh-1694432600.us-west-2.elb.amazonaws.com on 10.63.240.10:53: no such host Unable to connect to the server: dial tcp: lookup api-test-cluster-20382-k8-8l1uvh-1694432600.us-west-2.elb.amazonaws.com on 10.63.240.10:53: no such host Unable to connect to the server: dial tcp: lookup api-test-cluster-20382-k8-8l1uvh-1694432600.us-west-2.elb.amazonaws.com on 10.63.240.10:53: no such host Unable to connect to the server: dial tcp: lookup api-test-cluster-20382-k8-8l1uvh-1694432600.us-west-2.elb.amazonaws.com on 10.63.240.10:53: no such host error: Get "https://api-test-cluster-20382-k8-8l1uvh-1694432600.us-west-2.elb.amazonaws.com/api?timeout=32s": dial tcp: lookup api-test-cluster-20382-k8-8l1uvh-1694432600.us-west-2.elb.amazonaws.com on 10.63.240.10:53: no such host - error from a previous attempt: EOF Unable to connect to the server: EOF error: the server doesn't have a resource type "nodes" ### ## Cluster is up! # ### ## Applying testing roles # ... skipping 42 lines ... ip-172-20-97-134.us-west-2.compute.internal node True VALIDATION ERRORS KIND NAME MESSAGE Pod kube-system/aws-iam-authenticator-nqbl5 system-node-critical pod "aws-iam-authenticator-nqbl5" is pending Validation Failed W0121 00:37:10.325453 18274 validate_cluster.go:232] (will retry): cluster not yet healthy INSTANCE GROUPS NAME ROLE MACHINETYPE MIN MAX SUBNETS master-us-west-2a Master t3.medium 1 1 us-west-2a nodes-us-west-2a Node c5.large 1 1 us-west-2a nodes-us-west-2b Node c5.large 1 1 us-west-2b ... skipping 38 lines ... Will run [1m8[0m of [1m8[0m specs [38;5;10m•[0mcould not get token: AccessDenied: User: arn:aws:iam::607362164682:user/awstester is not authorized to perform: sts:AssumeRole on resource: arn:aws:iam::607362164682:role/aws-iam-authenticator-test-role-KubernetesAdmin status code: 403, request id: ab458a48-c1a7-4d72-ae05-f5ad21f0fca7 [38;5;243m------------------------------[0m [38;5;9m• [FAILED] [0.214 seconds][0m [iam-auth-e2e] a kubernetes client [38;5;243m/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/tests/e2e/tests.go:37[0m when assuming the KubernetesAdmin role [38;5;243m/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/tests/e2e/tests.go:62[0m [38;5;9m[1m[It] should send requests successfully[0m [38;5;243m/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/tests/e2e/tests.go:79[0m [38;5;9mUnexpected error: <*url.Error | 0xc0001b34d0>: { Op: "Get", URL: "https://api-test-cluster-20382-k8-8l1uvh-1694432600.us-west-2.elb.amazonaws.com/api/v1/namespaces/kube-system/pods", Err: <*errors.errorString | 0xc000550540>{ s: "getting credentials: exec: executable /home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/hack/e2e/e2e-test-artifacts/bin/aws-iam-authenticator failed with exit code 1", }, } Get "https://api-test-cluster-20382-k8-8l1uvh-1694432600.us-west-2.elb.amazonaws.com/api/v1/namespaces/kube-system/pods": getting credentials: exec: executable /home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/hack/e2e/e2e-test-artifacts/bin/aws-iam-authenticator failed with exit code 1 occurred[0m [38;5;9mIn [1m[It][0m[38;5;9m at: [1m/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/tests/e2e/tests.go:84[0m [38;5;243m------------------------------[0m could not get token: AccessDenied: User: arn:aws:iam::607362164682:user/awstester is not authorized to perform: sts:AssumeRole on resource: arn:aws:iam::607362164682:role/aws-iam-authenticator-test-role-KubernetesAdmin status code: 403, request id: 11056746-893f-465b-a74a-87dacb7d1b85 [38;5;9m• [FAILED] [0.251 seconds][0m [iam-auth-e2e] a kubernetes client [38;5;243m/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/tests/e2e/tests.go:37[0m when assuming the KubernetesAdmin role [38;5;243m/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/tests/e2e/tests.go:62[0m [38;5;9m[1m[It] should be authorized to do everything[0m [38;5;243m/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/tests/e2e/tests.go:87[0m [38;5;9mUnexpected error: <*url.Error | 0xc00058cc90>: { Op: "Post", URL: "https://api-test-cluster-20382-k8-8l1uvh-1694432600.us-west-2.elb.amazonaws.com/apis/authorization.k8s.io/v1/selfsubjectaccessreviews", Err: <*errors.errorString | 0xc000456690>{ s: "getting credentials: exec: executable /home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/hack/e2e/e2e-test-artifacts/bin/aws-iam-authenticator failed with exit code 1", }, } Post "https://api-test-cluster-20382-k8-8l1uvh-1694432600.us-west-2.elb.amazonaws.com/apis/authorization.k8s.io/v1/selfsubjectaccessreviews": getting credentials: exec: executable /home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/hack/e2e/e2e-test-artifacts/bin/aws-iam-authenticator failed with exit code 1 occurred[0m [38;5;9mIn [1m[It][0m[38;5;9m at: [1m/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/tests/e2e/tests.go:103[0m [38;5;243m------------------------------[0m could not get token: AccessDenied: User: arn:aws:iam::607362164682:user/awstester is not authorized to perform: sts:AssumeRole on resource: arn:aws:iam::607362164682:role/aws-iam-authenticator-test-role-KubernetesUsers status code: 403, request id: c37259d4-a625-4380-b20b-f6c5973fc4da [38;5;9m• [FAILED] [0.224 seconds][0m [iam-auth-e2e] a kubernetes client [38;5;243m/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/tests/e2e/tests.go:37[0m when assuming the KubernetesUsers role [38;5;243m/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/tests/e2e/tests.go:108[0m [38;5;9m[1m[It] should send a request successfully[0m [38;5;243m/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/tests/e2e/tests.go:125[0m [38;5;9mUnexpected error: <*url.Error | 0xc00021c060>: { Op: "Get", URL: "https://api-test-cluster-20382-k8-8l1uvh-1694432600.us-west-2.elb.amazonaws.com/api/v1/namespaces/kube-system/pods", Err: <*errors.errorString | 0xc00076e020>{ s: "getting credentials: exec: executable /home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/hack/e2e/e2e-test-artifacts/bin/aws-iam-authenticator failed with exit code 1", }, } Get "https://api-test-cluster-20382-k8-8l1uvh-1694432600.us-west-2.elb.amazonaws.com/api/v1/namespaces/kube-system/pods": getting credentials: exec: executable /home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/hack/e2e/e2e-test-artifacts/bin/aws-iam-authenticator failed with exit code 1 occurred[0m [38;5;9mIn [1m[It][0m[38;5;9m at: [1m/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/tests/e2e/tests.go:132[0m [38;5;243m------------------------------[0m could not get token: AccessDenied: User: arn:aws:iam::607362164682:user/awstester is not authorized to perform: sts:AssumeRole on resource: arn:aws:iam::607362164682:role/aws-iam-authenticator-test-role-KubernetesUsers status code: 403, request id: 6e17c909-11d1-4fc9-82fe-7c49f5fb91a5 [38;5;9m• [FAILED] [0.232 seconds][0m [iam-auth-e2e] a kubernetes client [38;5;243m/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/tests/e2e/tests.go:37[0m when assuming the KubernetesUsers role [38;5;243m/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/tests/e2e/tests.go:108[0m [38;5;9m[1m[It] should be mapped to the right permissions[0m [38;5;243m/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/tests/e2e/tests.go:135[0m [38;5;9mUnexpected error: <*url.Error | 0xc000161110>: { Op: "Post", URL: "https://api-test-cluster-20382-k8-8l1uvh-1694432600.us-west-2.elb.amazonaws.com/apis/authorization.k8s.io/v1/selfsubjectaccessreviews", Err: <*errors.errorString | 0xc00076fa80>{ s: "getting credentials: exec: executable /home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/hack/e2e/e2e-test-artifacts/bin/aws-iam-authenticator failed with exit code 1", }, } Post "https://api-test-cluster-20382-k8-8l1uvh-1694432600.us-west-2.elb.amazonaws.com/apis/authorization.k8s.io/v1/selfsubjectaccessreviews": getting credentials: exec: executable /home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/hack/e2e/e2e-test-artifacts/bin/aws-iam-authenticator failed with exit code 1 occurred[0m [38;5;9mIn [1m[It][0m[38;5;9m at: [1m/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/tests/e2e/tests.go:152[0m [38;5;243m------------------------------[0m [38;5;10m•[0m[38;5;10m•[0mWaiting for apiserver to go down... after 10s: Get "https://api-test-cluster-20382-k8-8l1uvh-1694432600.us-west-2.elb.amazonaws.com/api/v1/nodes": net/http: TLS handshake timeout after 21s: Get "https://api-test-cluster-20382-k8-8l1uvh-1694432600.us-west-2.elb.amazonaws.com/api/v1/nodes": EOF ... skipping 5 lines ... [0m[apiserver] [Disruptive] the apiserver [38;5;243mwhen the manifest changes [0mrestarts successfully[0m [38;5;243m/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/tests/e2e/apiserver_test.go:88[0m [38;5;243m------------------------------[0m [38;5;9m[1mSummarizing 4 Failures:[0m [38;5;9m[FAIL][0m [0m[iam-auth-e2e] a kubernetes client [38;5;243mwhen assuming the KubernetesAdmin role [0m[38;5;9m[1m[It] should send requests successfully[0m[0m [38;5;243m/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/tests/e2e/tests.go:84[0m [38;5;9m[FAIL][0m [0m[iam-auth-e2e] a kubernetes client [38;5;243mwhen assuming the KubernetesAdmin role [0m[38;5;9m[1m[It] should be authorized to do everything[0m[0m [38;5;243m/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/tests/e2e/tests.go:103[0m [38;5;9m[FAIL][0m [0m[iam-auth-e2e] a kubernetes client [38;5;243mwhen assuming the KubernetesUsers role [0m[38;5;9m[1m[It] should send a request successfully[0m[0m [38;5;243m/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/tests/e2e/tests.go:132[0m [38;5;9m[FAIL][0m [0m[iam-auth-e2e] a kubernetes client [38;5;243mwhen assuming the KubernetesUsers role [0m[38;5;9m[1m[It] should be mapped to the right permissions[0m[0m [38;5;243m/home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator/tests/e2e/tests.go:152[0m [38;5;9m[1mRan 8 of 8 Specs in 70.634 seconds[0m [38;5;9m[1mFAIL![0m -- [38;5;10m[1m4 Passed[0m | [38;5;9m[1m4 Failed[0m | [38;5;11m[1m0 Pending[0m | [38;5;14m[1m0 Skipped[0m [38;5;228mYou're using deprecated Ginkgo functionality:[0m [38;5;228m=============================================[0m [38;5;11mSupport for custom reporters has been removed in V2. Please read the documentation linked to below for Ginkgo's new behavior and for a migration path:[0m [1mLearn more at:[0m [38;5;14m[4mhttps://onsi.github.io/ginkgo/MIGRATING_TO_V2#removed-custom-reporters[0m [38;5;243mTo silence deprecations that can be silenced set the following environment variable:[0m [38;5;243mACK_GINKGO_DEPRECATIONS=2.4.0[0m --- FAIL: TestE2E (70.64s) FAIL Ginkgo ran 1 suite in 1m36.105477607s Test Suite Failed + TEST_PASSED=1 + popd /home/prow/go/src/github.com/kubernetes-sigs/aws-iam-authenticator + set -e + set +x ### ... skipping 342 lines ... vpc:vpc-01d7cc5fa88bc3000 ok dhcp-options:dopt-0715684ce463fc85b ok Deleted kubectl config for test-cluster-20382.k8s.local Deleted cluster: "test-cluster-20382.k8s.local" An error occurred (NoSuchEntity) when calling the DeleteRole operation: The role with name aws-iam-authenticator-test-role-KubernetesAdmin cannot be found. make: *** [Makefile:110: e2e] Error 255 + EXIT_VALUE=2 + set +o xtrace Cleaning up after docker in docker. ================================================================================ Cleaning up after docker Stopping Docker: dockerProgram process in pidfile '/var/run/docker-ssd.pid', 1 process(es), refused to die. ... skipping 3 lines ...