This job view page is being replaced by Spyglass soon. Check out the new job view.
ResultFAILURE
Tests 1 failed / 3400 succeeded
Started2021-06-09 22:28
Elapsed40m49s
Revisionmaster

Test Failures


k8s.io/kubernetes/test/integration/examples TestAggregatedAPIServer 23s

go test -v k8s.io/kubernetes/test/integration/examples -run TestAggregatedAPIServer$
=== RUN   TestAggregatedAPIServer
    testserver.go:380: Resolved testserver package path to: "/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/cmd/kube-apiserver/app/testing"
I0609 22:55:34.716260  127241 serving.go:341] Generated self-signed cert (/tmp/kubernetes-kube-apiserver066160511/apiserver.crt, /tmp/kubernetes-kube-apiserver066160511/apiserver.key)
I0609 22:55:34.716440  127241 server.go:541] external host was not specified, using 127.0.0.1
W0609 22:55:34.716561  127241 authentication.go:524] AnonymousAuth is not allowed with the AlwaysAllow authorizer. Resetting AnonymousAuth to false. You should use a different authorizer
    testserver.go:215: runtime-config=map[api/all:true]
    testserver.go:216: Starting kube-apiserver on port 44513...
W0609 22:55:36.517163  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0609 22:55:36.517216  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0609 22:55:36.517236  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0609 22:55:36.518033  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0609 22:55:36.520253  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0609 22:55:36.520320  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0609 22:55:36.520405  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0609 22:55:36.520455  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0609 22:55:36.520542  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0609 22:55:36.520625  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0609 22:55:36.523087  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0609 22:55:36.523453  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0609 22:55:36.523583  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0609 22:55:36.523624  127241 plugins.go:158] Loaded 11 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,ServiceAccount,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,RuntimeClass,DefaultIngressClass,MutatingAdmissionWebhook.
I0609 22:55:36.523641  127241 plugins.go:161] Loaded 10 validating admission controller(s) successfully in the following order: LimitRanger,ServiceAccount,Priority,PersistentVolumeClaimResize,RuntimeClass,CertificateApproval,CertificateSigning,CertificateSubjectRestriction,ValidatingAdmissionWebhook,ResourceQuota.
W0609 22:55:36.523778  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0609 22:55:36.523793  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0609 22:55:36.525689  127241 plugins.go:158] Loaded 11 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,ServiceAccount,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,RuntimeClass,DefaultIngressClass,MutatingAdmissionWebhook.
I0609 22:55:36.525716  127241 plugins.go:161] Loaded 10 validating admission controller(s) successfully in the following order: LimitRanger,ServiceAccount,Priority,PersistentVolumeClaimResize,RuntimeClass,CertificateApproval,CertificateSigning,CertificateSubjectRestriction,ValidatingAdmissionWebhook,ResourceQuota.
I0609 22:55:36.528070  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.528112  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.552835  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.552887  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.553993  127241 client.go:360] parsed scheme: "passthrough"
I0609 22:55:36.554078  127241 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I0609 22:55:36.554094  127241 clientconn.go:948] ClientConn switching balancer to "pick_first"
I0609 22:55:36.554974  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.555034  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
W0609 22:55:36.633379  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0609 22:55:36.634423  127241 instance.go:278] Using reconciler: lease
I0609 22:55:36.634649  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.634683  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.656860  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.657034  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.660726  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.660778  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.663009  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.663042  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.665996  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.666053  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.674407  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.674450  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.680409  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.680592  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.685934  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.685998  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.688088  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.688145  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.693413  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.693449  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.695915  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.696043  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.698983  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.699039  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.700396  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.700451  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.702562  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.702624  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.703974  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.704018  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.705557  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.705598  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.706912  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.707089  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.709765  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.709948  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.717399  127241 rest.go:130] the default service ipfamily for this cluster is: IPv4
I0609 22:55:36.970478  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.970529  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.979841  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.979886  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.981379  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.981415  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.982938  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.982980  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.985209  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.985290  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.987938  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.987989  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.990023  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.990065  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.992483  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.992518  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:36.993752  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:36.993790  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.003188  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.003407  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.006926  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.006995  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.010571  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.010633  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.014295  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.014333  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.028130  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.028166  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.051124  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.051360  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.054133  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.054313  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.104103  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.104148  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.107621  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.107663  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.110694  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.110742  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.114202  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.114237  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.115365  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.115395  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.117218  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.117267  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.118696  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.118738  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.119903  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.120118  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.121773  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.121963  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.151488  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.151540  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.153494  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.153530  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.160870  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.166971  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.175090  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.175139  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.188336  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.189528  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.190920  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.190953  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.193087  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.193123  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.197138  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.197245  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.199236  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.199527  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.204896  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.205201  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.207589  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.207642  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.252643  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.252692  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.256642  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.256877  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.259431  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.259620  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.268526  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.268579  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.271100  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.271136  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.273061  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.273106  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.274627  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.274654  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.276869  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.276923  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.277953  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.277993  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.279453  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.279490  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.282184  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.282228  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.283184  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.283232  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.285177  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.285222  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.288882  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.288968  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.290863  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.290908  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.294976  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.295454  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.297919  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.297991  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.302241  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.302306  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.303878  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.303930  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.312427  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.312497  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.314885  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.314943  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.316553  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.316598  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.319002  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.319042  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.322382  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.352128  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.354980  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.355036  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.357317  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.357377  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.359256  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.359285  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.362360  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.362407  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.363925  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.363964  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.366884  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.366934  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:37.517031  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:37.517161  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
W0609 22:55:38.912144  127241 genericapiserver.go:437] Skipping API apps/v1beta2 because it has no resources.
W0609 22:55:38.912281  127241 genericapiserver.go:437] Skipping API apps/v1beta1 because it has no resources.
I0609 22:55:38.961524  127241 plugins.go:158] Loaded 11 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,ServiceAccount,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,RuntimeClass,DefaultIngressClass,MutatingAdmissionWebhook.
I0609 22:55:38.961827  127241 plugins.go:161] Loaded 10 validating admission controller(s) successfully in the following order: LimitRanger,ServiceAccount,Priority,PersistentVolumeClaimResize,RuntimeClass,CertificateApproval,CertificateSigning,CertificateSubjectRestriction,ValidatingAdmissionWebhook,ResourceQuota.
W0609 22:55:38.964225  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0609 22:55:38.964608  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:38.964741  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:38.969476  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:38.969710  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
W0609 22:55:39.009250  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
    testserver.go:235: Waiting for /healthz to be ok...
I0609 22:55:47.610285  127241 dynamic_cafile_content.go:155] "Starting controller" name="request-header::/tmp/kubernetes-kube-apiserver066160511/proxy-ca.crt"
I0609 22:55:47.610409  127241 dynamic_cafile_content.go:155] "Starting controller" name="client-ca-bundle::/tmp/kubernetes-kube-apiserver066160511/client-ca.crt"
I0609 22:55:47.611079  127241 dynamic_serving_content.go:129] "Starting controller" name="serving-cert::/tmp/kubernetes-kube-apiserver066160511/apiserver.crt::/tmp/kubernetes-kube-apiserver066160511/apiserver.key"
I0609 22:55:47.612244  127241 secure_serving.go:195] Serving securely on 127.0.0.1:44513
I0609 22:55:47.612365  127241 tlsconfig.go:240] "Starting DynamicServingCertificateController"
W0609 22:55:47.613905  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0609 22:55:47.614374  127241 cluster_authentication_trust_controller.go:440] Starting cluster_authentication_trust_controller controller
I0609 22:55:47.614395  127241 shared_informer.go:240] Waiting for caches to sync for cluster_authentication_trust_controller
I0609 22:55:47.614466  127241 apf_controller.go:295] Starting API Priority and Fairness config controller
I0609 22:55:47.614588  127241 dynamic_cafile_content.go:155] "Starting controller" name="client-ca-bundle::/tmp/kubernetes-kube-apiserver066160511/client-ca.crt"
I0609 22:55:47.614709  127241 dynamic_cafile_content.go:155] "Starting controller" name="request-header::/tmp/kubernetes-kube-apiserver066160511/proxy-ca.crt"
I0609 22:55:47.614858  127241 autoregister_controller.go:141] Starting autoregister controller
I0609 22:55:47.614878  127241 cache.go:32] Waiting for caches to sync for autoregister controller
I0609 22:55:47.616117  127241 crdregistration_controller.go:111] Starting crd-autoregister controller
I0609 22:55:47.616136  127241 shared_informer.go:240] Waiting for caches to sync for crd-autoregister
I0609 22:55:47.616222  127241 customresource_discovery_controller.go:209] Starting DiscoveryController
I0609 22:55:47.616277  127241 controller.go:85] Starting OpenAPI controller
I0609 22:55:47.616303  127241 naming_controller.go:291] Starting NamingConditionController
I0609 22:55:47.616337  127241 establishing_controller.go:76] Starting EstablishingController
I0609 22:55:47.616375  127241 nonstructuralschema_controller.go:192] Starting NonStructuralSchemaConditionController
I0609 22:55:47.616399  127241 apiapproval_controller.go:186] Starting KubernetesAPIApprovalPolicyConformantConditionController
I0609 22:55:47.616435  127241 crd_finalizer.go:266] Starting CRDFinalizer
I0609 22:55:47.616490  127241 apiservice_controller.go:97] Starting APIServiceRegistrationController
I0609 22:55:47.616501  127241 cache.go:32] Waiting for caches to sync for APIServiceRegistrationController controller
I0609 22:55:47.618925  127241 controller.go:83] Starting OpenAPI AggregationController
I0609 22:55:47.619113  127241 available_controller.go:491] Starting AvailableConditionController
I0609 22:55:47.619122  127241 cache.go:32] Waiting for caches to sync for AvailableConditionController controller
E0609 22:55:47.624288  127241 controller.go:152] Unable to remove old endpoints from kubernetes service: StorageError: key not found, Code: 1, Key: /dfc39766-b1da-4a1b-9b4e-720ce493e9df/registry/masterleases/127.0.0.1, ResourceVersion: 0, AdditionalErrorMsg: 
W0609 22:55:47.711981  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:47.714548  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
I0609 22:55:47.714871  127241 shared_informer.go:247] Caches are synced for cluster_authentication_trust_controller 
W0609 22:55:47.714893  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
I0609 22:55:47.715235  127241 cache.go:39] Caches are synced for autoregister controller
I0609 22:55:47.716551  127241 cache.go:39] Caches are synced for APIServiceRegistrationController controller
I0609 22:55:47.716555  127241 shared_informer.go:247] Caches are synced for crd-autoregister 
I0609 22:55:47.720415  127241 cache.go:39] Caches are synced for AvailableConditionController controller
W0609 22:55:47.750487  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:47.756933  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:47.784275  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
I0609 22:55:47.788902  127241 controller.go:611] quota admission added evaluator for: namespaces
W0609 22:55:47.790328  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:47.801609  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
I0609 22:55:47.814730  127241 apf_controller.go:300] Running API Priority and Fairness config worker
W0609 22:55:47.879854  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:47.908452  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:47.915758  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:47.961130  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:47.962312  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:47.990421  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.002683  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.017697  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.056022  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.071447  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.089848  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.111697  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.119548  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.125707  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.154603  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.156821  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.164077  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.176545  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.186374  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.194014  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.201463  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.210736  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.224229  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.224902  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.253193  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.256344  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.260002  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.277780  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.280105  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.285465  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.293674  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.306003  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.306024  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.327528  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.351384  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.358645  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.362460  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.370000  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.384124  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.387449  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.396486  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.402329  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.405610  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.413603  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.413830  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.420458  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.422268  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.453414  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.455882  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.464045  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.466222  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.471392  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.479564  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.479607  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.496071  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.499912  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.505874  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.514363  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.514481  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.518979  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.524336  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.552715  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.555843  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.562535  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.569759  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.588131  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.603595  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.608140  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
I0609 22:55:48.609670  127241 controller.go:132] OpenAPI AggregationController: action for item : Nothing (removed from the queue).
I0609 22:55:48.609699  127241 controller.go:132] OpenAPI AggregationController: action for item k8s_internal_local_delegation_chain_0000000000: Nothing (removed from the queue).
W0609 22:55:48.611777  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.630529  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.655421  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.660609  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
I0609 22:55:48.663461  127241 storage_scheduling.go:132] created PriorityClass system-node-critical with value 2000001000
W0609 22:55:48.672251  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.676757  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
I0609 22:55:48.679669  127241 storage_scheduling.go:132] created PriorityClass system-cluster-critical with value 2000000000
I0609 22:55:48.679705  127241 storage_scheduling.go:148] all system priority classes are created successfully or already exist.
W0609 22:55:48.687240  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.699276  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.714465  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.723382  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.753894  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.756853  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.764418  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.767810  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.774627  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:48.786723  127241 lease.go:233] Resetting endpoints for master service "kubernetes" to [127.0.0.1]
E0609 22:55:48.799580  127241 controller.go:223] unable to sync kubernetes service: Endpoints "kubernetes" is invalid: subsets[0].addresses[0].ip: Invalid value: "127.0.0.1": may not be in the loopback range (127.0.0.0/8, ::1/128)
I0609 22:55:48.822179  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:48.822252  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
    apiserver_test.go:265: Client: Server public key is "127.0.0.1@1531467593" [serving] validServingFor=[127.0.0.1,10.0.0.1,kubernetes.default.svc,kubernetes.default,kubernetes,localhost] issuer="127.0.0.1-ca@1531467593" (2018-07-13 06:39:53 +0000 UTC to 2118-06-19 06:39:53 +0000 UTC (now=2021-06-09 22:55:48.858058826 +0000 UTC))
    apiserver_test.go:265: Client: Server public key is "127.0.0.1-ca@1531467593" [] issuer="<self>" (2018-07-13 06:39:53 +0000 UTC to 2118-06-19 06:39:53 +0000 UTC (now=2021-06-09 22:55:48.858145906 +0000 UTC))
    apiserver_test.go:272: CA bundle "127.0.0.1@1531467593" [serving] validServingFor=[127.0.0.1,10.0.0.1,kubernetes.default.svc,kubernetes.default,kubernetes,localhost] issuer="127.0.0.1-ca@1531467593" (2018-07-13 06:39:53 +0000 UTC to 2118-06-19 06:39:53 +0000 UTC (now=2021-06-09 22:55:48.859317324 +0000 UTC))
    apiserver_test.go:272: CA bundle "127.0.0.1-ca@1531467593" [] issuer="<self>" (2018-07-13 06:39:53 +0000 UTC to 2118-06-19 06:39:53 +0000 UTC (now=2021-06-09 22:55:48.859493956 +0000 UTC))
I0609 22:55:49.619406  127241 serving.go:341] Generated self-signed cert (/tmp/test-integration-wardle-server557205460/apiserver.crt, /tmp/test-integration-wardle-server557205460/apiserver.key)
I0609 22:55:50.622539  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:50.622672  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
W0609 22:55:51.366456  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0609 22:55:51.368220  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0609 22:55:51.368810  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0609 22:55:51.427591  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0609 22:55:51.428245  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0609 22:55:51.428567  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0609 22:55:51.428622  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0609 22:55:51.428644  127241 plugins.go:158] Loaded 3 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,MutatingAdmissionWebhook,BanFlunder.
I0609 22:55:51.428649  127241 plugins.go:161] Loaded 1 validating admission controller(s) successfully in the following order: ValidatingAdmissionWebhook.
W0609 22:55:51.429131  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0609 22:55:51.429160  127241 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0609 22:55:51.431171  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:51.431216  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:51.458645  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:51.458750  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:51.461953  127241 client.go:360] parsed scheme: "endpoint"
I0609 22:55:51.462004  127241 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0609 22:55:51.711848  127241 secure_serving.go:195] Serving securely on 127.0.0.1:38107
I0609 22:55:51.712455  127241 configmap_cafile_content.go:201] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
I0609 22:55:51.716869  127241 shared_informer.go:240] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
I0609 22:55:51.712509  127241 tlsconfig.go:240] "Starting DynamicServingCertificateController"
I0609 22:55:51.712667  127241 dynamic_serving_content.go:129] "Starting controller" name="serving-cert::/tmp/test-integration-wardle-server557205460/apiserver.crt::/tmp/test-integration-wardle-server557205460/apiserver.key"
I0609 22:55:51.713226  127241 requestheader_controller.go:169] Starting RequestHeaderAuthRequestController
I0609 22:55:51.717864  127241 shared_informer.go:240] Waiting for caches to sync for RequestHeaderAuthRequestController
I0609 22:55:51.713276  127241 configmap_cafile_content.go:201] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file"
I0609 22:55:51.718484  127241 shared_informer.go:240] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file
I0609 22:55:51.723030  127241 apf_controller.go:295] Starting API Priority and Fairness config controller
W0609 22:55:51.754222  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:51.756033  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:51.768680  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0609 22:55:51.770311  127241 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
    apiserver_test.go:357: {"kind":"APIGroupList","groups":[{"name":"wardle.example.com","versions":[{"groupVersion":"wardle.example.com/v1beta1","version":"v1beta1"},{"groupVersion":"wardle.example.com/v1alpha1","version":"v1alpha1"}],"preferredVersion":{"groupVersion":"wardle.example.com/v1beta1","version":"v1beta1"},"serverAddressByClientCIDRs":[{"clientCIDR":"0.0.0.0/0","serverAddress":":38107"}]}]}
        
    apiserver_test.go:386: {"kind":"APIGroup","apiVersion":"v1","name":"wardle.example.com","versions":[{"groupVersion":"wardle.example.com/v1beta1","version":"v1beta1"},{"groupVersion":"wardle.example.com/v1alpha1","version":"v1alpha1"}],"preferredVersion":{"groupVersion":"wardle.example.com/v1beta1","version":"v1beta1"}}
        
    apiserver_test.go:404: {"kind":"APIResourceList","apiVersion":"v1","groupVersion":"wardle.example.com/v1alpha1","resources":[{"name":"fischers","singularName":"","namespaced":false,"kind":"Fischer","verbs":["create","delete","deletecollection","get","list","patch","update","watch"],"storageVersionHash":"YS5qRiSxzNM="},{"name":"flunders","singularName":"","namespaced":true,"kind":"Flunder","verbs":["create","delete","deletecollection","get","list","patch","update","watch"],"storageVersionHash":"UHqNx5H3K7A="}]}
        
I0609 22:55:51.817543  127241 shared_informer.go:247] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file 
I0609 22:55:51.818606  127241 shared_informer.go:247] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file 
I0609 22:55:51.818782  127241 shared_informer.go:247] Caches are synced for RequestHeaderAuthRequestController 
I0609 22:55:51.853528  127241 apf_controller.go:300] Running API Priority and Fairness config worker
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:305: Unexpected failed group version error: unable to retrieve the complete list of server APIs: wardle.example.com/v1alpha1: the server could not find the requested resource
    apiserver_test.go:139: timed out waiting for the condition
W0609 22:55:56.991068  127241 cacher.go:149] Terminating all watchers from cacher *apiextensions.CustomResourceDefinition
W0609 22:55:56.991616  127241 cacher.go:149] Terminating all watchers from cacher *core.LimitRange
W0609 22:55:56.991920  127241 cacher.go:149] Terminating all watchers from cacher *core.ResourceQuota
W0609 22:55:56.992360  127241 cacher.go:149] Terminating all watchers from cacher *core.Secret
W0609 22:55:56.992907  127241 cacher.go:149] Terminating all watchers from cacher *core.ConfigMap
W0609 22:55:56.993214  127241 cacher.go:149] Terminating all watchers from cacher *core.Namespace
W0609 22:55:56.993502  127241 cacher.go:149] Terminating all watchers from cacher *core.Endpoints
W0609 22:55:56.993816  127241 cacher.go:149] Terminating all watchers from cacher *core.Pod
W0609 22:55:56.994073  127241 cacher.go:149] Terminating all watchers from cacher *core.ServiceAccount
W0609 22:55:56.994654  127241 cacher.go:149] Terminating all watchers from cacher *core.Service
W0609 22:55:56.997510  127241 cacher.go:149] Terminating all watchers from cacher *networking.IngressClass
W0609 22:55:56.998368  127241 cacher.go:149] Terminating all watchers from cacher *node.RuntimeClass
W0609 22:55:57.008228  127241 cacher.go:149] Terminating all watchers from cacher *scheduling.PriorityClass
W0609 22:55:57.009498  127241 cacher.go:149] Terminating all watchers from cacher *storage.StorageClass
W0609 22:55:57.010011  127241 cacher.go:149] Terminating all watchers from cacher *flowcontrol.FlowSchema
W0609 22:55:57.010194  127241 cacher.go:149] Terminating all watchers from cacher *flowcontrol.PriorityLevelConfiguration
W0609 22:55:57.013377  127241 cacher.go:149] Terminating all watchers from cacher *admissionregistration.ValidatingWebhookConfiguration
W0609 22:55:57.014114  127241 cacher.go:149] Terminating all watchers from cacher *admissionregistration.MutatingWebhookConfiguration
W0609 22:55:57.015048  127241 cacher.go:149] Terminating all watchers from cacher *apiregistration.APIService
W0609 22:55:57.015416  127241 cacher.go:149] Terminating all watchers from cacher *wardle.Fischer
--- FAIL: TestAggregatedAPIServer (23.38s)

				from junit_20210609-224540.xml

Filter through log files | View test history on testgrid


Show 3400 Passed Tests

Show 8 Skipped Tests

Error lines from build-log.txt

... skipping 70 lines ...
Recording: record_command_canary
Running command: record_command_canary

+++ Running case: test-cmd.record_command_canary 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: record_command_canary
/home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh: line 157: bogus-expected-to-fail: command not found
!!! [0609 22:33:26] Call tree:
!!! [0609 22:33:26]  1: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:47 record_command_canary(...)
!!! [0609 22:33:26]  2: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:112 eVal(...)
!!! [0609 22:33:26]  3: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:133 juLog(...)
!!! [0609 22:33:26]  4: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:161 record_command(...)
!!! [0609 22:33:26]  5: hack/make-rules/test-cmd.sh:35 source(...)
+++ exit code: 1
+++ error: 1
+++ [0609 22:33:26] Running kubeadm tests
+++ [0609 22:33:31] Building go targets for linux/amd64:
    cmd/kubeadm
+++ [0609 22:34:25] Running tests without code coverage
{"Time":"2021-06-09T22:35:29.096666044Z","Action":"output","Package":"k8s.io/kubernetes/cmd/kubeadm/test/cmd","Output":"ok  \tk8s.io/kubernetes/cmd/kubeadm/test/cmd\t45.453s\n"}
✓  cmd/kubeadm/test/cmd (45.458s)
... skipping 356 lines ...
+++ [0609 22:38:01] Building kube-controller-manager
+++ [0609 22:38:07] Building go targets for linux/amd64:
    cmd/kube-controller-manager
+++ [0609 22:38:39] Generate kubeconfig for controller-manager
+++ [0609 22:38:39] Starting controller-manager
I0609 22:38:40.226505   61795 serving.go:347] Generated self-signed cert in-memory
W0609 22:38:40.713441   61795 authentication.go:419] failed to read in-cluster kubeconfig for delegated authentication: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W0609 22:38:40.713552   61795 authentication.go:316] No authentication-kubeconfig provided in order to lookup client-ca-file in configmap/extension-apiserver-authentication in kube-system, so client certificate authentication won't work.
W0609 22:38:40.713561   61795 authentication.go:340] No authentication-kubeconfig provided in order to lookup requestheader-client-ca-file in configmap/extension-apiserver-authentication in kube-system, so request-header client certificate authentication won't work.
W0609 22:38:40.713578   61795 authorization.go:225] failed to read in-cluster kubeconfig for delegated authorization: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W0609 22:38:40.713597   61795 authorization.go:193] No authorization-kubeconfig provided, so SubjectAccessReview of authorization tokens won't work.
I0609 22:38:40.713620   61795 controllermanager.go:186] Version: v1.22.0-alpha.3.51+e6922078577c5c
I0609 22:38:40.715269   61795 secure_serving.go:195] Serving securely on [::]:10257
I0609 22:38:40.715410   61795 tlsconfig.go:240] "Starting DynamicServingCertificateController"
I0609 22:38:40.715770   61795 leaderelection.go:248] attempting to acquire leader lease kube-system/kube-controller-manager...
+++ [0609 22:38:40] On try 2, controller-manager: ok
... skipping 36 lines ...
I0609 22:38:40.841895   61795 shared_informer.go:240] Waiting for caches to sync for service account
I0609 22:38:40.854722   61795 controllermanager.go:577] Started "namespace"
I0609 22:38:40.855184   61795 controllermanager.go:577] Started "replicaset"
W0609 22:38:40.855198   61795 controllermanager.go:556] "tokencleaner" is disabled
W0609 22:38:40.855456   61795 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0609 22:38:40.855519   61795 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
E0609 22:38:40.855537   61795 core.go:91] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail
W0609 22:38:40.855544   61795 controllermanager.go:569] Skipping "service"
I0609 22:38:40.855631   61795 namespace_controller.go:200] Starting namespace controller
I0609 22:38:40.855663   61795 shared_informer.go:240] Waiting for caches to sync for namespace
I0609 22:38:40.855811   61795 node_lifecycle_controller.go:76] Sending events to api server
I0609 22:38:40.855930   61795 replica_set.go:181] Starting replicaset controller
E0609 22:38:40.858599   61795 core.go:231] failed to start cloud node lifecycle controller: no cloud provider provided
W0609 22:38:40.858684   61795 controllermanager.go:569] Skipping "cloud-node-lifecycle"
W0609 22:38:40.859759   61795 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0609 22:38:40.859809   61795 controllermanager.go:577] Started "replicationcontroller"
I0609 22:38:40.859967   61795 replica_set.go:181] Starting replicationcontroller controller
I0609 22:38:40.859990   61795 shared_informer.go:240] Waiting for caches to sync for ReplicationController
W0609 22:38:40.860164   61795 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
... skipping 149 lines ...
I0609 22:38:41.063985   61795 shared_informer.go:247] Caches are synced for endpoint_slice 
I0609 22:38:41.066387   61795 shared_informer.go:247] Caches are synced for persistent volume 
I0609 22:38:41.070980   61795 shared_informer.go:247] Caches are synced for attach detach 
I0609 22:38:41.183443   61795 shared_informer.go:247] Caches are synced for disruption 
I0609 22:38:41.183483   61795 disruption.go:371] Sending events to api server.
node/127.0.0.1 created
W0609 22:38:41.255790   61795 actual_state_of_world.go:534] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist
I0609 22:38:41.265030   61795 shared_informer.go:247] Caches are synced for endpoint 
I0609 22:38:41.267545   61795 shared_informer.go:247] Caches are synced for endpoint_slice_mirroring 
+++ [0609 22:38:41] Checking kubectl version
I0609 22:38:41.283151   61795 shared_informer.go:247] Caches are synced for resource quota 
I0609 22:38:41.328387   61795 shared_informer.go:247] Caches are synced for resource quota 
Client Version: version.Info{Major:"1", Minor:"22+", GitVersion:"v1.22.0-alpha.3.51+e6922078577c5c", GitCommit:"e6922078577c5c8896b3619857dd744f1c1ecf6b", GitTreeState:"clean", BuildDate:"2021-06-09T21:55:26Z", GoVersion:"go1.16.4", Compiler:"gc", Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"22+", GitVersion:"v1.22.0-alpha.3.51+e6922078577c5c", GitCommit:"e6922078577c5c8896b3619857dd744f1c1ecf6b", GitTreeState:"clean", BuildDate:"2021-06-09T21:55:26Z", GoVersion:"go1.16.4", Compiler:"gc", Platform:"linux/amd64"}
The Service "kubernetes" is invalid: spec.clusterIPs: Invalid value: []string{"10.0.0.1"}: failed to allocated ip:10.0.0.1 with error:provided IP is already allocated
I0609 22:38:41.736406   61795 shared_informer.go:247] Caches are synced for garbage collector 
I0609 22:38:41.740648   61795 shared_informer.go:247] Caches are synced for garbage collector 
I0609 22:38:41.740742   61795 garbagecollector.go:151] Garbage collector: all resource monitors have synced. Proceeding to collect garbage
NAME         TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
kubernetes   ClusterIP   10.0.0.1     <none>        443/TCP   40s
Recording: run_kubectl_version_tests
... skipping 106 lines ...
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_RESTMapper_evaluation_tests
+++ [0609 22:38:46] Creating namespace namespace-1623278326-29132
namespace/namespace-1623278326-29132 created
Context "test" modified.
+++ [0609 22:38:46] Testing RESTMapper
+++ [0609 22:38:46] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype"
+++ exit code: 0
NAME                              SHORTNAMES   APIVERSION                             NAMESPACED   KIND
bindings                                       v1                                     true         Binding
componentstatuses                 cs           v1                                     false        ComponentStatus
configmaps                        cm           v1                                     true         ConfigMap
endpoints                         ep           v1                                     true         Endpoints
... skipping 61 lines ...
namespace/namespace-1623278331-18694 created
Context "test" modified.
+++ [0609 22:38:52] Testing clusterroles
rbac.sh:29: Successful get clusterroles/cluster-admin {{.metadata.name}}: cluster-admin
(Brbac.sh:30: Successful get clusterrolebindings/cluster-admin {{.metadata.name}}: cluster-admin
(BSuccessful
message:Error from server (NotFound): clusterroles.rbac.authorization.k8s.io "pod-admin" not found
has:clusterroles.rbac.authorization.k8s.io "pod-admin" not found
clusterrole.rbac.authorization.k8s.io/pod-admin created (dry run)
clusterrole.rbac.authorization.k8s.io/pod-admin created (server dry run)
Successful
message:Error from server (NotFound): clusterroles.rbac.authorization.k8s.io "pod-admin" not found
has:clusterroles.rbac.authorization.k8s.io "pod-admin" not found
clusterrole.rbac.authorization.k8s.io/pod-admin created
rbac.sh:42: Successful get clusterrole/pod-admin {{range.rules}}{{range.verbs}}{{.}}:{{end}}{{end}}: *:
(BSuccessful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
clusterrole.rbac.authorization.k8s.io "pod-admin" deleted
... skipping 18 lines ...
(Bclusterrole.rbac.authorization.k8s.io/url-reader created
rbac.sh:61: Successful get clusterrole/url-reader {{range.rules}}{{range.verbs}}{{.}}:{{end}}{{end}}: get:
(Brbac.sh:62: Successful get clusterrole/url-reader {{range.rules}}{{range.nonResourceURLs}}{{.}}:{{end}}{{end}}: /logs/*:/healthz/*:
(Bclusterrole.rbac.authorization.k8s.io/aggregation-reader created
rbac.sh:64: Successful get clusterrole/aggregation-reader {{.metadata.name}}: aggregation-reader
(BSuccessful
message:Error from server (NotFound): clusterrolebindings.rbac.authorization.k8s.io "super-admin" not found
has:clusterrolebindings.rbac.authorization.k8s.io "super-admin" not found
clusterrolebinding.rbac.authorization.k8s.io/super-admin created (dry run)
clusterrolebinding.rbac.authorization.k8s.io/super-admin created (server dry run)
Successful
message:Error from server (NotFound): clusterrolebindings.rbac.authorization.k8s.io "super-admin" not found
has:clusterrolebindings.rbac.authorization.k8s.io "super-admin" not found
clusterrolebinding.rbac.authorization.k8s.io/super-admin created
rbac.sh:77: Successful get clusterrolebinding/super-admin {{range.subjects}}{{.name}}:{{end}}: super-admin:
(Bclusterrolebinding.rbac.authorization.k8s.io/super-admin subjects updated (dry run)
clusterrolebinding.rbac.authorization.k8s.io/super-admin subjects updated (server dry run)
rbac.sh:80: Successful get clusterrolebinding/super-admin {{range.subjects}}{{.name}}:{{end}}: super-admin:
... skipping 64 lines ...
rbac.sh:102: Successful get clusterrolebinding/super-admin {{range.subjects}}{{.name}}:{{end}}: super-admin:foo:test-all-user:
(Brbac.sh:103: Successful get clusterrolebinding/super-group {{range.subjects}}{{.name}}:{{end}}: the-group:foo:test-all-user:
(Brbac.sh:104: Successful get clusterrolebinding/super-sa {{range.subjects}}{{.name}}:{{end}}: sa-name:foo:test-all-user:
(Brolebinding.rbac.authorization.k8s.io/admin created (dry run)
rolebinding.rbac.authorization.k8s.io/admin created (server dry run)
Successful
message:Error from server (NotFound): rolebindings.rbac.authorization.k8s.io "admin" not found
has: not found
rolebinding.rbac.authorization.k8s.io/admin created
rbac.sh:113: Successful get rolebinding/admin {{.roleRef.kind}}: ClusterRole
(Brbac.sh:114: Successful get rolebinding/admin {{range.subjects}}{{.name}}:{{end}}: default-admin:
(Brolebinding.rbac.authorization.k8s.io/admin subjects updated
rbac.sh:116: Successful get rolebinding/admin {{range.subjects}}{{.name}}:{{end}}: default-admin:foo:
... skipping 152 lines ...
namespace/namespace-1623278340-4487 created
Context "test" modified.
+++ [0609 22:39:00] Testing role
role.rbac.authorization.k8s.io/pod-admin created (dry run)
role.rbac.authorization.k8s.io/pod-admin created (server dry run)
Successful
message:Error from server (NotFound): roles.rbac.authorization.k8s.io "pod-admin" not found
has: not found
role.rbac.authorization.k8s.io/pod-admin created
rbac.sh:159: Successful get role/pod-admin {{range.rules}}{{range.verbs}}{{.}}:{{end}}{{end}}: *:
(Brbac.sh:160: Successful get role/pod-admin {{range.rules}}{{range.resources}}{{.}}:{{end}}{{end}}: pods:
(Brbac.sh:161: Successful get role/pod-admin {{range.rules}}{{range.apiGroups}}{{.}}:{{end}}{{end}}: :
(BSuccessful
... skipping 440 lines ...
has:valid-pod
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
has:valid-pod
core.sh:194: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: resource(s) were provided, but no name was specified
core.sh:198: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: setting 'all' parameter but found a non empty selector. 
core.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:210: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
core.sh:214: Successful get pods -l'name in (valid-pod)' {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:219: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-kubectl-describe-pod\" }}found{{end}}{{end}}:: :
... skipping 30 lines ...
I0609 22:39:14.316179   66500 round_trippers.go:454] GET https://127.0.0.1:6443/apis/policy/v1/namespaces/test-kubectl-describe-pod/poddisruptionbudgets/test-pdb-2 200 OK in 2 milliseconds
I0609 22:39:14.323488   66500 round_trippers.go:454] GET https://127.0.0.1:6443/api/v1/namespaces/test-kubectl-describe-pod/events?fieldSelector=involvedObject.name%3Dtest-pdb-2%2CinvolvedObject.namespace%3Dtest-kubectl-describe-pod%2CinvolvedObject.kind%3DPodDisruptionBudget%2CinvolvedObject.uid%3D096ad1dc-d4c3-4fb7-b487-829fb7e34dac&limit=500 200 OK in 6 milliseconds
(Bpoddisruptionbudget.policy/test-pdb-3 created
core.sh:271: Successful get pdb/test-pdb-3 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 2
(Bpoddisruptionbudget.policy/test-pdb-4 created
core.sh:275: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50%
(Berror: min-available and max-unavailable cannot be both specified
core.sh:281: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/env-test-pod created
matched TEST_CMD_1
matched <set to the key 'key-1' in secret 'test-secret'>
matched TEST_CMD_2
matched <set to the key 'key-2' of config map 'test-configmap'>
... skipping 237 lines ...
core.sh:542: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.5:
(BSuccessful
message:kubectl-create kubectl-patch
has:kubectl-patch
pod/valid-pod patched
core.sh:562: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
(B+++ [0609 22:39:31] "kubectl patch with resourceVersion 602" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again
pod "valid-pod" deleted
pod/valid-pod replaced
core.sh:586: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname
(BSuccessful
message:kubectl-replace
has:kubectl-replace
Successful
message:error: --grace-period must have --force specified
has:\-\-grace-period must have \-\-force specified
Successful
message:error: --timeout must have --force specified
has:\-\-timeout must have \-\-force specified
node/node-v1-test created
W0609 22:39:32.962689   61795 actual_state_of_world.go:534] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist
core.sh:614: Successful get node node-v1-test {{range.items}}{{if .metadata.annotations.a}}found{{end}}{{end}}:: :
(Bnode/node-v1-test replaced (server dry run)
node/node-v1-test replaced (dry run)
core.sh:639: Successful get node node-v1-test {{range.items}}{{if .metadata.annotations.a}}found{{end}}{{end}}:: :
(Bnode/node-v1-test replaced
core.sh:655: Successful get node node-v1-test {{.metadata.annotations.a}}: b
... skipping 29 lines ...
spec:
  containers:
  - image: k8s.gcr.io/pause:3.5
    name: kubernetes-pause
has:localonlyvalue
core.sh:691: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Berror: 'name' already has a value (valid-pod), and --overwrite is false
core.sh:695: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bcore.sh:699: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bpod/valid-pod labeled
core.sh:703: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod-super-sayan
(Bcore.sh:707: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
... skipping 83 lines ...
+++ Running case: test-cmd.run_kubectl_create_error_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_create_error_tests
+++ [0609 22:39:44] Creating namespace namespace-1623278384-19959
namespace/namespace-1623278384-19959 created
Context "test" modified.
+++ [0609 22:39:44] Testing kubectl create with error
Error: must specify one of -f and -k

Create a resource from a file or from stdin.

 JSON and YAML formats are accepted.

Examples:
... skipping 44 lines ...

Usage:
  kubectl create -f FILENAME [options]

Use "kubectl <command> --help" for more information about a given command.
Use "kubectl options" for a list of global command-line options (applies to all commands).
+++ [0609 22:39:44] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false
+++ exit code: 0
Recording: run_kubectl_apply_tests
Running command: run_kubectl_apply_tests

+++ Running case: test-cmd.run_kubectl_apply_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
... skipping 29 lines ...
I0609 22:39:47.591126   61795 event.go:291] "Event occurred" object="namespace-1623278384-26816/test-deployment-retainkeys-8695b756f8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: test-deployment-retainkeys-8695b756f8-rw6d8"
deployment.apps "test-deployment-retainkeys" deleted
apply.sh:88: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/selector-test-pod created
apply.sh:92: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
(BSuccessful
message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
has:pods "selector-test-pod-dont-apply" not found
pod "selector-test-pod" deleted
apply.sh:101: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BW0609 22:39:48.657938   70278 helpers.go:569] --dry-run=true is deprecated (boolean value) and can be replaced with --dry-run=client.
pod/test-pod created (dry run)
pod/test-pod created (dry run)
... skipping 31 lines ...
(Bpod/b created
apply.sh:208: Successful get pods a {{.metadata.name}}: a
(Bapply.sh:209: Successful get pods b -n nsb {{.metadata.name}}: b
(Bpod "a" deleted
pod "b" deleted
Successful
message:error: all resources selected for prune without explicitly passing --all. To prune all resources, pass the --all flag. If you did not mean to prune all resources, specify a label selector
has:all resources selected for prune without explicitly passing --all
pod/a created
pod/b created
service/prune-svc created
I0609 22:39:58.287424   61795 horizontal.go:361] Horizontal Pod Autoscaler frontend has been deleted in namespace-1623278381-12214
I0609 22:40:01.070350   58123 client.go:360] parsed scheme: "passthrough"
... skipping 38 lines ...
apply.sh:262: Successful get pods b -n nsb {{.metadata.name}}: b
(Bpod/b unchanged
pod/a pruned
apply.sh:266: Successful get pods -n nsb {{range.items}}{{.metadata.name}}:{{end}}: b:
(Bnamespace "nsb" deleted
Successful
message:error: the namespace from the provided object "nsb" does not match the namespace "foo". You must pass '--namespace=nsb' to perform this operation.
has:the namespace from the provided object "nsb" does not match the namespace "foo".
apply.sh:277: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bservice/a created
apply.sh:281: Successful get services a {{.metadata.name}}: a
(BSuccessful
message:The Service "a" is invalid: spec.clusterIPs[0]: Invalid value: []string{"10.0.0.12"}: may not change once set
... skipping 26 lines ...
(Bapply.sh:303: Successful get deployment test-the-deployment {{.metadata.name}}: test-the-deployment
(Bapply.sh:304: Successful get service test-the-service {{.metadata.name}}: test-the-service
(Bconfigmap "test-the-map" deleted
service "test-the-service" deleted
deployment.apps "test-the-deployment" deleted
Successful
message:Error from server (NotFound): namespaces "multi-resource-ns" not found
has:namespaces "multi-resource-ns" not found
apply.sh:312: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:namespace/multi-resource-ns created
Error from server (NotFound): error when creating "hack/testdata/multi-resource-1.yaml": namespaces "multi-resource-ns" not found
has:namespaces "multi-resource-ns" not found
Successful
message:Error from server (NotFound): pods "test-pod" not found
has:pods "test-pod" not found
pod/test-pod created
namespace/multi-resource-ns unchanged
apply.sh:320: Successful get pods test-pod -n multi-resource-ns {{.metadata.name}}: test-pod
(Bpod "test-pod" deleted
namespace "multi-resource-ns" deleted
I0609 22:40:31.387646   58123 client.go:360] parsed scheme: "passthrough"
I0609 22:40:31.387731   58123 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I0609 22:40:31.387744   58123 clientconn.go:948] ClientConn switching balancer to "pick_first"
apply.sh:326: Successful get configmaps --field-selector=metadata.name=foo {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:configmap/foo created
error: unable to recognize "hack/testdata/multi-resource-2.yaml": no matches for kind "Bogus" in version "example.com/v1"
has:no matches for kind "Bogus" in version "example.com/v1"
apply.sh:332: Successful get configmaps foo {{.metadata.name}}: foo
(Bconfigmap "foo" deleted
apply.sh:338: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:pod/pod-a created
... skipping 5 lines ...
(Bpod "pod-a" deleted
pod "pod-c" deleted
apply.sh:346: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapply.sh:350: Successful get crds {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:customresourcedefinition.apiextensions.k8s.io/widgets.example.com created
error: unable to recognize "hack/testdata/multi-resource-4.yaml": no matches for kind "Widget" in version "example.com/v1"
has:no matches for kind "Widget" in version "example.com/v1"
I0609 22:40:37.161039   58123 client.go:360] parsed scheme: "endpoint"
I0609 22:40:37.161094   58123 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
Successful
message:Error from server (NotFound): widgets.example.com "foo" not found
has:widgets.example.com "foo" not found
apply.sh:356: Successful get crds widgets.example.com {{.metadata.name}}: widgets.example.com
(BI0609 22:40:39.436930   58123 controller.go:611] quota admission added evaluator for: widgets.example.com
widget.example.com/foo created
customresourcedefinition.apiextensions.k8s.io/widgets.example.com unchanged
apply.sh:359: Successful get widget foo {{.metadata.name}}: foo
... skipping 32 lines ...
message:882
has:882
pod "test-pod" deleted
apply.sh:415: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ [0609 22:40:42] Testing upgrade kubectl client-side apply to server-side apply
pod/test-pod created
error: Apply failed with 1 conflict: conflict with "kubectl-client-side-apply" using v1: .metadata.labels.name
Please review the fields above--they currently have other managers. Here
are the ways you can resolve this warning:
* If you intend to manage all of these fields, please re-run the apply
  command with the `--force-conflicts` flag.
* If you do not intend to manage all of the fields, please edit your
  manifest to remove references to the fields that should keep their
... skipping 77 lines ...
(Bpod "nginx-extensions" deleted
Successful
message:pod/test1 created
has:pod/test1 created
pod "test1" deleted
Successful
message:error: Invalid image name "InvalidImageName": invalid reference format
has:error: Invalid image name "InvalidImageName": invalid reference format
+++ exit code: 0
Recording: run_kubectl_create_filter_tests
Running command: run_kubectl_create_filter_tests

+++ Running case: test-cmd.run_kubectl_create_filter_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
... skipping 3 lines ...
Context "test" modified.
+++ [0609 22:40:45] Testing kubectl create filter
create.sh:50: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/selector-test-pod created
create.sh:54: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
(BSuccessful
message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
has:pods "selector-test-pod-dont-apply" not found
pod "selector-test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_apply_deployments_tests
Running command: run_kubectl_apply_deployments_tests

... skipping 18 lines ...
apps.sh:136: Successful get deployments my-depl {{.spec.template.metadata.labels.l1}}: l1
(Bapps.sh:137: Successful get deployments my-depl {{.spec.selector.matchLabels.l1}}: l1
(Bapps.sh:138: Successful get deployments my-depl {{.metadata.labels.l1}}: <no value>
(Bdeployment.apps "my-depl" deleted
replicaset.apps "my-depl-84fb47b469" deleted
pod "my-depl-84fb47b469-25jjg" deleted
E0609 22:40:48.177758   61795 replica_set.go:531] sync "namespace-1623278446-3150/my-depl-84fb47b469" failed with Operation cannot be fulfilled on replicasets.apps "my-depl-84fb47b469": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1623278446-3150/my-depl-84fb47b469, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 2c9dd30c-d407-4418-8022-2e1406f70dad, UID in object meta: 
apps.sh:144: Successful get deployments {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:145: Successful get replicasets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:146: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:150: Successful get deployments {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx created
I0609 22:40:48.746587   61795 event.go:291] "Event occurred" object="namespace-1623278446-3150/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-9bb9c4878 to 3"
I0609 22:40:48.751546   61795 event.go:291] "Event occurred" object="namespace-1623278446-3150/nginx-9bb9c4878" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-9bb9c4878-x64t5"
I0609 22:40:48.755358   61795 event.go:291] "Event occurred" object="namespace-1623278446-3150/nginx-9bb9c4878" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-9bb9c4878-z46bv"
I0609 22:40:48.755433   61795 event.go:291] "Event occurred" object="namespace-1623278446-3150/nginx-9bb9c4878" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-9bb9c4878-fc2vx"
apps.sh:154: Successful get deployment nginx {{.metadata.name}}: nginx
(BSuccessful
message:Error from server (Conflict): error when applying patch:
{"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1623278446-3150\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}}
to:
Resource: "apps/v1, Resource=deployments", GroupVersionKind: "apps/v1, Kind=Deployment"
Name: "nginx", Namespace: "namespace-1623278446-3150"
for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.apps "nginx": the object has been modified; please apply your changes to the latest version and try again
has:Error from server (Conflict)
deployment.apps/nginx configured
I0609 22:40:57.345895   61795 event.go:291] "Event occurred" object="namespace-1623278446-3150/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-6dd6cfdb57 to 3"
I0609 22:40:57.351390   61795 event.go:291] "Event occurred" object="namespace-1623278446-3150/nginx-6dd6cfdb57" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-6dd6cfdb57-jwjkk"
I0609 22:40:57.358693   61795 event.go:291] "Event occurred" object="namespace-1623278446-3150/nginx-6dd6cfdb57" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-6dd6cfdb57-zlm9k"
I0609 22:40:57.359458   61795 event.go:291] "Event occurred" object="namespace-1623278446-3150/nginx-6dd6cfdb57" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-6dd6cfdb57-tqr94"
Successful
... skipping 300 lines ...
+++ [0609 22:41:05] Creating namespace namespace-1623278465-2070
namespace/namespace-1623278465-2070 created
Context "test" modified.
+++ [0609 22:41:05] Testing kubectl get
get.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:{
    "apiVersion": "v1",
    "items": [],
... skipping 23 lines ...
has not:No resources found
Successful
message:NAME
has not:No resources found
get.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:error: the server doesn't have a resource type "foobar"
has not:No resources found
Successful
message:No resources found in namespace-1623278465-2070 namespace.
has:No resources found
Successful
message:
has not:No resources found
Successful
message:No resources found in namespace-1623278465-2070 namespace.
has:No resources found
get.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
Successful
message:Error from server (NotFound): pods "abc" not found
has not:List
Successful
message:I0609 22:41:07.200703   73718 loader.go:372] Config loaded from file:  /tmp/tmp.YOkNUCWEff/.kube/config
I0609 22:41:07.208419   73718 round_trippers.go:454] GET https://127.0.0.1:6443/version?timeout=32s 200 OK in 6 milliseconds
I0609 22:41:07.239453   73718 round_trippers.go:454] GET https://127.0.0.1:6443/api/v1/namespaces/default/pods 200 OK in 2 milliseconds
I0609 22:41:07.241885   73718 round_trippers.go:454] GET https://127.0.0.1:6443/api/v1/namespaces/default/replicationcontrollers 200 OK in 2 milliseconds
... skipping 597 lines ...
}
get.sh:158: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(B<no value>Successful
message:valid-pod:
has:valid-pod:
Successful
message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template:
	template was:
		{.missing}
	object given to jsonpath engine was:
		map[string]interface {}{"apiVersion":"v1", "kind":"Pod", "metadata":map[string]interface {}{"creationTimestamp":"2021-06-09T22:41:14Z", "labels":map[string]interface {}{"name":"valid-pod"}, "managedFields":[]interface {}{map[string]interface {}{"apiVersion":"v1", "fieldsType":"FieldsV1", "fieldsV1":map[string]interface {}{"f:metadata":map[string]interface {}{"f:labels":map[string]interface {}{".":map[string]interface {}{}, "f:name":map[string]interface {}{}}}, "f:spec":map[string]interface {}{"f:containers":map[string]interface {}{"k:{\"name\":\"kubernetes-serve-hostname\"}":map[string]interface {}{".":map[string]interface {}{}, "f:image":map[string]interface {}{}, "f:imagePullPolicy":map[string]interface {}{}, "f:name":map[string]interface {}{}, "f:resources":map[string]interface {}{".":map[string]interface {}{}, "f:limits":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}, "f:requests":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}}, "f:terminationMessagePath":map[string]interface {}{}, "f:terminationMessagePolicy":map[string]interface {}{}}}, "f:dnsPolicy":map[string]interface {}{}, "f:enableServiceLinks":map[string]interface {}{}, "f:restartPolicy":map[string]interface {}{}, "f:schedulerName":map[string]interface {}{}, "f:securityContext":map[string]interface {}{}, "f:terminationGracePeriodSeconds":map[string]interface {}{}}}, "manager":"kubectl-create", "operation":"Update", "time":"2021-06-09T22:41:14Z"}}, "name":"valid-pod", "namespace":"namespace-1623278474-30602", "resourceVersion":"1048", "uid":"2bceee7f-bfec-4995-a815-7d85c8255a2f"}, "spec":map[string]interface {}{"containers":[]interface {}{map[string]interface {}{"image":"k8s.gcr.io/serve_hostname", "imagePullPolicy":"Always", "name":"kubernetes-serve-hostname", "resources":map[string]interface {}{"limits":map[string]interface {}{"cpu":"1", "memory":"512Mi"}, "requests":map[string]interface {}{"cpu":"1", "memory":"512Mi"}}, "terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File"}}, "dnsPolicy":"ClusterFirst", "enableServiceLinks":true, "preemptionPolicy":"PreemptLowerPriority", "priority":0, "restartPolicy":"Always", "schedulerName":"default-scheduler", "securityContext":map[string]interface {}{}, "terminationGracePeriodSeconds":30}, "status":map[string]interface {}{"phase":"Pending", "qosClass":"Guaranteed"}}
has:missing is not found
error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing"
Successful
message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template:
	template was:
		{{.missing}}
	raw data was:
		{"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2021-06-09T22:41:14Z","labels":{"name":"valid-pod"},"managedFields":[{"apiVersion":"v1","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:labels":{".":{},"f:name":{}}},"f:spec":{"f:containers":{"k:{\"name\":\"kubernetes-serve-hostname\"}":{".":{},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:resources":{".":{},"f:limits":{".":{},"f:cpu":{},"f:memory":{}},"f:requests":{".":{},"f:cpu":{},"f:memory":{}}},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:enableServiceLinks":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}},"manager":"kubectl-create","operation":"Update","time":"2021-06-09T22:41:14Z"}],"name":"valid-pod","namespace":"namespace-1623278474-30602","resourceVersion":"1048","uid":"2bceee7f-bfec-4995-a815-7d85c8255a2f"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"preemptionPolicy":"PreemptLowerPriority","priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}}
	object given to template engine was:
		map[apiVersion:v1 kind:Pod metadata:map[creationTimestamp:2021-06-09T22:41:14Z labels:map[name:valid-pod] managedFields:[map[apiVersion:v1 fieldsType:FieldsV1 fieldsV1:map[f:metadata:map[f:labels:map[.:map[] f:name:map[]]] f:spec:map[f:containers:map[k:{"name":"kubernetes-serve-hostname"}:map[.:map[] f:image:map[] f:imagePullPolicy:map[] f:name:map[] f:resources:map[.:map[] f:limits:map[.:map[] f:cpu:map[] f:memory:map[]] f:requests:map[.:map[] f:cpu:map[] f:memory:map[]]] f:terminationMessagePath:map[] f:terminationMessagePolicy:map[]]] f:dnsPolicy:map[] f:enableServiceLinks:map[] f:restartPolicy:map[] f:schedulerName:map[] f:securityContext:map[] f:terminationGracePeriodSeconds:map[]]] manager:kubectl-create operation:Update time:2021-06-09T22:41:14Z]] name:valid-pod namespace:namespace-1623278474-30602 resourceVersion:1048 uid:2bceee7f-bfec-4995-a815-7d85c8255a2f] spec:map[containers:[map[image:k8s.gcr.io/serve_hostname imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[cpu:1 memory:512Mi] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File]] dnsPolicy:ClusterFirst enableServiceLinks:true preemptionPolicy:PreemptLowerPriority priority:0 restartPolicy:Always schedulerName:default-scheduler securityContext:map[] terminationGracePeriodSeconds:30] status:map[phase:Pending qosClass:Guaranteed]]
... skipping 84 lines ...
  terminationGracePeriodSeconds: 30
status:
  phase: Pending
  qosClass: Guaranteed
has:name: valid-pod
Successful
message:Error from server (NotFound): pods "invalid-pod" not found
has:"invalid-pod" not found
pod "valid-pod" deleted
get.sh:196: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/redis-master created
pod/valid-pod created
Successful
... skipping 36 lines ...
+++ [0609 22:41:23] Creating namespace namespace-1623278483-15640
namespace/namespace-1623278483-15640 created
Context "test" modified.
+++ [0609 22:41:23] Testing kubectl exec POD COMMAND
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
pod/test-pod created
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pods "test-pod" not found
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod or type/name must be specified
pod "test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_exec_resource_name_tests
Running command: run_kubectl_exec_resource_name_tests

... skipping 3 lines ...
+++ [0609 22:41:24] Creating namespace namespace-1623278484-10418
namespace/namespace-1623278484-10418 created
Context "test" modified.
+++ [0609 22:41:24] Testing kubectl exec TYPE/NAME COMMAND
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
error: the server doesn't have a resource type "foo"
has:error:
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (NotFound): deployments.apps "bar" not found
has:"bar" not found
pod/test-pod created
replicaset.apps/frontend created
I0609 22:41:25.328398   61795 event.go:291] "Event occurred" object="namespace-1623278484-10418/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-mbcml"
I0609 22:41:25.334451   61795 event.go:291] "Event occurred" object="namespace-1623278484-10418/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-w9bsw"
I0609 22:41:25.334494   61795 event.go:291] "Event occurred" object="namespace-1623278484-10418/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-p9jgr"
configmap/test-set-env-config created
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
error: cannot attach to *v1.ConfigMap: selector for *v1.ConfigMap not implemented
has:not implemented
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod test-pod does not have a host assigned
has not:not found
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod, type/name or --filename must be specified
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod frontend-mbcml does not have a host assigned
has not:not found
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod frontend-mbcml does not have a host assigned
has not:pod, type/name or --filename must be specified
pod "test-pod" deleted
replicaset.apps "frontend" deleted
configmap "test-set-env-config" deleted
+++ exit code: 0
Recording: run_create_secret_tests
Running command: run_create_secret_tests

+++ Running case: test-cmd.run_create_secret_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_create_secret_tests
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
message:user-specified
has:user-specified
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","uid":"6e3acb26-a1db-480c-b03e-32540017cce3","resourceVersion":"1128","creationTimestamp":"2021-06-09T22:41:26Z"}}
Successful
message:{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","uid":"6e3acb26-a1db-480c-b03e-32540017cce3","resourceVersion":"1129","creationTimestamp":"2021-06-09T22:41:26Z"},"data":{"key1":"config1"}}
has:uid
Successful
message:{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","uid":"6e3acb26-a1db-480c-b03e-32540017cce3","resourceVersion":"1129","creationTimestamp":"2021-06-09T22:41:26Z"},"data":{"key1":"config1"}}
has:config1
{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Success","details":{"name":"tester-update-cm","kind":"configmaps","uid":"6e3acb26-a1db-480c-b03e-32540017cce3"}}
Successful
message:Error from server (NotFound): configmaps "tester-update-cm" not found
has:configmaps "tester-update-cm" not found
+++ exit code: 0
Recording: run_kubectl_create_kustomization_directory_tests
Running command: run_kubectl_create_kustomization_directory_tests

+++ Running case: test-cmd.run_kubectl_create_kustomization_directory_tests 
... skipping 73 lines ...
      securityContext: {}
      terminationGracePeriodSeconds: 30
status: {}
has:apps/v1beta1
deployment.apps "nginx" deleted
Successful
message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
Successful
message:nginx:
has:nginx:
+++ exit code: 0
Recording: run_kubectl_delete_allnamespaces_tests
... skipping 104 lines ...
has:Timeout
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
has:valid-pod
Successful
message:error: Invalid timeout value. Timeout must be a single integer in seconds, or an integer followed by a corresponding time unit (e.g. 1s | 2m | 3h)
has:Invalid timeout value
pod "valid-pod" deleted
+++ exit code: 0
Recording: run_crd_tests
Running command: run_crd_tests

... skipping 155 lines ...
foo.company.com/test patched
crd.sh:280: Successful get foos/test {{.patched}}: value1
(Bfoo.company.com/test patched
crd.sh:282: Successful get foos/test {{.patched}}: value2
(Bfoo.company.com/test patched
crd.sh:284: Successful get foos/test {{.patched}}: <no value>
(B+++ [0609 22:41:40] "kubectl patch --local" returns error as expected for CustomResource: error: strategic merge patch is not supported for company.com/v1, Kind=Foo locally, try --type merge
{
    "apiVersion": "company.com/v1",
    "kind": "Foo",
    "metadata": {
        "annotations": {
            "kubernetes.io/change-cause": "kubectl patch foos/test --server=https://127.0.0.1:6443 --insecure-skip-tls-verify=true --match-server-version=true --patch={\"patched\":null} --type=merge --record=true"
... skipping 328 lines ...
crd.sh:510: Successful get bars {{len .items}}: 1
(Bnamespace "non-native-resources" deleted
I0609 22:41:49.658433   58123 client.go:360] parsed scheme: "passthrough"
I0609 22:41:49.658502   58123 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I0609 22:41:49.658515   58123 clientconn.go:948] ClientConn switching balancer to "pick_first"
crd.sh:513: Successful get bars {{len .items}}: 0
(BError from server (NotFound): namespaces "non-native-resources" not found
customresourcedefinition.apiextensions.k8s.io "foos.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "bars.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
customresourcedefinition.apiextensions.k8s.io "validfoos.company.com" deleted
+++ exit code: 0
+++ [0609 22:41:55] Testing recursive resources
... skipping 2 lines ...
Context "test" modified.
generic-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:pod/busybox0 created
pod/busybox1 created
error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
W0609 22:41:55.726544   58123 cacher.go:149] Terminating all watchers from cacher *unstructured.Unstructured
E0609 22:41:55.728474   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BW0609 22:41:55.825112   58123 cacher.go:149] Terminating all watchers from cacher *unstructured.Unstructured
E0609 22:41:55.827052   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
W0609 22:41:55.933290   58123 cacher.go:149] Terminating all watchers from cacher *unstructured.Unstructured
E0609 22:41:55.935419   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:220: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox:
(BSuccessful
message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
W0609 22:41:56.040861   58123 cacher.go:149] Terminating all watchers from cacher *unstructured.Unstructured
E0609 22:41:56.043184   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:227: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:231: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:pod/busybox0 replaced
pod/busybox1 replaced
error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:236: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:Name:         busybox0
Namespace:    namespace-1623278515-20152
Priority:     0
Node:         <none>
... skipping 159 lines ...
has:Object 'Kind' is missing
generic-resources.sh:246: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:250: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue:
(BSuccessful
message:pod/busybox0 annotated
pod/busybox1 annotated
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0609 22:41:56.882892   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:255: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0609 22:41:56.924960   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:259: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:Warning: resource pods/busybox0 is missing the kubectl.kubernetes.io/last-applied-configuration annotation which is required by kubectl apply. kubectl apply should only be used on resources created declaratively by either kubectl create --save-config or kubectl apply. The missing annotation will be patched automatically.
pod/busybox0 configured
Warning: resource pods/busybox1 is missing the kubectl.kubernetes.io/last-applied-configuration annotation which is required by kubectl apply. kubectl apply should only be used on resources created declaratively by either kubectl create --save-config or kubectl apply. The missing annotation will be patched automatically.
pod/busybox1 configured
error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
E0609 22:41:57.205613   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0609 22:41:57.242486   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:264: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:busybox0:busybox1:
Successful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:273: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bpod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
generic-resources.sh:278: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue:
(BSuccessful
message:pod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:283: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bpod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
generic-resources.sh:288: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox:
(BSuccessful
message:pod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:293: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:297: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "busybox0" force deleted
pod "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:302: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/busybox0 created
I0609 22:41:58.392522   61795 event.go:291] "Event occurred" object="namespace-1623278515-20152/busybox0" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox0-sngs7"
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0609 22:41:58.401327   61795 event.go:291] "Event occurred" object="namespace-1623278515-20152/busybox1" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox1-2lvnd"
generic-resources.sh:306: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:311: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0609 22:41:58.581418   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:312: Successful get rc busybox0 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:313: Successful get rc busybox1 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:318: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(BE0609 22:41:58.989642   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:319: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(BSuccessful
message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled
horizontalpodautoscaler.autoscaling/busybox1 autoscaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
horizontalpodautoscaler.autoscaling "busybox0" deleted
horizontalpodautoscaler.autoscaling "busybox1" deleted
generic-resources.sh:327: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:328: Successful get rc busybox0 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:329: Successful get rc busybox1 {{.spec.replicas}}: 1
(BI0609 22:41:59.455647   61795 namespace_controller.go:185] Namespace has been deleted non-native-resources
generic-resources.sh:333: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bgeneric-resources.sh:334: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(BSuccessful
message:service/busybox0 exposed
service/busybox1 exposed
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:340: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0609 22:41:59.817652   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:341: Successful get rc busybox0 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:342: Successful get rc busybox1 {{.spec.replicas}}: 1
(BI0609 22:42:00.040123   61795 event.go:291] "Event occurred" object="namespace-1623278515-20152/busybox0" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox0-q8ncv"
I0609 22:42:00.056069   61795 event.go:291] "Event occurred" object="namespace-1623278515-20152/busybox1" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox1-t6krl"
generic-resources.sh:346: Successful get rc busybox0 {{.spec.replicas}}: 2
(BE0609 22:42:00.184219   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:347: Successful get rc busybox1 {{.spec.replicas}}: 2
(BSuccessful
message:replicationcontroller/busybox0 scaled
replicationcontroller/busybox1 scaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:352: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:356: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:361: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx1-deployment created
I0609 22:42:00.751792   61795 event.go:291] "Event occurred" object="namespace-1623278515-20152/nginx1-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx1-deployment-758b5949b6 to 2"
deployment.apps/nginx0-deployment created
error: error validating "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0609 22:42:00.757766   61795 event.go:291] "Event occurred" object="namespace-1623278515-20152/nginx1-deployment-758b5949b6" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx1-deployment-758b5949b6-x44w4"
I0609 22:42:00.759772   61795 event.go:291] "Event occurred" object="namespace-1623278515-20152/nginx0-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx0-deployment-75db9cdfd9 to 2"
I0609 22:42:00.767017   61795 event.go:291] "Event occurred" object="namespace-1623278515-20152/nginx1-deployment-758b5949b6" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx1-deployment-758b5949b6-ntghr"
I0609 22:42:00.772433   61795 event.go:291] "Event occurred" object="namespace-1623278515-20152/nginx0-deployment-75db9cdfd9" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx0-deployment-75db9cdfd9-h8gn2"
I0609 22:42:00.782480   61795 event.go:291] "Event occurred" object="namespace-1623278515-20152/nginx0-deployment-75db9cdfd9" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx0-deployment-75db9cdfd9-zrght"
generic-resources.sh:365: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx0-deployment:nginx1-deployment:
(Bgeneric-resources.sh:366: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(Bgeneric-resources.sh:370: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(BSuccessful
message:deployment.apps/nginx1-deployment skipped rollback (current template already matches revision 1)
deployment.apps/nginx0-deployment skipped rollback (current template already matches revision 1)
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
deployment.apps/nginx1-deployment paused
deployment.apps/nginx0-deployment paused
generic-resources.sh:378: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: true:true:
(BSuccessful
message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
... skipping 10 lines ...
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx0-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx1-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
deployment.apps "nginx1-deployment" force deleted
deployment.apps "nginx0-deployment" force deleted
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
E0609 22:42:02.203097   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0609 22:42:02.771877   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:400: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/busybox0 created
I0609 22:42:03.041687   61795 event.go:291] "Event occurred" object="namespace-1623278515-20152/busybox0" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox0-k8fdp"
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0609 22:42:03.160999   61795 event.go:291] "Event occurred" object="namespace-1623278515-20152/busybox1" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox1-k9vzx"
generic-resources.sh:404: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
... skipping 2 lines ...
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox0" pausing is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox1" pausing is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox0" resuming is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox1" resuming is not supported
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
E0609 22:42:03.821536   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_namespace_tests
Running command: run_namespace_tests

+++ Running case: test-cmd.run_namespace_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_namespace_tests
+++ [0609 22:42:04] Testing kubectl(v1:namespaces)
Successful
message:Error from server (NotFound): namespaces "my-namespace" not found
has: not found
namespace/my-namespace created (dry run)
namespace/my-namespace created (server dry run)
Successful
message:Error from server (NotFound): namespaces "my-namespace" not found
has: not found
namespace/my-namespace created
core.sh:1471: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(Bquery for namespaces had limit param
query for resourcequotas had limit param
query for limitranges had limit param
... skipping 123 lines ...
I0609 22:42:05.486315   78838 round_trippers.go:454] GET https://127.0.0.1:6443/api/v1/namespaces/namespace-1623278496-17539/resourcequotas?limit=500 200 OK in 1 milliseconds
I0609 22:42:05.488198   78838 round_trippers.go:454] GET https://127.0.0.1:6443/api/v1/namespaces/namespace-1623278496-17539/limitranges?limit=500 200 OK in 1 milliseconds
I0609 22:42:05.490160   78838 round_trippers.go:454] GET https://127.0.0.1:6443/api/v1/namespaces/namespace-1623278515-20152 200 OK in 1 milliseconds
I0609 22:42:05.491963   78838 round_trippers.go:454] GET https://127.0.0.1:6443/api/v1/namespaces/namespace-1623278515-20152/resourcequotas?limit=500 200 OK in 1 milliseconds
I0609 22:42:05.493427   78838 round_trippers.go:454] GET https://127.0.0.1:6443/api/v1/namespaces/namespace-1623278515-20152/limitranges?limit=500 200 OK in 1 milliseconds
(Bnamespace "my-namespace" deleted
E0609 22:42:06.022264   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/my-namespace condition met
Successful
message:Error from server (NotFound): namespaces "my-namespace" not found
has: not found
namespace/my-namespace created
core.sh:1482: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(BI0609 22:42:11.505974   61795 shared_informer.go:240] Waiting for caches to sync for resource quota
I0609 22:42:11.506030   61795 shared_informer.go:247] Caches are synced for resource quota 
Successful
... skipping 33 lines ...
namespace "namespace-1623278491-19867" deleted
namespace "namespace-1623278491-757" deleted
namespace "namespace-1623278492-10603" deleted
namespace "namespace-1623278494-3759" deleted
namespace "namespace-1623278496-17539" deleted
namespace "namespace-1623278515-20152" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:warning: deleting cluster-scoped resources
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
namespace "my-namespace" deleted
namespace "namespace-1623278322-29176" deleted
... skipping 29 lines ...
namespace "namespace-1623278491-19867" deleted
namespace "namespace-1623278491-757" deleted
namespace "namespace-1623278492-10603" deleted
namespace "namespace-1623278494-3759" deleted
namespace "namespace-1623278496-17539" deleted
namespace "namespace-1623278515-20152" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:namespace "my-namespace" deleted
namespace/quotas created
I0609 22:42:11.924531   61795 shared_informer.go:240] Waiting for caches to sync for garbage collector
I0609 22:42:11.924604   61795 shared_informer.go:247] Caches are synced for garbage collector 
core.sh:1489: Successful get namespaces/quotas {{.metadata.name}}: quotas
(Bcore.sh:1490: Successful get quota --namespace=quotas {{range.items}}{{ if eq .metadata.name \"test-quota\" }}found{{end}}{{end}}:: :
... skipping 9 lines ...
I0609 22:42:12.669007   79043 round_trippers.go:454] GET https://127.0.0.1:6443/version?timeout=32s 200 OK in 7 milliseconds
I0609 22:42:12.689737   79043 round_trippers.go:454] GET https://127.0.0.1:6443/api/v1/namespaces/quotas/resourcequotas?limit=500 200 OK in 2 milliseconds
I0609 22:42:12.693020   79043 round_trippers.go:454] GET https://127.0.0.1:6443/api/v1/namespaces/quotas/resourcequotas/test-quota 200 OK in 2 milliseconds
(BI0609 22:42:12.844680   61795 resource_quota_controller.go:307] Resource quota has been deleted quotas/test-quota
resourcequota "test-quota" deleted
namespace "quotas" deleted
E0609 22:42:13.319964   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0609 22:42:13.426144   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0609 22:42:13.828498   61795 horizontal.go:361] Horizontal Pod Autoscaler busybox0 has been deleted in namespace-1623278515-20152
I0609 22:42:13.834953   61795 horizontal.go:361] Horizontal Pod Autoscaler busybox1 has been deleted in namespace-1623278515-20152
E0609 22:42:14.698022   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0609 22:42:17.153583   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1511: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"other\" }}found{{end}}{{end}}:: :
(Bnamespace/other created
core.sh:1515: Successful get namespaces/other {{.metadata.name}}: other
(Bcore.sh:1519: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
core.sh:1523: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:1525: Successful get pods -n other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BSuccessful
message:error: a resource cannot be retrieved by name across all namespaces
has:a resource cannot be retrieved by name across all namespaces
core.sh:1532: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
core.sh:1536: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(Bnamespace "other" deleted
... skipping 123 lines ...
(Bsecret "test-secret" deleted
secret/secret-string-data created
core.sh:919: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
(Bcore.sh:920: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
(Bcore.sh:921: Successful get secret/secret-string-data --namespace=test-secrets  {{.stringData}}: <no value>
(Bsecret "secret-string-data" deleted
E0609 22:42:28.030100   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:930: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret "test-secret" deleted
namespace "test-secrets" deleted
I0609 22:42:29.580401   61795 namespace_controller.go:185] Namespace has been deleted other
E0609 22:42:31.858919   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_configmap_tests
Running command: run_configmap_tests

+++ Running case: test-cmd.run_configmap_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
... skipping 5 lines ...
configmap/test-configmap created
core.sh:28: Successful get configmap/test-configmap {{.metadata.name}}: test-configmap
(Bconfigmap "test-configmap" deleted
core.sh:33: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-configmaps\" }}found{{end}}{{end}}:: :
(Bnamespace/test-configmaps created
core.sh:37: Successful get namespaces/test-configmaps {{.metadata.name}}: test-configmaps
(BE0609 22:42:34.447579   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:41: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-configmap\" }}found{{end}}{{end}}:: :
(Bcore.sh:42: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-binary-configmap\" }}found{{end}}{{end}}:: :
(Bconfigmap/test-configmap created (dry run)
configmap/test-configmap created (server dry run)
core.sh:46: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-configmap\" }}found{{end}}{{end}}:: :
(Bconfigmap/test-configmap created
... skipping 14 lines ...
I0609 22:42:35.518201   80233 round_trippers.go:454] GET https://127.0.0.1:6443/api/v1/namespaces/test-configmaps/configmaps/test-configmap 200 OK in 2 milliseconds
I0609 22:42:35.520618   80233 round_trippers.go:454] GET https://127.0.0.1:6443/api/v1/namespaces/test-configmaps/events?fieldSelector=involvedObject.name%3Dtest-configmap%2CinvolvedObject.namespace%3Dtest-configmaps%2CinvolvedObject.kind%3DConfigMap%2CinvolvedObject.uid%3D510afa1c-87f8-44cb-9fa8-6e3359364dfd&limit=500 200 OK in 2 milliseconds
(Bconfigmap "test-configmap" deleted
configmap "test-binary-configmap" deleted
namespace "test-configmaps" deleted
I0609 22:42:38.566524   61795 namespace_controller.go:185] Namespace has been deleted test-secrets
E0609 22:42:40.068990   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_client_config_tests
Running command: run_client_config_tests

+++ Running case: test-cmd.run_client_config_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_client_config_tests
+++ [0609 22:42:41] Creating namespace namespace-1623278561-17224
namespace/namespace-1623278561-17224 created
Context "test" modified.
+++ [0609 22:42:41] Testing client config
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:Error in configuration: context was not found for specified context: missing-context
has:context was not found for specified context: missing-context
Successful
message:error: no server found for cluster "missing-cluster"
has:no server found for cluster "missing-cluster"
Successful
message:error: auth info "missing-user" does not exist
has:auth info "missing-user" does not exist
Successful
message:error: error loading config file "/tmp/newconfig.yaml": no kind "Config" is registered for version "v-1" in scheme "k8s.io/client-go/tools/clientcmd/api/latest/latest.go:50"
has:error loading config file
Successful
message:error: stat missing-config: no such file or directory
has:no such file or directory
+++ exit code: 0
Recording: run_service_accounts_tests
Running command: run_service_accounts_tests

+++ Running case: test-cmd.run_service_accounts_tests 
... skipping 58 lines ...
Labels:                        <none>
Annotations:                   <none>
Schedule:                      59 23 31 2 *
Concurrency Policy:            Allow
Suspend:                       False
Successful Job History Limit:  3
Failed Job History Limit:      1
Starting Deadline Seconds:     <unset>
Selector:                      <unset>
Parallelism:                   <unset>
Completions:                   <unset>
Pod Template:
  Labels:  <none>
... skipping 51 lines ...
                  job-name=test-job
Annotations:      cronjob.kubernetes.io/instantiate: manual
Parallelism:      1
Completions:      1
Completion Mode:  NonIndexed
Start Time:       Wed, 09 Jun 2021 22:42:50 +0000
Pods Statuses:    1 Running / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  controller-uid=2b421418-a50e-4034-8843-ef9cd9a2bfd5
           job-name=test-job
  Containers:
   pi:
    Image:      k8s.gcr.io/perl
... skipping 447 lines ...
  type: ClusterIP
status:
  loadBalancer: {}
Successful
message:kubectl-create kubectl-set
has:kubectl-set
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
core.sh:1034: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(Bservice/redis-master selector updated
I0609 22:43:01.228228   61795 namespace_controller.go:185] Namespace has been deleted test-jobs
Successful
message:Error from server (Conflict): Operation cannot be fulfilled on services "redis-master": the object has been modified; please apply your changes to the latest version and try again
has:Conflict
core.sh:1047: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(Bservice "redis-master" deleted
core.sh:1054: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:1058: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE0609 22:43:01.700907   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master created
core.sh:1062: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(Bcore.sh:1066: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(Bservice/service-v1-test created
core.sh:1087: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test:
(Bservice/service-v1-test replaced
core.sh:1094: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test:
(Bservice "redis-master" deleted
service "service-v1-test" deleted
core.sh:1102: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:1106: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bservice/redis-master created
E0609 22:43:03.239372   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-slave created
core.sh:1111: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:redis-slave:
(BSuccessful
message:NAME           RSRC
kubernetes     202
redis-master   1924
... skipping 41 lines ...
Recording: run_daemonset_tests
Running command: run_daemonset_tests

+++ Running case: test-cmd.run_daemonset_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_daemonset_tests
E0609 22:43:05.880897   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [0609 22:43:05] Creating namespace namespace-1623278585-29044
namespace/namespace-1623278585-29044 created
Context "test" modified.
+++ [0609 22:43:05] Testing kubectl(v1:daemonsets)
apps.sh:30: Successful get daemonsets {{range.items}}{{.metadata.name}}:{{end}}: 
(BI0609 22:43:06.274163   58123 controller.go:611] quota admission added evaluator for: daemonsets.apps
... skipping 66 lines ...
(Bapps.sh:90: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:91: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bdaemonset.apps/bind rolled back
apps.sh:94: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:95: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BSuccessful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
apps.sh:99: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:100: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bdaemonset.apps/bind rolled back
apps.sh:103: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(Bapps.sh:104: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
... skipping 36 lines ...
Namespace:    namespace-1623278590-20860
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1623278590-20860
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
Namespace:    namespace-1623278590-20860
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 12 lines ...
Namespace:    namespace-1623278590-20860
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 27 lines ...
Namespace:    namespace-1623278590-20860
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1623278590-20860
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1623278590-20860
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 11 lines ...
Namespace:    namespace-1623278590-20860
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 23 lines ...
I0609 22:43:12.678114   83780 round_trippers.go:454] GET https://127.0.0.1:6443/api/v1/namespaces/namespace-1623278590-20860/pods?labelSelector=app%3Dguestbook%2Ctier%3Dfrontend&limit=500 200 OK in 4 milliseconds
I0609 22:43:12.687223   83780 round_trippers.go:454] GET https://127.0.0.1:6443/api/v1/namespaces/namespace-1623278590-20860/events?fieldSelector=involvedObject.kind%3DReplicationController%2CinvolvedObject.uid%3Db92ba0bb-ff90-404f-b478-44aa8726b83f%2CinvolvedObject.name%3Dfrontend%2CinvolvedObject.namespace%3Dnamespace-1623278590-20860&limit=500 200 OK in 3 milliseconds
(Bcore.sh:1240: Successful get rc frontend {{.spec.replicas}}: 3
(Breplicationcontroller/frontend scaled
E0609 22:43:12.962367   61795 replica_set.go:200] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1623278590-20860  b92ba0bb-ff90-404f-b478-44aa8726b83f 2044 2 2021-06-09 22:43:11 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  [{kubectl Update v1 <nil> FieldsV1 {"f:spec":{"f:replicas":{}}} scale} {kube-controller-manager Update v1 2021-06-09 22:43:11 +0000 UTC FieldsV1 {"f:status":{"f:fullyLabeledReplicas":{},"f:observedGeneration":{},"f:replicas":{}}} status} {kubectl-create Update v1 2021-06-09 22:43:11 +0000 UTC FieldsV1 {"f:metadata":{"f:labels":{".":{},"f:app":{},"f:tier":{}}},"f:spec":{"f:selector":{},"f:template":{".":{},"f:metadata":{".":{},"f:creationTimestamp":{},"f:labels":{".":{},"f:app":{},"f:tier":{}}},"f:spec":{".":{},"f:containers":{".":{},"k:{\"name\":\"php-redis\"}":{".":{},"f:env":{".":{},"k:{\"name\":\"GET_HOSTS_FROM\"}":{".":{},"f:name":{},"f:value":{}}},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:ports":{".":{},"k:{\"containerPort\":80,\"protocol\":\"TCP\"}":{".":{},"f:containerPort":{},"f:protocol":{}}},"f:resources":{".":{},"f:requests":{".":{},"f:cpu":{},"f:memory":{}}},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}}}} }]},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc002e922f8 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,FSGroupChangePolicy:nil,SeccompProfile:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] [] <nil>}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:1,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
I0609 22:43:12.969887   61795 event.go:291] "Event occurred" object="namespace-1623278590-20860/frontend" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: frontend-rwc8j"
E0609 22:43:12.971853   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1244: Successful get rc frontend {{.spec.replicas}}: 2
(Bcore.sh:1248: Successful get rc frontend {{.spec.replicas}}: 2
(Berror: Expected replicas to be 3, was 2
core.sh:1252: Successful get rc frontend {{.spec.replicas}}: 2
(Bcore.sh:1256: Successful get rc frontend {{.spec.replicas}}: 2
(Breplicationcontroller/frontend scaled
I0609 22:43:13.522064   61795 event.go:291] "Event occurred" object="namespace-1623278590-20860/frontend" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-qv47r"
core.sh:1260: Successful get rc frontend {{.spec.replicas}}: 3
(Bcore.sh:1264: Successful get rc frontend {{.spec.replicas}}: 3
... skipping 31 lines ...
(Bdeployment.apps "nginx-deployment" deleted
Successful
message:service/expose-test-deployment exposed
has:service/expose-test-deployment exposed
service "expose-test-deployment" deleted
Successful
message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
See 'kubectl expose -h' for help and examples
has:invalid deployment: no selectors
deployment.apps/nginx-deployment created
I0609 22:43:15.638629   61795 event.go:291] "Event occurred" object="namespace-1623278590-20860/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-76b5cd66f5 to 3"
I0609 22:43:15.645183   61795 event.go:291] "Event occurred" object="namespace-1623278590-20860/nginx-deployment-76b5cd66f5" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-76b5cd66f5-z6v6z"
I0609 22:43:15.651143   61795 event.go:291] "Event occurred" object="namespace-1623278590-20860/nginx-deployment-76b5cd66f5" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-76b5cd66f5-9cwfk"
... skipping 20 lines ...
(Bpod "valid-pod" deleted
service "frontend" deleted
service "frontend-2" deleted
service "frontend-3" deleted
service "frontend-4" deleted
Successful
message:error: cannot expose a Node
has:cannot expose
Successful
message:The Service "invalid-large-service-name-that-has-more-than-sixty-three-characters" is invalid: metadata.name: Invalid value: "invalid-large-service-name-that-has-more-than-sixty-three-characters": must be no more than 63 characters
has:metadata.name: Invalid value
Successful
message:service/kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
... skipping 30 lines ...
(Bhorizontalpodautoscaler.autoscaling/frontend autoscaled
core.sh:1403: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
horizontalpodautoscaler.autoscaling/frontend autoscaled
core.sh:1407: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
Error: required flag(s) "max" not set
replicationcontroller "frontend" deleted
core.sh:1416: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BapiVersion: apps/v1
kind: Deployment
metadata:
  creationTimestamp: null
... skipping 24 lines ...
          limits:
            cpu: 300m
          requests:
            cpu: 300m
      terminationGracePeriodSeconds: 0
status: {}
Error from server (NotFound): deployments.apps "nginx-deployment-resources" not found
deployment.apps/nginx-deployment-resources created
I0609 22:43:20.467991   61795 event.go:291] "Event occurred" object="namespace-1623278590-20860/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-resources-748ddcb48b to 3"
I0609 22:43:20.475500   61795 event.go:291] "Event occurred" object="namespace-1623278590-20860/nginx-deployment-resources-748ddcb48b" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-748ddcb48b-vwp9d"
I0609 22:43:20.481990   61795 event.go:291] "Event occurred" object="namespace-1623278590-20860/nginx-deployment-resources-748ddcb48b" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-748ddcb48b-njtxn"
I0609 22:43:20.484181   61795 event.go:291] "Event occurred" object="namespace-1623278590-20860/nginx-deployment-resources-748ddcb48b" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-748ddcb48b-vssx5"
core.sh:1422: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment-resources:
(Bcore.sh:1423: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bcore.sh:1424: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment-resources resource requirements updated
I0609 22:43:20.823386   61795 event.go:291] "Event occurred" object="namespace-1623278590-20860/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-resources-7bfb7d56b6 to 1"
I0609 22:43:20.834205   61795 event.go:291] "Event occurred" object="namespace-1623278590-20860/nginx-deployment-resources-7bfb7d56b6" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-7bfb7d56b6-9jp45"
core.sh:1427: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 100m:
(Bcore.sh:1428: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
(Berror: unable to find container named redis
deployment.apps/nginx-deployment-resources resource requirements updated
I0609 22:43:21.168786   61795 event.go:291] "Event occurred" object="namespace-1623278590-20860/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled down replica set nginx-deployment-resources-748ddcb48b to 2"
I0609 22:43:21.178464   61795 event.go:291] "Event occurred" object="namespace-1623278590-20860/nginx-deployment-resources-748ddcb48b" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: nginx-deployment-resources-748ddcb48b-vwp9d"
I0609 22:43:21.179885   61795 event.go:291] "Event occurred" object="namespace-1623278590-20860/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-resources-75dbcccf44 to 1"
I0609 22:43:21.184995   61795 event.go:291] "Event occurred" object="namespace-1623278590-20860/nginx-deployment-resources-75dbcccf44" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-75dbcccf44-2pv69"
core.sh:1433: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
... skipping 155 lines ...
    status: "True"
    type: Progressing
  observedGeneration: 4
  replicas: 4
  unavailableReplicas: 4
  updatedReplicas: 1
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
core.sh:1444: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(Bcore.sh:1445: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
(Bcore.sh:1446: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
... skipping 46 lines ...
                pod-template-hash=69dd6dcd84
Annotations:    deployment.kubernetes.io/desired-replicas: 1
                deployment.kubernetes.io/max-replicas: 2
                deployment.kubernetes.io/revision: 1
Controlled By:  Deployment/test-nginx-apps
Replicas:       1 current / 1 desired
Pods Status:    0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=test-nginx-apps
           pod-template-hash=69dd6dcd84
  Containers:
   nginx:
    Image:        k8s.gcr.io/nginx:test-cmd
... skipping 123 lines ...
apps.sh:311: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(B    Image:	k8s.gcr.io/nginx:test-cmd
deployment.apps/nginx rolled back (server dry run)
apps.sh:315: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bdeployment.apps/nginx rolled back
apps.sh:319: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Berror: unable to find specified revision 1000000 in history
apps.sh:322: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bdeployment.apps/nginx rolled back
apps.sh:326: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bdeployment.apps/nginx paused
error: you cannot rollback a paused deployment; resume it first with 'kubectl rollout resume deployment/nginx' and try again
error: deployments.apps "nginx" can't restart paused deployment (run rollout resume first)
deployment.apps/nginx resumed
deployment.apps/nginx rolled back
    deployment.kubernetes.io/revision-history: 1,3
error: desired revision (3) is different from the running revision (5)
deployment.apps/nginx restarted
I0609 22:43:32.334537   61795 event.go:291] "Event occurred" object="namespace-1623278602-31006/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled down replica set nginx-54785cbcb8 to 2"
I0609 22:43:32.342867   61795 event.go:291] "Event occurred" object="namespace-1623278602-31006/nginx-54785cbcb8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: nginx-54785cbcb8-ms7ch"
I0609 22:43:32.347754   61795 event.go:291] "Event occurred" object="namespace-1623278602-31006/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-bc466c859 to 1"
I0609 22:43:32.354530   61795 event.go:291] "Event occurred" object="namespace-1623278602-31006/nginx-bc466c859" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-bc466c859-8vtb8"
Successful
... skipping 84 lines ...
(Bapps.sh:370: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
I0609 22:43:35.166520   61795 event.go:291] "Event occurred" object="namespace-1623278602-31006/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-6dd48b9849 to 1"
I0609 22:43:35.175413   61795 event.go:291] "Event occurred" object="namespace-1623278602-31006/nginx-deployment-6dd48b9849" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-6dd48b9849-t4mrm"
apps.sh:373: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bapps.sh:374: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Berror: unable to find container named "redis"
deployment.apps/nginx-deployment image updated
apps.sh:379: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:380: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
apps.sh:383: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bapps.sh:384: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
... skipping 49 lines ...
I0609 22:43:39.246091   61795 event.go:291] "Event occurred" object="namespace-1623278602-31006/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-7584fc66fd to 1"
deployment.apps/nginx-deployment env updated
I0609 22:43:39.302610   61795 event.go:291] "Event occurred" object="namespace-1623278602-31006/nginx-deployment-b8c4df945" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: nginx-deployment-b8c4df945-nsmtm"
deployment.apps/nginx-deployment env updated
deployment.apps "nginx-deployment" deleted
configmap "test-set-env-config" deleted
E0609 22:43:39.545137   61795 replica_set.go:531] sync "namespace-1623278602-31006/nginx-deployment-68d657fb6" failed with replicasets.apps "nginx-deployment-68d657fb6" not found
secret "test-set-env-secret" deleted
+++ exit code: 0
E0609 22:43:39.646195   61795 replica_set.go:531] sync "namespace-1623278602-31006/nginx-deployment-7584fc66fd" failed with Operation cannot be fulfilled on replicasets.apps "nginx-deployment-7584fc66fd": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1623278602-31006/nginx-deployment-7584fc66fd, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 6b8c5b08-2a77-4b64-ba8f-18f3fbada953, UID in object meta: 
Recording: run_rs_tests
Running command: run_rs_tests
E0609 22:43:39.695724   61795 replica_set.go:531] sync "namespace-1623278602-31006/nginx-deployment-b8c4df945" failed with replicasets.apps "nginx-deployment-b8c4df945" not found

+++ Running case: test-cmd.run_rs_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_rs_tests
+++ [0609 22:43:39] Creating namespace namespace-1623278619-5500
E0609 22:43:39.745468   61795 replica_set.go:531] sync "namespace-1623278602-31006/nginx-deployment-57ddd474c4" failed with replicasets.apps "nginx-deployment-57ddd474c4" not found
namespace/namespace-1623278619-5500 created
Context "test" modified.
+++ [0609 22:43:39] Testing kubectl(v1:replicasets)
apps.sh:550: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicaset.apps/frontend created
I0609 22:43:40.191296   61795 event.go:291] "Event occurred" object="namespace-1623278619-5500/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-j7vv2"
... skipping 7 lines ...
I0609 22:43:40.655796   61795 event.go:291] "Event occurred" object="namespace-1623278619-5500/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-xrnts"
I0609 22:43:40.662870   61795 event.go:291] "Event occurred" object="namespace-1623278619-5500/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-hgmlj"
I0609 22:43:40.663556   61795 event.go:291] "Event occurred" object="namespace-1623278619-5500/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-hnws4"
apps.sh:564: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(B+++ [0609 22:43:40] Deleting rs
replicaset.apps "frontend" deleted
E0609 22:43:40.896282   61795 replica_set.go:531] sync "namespace-1623278619-5500/frontend" failed with Operation cannot be fulfilled on replicasets.apps "frontend": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1623278619-5500/frontend, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: f0ffbb50-753f-4b44-b789-9a11b12282d2, UID in object meta: 
apps.sh:568: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:570: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(Bpod "frontend-hgmlj" deleted
pod "frontend-hnws4" deleted
pod "frontend-xrnts" deleted
apps.sh:573: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:577: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0609 22:43:41.497810   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I0609 22:43:41.504631   61795 event.go:291] "Event occurred" object="namespace-1623278619-5500/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-khnvv"
I0609 22:43:41.508685   61795 event.go:291] "Event occurred" object="namespace-1623278619-5500/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-rcw8x"
I0609 22:43:41.511093   61795 event.go:291] "Event occurred" object="namespace-1623278619-5500/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-wtqcm"
apps.sh:581: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Bmatched Name:
... skipping 8 lines ...
Namespace:    namespace-1623278619-5500
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1623278619-5500
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 18 lines ...
Namespace:    namespace-1623278619-5500
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 12 lines ...
Namespace:    namespace-1623278619-5500
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 25 lines ...
Namespace:    namespace-1623278619-5500
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 18 lines ...
Namespace:    namespace-1623278619-5500
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1623278619-5500
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 11 lines ...
Namespace:    namespace-1623278619-5500
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 139 lines ...
deployment.apps/scale-2 scaled
deployment.apps/scale-3 scaled
deployment.apps/scale-1 scaled
deployment.apps/scale-2 scaled
deployment.apps/scale-3 scaled
apps.sh:625: Successful get deploy scale-1 {{.spec.replicas}}: 1
(BE0609 22:43:44.744886   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:626: Successful get deploy scale-2 {{.spec.replicas}}: 1
(Bapps.sh:627: Successful get deploy scale-3 {{.spec.replicas}}: 1
(Bdeployment.apps/scale-1 scaled
I0609 22:43:44.906829   61795 event.go:291] "Event occurred" object="namespace-1623278619-5500/scale-1" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set scale-1-6865bdcf4d to 2"
deployment.apps/scale-2 scaled
I0609 22:43:44.911182   61795 event.go:291] "Event occurred" object="namespace-1623278619-5500/scale-1-6865bdcf4d" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: scale-1-6865bdcf4d-9hf8n"
... skipping 36 lines ...
replicaset.apps/frontend resource requirements updated (server dry run)
apps.sh:662: Successful get rs frontend {{.metadata.generation}}: 3
(Breplicaset.apps/frontend resource requirements updated
apps.sh:664: Successful get rs frontend {{.metadata.generation}}: 4
(Breplicaset.apps/frontend serviceaccount updated (dry run)
replicaset.apps/frontend serviceaccount updated (server dry run)
E0609 22:43:47.780466   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:667: Successful get rs frontend {{.metadata.generation}}: 4
(Breplicaset.apps/frontend serviceaccount updated
apps.sh:669: Successful get rs frontend {{.metadata.generation}}: 5
(BSuccessful
message:kube-controller-manager kubectl-create kubectl-set
has:kubectl-set
... skipping 25 lines ...
horizontalpodautoscaler.autoscaling/frontend autoscaled
apps.sh:713: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(BSuccessful
message:kubectl-autoscale
has:kubectl-autoscale
horizontalpodautoscaler.autoscaling "frontend" deleted
Error: required flag(s) "max" not set
replicaset.apps "frontend" deleted
+++ exit code: 0
Recording: run_stateful_set_tests
Running command: run_stateful_set_tests

+++ Running case: test-cmd.run_stateful_set_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_stateful_set_tests
+++ [0609 22:43:50] Creating namespace namespace-1623278630-4292
namespace/namespace-1623278630-4292 created
Context "test" modified.
+++ [0609 22:43:50] Testing kubectl(v1:statefulsets)
apps.sh:506: Successful get statefulset {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0609 22:43:50.597578   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0609 22:43:50.792728   58123 controller.go:611] quota admission added evaluator for: statefulsets.apps
statefulset.apps/nginx created
query for statefulsets had limit param
query for pods had limit param
query for events had limit param
query for statefulsets had user-specified limit param
... skipping 59 lines ...
(Bapps.sh:472: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:473: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bstatefulset.apps/nginx rolled back
apps.sh:476: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:477: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BSuccessful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
apps.sh:481: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:482: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bstatefulset.apps/nginx rolled back
apps.sh:485: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(Bapps.sh:486: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
... skipping 61 lines ...
Name:         mock
Namespace:    namespace-1623278635-8279
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:3.5
    Port:         9949/TCP
... skipping 59 lines ...
Name:         mock
Namespace:    namespace-1623278635-8279
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:3.5
    Port:         9949/TCP
... skipping 59 lines ...
Name:         mock
Namespace:    namespace-1623278635-8279
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:3.5
    Port:         9949/TCP
... skipping 41 lines ...
Namespace:    namespace-1623278635-8279
Selector:     app=mock
Labels:       app=mock
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:3.5
    Port:         9949/TCP
... skipping 11 lines ...
Namespace:    namespace-1623278635-8279
Selector:     app=mock2
Labels:       app=mock2
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock2
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:3.5
    Port:         9949/TCP
... skipping 113 lines ...
+++ [0609 22:44:08] Creating namespace namespace-1623278648-1323
namespace/namespace-1623278648-1323 created
Context "test" modified.
+++ [0609 22:44:08] Testing persistent volumes
storage.sh:30: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpersistentvolume/pv0001 created
E0609 22:44:08.891580   61795 pv_protection_controller.go:118] PV pv0001 failed with : Operation cannot be fulfilled on persistentvolumes "pv0001": the object has been modified; please apply your changes to the latest version and try again
storage.sh:33: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
(Bpersistentvolume "pv0001" deleted
persistentvolume/pv0002 created
storage.sh:36: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0002:
(Bpersistentvolume "pv0002" deleted
persistentvolume/pv0003 created
... skipping 107 lines ...
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
                    save-managers: true
CreationTimestamp:  Wed, 09 Jun 2021 22:38:41 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 31 lines ...
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
                    save-managers: true
CreationTimestamp:  Wed, 09 Jun 2021 22:38:41 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 32 lines ...
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
                    save-managers: true
CreationTimestamp:  Wed, 09 Jun 2021 22:38:41 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 31 lines ...
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
                    save-managers: true
CreationTimestamp:  Wed, 09 Jun 2021 22:38:41 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 39 lines ...
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
                    save-managers: true
CreationTimestamp:  Wed, 09 Jun 2021 22:38:41 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 31 lines ...
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
                    save-managers: true
CreationTimestamp:  Wed, 09 Jun 2021 22:38:41 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 31 lines ...
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
                    save-managers: true
CreationTimestamp:  Wed, 09 Jun 2021 22:38:41 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 30 lines ...
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
                    save-managers: true
CreationTimestamp:  Wed, 09 Jun 2021 22:38:41 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Wed, 09 Jun 2021 22:38:41 +0000   Wed, 09 Jun 2021 22:39:45 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 38 lines ...
I0609 22:44:14.698504   91873 round_trippers.go:454] GET https://127.0.0.1:6443/api/v1/events?fieldSelector=involvedObject.uid%3D127.0.0.1%2CinvolvedObject.name%3D127.0.0.1%2CinvolvedObject.namespace%3D%2CinvolvedObject.kind%3DNode&limit=500 200 OK in 13 milliseconds
I0609 22:44:14.702193   91873 round_trippers.go:454] GET https://127.0.0.1:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/127.0.0.1 404 Not Found in 2 milliseconds
(Bcore.sh:1573: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 patched
core.sh:1576: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: true
(Bnode/127.0.0.1 patched
E0609 22:44:15.241778   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1579: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Btokenreview.authentication.k8s.io/<unknown> created
+++ exit code: 0
Recording: run_exec_credentials_tests
Running command: run_exec_credentials_tests

... skipping 59 lines ...
yes
has:the server doesn't have a resource type
Successful
message:yes
has:yes
Successful
message:error: --subresource can not be used with NonResourceURL
has:subresource can not be used with NonResourceURL
Successful
Successful
message:yes
0
has:0
... skipping 59 lines ...
		{Verbs:[get list watch] APIGroups:[] Resources:[configmaps] ResourceNames:[] NonResourceURLs:[]}
legacy-script.sh:847: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-RB:
(Blegacy-script.sh:848: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-R:
(Blegacy-script.sh:849: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CRB:
(Blegacy-script.sh:850: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CR:
(BSuccessful
message:error: only rbac.authorization.k8s.io/v1 is supported: not *v1beta1.ClusterRole
has:only rbac.authorization.k8s.io/v1 is supported
rolebinding.rbac.authorization.k8s.io "testing-RB" deleted
role.rbac.authorization.k8s.io "testing-R" deleted
warning: deleting cluster-scoped resources, not scoped to the provided namespace
clusterrole.rbac.authorization.k8s.io "testing-CR" deleted
clusterrolebinding.rbac.authorization.k8s.io "testing-CRB" deleted
... skipping 24 lines ...
discovery.sh:91: Successful get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}}: cassandra:cassandra:cassandra:cassandra:
(BI0609 22:44:19.826584   61795 event.go:291] "Event occurred" object="namespace-1623278659-6528/cassandra" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: cassandra-b2kmn"
pod "cassandra-gpwwx" deleted
pod "cassandra-sjlgh" deleted
I0609 22:44:19.845753   61795 event.go:291] "Event occurred" object="namespace-1623278659-6528/cassandra" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: cassandra-9ldwc"
replicationcontroller "cassandra" deleted
E0609 22:44:19.860090   61795 replica_set.go:531] sync "namespace-1623278659-6528/cassandra" failed with replicationcontrollers "cassandra" not found
service "cassandra" deleted
+++ exit code: 0
Recording: run_kubectl_explain_tests
Running command: run_kubectl_explain_tests

+++ Running case: test-cmd.run_kubectl_explain_tests 
... skipping 120 lines ...

+++ Running case: test-cmd.run_kubectl_sort_by_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_sort_by_tests
+++ [0609 22:44:20] Testing kubectl --sort-by
get.sh:256: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0609 22:44:21.020626   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
No resources found in namespace-1623278659-6528 namespace.
No resources found in namespace-1623278659-6528 namespace.
get.sh:264: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
get.sh:268: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BSuccessful
... skipping 243 lines ...
namespace-1623278648-1323    default   0         16s
namespace-1623278650-4       default   0         14s
namespace-1623278659-6528    default   0         5s
some-other-random            default   0         6s
has:all-ns-test-2
namespace "all-ns-test-1" deleted
E0609 22:44:29.378511   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "all-ns-test-2" deleted
E0609 22:44:30.932953   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0609 22:44:35.132073   61795 namespace_controller.go:185] Namespace has been deleted all-ns-test-1
get.sh:392: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
get.sh:396: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bget.sh:400: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
... skipping 10 lines ...
message:Warning: policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
No resources found
has:PodSecurityPolicy is deprecated
Successful
message:Warning: policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
No resources found
Error: 1 warning received
has:PodSecurityPolicy is deprecated
Successful
message:Warning: policy/v1beta1 PodSecurityPolicy is deprecated in v1.21+, unavailable in v1.25+
No resources found
Error: 1 warning received
has:Error: 1 warning received
Recording: run_template_output_tests
Running command: run_template_output_tests

+++ Running case: test-cmd.run_template_output_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_template_output_tests
... skipping 558 lines ...
node/127.0.0.1 cordoned (server dry run)
WARNING: deleting Pods not managed by ReplicationController, ReplicaSet, Job, DaemonSet or StatefulSet: namespace-1623278684-22066/test-pod-1
evicting pod namespace-1623278684-22066/test-pod-1 (server dry run)
node/127.0.0.1 drained (server dry run)
node-management.sh:140: Successful get pods {{range .items}}{{.metadata.name}},{{end}}: test-pod-1,test-pod-2,
(BWARNING: deleting Pods not managed by ReplicationController, ReplicaSet, Job, DaemonSet or StatefulSet: namespace-1623278684-22066/test-pod-1
E0609 22:44:54.618308   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0609 22:45:06.836557   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0609 22:45:18.916897   58123 client.go:360] parsed scheme: "passthrough"
I0609 22:45:18.916992   58123 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I0609 22:45:18.917004   58123 clientconn.go:948] ClientConn switching balancer to "pick_first"
Successful
message:node/127.0.0.1 cordoned
evicting pod namespace-1623278684-22066/test-pod-1
... skipping 17 lines ...
message:node/127.0.0.1 already uncordoned (server dry run)
has:already uncordoned
node-management.sh:161: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 labeled
node-management.sh:166: Successful get nodes 127.0.0.1 {{.metadata.labels.test}}: label
(BSuccessful
message:error: cannot specify both a node name and a --selector option
See 'kubectl drain -h' for help and examples
has:cannot specify both a node name
node-management.sh:172: Successful get nodes 127.0.0.1 {{.metadata.labels.test}}: label
(Bnode-management.sh:174: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode-management.sh:176: Successful get pods {{range .items}}{{.metadata.name}},{{end}}: test-pod-1,test-pod-2,
(BSuccessful
... skipping 78 lines ...
WARNING: deleting Pods not managed by ReplicationController, ReplicaSet, Job, DaemonSet or StatefulSet: namespace-1623278684-22066/test-pod-1, namespace-1623278684-22066/test-pod-2
evicting pod namespace-1623278684-22066/test-pod-1 (dry run)
evicting pod namespace-1623278684-22066/test-pod-2 (dry run)
node/127.0.0.1 drained (dry run)
has:/v1/pods?fieldSelector=spec.nodeName%3D127.0.0.1&limit=500 200 OK
Successful
message:error: USAGE: cordon NODE [flags]
See 'kubectl cordon -h' for help and examples
has:error\: USAGE\: cordon NODE
node/127.0.0.1 already uncordoned
Successful
message:error: You must provide one or more resources by argument or filename.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
   '<resource> <name>'
   '<resource>'
has:must provide one or more resources
... skipping 18 lines ...
+++ [0609 22:45:22] Testing kubectl plugins
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/version/kubectl-version
  - warning: kubectl-version overwrites existing command: "kubectl version"
error: one plugin warning was found
has:kubectl-version overwrites existing command: "kubectl version"
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo
  - warning: test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin: test/fixtures/pkg/kubectl/plugins/kubectl-foo
error: one plugin warning was found
has:test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
has:plugins are available
Successful
message:Unable to read directory "test/fixtures/pkg/kubectl/plugins/empty" from your PATH: open test/fixtures/pkg/kubectl/plugins/empty: no such file or directory. Skipping...
error: unable to find any kubectl plugins in your PATH
has:unable to find any kubectl plugins in your PATH
Successful
message:I am plugin foo
has:plugin foo
Successful
message:I am plugin bar called with args test/fixtures/pkg/kubectl/plugins/bar/kubectl-bar arg1
... skipping 10 lines ...

+++ Running case: test-cmd.run_impersonation_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_impersonation_tests
+++ [0609 22:45:23] Testing impersonation
Successful
message:error: requesting groups or user-extra for test-admin without impersonating a user
has:without impersonating a user
certificatesigningrequest.certificates.k8s.io/foo created
authorization.sh:57: Successful get csr/foo {{.spec.username}}: user1
(Bauthorization.sh:58: Successful get csr/foo {{range .spec.groups}}{{.}}{{end}}: system:authenticated
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
certificatesigningrequest.certificates.k8s.io/foo created
... skipping 15 lines ...
deployment.apps/test-1 created
I0609 22:45:24.286510   61795 event.go:291] "Event occurred" object="namespace-1623278724-17705/test-1-7487ff9cbb" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: test-1-7487ff9cbb-rzn5q"
I0609 22:45:24.351304   61795 event.go:291] "Event occurred" object="namespace-1623278724-17705/test-2" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set test-2-646997777c to 1"
I0609 22:45:24.356505   61795 event.go:291] "Event occurred" object="namespace-1623278724-17705/test-2-646997777c" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: test-2-646997777c-gfg2h"
deployment.apps/test-2 created
wait.sh:36: Successful get deployments {{range .items}}{{.metadata.name}},{{end}}: test-1,test-2,
(BE0609 22:45:24.738230   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0609 22:45:26.070517   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "test-1" deleted
deployment.apps "test-2" deleted
Successful
message:deployment.apps/test-1 condition met
deployment.apps/test-2 condition met
has:test-1 condition met
... skipping 37 lines ...
Running command: run_kubectl_debug_node_tests

+++ Running case: test-cmd.run_kubectl_debug_node_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_debug_node_tests
+++ [0609 22:45:28] Creating namespace namespace-1623278728-18151
E0609 22:45:28.609787   61795 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1623278728-18151 created
Context "test" modified.
+++ [0609 22:45:28] Testing kubectl debug (pod tests)
debug.sh:80: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(Bdebug.sh:84: Successful get pod {{(len .items)}}: 1
(BSuccessful
... skipping 44 lines ...
I0609 22:45:29.868908   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.868945   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.868976   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.869002   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.869043   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.869076   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.869128   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.869133   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.869170   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.869206   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.869262   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.869266   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.869269   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.869329   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.869359   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.869364   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
... skipping 10 lines ...
I0609 22:45:29.869637   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.869727   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.869766   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.869823   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.869870   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.869965   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.870015   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.870049   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.870099   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.870138   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.870154   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.870184   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
E0609 22:45:29.870253   58123 controller.go:184] rpc error: code = Unavailable desc = transport is closing
I0609 22:45:29.870308   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.870362   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.870444   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.870485   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.870531   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.870537   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.870582   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.870587   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.870639   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.870680   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.870682   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.870719   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.870729   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.870787   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.870861   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.870907   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.870937   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.870942   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.870979   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.871006   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.871021   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.871023   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.871069   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.871075   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.871083   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.871110   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.871132   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.871135   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.871186   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.871192   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.871235   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.871245   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.871334   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.871390   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.871486   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.871489   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.871188   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.871530   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.871537   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.871589   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.871634   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.871653   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.871677   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.871712   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.871740   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.871755   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.871773   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.871825   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.871495   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.871680   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.871858   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.871872   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.871876   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.871826   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.871910   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.871928   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.871947   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.871950   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.871967   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.872018   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.872085   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.872118   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.872149   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.872181   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.872204   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.872254   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.872263   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.872353   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.872355   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0609 22:45:29.872463   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.872479   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.872510   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.872539   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.872584   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.872645   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.872654   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.872702   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.872748   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.872746   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.872786   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.872792   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.872838   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.872848   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.872856   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.872882   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.872903   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.872929   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.872946   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.872951   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.873001   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0609 22:45:29.873047   58123 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0609 22:45:29.873092   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.873095   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.873146   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.873181   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.873262   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.873313   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.873323   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.873355   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.873368   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.873422   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.873437   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:29.873618   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
junit report dir: /logs/artifacts
+++ [0609 22:45:29] Clean up complete
+ make test-integration
W0609 22:45:30.870488   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.870544   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.870494   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.870494   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.870639   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.870644   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.870676   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.870722   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.870828   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.870840   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.870922   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.870925   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.870932   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.870969   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.870981   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.870987   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.871024   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.871080   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.871178   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.871217   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.871225   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.871289   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.871322   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.871328   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.871340   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.871352   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.871390   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.871392   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.871579   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.871839   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.871848   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.871966   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.871975   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.871998   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.872068   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.872087   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.872109   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.872112   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.872140   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.872150   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.872160   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.872182   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.872303   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.872313   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.872320   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.872321   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.872336   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.872382   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.872415   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.873712   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.873825   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.873875   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.873892   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.873907   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.873938   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.873989   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.873994   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.874004   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.874025   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.874041   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.874032   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.874060   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.874064   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.874085   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.874132   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.874240   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.874266   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.874284   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.874309   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.874322   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.874333   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.874268   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.874365   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.874381   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.874391   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.874395   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0609 22:45:30.874419   58123 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
+++ [0609 22:45:34] Checking etcd is on PATH
/home/prow/go/src/k8s.io/kubernetes/third_party/etcd/etcd
+++ [0609 22:45:34] Starting etcd instance
etcd --advertise-client-urls http://127.0.0.1:2379 --data-dir /tmp/tmp.MvvDWYNwRO --listen-client-urls http://127.0.0.1:2379 --log-level=debug > "/logs/artifacts/etcd.c0f14e1d-c971-11eb-a09e-ca7f85cb5ec2.root.log.DEBUG.20210609-224534.98105" 2>/dev/null
Waiting for etcd to come up.
+++ [0609 22:45:35] On try 2, etcd: : {"health":"true"}
{"header":{"cluster_id":"14841639068965178418","member_id":"10276657743932975437","revision":"2","raft_term":"2"}}+++ [0609 22:45:35] Running integration test cases
+++ [0609 22:45:40] Running tests without code coverage
{"Time":"2021-06-09T22:48:55.254578142Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/podlogs","Output":"ok  \tk8s.io/kubernetes/test/integration/apiserver/podlogs\t7.331s\n"}
{"Time":"2021-06-09T22:49:03.817404716Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestSelfSubjectAccessReview","Output":"tes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:227 +0xb2\\nnet/http.Error(0x7fa4ec3866d8, 0xc005bcd450, 0xc0031edf80, 0x60, 0x1f4)\\n\\t/usr/local/go/src/net/http/server.go:2081 +0x1f6\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.InternalError(0x7fa4ec3866d8, 0xc005bcd450, 0xc004063e00, 0x54d1ac0, 0xc006856048)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/errors.go:75 +0x11a\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthorization.func1(0x7fa4ec3866d8, 0xc005bcd450, 0xc004063e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authorization.go:69 +0x53a\\nnet/http.HandlerFunc.ServeHTTP(0xc006b0d4c0, 0x7fa4ec3866d8, 0xc005bcd450, 0xc004063e00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStar"}
{"Time":"2021-06-09T22:49:03.817422244Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestSelfSubjectAccessReview","Output":"s.go:161 +0x25a\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc006649e00, 0xc0015a4cb7, 0xc000498af0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:339 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc006649e00, 0xc000498af0, 0xc005ea2820)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:340 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc006b08b00, 0x552e398, 0xc006488600, 0xc"}
{"Time":"2021-06-09T22:49:03.817440435Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestSelfSubjectAccessReview","Output":"bcd450, 0xc004063e00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa4ec3866d8, 0xc005bcd450, 0xc004063e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc006b21d10, 0x7fa4ec3866d8, 0xc005bcd450, 0xc004063e00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7fa4ec3866d8, 0xc005bcd450, 0xc004063e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:50 +0x23b6\\nnet/http.HandlerFunc.ServeHTTP(0xc006b0d5c0, 0x7fa4ec3866d8, 0xc005bcd450, 0xc004063e00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1("}
{"Time":"2021-06-09T22:49:03.817459675Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestSelfSubjectAccessReview","Output":"ts/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc006b0d640, 0x7fa4ec3866d8, 0xc005bcd450, 0xc004063e00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa4ec3866d8, 0xc005bcd450, 0xc004063e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc006b21da0, 0x7fa4ec3866d8, 0xc005bcd450, 0xc004063e00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.withAuthentication.func1(0x7fa4ec3866d8, 0xc005bcd450, 0xc004063e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:80 +0x75c\\nnet/http.HandlerFunc.ServeHTTP(0xc006b0a780, 0x7fa4ec3866d8, 0xc005bcd450, 0xc004063d00)\\n\\t/usr/local/go/src/net/htt"}
{"Time":"2021-06-09T22:49:03.817467559Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestSelfSubjectAccessReview","Output":"p/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa4ec3866d8, 0xc005bcd450, 0xc004063c00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:88 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc006b0d680, 0x7fa4ec3866d8, 0xc005bcd450, 0xc004063c00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc0031edf20, 0xc0069ed518, 0x552efd8, 0xc005bcd450, 0xc004063c00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:108 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:94 +0x1fa\\n\" addedInfo=\"\\nloggi"}
{"Time":"2021-06-09T22:49:13.504396898Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAuthModeAlwaysAllow","Output":"/server/filters/timeout.go:227 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc011b17b60, 0x1f7)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:592 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc0118e5b60, 0xc0093ec000, 0xa3, 0x11c1, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc011b99288, 0x4c88c60, 0xc0118d9540, 0x81285a, 0x4d2c236)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc0006be280, 0x54da978, 0xc0118d9540, 0x54cd9a0, 0xc0118e5b60, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s"}
{"Time":"2021-06-09T22:49:13.504406254Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestAuthModeAlwaysAllow","Output":".io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc0006be280, 0x54da978, 0xc0118d9540, 0x54cd9a0, 0xc0118e5b60, 0x3dc5e32, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc0118d95e0, 0x54da978, 0xc0118d9540, 0x54cd9a0, 0xc0118e5b60, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc0118d95e0, 0x54da978, 0xc0118d9540, 0x54cd9a0, 0xc0118e5b60, 0xc00018aa80, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/a"}
... skipping 35 lines ...
{"Time":"2021-06-09T22:49:40.886942302Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"26260450, 0xc026203a40)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:483 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc0179b9d40, 0x7fa4ec3866d8, 0xc0257898b8, 0xc026262100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4d439c4, 0xe, 0xc0179b9d40, 0xc01871dab0, 0x7fa4ec3866d8, 0xc0257898b8, 0xc026262100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterl"}
{"Time":"2021-06-09T22:49:40.886963026Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"iserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc019bdfe80, 0x7fa4ec3866d8, 0xc0257898b8, 0xc026262100)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa4ec3866d8, 0xc0257898b8, 0xc026262100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc018faa660, 0x7fa4ec3866d8, 0xc0257898b8, 0xc026262100)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:161 +0x25a\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kuber"}
{"Time":"2021-06-09T22:49:40.886971787Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"netes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc025ddd8c0, 0xc0255cacb7, 0xc0262039d0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:339 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc025ddd8c0, 0xc0262039d0, 0xc026253320)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:340 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc018f1df00, 0x552e398, 0xc0262602d0, 0xc02620ebb0, 0x552ee18, 0xc02622bb40, 0x1, 0xc026253240, 0xc026253250, 0xc026203960)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.i"}
{"Time":"2021-06-09T22:49:40.886981335Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"o/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:166 +0x974\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7fa4ec3866d8, 0xc0257898b8, 0xc026262100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:169 +0x504\\nnet/http.HandlerFunc.ServeHTTP(0xc019bdfec0, 0x7fa4ec3866d8, 0xc0257898b8, 0xc026262100)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa4ec3866d8, 0xc0257898b8, 0xc026262100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc019bdff00, 0x7fa4ec3866d8, 0xc0257898b8, 0xc026262100)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trac"}
{"Time":"2021-06-09T22:49:40.886998872Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"g/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc019bdff80, 0x7fa4ec3866d8, 0xc0257898b8, 0xc026262000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa4ec3866d8, 0xc0257898b8, 0xc026262000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc018faa6c0, 0x7fa4ec3866d8, 0xc0257898b8, 0xc026262000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa4ec3866d8, 0xc0257898b8, 0xc026262000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc019bdffc0, 0x7fa4ec3866d8, 0xc0257898b8, 0xc026262000)\\n\\t/usr/local/"}
{"Time":"2021-06-09T22:49:40.887007295Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa4ec3866d8, 0xc0257898b8, 0xc026262000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc018faa720, 0x7fa4ec3866d8, 0xc0257898b8, 0xc026262000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.withAuthentication.func1(0x7fa4ec3866d8, 0xc0257898b8, 0xc026262000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:80 +0x75c\\nnet/http.HandlerFunc.ServeHTTP(0xc018f44f60, 0x7fa4ec3866d8, 0xc0257898b8, 0xc026245f00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa4ec3866d8, 0xc0257898b8, 0xc02"}
{"Time":"2021-06-09T22:49:40.887016685Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"6245e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:88 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc011e68000, 0x7fa4ec3866d8, 0xc0257898b8, 0xc026245e00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc026254540, 0xc01a5f8cf0, 0x552efd8, 0xc0257898b8, 0xc026245e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:108 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:94 +0x1fa\\n\" addedInfo=\"\\n\u0026{bob 2 [system:authenticated] map[]} is acting as \u0026{alice  [system:authenticated] map[]}\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVe"}
{"Time":"2021-06-09T22:49:41.291737779Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"apiserver/pkg/server/filters/timeout.go:227 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc0269d64b0, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:592 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc0269c48a0, 0xc019f25800, 0xbb, 0x37e4, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc024444ef8, 0x4c88c60, 0xc0269aeb40, 0x81285a, 0x4d2c236)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc0006be280, 0x54da978, 0xc0269aeb40, 0x54cd9a0, 0xc0269c48a0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/loc"}
{"Time":"2021-06-09T22:49:41.291750218Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"al/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc0006be280, 0x54da978, 0xc0269aeb40, 0x54cd9a0, 0xc0269c48a0, 0x3dc5e32, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc0269aebe0, 0x54da978, 0xc0269aeb40, 0x54cd9a0, 0xc0269c48a0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc0269aebe0, 0x54da978, 0xc0269aeb40, 0x54cd9a0, 0xc0269c48a0, 0xc00018aa80, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/ve"}
{"Time":"2021-06-09T22:49:41.291758446Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"ndor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4d4a11c, 0x10, 0x7fa4ec467e38, 0xc0269aebe0, 0x5529988, 0xc02538bcf8, 0xc0269cb200, 0x1f4, 0x54da978, 0xc0269aeb40)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x552c3b8, 0xc011e688c0, 0x552c5c8, 0x78bb2c0, 0x0, 0x0, 0x4d2c236, 0x2, 0x5529988, 0xc02538bcf8, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x54c6ec0, 0xc02698ba20, 0x552c3b8, 0xc011e688c0, 0x0, 0x0, 0x4d2c236, 0x2, 0x5529988, 0xc025"}
{"Time":"2021-06-09T22:49:41.291769584Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"38bcf8, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:111\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.DeleteResource.func1(0x5529988, 0xc02538bcf8, 0xc0269cb200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/delete.go:97 +0x1c5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulDeleteResource.func1(0xc0269d6420, 0xc0269952d0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1192 +0x83\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc0"}
{"Time":"2021-06-09T22:49:41.291789325Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"269d6420, 0xc0269952d0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:483 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc0179b9d40, 0x7fa4ec3866d8, 0xc02538bce0, 0xc0269cb200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4d439c4, 0xe, 0xc0179b9d40, 0xc01871dab0, 0x7fa4ec3866d8, 0xc02538bce0, 0xc0269cb200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterl"}
{"Time":"2021-06-09T22:49:41.291808502Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"iserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc019bdfe80, 0x7fa4ec3866d8, 0xc02538bce0, 0xc0269cb200)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa4ec3866d8, 0xc02538bce0, 0xc0269cb200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc018faa660, 0x7fa4ec3866d8, 0xc02538bce0, 0xc0269cb200)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:161 +0x25a\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kuber"}
... skipping 10 lines ...
{"Time":"2021-06-09T22:50:06.750874679Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticator","Output":", 0xc039fec700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:483 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc033d5d680, 0x7fa4ec3866d8, 0xc0391a3830, 0xc039ff7900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4d439c4, 0xe, 0xc033d5d680, 0xc033d1ee00, 0x7fa4ec3866d8, 0xc0391a3830, 0xc039ff7900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.t"}
{"Time":"2021-06-09T22:50:06.750894143Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticator","Output":"pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc033d6e040, 0x7fa4ec3866d8, 0xc0391a3830, 0xc039ff7900)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa4ec3866d8, 0xc0391a3830, 0xc039ff7900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc033d6c090, 0x7fa4ec3866d8, 0xc0391a3830, 0xc039ff7900)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:161 +0x25a\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_o"}
{"Time":"2021-06-09T22:50:06.75090502Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticator","Output":"utput/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc039ffc0c0, 0xc034ef6cb7, 0xc039fec690)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:339 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc039ffc0c0, 0xc039fec690, 0xc039fd3a40)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:340 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc033d50100, 0x552e398, 0xc039f72570, 0xc039ff8210, 0x552ee18, 0xc039e75f80, 0x1, 0xc039fd38e0, 0xc039fd38f0, 0xc039fec5b0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubern"}
{"Time":"2021-06-09T22:50:06.750930326Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticator","Output":"etes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:166 +0x974\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7fa4ec3866d8, 0xc0391a3830, 0xc039ff7900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:169 +0x504\\nnet/http.HandlerFunc.ServeHTTP(0xc033d6e080, 0x7fa4ec3866d8, 0xc0391a3830, 0xc039ff7900)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa4ec3866d8, 0xc0391a3830, 0xc039ff7900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc033d6e0c0, 0x7fa4ec3866d8, 0xc0391a3830, 0xc039ff7900)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackComplet"}
{"Time":"2021-06-09T22:50:06.750955467Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticator","Output":"nts/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc033d6e140, 0x7fa4ec3866d8, 0xc0391a3830, 0xc039ff7900)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa4ec3866d8, 0xc0391a3830, 0xc039ff7900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc033d6c0f0, 0x7fa4ec3866d8, 0xc0391a3830, 0xc039ff7900)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa4ec3866d8, 0xc0391a3830, 0xc039ff7900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc033d6e180, 0x7fa4ec3866d8, 0xc0391a3830, 0xc039ff7900)\\n\\t/usr/local/go/src/n"}
{"Time":"2021-06-09T22:50:06.750966048Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticator","Output":"et/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa4ec3866d8, 0xc0391a3830, 0xc039ff7900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc033d6c150, 0x7fa4ec3866d8, 0xc0391a3830, 0xc039ff7900)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.withAuthentication.func1(0x7fa4ec3866d8, 0xc0391a3830, 0xc039ff7900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:80 +0x75c\\nnet/http.HandlerFunc.ServeHTTP(0xc033d4d2c0, 0x7fa4ec3866d8, 0xc0391a3830, 0xc039ff7800)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa4ec3866d8, 0xc0391a3830, 0xc039ff7700)"}
{"Time":"2021-06-09T22:50:06.750974764Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticator","Output":"\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:88 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc033d6e1c0, 0x7fa4ec3866d8, 0xc0391a3830, 0xc039ff7700)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc039fe6f60, 0xc02cc8d1b8, 0x552efd8, 0xc0391a3830, 0xc039ff7700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:108 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:94 +0x1fa\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"couldn'"}
{"Time":"2021-06-09T22:50:13.062856686Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticatorCustomDial","Output":"r/pkg/server/filters/timeout.go:227 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc011726a80, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:592 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc0182660c0, 0xc005544000, 0xbb, 0x4bc5, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc00d84aef8, 0x4c88c60, 0xc00ce33cc0, 0x81285a, 0x4d2c236)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc0006be280, 0x54da978, 0xc00ce33cc0, 0x54cd9a0, 0xc0182660c0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/sr"}
{"Time":"2021-06-09T22:50:13.06286802Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticatorCustomDial","Output":"c/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc0006be280, 0x54da978, 0xc00ce33cc0, 0x54cd9a0, 0xc0182660c0, 0x3dc5e32, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc00ce33d60, 0x54da978, 0xc00ce33cc0, 0x54cd9a0, 0xc0182660c0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc00ce33d60, 0x54da978, 0xc00ce33cc0, 0x54cd9a0, 0xc0182660c0, 0xc00018aa80, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s"}
{"Time":"2021-06-09T22:50:13.062876414Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticatorCustomDial","Output":".io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4d4a11c, 0x10, 0x7fa4ec467e38, 0xc00ce33d60, 0x5529988, 0xc02ba88490, 0xc02b0c6500, 0x1f4, 0x54da978, 0xc00ce33cc0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x552c3b8, 0xc03a6c3a00, 0x552c5c8, 0x78bb2c0, 0x0, 0x0, 0x4d2c236, 0x2, 0x5529988, 0xc02ba88490, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x54c6ec0, 0xc00d542790, 0x552c3b8, 0xc03a6c3a00, 0x0, 0x0, 0x4d2c236, 0x2, 0x5529988, 0xc02ba88490, "}
{"Time":"2021-06-09T22:50:13.062884389Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticatorCustomDial","Output":"...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:111\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.DeleteResource.func1(0x5529988, 0xc02ba88490, 0xc02b0c6500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/delete.go:97 +0x1c5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulDeleteResource.func1(0xc0117269f0, 0xc006394a10)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1192 +0x83\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc0117269f0"}
{"Time":"2021-06-09T22:50:13.062891832Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticatorCustomDial","Output":", 0xc006394a10)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:483 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc03a6e61b0, 0x7fa4ec3866d8, 0xc02ba88468, 0xc02b0c6500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4d439c4, 0xe, 0xc03a6e61b0, 0xc03a6b8700, 0x7fa4ec3866d8, 0xc02ba88468, 0xc02b0c6500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.t"}
{"Time":"2021-06-09T22:50:13.063084883Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticatorCustomDial","Output":"pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc03a6c3080, 0x7fa4ec3866d8, 0xc02ba88468, 0xc02b0c6500)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa4ec3866d8, 0xc02ba88468, 0xc02b0c6500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc03a6e4180, 0x7fa4ec3866d8, 0xc02ba88468, 0xc02b0c6500)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:161 +0x25a\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_o"}
{"Time":"2021-06-09T22:50:13.0631041Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticatorCustomDial","Output":"utput/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc011c4ef00, 0xc00d84ccb7, 0xc0063949a0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:339 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc011c4ef00, 0xc0063949a0, 0xc00d542670)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:340 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc03a507a00, 0x552e398, 0xc0117265a0, 0xc018d9eb00, 0x552ee18, 0xc00c3c7cc0, 0x1, 0xc00d542510, 0xc00d542520, 0xc0063948c0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubern"}
{"Time":"2021-06-09T22:50:13.063200966Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticatorCustomDial","Output":"etes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:166 +0x974\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7fa4ec3866d8, 0xc02ba88468, 0xc02b0c6500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:169 +0x504\\nnet/http.HandlerFunc.ServeHTTP(0xc03a6c30c0, 0x7fa4ec3866d8, 0xc02ba88468, 0xc02b0c6500)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa4ec3866d8, 0xc02ba88468, 0xc02b0c6500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc03a6c3100, 0x7fa4ec3866d8, 0xc02ba88468, 0xc02b0c6500)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackComplet"}
{"Time":"2021-06-09T22:50:13.063222628Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticatorCustomDial","Output":"nts/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc03a6c3180, 0x7fa4ec3866d8, 0xc02ba88468, 0xc02b0c6500)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa4ec3866d8, 0xc02ba88468, 0xc02b0c6500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc03a6e41e0, 0x7fa4ec3866d8, 0xc02ba88468, 0xc02b0c6500)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa4ec3866d8, 0xc02ba88468, 0xc02b0c6500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc03a6c31c0, 0x7fa4ec3866d8, 0xc02ba88468, 0xc02b0c6500)\\n\\t/usr/local/go/src/n"}
{"Time":"2021-06-09T22:50:13.063235833Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticatorCustomDial","Output":"et/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa4ec3866d8, 0xc02ba88468, 0xc02b0c6500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc03a6e4240, 0x7fa4ec3866d8, 0xc02ba88468, 0xc02b0c6500)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.withAuthentication.func1(0x7fa4ec3866d8, 0xc02ba88468, 0xc02b0c6500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:80 +0x75c\\nnet/http.HandlerFunc.ServeHTTP(0xc03a61faa0, 0x7fa4ec3866d8, 0xc02ba88468, 0xc02b0c6400)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa4ec3866d8, 0xc02ba88468, 0xc02b0c6300)"}
{"Time":"2021-06-09T22:50:13.063251633Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticatorCustomDial","Output":"\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:88 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc03a6c3200, 0x7fa4ec3866d8, 0xc02ba88468, 0xc02b0c6300)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc01992a060, 0xc03a4a9560, 0x552efd8, 0xc02ba88468, 0xc02b0c6300)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:108 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:94 +0x1fa\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"couldn'"}
{"Time":"2021-06-09T22:50:26.103146411Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/configmap","Output":"ok  \tk8s.io/kubernetes/test/integration/configmap\t8.094s\n"}
{"Time":"2021-06-09T22:50:39.207381119Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestCreateVeryLargeObject","Output":"utput/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:227 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc0202d4f30, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:592 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc023caec00, 0xc034728000, 0x7d, 0x37ee8e, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc0015b2b38, 0x4c9c1c0, 0xc019a985a0, 0x60ffda, 0x4d3f79e)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc00052a1e0, 0x54f5a78, 0xc019a985a0, 0x54e8b20, 0xc023caec00, 0x0, 0x"}
{"Time":"2021-06-09T22:50:39.207390329Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestCreateVeryLargeObject","Output":"0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc00052a1e0, 0x54f5a78, 0xc019a985a0, 0x54e8b20, 0xc023caec00, 0x3ddadfa, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc019a98640, 0x54f5a78, 0xc019a985a0, 0x54e8b20, 0xc023caec00, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc019a98640, 0x54f5a78, 0xc019a985a0, 0x54e8b20, 0xc023caec00, 0xc000187500, 0x3)\\n\\t/home/prow/go/src/k8s.io"}
{"Time":"2021-06-09T22:50:39.207399481Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestCreateVeryLargeObject","Output":"/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4d5d922, 0x10, 0x7f699001f248, 0xc019a98640, 0x5544ac8, 0xc048b55680, 0xc03ce88b00, 0x1f4, 0x54f5a78, 0xc019a985a0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x55474f8, 0xc045cc6300, 0x5547708, 0x78d2120, 0x0, 0x0, 0x4d3f79e, 0x2, 0x5544ac8, 0xc048b55680, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x54ec120, 0xc0194d24f8, 0x55474f8, 0xc0"}
{"Time":"2021-06-09T22:50:39.207419284Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestCreateVeryLargeObject","Output":"kg/endpoints/metrics.InstrumentRouteFunc.func1(0xc0202d4d20, 0xc0002644d0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:483 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc045ba4cf0, 0x7f697920f8e0, 0xc048b55668, 0xc03ce88b00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4d57115, 0xe, 0xc045ba4cf0, 0xc044e94e70, 0x7f697920f8e0, 0xc048b55668, 0xc03ce88b00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kuber"}
{"Time":"2021-06-09T22:50:39.207456358Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestCreateVeryLargeObject","Output":"put/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc045ab9940, 0x7f697920f8e0, 0xc048b55668, 0xc03ce88b00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f697920f8e0, 0xc048b55668, 0xc03ce88b00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc045bb43c0, 0x7f697920f8e0, 0xc048b55668, 0xc03ce88b00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:161 +0x25a\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configControlle"}
... skipping 200 lines ...
{"Time":"2021-06-09T22:58:05.660445714Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"72c3620, 0xc004540bd0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:483 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc000ae55f0, 0x7f209271e790, 0xc004f21b18, 0xc007e92c00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4a9174d, 0xe, 0xc000ae55f0, 0xc000aab880, 0x7f209271e790, 0xc004f21b18, 0xc007e92c00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterla"}
{"Time":"2021-06-09T22:58:05.660463703Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"server/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb1c0, 0x7f209271e790, 0xc004f21b18, 0xc007e92c00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc004f21b18, 0xc007e92c00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8f60, 0x7f209271e790, 0xc004f21b18, 0xc007e92c00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:161 +0x25a\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kubern"}
{"Time":"2021-06-09T22:58:05.66048152Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"etes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc00647bc80, 0xc00663ccb7, 0xc004540b60)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:339 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc00647bc80, 0xc004540b60, 0xc007a65bf0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:340 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc0002e6e00, 0x52286d8, 0xc0072c3500, 0xc006b14a50, 0x5229040, 0xc0060bf400, 0x1, 0xc007a65aa0, 0xc007a65ab0, 0xc004540af0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io"}
{"Time":"2021-06-09T22:58:05.660490103Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:166 +0x974\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7f209271e790, 0xc004f21b18, 0xc007e92c00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:169 +0x504\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb200, 0x7f209271e790, 0xc004f21b18, 0xc007e92c00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc004f21b18, 0xc007e92c00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb240, 0x7f209271e790, 0xc004f21b18, 0xc007e92c00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.track"}
{"Time":"2021-06-09T22:58:05.660509414Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb2c0, 0x7f209271e790, 0xc004f21b18, 0xc007e92c00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc004f21b18, 0xc007e92c00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8fc0, 0x7f209271e790, 0xc004f21b18, 0xc007e92c00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc004f21b18, 0xc007e92c00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb300, 0x7f209271e790, 0xc004f21b18, 0xc007e92c00)\\n\\t/usr/local/g"}
{"Time":"2021-06-09T22:58:05.660519887Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"o/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc004f21b18, 0xc007e92c00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae9020, 0x7f209271e790, 0xc004f21b18, 0xc007e92c00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.withAuthentication.func1(0x7f209271e790, 0xc004f21b18, 0xc007e92c00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:80 +0x75c\\nnet/http.HandlerFunc.ServeHTTP(0xc00020cfc0, 0x7f209271e790, 0xc004f21b18, 0xc007e92b00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc004f21b18, 0xc007"}
{"Time":"2021-06-09T22:58:05.660528573Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"e92a00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:88 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb340, 0x7f209271e790, 0xc004f21b18, 0xc007e92a00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc006429980, 0xc0006b1710, 0x5229158, 0xc004f21b18, 0xc007e92a00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:108 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:94 +0x1fa\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\""}
{"Time":"2021-06-09T22:58:05.660569182Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"piserver/pkg/server/filters/timeout.go:227 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc00723ec90, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:592 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc004b36e40, 0xc005cd1600, 0xfb, 0x565, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc00592ae60, 0x49d8be0, 0xc0062b39a0, 0xb062ba, 0x4a7aa12)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc000112be0, 0x51d8078, 0xc0062b39a0, 0x51cbde0, 0xc004b36e40, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local"}
{"Time":"2021-06-09T22:58:05.660613227Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc000112be0, 0x51d8078, 0xc0062b39a0, 0x51cbde0, 0xc004b36e40, 0x3b71001, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc0062b3a40, 0x51d8078, 0xc0062b39a0, 0x51cbde0, 0xc004b36e40, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc0062b3a40, 0x51d8078, 0xc0062b39a0, 0x51cbde0, 0xc004b36e40, 0xc000383680, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vend"}
{"Time":"2021-06-09T22:58:05.66062325Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"or/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4a97b11, 0x10, 0x7f2092f8c880, 0xc0062b3a40, 0x5224238, 0xc00544bab8, 0xc008018900, 0x1f4, 0x51d8078, 0xc0062b39a0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x5226878, 0xc000adbb40, 0x5226a58, 0x746bf60, 0x0, 0x0, 0x4a7aa12, 0x2, 0x5224238, 0xc00544bab8, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x51cb9c0, 0xc0062b3900, 0x5226878, 0xc000adbb40, 0x0, 0x0, 0x4a7aa12, 0x2, 0x5224238, 0xc00544"}
{"Time":"2021-06-09T22:58:05.660633886Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"bab8, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:111\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.createHandler.func1(0x5224238, 0xc00544bab8, 0xc008018900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/create.go:191 +0x1cc5\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulCreateResource.func1(0xc00723ec00, 0xc0045cd110)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1186 +0xe2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc007"}
{"Time":"2021-06-09T22:58:05.66064313Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"23ec00, 0xc0045cd110)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:483 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc000ae55f0, 0x7f209271e790, 0xc00800e0f8, 0xc008018900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4a9174d, 0xe, 0xc000ae55f0, 0xc000aab880, 0x7f209271e790, 0xc00800e0f8, 0xc008018900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlat"}
{"Time":"2021-06-09T22:58:05.660667494Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"erver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb1c0, 0x7f209271e790, 0xc00800e0f8, 0xc008018900)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc00800e0f8, 0xc008018900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8f60, 0x7f209271e790, 0xc00800e0f8, 0xc008018900)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:161 +0x25a\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kuberne"}
{"Time":"2021-06-09T22:58:05.660676526Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"tes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc00635fbc0, 0xc00592ccb7, 0xc0045cd0a0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:339 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc00635fbc0, 0xc0045cd0a0, 0xc007f0e7f0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:340 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc0002e6e00, 0x52286d8, 0xc0072e5e60, 0xc006f488f0, 0x5229040, 0xc006200e00, 0x1, 0xc006f471a0, 0xc006f471b0, 0xc004582f50)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/"}
{"Time":"2021-06-09T22:58:05.660686453Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:166 +0x974\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7f209271e790, 0xc00800e0f8, 0xc008018900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:169 +0x504\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb200, 0x7f209271e790, 0xc00800e0f8, 0xc008018900)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc00800e0f8, 0xc008018900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb240, 0x7f209271e790, 0xc00800e0f8, 0xc008018900)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackC"}
{"Time":"2021-06-09T22:58:05.660702347Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb2c0, 0x7f209271e790, 0xc00800e0f8, 0xc008018900)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc00800e0f8, 0xc008018900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8fc0, 0x7f209271e790, 0xc00800e0f8, 0xc008018900)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc00800e0f8, 0xc008018900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb300, 0x7f209271e790, 0xc00800e0f8, 0xc008018900)\\n\\t/usr/local/go"}
{"Time":"2021-06-09T22:58:05.660720084Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc00800e0f8, 0xc008018900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae9020, 0x7f209271e790, 0xc00800e0f8, 0xc008018900)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.withAuthentication.func1(0x7f209271e790, 0xc00800e0f8, 0xc008018900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:80 +0x75c\\nnet/http.HandlerFunc.ServeHTTP(0xc00020cfc0, 0x7f209271e790, 0xc00800e0f8, 0xc008018800)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc00800e0f8, 0xc0080"}
{"Time":"2021-06-09T22:58:05.660728617Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"18700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:88 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb340, 0x7f209271e790, 0xc00800e0f8, 0xc008018700)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc00646faa0, 0xc0006b1710, 0x5229158, 0xc00800e0f8, 0xc008018700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:108 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:94 +0x1fa\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"I"}
{"Time":"2021-06-09T22:58:05.66084167Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"iserver/pkg/server/filters/timeout.go:227 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc007347470, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:592 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc004b36c00, 0xc005cd1600, 0xfb, 0x565, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc0056aae60, 0x49d8be0, 0xc0062b3720, 0xb062ba, 0x4a7aa12)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc000112be0, 0x51d8078, 0xc0062b3720, 0x51cbde0, 0xc004b36c00, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/"}
{"Time":"2021-06-09T22:58:05.66085073Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc000112be0, 0x51d8078, 0xc0062b3720, 0x51cbde0, 0xc004b36c00, 0x3b71001, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc0062b37c0, 0x51d8078, 0xc0062b3720, 0x51cbde0, 0xc004b36c00, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc0062b37c0, 0x51d8078, 0xc0062b3720, 0x51cbde0, 0xc004b36c00, 0xc000383680, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendo"}
{"Time":"2021-06-09T22:58:05.660859347Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"r/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4a97b11, 0x10, 0x7f2092f8c880, 0xc0062b37c0, 0x5224238, 0xc004f21bf8, 0xc007e93e00, 0x1f4, 0x51d8078, 0xc0062b3720)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x5226878, 0xc000adbb40, 0x5226a58, 0x746bf60, 0x0, 0x0, 0x4a7aa12, 0x2, 0x5224238, 0xc004f21bf8, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x51cb9c0, 0xc0062b3680, 0x5226878, 0xc000adbb40, 0x0, 0x0, 0x4a7aa12, 0x2, 0x5224238, 0xc004f21"}
{"Time":"2021-06-09T22:58:05.660875488Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"bf8, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:111\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.createHandler.func1(0x5224238, 0xc004f21bf8, 0xc007e93e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/create.go:191 +0x1cc5\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulCreateResource.func1(0xc0073473e0, 0xc004541180)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1186 +0xe2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc0073"}
{"Time":"2021-06-09T22:58:05.660883563Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"473e0, 0xc004541180)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:483 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc000ae55f0, 0x7f209271e790, 0xc004f21be0, 0xc007e93e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4a9174d, 0xe, 0xc000ae55f0, 0xc000aab880, 0x7f209271e790, 0xc004f21be0, 0xc007e93e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlate"}
{"Time":"2021-06-09T22:58:05.660899385Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"rver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb1c0, 0x7f209271e790, 0xc004f21be0, 0xc007e93e00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc004f21be0, 0xc007e93e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8f60, 0x7f209271e790, 0xc004f21be0, 0xc007e93e00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:161 +0x25a\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kubernet"}
{"Time":"2021-06-09T22:58:05.66090752Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"es/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc007ea2000, 0xc0056accb7, 0xc004541110)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:339 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc007ea2000, 0xc004541110, 0xc007e9a400)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:340 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc0002e6e00, 0x52286d8, 0xc0073472c0, 0xc006b14e70, 0x5229040, 0xc0060bfa00, 0x1, 0xc007e9a2b0, 0xc007e9a2c0, 0xc004541030)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/k"}
{"Time":"2021-06-09T22:58:05.660930133Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"ubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:166 +0x974\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7f209271e790, 0xc004f21be0, 0xc007e93e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:169 +0x504\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb200, 0x7f209271e790, 0xc004f21be0, 0xc007e93e00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc004f21be0, 0xc007e93e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb240, 0x7f209271e790, 0xc004f21be0, 0xc007e93e00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCo"}
{"Time":"2021-06-09T22:58:05.660957793Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"ndpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb2c0, 0x7f209271e790, 0xc004f21be0, 0xc007e93e00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc004f21be0, 0xc007e93e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8fc0, 0x7f209271e790, 0xc004f21be0, 0xc007e93e00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc004f21be0, 0xc007e93e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb300, 0x7f209271e790, 0xc004f21be0, 0xc007e93e00)\\n\\t/usr/local/go/"}
{"Time":"2021-06-09T22:58:05.660969388Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc004f21be0, 0xc007e93e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae9020, 0x7f209271e790, 0xc004f21be0, 0xc007e93e00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.withAuthentication.func1(0x7f209271e790, 0xc004f21be0, 0xc007e93e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:80 +0x75c\\nnet/http.HandlerFunc.ServeHTTP(0xc00020cfc0, 0x7f209271e790, 0xc004f21be0, 0xc007e93d00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc004f21be0, 0xc007e9"}
{"Time":"2021-06-09T22:58:05.660979846Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"3c00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:88 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb340, 0x7f209271e790, 0xc004f21be0, 0xc007e93c00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc006429f20, 0xc0006b1710, 0x5229158, 0xc004f21be0, 0xc007e93c00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:108 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:94 +0x1fa\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"In"}
{"Time":"2021-06-09T22:58:05.661494453Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"piserver/pkg/server/filters/timeout.go:227 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc00723fdd0, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:592 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc004db8f00, 0xc00017c800, 0xfb, 0x160b, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc00597ae60, 0x49d8be0, 0xc00667f180, 0xb062ba, 0x4a7aa12)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc000112be0, 0x51d8078, 0xc00667f180, 0x51cbde0, 0xc004db8f00, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/loca"}
{"Time":"2021-06-09T22:58:05.661525483Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"l/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc000112be0, 0x51d8078, 0xc00667f180, 0x51cbde0, 0xc004db8f00, 0x3b71001, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc00667f220, 0x51d8078, 0xc00667f180, 0x51cbde0, 0xc004db8f00, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc00667f220, 0x51d8078, 0xc00667f180, 0x51cbde0, 0xc004db8f00, 0xc000383680, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/ven"}
{"Time":"2021-06-09T22:58:05.661533901Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"dor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4a97b11, 0x10, 0x7f2092f8c880, 0xc00667f220, 0x5224238, 0xc00544bb08, 0xc008019200, 0x1f4, 0x51d8078, 0xc00667f180)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x5226878, 0xc000adbb40, 0x5226a58, 0x746bf60, 0x0, 0x0, 0x4a7aa12, 0x2, 0x5224238, 0xc00544bb08, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x51cb9c0, 0xc00667f0e0, 0x5226878, 0xc000adbb40, 0x0, 0x0, 0x4a7aa12, 0x2, 0x5224238, 0xc0054"}
{"Time":"2021-06-09T22:58:05.661544151Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"4bb08, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:111\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.createHandler.func1(0x5224238, 0xc00544bb08, 0xc008019200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/create.go:191 +0x1cc5\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulCreateResource.func1(0xc00723fd40, 0xc0045cd340)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1186 +0xe2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc00"}
{"Time":"2021-06-09T22:58:05.66155188Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"723fd40, 0xc0045cd340)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:483 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc000ae55f0, 0x7f209271e790, 0xc00800e108, 0xc008019200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4a9174d, 0xe, 0xc000ae55f0, 0xc000aab880, 0x7f209271e790, 0xc00800e108, 0xc008019200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterla"}
{"Time":"2021-06-09T22:58:05.661566677Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"server/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb1c0, 0x7f209271e790, 0xc00800e108, 0xc008019200)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc00800e108, 0xc008019200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8f60, 0x7f209271e790, 0xc00800e108, 0xc008019200)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:161 +0x25a\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kubern"}
{"Time":"2021-06-09T22:58:05.66157797Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"etes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc00635fd40, 0xc00597ccb7, 0xc0045cd2d0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:339 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc00635fd40, 0xc0045cd2d0, 0xc007f0eb10)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:340 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc0002e6e00, 0x52286d8, 0xc0073541e0, 0xc006f489a0, 0x5229040, 0xc006200f40, 0x1, 0xc006f47290, 0xc006f472a0, 0xc004583030)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io"}
{"Time":"2021-06-09T22:58:05.661585662Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:166 +0x974\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7f209271e790, 0xc00800e108, 0xc008019200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:169 +0x504\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb200, 0x7f209271e790, 0xc00800e108, 0xc008019200)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc00800e108, 0xc008019200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb240, 0x7f209271e790, 0xc00800e108, 0xc008019200)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.track"}
{"Time":"2021-06-09T22:58:05.661600322Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb2c0, 0x7f209271e790, 0xc00800e108, 0xc008019200)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc00800e108, 0xc008019200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8fc0, 0x7f209271e790, 0xc00800e108, 0xc008019200)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc00800e108, 0xc008019200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb300, 0x7f209271e790, 0xc00800e108, 0xc008019200)\\n\\t/usr/local/g"}
{"Time":"2021-06-09T22:58:05.661611102Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"o/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc00800e108, 0xc008019200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae9020, 0x7f209271e790, 0xc00800e108, 0xc008019200)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.withAuthentication.func1(0x7f209271e790, 0xc00800e108, 0xc008019200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:80 +0x75c\\nnet/http.HandlerFunc.ServeHTTP(0xc00020cfc0, 0x7f209271e790, 0xc00800e108, 0xc008019100)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc00800e108, 0xc008"}
{"Time":"2021-06-09T22:58:05.661618863Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"019000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:88 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb340, 0x7f209271e790, 0xc00800e108, 0xc008019000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc00646fc20, 0xc0006b1710, 0x5229158, 0xc00800e108, 0xc008019000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:108 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:94 +0x1fa\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\""}
{"Time":"2021-06-09T22:58:05.661650658Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"piserver/pkg/server/filters/timeout.go:227 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc00724f7d0, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:592 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc004db91a0, 0xc00017c800, 0xfb, 0x160b, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc007f40e60, 0x49d8be0, 0xc00667f400, 0xb062ba, 0x4a7aa12)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc000112be0, 0x51d8078, 0xc00667f400, 0x51cbde0, 0xc004db91a0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/loca"}
{"Time":"2021-06-09T22:58:05.661660283Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"l/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc000112be0, 0x51d8078, 0xc00667f400, 0x51cbde0, 0xc004db91a0, 0x3b71001, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc00667f4a0, 0x51d8078, 0xc00667f400, 0x51cbde0, 0xc004db91a0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc00667f4a0, 0x51d8078, 0xc00667f400, 0x51cbde0, 0xc004db91a0, 0xc000383680, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/ven"}
{"Time":"2021-06-09T22:58:05.661667669Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"dor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4a97b11, 0x10, 0x7f2092f8c880, 0xc00667f4a0, 0x5224238, 0xc004f21c98, 0xc007f92700, 0x1f4, 0x51d8078, 0xc00667f400)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x5226878, 0xc000adbb40, 0x5226a58, 0x746bf60, 0x0, 0x0, 0x4a7aa12, 0x2, 0x5224238, 0xc004f21c98, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x51cb9c0, 0xc00667f360, 0x5226878, 0xc000adbb40, 0x0, 0x0, 0x4a7aa12, 0x2, 0x5224238, 0xc004f"}
{"Time":"2021-06-09T22:58:05.66171023Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"21c98, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:111\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.createHandler.func1(0x5224238, 0xc004f21c98, 0xc007f92700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/create.go:191 +0x1cc5\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulCreateResource.func1(0xc00724f740, 0xc004541570)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1186 +0xe2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc00"}
{"Time":"2021-06-09T22:58:05.661718996Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"724f740, 0xc004541570)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:483 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc000ae55f0, 0x7f209271e790, 0xc006a24600, 0xc007f92700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4a9174d, 0xe, 0xc000ae55f0, 0xc000aab880, 0x7f209271e790, 0xc006a24600, 0xc007f92700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterla"}
{"Time":"2021-06-09T22:58:05.661738392Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"server/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb1c0, 0x7f209271e790, 0xc006a24600, 0xc007f92700)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc006a24600, 0xc007f92700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8f60, 0x7f209271e790, 0xc006a24600, 0xc007f92700)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:161 +0x25a\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kubern"}
{"Time":"2021-06-09T22:58:05.661749454Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"etes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc006613380, 0xc007f42cb7, 0xc004583180)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:339 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc006613380, 0xc004583180, 0xc006f47970)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:340 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc0002e6e00, 0x52286d8, 0xc007164fc0, 0xc006a676b0, 0x5229040, 0xc0051ae900, 0x1, 0xc006a37ce0, 0xc006a37cf0, 0xc00452fb20)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io"}
{"Time":"2021-06-09T22:58:05.66175799Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:166 +0x974\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7f209271e790, 0xc006a24600, 0xc007f92700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:169 +0x504\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb200, 0x7f209271e790, 0xc006a24600, 0xc007f92700)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc006a24600, 0xc007f92700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb240, 0x7f209271e790, 0xc006a24600, 0xc007f92700)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.track"}
{"Time":"2021-06-09T22:58:05.66177675Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb2c0, 0x7f209271e790, 0xc006a24600, 0xc007f92700)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc006a24600, 0xc007f92700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8fc0, 0x7f209271e790, 0xc006a24600, 0xc007f92700)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc006a24600, 0xc007f92700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb300, 0x7f209271e790, 0xc006a24600, 0xc007f92700)\\n\\t/usr/local/g"}
{"Time":"2021-06-09T22:58:05.66178712Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"o/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc006a24600, 0xc007f92700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae9020, 0x7f209271e790, 0xc006a24600, 0xc007f92700)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.withAuthentication.func1(0x7f209271e790, 0xc006a24600, 0xc007f92700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:80 +0x75c\\nnet/http.HandlerFunc.ServeHTTP(0xc00020cfc0, 0x7f209271e790, 0xc006a24600, 0xc007f92600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc006a24600, 0xc007"}
{"Time":"2021-06-09T22:58:05.661794537Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"f92500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:88 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb340, 0x7f209271e790, 0xc006a24600, 0xc007f92500)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc006de1ce0, 0xc0006b1710, 0x5229158, 0xc006a24600, 0xc007f92500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:108 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:94 +0x1fa\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\""}
{"Time":"2021-06-09T22:58:05.661830844Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"piserver/pkg/server/filters/timeout.go:227 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc007524f90, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:592 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc004b36840, 0xc005cd1600, 0xfb, 0x565, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc0073e0e60, 0x49d8be0, 0xc0062b30e0, 0xb062ba, 0x4a7aa12)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc000112be0, 0x51d8078, 0xc0062b30e0, 0x51cbde0, 0xc004b36840, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local"}
{"Time":"2021-06-09T22:58:05.661839367Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc000112be0, 0x51d8078, 0xc0062b30e0, 0x51cbde0, 0xc004b36840, 0x3b71001, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc0062b3180, 0x51d8078, 0xc0062b30e0, 0x51cbde0, 0xc004b36840, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc0062b3180, 0x51d8078, 0xc0062b30e0, 0x51cbde0, 0xc004b36840, 0xc000383680, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vend"}
{"Time":"2021-06-09T22:58:05.661847408Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"or/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4a97b11, 0x10, 0x7f2092f8c880, 0xc0062b3180, 0x5224238, 0xc007cd7390, 0xc007e5e600, 0x1f4, 0x51d8078, 0xc0062b30e0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x5226878, 0xc000adbb40, 0x5226a58, 0x746bf60, 0x0, 0x0, 0x4a7aa12, 0x2, 0x5224238, 0xc007cd7390, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x51cb9c0, 0xc0062b3040, 0x5226878, 0xc000adbb40, 0x0, 0x0, 0x4a7aa12, 0x2, 0x5224238, 0xc007cd"}
{"Time":"2021-06-09T22:58:05.661857546Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"7390, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:111\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.createHandler.func1(0x5224238, 0xc007cd7390, 0xc007e5e600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/create.go:191 +0x1cc5\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulCreateResource.func1(0xc007524f00, 0xc0045b7490)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1186 +0xe2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc007"}
{"Time":"2021-06-09T22:58:05.661864719Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"524f00, 0xc0045b7490)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:483 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc000ae55f0, 0x7f209271e790, 0xc007cd7378, 0xc007e5e600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4a9174d, 0xe, 0xc000ae55f0, 0xc000aab880, 0x7f209271e790, 0xc007cd7378, 0xc007e5e600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlat"}
{"Time":"2021-06-09T22:58:05.661879179Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"erver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb1c0, 0x7f209271e790, 0xc007cd7378, 0xc007e5e600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc007cd7378, 0xc007e5e600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8f60, 0x7f209271e790, 0xc007cd7378, 0xc007e5e600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:161 +0x25a\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kuberne"}
{"Time":"2021-06-09T22:58:05.661889252Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"tes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc00672e480, 0xc0073e2cb7, 0xc0045b7420)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:339 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc00672e480, 0xc0045b7420, 0xc007e44a20)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:340 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc0002e6e00, 0x52286d8, 0xc007524de0, 0xc007e5a2c0, 0x5229040, 0xc006181980, 0x1, 0xc007e448d0, 0xc007e448e0, 0xc0045b7340)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/"}
{"Time":"2021-06-09T22:58:05.661897761Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:166 +0x974\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7f209271e790, 0xc007cd7378, 0xc007e5e600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:169 +0x504\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb200, 0x7f209271e790, 0xc007cd7378, 0xc007e5e600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc007cd7378, 0xc007e5e600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb240, 0x7f209271e790, 0xc007cd7378, 0xc007e5e600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackC"}
{"Time":"2021-06-09T22:58:05.662304396Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb2c0, 0x7f209271e790, 0xc007cd7378, 0xc007e5e600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc007cd7378, 0xc007e5e600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8fc0, 0x7f209271e790, 0xc007cd7378, 0xc007e5e600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc007cd7378, 0xc007e5e600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb300, 0x7f209271e790, 0xc007cd7378, 0xc007e5e600)\\n\\t/usr/local/go"}
{"Time":"2021-06-09T22:58:05.662337503Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc007cd7378, 0xc007e5e600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae9020, 0x7f209271e790, 0xc007cd7378, 0xc007e5e600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.withAuthentication.func1(0x7f209271e790, 0xc007cd7378, 0xc007e5e600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:80 +0x75c\\nnet/http.HandlerFunc.ServeHTTP(0xc00020cfc0, 0x7f209271e790, 0xc007cd7378, 0xc007e5e500)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc007cd7378, 0xc007e"}
{"Time":"2021-06-09T22:58:05.662347295Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"5e400)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:88 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb340, 0x7f209271e790, 0xc007cd7378, 0xc007e5e400)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc0073d0120, 0xc0006b1710, 0x5229158, 0xc007cd7378, 0xc007e5e400)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:108 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:94 +0x1fa\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"I"}
{"Time":"2021-06-09T22:58:05.662389097Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"piserver/pkg/server/filters/timeout.go:227 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc00721d020, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:592 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc0049e9da0, 0xc0047d2000, 0xfb, 0x565, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc0057a4e60, 0x49d8be0, 0xc0066db540, 0xb062ba, 0x4a7aa12)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc000112be0, 0x51d8078, 0xc0066db540, 0x51cbde0, 0xc0049e9da0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local"}
{"Time":"2021-06-09T22:58:05.662403152Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc000112be0, 0x51d8078, 0xc0066db540, 0x51cbde0, 0xc0049e9da0, 0x3b71001, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc0066db9a0, 0x51d8078, 0xc0066db540, 0x51cbde0, 0xc0049e9da0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc0066db9a0, 0x51d8078, 0xc0066db540, 0x51cbde0, 0xc0049e9da0, 0xc000383680, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vend"}
{"Time":"2021-06-09T22:58:05.662411287Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"or/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4a97b11, 0x10, 0x7f2092f8c880, 0xc0066db9a0, 0x5224238, 0xc00800e1a0, 0xc007f93000, 0x1f4, 0x51d8078, 0xc0066db540)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x5226878, 0xc000adbb40, 0x5226a58, 0x746bf60, 0x0, 0x0, 0x4a7aa12, 0x2, 0x5224238, 0xc00800e1a0, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x51cb9c0, 0xc0066db4a0, 0x5226878, 0xc000adbb40, 0x0, 0x0, 0x4a7aa12, 0x2, 0x5224238, 0xc00800"}
{"Time":"2021-06-09T22:58:05.66242038Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"e1a0, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:111\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.createHandler.func1(0x5224238, 0xc00800e1a0, 0xc007f93000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/create.go:191 +0x1cc5\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulCreateResource.func1(0xc00721cf90, 0xc004583340)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1186 +0xe2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc007"}
{"Time":"2021-06-09T22:58:05.662434433Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"21cf90, 0xc004583340)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:483 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc000ae55f0, 0x7f209271e790, 0xc006a24610, 0xc007f93000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4a9174d, 0xe, 0xc000ae55f0, 0xc000aab880, 0x7f209271e790, 0xc006a24610, 0xc007f93000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlat"}
{"Time":"2021-06-09T22:58:05.662451708Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"erver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb1c0, 0x7f209271e790, 0xc006a24610, 0xc007f93000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc006a24610, 0xc007f93000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8f60, 0x7f209271e790, 0xc006a24610, 0xc007f93000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:161 +0x25a\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kuberne"}
{"Time":"2021-06-09T22:58:05.662501215Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"tes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc006613500, 0xc0057a6cb7, 0xc0045832d0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:339 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc006613500, 0xc0045832d0, 0xc006f47c90)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:340 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc0002e6e00, 0x52286d8, 0xc0071652f0, 0xc006a67760, 0x5229040, 0xc0051aea40, 0x1, 0xc006a37dd0, 0xc006a37de0, 0xc00452fb90)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/"}
{"Time":"2021-06-09T22:58:05.662509451Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:166 +0x974\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7f209271e790, 0xc006a24610, 0xc007f93000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:169 +0x504\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb200, 0x7f209271e790, 0xc006a24610, 0xc007f93000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc006a24610, 0xc007f93000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb240, 0x7f209271e790, 0xc006a24610, 0xc007f93000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackC"}
{"Time":"2021-06-09T22:58:05.662524491Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb2c0, 0x7f209271e790, 0xc006a24610, 0xc007f93000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc006a24610, 0xc007f93000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8fc0, 0x7f209271e790, 0xc006a24610, 0xc007f93000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc006a24610, 0xc007f93000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb300, 0x7f209271e790, 0xc006a24610, 0xc007f93000)\\n\\t/usr/local/go"}
{"Time":"2021-06-09T22:58:05.662533087Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc006a24610, 0xc007f93000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae9020, 0x7f209271e790, 0xc006a24610, 0xc007f93000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.withAuthentication.func1(0x7f209271e790, 0xc006a24610, 0xc007f93000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:80 +0x75c\\nnet/http.HandlerFunc.ServeHTTP(0xc00020cfc0, 0x7f209271e790, 0xc006a24610, 0xc007f92f00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc006a24610, 0xc007f"}
{"Time":"2021-06-09T22:58:05.662540498Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"92e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:88 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb340, 0x7f209271e790, 0xc006a24610, 0xc007f92e00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc006de1e60, 0xc0006b1710, 0x5229158, 0xc006a24610, 0xc007f92e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:108 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:94 +0x1fa\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"I"}
{"Time":"2021-06-09T22:58:05.662574333Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"piserver/pkg/server/filters/timeout.go:227 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc0072dbe30, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:592 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc004b36660, 0xc005cd1600, 0xfb, 0x565, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc0057a0e60, 0x49d8be0, 0xc0062b2a00, 0xb062ba, 0x4a7aa12)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc000112be0, 0x51d8078, 0xc0062b2a00, 0x51cbde0, 0xc004b36660, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local"}
{"Time":"2021-06-09T22:58:05.662582368Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc000112be0, 0x51d8078, 0xc0062b2a00, 0x51cbde0, 0xc004b36660, 0x3b71001, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc0062b2aa0, 0x51d8078, 0xc0062b2a00, 0x51cbde0, 0xc004b36660, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc0062b2aa0, 0x51d8078, 0xc0062b2a00, 0x51cbde0, 0xc004b36660, 0xc000383680, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vend"}
{"Time":"2021-06-09T22:58:05.662589863Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"or/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4a97b11, 0x10, 0x7f2092f8c880, 0xc0062b2aa0, 0x5224238, 0xc00800e010, 0xc00800ac00, 0x1f4, 0x51d8078, 0xc0062b2a00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x5226878, 0xc000adbb40, 0x5226a58, 0x746bf60, 0x0, 0x0, 0x4a7aa12, 0x2, 0x5224238, 0xc00800e010, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x51cb9c0, 0xc0062b2960, 0x5226878, 0xc000adbb40, 0x0, 0x0, 0x4a7aa12, 0x2, 0x5224238, 0xc00800"}
{"Time":"2021-06-09T22:58:05.662599467Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"e010, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:111\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.createHandler.func1(0x5224238, 0xc00800e010, 0xc00800ac00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/create.go:191 +0x1cc5\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulCreateResource.func1(0xc0072dbd70, 0xc004582930)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1186 +0xe2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc007"}
{"Time":"2021-06-09T22:58:05.662607463Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"2dbd70, 0xc004582930)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:483 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc000ae55f0, 0x7f209271e790, 0xc006191ff0, 0xc00800ac00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4a9174d, 0xe, 0xc000ae55f0, 0xc000aab880, 0x7f209271e790, 0xc006191ff0, 0xc00800ac00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlat"}
{"Time":"2021-06-09T22:58:05.662627668Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"erver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb1c0, 0x7f209271e790, 0xc006191ff0, 0xc00800ac00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc006191ff0, 0xc00800ac00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8f60, 0x7f209271e790, 0xc006191ff0, 0xc00800ac00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:161 +0x25a\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kuberne"}
{"Time":"2021-06-09T22:58:05.662636371Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"tes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc006613080, 0xc0057a2cb7, 0xc0045828c0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:339 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc006613080, 0xc0045828c0, 0xc006f46980)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:340 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc0002e6e00, 0x52286d8, 0xc0072dbbf0, 0xc006f482c0, 0x5229040, 0xc006200540, 0x1, 0xc006f46830, 0xc006f46840, 0xc004582770)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/"}
{"Time":"2021-06-09T22:58:05.662643833Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:166 +0x974\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7f209271e790, 0xc006191ff0, 0xc00800ac00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:169 +0x504\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb200, 0x7f209271e790, 0xc006191ff0, 0xc00800ac00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc006191ff0, 0xc00800ac00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb240, 0x7f209271e790, 0xc006191ff0, 0xc00800ac00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackC"}
{"Time":"2021-06-09T22:58:05.662662491Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb2c0, 0x7f209271e790, 0xc006191ff0, 0xc00800ac00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc006191ff0, 0xc00800ac00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8fc0, 0x7f209271e790, 0xc006191ff0, 0xc00800ac00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc006191ff0, 0xc00800ac00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb300, 0x7f209271e790, 0xc006191ff0, 0xc00800ac00)\\n\\t/usr/local/go"}
{"Time":"2021-06-09T22:58:05.66267362Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc006191ff0, 0xc00800ac00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae9020, 0x7f209271e790, 0xc006191ff0, 0xc00800ac00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.withAuthentication.func1(0x7f209271e790, 0xc006191ff0, 0xc00800ac00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:80 +0x75c\\nnet/http.HandlerFunc.ServeHTTP(0xc00020cfc0, 0x7f209271e790, 0xc006191ff0, 0xc00800ab00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc006191ff0, 0xc0080"}
{"Time":"2021-06-09T22:58:05.662681584Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"0aa00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:88 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb340, 0x7f209271e790, 0xc006191ff0, 0xc00800aa00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc00646f0e0, 0xc0006b1710, 0x5229158, 0xc006191ff0, 0xc00800aa00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:108 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:94 +0x1fa\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"I"}
{"Time":"2021-06-09T22:58:05.66849666Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"iserver/pkg/server/filters/timeout.go:227 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc00724e750, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:592 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc004b36a20, 0xc005cd1600, 0xfb, 0x565, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc007eaae60, 0x49d8be0, 0xc0062b3360, 0xb062ba, 0x4a7aa12)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc000112be0, 0x51d8078, 0xc0062b3360, 0x51cbde0, 0xc004b36a20, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/"}
{"Time":"2021-06-09T22:58:05.668504Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc000112be0, 0x51d8078, 0xc0062b3360, 0x51cbde0, 0xc004b36a20, 0x3b71001, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc0062b3400, 0x51d8078, 0xc0062b3360, 0x51cbde0, 0xc004b36a20, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc0062b3400, 0x51d8078, 0xc0062b3360, 0x51cbde0, 0xc004b36a20, 0xc000383680, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendo"}
{"Time":"2021-06-09T22:58:05.668512871Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"r/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4a97b11, 0x10, 0x7f2092f8c880, 0xc0062b3400, 0x5224238, 0xc004f21c48, 0xc008019b00, 0x1f4, 0x51d8078, 0xc0062b3360)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x5226878, 0xc000adbb40, 0x5226a58, 0x746bf60, 0x0, 0x0, 0x4a7aa12, 0x2, 0x5224238, 0xc004f21c48, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x51cb9c0, 0xc0062b32c0, 0x5226878, 0xc000adbb40, 0x0, 0x0, 0x4a7aa12, 0x2, 0x5224238, 0xc004f21"}
{"Time":"2021-06-09T22:58:05.668532279Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"c48, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:111\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.createHandler.func1(0x5224238, 0xc004f21c48, 0xc008019b00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/create.go:191 +0x1cc5\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulCreateResource.func1(0xc00724e6c0, 0xc004541420)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1186 +0xe2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc0072"}
{"Time":"2021-06-09T22:58:05.668541135Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"4e6c0, 0xc004541420)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:483 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc000ae55f0, 0x7f209271e790, 0xc00800e118, 0xc008019b00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4a9174d, 0xe, 0xc000ae55f0, 0xc000aab880, 0x7f209271e790, 0xc00800e118, 0xc008019b00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlate"}
{"Time":"2021-06-09T22:58:05.668561548Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"rver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb1c0, 0x7f209271e790, 0xc00800e118, 0xc008019b00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc00800e118, 0xc008019b00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8f60, 0x7f209271e790, 0xc00800e118, 0xc008019b00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:161 +0x25a\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kubernet"}
{"Time":"2021-06-09T22:58:05.668572526Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"es/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc007ea2180, 0xc007eaccb7, 0xc004541340)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:339 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc007ea2180, 0xc004541340, 0xc007e9a720)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:340 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc0002e6e00, 0x52286d8, 0xc007354870, 0xc006f48a50, 0x5229040, 0xc006201080, 0x1, 0xc006f47380, 0xc006f47390, 0xc0045830a0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/k"}
{"Time":"2021-06-09T22:58:05.668582482Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"ubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:166 +0x974\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7f209271e790, 0xc00800e118, 0xc008019b00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:169 +0x504\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb200, 0x7f209271e790, 0xc00800e118, 0xc008019b00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc00800e118, 0xc008019b00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb240, 0x7f209271e790, 0xc00800e118, 0xc008019b00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCo"}
{"Time":"2021-06-09T22:58:05.668604482Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"ndpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb2c0, 0x7f209271e790, 0xc00800e118, 0xc008019b00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc00800e118, 0xc008019b00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8fc0, 0x7f209271e790, 0xc00800e118, 0xc008019b00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc00800e118, 0xc008019b00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb300, 0x7f209271e790, 0xc00800e118, 0xc008019b00)\\n\\t/usr/local/go/"}
{"Time":"2021-06-09T22:58:05.668614612Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc00800e118, 0xc008019b00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae9020, 0x7f209271e790, 0xc00800e118, 0xc008019b00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.withAuthentication.func1(0x7f209271e790, 0xc00800e118, 0xc008019b00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:80 +0x75c\\nnet/http.HandlerFunc.ServeHTTP(0xc00020cfc0, 0x7f209271e790, 0xc00800e118, 0xc008019a00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc00800e118, 0xc00801"}
{"Time":"2021-06-09T22:58:05.668623836Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"9900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:88 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb340, 0x7f209271e790, 0xc00800e118, 0xc008019900)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc00646fda0, 0xc0006b1710, 0x5229158, 0xc00800e118, 0xc008019900)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:108 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:94 +0x1fa\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"In"}
{"Time":"2021-06-09T22:58:05.669547163Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"piserver/pkg/server/filters/timeout.go:227 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc00732d920, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:592 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc004db89c0, 0xc00017c800, 0xfb, 0x160b, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc0070dee60, 0x49d8be0, 0xc00667ec80, 0xb062ba, 0x4a7aa12)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc000112be0, 0x51d8078, 0xc00667ec80, 0x51cbde0, 0xc004db89c0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/loca"}
{"Time":"2021-06-09T22:58:05.669564306Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"l/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc000112be0, 0x51d8078, 0xc00667ec80, 0x51cbde0, 0xc004db89c0, 0x3b71001, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc00667ed20, 0x51d8078, 0xc00667ec80, 0x51cbde0, 0xc004db89c0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc00667ed20, 0x51d8078, 0xc00667ec80, 0x51cbde0, 0xc004db89c0, 0xc000383680, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/ven"}
{"Time":"2021-06-09T22:58:05.669573585Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"dor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4a97b11, 0x10, 0x7f2092f8c880, 0xc00667ed20, 0x5224238, 0xc004f21b88, 0xc007c71d00, 0x1f4, 0x51d8078, 0xc00667ec80)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x5226878, 0xc000adbb40, 0x5226a58, 0x746bf60, 0x0, 0x0, 0x4a7aa12, 0x2, 0x5224238, 0xc004f21b88, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x51cb9c0, 0xc00667ebe0, 0x5226878, 0xc000adbb40, 0x0, 0x0, 0x4a7aa12, 0x2, 0x5224238, 0xc004f"}
{"Time":"2021-06-09T22:58:05.670410491Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"21b88, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:111\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.createHandler.func1(0x5224238, 0xc004f21b88, 0xc007c71d00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/create.go:191 +0x1cc5\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulCreateResource.func1(0xc00732d7d0, 0xc004540e70)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1186 +0xe2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc00"}
{"Time":"2021-06-09T22:58:05.670443702Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"732d7d0, 0xc004540e70)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:483 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc000ae55f0, 0x7f209271e790, 0xc007cd7368, 0xc007c71d00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4a9174d, 0xe, 0xc000ae55f0, 0xc000aab880, 0x7f209271e790, 0xc007cd7368, 0xc007c71d00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterla"}
{"Time":"2021-06-09T22:58:05.670466143Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"server/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb1c0, 0x7f209271e790, 0xc007cd7368, 0xc007c71d00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc007cd7368, 0xc007c71d00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8f60, 0x7f209271e790, 0xc007cd7368, 0xc007c71d00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:161 +0x25a\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kubern"}
{"Time":"2021-06-09T22:58:05.670483626Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"etes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc00647be00, 0xc0070e0cb7, 0xc004540d90)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:339 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc00647be00, 0xc004540d90, 0xc007a65f10)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:340 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc0002e6e00, 0x52286d8, 0xc007524ae0, 0xc007e5a210, 0x5229040, 0xc006181840, 0x1, 0xc007e447e0, 0xc007e447f0, 0xc0045b72d0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io"}
{"Time":"2021-06-09T22:58:05.670492375Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:166 +0x974\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7f209271e790, 0xc007cd7368, 0xc007c71d00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:169 +0x504\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb200, 0x7f209271e790, 0xc007cd7368, 0xc007c71d00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc007cd7368, 0xc007c71d00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb240, 0x7f209271e790, 0xc007cd7368, 0xc007c71d00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.track"}
{"Time":"2021-06-09T22:58:05.670509771Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb2c0, 0x7f209271e790, 0xc007cd7368, 0xc007c71d00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc007cd7368, 0xc007c71d00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8fc0, 0x7f209271e790, 0xc007cd7368, 0xc007c71d00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc007cd7368, 0xc007c71d00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb300, 0x7f209271e790, 0xc007cd7368, 0xc007c71d00)\\n\\t/usr/local/g"}
{"Time":"2021-06-09T22:58:05.670518694Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"o/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc007cd7368, 0xc007c71d00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae9020, 0x7f209271e790, 0xc007cd7368, 0xc007c71d00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.withAuthentication.func1(0x7f209271e790, 0xc007cd7368, 0xc007c71d00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:80 +0x75c\\nnet/http.HandlerFunc.ServeHTTP(0xc00020cfc0, 0x7f209271e790, 0xc007cd7368, 0xc007c71c00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc007cd7368, 0xc007"}
{"Time":"2021-06-09T22:58:05.670530508Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"c71b00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:88 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb340, 0x7f209271e790, 0xc007cd7368, 0xc007c71b00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc006ed9ec0, 0xc0006b1710, 0x5229158, 0xc007cd7368, 0xc007c71b00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:108 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:94 +0x1fa\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\""}
{"Time":"2021-06-09T22:58:05.670854736Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"piserver/pkg/server/filters/timeout.go:227 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc00726e750, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:592 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc004b371a0, 0xc005cd1600, 0xfb, 0x565, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc0073e4e60, 0x49d8be0, 0xc0062b3c20, 0xb062ba, 0x4a7aa12)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc000112be0, 0x51d8078, 0xc0062b3c20, 0x51cbde0, 0xc004b371a0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local"}
{"Time":"2021-06-09T22:58:05.670865753Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc000112be0, 0x51d8078, 0xc0062b3c20, 0x51cbde0, 0xc004b371a0, 0x3b71001, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc0062b3cc0, 0x51d8078, 0xc0062b3c20, 0x51cbde0, 0xc004b371a0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc0062b3cc0, 0x51d8078, 0xc0062b3c20, 0x51cbde0, 0xc004b371a0, 0xc000383680, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vend"}
{"Time":"2021-06-09T22:58:05.67087695Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"or/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4a97b11, 0x10, 0x7f2092f8c880, 0xc0062b3cc0, 0x5224238, 0xc007e040b8, 0xc00802c700, 0x1f4, 0x51d8078, 0xc0062b3c20)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x5226878, 0xc000adbb40, 0x5226a58, 0x746bf60, 0x0, 0x0, 0x4a7aa12, 0x2, 0x5224238, 0xc007e040b8, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x51cb9c0, 0xc0062b3b80, 0x5226878, 0xc000adbb40, 0x0, 0x0, 0x4a7aa12, 0x2, 0x5224238, 0xc007e0"}
{"Time":"2021-06-09T22:58:05.670887114Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"40b8, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:111\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.createHandler.func1(0x5224238, 0xc007e040b8, 0xc00802c700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/create.go:191 +0x1cc5\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulCreateResource.func1(0xc00726e6c0, 0xc0045f07e0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1186 +0xe2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc007"}
{"Time":"2021-06-09T22:58:05.670898544Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"26e6c0, 0xc0045f07e0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:483 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc000ae55f0, 0x7f209271e790, 0xc00800e128, 0xc00802c700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4a9174d, 0xe, 0xc000ae55f0, 0xc000aab880, 0x7f209271e790, 0xc00800e128, 0xc00802c700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlat"}
{"Time":"2021-06-09T22:58:05.670929393Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"erver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb1c0, 0x7f209271e790, 0xc00800e128, 0xc00802c700)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc00800e128, 0xc00802c700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8f60, 0x7f209271e790, 0xc00800e128, 0xc00802c700)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:161 +0x25a\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kuberne"}
{"Time":"2021-06-09T22:58:05.670942709Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"tes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc006f9c540, 0xc0073e6cb7, 0xc0045f0770)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:339 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc006f9c540, 0xc0045f0770, 0xc007adf090)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:340 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc0002e6e00, 0x52286d8, 0xc00721cd20, 0xc006f48b00, 0x5229040, 0xc006201340, 0x1, 0xc006f478a0, 0xc006f478b0, 0xc004583110)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/"}
{"Time":"2021-06-09T22:58:05.670956455Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:166 +0x974\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7f209271e790, 0xc00800e128, 0xc00802c700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:169 +0x504\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb200, 0x7f209271e790, 0xc00800e128, 0xc00802c700)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc00800e128, 0xc00802c700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb240, 0x7f209271e790, 0xc00800e128, 0xc00802c700)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackC"}
{"Time":"2021-06-09T22:58:05.670975954Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb2c0, 0x7f209271e790, 0xc00800e128, 0xc00802c700)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc00800e128, 0xc00802c700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae8fc0, 0x7f209271e790, 0xc00800e128, 0xc00802c700)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc00800e128, 0xc00802c700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:79 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb300, 0x7f209271e790, 0xc00800e128, 0xc00802c700)\\n\\t/usr/local/go"}
{"Time":"2021-06-09T22:58:05.670993316Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7f209271e790, 0xc00800e128, 0xc00802c700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:103 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc000ae9020, 0x7f209271e790, 0xc00800e128, 0xc00802c700)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.withAuthentication.func1(0x7f209271e790, 0xc00800e128, 0xc00802c700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:80 +0x75c\\nnet/http.HandlerFunc.ServeHTTP(0xc00020cfc0, 0x7f209271e790, 0xc00800e128, 0xc00802c600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7f209271e790, 0xc00800e128, 0xc0080"}
{"Time":"2021-06-09T22:58:05.671003842Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"2c500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:88 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc000adb340, 0x7f209271e790, 0xc00800e128, 0xc00802c500)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc00646ff20, 0xc0006b1710, 0x5229158, 0xc00800e128, 0xc00802c500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:108 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:94 +0x1fa\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"I"}
{"Time":"2021-06-09T22:58:05.671298785Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"piserver/pkg/server/filters/timeout.go:227 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc00752da10, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:592 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc004db8720, 0xc00017c800, 0xfb, 0x160b, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc006480e60, 0x49d8be0, 0xc00667ea00, 0xb062ba, 0x4a7aa12)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc000112be0, 0x51d8078, 0xc00667ea00, 0x51cbde0, 0xc004db8720, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/loca"}
{"Time":"2021-06-09T22:58:05.67130793Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"l/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc000112be0, 0x51d8078, 0xc00667ea00, 0x51cbde0, 0xc004db8720, 0x3b71001, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc00667eaa0, 0x51d8078, 0xc00667ea00, 0x51cbde0, 0xc004db8720, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimac