This job view page is being replaced by Spyglass soon. Check out the new job view.
ResultFAILURE
Tests 1 failed / 3332 succeeded
Started2021-04-07 14:19
Elapsed36m11s
Revisionmaster

Test Failures


k8s.io/kubernetes/test/integration/apiserver/admissionwebhook TestMutatingWebhookResetsInvalidManagedFields 9.80s

go test -v k8s.io/kubernetes/test/integration/apiserver/admissionwebhook -run TestMutatingWebhookResetsInvalidManagedFields$
=== RUN   TestMutatingWebhookResetsInvalidManagedFields
I0407 14:42:12.743650  125810 controller.go:181] Shutting down kubernetes service endpoint reconciler
I0407 14:42:12.743786  125810 controller.go:89] Shutting down OpenAPI AggregationController
I0407 14:42:12.743859  125810 dynamic_cafile_content.go:182] Shutting down request-header::/tmp/kubernetes-kube-apiserver872517318/proxy-ca.crt
I0407 14:42:12.743871  125810 dynamic_cafile_content.go:182] Shutting down client-ca-bundle::/tmp/kubernetes-kube-apiserver872517318/client-ca.crt
I0407 14:42:12.743942  125810 dynamic_cafile_content.go:182] Shutting down request-header::/tmp/kubernetes-kube-apiserver872517318/proxy-ca.crt
I0407 14:42:12.744043  125810 secure_serving.go:241] Stopped listening on 127.0.0.1:41033
I0407 14:42:12.744052  125810 tlsconfig.go:255] Shutting down DynamicServingCertificateController
I0407 14:42:12.744066  125810 dynamic_serving_content.go:145] Shutting down serving-cert::/tmp/kubernetes-kube-apiserver872517318/apiserver.crt::/tmp/kubernetes-kube-apiserver872517318/apiserver.key
I0407 14:42:12.744782  125810 customresource_discovery_controller.go:245] Shutting down DiscoveryController
I0407 14:42:12.744875  125810 controller.go:123] Shutting down OpenAPI controller
I0407 14:42:12.744889  125810 apiservice_controller.go:131] Shutting down APIServiceRegistrationController
I0407 14:42:12.744902  125810 nonstructuralschema_controller.go:204] Shutting down NonStructuralSchemaConditionController
I0407 14:42:12.744918  125810 crdregistration_controller.go:142] Shutting down crd-autoregister controller
I0407 14:42:12.744933  125810 establishing_controller.go:87] Shutting down EstablishingController
I0407 14:42:12.744944  125810 naming_controller.go:302] Shutting down NamingConditionController
I0407 14:42:12.744956  125810 available_controller.go:487] Shutting down AvailableConditionController
I0407 14:42:12.744969  125810 apiapproval_controller.go:198] Shutting down KubernetesAPIApprovalPolicyConformantConditionController
I0407 14:42:12.744983  125810 crd_finalizer.go:278] Shutting down CRDFinalizer
I0407 14:42:12.744997  125810 apf_controller.go:303] Shutting down API Priority and Fairness config worker
I0407 14:42:12.745253  125810 autoregister_controller.go:165] Shutting down autoregister controller
I0407 14:42:12.745272  125810 cluster_authentication_trust_controller.go:463] Shutting down cluster_authentication_trust_controller controller
I0407 14:42:12.760675  125810 dynamic_cafile_content.go:182] Shutting down client-ca-bundle::/tmp/kubernetes-kube-apiserver872517318/client-ca.crt
E0407 14:42:12.765417  125810 controller.go:184] Get "https://127.0.0.1:41033/api/v1/namespaces/default/endpoints/kubernetes": dial tcp 127.0.0.1:41033: connect: connection refused
    testserver.go:355: Resolved testserver package path to: "/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/cmd/kube-apiserver/app/testing"
I0407 14:42:13.180544  125810 serving.go:341] Generated self-signed cert (/tmp/kubernetes-kube-apiserver725875565/apiserver.crt, /tmp/kubernetes-kube-apiserver725875565/apiserver.key)
I0407 14:42:13.180671  125810 server.go:629] external host was not specified, using 127.0.0.1
W0407 14:42:13.180760  125810 authentication.go:507] AnonymousAuth is not allowed with the AlwaysAllow authorizer. Resetting AnonymousAuth to false. You should use a different authorizer
    testserver.go:190: runtime-config=map[api/all:true]
    testserver.go:191: Starting kube-apiserver on port 41507...
W0407 14:42:15.109358  125810 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:42:15.109386  125810 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:42:15.109398  125810 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:42:15.109561  125810 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:42:15.110312  125810 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:42:15.110346  125810 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:42:15.110367  125810 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:42:15.110394  125810 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:42:15.110435  125810 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:42:15.110482  125810 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:42:15.110734  125810 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:42:15.110884  125810 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:42:15.110947  125810 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0407 14:42:15.110965  125810 plugins.go:158] Loaded 10 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,RuntimeClass,DefaultIngressClass,MutatingAdmissionWebhook.
I0407 14:42:15.110975  125810 plugins.go:161] Loaded 9 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,RuntimeClass,CertificateApproval,CertificateSigning,CertificateSubjectRestriction,ValidatingAdmissionWebhook,ResourceQuota.
W0407 14:42:15.111050  125810 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:42:15.111073  125810 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0407 14:42:15.112428  125810 plugins.go:158] Loaded 10 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,RuntimeClass,DefaultIngressClass,MutatingAdmissionWebhook.
I0407 14:42:15.112448  125810 plugins.go:161] Loaded 9 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,RuntimeClass,CertificateApproval,CertificateSigning,CertificateSubjectRestriction,ValidatingAdmissionWebhook,ResourceQuota.
I0407 14:42:15.114020  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.114062  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.114855  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.114893  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
W0407 14:42:15.153179  125810 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0407 14:42:15.154667  125810 instance.go:283] Using reconciler: lease
I0407 14:42:15.154927  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.155090  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.158259  125810 instance.go:387] Could not construct pre-rendered responses for ServiceAccountIssuerDiscovery endpoints. Endpoints will not be enabled. Error: empty issuer URL
I0407 14:42:15.158750  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.158794  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.159909  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.159943  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.161283  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.161336  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.162389  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.162419  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.163234  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.163264  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.164601  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.164629  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.166507  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.166530  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.167670  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.167701  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.169332  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.169361  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.170186  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.170218  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.171149  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.171179  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.173978  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.174012  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.174818  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.174843  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.175801  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.175834  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.176450  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.176477  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.177239  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.177268  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.178102  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.178138  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.179080  125810 rest.go:130] the default service ipfamily for this cluster is: IPv4
I0407 14:42:15.293321  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.293371  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.294814  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.294864  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.296023  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.296048  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.297395  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.297435  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.298427  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.298458  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.299472  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.299510  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.300885  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.300921  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.302102  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.302140  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.309712  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.309752  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.310799  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.310841  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.311940  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.311972  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.313121  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.313157  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.313892  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.313926  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.314752  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.314784  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.315696  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.315717  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.316767  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.316799  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.318356  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.318392  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.319468  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.319507  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.320616  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.320648  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.322223  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.322256  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.323278  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.323308  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.324578  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.324610  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.325422  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.325464  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.326549  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.326597  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.327456  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.327497  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.328679  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.328713  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.329567  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.329611  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.330447  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.330535  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.331439  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.331469  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.332231  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.332271  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.333458  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.333494  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.334173  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.334311  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.335307  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.335339  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.336172  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.336257  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.342565  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.342617  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.343668  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.343701  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.345060  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.345101  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.347849  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.348016  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.348871  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.348903  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.349672  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.349783  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.350822  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.350855  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.351480  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.351511  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.352440  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.352472  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.353381  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.353459  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.354840  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.354875  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.355700  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.355820  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.356470  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.356502  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.360987  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.361087  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.361988  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.362019  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.362900  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.362936  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.363997  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.364204  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.365504  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.365535  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.366127  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.366156  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.367247  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.367281  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.368380  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.368411  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.369608  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.369716  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.370413  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.370459  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.371466  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.371491  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.372460  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.372497  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.373320  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.373422  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.374527  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.374572  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.375579  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.375671  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.377395  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.377427  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.378211  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.378242  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.379427  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.379462  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.380235  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.380271  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
W0407 14:42:15.742709  125810 genericapiserver.go:425] Skipping API apps/v1beta2 because it has no resources.
W0407 14:42:15.742731  125810 genericapiserver.go:425] Skipping API apps/v1beta1 because it has no resources.
I0407 14:42:15.759333  125810 plugins.go:158] Loaded 10 mutating admission controller(s) successfully in the following order: NamespaceLifecycle,LimitRanger,TaintNodesByCondition,Priority,DefaultTolerationSeconds,DefaultStorageClass,StorageObjectInUseProtection,RuntimeClass,DefaultIngressClass,MutatingAdmissionWebhook.
I0407 14:42:15.759359  125810 plugins.go:161] Loaded 9 validating admission controller(s) successfully in the following order: LimitRanger,Priority,PersistentVolumeClaimResize,RuntimeClass,CertificateApproval,CertificateSigning,CertificateSubjectRestriction,ValidatingAdmissionWebhook,ResourceQuota.
W0407 14:42:15.761020  125810 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0407 14:42:15.761277  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.761311  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:15.762266  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:15.762298  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
W0407 14:42:15.791886  125810 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
    testserver.go:210: Waiting for /healthz to be ok...
I0407 14:42:16.109703  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:16.109786  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
I0407 14:42:21.140370  125810 dynamic_cafile_content.go:167] Starting request-header::/tmp/kubernetes-kube-apiserver725875565/proxy-ca.crt
I0407 14:42:21.140429  125810 dynamic_cafile_content.go:167] Starting client-ca-bundle::/tmp/kubernetes-kube-apiserver725875565/client-ca.crt
I0407 14:42:21.140710  125810 dynamic_serving_content.go:130] Starting serving-cert::/tmp/kubernetes-kube-apiserver725875565/apiserver.crt::/tmp/kubernetes-kube-apiserver725875565/apiserver.key
I0407 14:42:21.141473  125810 secure_serving.go:197] Serving securely on 127.0.0.1:41507
I0407 14:42:21.141536  125810 tlsconfig.go:240] Starting DynamicServingCertificateController
I0407 14:42:21.146355  125810 autoregister_controller.go:141] Starting autoregister controller
I0407 14:42:21.146383  125810 cache.go:32] Waiting for caches to sync for autoregister controller
I0407 14:42:21.146428  125810 apf_controller.go:294] Starting API Priority and Fairness config controller
I0407 14:42:21.146446  125810 crdregistration_controller.go:111] Starting crd-autoregister controller
I0407 14:42:21.146454  125810 shared_informer.go:240] Waiting for caches to sync for crd-autoregister
I0407 14:42:21.160828  125810 customresource_discovery_controller.go:209] Starting DiscoveryController
I0407 14:42:21.160886  125810 apiservice_controller.go:97] Starting APIServiceRegistrationController
I0407 14:42:21.160896  125810 cache.go:32] Waiting for caches to sync for APIServiceRegistrationController controller
I0407 14:42:21.160918  125810 available_controller.go:475] Starting AvailableConditionController
I0407 14:42:21.160923  125810 cache.go:32] Waiting for caches to sync for AvailableConditionController controller
I0407 14:42:21.160954  125810 controller.go:86] Starting OpenAPI controller
I0407 14:42:21.160975  125810 naming_controller.go:291] Starting NamingConditionController
I0407 14:42:21.160991  125810 establishing_controller.go:76] Starting EstablishingController
I0407 14:42:21.161008  125810 nonstructuralschema_controller.go:192] Starting NonStructuralSchemaConditionController
I0407 14:42:21.161023  125810 apiapproval_controller.go:186] Starting KubernetesAPIApprovalPolicyConformantConditionController
I0407 14:42:21.161040  125810 crd_finalizer.go:266] Starting CRDFinalizer
W0407 14:42:21.163352  125810 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0407 14:42:21.163552  125810 cluster_authentication_trust_controller.go:440] Starting cluster_authentication_trust_controller controller
I0407 14:42:21.163564  125810 shared_informer.go:240] Waiting for caches to sync for cluster_authentication_trust_controller
I0407 14:42:21.164632  125810 controller.go:83] Starting OpenAPI AggregationController
I0407 14:42:21.164680  125810 dynamic_cafile_content.go:167] Starting client-ca-bundle::/tmp/kubernetes-kube-apiserver725875565/client-ca.crt
I0407 14:42:21.164709  125810 dynamic_cafile_content.go:167] Starting request-header::/tmp/kubernetes-kube-apiserver725875565/proxy-ca.crt
E0407 14:42:21.170904  125810 controller.go:152] Unable to remove old endpoints from kubernetes service: StorageError: key not found, Code: 1, Key: /84da2f19-cca7-4179-8d63-67a7f5b593f9/registry/masterleases/127.0.0.1, ResourceVersion: 0, AdditionalErrorMsg: 
W0407 14:42:21.193100  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.194220  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.197243  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.197797  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.199325  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.202393  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.221722  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
I0407 14:42:21.229806  125810 controller.go:611] quota admission added evaluator for: namespaces
I0407 14:42:21.260691  125810 shared_informer.go:247] Caches are synced for crd-autoregister 
I0407 14:42:21.260774  125810 cache.go:39] Caches are synced for autoregister controller
I0407 14:42:21.260830  125810 apf_controller.go:299] Running API Priority and Fairness config worker
I0407 14:42:21.261610  125810 cache.go:39] Caches are synced for APIServiceRegistrationController controller
I0407 14:42:21.261676  125810 cache.go:39] Caches are synced for AvailableConditionController controller
I0407 14:42:21.263698  125810 shared_informer.go:247] Caches are synced for cluster_authentication_trust_controller 
W0407 14:42:21.265558  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.273412  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.273568  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.286114  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.286284  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.304871  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.305390  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.316372  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.321905  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.325476  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.331554  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.370392  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.370645  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.380715  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.381072  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.384477  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.397351  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.413692  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.414164  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.417879  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.418837  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.419386  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.423075  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.423269  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.430716  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.431007  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.434945  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.464122  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.464130  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.467321  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.473881  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.475261  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.485304  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.485440  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 PriorityLevelConfiguration is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.492179  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.502391  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.507775  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
W0407 14:42:21.513522  125810 warnings.go:70] flowcontrol.apiserver.k8s.io/v1beta1 FlowSchema is deprecated in v1.23+, unavailable in v1.26+
I0407 14:42:22.141132  125810 controller.go:132] OpenAPI AggregationController: action for item : Nothing (removed from the queue).
I0407 14:42:22.141504  125810 controller.go:132] OpenAPI AggregationController: action for item k8s_internal_local_delegation_chain_0000000000: Nothing (removed from the queue).
I0407 14:42:22.181211  125810 storage_scheduling.go:132] created PriorityClass system-node-critical with value 2000001000
I0407 14:42:22.186277  125810 storage_scheduling.go:132] created PriorityClass system-cluster-critical with value 2000000000
I0407 14:42:22.186298  125810 storage_scheduling.go:148] all system priority classes are created successfully or already exist.
I0407 14:42:22.408975  125810 client.go:360] parsed scheme: "endpoint"
I0407 14:42:22.409159  125810 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
    invalid_managedFields_test.go:144: expected one warning, got: 0
    invalid_managedFields_test.go:147: unexpected warning, expected: 
        .metadata.managedFields was in an invalid state after admission; this could be caused by an outdated mutating admission controller; please fix your requests: 
        , got: 
    invalid_managedFields_test.go:208: corrupting managedFields [{admissionwebhook.test Update v1 2021-04-07 14:42:22 +0000 UTC FieldsV1 {"f:spec":{"f:containers":{"k:{\"name\":\"fake-name\"}":{".":{},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:resources":{},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:enableServiceLinks":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}}}]
W0407 14:42:22.504741  125810 lease.go:233] Resetting endpoints for master service "kubernetes" to [127.0.0.1]
    invalid_managedFields_test.go:161: expected two warnings, got: 1
E0407 14:42:22.508373  125810 controller.go:223] unable to sync kubernetes service: Endpoints "kubernetes" is invalid: subsets[0].addresses[0].ip: Invalid value: "127.0.0.1": may not be in the loopback range (127.0.0.0/8)
W0407 14:42:22.523763  125810 cacher.go:148] Terminating all watchers from cacher *apiextensions.CustomResourceDefinition
W0407 14:42:22.524024  125810 cacher.go:148] Terminating all watchers from cacher *core.LimitRange
W0407 14:42:22.524291  125810 cacher.go:148] Terminating all watchers from cacher *core.ResourceQuota
W0407 14:42:22.524461  125810 cacher.go:148] Terminating all watchers from cacher *core.Secret
W0407 14:42:22.524801  125810 cacher.go:148] Terminating all watchers from cacher *core.ConfigMap
W0407 14:42:22.524949  125810 cacher.go:148] Terminating all watchers from cacher *core.Namespace
W0407 14:42:22.525111  125810 cacher.go:148] Terminating all watchers from cacher *core.Endpoints
W0407 14:42:22.525315  125810 cacher.go:148] Terminating all watchers from cacher *core.Pod
W0407 14:42:22.525497  125810 cacher.go:148] Terminating all watchers from cacher *core.ServiceAccount
W0407 14:42:22.525816  125810 cacher.go:148] Terminating all watchers from cacher *core.Service
W0407 14:42:22.527098  125810 cacher.go:148] Terminating all watchers from cacher *networking.IngressClass
W0407 14:42:22.527569  125810 cacher.go:148] Terminating all watchers from cacher *node.RuntimeClass
W0407 14:42:22.533525  125810 cacher.go:148] Terminating all watchers from cacher *scheduling.PriorityClass
W0407 14:42:22.538747  125810 cacher.go:148] Terminating all watchers from cacher *storage.StorageClass
W0407 14:42:22.539510  125810 cacher.go:148] Terminating all watchers from cacher *flowcontrol.FlowSchema
W0407 14:42:22.541085  125810 cacher.go:148] Terminating all watchers from cacher *flowcontrol.PriorityLevelConfiguration
W0407 14:42:22.543398  125810 cacher.go:148] Terminating all watchers from cacher *admissionregistration.ValidatingWebhookConfiguration
W0407 14:42:22.545239  125810 cacher.go:148] Terminating all watchers from cacher *admissionregistration.MutatingWebhookConfiguration
W0407 14:42:22.545576  125810 cacher.go:148] Terminating all watchers from cacher *apiregistration.APIService
--- FAIL: TestMutatingWebhookResetsInvalidManagedFields (9.80s)

				from junit_20210407-143707.xml

Filter through log files | View test history on testgrid


Show 3332 Passed Tests

Show 8 Skipped Tests

Error lines from build-log.txt

... skipping 70 lines ...
Recording: record_command_canary
Running command: record_command_canary

+++ Running case: test-cmd.record_command_canary 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: record_command_canary
/home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh: line 156: bogus-expected-to-fail: command not found
!!! [0407 14:24:53] Call tree:
!!! [0407 14:24:53]  1: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:47 record_command_canary(...)
!!! [0407 14:24:53]  2: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:112 eVal(...)
!!! [0407 14:24:53]  3: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:132 juLog(...)
!!! [0407 14:24:53]  4: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:160 record_command(...)
!!! [0407 14:24:53]  5: hack/make-rules/test-cmd.sh:35 source(...)
+++ exit code: 1
+++ error: 1
+++ [0407 14:24:54] Running kubeadm tests
+++ [0407 14:24:58] Building go targets for linux/amd64:
    cmd/kubeadm
+++ [0407 14:25:48] Running tests without code coverage
{"Time":"2021-04-07T14:27:23.182706331Z","Action":"output","Package":"k8s.io/kubernetes/cmd/kubeadm/test/cmd","Output":"ok  \tk8s.io/kubernetes/cmd/kubeadm/test/cmd\t48.098s\n"}
✓  cmd/kubeadm/test/cmd (48.1s)
... skipping 352 lines ...
I0407 14:29:54.730070   60095 client.go:360] parsed scheme: "passthrough"
I0407 14:29:54.730136   60095 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I0407 14:29:54.730152   60095 clientconn.go:948] ClientConn switching balancer to "pick_first"
+++ [0407 14:30:00] Generate kubeconfig for controller-manager
+++ [0407 14:30:00] Starting controller-manager
I0407 14:30:01.504465   63831 serving.go:347] Generated self-signed cert in-memory
W0407 14:30:02.203393   63831 authentication.go:410] failed to read in-cluster kubeconfig for delegated authentication: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W0407 14:30:02.203453   63831 authentication.go:307] No authentication-kubeconfig provided in order to lookup client-ca-file in configmap/extension-apiserver-authentication in kube-system, so client certificate authentication won't work.
W0407 14:30:02.203461   63831 authentication.go:331] No authentication-kubeconfig provided in order to lookup requestheader-client-ca-file in configmap/extension-apiserver-authentication in kube-system, so request-header client certificate authentication won't work.
W0407 14:30:02.203525   63831 authorization.go:216] failed to read in-cluster kubeconfig for delegated authorization: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W0407 14:30:02.203539   63831 authorization.go:184] No authorization-kubeconfig provided, so SubjectAccessReview of authorization tokens won't work.
I0407 14:30:02.203561   63831 controllermanager.go:175] Version: v1.22.0-alpha.0.30+b0abe89ae259d5
I0407 14:30:02.204832   63831 secure_serving.go:197] Serving securely on [::]:10257
I0407 14:30:02.205000   63831 tlsconfig.go:240] Starting DynamicServingCertificateController
I0407 14:30:02.205555   63831 deprecated_insecure_serving.go:53] Serving insecurely on [::]:10252
I0407 14:30:02.205967   63831 leaderelection.go:243] attempting to acquire leader lease kube-system/kube-controller-manager...
... skipping 154 lines ...
I0407 14:30:02.819193   63831 node_lifecycle_controller.go:377] Sending events to api server.
I0407 14:30:02.819400   63831 taint_manager.go:163] "Sending events to api server"
I0407 14:30:02.819481   63831 node_lifecycle_controller.go:505] Controller will reconcile labels.
I0407 14:30:02.819517   63831 controllermanager.go:574] Started "nodelifecycle"
I0407 14:30:02.819646   63831 node_lifecycle_controller.go:539] Starting node controller
I0407 14:30:02.819669   63831 shared_informer.go:240] Waiting for caches to sync for taint
E0407 14:30:02.819875   63831 core.go:91] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail
W0407 14:30:02.819899   63831 controllermanager.go:566] Skipping "service"
I0407 14:30:02.830057   63831 controllermanager.go:574] Started "namespace"
I0407 14:30:02.830241   63831 namespace_controller.go:200] Starting namespace controller
I0407 14:30:02.830267   63831 shared_informer.go:240] Waiting for caches to sync for namespace
I0407 14:30:02.830530   63831 controllermanager.go:574] Started "replicaset"
I0407 14:30:02.830563   63831 replica_set.go:182] Starting replicaset controller
I0407 14:30:02.830575   63831 shared_informer.go:240] Waiting for caches to sync for ReplicaSet
I0407 14:30:02.830843   63831 node_lifecycle_controller.go:76] Sending events to api server
E0407 14:30:02.830890   63831 core.go:231] failed to start cloud node lifecycle controller: no cloud provider provided
W0407 14:30:02.830901   63831 controllermanager.go:566] Skipping "cloud-node-lifecycle"
W0407 14:30:02.830914   63831 controllermanager.go:566] Skipping "csrsigning"
W0407 14:30:02.830920   63831 controllermanager.go:553] "tokencleaner" is disabled
I0407 14:30:02.834302   63831 shared_informer.go:240] Waiting for caches to sync for resource quota
W0407 14:30:02.865647   63831 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:30:02.865929   63831 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:30:02.866792   63831 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:30:02.867714   63831 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:30:02.868053   63831 actual_state_of_world.go:534] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist
W0407 14:30:02.868380   63831 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:30:02.868562   63831 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:30:02.868609   63831 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:30:02.868812   63831 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:30:02.868850   63831 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
W0407 14:30:02.869201   63831 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
... skipping 26 lines ...
I0407 14:30:03.091438   63831 shared_informer.go:247] Caches are synced for cronjob 
I0407 14:30:03.091522   63831 shared_informer.go:247] Caches are synced for job 
I0407 14:30:03.099108   63831 shared_informer.go:247] Caches are synced for ClusterRoleAggregator 
I0407 14:30:03.120271   63831 shared_informer.go:247] Caches are synced for TTL after finished 
I0407 14:30:03.218526   63831 shared_informer.go:247] Caches are synced for stateful set 
I0407 14:30:03.292924   63831 shared_informer.go:247] Caches are synced for daemon sets 
The Service "kubernetes" is invalid: spec.clusterIPs: Invalid value: []string{"10.0.0.1"}: failed to allocated ip:10.0.0.1 with error:provided IP is already allocated
I0407 14:30:03.316244   63831 shared_informer.go:247] Caches are synced for resource quota 
I0407 14:30:03.320722   63831 shared_informer.go:247] Caches are synced for taint 
I0407 14:30:03.320827   63831 node_lifecycle_controller.go:1398] Initializing eviction metric for zone: 
I0407 14:30:03.320888   63831 taint_manager.go:187] "Starting NoExecuteTaintManager"
I0407 14:30:03.320905   63831 node_lifecycle_controller.go:1164] Controller detected that all Nodes are not-Ready. Entering master disruption mode.
I0407 14:30:03.320997   63831 event.go:291] "Event occurred" object="127.0.0.1" kind="Node" apiVersion="v1" type="Normal" reason="RegisteredNode" message="Node 127.0.0.1 event: Registered Node 127.0.0.1 in Controller"
... skipping 107 lines ...
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_RESTMapper_evaluation_tests
+++ [0407 14:30:08] Creating namespace namespace-1617805808-2840
namespace/namespace-1617805808-2840 created
Context "test" modified.
+++ [0407 14:30:08] Testing RESTMapper
+++ [0407 14:30:08] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype"
+++ exit code: 0
NAME                              SHORTNAMES   APIVERSION                             NAMESPACED   KIND
bindings                                       v1                                     true         Binding
componentstatuses                 cs           v1                                     false        ComponentStatus
configmaps                        cm           v1                                     true         ConfigMap
endpoints                         ep           v1                                     true         Endpoints
... skipping 63 lines ...
namespace/namespace-1617805814-20116 created
Context "test" modified.
+++ [0407 14:30:15] Testing clusterroles
rbac.sh:29: Successful get clusterroles/cluster-admin {{.metadata.name}}: cluster-admin
(Brbac.sh:30: Successful get clusterrolebindings/cluster-admin {{.metadata.name}}: cluster-admin
(BSuccessful
message:Error from server (NotFound): clusterroles.rbac.authorization.k8s.io "pod-admin" not found
has:clusterroles.rbac.authorization.k8s.io "pod-admin" not found
clusterrole.rbac.authorization.k8s.io/pod-admin created (dry run)
clusterrole.rbac.authorization.k8s.io/pod-admin created (server dry run)
Successful
message:Error from server (NotFound): clusterroles.rbac.authorization.k8s.io "pod-admin" not found
has:clusterroles.rbac.authorization.k8s.io "pod-admin" not found
clusterrole.rbac.authorization.k8s.io/pod-admin created
rbac.sh:42: Successful get clusterrole/pod-admin {{range.rules}}{{range.verbs}}{{.}}:{{end}}{{end}}: *:
(BSuccessful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
clusterrole.rbac.authorization.k8s.io "pod-admin" deleted
... skipping 18 lines ...
(Bclusterrole.rbac.authorization.k8s.io/url-reader created
rbac.sh:61: Successful get clusterrole/url-reader {{range.rules}}{{range.verbs}}{{.}}:{{end}}{{end}}: get:
(Brbac.sh:62: Successful get clusterrole/url-reader {{range.rules}}{{range.nonResourceURLs}}{{.}}:{{end}}{{end}}: /logs/*:/healthz/*:
(Bclusterrole.rbac.authorization.k8s.io/aggregation-reader created
rbac.sh:64: Successful get clusterrole/aggregation-reader {{.metadata.name}}: aggregation-reader
(BSuccessful
message:Error from server (NotFound): clusterrolebindings.rbac.authorization.k8s.io "super-admin" not found
has:clusterrolebindings.rbac.authorization.k8s.io "super-admin" not found
clusterrolebinding.rbac.authorization.k8s.io/super-admin created (dry run)
clusterrolebinding.rbac.authorization.k8s.io/super-admin created (server dry run)
Successful
message:Error from server (NotFound): clusterrolebindings.rbac.authorization.k8s.io "super-admin" not found
has:clusterrolebindings.rbac.authorization.k8s.io "super-admin" not found
clusterrolebinding.rbac.authorization.k8s.io/super-admin created
rbac.sh:77: Successful get clusterrolebinding/super-admin {{range.subjects}}{{.name}}:{{end}}: super-admin:
(Bclusterrolebinding.rbac.authorization.k8s.io/super-admin subjects updated (dry run)
clusterrolebinding.rbac.authorization.k8s.io/super-admin subjects updated (server dry run)
rbac.sh:80: Successful get clusterrolebinding/super-admin {{range.subjects}}{{.name}}:{{end}}: super-admin:
... skipping 64 lines ...
rbac.sh:102: Successful get clusterrolebinding/super-admin {{range.subjects}}{{.name}}:{{end}}: super-admin:foo:test-all-user:
(Brbac.sh:103: Successful get clusterrolebinding/super-group {{range.subjects}}{{.name}}:{{end}}: the-group:foo:test-all-user:
(Brbac.sh:104: Successful get clusterrolebinding/super-sa {{range.subjects}}{{.name}}:{{end}}: sa-name:foo:test-all-user:
(Brolebinding.rbac.authorization.k8s.io/admin created (dry run)
rolebinding.rbac.authorization.k8s.io/admin created (server dry run)
Successful
message:Error from server (NotFound): rolebindings.rbac.authorization.k8s.io "admin" not found
has: not found
rolebinding.rbac.authorization.k8s.io/admin created
rbac.sh:113: Successful get rolebinding/admin {{.roleRef.kind}}: ClusterRole
(Brbac.sh:114: Successful get rolebinding/admin {{range.subjects}}{{.name}}:{{end}}: default-admin:
(Brolebinding.rbac.authorization.k8s.io/admin subjects updated
rbac.sh:116: Successful get rolebinding/admin {{range.subjects}}{{.name}}:{{end}}: default-admin:foo:
... skipping 29 lines ...
message:Warning: rbac.authorization.k8s.io/v1beta1 Role is deprecated in v1.17+, unavailable in v1.22+; use rbac.authorization.k8s.io/v1 Role
No resources found in namespace-1617805823-24850 namespace.
has:Role is deprecated
Successful
message:Warning: rbac.authorization.k8s.io/v1beta1 Role is deprecated in v1.17+, unavailable in v1.22+; use rbac.authorization.k8s.io/v1 Role
No resources found in namespace-1617805823-24850 namespace.
Error: 1 warning received
has:Role is deprecated
Successful
message:Warning: rbac.authorization.k8s.io/v1beta1 Role is deprecated in v1.17+, unavailable in v1.22+; use rbac.authorization.k8s.io/v1 Role
No resources found in namespace-1617805823-24850 namespace.
Error: 1 warning received
has:Error: 1 warning received
role.rbac.authorization.k8s.io/pod-admin created (dry run)
role.rbac.authorization.k8s.io/pod-admin created (server dry run)
Successful
message:Error from server (NotFound): roles.rbac.authorization.k8s.io "pod-admin" not found
has: not found
role.rbac.authorization.k8s.io/pod-admin created
rbac.sh:163: Successful get role/pod-admin {{range.rules}}{{range.verbs}}{{.}}:{{end}}{{end}}: *:
(Brbac.sh:164: Successful get role/pod-admin {{range.rules}}{{range.resources}}{{.}}:{{end}}{{end}}: pods:
(Brbac.sh:165: Successful get role/pod-admin {{range.rules}}{{range.apiGroups}}{{.}}:{{end}}{{end}}: :
(BSuccessful
... skipping 412 lines ...
has:valid-pod
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
has:valid-pod
core.sh:190: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: resource(s) were provided, but no name was specified
core.sh:194: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:198: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: setting 'all' parameter but found a non empty selector. 
core.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
core.sh:210: Successful get pods -l'name in (valid-pod)' {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:215: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-kubectl-describe-pod\" }}found{{end}}{{end}}:: :
... skipping 24 lines ...
(BWarning: policy/v1beta1 PodDisruptionBudget is deprecated in v1.21+, unavailable in v1.25+; use policy/v1 PodDisruptionBudget
poddisruptionbudget.policy/test-pdb-3 created
core.sh:265: Successful get pdb/test-pdb-3 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 2
(BWarning: policy/v1beta1 PodDisruptionBudget is deprecated in v1.21+, unavailable in v1.25+; use policy/v1 PodDisruptionBudget
poddisruptionbudget.policy/test-pdb-4 created
core.sh:269: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50%
(Berror: min-available and max-unavailable cannot be both specified
core.sh:275: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/env-test-pod created
matched TEST_CMD_1
matched <set to the key 'key-1' in secret 'test-secret'>
matched TEST_CMD_2
matched <set to the key 'key-2' of config map 'test-configmap'>
... skipping 224 lines ...
core.sh:534: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.4.1:
(BSuccessful
message:kubectl-create kubectl-patch
has:kubectl-patch
pod/valid-pod patched
core.sh:554: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
(B+++ [0407 14:30:56] "kubectl patch with resourceVersion 597" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again
pod "valid-pod" deleted
pod/valid-pod replaced
core.sh:578: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname
(BSuccessful
message:kubectl-replace
has:kubectl-replace
Successful
message:error: --grace-period must have --force specified
has:\-\-grace-period must have \-\-force specified
Successful
message:error: --timeout must have --force specified
has:\-\-timeout must have \-\-force specified
W0407 14:30:57.549268   63831 actual_state_of_world.go:534] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist
node/node-v1-test created
core.sh:606: Successful get node node-v1-test {{range.items}}{{if .metadata.annotations.a}}found{{end}}{{end}}:: :
(Bnode/node-v1-test replaced (server dry run)
node/node-v1-test replaced (dry run)
core.sh:631: Successful get node node-v1-test {{range.items}}{{if .metadata.annotations.a}}found{{end}}{{end}}:: :
(BI0407 14:30:58.329899   63831 event.go:291] "Event occurred" object="node-v1-test" kind="Node" apiVersion="v1" type="Normal" reason="RegisteredNode" message="Node node-v1-test event: Registered Node node-v1-test in Controller"
... skipping 31 lines ...
spec:
  containers:
  - image: k8s.gcr.io/pause:3.4.1
    name: kubernetes-pause
has:localonlyvalue
core.sh:683: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Berror: 'name' already has a value (valid-pod), and --overwrite is false
core.sh:687: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bcore.sh:691: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bpod/valid-pod labeled
core.sh:695: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod-super-sayan
(Bcore.sh:699: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
... skipping 84 lines ...
+++ Running case: test-cmd.run_kubectl_create_error_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_create_error_tests
+++ [0407 14:31:09] Creating namespace namespace-1617805869-29858
namespace/namespace-1617805869-29858 created
Context "test" modified.
+++ [0407 14:31:09] Testing kubectl create with error
Error: must specify one of -f and -k

Create a resource from a file or from stdin.

 JSON and YAML formats are accepted.

Examples:
... skipping 44 lines ...

Usage:
  kubectl create -f FILENAME [options]

Use "kubectl <command> --help" for more information about a given command.
Use "kubectl options" for a list of global command-line options (applies to all commands).
+++ [0407 14:31:09] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false
+++ exit code: 0
Recording: run_kubectl_apply_tests
Running command: run_kubectl_apply_tests

+++ Running case: test-cmd.run_kubectl_apply_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
... skipping 29 lines ...
I0407 14:31:13.049857   63831 event.go:291] "Event occurred" object="namespace-1617805869-27058/test-deployment-retainkeys-8695b756f8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: test-deployment-retainkeys-8695b756f8-h7pn8"
deployment.apps "test-deployment-retainkeys" deleted
apply.sh:88: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/selector-test-pod created
apply.sh:92: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
(BSuccessful
message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
has:pods "selector-test-pod-dont-apply" not found
pod "selector-test-pod" deleted
apply.sh:101: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BW0407 14:31:14.240678   72253 helpers.go:571] --dry-run=true is deprecated (boolean value) and can be replaced with --dry-run=client.
pod/test-pod created (dry run)
pod/test-pod created (dry run)
... skipping 37 lines ...
(Bpod/b created
apply.sh:196: Successful get pods a {{.metadata.name}}: a
(Bapply.sh:197: Successful get pods b -n nsb {{.metadata.name}}: b
(Bpod "a" deleted
pod "b" deleted
Successful
message:error: all resources selected for prune without explicitly passing --all. To prune all resources, pass the --all flag. If you did not mean to prune all resources, specify a label selector
has:all resources selected for prune without explicitly passing --all
pod/a created
pod/b created
service/prune-svc created
Warning: batch/v1beta1 CronJob is deprecated in v1.21+, unavailable in v1.25+; use batch/v1 CronJob
I0407 14:31:23.593628   63831 horizontal.go:361] Horizontal Pod Autoscaler frontend has been deleted in namespace-1617805866-925
... skipping 41 lines ...
(Bpod/b unchanged
pod/a pruned
Warning: batch/v1beta1 CronJob is deprecated in v1.21+, unavailable in v1.25+; use batch/v1 CronJob
apply.sh:254: Successful get pods -n nsb {{range.items}}{{.metadata.name}}:{{end}}: b:
(Bnamespace "nsb" deleted
Successful
message:error: the namespace from the provided object "nsb" does not match the namespace "foo". You must pass '--namespace=nsb' to perform this operation.
has:the namespace from the provided object "nsb" does not match the namespace "foo".
apply.sh:265: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bservice/a created
apply.sh:269: Successful get services a {{.metadata.name}}: a
(BSuccessful
message:The Service "a" is invalid: spec.clusterIPs[0]: Invalid value: []string{"10.0.0.12"}: may not change once set
... skipping 29 lines ...
(Bapply.sh:291: Successful get deployment test-the-deployment {{.metadata.name}}: test-the-deployment
(Bapply.sh:292: Successful get service test-the-service {{.metadata.name}}: test-the-service
(Bconfigmap "test-the-map" deleted
service "test-the-service" deleted
deployment.apps "test-the-deployment" deleted
Successful
message:Error from server (NotFound): namespaces "multi-resource-ns" not found
has:namespaces "multi-resource-ns" not found
apply.sh:300: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:namespace/multi-resource-ns created
Error from server (NotFound): error when creating "hack/testdata/multi-resource-1.yaml": namespaces "multi-resource-ns" not found
has:namespaces "multi-resource-ns" not found
Successful
message:Error from server (NotFound): pods "test-pod" not found
has:pods "test-pod" not found
pod/test-pod created
namespace/multi-resource-ns unchanged
apply.sh:308: Successful get pods test-pod -n multi-resource-ns {{.metadata.name}}: test-pod
(Bpod "test-pod" deleted
namespace "multi-resource-ns" deleted
apply.sh:314: Successful get configmaps --field-selector=metadata.name=foo {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:configmap/foo created
error: unable to recognize "hack/testdata/multi-resource-2.yaml": no matches for kind "Bogus" in version "example.com/v1"
has:no matches for kind "Bogus" in version "example.com/v1"
apply.sh:320: Successful get configmaps foo {{.metadata.name}}: foo
(Bconfigmap "foo" deleted
apply.sh:326: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:pod/pod-a created
... skipping 6 lines ...
pod "pod-c" deleted
apply.sh:334: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapply.sh:338: Successful get crds {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Warning: apiextensions.k8s.io/v1beta1 CustomResourceDefinition is deprecated in v1.16+, unavailable in v1.22+; use apiextensions.k8s.io/v1 CustomResourceDefinition
customresourcedefinition.apiextensions.k8s.io/widgets.example.com created
error: unable to recognize "hack/testdata/multi-resource-4.yaml": no matches for kind "Widget" in version "example.com/v1"
has:no matches for kind "Widget" in version "example.com/v1"
W0407 14:32:03.367772   63831 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0407 14:32:03.367828   63831 resource_quota_monitor.go:229] QuotaMonitor created object count evaluator for widgets.example.com
I0407 14:32:03.367903   63831 shared_informer.go:240] Waiting for caches to sync for resource quota
I0407 14:32:03.391113   60095 client.go:360] parsed scheme: "endpoint"
I0407 14:32:03.391164   60095 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379  <nil> 0 <nil>}]
Successful
message:Error from server (NotFound): widgets.example.com "foo" not found
has:widgets.example.com "foo" not found
I0407 14:32:03.468820   63831 shared_informer.go:247] Caches are synced for resource quota 
apply.sh:344: Successful get crds widgets.example.com {{.metadata.name}}: widgets.example.com
(BI0407 14:32:03.809872   63831 shared_informer.go:240] Waiting for caches to sync for garbage collector
I0407 14:32:03.809935   63831 shared_informer.go:247] Caches are synced for garbage collector 
I0407 14:32:05.718283   60095 controller.go:611] quota admission added evaluator for: widgets.example.com
... skipping 20 lines ...
(Bpod/test-pod serverside-applied
apply.sh:368: Successful get pods test-pod {{.metadata.labels.name}}: test-pod-label
(BSuccessful
message:kubectl
has:kubectl
W0407 14:32:06.926363   60095 cacher.go:148] Terminating all watchers from cacher *unstructured.Unstructured
E0407 14:32:06.927816   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
pod/test-pod serverside-applied
Successful
message:kubectl my-field-manager
has:my-field-manager
pod "test-pod" deleted
apply.sh:381: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/test-pod serverside-applied (server dry run)
apply.sh:386: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/test-pod serverside-applied
E0407 14:32:07.944251   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/test-pod serverside-applied (server dry run)
apply.sh:393: Successful get pods test-pod {{.metadata.labels.name}}: test-pod-label
(BSuccessful
message:883
has:883
pod "test-pod" deleted
apply.sh:403: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ [0407 14:32:08] Testing upgrade kubectl client-side apply to server-side apply
pod/test-pod created
error: Apply failed with 1 conflict: conflict with "kubectl-client-side-apply" using v1: .metadata.labels.name
Please review the fields above--they currently have other managers. Here
are the ways you can resolve this warning:
* If you intend to manage all of these fields, please re-run the apply
  command with the `--force-conflicts` flag.
* If you do not intend to manage all of the fields, please edit your
  manifest to remove references to the fields that should keep their
... skipping 46 lines ...
    ]
  }
}
has:"name": "test-pod-label"
pod/test-pod configured
pod "test-pod" deleted
E0407 14:32:10.303807   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Warning: apiextensions.k8s.io/v1beta1 CustomResourceDefinition is deprecated in v1.16+, unavailable in v1.22+; use apiextensions.k8s.io/v1 CustomResourceDefinition
customresourcedefinition.apiextensions.k8s.io/resources.mygroup.example.com created
Successful
message:resources.mygroup.example.com
has:resources.mygroup.example.com
Warning: apiextensions.k8s.io/v1beta1 CustomResourceDefinition is deprecated in v1.16+, unavailable in v1.22+; use apiextensions.k8s.io/v1 CustomResourceDefinition
... skipping 21 lines ...
(Bpod "nginx-extensions" deleted
Successful
message:pod/test1 created
has:pod/test1 created
pod "test1" deleted
Successful
message:error: Invalid image name "InvalidImageName": invalid reference format
has:error: Invalid image name "InvalidImageName": invalid reference format
+++ exit code: 0
Recording: run_kubectl_create_filter_tests
Running command: run_kubectl_create_filter_tests

+++ Running case: test-cmd.run_kubectl_create_filter_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
... skipping 3 lines ...
Context "test" modified.
+++ [0407 14:32:12] Testing kubectl create filter
create.sh:50: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/selector-test-pod created
create.sh:54: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
(BSuccessful
message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
has:pods "selector-test-pod-dont-apply" not found
pod "selector-test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_apply_deployments_tests
Running command: run_kubectl_apply_deployments_tests

... skipping 3 lines ...
+++ [0407 14:32:13] Creating namespace namespace-1617805933-31701
namespace/namespace-1617805933-31701 created
Context "test" modified.
+++ [0407 14:32:13] Testing kubectl apply deployments
apps.sh:119: Successful get deployments {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:120: Successful get replicasets {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0407 14:32:14.048014   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:121: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/my-depl created
I0407 14:32:14.338089   63831 event.go:291] "Event occurred" object="namespace-1617805933-31701/my-depl" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set my-depl-84fb47b469 to 1"
I0407 14:32:14.348478   63831 event.go:291] "Event occurred" object="namespace-1617805933-31701/my-depl-84fb47b469" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: my-depl-84fb47b469-tdpmj"
apps.sh:125: Successful get deployments my-depl {{.metadata.name}}: my-depl
(Bapps.sh:127: Successful get deployments my-depl {{.spec.template.metadata.labels.l1}}: l1
... skipping 14 lines ...
I0407 14:32:15.986145   63831 event.go:291] "Event occurred" object="namespace-1617805933-31701/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-9bb9c4878 to 3"
I0407 14:32:15.989805   63831 event.go:291] "Event occurred" object="namespace-1617805933-31701/nginx-9bb9c4878" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-9bb9c4878-lqqmk"
I0407 14:32:15.995764   63831 event.go:291] "Event occurred" object="namespace-1617805933-31701/nginx-9bb9c4878" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-9bb9c4878-wdr97"
I0407 14:32:15.997087   63831 event.go:291] "Event occurred" object="namespace-1617805933-31701/nginx-9bb9c4878" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-9bb9c4878-kn5sz"
apps.sh:152: Successful get deployment nginx {{.metadata.name}}: nginx
(BSuccessful
message:Error from server (Conflict): error when applying patch:
{"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1617805933-31701\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}}
to:
Resource: "apps/v1, Resource=deployments", GroupVersionKind: "apps/v1, Kind=Deployment"
Name: "nginx", Namespace: "namespace-1617805933-31701"
for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.apps "nginx": the object has been modified; please apply your changes to the latest version and try again
has:Error from server (Conflict)
E0407 14:32:23.605147   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0407 14:32:24.339780   60095 client.go:360] parsed scheme: "passthrough"
I0407 14:32:24.339838   60095 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I0407 14:32:24.339848   60095 clientconn.go:948] ClientConn switching balancer to "pick_first"
deployment.apps/nginx configured
I0407 14:32:24.632212   63831 event.go:291] "Event occurred" object="namespace-1617805933-31701/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-6dd6cfdb57 to 3"
I0407 14:32:24.638303   63831 event.go:291] "Event occurred" object="namespace-1617805933-31701/nginx-6dd6cfdb57" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-6dd6cfdb57-x62st"
... skipping 311 lines ...
+++ [0407 14:32:33] Creating namespace namespace-1617805953-26004
namespace/namespace-1617805953-26004 created
Context "test" modified.
+++ [0407 14:32:33] Testing kubectl get
get.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BI0407 14:32:33.478683   63831 shared_informer.go:240] Waiting for caches to sync for resource quota
I0407 14:32:33.478734   63831 shared_informer.go:247] Caches are synced for resource quota 
Successful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:{
    "apiVersion": "v1",
    "items": [],
... skipping 25 lines ...
has not:No resources found
Successful
message:NAME
has not:No resources found
get.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:error: the server doesn't have a resource type "foobar"
has not:No resources found
Successful
message:No resources found in namespace-1617805953-26004 namespace.
has:No resources found
Successful
message:
has not:No resources found
Successful
message:No resources found in namespace-1617805953-26004 namespace.
has:No resources found
get.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
Successful
message:Error from server (NotFound): pods "abc" not found
has not:List
Successful
message:I0407 14:32:35.148816   75759 loader.go:372] Config loaded from file:  /tmp/tmp.L7pLYKUw1K/.kube/config
I0407 14:32:35.155444   75759 round_trippers.go:454] GET https://127.0.0.1:6443/version?timeout=32s 200 OK in 6 milliseconds
I0407 14:32:35.182968   75759 round_trippers.go:454] GET https://127.0.0.1:6443/api/v1/namespaces/default/pods 200 OK in 2 milliseconds
I0407 14:32:35.185512   75759 round_trippers.go:454] GET https://127.0.0.1:6443/api/v1/namespaces/default/replicationcontrollers 200 OK in 2 milliseconds
... skipping 524 lines ...
message:NAME               DATA   AGE
kube-root-ca.crt   1      7s
one                0      0s
three              0      0s
two                0      0s
has not:watch is only supported on individual resources
E0407 14:32:41.482186   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:
has not:watch is only supported on individual resources
+++ [0407 14:32:42] Creating namespace namespace-1617805962-4204
namespace/namespace-1617805962-4204 created
Context "test" modified.
... skipping 58 lines ...
}
get.sh:158: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(B<no value>Successful
message:valid-pod:
has:valid-pod:
Successful
message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template:
	template was:
		{.missing}
	object given to jsonpath engine was:
		map[string]interface {}{"apiVersion":"v1", "kind":"Pod", "metadata":map[string]interface {}{"creationTimestamp":"2021-04-07T14:32:42Z", "labels":map[string]interface {}{"name":"valid-pod"}, "managedFields":[]interface {}{map[string]interface {}{"apiVersion":"v1", "fieldsType":"FieldsV1", "fieldsV1":map[string]interface {}{"f:metadata":map[string]interface {}{"f:labels":map[string]interface {}{".":map[string]interface {}{}, "f:name":map[string]interface {}{}}}, "f:spec":map[string]interface {}{"f:containers":map[string]interface {}{"k:{\"name\":\"kubernetes-serve-hostname\"}":map[string]interface {}{".":map[string]interface {}{}, "f:image":map[string]interface {}{}, "f:imagePullPolicy":map[string]interface {}{}, "f:name":map[string]interface {}{}, "f:resources":map[string]interface {}{".":map[string]interface {}{}, "f:limits":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}, "f:requests":map[string]interface {}{".":map[string]interface {}{}, "f:cpu":map[string]interface {}{}, "f:memory":map[string]interface {}{}}}, "f:terminationMessagePath":map[string]interface {}{}, "f:terminationMessagePolicy":map[string]interface {}{}}}, "f:dnsPolicy":map[string]interface {}{}, "f:enableServiceLinks":map[string]interface {}{}, "f:restartPolicy":map[string]interface {}{}, "f:schedulerName":map[string]interface {}{}, "f:securityContext":map[string]interface {}{}, "f:terminationGracePeriodSeconds":map[string]interface {}{}}}, "manager":"kubectl-create", "operation":"Update", "time":"2021-04-07T14:32:42Z"}}, "name":"valid-pod", "namespace":"namespace-1617805962-4204", "resourceVersion":"1050", "uid":"140bea0d-2bd1-4cc8-87ec-ce3c9a3599f1"}, "spec":map[string]interface {}{"containers":[]interface {}{map[string]interface {}{"image":"k8s.gcr.io/serve_hostname", "imagePullPolicy":"Always", "name":"kubernetes-serve-hostname", "resources":map[string]interface {}{"limits":map[string]interface {}{"cpu":"1", "memory":"512Mi"}, "requests":map[string]interface {}{"cpu":"1", "memory":"512Mi"}}, "terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File"}}, "dnsPolicy":"ClusterFirst", "enableServiceLinks":true, "preemptionPolicy":"PreemptLowerPriority", "priority":0, "restartPolicy":"Always", "schedulerName":"default-scheduler", "securityContext":map[string]interface {}{}, "terminationGracePeriodSeconds":30}, "status":map[string]interface {}{"phase":"Pending", "qosClass":"Guaranteed"}}
has:missing is not found
error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing"
Successful
message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template:
	template was:
		{{.missing}}
	raw data was:
		{"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2021-04-07T14:32:42Z","labels":{"name":"valid-pod"},"managedFields":[{"apiVersion":"v1","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:labels":{".":{},"f:name":{}}},"f:spec":{"f:containers":{"k:{\"name\":\"kubernetes-serve-hostname\"}":{".":{},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:resources":{".":{},"f:limits":{".":{},"f:cpu":{},"f:memory":{}},"f:requests":{".":{},"f:cpu":{},"f:memory":{}}},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:enableServiceLinks":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}},"manager":"kubectl-create","operation":"Update","time":"2021-04-07T14:32:42Z"}],"name":"valid-pod","namespace":"namespace-1617805962-4204","resourceVersion":"1050","uid":"140bea0d-2bd1-4cc8-87ec-ce3c9a3599f1"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"preemptionPolicy":"PreemptLowerPriority","priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}}
	object given to template engine was:
		map[apiVersion:v1 kind:Pod metadata:map[creationTimestamp:2021-04-07T14:32:42Z labels:map[name:valid-pod] managedFields:[map[apiVersion:v1 fieldsType:FieldsV1 fieldsV1:map[f:metadata:map[f:labels:map[.:map[] f:name:map[]]] f:spec:map[f:containers:map[k:{"name":"kubernetes-serve-hostname"}:map[.:map[] f:image:map[] f:imagePullPolicy:map[] f:name:map[] f:resources:map[.:map[] f:limits:map[.:map[] f:cpu:map[] f:memory:map[]] f:requests:map[.:map[] f:cpu:map[] f:memory:map[]]] f:terminationMessagePath:map[] f:terminationMessagePolicy:map[]]] f:dnsPolicy:map[] f:enableServiceLinks:map[] f:restartPolicy:map[] f:schedulerName:map[] f:securityContext:map[] f:terminationGracePeriodSeconds:map[]]] manager:kubectl-create operation:Update time:2021-04-07T14:32:42Z]] name:valid-pod namespace:namespace-1617805962-4204 resourceVersion:1050 uid:140bea0d-2bd1-4cc8-87ec-ce3c9a3599f1] spec:map[containers:[map[image:k8s.gcr.io/serve_hostname imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[cpu:1 memory:512Mi] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File]] dnsPolicy:ClusterFirst enableServiceLinks:true preemptionPolicy:PreemptLowerPriority priority:0 restartPolicy:Always schedulerName:default-scheduler securityContext:map[] terminationGracePeriodSeconds:30] status:map[phase:Pending qosClass:Guaranteed]]
... skipping 84 lines ...
  terminationGracePeriodSeconds: 30
status:
  phase: Pending
  qosClass: Guaranteed
has:name: valid-pod
Successful
message:Error from server (NotFound): pods "invalid-pod" not found
has:"invalid-pod" not found
pod "valid-pod" deleted
get.sh:196: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/redis-master created
pod/valid-pod created
Successful
... skipping 36 lines ...
+++ [0407 14:32:51] Creating namespace namespace-1617805971-32333
namespace/namespace-1617805971-32333 created
Context "test" modified.
+++ [0407 14:32:52] Testing kubectl exec POD COMMAND
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
pod/test-pod created
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pods "test-pod" not found
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod or type/name must be specified
pod "test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_exec_resource_name_tests
Running command: run_kubectl_exec_resource_name_tests

... skipping 3 lines ...
+++ [0407 14:32:52] Creating namespace namespace-1617805972-25348
namespace/namespace-1617805972-25348 created
Context "test" modified.
+++ [0407 14:32:52] Testing kubectl exec TYPE/NAME COMMAND
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
error: the server doesn't have a resource type "foo"
has:error:
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (NotFound): deployments.apps "bar" not found
has:"bar" not found
pod/test-pod created
replicaset.apps/frontend created
I0407 14:32:53.665206   63831 event.go:291] "Event occurred" object="namespace-1617805972-25348/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-94mfg"
I0407 14:32:53.669171   63831 event.go:291] "Event occurred" object="namespace-1617805972-25348/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-bprxg"
I0407 14:32:53.669925   63831 event.go:291] "Event occurred" object="namespace-1617805972-25348/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-p46ks"
configmap/test-set-env-config created
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
error: cannot attach to *v1.ConfigMap: selector for *v1.ConfigMap not implemented
has:not implemented
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod test-pod does not have a host assigned
has not:not found
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod, type/name or --filename must be specified
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod frontend-94mfg does not have a host assigned
has not:not found
Successful
message:kubectl exec [POD] [COMMAND] is DEPRECATED and will be removed in a future version. Use kubectl exec [POD] -- [COMMAND] instead.
Error from server (BadRequest): pod frontend-94mfg does not have a host assigned
has not:pod, type/name or --filename must be specified
pod "test-pod" deleted
replicaset.apps "frontend" deleted
configmap "test-set-env-config" deleted
+++ exit code: 0
Recording: run_create_secret_tests
Running command: run_create_secret_tests

+++ Running case: test-cmd.run_create_secret_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_create_secret_tests
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
message:user-specified
has:user-specified
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","uid":"3d3c6827-afec-4dd1-bc9d-cccc75fc2bbc","resourceVersion":"1131","creationTimestamp":"2021-04-07T14:32:54Z"}}
Successful
message:{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","uid":"3d3c6827-afec-4dd1-bc9d-cccc75fc2bbc","resourceVersion":"1132","creationTimestamp":"2021-04-07T14:32:54Z"},"data":{"key1":"config1"}}
has:uid
Successful
message:{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","uid":"3d3c6827-afec-4dd1-bc9d-cccc75fc2bbc","resourceVersion":"1132","creationTimestamp":"2021-04-07T14:32:54Z"},"data":{"key1":"config1"}}
has:config1
{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Success","details":{"name":"tester-update-cm","kind":"configmaps","uid":"3d3c6827-afec-4dd1-bc9d-cccc75fc2bbc"}}
Successful
message:Error from server (NotFound): configmaps "tester-update-cm" not found
has:configmaps "tester-update-cm" not found
+++ exit code: 0
Recording: run_kubectl_create_kustomization_directory_tests
Running command: run_kubectl_create_kustomization_directory_tests

+++ Running case: test-cmd.run_kubectl_create_kustomization_directory_tests 
... skipping 73 lines ...
      securityContext: {}
      terminationGracePeriodSeconds: 30
status: {}
has:apps/v1beta1
deployment.apps "nginx" deleted
Successful
message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
Successful
message:nginx:
has:nginx:
+++ exit code: 0
Recording: run_kubectl_delete_allnamespaces_tests
... skipping 104 lines ...
has:Timeout
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          2s
has:valid-pod
Successful
message:error: Invalid timeout value. Timeout must be a single integer in seconds, or an integer followed by a corresponding time unit (e.g. 1s | 2m | 3h)
has:Invalid timeout value
pod "valid-pod" deleted
+++ exit code: 0
Recording: run_crd_tests
Running command: run_crd_tests

... skipping 164 lines ...
foo.company.com/test patched
crd.sh:236: Successful get foos/test {{.patched}}: value1
(Bfoo.company.com/test patched
crd.sh:238: Successful get foos/test {{.patched}}: value2
(Bfoo.company.com/test patched
crd.sh:240: Successful get foos/test {{.patched}}: <no value>
(B+++ [0407 14:33:10] "kubectl patch --local" returns error as expected for CustomResource: error: strategic merge patch is not supported for company.com/v1, Kind=Foo locally, try --type merge
{
    "apiVersion": "company.com/v1",
    "kind": "Foo",
    "metadata": {
        "annotations": {
            "kubernetes.io/change-cause": "kubectl patch foos/test --server=https://127.0.0.1:6443 --insecure-skip-tls-verify=true --match-server-version=true --patch={\"patched\":null} --type=merge --record=true"
... skipping 205 lines ...
done
/home/prow/go/src/k8s.io/kubernetes/test/cmd/../../test/cmd/crd.sh: line 294: 78598 Killed                  kubectl "${kube_flags[@]}" get bars --request-timeout=1m --watch-only -o name
Successful
message:bar.company.com/test
has:bar.company.com/test
bar.company.com "test" deleted
E0407 14:33:17.909320   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
W0407 14:33:33.496803   63831 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0407 14:33:33.496854   63831 resource_quota_monitor.go:229] QuotaMonitor created object count evaluator for resources.mygroup.example.com
W0407 14:33:33.496886   63831 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0407 14:33:33.496900   63831 resource_quota_monitor.go:229] QuotaMonitor created object count evaluator for bars.company.com
W0407 14:33:33.496916   63831 mutation_detector.go:53] Mutation detector is enabled, this will result in memory leakage.
I0407 14:33:33.496928   63831 resource_quota_monitor.go:229] QuotaMonitor created object count evaluator for validfoos.company.com
... skipping 71 lines ...
crd.sh:455: Successful get bars {{len .items}}: 1
(Bnamespace "non-native-resources" deleted
I0407 14:33:42.466662   60095 client.go:360] parsed scheme: "passthrough"
I0407 14:33:42.466716   60095 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I0407 14:33:42.466726   60095 clientconn.go:948] ClientConn switching balancer to "pick_first"
crd.sh:458: Successful get bars {{len .items}}: 0
(BError from server (NotFound): namespaces "non-native-resources" not found
customresourcedefinition.apiextensions.k8s.io "foos.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "bars.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
customresourcedefinition.apiextensions.k8s.io "validfoos.company.com" deleted
+++ exit code: 0
+++ [0407 14:33:47] Testing recursive resources
... skipping 2 lines ...
Context "test" modified.
generic-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:pod/busybox0 created
pod/busybox1 created
error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
W0407 14:33:47.907639   60095 cacher.go:148] Terminating all watchers from cacher *unstructured.Unstructured
E0407 14:33:47.909885   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BW0407 14:33:48.014676   60095 cacher.go:148] Terminating all watchers from cacher *unstructured.Unstructured
E0407 14:33:48.016155   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
W0407 14:33:48.115676   60095 cacher.go:148] Terminating all watchers from cacher *unstructured.Unstructured
E0407 14:33:48.117266   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:220: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox:
(BSuccessful
message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
W0407 14:33:48.227488   60095 cacher.go:148] Terminating all watchers from cacher *unstructured.Unstructured
E0407 14:33:48.229651   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:227: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:231: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:pod/busybox0 replaced
pod/busybox1 replaced
error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:236: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0407 14:33:48.938967   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Name:         busybox0
Namespace:    namespace-1617806027-10339
Priority:     0
Node:         <none>
Labels:       app=busybox0
... skipping 153 lines ...
QoS Class:        BestEffort
Node-Selectors:   <none>
Tolerations:      <none>
Events:           <none>
unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E0407 14:33:49.031758   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0407 14:33:49.037956   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:246: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:250: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue:
(BSuccessful
message:pod/busybox0 annotated
pod/busybox1 annotated
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:255: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE0407 14:33:49.434546   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0407 14:33:49.452343   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:259: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:Warning: resource pods/busybox0 is missing the kubectl.kubernetes.io/last-applied-configuration annotation which is required by kubectl apply. kubectl apply should only be used on resources created declaratively by either kubectl create --save-config or kubectl apply. The missing annotation will be patched automatically.
pod/busybox0 configured
Warning: resource pods/busybox1 is missing the kubectl.kubernetes.io/last-applied-configuration annotation which is required by kubectl apply. kubectl apply should only be used on resources created declaratively by either kubectl create --save-config or kubectl apply. The missing annotation will be patched automatically.
pod/busybox1 configured
error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:264: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:busybox0:busybox1:
Successful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:273: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bpod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
generic-resources.sh:278: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue:
(BSuccessful
message:pod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:283: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bpod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
generic-resources.sh:288: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox:
(BSuccessful
message:pod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:293: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:297: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "busybox0" force deleted
pod "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:302: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/busybox0 created
I0407 14:33:51.044403   63831 event.go:291] "Event occurred" object="namespace-1617806027-10339/busybox0" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox0-vrsr9"
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0407 14:33:51.052042   63831 event.go:291] "Event occurred" object="namespace-1617806027-10339/busybox1" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox1-tzvzr"
E0407 14:33:51.136590   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:306: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:311: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:312: Successful get rc busybox0 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:313: Successful get rc busybox1 {{.spec.replicas}}: 1
(BI0407 14:33:51.616255   63831 namespace_controller.go:185] Namespace has been deleted non-native-resources
generic-resources.sh:318: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(BE0407 14:33:51.656312   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0407 14:33:51.686886   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:319: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(BSuccessful
message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled
horizontalpodautoscaler.autoscaling/busybox1 autoscaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
horizontalpodautoscaler.autoscaling "busybox0" deleted
horizontalpodautoscaler.autoscaling "busybox1" deleted
generic-resources.sh:327: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:328: Successful get rc busybox0 {{.spec.replicas}}: 1
(BE0407 14:33:52.176483   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:329: Successful get rc busybox1 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:333: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bgeneric-resources.sh:334: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(BSuccessful
message:service/busybox0 exposed
service/busybox1 exposed
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:340: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:341: Successful get rc busybox0 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:342: Successful get rc busybox1 {{.spec.replicas}}: 1
(BI0407 14:33:52.952417   63831 event.go:291] "Event occurred" object="namespace-1617806027-10339/busybox0" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox0-c6745"
I0407 14:33:52.961719   63831 event.go:291] "Event occurred" object="namespace-1617806027-10339/busybox1" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox1-5vb48"
generic-resources.sh:346: Successful get rc busybox0 {{.spec.replicas}}: 2
(Bgeneric-resources.sh:347: Successful get rc busybox1 {{.spec.replicas}}: 2
(BSuccessful
message:replicationcontroller/busybox0 scaled
replicationcontroller/busybox1 scaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:352: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:356: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:361: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx1-deployment created
deployment.apps/nginx0-deployment created
error: error validating "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0407 14:33:53.759939   63831 event.go:291] "Event occurred" object="namespace-1617806027-10339/nginx1-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx1-deployment-758b5949b6 to 2"
I0407 14:33:53.774543   63831 event.go:291] "Event occurred" object="namespace-1617806027-10339/nginx1-deployment-758b5949b6" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx1-deployment-758b5949b6-xt5p8"
I0407 14:33:53.775300   63831 event.go:291] "Event occurred" object="namespace-1617806027-10339/nginx0-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx0-deployment-75db9cdfd9 to 2"
I0407 14:33:53.778706   63831 event.go:291] "Event occurred" object="namespace-1617806027-10339/nginx0-deployment-75db9cdfd9" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx0-deployment-75db9cdfd9-7ssck"
I0407 14:33:53.781791   63831 event.go:291] "Event occurred" object="namespace-1617806027-10339/nginx1-deployment-758b5949b6" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx1-deployment-758b5949b6-gb4qr"
I0407 14:33:53.786887   63831 event.go:291] "Event occurred" object="namespace-1617806027-10339/nginx0-deployment-75db9cdfd9" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx0-deployment-75db9cdfd9-fhgm6"
generic-resources.sh:365: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx0-deployment:nginx1-deployment:
(Bgeneric-resources.sh:366: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(Bgeneric-resources.sh:370: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(BSuccessful
message:deployment.apps/nginx1-deployment skipped rollback (current template already matches revision 1)
deployment.apps/nginx0-deployment skipped rollback (current template already matches revision 1)
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
deployment.apps/nginx1-deployment paused
deployment.apps/nginx0-deployment paused
generic-resources.sh:378: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: true:true:
(BSuccessful
message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
... skipping 10 lines ...
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx0-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx1-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
deployment.apps "nginx1-deployment" force deleted
deployment.apps "nginx0-deployment" force deleted
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
E0407 14:33:55.577021   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:400: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0407 14:33:56.095478   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/busybox0 created
I0407 14:33:56.179936   63831 event.go:291] "Event occurred" object="namespace-1617806027-10339/busybox0" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox0-wcl92"
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I0407 14:33:56.188886   63831 event.go:291] "Event occurred" object="namespace-1617806027-10339/busybox1" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: busybox1-xlrrj"
E0407 14:33:56.278799   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:404: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:no rollbacker has been implemented for "ReplicationController"
Successful
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox0" pausing is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox1" pausing is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox0" resuming is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox1" resuming is not supported
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
E0407 14:33:57.674098   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_namespace_tests
Running command: run_namespace_tests

+++ Running case: test-cmd.run_namespace_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_namespace_tests
+++ [0407 14:33:57] Testing kubectl(v1:namespaces)
Successful
message:Error from server (NotFound): namespaces "my-namespace" not found
has: not found
namespace/my-namespace created (dry run)
namespace/my-namespace created (server dry run)
Successful
message:Error from server (NotFound): namespaces "my-namespace" not found
has: not found
namespace/my-namespace created
core.sh:1455: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(Bnamespace "my-namespace" deleted
namespace/my-namespace condition met
I0407 14:34:03.606935   63831 shared_informer.go:240] Waiting for caches to sync for resource quota
I0407 14:34:03.606986   63831 shared_informer.go:247] Caches are synced for resource quota 
Successful
message:Error from server (NotFound): namespaces "my-namespace" not found
has: not found
namespace/my-namespace created
core.sh:1464: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(BI0407 14:34:03.967204   63831 shared_informer.go:240] Waiting for caches to sync for garbage collector
I0407 14:34:03.967296   63831 shared_informer.go:247] Caches are synced for garbage collector 
Successful
... skipping 33 lines ...
namespace "namespace-1617805979-21333" deleted
namespace "namespace-1617805979-2684" deleted
namespace "namespace-1617805981-21237" deleted
namespace "namespace-1617805983-16726" deleted
namespace "namespace-1617805984-30881" deleted
namespace "namespace-1617806027-10339" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:warning: deleting cluster-scoped resources
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
namespace "my-namespace" deleted
namespace "namespace-1617805804-15134" deleted
... skipping 29 lines ...
namespace "namespace-1617805979-21333" deleted
namespace "namespace-1617805979-2684" deleted
namespace "namespace-1617805981-21237" deleted
namespace "namespace-1617805983-16726" deleted
namespace "namespace-1617805984-30881" deleted
namespace "namespace-1617806027-10339" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:namespace "my-namespace" deleted
namespace/quotas created
core.sh:1471: Successful get namespaces/quotas {{.metadata.name}}: quotas
(Bcore.sh:1472: Successful get quota --namespace=quotas {{range.items}}{{ if eq .metadata.name \"test-quota\" }}found{{end}}{{end}}:: :
(Bresourcequota/test-quota created (dry run)
resourcequota/test-quota created (server dry run)
core.sh:1476: Successful get quota --namespace=quotas {{range.items}}{{ if eq .metadata.name \"test-quota\" }}found{{end}}{{end}}:: :
(Bresourcequota/test-quota created
core.sh:1479: Successful get quota --namespace=quotas {{range.items}}{{ if eq .metadata.name \"test-quota\" }}found{{end}}{{end}}:: found:
(BI0407 14:34:05.000882   63831 resource_quota_controller.go:307] Resource quota has been deleted quotas/test-quota
resourcequota "test-quota" deleted
namespace "quotas" deleted
E0407 14:34:05.849683   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0407 14:34:06.529891   63831 horizontal.go:361] Horizontal Pod Autoscaler busybox0 has been deleted in namespace-1617806027-10339
I0407 14:34:06.534073   63831 horizontal.go:361] Horizontal Pod Autoscaler busybox1 has been deleted in namespace-1617806027-10339
E0407 14:34:07.812528   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0407 14:34:08.557288   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E0407 14:34:09.686475   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1491: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"other\" }}found{{end}}{{end}}:: :
(Bnamespace/other created
core.sh:1495: Successful get namespaces/other {{.metadata.name}}: other
(Bcore.sh:1499: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
core.sh:1503: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:1505: Successful get pods -n other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BSuccessful
message:error: a resource cannot be retrieved by name across all namespaces
has:a resource cannot be retrieved by name across all namespaces
core.sh:1512: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
core.sh:1516: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(Bnamespace "other" deleted
... skipping 117 lines ...
(Bcore.sh:911: Successful get secret/secret-string-data --namespace=test-secrets  {{.stringData}}: <no value>
(Bsecret "secret-string-data" deleted
core.sh:920: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret "test-secret" deleted
namespace "test-secrets" deleted
I0407 14:34:21.617966   63831 namespace_controller.go:185] Namespace has been deleted other
E0407 14:34:24.593212   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0407 14:34:25.123192   60095 client.go:360] parsed scheme: "passthrough"
I0407 14:34:25.123252   60095 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I0407 14:34:25.123263   60095 clientconn.go:948] ClientConn switching balancer to "pick_first"
+++ exit code: 0
Recording: run_configmap_tests
Running command: run_configmap_tests
... skipping 17 lines ...
configmap/test-configmap created (server dry run)
core.sh:46: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-configmap\" }}found{{end}}{{end}}:: :
(Bconfigmap/test-configmap created
configmap/test-binary-configmap created
core.sh:51: Successful get configmap/test-configmap --namespace=test-configmaps {{.metadata.name}}: test-configmap
(Bcore.sh:52: Successful get configmap/test-binary-configmap --namespace=test-configmaps {{.metadata.name}}: test-binary-configmap
(BE0407 14:34:27.594370   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap "test-configmap" deleted
configmap "test-binary-configmap" deleted
namespace "test-configmaps" deleted
E0407 14:34:28.848361   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0407 14:34:30.748829   63831 namespace_controller.go:185] Namespace has been deleted test-secrets
E0407 14:34:31.698667   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_client_config_tests
Running command: run_client_config_tests

+++ Running case: test-cmd.run_client_config_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_client_config_tests
+++ [0407 14:34:33] Creating namespace namespace-1617806073-31806
namespace/namespace-1617806073-31806 created
Context "test" modified.
+++ [0407 14:34:33] Testing client config
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:Error in configuration: context was not found for specified context: missing-context
has:context was not found for specified context: missing-context
Successful
message:error: no server found for cluster "missing-cluster"
has:no server found for cluster "missing-cluster"
Successful
message:error: auth info "missing-user" does not exist
has:auth info "missing-user" does not exist
Successful
message:error: error loading config file "/tmp/newconfig.yaml": no kind "Config" is registered for version "v-1" in scheme "k8s.io/client-go/tools/clientcmd/api/latest/latest.go:50"
has:error loading config file
Successful
message:error: stat missing-config: no such file or directory
has:no such file or directory
+++ exit code: 0
Recording: run_service_accounts_tests
Running command: run_service_accounts_tests

+++ Running case: test-cmd.run_service_accounts_tests 
... skipping 11 lines ...
serviceaccount/test-service-account created (server dry run)
core.sh:953: Successful get serviceaccount --namespace=test-service-accounts {{range.items}}{{ if eq .metadata.name \"test-service-account\" }}found{{end}}{{end}}:: :
(Bserviceaccount/test-service-account created
core.sh:957: Successful get serviceaccount/test-service-account --namespace=test-service-accounts {{.metadata.name}}: test-service-account
(Bserviceaccount "test-service-account" deleted
namespace "test-service-accounts" deleted
E0407 14:34:37.783131   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I0407 14:34:38.069865   63831 namespace_controller.go:185] Namespace has been deleted test-configmaps
+++ exit code: 0
Recording: run_job_tests
Running command: run_job_tests

+++ Running case: test-cmd.run_job_tests 
... skipping 21 lines ...
Labels:                        <none>
Annotations:                   <none>
Schedule:                      59 23 31 2 *
Concurrency Policy:            Allow
Suspend:                       False
Successful Job History Limit:  3
Failed Job History Limit:      1
Starting Deadline Seconds:     <unset>
Selector:                      <unset>
Parallelism:                   <unset>
Completions:                   <unset>
Pod Template:
  Labels:  <none>
... skipping 41 lines ...
Labels:         controller-uid=c176b5a8-6d78-4b2b-a64e-c4ea251fa32f
                job-name=test-job
Annotations:    cronjob.kubernetes.io/instantiate: manual
Parallelism:    1
Completions:    1
Start Time:     Wed, 07 Apr 2021 14:34:42 +0000
Pods Statuses:  1 Running / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  controller-uid=c176b5a8-6d78-4b2b-a64e-c4ea251fa32f
           job-name=test-job
  Containers:
   pi:
    Image:      k8s.gcr.io/perl
... skipping 417 lines ...
status:
  loadBalancer: {}
Successful
message:kubectl-create kubectl-set
has:kubectl-set
I0407 14:34:53.126116   63831 namespace_controller.go:185] Namespace has been deleted test-jobs
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
core.sh:1020: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(Bservice/redis-master selector updated
Successful
message:Error from server (Conflict): Operation cannot be fulfilled on services "redis-master": the object has been modified; please apply your changes to the latest version and try again
has:Conflict
core.sh:1033: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(Bservice "redis-master" deleted
core.sh:1040: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:1044: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bservice/redis-master created
... skipping 122 lines ...
 (dry run)
daemonset.apps/bind rolled back (server dry run)
apps.sh:87: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(Bapps.sh:88: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:89: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bdaemonset.apps/bind rolled back
E0407 14:35:03.209234   63831 daemon_controller.go:320] namespace-1617806100-21529/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1617806100-21529", SelfLink:"", UID:"02bddb6f-8221-4ee5-8447-fae005174571", ResourceVersion:"2021", Generation:3, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63753402901, loc:(*time.Location)(0x72f7400)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"3", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=https://127.0.0.1:6443 --insecure-skip-tls-verify=true --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1617806100-21529\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=https://127.0.0.1:6443 --insecure-skip-tls-verify=true --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry{v1.ManagedFieldsEntry{Manager:"kube-controller-manager", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc0015b3578), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc0015b3590)}, v1.ManagedFieldsEntry{Manager:"kubectl-client-side-apply", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc0015b35a8), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc0015b35c0)}, v1.ManagedFieldsEntry{Manager:"kubectl", Operation:"Update", APIVersion:"apps/v1", Time:(*v1.Time)(0xc0015b35d8), FieldsType:"FieldsV1", FieldsV1:(*v1.FieldsV1)(0xc0015b35f0)}}}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc00154f180), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:2.0", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc002bc1748), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0003b7180), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc0015b3608), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil), SetHostnameAsFQDN:(*bool)(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc0032ae2c0)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc002bc179c)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:2, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
apps.sh:92: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:93: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BSuccessful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
apps.sh:97: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:98: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bdaemonset.apps/bind rolled back
apps.sh:101: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(Bapps.sh:102: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE0407 14:35:04.028765   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:103: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bdaemonset.apps "bind" deleted
+++ exit code: 0
Recording: run_rc_tests
Running command: run_rc_tests

... skipping 30 lines ...
Namespace:    namespace-1617806104-30921
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1617806104-30921
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
Namespace:    namespace-1617806104-30921
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 12 lines ...
Namespace:    namespace-1617806104-30921
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 27 lines ...
Namespace:    namespace-1617806104-30921
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1617806104-30921
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1617806104-30921
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 11 lines ...
Namespace:    namespace-1617806104-30921
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 15 lines ...
(Bcore.sh:1224: Successful get rc frontend {{.spec.replicas}}: 3
(Breplicationcontroller/frontend scaled
E0407 14:35:06.500195   63831 replica_set.go:201] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1617806104-30921  6d8c73e6-ac00-4bd7-b291-71113d53c3be 2059 2 2021-04-07 14:35:05 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  [{kube-controller-manager Update v1 2021-04-07 14:35:05 +0000 UTC FieldsV1 {"f:status":{"f:fullyLabeledReplicas":{},"f:observedGeneration":{},"f:replicas":{}}}} {kubectl-create Update v1 2021-04-07 14:35:05 +0000 UTC FieldsV1 {"f:metadata":{"f:labels":{".":{},"f:app":{},"f:tier":{}}},"f:spec":{"f:replicas":{},"f:selector":{".":{},"f:app":{},"f:tier":{}},"f:template":{".":{},"f:metadata":{".":{},"f:creationTimestamp":{},"f:labels":{".":{},"f:app":{},"f:tier":{}}},"f:spec":{".":{},"f:containers":{".":{},"k:{\"name\":\"php-redis\"}":{".":{},"f:env":{".":{},"k:{\"name\":\"GET_HOSTS_FROM\"}":{".":{},"f:name":{},"f:value":{}}},"f:image":{},"f:imagePullPolicy":{},"f:name":{},"f:ports":{".":{},"k:{\"containerPort\":80,\"protocol\":\"TCP\"}":{".":{},"f:containerPort":{},"f:protocol":{}}},"f:resources":{".":{},"f:requests":{".":{},"f:cpu":{},"f:memory":{}}},"f:terminationMessagePath":{},"f:terminationMessagePolicy":{}}},"f:dnsPolicy":{},"f:restartPolicy":{},"f:schedulerName":{},"f:securityContext":{},"f:terminationGracePeriodSeconds":{}}}}}}]},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc002f17d38 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,FSGroupChangePolicy:nil,SeccompProfile:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] [] <nil>}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:1,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
I0407 14:35:06.508652   63831 event.go:291] "Event occurred" object="namespace-1617806104-30921/frontend" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: frontend-7rttv"
core.sh:1228: Successful get rc frontend {{.spec.replicas}}: 2
(Bcore.sh:1232: Successful get rc frontend {{.spec.replicas}}: 2
(Berror: Expected replicas to be 3, was 2
core.sh:1236: Successful get rc frontend {{.spec.replicas}}: 2
(Bcore.sh:1240: Successful get rc frontend {{.spec.replicas}}: 2
(Breplicationcontroller/frontend scaled
I0407 14:35:07.083258   63831 event.go:291] "Event occurred" object="namespace-1617806104-30921/frontend" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-xrpp9"
core.sh:1244: Successful get rc frontend {{.spec.replicas}}: 3
(Bcore.sh:1248: Successful get rc frontend {{.spec.replicas}}: 3
... skipping 31 lines ...
(Bdeployment.apps "nginx-deployment" deleted
Successful
message:service/expose-test-deployment exposed
has:service/expose-test-deployment exposed
service "expose-test-deployment" deleted
Successful
message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
See 'kubectl expose -h' for help and examples
has:invalid deployment: no selectors
deployment.apps/nginx-deployment created
I0407 14:35:09.511168   63831 event.go:291] "Event occurred" object="namespace-1617806104-30921/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-76b5cd66f5 to 3"
I0407 14:35:09.515775   63831 event.go:291] "Event occurred" object="namespace-1617806104-30921/nginx-deployment-76b5cd66f5" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-76b5cd66f5-lgslz"
I0407 14:35:09.522311   63831 event.go:291] "Event occurred" object="namespace-1617806104-30921/nginx-deployment-76b5cd66f5" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-76b5cd66f5-pqp6r"
... skipping 20 lines ...
(Bpod "valid-pod" deleted
service "frontend" deleted
service "frontend-2" deleted
service "frontend-3" deleted
service "frontend-4" deleted
Successful
message:error: cannot expose a Node
has:cannot expose
Successful
message:The Service "invalid-large-service-name-that-has-more-than-sixty-three-characters" is invalid: metadata.name: Invalid value: "invalid-large-service-name-that-has-more-than-sixty-three-characters": must be no more than 63 characters
has:metadata.name: Invalid value
Successful
message:service/kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
has:kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
service "kubernetes-serve-hostname-testing-sixty-three-characters-in-len" deleted
Successful
message:service/etcd-server exposed
has:etcd-server exposed
core.sh:1349: Successful get service etcd-server {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: port-1 2380
(Bcore.sh:1350: Successful get service etcd-server {{(index .spec.ports 1).name}} {{(index .spec.ports 1).port}}: port-2 2379
(BE0407 14:35:12.213806   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "etcd-server" deleted
core.sh:1356: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Breplicationcontroller "frontend" deleted
core.sh:1360: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:1364: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/frontend created
... skipping 17 lines ...
(Bhorizontalpodautoscaler.autoscaling/frontend autoscaled
core.sh:1387: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
horizontalpodautoscaler.autoscaling/frontend autoscaled
core.sh:1391: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
Error: required flag(s) "max" not set
replicationcontroller "frontend" deleted
core.sh:1400: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BapiVersion: apps/v1
kind: Deployment
metadata:
  creationTimestamp: null
... skipping 24 lines ...
          limits:
            cpu: 300m
          requests:
            cpu: 300m
      terminationGracePeriodSeconds: 0
status: {}
Error from server (NotFound): deployments.apps "nginx-deployment-resources" not found
deployment.apps/nginx-deployment-resources created
I0407 14:35:15.195595   63831 event.go:291] "Event occurred" object="namespace-1617806104-30921/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-resources-748ddcb48b to 3"
I0407 14:35:15.201743   63831 event.go:291] "Event occurred" object="namespace-1617806104-30921/nginx-deployment-resources-748ddcb48b" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-748ddcb48b-f8h9g"
I0407 14:35:15.204913   63831 event.go:291] "Event occurred" object="namespace-1617806104-30921/nginx-deployment-resources-748ddcb48b" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-748ddcb48b-bsvk4"
I0407 14:35:15.206533   63831 event.go:291] "Event occurred" object="namespace-1617806104-30921/nginx-deployment-resources-748ddcb48b" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-748ddcb48b-67kt6"
core.sh:1406: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment-resources:
(Bcore.sh:1407: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bcore.sh:1408: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment-resources resource requirements updated
I0407 14:35:15.607783   63831 event.go:291] "Event occurred" object="namespace-1617806104-30921/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-resources-7bfb7d56b6 to 1"
I0407 14:35:15.614095   63831 event.go:291] "Event occurred" object="namespace-1617806104-30921/nginx-deployment-resources-7bfb7d56b6" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-7bfb7d56b6-587q2"
core.sh:1411: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 100m:
(Bcore.sh:1412: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
(Berror: unable to find container named redis
deployment.apps/nginx-deployment-resources resource requirements updated
I0407 14:35:16.031308   63831 event.go:291] "Event occurred" object="namespace-1617806104-30921/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled down replica set nginx-deployment-resources-748ddcb48b to 2"
I0407 14:35:16.040447   63831 event.go:291] "Event occurred" object="namespace-1617806104-30921/nginx-deployment-resources-748ddcb48b" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: nginx-deployment-resources-748ddcb48b-f8h9g"
I0407 14:35:16.045139   63831 event.go:291] "Event occurred" object="namespace-1617806104-30921/nginx-deployment-resources" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-resources-75dbcccf44 to 1"
I0407 14:35:16.049803   63831 event.go:291] "Event occurred" object="namespace-1617806104-30921/nginx-deployment-resources-75dbcccf44" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-resources-75dbcccf44-dcr75"
core.sh:1417: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
... skipping 155 lines ...
    status: "True"
    type: Progressing
  observedGeneration: 4
  replicas: 4
  unavailableReplicas: 4
  updatedReplicas: 1
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
core.sh:1428: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(Bcore.sh:1429: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
(Bcore.sh:1430: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
... skipping 46 lines ...
                pod-template-hash=69dd6dcd84
Annotations:    deployment.kubernetes.io/desired-replicas: 1
                deployment.kubernetes.io/max-replicas: 2
                deployment.kubernetes.io/revision: 1
Controlled By:  Deployment/test-nginx-apps
Replicas:       1 current / 1 desired
Pods Status:    0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=test-nginx-apps
           pod-template-hash=69dd6dcd84
  Containers:
   nginx:
    Image:        k8s.gcr.io/nginx:test-cmd
... skipping 33 lines ...
    Mounts:       <none>
Volumes:          <none>
QoS Class:        BestEffort
Node-Selectors:   <none>
Tolerations:      <none>
Events:           <none>
(BE0407 14:35:18.786256   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "test-nginx-apps" deleted
apps.sh:218: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-with-command created (dry run)
deployment.apps/nginx-with-command created (server dry run)
apps.sh:222: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BI0407 14:35:19.452157   63831 event.go:291] "Event occurred" object="namespace-1617806117-9591/nginx-with-command" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-with-command-978c69cb6 to 1"
I0407 14:35:19.458354   63831 event.go:291] "Event occurred" object="namespace-1617806117-9591/nginx-with-command-978c69cb6" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-with-command-978c69cb6-b8wfx"
deployment.apps/nginx-with-command created
apps.sh:226: Successful get deploy nginx-with-command {{(index .spec.template.spec.containers 0).name}}: nginx
(Bdeployment.apps "nginx-with-command" deleted
apps.sh:232: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE0407 14:35:19.833980   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/deployment-with-unixuserid created
I0407 14:35:19.983855   63831 event.go:291] "Event occurred" object="namespace-1617806117-9591/deployment-with-unixuserid" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set deployment-with-unixuserid-6859c8dfcb to 1"
I0407 14:35:19.990971   63831 event.go:291] "Event occurred" object="namespace-1617806117-9591/deployment-with-unixuserid-6859c8dfcb" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: deployment-with-unixuserid-6859c8dfcb-n29n6"
apps.sh:236: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: deployment-with-unixuserid:
(Bdeployment.apps "deployment-with-unixuserid" deleted
apps.sh:243: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 46 lines ...
apps.sh:305: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(B    Image:	k8s.gcr.io/nginx:test-cmd
deployment.apps/nginx rolled back (server dry run)
apps.sh:309: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bdeployment.apps/nginx rolled back
apps.sh:313: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Berror: unable to find specified revision 1000000 in history
apps.sh:316: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bdeployment.apps/nginx rolled back
apps.sh:320: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bdeployment.apps/nginx paused
error: you cannot rollback a paused deployment; resume it first with 'kubectl rollout resume deployment/nginx' and try again
error: deployments.apps "nginx" can't restart paused deployment (run rollout resume first)
deployment.apps/nginx resumed
deployment.apps/nginx rolled back
    deployment.kubernetes.io/revision-history: 1,3
error: desired revision (3) is different from the running revision (5)
deployment.apps/nginx restarted
I0407 14:35:28.082358   63831 event.go:291] "Event occurred" object="namespace-1617806117-9591/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled down replica set nginx-54785cbcb8 to 2"
I0407 14:35:28.092368   63831 event.go:291] "Event occurred" object="namespace-1617806117-9591/nginx-54785cbcb8" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: nginx-54785cbcb8-trkzk"
I0407 14:35:28.099891   63831 event.go:291] "Event occurred" object="namespace-1617806117-9591/nginx" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-6b8f89f6b4 to 1"
I0407 14:35:28.108628   63831 event.go:291] "Event occurred" object="namespace-1617806117-9591/nginx-6b8f89f6b4" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-6b8f89f6b4-98cpc"
I0407 14:35:29.062900   63831 horizontal.go:361] Horizontal Pod Autoscaler frontend has been deleted in namespace-1617806104-30921
... skipping 81 lines ...
(Bapps.sh:364: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
I0407 14:35:31.116926   63831 event.go:291] "Event occurred" object="namespace-1617806117-9591/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-6dd48b9849 to 1"
I0407 14:35:31.122085   63831 event.go:291] "Event occurred" object="namespace-1617806117-9591/nginx-deployment-6dd48b9849" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-6dd48b9849-fh79z"
apps.sh:367: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bapps.sh:368: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Berror: unable to find container named "redis"
deployment.apps/nginx-deployment image updated
apps.sh:373: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:374: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
apps.sh:377: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bapps.sh:378: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
... skipping 41 lines ...
I0407 14:35:35.228715   63831 event.go:291] "Event occurred" object="namespace-1617806117-9591/nginx-deployment-5fbc8fbcbf" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-5fbc8fbcbf-w9xqs"
deployment.apps/nginx-deployment env updated
I0407 14:35:35.341498   63831 event.go:291] "Event occurred" object="namespace-1617806117-9591/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled down replica set nginx-deployment-b8c4df945 to 0"
I0407 14:35:35.358614   63831 event.go:291] "Event occurred" object="namespace-1617806117-9591/nginx-deployment-b8c4df945" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: nginx-deployment-b8c4df945-m84d2"
I0407 14:35:35.363910   63831 event.go:291] "Event occurred" object="namespace-1617806117-9591/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-68d657fb6 to 1"
I0407 14:35:35.371523   63831 event.go:291] "Event occurred" object="namespace-1617806117-9591/nginx-deployment-68d657fb6" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-68d657fb6-ks28c"
E0407 14:35:35.443414   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
I0407 14:35:35.518151   63831 event.go:291] "Event occurred" object="namespace-1617806117-9591/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled down replica set nginx-deployment-59b7fccd97 to 0"
deployment.apps/nginx-deployment env updated
I0407 14:35:35.624744   63831 event.go:291] "Event occurred" object="namespace-1617806117-9591/nginx-deployment-59b7fccd97" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulDelete" message="Deleted pod: nginx-deployment-59b7fccd97-2s4hp"
I0407 14:35:35.672079   63831 event.go:291] "Event occurred" object="namespace-1617806117-9591/nginx-deployment" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set nginx-deployment-57ddd474c4 to 1"
deployment.apps/nginx-deployment env updated
deployment.apps "nginx-deployment" deleted
I0407 14:35:35.779301   63831 event.go:291] "Event occurred" object="namespace-1617806117-9591/nginx-deployment-57ddd474c4" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: nginx-deployment-57ddd474c4-g287s"
configmap "test-set-env-config" deleted
E0407 14:35:35.971383   63831 replica_set.go:532] sync "namespace-1617806117-9591/nginx-deployment-59b7fccd97" failed with replicasets.apps "nginx-deployment-59b7fccd97" not found
secret "test-set-env-secret" deleted
+++ exit code: 0
E0407 14:35:36.019768   63831 replica_set.go:532] sync "namespace-1617806117-9591/nginx-deployment-57ddd474c4" failed with replicasets.apps "nginx-deployment-57ddd474c4" not found
Recording: run_rs_tests
Running command: run_rs_tests

+++ Running case: test-cmd.run_rs_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_rs_tests
+++ [0407 14:35:36] Creating namespace namespace-1617806136-6274
namespace/namespace-1617806136-6274 created
Context "test" modified.
+++ [0407 14:35:36] Testing kubectl(v1:replicasets)
E0407 14:35:36.229604   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:541: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicaset.apps/frontend created
I0407 14:35:36.545685   63831 event.go:291] "Event occurred" object="namespace-1617806136-6274/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-586sg"
+++ [0407 14:35:36] Deleting rs
I0407 14:35:36.550749   63831 event.go:291] "Event occurred" object="namespace-1617806136-6274/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-x4j8c"
I0407 14:35:36.550785   63831 event.go:291] "Event occurred" object="namespace-1617806136-6274/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-f2zgn"
... skipping 4 lines ...
I0407 14:35:37.071923   63831 event.go:291] "Event occurred" object="namespace-1617806136-6274/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-8hzf6"
I0407 14:35:37.076381   63831 event.go:291] "Event occurred" object="namespace-1617806136-6274/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-b4jks"
I0407 14:35:37.076418   63831 event.go:291] "Event occurred" object="namespace-1617806136-6274/frontend" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: frontend-h75lk"
apps.sh:555: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(B+++ [0407 14:35:37] Deleting rs
replicaset.apps "frontend" deleted
E0407 14:35:37.289615   63831 replica_set.go:532] sync "namespace-1617806136-6274/frontend" failed with Operation cannot be fulfilled on replicasets.apps "frontend": StorageError: invalid object, Code: 4, Key: /registry/replicasets/namespace-1617806136-6274/frontend, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 8d53cdeb-0ac9-4956-a9c1-b02666116c25, UID in object meta: 
apps.sh:559: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:561: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(Bpod "frontend-8hzf6" deleted
pod "frontend-b4jks" deleted
pod "frontend-h75lk" deleted
apps.sh:564: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 16 lines ...
Namespace:    namespace-1617806136-6274
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1617806136-6274
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 18 lines ...
Namespace:    namespace-1617806136-6274
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 12 lines ...
Namespace:    namespace-1617806136-6274
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 25 lines ...
Namespace:    namespace-1617806136-6274
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1617806136-6274
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1617806136-6274
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 11 lines ...
Namespace:    namespace-1617806136-6274
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 216 lines ...
horizontalpodautoscaler.autoscaling/frontend autoscaled
apps.sh:702: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(BSuccessful
message:kubectl-autoscale
has:kubectl-autoscale
horizontalpodautoscaler.autoscaling "frontend" deleted
Error: required flag(s) "max" not set
replicaset.apps "frontend" deleted
+++ exit code: 0
Recording: run_stateful_set_tests
Running command: run_stateful_set_tests

+++ Running case: test-cmd.run_stateful_set_tests 
... skipping 61 lines ...
(Bapps.sh:466: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:467: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bstatefulset.apps/nginx rolled back
apps.sh:470: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:471: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BSuccessful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
apps.sh:475: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:476: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bstatefulset.apps/nginx rolled back
apps.sh:479: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(Bapps.sh:480: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
... skipping 61 lines ...
Name:         mock
Namespace:    namespace-1617806152-18699
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:3.4.1
    Port:         9949/TCP
... skipping 59 lines ...
Name:         mock
Namespace:    namespace-1617806152-18699
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:3.4.1
    Port:         9949/TCP
... skipping 59 lines ...
Name:         mock
Namespace:    namespace-1617806152-18699
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:3.4.1
    Port:         9949/TCP
... skipping 41 lines ...
Namespace:    namespace-1617806152-18699
Selector:     app=mock
Labels:       app=mock
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:3.4.1
    Port:         9949/TCP
... skipping 11 lines ...
Namespace:    namespace-1617806152-18699
Selector:     app=mock2
Labels:       app=mock2
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock2
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:3.4.1
    Port:         9949/TCP
... skipping 71 lines ...
Events:            <none>
service "mock" deleted
service "mock2" deleted
service/mock replaced
service/mock2 replaced
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(BE0407 14:36:04.267468   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:98: Successful get services mock2 {{.metadata.labels.status}}: replaced
(Bservice/mock edited
service/mock2 edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:116: Successful get services mock2 {{.metadata.labels.status}}: edited
(Bservice/mock labeled
... skipping 27 lines ...
+++ [0407 14:36:06] Creating namespace namespace-1617806166-29506
namespace/namespace-1617806166-29506 created
Context "test" modified.
+++ [0407 14:36:06] Testing persistent volumes
storage.sh:30: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpersistentvolume/pv0001 created
E0407 14:36:07.216517   63831 pv_protection_controller.go:118] PV pv0001 failed with : Operation cannot be fulfilled on persistentvolumes "pv0001": the object has been modified; please apply your changes to the latest version and try again
storage.sh:33: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
(Bpersistentvolume "pv0001" deleted
persistentvolume/pv0002 created
E0407 14:36:07.643901   63831 pv_protection_controller.go:118] PV pv0002 failed with : Operation cannot be fulfilled on persistentvolumes "pv0002": the object has been modified; please apply your changes to the latest version and try again
storage.sh:36: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0002:
(Bpersistentvolume "pv0002" deleted
persistentvolume/pv0003 created
E0407 14:36:08.047110   63831 pv_protection_controller.go:118] PV pv0003 failed with : Operation cannot be fulfilled on persistentvolumes "pv0003": the object has been modified; please apply your changes to the latest version and try again
storage.sh:39: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0003:
(Bpersistentvolume "pv0003" deleted
storage.sh:42: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpersistentvolume/pv0001 created
E0407 14:36:08.556145   63831 pv_protection_controller.go:118] PV pv0001 failed with : Operation cannot be fulfilled on persistentvolumes "pv0001": the object has been modified; please apply your changes to the latest version and try again
storage.sh:45: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
(BSuccessful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
persistentvolume "pv0001" deleted
has:warning: deleting cluster-scoped resources
Successful
... skipping 68 lines ...
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
                    save-managers: true
CreationTimestamp:  Wed, 07 Apr 2021 14:30:02 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 31 lines ...
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
                    save-managers: true
CreationTimestamp:  Wed, 07 Apr 2021 14:30:02 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 32 lines ...
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
                    save-managers: true
CreationTimestamp:  Wed, 07 Apr 2021 14:30:02 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 22 lines ...
  Resource           Requests  Limits
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
(B
E0407 14:36:11.938949   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1539: Successful describe
Name:               127.0.0.1
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
                    save-managers: true
CreationTimestamp:  Wed, 07 Apr 2021 14:30:02 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 39 lines ...
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
                    save-managers: true
CreationTimestamp:  Wed, 07 Apr 2021 14:30:02 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 31 lines ...
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
                    save-managers: true
CreationTimestamp:  Wed, 07 Apr 2021 14:30:02 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 31 lines ...
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
                    save-managers: true
CreationTimestamp:  Wed, 07 Apr 2021 14:30:02 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 30 lines ...
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
                    save-managers: true
CreationTimestamp:  Wed, 07 Apr 2021 14:30:02 +0000
Taints:             node.kubernetes.io/unreachable:NoSchedule
Unschedulable:      false
Lease:              Failed to get lease: leases.coordination.k8s.io "127.0.0.1" not found
Conditions:
  Type             Status    LastHeartbeatTime                 LastTransitionTime                Reason                   Message
  ----             ------    -----------------                 ------------------                ------                   -------
  Ready            Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  MemoryPressure   Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
  DiskPressure     Unknown   Wed, 07 Apr 2021 14:30:02 +0000   Wed, 07 Apr 2021 14:31:03 +0000   NodeStatusNeverUpdated   Kubelet never posted node status.
... skipping 118 lines ...
  "status": {
    "allowed": true,
    "reason": "RBAC: allowed by ClusterRoleBinding \"super-group\" of ClusterRole \"admin\" to Group \"the-group\""
  }
}
+++ exit code: 0
E0407 14:36:13.825068   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:yes
has:yes
Successful
message:yes
has:yes
... skipping 2 lines ...
yes
has:the server doesn't have a resource type
Successful
message:yes
has:yes
Successful
message:error: --subresource can not be used with NonResourceURL
has:subresource can not be used with NonResourceURL
Successful
Successful
message:yes
0
has:0
... skipping 59 lines ...
		{Verbs:[get list watch] APIGroups:[] Resources:[configmaps] ResourceNames:[] NonResourceURLs:[]}
legacy-script.sh:846: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-RB:
(Blegacy-script.sh:847: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-R:
(Blegacy-script.sh:848: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CRB:
(Blegacy-script.sh:849: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CR:
(BSuccessful
message:error: only rbac.authorization.k8s.io/v1 is supported: not *v1beta1.ClusterRole
has:only rbac.authorization.k8s.io/v1 is supported
rolebinding.rbac.authorization.k8s.io "testing-RB" deleted
role.rbac.authorization.k8s.io "testing-R" deleted
warning: deleting cluster-scoped resources, not scoped to the provided namespace
clusterrole.rbac.authorization.k8s.io "testing-CR" deleted
clusterrolebinding.rbac.authorization.k8s.io "testing-CRB" deleted
... skipping 15 lines ...

+++ Running case: test-cmd.run_resource_aliasing_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_resource_aliasing_tests
+++ [0407 14:36:16] Creating namespace namespace-1617806176-13625
namespace/namespace-1617806176-13625 created
E0407 14:36:17.092122   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [0407 14:36:17] Testing resource aliasing
replicationcontroller/cassandra created
I0407 14:36:17.360023   63831 event.go:291] "Event occurred" object="namespace-1617806176-13625/cassandra" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: cassandra-mrgqh"
I0407 14:36:17.365520   63831 event.go:291] "Event occurred" object="namespace-1617806176-13625/cassandra" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: cassandra-nbn6z"
service/cassandra created
... skipping 464 lines ...
Successful
message:valid-pod:
has:valid-pod:
Successful
message:valid-pod:
has:valid-pod:
E0407 14:36:35.265279   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
Successful
message:valid-pod:
has:valid-pod:
... skipping 7 lines ...
message:pi:
has:pi:
Successful
message:127.0.0.1:
has:127.0.0.1:
node/127.0.0.1 untainted
E0407 14:36:36.112688   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/cassandra created
I0407 14:36:36.345402   63831 event.go:291] "Event occurred" object="namespace-1617806194-5170/cassandra" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: cassandra-gnpbf"
I0407 14:36:36.352506   63831 event.go:291] "Event occurred" object="namespace-1617806194-5170/cassandra" kind="ReplicationController" apiVersion="v1" type="Normal" reason="SuccessfulCreate" message="Created pod: cassandra-wtb2b"
Successful
message:cassandra:
has:cassandra:
... skipping 422 lines ...
node-management.sh:89: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: dedicated=<no value>:PreferNoSchedule
(BSuccessful
message:kubectl-create kube-controller-manager kubectl-taint
has:kubectl-taint
node/127.0.0.1 untainted
node/127.0.0.1 untainted
E0407 14:36:45.742291   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:96: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: dedicated=<no value>:PreferNoSchedule
(Bnode/127.0.0.1 untainted
node-management.sh:100: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
(Bnode-management.sh:104: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 cordoned (dry run)
node/127.0.0.1 cordoned (server dry run)
... skipping 6 lines ...
node-management.sh:115: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(Bnode-management.sh:116: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode-management.sh:120: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode-management.sh:122: Successful get pods {{range .items}}{{.metadata.name}},{{end}}: test-pod-1,test-pod-2,
(Bnode/127.0.0.1 cordoned (dry run)
node/127.0.0.1 drained (dry run)
E0407 14:36:47.429918   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 cordoned (server dry run)
node/127.0.0.1 drained (server dry run)
node-management.sh:126: Successful get pods {{range .items}}{{.metadata.name}},{{end}}: test-pod-1,test-pod-2,
(Bnode/127.0.0.1 cordoned
node/127.0.0.1 drained
node-management.sh:130: Successful get pods/test-pod-2 {{.metadata.name}}: test-pod-2
... skipping 8 lines ...
message:node/127.0.0.1 already uncordoned (server dry run)
has:already uncordoned
node-management.sh:145: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 labeled
node-management.sh:150: Successful get nodes 127.0.0.1 {{.metadata.labels.test}}: label
(BSuccessful
message:error: cannot specify both a node name and a --selector option
See 'kubectl drain -h' for help and examples
has:cannot specify both a node name
Successful
message:error: USAGE: cordon NODE [flags]
See 'kubectl cordon -h' for help and examples
has:error\: USAGE\: cordon NODE
node/127.0.0.1 already uncordoned
Successful
message:error: You must provide one or more resources by argument or filename.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
   '<resource> <name>'
   '<resource>'
has:must provide one or more resources
... skipping 14 lines ...
+++ [0407 14:36:49] Testing kubectl plugins
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/version/kubectl-version
  - warning: kubectl-version overwrites existing command: "kubectl version"
error: one plugin warning was found
has:kubectl-version overwrites existing command: "kubectl version"
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo
  - warning: test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin: test/fixtures/pkg/kubectl/plugins/kubectl-foo
error: one plugin warning was found
has:test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
has:plugins are available
Successful
message:Unable to read directory "test/fixtures/pkg/kubectl/plugins/empty" from your PATH: open test/fixtures/pkg/kubectl/plugins/empty: no such file or directory. Skipping...
error: unable to find any kubectl plugins in your PATH
has:unable to find any kubectl plugins in your PATH
Successful
message:I am plugin foo
has:plugin foo
Successful
message:I am plugin bar called with args test/fixtures/pkg/kubectl/plugins/bar/kubectl-bar arg1
... skipping 10 lines ...

+++ Running case: test-cmd.run_impersonation_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_impersonation_tests
+++ [0407 14:36:49] Testing impersonation
Successful
message:error: requesting groups or user-extra for test-admin without impersonating a user
has:without impersonating a user
Warning: certificates.k8s.io/v1beta1 CertificateSigningRequest is deprecated in v1.19+, unavailable in v1.22+; use certificates.k8s.io/v1 CertificateSigningRequest
certificatesigningrequest.certificates.k8s.io/foo created
authorization.sh:68: Successful get csr/foo {{.spec.username}}: user1
(Bauthorization.sh:69: Successful get csr/foo {{range .spec.groups}}{{.}}{{end}}: system:authenticated
(BWarning: certificates.k8s.io/v1beta1 CertificateSigningRequest is deprecated in v1.19+, unavailable in v1.22+; use certificates.k8s.io/v1 CertificateSigningRequest
... skipping 40 lines ...
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_debug_pod_tests
+++ [0407 14:36:53] Creating namespace namespace-1617806213-7896
namespace/namespace-1617806213-7896 created
Context "test" modified.
+++ [0407 14:36:53] Testing kubectl debug (pod tests)
E0407 14:36:53.850726   63831 reflector.go:138] k8s.io/client-go/metadata/metadatainformer/informer.go:90: Failed to watch *v1.PartialObjectMetadata: failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/target created
debug.sh:32: Successful get pod {{range.items}}{{.metadata.name}}:{{end}}: target:
(BI0407 14:36:54.003725   60095 client.go:360] parsed scheme: "passthrough"
I0407 14:36:54.003797   60095 passthrough.go:48] ccResolverWrapper: sending update to cc: {[{http://127.0.0.1:2379  <nil> 0 <nil>}] <nil> <nil>}
I0407 14:36:54.003811   60095 clientconn.go:948] ClientConn switching balancer to "pick_first"
debug.sh:36: Successful get pod {{range.items}}{{.metadata.name}}:{{end}}: target:target-copy:
... skipping 66 lines ...
I0407 14:36:57.367154   60095 establishing_controller.go:87] Shutting down EstablishingController
I0407 14:36:57.367166   60095 naming_controller.go:302] Shutting down NamingConditionController
I0407 14:36:57.367311   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.367311   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.367401   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.367449   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.367487   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.367500   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.367506   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.367568   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.367640   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.367642   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.367736   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.367740   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.367755   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.367808   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.367838   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.367856   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.367905   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.367917   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.367966   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.367975   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.368021   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.368039   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.368117   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.368141   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.368244   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.368245   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.368330   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.368334   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.367975   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.368388   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.368417   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.368503   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.368506   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.368541   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.368552   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.368590   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.368609   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.368638   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.368664   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.368675   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.368682   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.368694   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.368712   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.368752   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.368783   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.368791   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.368829   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.368835   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.368837   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.368422   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.368938   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.368982   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.369003   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.369094   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.369118   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.369172   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.369188   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.369193   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.369226   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.369248   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.369270   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.369280   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.369292   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.369321   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.369330   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.369351   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.369368   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
E0407 14:36:57.369285   60095 controller.go:184] rpc error: code = Unavailable desc = transport is closing
I0407 14:36:57.369388   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.369420   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.369443   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.369455   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.369469   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.369487   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.369491   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.369596   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.369611   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.369617   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.369679   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.369685   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.369720   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.369738   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.369777   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.369787   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.369790   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.369821   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.369833   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.369844   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.369843   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.369881   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.369888   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.369912   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.369947   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.369958   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.369998   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.370010   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.370013   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.370061   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.370072   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.370086   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.370117   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.370143   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.370185   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.370201   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.370216   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.370249   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.370261   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.370262   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.369883   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.370327   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.370340   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.370434   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.370457   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.370345   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.370481   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.370519   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.370525   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.370546   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.370526   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.370575   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.370577   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.370602   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.370482   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.370649   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.370664   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.370704   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.370708   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.370742   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.370752   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.370829   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.370841   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.370853   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.370869   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.370890   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.370895   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.370927   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
I0407 14:36:57.370942   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.370944   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.371003   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.371021   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I0407 14:36:57.371022   60095 clientconn.go:897] blockingPicker: the picked transport is not ready, loop back to repick
W0407 14:36:57.371073   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.371117   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.371135   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.371149   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.371182   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.371196   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:57.371270   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
junit report dir: /logs/artifacts
+++ [0407 14:36:57] Clean up complete
+ make test-integration
W0407 14:36:58.368599   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.368599   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.368659   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.368686   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.368599   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.368715   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.368731   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.368745   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.368771   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.368788   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.368803   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.368823   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.368960   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.368975   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.368986   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.369193   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.369464   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.369478   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.369483   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.369505   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.369548   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.369604   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.369616   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.369628   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.369602   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370425   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370437   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370446   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370446   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370512   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370515   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370554   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370571   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370604   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370626   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370660   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370680   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370722   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370731   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370745   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370781   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370786   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370830   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370848   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370883   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370891   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370891   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370917   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370942   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370944   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370971   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370978   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370848   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371032   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371045   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371069   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371086   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.370848   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371082   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371126   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371129   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371146   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371158   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371290   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371310   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371348   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371356   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371373   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371398   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371402   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371350   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371432   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371448   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371567   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371567   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371601   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371602   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371628   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W0407 14:36:58.371644   60095 clientconn.go:1223] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379  <nil> 0 <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
+++ [0407 14:37:02] Checking etcd is on PATH
/home/prow/go/src/k8s.io/kubernetes/third_party/etcd/etcd
+++ [0407 14:37:02] Starting etcd instance
etcd --advertise-client-urls http://127.0.0.1:2379 --data-dir /tmp/tmp.0DHczciXJJ --listen-client-urls http://127.0.0.1:2379 --log-level=debug > "/logs/artifacts/etcd.226a5348-97ac-11eb-a23f-1264af243897.root.log.DEBUG.20210407-143702.99512" 2>/dev/null
Waiting for etcd to come up.
+++ [0407 14:37:02] On try 2, etcd: : {"health":"true"}
{"header":{"cluster_id":"14841639068965178418","member_id":"10276657743932975437","revision":"2","raft_term":"2"}}+++ [0407 14:37:02] Running integration test cases
+++ [0407 14:37:07] Running tests without code coverage
{"Time":"2021-04-07T14:40:08.940235019Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/podlogs","Output":"ok  \tk8s.io/kubernetes/test/integration/apiserver/podlogs\t7.668s\n"}
{"Time":"2021-04-07T14:40:32.989435741Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestSelfSubjectAccessReview","Output":"meout.go:222 +0xb2\\nnet/http.Error(0x7ff3f43ff3b0, 0xc004e01f28, 0xc004f41320, 0x60, 0x1f4)\\n\\t/usr/local/go/src/net/http/server.go:2081 +0x1f6\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.InternalError(0x7ff3f43ff3b0, 0xc004e01f28, 0xc0015bdd00, 0x5615e60, 0xc00531e4f8)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/errors.go:75 +0x11a\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthorization.func1(0x7ff3f43ff3b0, 0xc004e01f28, 0xc0015bdd00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authorization.go:69 +0x53a\\nnet/http.HandlerFunc.ServeHTTP(0xc0068d2f80, 0x7ff3f43ff3b0, 0xc004e01f28, 0xc0015bdd00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7ff3f43ff3b0, 0xc004e01f28, 0xc0015bd"}
{"Time":"2021-04-07T14:40:32.989445209Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestSelfSubjectAccessReview","Output":"d00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc0068d2fc0, 0x7ff3f43ff3b0, 0xc004e01f28, 0xc0015bdd00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7ff3f43ff3b0, 0xc004e01f28, 0xc0015bdd00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc0068e6d50, 0x7ff3f43ff3b0, 0xc004e01f28, 0xc0015bdd00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:127 +0x1ba\\nk8s.io/kubernetes/vendor/k8s.io/a"}
{"Time":"2021-04-07T14:40:32.989472087Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestSelfSubjectAccessReview","Output":"piserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc0099dfa20, 0xc009342cef, 0xc001e5cf50)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:329 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc0099dfa20, 0xc001e5cf50, 0xc00557f4d0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:330 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc0068c3100, 0x5674bd8, 0xc00580ac90, 0xc0099df8c0, 0x56756c8, 0xc0049e3500, 0xc00557f3a0, "}
{"Time":"2021-04-07T14:40:32.989481769Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestSelfSubjectAccessReview","Output":"0xc00557f3b0, 0xc005ac7020)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:166 +0x907\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7ff3f43ff3b0, 0xc004e01f28, 0xc0015bdc00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:130 +0x606\\nnet/http.HandlerFunc.ServeHTTP(0xc0068e6d80, 0x7ff3f43ff3b0, 0xc004e01f28, 0xc0015bdc00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7ff3f43ff3b0, 0xc004e01f28, 0xc0015bdc00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc0068d3000, 0x7ff3f43ff3b0, 0xc004e01f28, 0xc0015bdc00)\\n\\t/usr/local/go/src/net/http/ser"}
{"Time":"2021-04-07T14:40:32.989490716Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestSelfSubjectAccessReview","Output":"ver.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7ff3f43ff3b0, 0xc004e01f28, 0xc0015bdc00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc0068e6db0, 0x7ff3f43ff3b0, 0xc004e01f28, 0xc0015bdc00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7ff3f43ff3b0, 0xc004e01f28, 0xc0015bdc00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:50 +0x240d\\nnet/http.HandlerFunc.ServeHTTP(0xc0068d3040, 0x7ff3f43ff3b0, 0xc004e01f28, 0xc0015bdc00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7ff3f43ff3b0, 0xc004e01f28, 0xc0015bdc00)\\n\\t/home/pro"}
{"Time":"2021-04-07T14:40:32.98950024Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestSelfSubjectAccessReview","Output":"w/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc0068d3080, 0x7ff3f43ff3b0, 0xc004e01f28, 0xc0015bdc00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7ff3f43ff3b0, 0xc004e01f28, 0xc0015bdc00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc0068e6de0, 0x7ff3f43ff3b0, 0xc004e01f28, 0xc0015bdc00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7ff3f43ff3b0, 0xc004e01f28, 0xc0015bdc00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.Han"}
{"Time":"2021-04-07T14:40:34.902963948Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=","Output":"er/pkg/server/filters/timeout.go:222 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc02488c510, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:537 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc024866c60, 0xc00f556000, 0xc0, 0x2dcf, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc022cb71d8, 0x69355c0, 0xc02486c820, 0x232b69a, 0x69dbe60)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc0000d4370, 0x72932c8, 0xc02486c820, 0x72851c0, 0xc024866c60, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/"}
... skipping 3 lines ...
{"Time":"2021-04-07T14:40:34.902999963Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=","Output":"2488c480, 0xc00da364d0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:428 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc0111f9830, 0x7fa4baee3d60, 0xc01379e180, 0xc024865700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x69f411e, 0xe, 0xc0111f9830, 0xc00ccfdf80, 0x7fa4baee3d60, 0xc01379e180, 0xc024865700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterl"}
{"Time":"2021-04-07T14:40:34.903010366Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=","Output":"server/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc009f7d440, 0x7fa4baee3d60, 0xc01379e180, 0xc024865700)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa4baee3d60, 0xc01379e180, 0xc024865700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc0111f5ec0, 0x7fa4baee3d60, 0xc01379e180, 0xc024865700)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:127 +0x1ba\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kuberne"}
{"Time":"2021-04-07T14:40:34.90301723Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=","Output":"tes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc0248628f0, 0xc022cb8cef, 0xc00da36460)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:329 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc0248628f0, 0xc00da36460, 0xc02483dc00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:330 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc011322000, 0x72eb9f8, 0xc02488c300, 0xc024862790, 0x72eca60, 0xc0248de040, 0xc02483dad0, 0xc02483dae0, 0xc0248669c0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kuber"}
{"Time":"2021-04-07T14:40:34.903025444Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=","Output":"netes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:166 +0x907\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7fa4baee3d60, 0xc01379e180, 0xc024865600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:130 +0x606\\nnet/http.HandlerFunc.ServeHTTP(0xc0111f5ef0, 0x7fa4baee3d60, 0xc01379e180, 0xc024865600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa4baee3d60, 0xc01379e180, 0xc024865600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc009f7d480, 0x7fa4baee3d60, 0xc01379e180, 0xc024865600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackComple"}
{"Time":"2021-04-07T14:40:34.903045217Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=","Output":"nts/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc009f7d500, 0x7fa4baee3d60, 0xc01379e180, 0xc024865600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa4baee3d60, 0xc01379e180, 0xc024865600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc0111f5f50, 0x7fa4baee3d60, 0xc01379e180, 0xc024865600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa4baee3d60, 0xc01379e180, 0xc024865600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc009f7d540, 0x7fa4baee3d60, 0xc01379e180, 0xc024865600)\\n\\t/usr/local/go/src/ne"}
{"Time":"2021-04-07T14:40:34.903053747Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=","Output":"t/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa4baee3d60, 0xc01379e180, 0xc024865600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc0111f5fb0, 0x7fa4baee3d60, 0xc01379e180, 0xc024865600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.withAuthentication.func1(0x7fa4baee3d60, 0xc01379e180, 0xc024865600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:80 +0x75c\\nnet/http.HandlerFunc.ServeHTTP(0xc00a092720, 0x7fa4baee3d60, 0xc01379e180, 0xc024865500)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa4baee3d60, 0xc01379e180, 0xc024865400)\\n"}
{"Time":"2021-04-07T14:40:34.903063124Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=","Output":"\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:80 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc009f7d580, 0x7fa4baee3d60, 0xc01379e180, 0xc024865400)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc0244ebe00, 0xc0111da3c0, 0x72ecb40, 0xc01379e180, 0xc024865400)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:107 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:93 +0x1f4\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"resourceV"}
{"Time":"2021-04-07T14:40:34.904858176Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=NotOlderThan","Output":"o/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:222 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc02488d0b0, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:537 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc0248672c0, 0xc00f556000, 0xc0, 0x2dcf, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc022cb71d8, 0x69355c0, 0xc02486ce60, 0x232b69a, 0x69dbe60)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc0000d4370, 0x72932c8, 0xc02486ce60, 0x72851c0, 0xc0248672c0, 0x0, 0x0)\\n\\t/home/prow/go/src/k"}
{"Time":"2021-04-07T14:40:34.904867124Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=NotOlderThan","Output":"8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc0000d4370, 0x72932c8, 0xc02486ce60, 0x72851c0, 0xc0248672c0, 0x5a1b458, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc02486cf00, 0x72932c8, 0xc02486ce60, 0x72851c0, 0xc0248672c0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc02486cf00, 0x72932c8, 0xc02486ce60, 0x72851c0, 0xc0248672c0, 0xc000103380, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local"}
{"Time":"2021-04-07T14:40:34.904874858Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=NotOlderThan","Output":"/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x69fab88, 0x10, 0x7fa4d88addc0, 0xc02486cf00, 0x72e6b58, 0xc01379e250, 0xc0249c0300, 0x1f4, 0x72932c8, 0xc02486ce60)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x72e9738, 0xc01aee2080, 0x72e9948, 0x9787970, 0x69dd5e3, 0x4, 0x69dbe60, 0x2, 0x72e6b58, 0xc01379e250, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x7289e00, 0xc024830f78, 0x72e9738, 0xc01aee2080, 0x69dd5e3"}
{"Time":"2021-04-07T14:40:34.90489839Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=NotOlderThan","Output":"ics.InstrumentRouteFunc.func1(0xc02488d020, 0xc00da36a80)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:428 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc0111f9830, 0x7fa4baee3d60, 0xc01379e238, 0xc0249c0300)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x69f411e, 0xe, 0xc0111f9830, 0xc00ccfdf80, 0x7fa4baee3d60, 0xc01379e238, 0xc0249c0300)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s."}
{"Time":"2021-04-07T14:40:34.904914867Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=NotOlderThan","Output":"8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc009f7d440, 0x7fa4baee3d60, 0xc01379e238, 0xc0249c0300)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa4baee3d60, 0xc01379e238, 0xc0249c0300)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc0111f5ec0, 0x7fa4baee3d60, 0xc01379e238, 0xc0249c0300)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:127 +0x1ba\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n"}
{"Time":"2021-04-07T14:40:34.904924209Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Test":"TestListOptions/watchCacheEnabled=true/limit=0_continue=empty_rv=invalid_rvMatch=NotOlderThan","Output":"\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc024862c60, 0xc022cb8cef, 0xc00da36a10)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:329 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc024862c60, 0xc00da36a10, 0xc02499c680)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:330 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc011322000, 0x72eb9f8, 0xc02488cea0, 0xc024862b00, 0x72eca60, 0xc0248de400, 0xc02499c550, 0xc02499c560, 0xc024867020)\\n\\t/home/prow/go/src/k8s.io/kubernetes"}
... skipping 28 lines ...
{"Time":"2021-04-07T14:41:06.603190635Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"o/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:428 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc006098ea0, 0x7ff3f43ff3b0, 0xc011813520, 0xc01b897100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4da52a1, 0xe, 0xc006098ea0, 0xc01b9dc5b0, 0x7ff3f43ff3b0, 0xc011813520, 0xc01b897100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7ff3f43ff3b0, 0xc011"}
{"Time":"2021-04-07T14:41:06.60319804Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"813520, 0xc01b897100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc0113d79e0, 0x7ff3f43ff3b0, 0xc011813520, 0xc01b897100)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthorization.func1(0x7ff3f43ff3b0, 0xc011813520, 0xc01b897100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authorization.go:64 +0x603\\nnet/http.HandlerFunc.ServeHTTP(0xc00db3f380, 0x7ff3f43ff3b0, 0xc011813520, 0xc01b897100)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7ff3f43ff3b0, 0xc011813520, 0xc01b897100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.g"}
{"Time":"2021-04-07T14:41:06.603210026Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"o:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc00db3f3c0, 0x7ff3f43ff3b0, 0xc011813520, 0xc01b897100)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7ff3f43ff3b0, 0xc011813520, 0xc01b897100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc0113d7a10, 0x7ff3f43ff3b0, 0xc011813520, 0xc01b897100)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:127 +0x1ba\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/"}
{"Time":"2021-04-07T14:41:06.603217763Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc01b54c580, 0xc01a1d8cef, 0xc01aa7caf0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:329 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc01b54c580, 0xc01aa7caf0, 0xc01f5bd050)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:330 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc020b11c00, 0x5674bd8, 0xc01fe5b830, 0xc01b54c4d0, 0x56756c8, 0xc01f051300, 0xc01f5bcf70, 0xc01f5bcf80, 0xc01b89c6c0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol"}
{"Time":"2021-04-07T14:41:06.603224744Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"/apf_filter.go:166 +0x907\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7ff3f43ff3b0, 0xc011813520, 0xc01b897000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:130 +0x606\\nnet/http.HandlerFunc.ServeHTTP(0xc0113d7a40, 0x7ff3f43ff3b0, 0xc011813520, 0xc01b897000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7ff3f43ff3b0, 0xc011813520, 0xc01b897000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc00db3f400, 0x7ff3f43ff3b0, 0xc011813520, 0xc01b897000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7ff3f43ff3b0, 0xc011813520, 0xc01b8970"}
{"Time":"2021-04-07T14:41:06.603232376Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc0113d7a70, 0x7ff3f43ff3b0, 0xc011813520, 0xc01b897000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7ff3f43ff3b0, 0xc011813520, 0xc01b896f00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:175 +0xcd0\\nnet/http.HandlerFunc.ServeHTTP(0xc00db3f440, 0x7ff3f43ff3b0, 0xc011813520, 0xc01b896f00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7ff3f43ff3b0, 0xc011813520, 0xc01b896f00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/"}
{"Time":"2021-04-07T14:41:06.603264624Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"l/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:80 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc00db3f500, 0x7ff3f43ff3b0, 0xc011813520, 0xc01b896d00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc01b578f00, 0xc011add7b8, 0x56757a8, 0xc011813520, 0xc01b896d00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:107 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:93 +0x1f4\\n\" addedInfo=\"\\n\u0026{bob 2 [system:authenticated] map[]} is acting as \u0026{alice  [system:authenticated] map[]}\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\""}
{"Time":"2021-04-07T14:41:06.808543882Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc0206ad6e0, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:537 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc01f27d200, 0xc005452a80, 0xbb, 0x9e4, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc000126fa0, 0x4ce7a60, 0xc01f730c80, 0x8138fa, 0x4d8cca6)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc000388370, 0x561f0e0, 0xc01f730c80, 0x5611ba0, 0xc01f27d200, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachi"}
{"Time":"2021-04-07T14:41:06.808551985Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"nery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc000388370, 0x561f0e0, 0xc01f730c80, 0x5611ba0, 0xc01f27d200, 0x3df7931, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc01f730d20, 0x561f0e0, 0xc01f730c80, 0x5611ba0, 0xc01f27d200, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc01f730d20, 0x561f0e0, 0xc01f730c80, 0x5611ba0, 0xc01f27d200, 0xc000583380, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/ve"}
{"Time":"2021-04-07T14:41:06.808559063Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"rsioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4dabc2e, 0x10, 0x7ff3f4d078b8, 0xc01f730d20, 0x566ff78, 0xc019326f78, 0xc01f737100, 0x1f4, 0x561f0e0, 0xc01f730c80)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x5672b28, 0xc00db3fe00, 0x5672d38, 0x7ad6340, 0x0, 0x0, 0x4d8cca6, 0x2, 0x566ff78, 0xc019326f78, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x560b740, 0xc01fcdb200, 0x5672b28, 0xc00db3fe00, 0x0, 0x0, 0x4d8cca6, 0x2, 0x566ff78, 0xc019326f78, ...)\\n\\t/home/prow/go/src/k8s.io/kubernete"}
{"Time":"2021-04-07T14:41:06.808568121Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"s/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:106\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.DeleteResource.func1(0x566ff78, 0xc019326f78, 0xc01f737100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/delete.go:96 +0x1b45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulDeleteResource.func1(0xc0206ad650, 0xc01f81a000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1209 +0x83\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc0206ad650, 0xc01f81a000)\\n\\t/home/prow/go/src/k8s.i"}
{"Time":"2021-04-07T14:41:06.808583242Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"o/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:428 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc006098ea0, 0x7ff3f43ff3b0, 0xc019326f58, 0xc01f737100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4da52a1, 0xe, 0xc006098ea0, 0xc01b9dc5b0, 0x7ff3f43ff3b0, 0xc019326f58, 0xc01f737100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7ff3f43ff3b0, 0xc019"}
{"Time":"2021-04-07T14:41:06.808622822Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"326f58, 0xc01f737100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc0113d79e0, 0x7ff3f43ff3b0, 0xc019326f58, 0xc01f737100)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthorization.func1(0x7ff3f43ff3b0, 0xc019326f58, 0xc01f737100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authorization.go:64 +0x603\\nnet/http.HandlerFunc.ServeHTTP(0xc00db3f380, 0x7ff3f43ff3b0, 0xc019326f58, 0xc01f737100)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7ff3f43ff3b0, 0xc019326f58, 0xc01f737100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.g"}
{"Time":"2021-04-07T14:41:06.808631469Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"o:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc00db3f3c0, 0x7ff3f43ff3b0, 0xc019326f58, 0xc01f737100)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7ff3f43ff3b0, 0xc019326f58, 0xc01f737100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc0113d7a10, 0x7ff3f43ff3b0, 0xc019326f58, 0xc01f737100)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:127 +0x1ba\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/"}
{"Time":"2021-04-07T14:41:06.808639484Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc01f009290, 0xc000128cef, 0xc01ee29f80)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:329 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc01f009290, 0xc01ee29f80, 0xc01fcdb0e0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:330 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc020b11c00, 0x5674bd8, 0xc0206ad440, 0xc01f0091e0, 0x56756c8, 0xc01f45bb00, 0xc01fcdaff0, 0xc01fcdb000, 0xc01f27cfc0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol"}
{"Time":"2021-04-07T14:41:06.808646307Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"/apf_filter.go:166 +0x907\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7ff3f43ff3b0, 0xc019326f58, 0xc01f737000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:130 +0x606\\nnet/http.HandlerFunc.ServeHTTP(0xc0113d7a40, 0x7ff3f43ff3b0, 0xc019326f58, 0xc01f737000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7ff3f43ff3b0, 0xc019326f58, 0xc01f737000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc00db3f400, 0x7ff3f43ff3b0, 0xc019326f58, 0xc01f737000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7ff3f43ff3b0, 0xc019326f58, 0xc01f7370"}
{"Time":"2021-04-07T14:41:06.808654512Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc0113d7a70, 0x7ff3f43ff3b0, 0xc019326f58, 0xc01f737000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7ff3f43ff3b0, 0xc019326f58, 0xc01f736f00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:175 +0xcd0\\nnet/http.HandlerFunc.ServeHTTP(0xc00db3f440, 0x7ff3f43ff3b0, 0xc019326f58, 0xc01f736f00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7ff3f43ff3b0, 0xc019326f58, 0xc01f736f00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/"}
{"Time":"2021-04-07T14:41:06.808680358Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestImpersonateIsForbidden","Output":"l/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:80 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc00db3f500, 0x7ff3f43ff3b0, 0xc019326f58, 0xc01f736d00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc01f273020, 0xc011add7b8, 0x56757a8, 0xc019326f58, 0xc01f736d00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:107 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:93 +0x1f4\\n\" addedInfo=\"\\n\u0026{bob 2 [system:authenticated] map[]} is acting as \u0026{system:serviceaccount:default:default  [system:serviceaccounts system:serviceaccounts:default system:authenticated] map[]}\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\"}
{"Time":"2021-04-07T14:41:16.679675944Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/configmap","Output":"ok  \tk8s.io/kubernetes/test/integration/configmap\t4.834s\n"}
{"Time":"2021-04-07T14:41:28.071817697Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestCreateVeryLargeObject","Output":"o/apiserver/pkg/server/filters/timeout.go:222 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc06026eae0, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:537 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc05f35bc20, 0xc03c77e000, 0x7d, 0x4e655f, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc0013d0ba8, 0x4c4d960, 0xc05f34d2c0, 0x60fc3a, 0x549b2c6)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc0003ae140, 0x5576d20, 0xc05f34d2c0, 0x5569c00, 0xc05f35bc20, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output"}
{"Time":"2021-04-07T14:41:28.07182598Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestCreateVeryLargeObject","Output":"/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc0003ae140, 0x5576d20, 0xc05f34d2c0, 0x5569c00, 0xc05f35bc20, 0x3d83250, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc05f34d360, 0x5576d20, 0xc05f34d2c0, 0x5569c00, 0xc05f35bc20, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc05f34d360, 0x5576d20, 0xc05f34d2c0, 0x5569c00, 0xc05f35bc20, 0xc0006e4300, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernete"}
{"Time":"2021-04-07T14:41:28.071833991Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestCreateVeryLargeObject","Output":"s/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4d0fcfe, 0x10, 0x7fa0f9e5f680, 0xc05f34d360, 0x55c66f8, 0xc05ea6c840, 0xc061b7a000, 0x1f4, 0x5576d20, 0xc05f34d2c0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x55c9158, 0xc0540971c0, 0x55c9368, 0x7936900, 0x0, 0x0, 0x4cf1cda, 0x2, 0x55c66f8, 0xc05ea6c840, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x556d180, 0xc05f363008, 0x55c9158, 0xc0540971c0, 0x0, 0x0, 0x4cf1cda, 0x2, 0x55c66f8, 0x"}
{"Time":"2021-04-07T14:41:28.071844047Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestCreateVeryLargeObject","Output":"c05ea6c840, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:106\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.PatchResource.func1(0x55c66f8, 0xc05ea6c840, 0xc061b7a000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/patch.go:225 +0x211f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulPatchResource.func1(0xc06026ea50, 0xc061b76150)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1227 +0x99\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0x"}
{"Time":"2021-04-07T14:41:28.071865684Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestCreateVeryLargeObject","Output":"c06026ea50, 0xc061b76150)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:428 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc054d94ab0, 0x7fa0f948bb38, 0xc05ea6c828, 0xc061b7a000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4d094c4, 0xe, 0xc054d94ab0, 0xc011d8f260, 0x7fa0f948bb38, 0xc05ea6c828, 0xc061b7a000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filte"}
{"Time":"2021-04-07T14:41:28.071881608Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestCreateVeryLargeObject","Output":"piserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc054096500, 0x7fa0f948bb38, 0xc05ea6c828, 0xc061b7a000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa0f948bb38, 0xc05ea6c828, 0xc061b7a000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc04b4f0780, 0x7fa0f948bb38, 0xc05ea6c828, 0xc061b7a000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:127 +0x1ba\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kuber"}
{"Time":"2021-04-07T14:41:28.071890257Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestCreateVeryLargeObject","Output":"netes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc05f18f1e0, 0xc0013d2cef, 0xc061b760e0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:329 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc05f18f1e0, 0xc061b760e0, 0xc061b781a0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:330 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc049cb8a00, 0x55cb1f8, 0xc06026e900, 0xc05f18f080, 0x55cbce8, 0xc05f1b4b80, 0xc061b78060, 0xc061b78070, 0xc05f1bf9e0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kub"}
{"Time":"2021-04-07T14:41:28.071898206Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestCreateVeryLargeObject","Output":"ernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:166 +0x907\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7fa0f948bb38, 0xc05ea6c828, 0xc05f1b3f00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:130 +0x606\\nnet/http.HandlerFunc.ServeHTTP(0xc04b4f08a0, 0x7fa0f948bb38, 0xc05ea6c828, 0xc05f1b3f00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa0f948bb38, 0xc05ea6c828, 0xc05f1b3f00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc054096540, 0x7fa0f948bb38, 0xc05ea6c828, 0xc05f1b3f00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackComp"}
{"Time":"2021-04-07T14:41:28.075038349Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestCreateVeryLargeObject","Output":"oints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc0540965c0, 0x7fa0f948bb38, 0xc05ea6c828, 0xc05f1b3f00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa0f948bb38, 0xc05ea6c828, 0xc05f1b3f00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc04b4f0a50, 0x7fa0f948bb38, 0xc05ea6c828, 0xc05f1b3f00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa0f948bb38, 0xc05ea6c828, 0xc05f1b3f00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc054096640, 0x7fa0f948bb38, 0xc05ea6c828, 0xc05f1b3f00)\\n\\t/usr/local/go/src/"}
{"Time":"2021-04-07T14:41:28.075129245Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestCreateVeryLargeObject","Output":"net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa0f948bb38, 0xc05ea6c828, 0xc05f1b3f00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc04b4f0ae0, 0x7fa0f948bb38, 0xc05ea6c828, 0xc05f1b3f00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.withAuthentication.func1(0x7fa0f948bb38, 0xc05ea6c828, 0xc05f1b3f00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:80 +0x75c\\nnet/http.HandlerFunc.ServeHTTP(0xc032e673e0, 0x7fa0f948bb38, 0xc05ea6c828, 0xc05f1b3e00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa0f948bb38, 0xc05ea6c828, 0xc05f1b3d00)"}
{"Time":"2021-04-07T14:41:28.075140978Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestCreateVeryLargeObject","Output":"\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:80 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc054096680, 0x7fa0f948bb38, 0xc05ea6c828, 0xc05f1b3d00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc05f1a2c60, 0xc02e2de678, 0x55cbdc8, 0xc05ea6c828, 0xc05f1b3d00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:107 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:93 +0x1f4\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"etcdser"}
{"Time":"2021-04-07T14:41:30.030210489Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticator","Output":"/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc038ce9fb0, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:537 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc038ceed80, 0xc01fc28000, 0xbb, 0x12bd, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc0357f6fa0, 0x4ce7a60, 0xc038cfcc80, 0x8138fa, 0x4d8cca6)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc000388370, 0x561f0e0, 0xc038cfcc80, 0x5611ba0, 0xc038ceed80, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pk"}
{"Time":"2021-04-07T14:41:30.030234069Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticator","Output":"g/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc000388370, 0x561f0e0, 0xc038cfcc80, 0x5611ba0, 0xc038ceed80, 0x3df7931, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc038cfcd20, 0x561f0e0, 0xc038cfcc80, 0x5611ba0, 0xc038ceed80, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc038cfcd20, 0x561f0e0, 0xc038cfcc80, 0x5611ba0, 0xc038ceed80, 0xc000583380, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versionin"}
{"Time":"2021-04-07T14:41:30.030242452Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticator","Output":"g/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4dabc2e, 0x10, 0x7ff3f4d078b8, 0xc038cfcd20, 0x566ff78, 0xc038662b10, 0xc038e2b700, 0x1f4, 0x561f0e0, 0xc038cfcc80)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x5672b28, 0xc02a4dff40, 0x5672d38, 0x7ad6340, 0x0, 0x0, 0x4d8cca6, 0x2, 0x566ff78, 0xc038662b10, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x560b740, 0xc038cf2e50, 0x5672b28, 0xc02a4dff40, 0x0, 0x0, 0x4d8cca6, 0x2, 0x566ff78, 0xc038662b10, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_outp"}
{"Time":"2021-04-07T14:41:30.030261197Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticator","Output":"ut/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:106\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.DeleteResource.func1(0x566ff78, 0xc038662b10, 0xc038e2b700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/delete.go:96 +0x1b45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulDeleteResource.func1(0xc038ce9f20, 0xc038cf68c0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1209 +0x83\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc038ce9f20, 0xc038cf68c0)\\n\\t/home/prow/go/src/k8s.io/kuber"}
{"Time":"2021-04-07T14:41:30.030272346Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticator","Output":"netes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:428 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc032e227e0, 0x7ff3f43ff3b0, 0xc038662af8, 0xc038e2b700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4da52a1, 0xe, 0xc032e227e0, 0xc02afc2930, 0x7ff3f43ff3b0, 0xc038662af8, 0xc038e2b700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7ff3f43ff3b0, 0xc038662af8,"}
{"Time":"2021-04-07T14:41:30.030280609Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticator","Output":" 0xc038e2b700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc032e20ed0, 0x7ff3f43ff3b0, 0xc038662af8, 0xc038e2b700)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthorization.func1(0x7ff3f43ff3b0, 0xc038662af8, 0xc038e2b700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authorization.go:64 +0x603\\nnet/http.HandlerFunc.ServeHTTP(0xc02a4ded00, 0x7ff3f43ff3b0, 0xc038662af8, 0xc038e2b700)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7ff3f43ff3b0, 0xc038662af8, 0xc038e2b700)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0"}
{"Time":"2021-04-07T14:41:30.030302354Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticator","Output":"apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc038cada20, 0xc0357f8cef, 0xc038cf6850)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:329 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc038cada20, 0xc038cf6850, 0xc038cf2d30)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:330 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc032d4dd00, 0x5674bd8, 0xc038ce9da0, 0xc038cad8c0, 0x56756c8, 0xc038d76280, 0xc038cf2bf0, 0xc038cf2c00, 0xc038ceeb40)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_fi"}
{"Time":"2021-04-07T14:41:30.030311069Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticator","Output":"lter.go:166 +0x907\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7ff3f43ff3b0, 0xc038662af8, 0xc038e2b600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:130 +0x606\\nnet/http.HandlerFunc.ServeHTTP(0xc032e20f30, 0x7ff3f43ff3b0, 0xc038662af8, 0xc038e2b600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7ff3f43ff3b0, 0xc038662af8, 0xc038e2b600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc02a4dee40, 0x7ff3f43ff3b0, 0xc038662af8, 0xc038e2b600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7ff3f43ff3b0, 0xc038662af8, 0xc038e2b600)\\n\\t"}
{"Time":"2021-04-07T14:41:30.0303229Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticator","Output":"/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc032e20f60, 0x7ff3f43ff3b0, 0xc038662af8, 0xc038e2b600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7ff3f43ff3b0, 0xc038662af8, 0xc038e2b600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:50 +0x240d\\nnet/http.HandlerFunc.ServeHTTP(0xc02a4def80, 0x7ff3f43ff3b0, 0xc038662af8, 0xc038e2b600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7ff3f43ff3b0, 0xc038662af8, 0xc038e2b600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.Ha"}
{"Time":"2021-04-07T14:41:30.030350352Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticator","Output":"c/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:80 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc02a4df440, 0x7ff3f43ff3b0, 0xc038662af8, 0xc038e2b400)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc038ce5020, 0xc032d43e78, 0x56757a8, 0xc038662af8, 0xc038e2b400)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:107 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:93 +0x1f4\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"couldn't get version/kind; json parse error: invalid character '%'"}
{"Time":"2021-04-07T14:41:34.44840351Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestUpdateVeryLargeObject","Output":"o/apiserver/pkg/server/filters/timeout.go:222 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc053dc3ce0, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:537 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc03b4dc3c0, 0xc04c7e0000, 0x7d, 0x3c2bb7, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc065510ba8, 0x4c4d960, 0xc03a4e34a0, 0x60fc3a, 0x4cf1cda)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc0003ae140, 0x5576d20, 0xc03a4e34a0, 0x5569c00, 0xc03b4dc3c0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output"}
{"Time":"2021-04-07T14:41:34.448413056Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestUpdateVeryLargeObject","Output":"/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc0003ae140, 0x5576d20, 0xc03a4e34a0, 0x5569c00, 0xc03b4dc3c0, 0x3d83250, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc03a4e3540, 0x5576d20, 0xc03a4e34a0, 0x5569c00, 0xc03b4dc3c0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc03a4e3540, 0x5576d20, 0xc03a4e34a0, 0x5569c00, 0xc03b4dc3c0, 0xc0006e4300, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernete"}
{"Time":"2021-04-07T14:41:34.448421896Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestUpdateVeryLargeObject","Output":"s/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4d0fcfe, 0x10, 0x7fa0f9e5f680, 0xc03a4e3540, 0x55c66f8, 0xc0272383d8, 0xc05ef3b500, 0x1f4, 0x5576d20, 0xc03a4e34a0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x55c9158, 0xc064216940, 0x55c9368, 0x7936900, 0x0, 0x0, 0x4cf1cda, 0x2, 0x55c66f8, 0xc0272383d8, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x556d180, 0xc05daf9938, 0x55c9158, 0xc064216940, 0x0, 0x0, 0x4cf1cda, 0x2, 0x55c66f8, 0x"}
{"Time":"2021-04-07T14:41:34.448432147Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestUpdateVeryLargeObject","Output":"c0272383d8, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:106\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.PatchResource.func1(0x55c66f8, 0xc0272383d8, 0xc05ef3b500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/patch.go:225 +0x211f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulPatchResource.func1(0xc053dc3c50, 0xc04605a5b0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1227 +0x99\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0x"}
{"Time":"2021-04-07T14:41:34.448440343Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestUpdateVeryLargeObject","Output":"c053dc3c50, 0xc04605a5b0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:428 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc0642e1440, 0x7fa0f948bb38, 0xc0272383c0, 0xc05ef3b500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4d094c4, 0xe, 0xc0642e1440, 0xc05f3d9180, 0x7fa0f948bb38, 0xc0272383c0, 0xc05ef3b500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filte"}
{"Time":"2021-04-07T14:41:34.448460782Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestUpdateVeryLargeObject","Output":"piserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc0642d8a80, 0x7fa0f948bb38, 0xc0272383c0, 0xc05ef3b500)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa0f948bb38, 0xc0272383c0, 0xc05ef3b500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc0642dba70, 0x7fa0f948bb38, 0xc0272383c0, 0xc05ef3b500)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:127 +0x1ba\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kuber"}
{"Time":"2021-04-07T14:41:34.448469685Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestUpdateVeryLargeObject","Output":"netes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc05453e2c0, 0xc065512cef, 0xc04605a540)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:329 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc05453e2c0, 0xc04605a540, 0xc03b4baad0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:330 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc05f6eed00, 0x55cb1f8, 0xc053dc3b00, 0xc05453e160, 0x55cbce8, 0xc0492aa200, 0xc03b4ba990, 0xc03b4ba9a0, 0xc03a4e5a40)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kub"}
{"Time":"2021-04-07T14:41:34.448477802Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestUpdateVeryLargeObject","Output":"ernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:166 +0x907\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7fa0f948bb38, 0xc0272383c0, 0xc05ef3b400)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:130 +0x606\\nnet/http.HandlerFunc.ServeHTTP(0xc0642dbaa0, 0x7fa0f948bb38, 0xc0272383c0, 0xc05ef3b400)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa0f948bb38, 0xc0272383c0, 0xc05ef3b400)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc0642d8ac0, 0x7fa0f948bb38, 0xc0272383c0, 0xc05ef3b400)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackComp"}
{"Time":"2021-04-07T14:41:34.448494921Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestUpdateVeryLargeObject","Output":"oints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc0642d8b40, 0x7fa0f948bb38, 0xc0272383c0, 0xc05ef3b400)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa0f948bb38, 0xc0272383c0, 0xc05ef3b400)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc0642dbb00, 0x7fa0f948bb38, 0xc0272383c0, 0xc05ef3b400)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa0f948bb38, 0xc0272383c0, 0xc05ef3b400)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc0642d8b80, 0x7fa0f948bb38, 0xc0272383c0, 0xc05ef3b400)\\n\\t/usr/local/go/src/"}
{"Time":"2021-04-07T14:41:34.448503424Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestUpdateVeryLargeObject","Output":"net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa0f948bb38, 0xc0272383c0, 0xc05ef3b400)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc0642dbb60, 0x7fa0f948bb38, 0xc0272383c0, 0xc05ef3b400)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.withAuthentication.func1(0x7fa0f948bb38, 0xc0272383c0, 0xc05ef3b400)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:80 +0x75c\\nnet/http.HandlerFunc.ServeHTTP(0xc0642aa900, 0x7fa0f948bb38, 0xc0272383c0, 0xc05ef3b300)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa0f948bb38, 0xc0272383c0, 0xc05ef3b200)"}
{"Time":"2021-04-07T14:41:34.448514543Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestUpdateVeryLargeObject","Output":"\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:80 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc0642d8bc0, 0x7fa0f948bb38, 0xc0272383c0, 0xc05ef3b200)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc05819ce40, 0xc0642d5980, 0x55cbdc8, 0xc0272383c0, 0xc05ef3b200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:107 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:93 +0x1f4\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"etcdser"}
{"Time":"2021-04-07T14:41:34.853355199Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticatorCustomDial","Output":"/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc03f554de0, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:537 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc03f52df80, 0xc017a4d500, 0xbb, 0x1329, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc03ed26fa0, 0x4ce7a60, 0xc03f540c80, 0x8138fa, 0x4d8cca6)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc000388370, 0x561f0e0, 0xc03f540c80, 0x5611ba0, 0xc03f52df80, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pk"}
{"Time":"2021-04-07T14:41:34.853364688Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticatorCustomDial","Output":"g/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc000388370, 0x561f0e0, 0xc03f540c80, 0x5611ba0, 0xc03f52df80, 0x3df7931, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc03f540d20, 0x561f0e0, 0xc03f540c80, 0x5611ba0, 0xc03f52df80, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc03f540d20, 0x561f0e0, 0xc03f540c80, 0x5611ba0, 0xc03f52df80, 0xc000583380, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versionin"}
{"Time":"2021-04-07T14:41:34.853373362Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticatorCustomDial","Output":"g/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4dabc2e, 0x10, 0x7ff3f4d078b8, 0xc03f540d20, 0x566ff78, 0xc03eac9128, 0xc03f55ab00, 0x1f4, 0x561f0e0, 0xc03f540c80)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x5672b28, 0xc039511b00, 0x5672d38, 0x7ad6340, 0x0, 0x0, 0x4d8cca6, 0x2, 0x566ff78, 0xc03eac9128, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x560b740, 0xc03f550990, 0x5672b28, 0xc039511b00, 0x0, 0x0, 0x4d8cca6, 0x2, 0x566ff78, 0xc03eac9128, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_outp"}
{"Time":"2021-04-07T14:41:34.853381593Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticatorCustomDial","Output":"ut/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:106\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.DeleteResource.func1(0x566ff78, 0xc03eac9128, 0xc03f55ab00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/delete.go:96 +0x1b45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulDeleteResource.func1(0xc03f554d50, 0xc03f36de30)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1209 +0x83\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc03f554d50, 0xc03f36de30)\\n\\t/home/prow/go/src/k8s.io/kuber"}
{"Time":"2021-04-07T14:41:34.8533914Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticatorCustomDial","Output":"netes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:428 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc0395278c0, 0x7ff3f43ff3b0, 0xc03eac9110, 0xc03f55ab00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4da52a1, 0xe, 0xc0395278c0, 0xc0394e7810, 0x7ff3f43ff3b0, 0xc03eac9110, 0xc03f55ab00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7ff3f43ff3b0, 0xc03eac9110,"}
{"Time":"2021-04-07T14:41:34.853399553Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticatorCustomDial","Output":" 0xc03f55ab00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc03952e510, 0x7ff3f43ff3b0, 0xc03eac9110, 0xc03f55ab00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthorization.func1(0x7ff3f43ff3b0, 0xc03eac9110, 0xc03f55ab00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authorization.go:64 +0x603\\nnet/http.HandlerFunc.ServeHTTP(0xc039511180, 0x7ff3f43ff3b0, 0xc03eac9110, 0xc03f55ab00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7ff3f43ff3b0, 0xc03eac9110, 0xc03f55ab00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0"}
{"Time":"2021-04-07T14:41:34.853475917Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticatorCustomDial","Output":"apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc03f54a630, 0xc03ed28cef, 0xc03f36ddc0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:329 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc03f54a630, 0xc03f36ddc0, 0xc03f550870)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:330 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc0391e3f00, 0x5674bd8, 0xc03f554bd0, 0xc03f54a4d0, 0x56756c8, 0xc03f586480, 0xc03f550730, 0xc03f550740, 0xc03f52dd40)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_fi"}
{"Time":"2021-04-07T14:41:34.853484513Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticatorCustomDial","Output":"lter.go:166 +0x907\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7ff3f43ff3b0, 0xc03eac9110, 0xc03f55aa00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:130 +0x606\\nnet/http.HandlerFunc.ServeHTTP(0xc03952e570, 0x7ff3f43ff3b0, 0xc03eac9110, 0xc03f55aa00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7ff3f43ff3b0, 0xc03eac9110, 0xc03f55aa00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc039511200, 0x7ff3f43ff3b0, 0xc03eac9110, 0xc03f55aa00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7ff3f43ff3b0, 0xc03eac9110, 0xc03f55aa00)\\n\\t"}
{"Time":"2021-04-07T14:41:34.853493519Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticatorCustomDial","Output":"/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc03952e5a0, 0x7ff3f43ff3b0, 0xc03eac9110, 0xc03f55aa00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7ff3f43ff3b0, 0xc03eac9110, 0xc03f55aa00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:50 +0x240d\\nnet/http.HandlerFunc.ServeHTTP(0xc039511240, 0x7ff3f43ff3b0, 0xc03eac9110, 0xc03f55aa00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7ff3f43ff3b0, 0xc03eac9110, 0xc03f55aa00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.Ha"}
{"Time":"2021-04-07T14:41:34.853530632Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/auth","Test":"TestWebhookTokenAuthenticatorCustomDial","Output":"c/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:80 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc039511300, 0x7ff3f43ff3b0, 0xc03eac9110, 0xc03f55a800)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc03f5528a0, 0xc0391db2f0, 0x56757a8, 0xc03eac9110, 0xc03f55a800)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:107 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:93 +0x1f4\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"couldn't get version/kind; json parse error: invalid character '%'"}
{"Time":"2021-04-07T14:41:46.477860132Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestPatchVeryLargeObject","Output":"/apiserver/pkg/server/filters/timeout.go:222 +0xb2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc007ac0960, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:537 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc06e7e2e40, 0xc07960e000, 0x7d, 0x111e93, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc0383b4ba8, 0x4c4d960, 0xc058f770e0, 0x60fc3a, 0x4cf1cda)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc0003ae140, 0x5576d20, 0xc058f770e0, 0x5569c00, 0xc06e7e2e40, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/"}
{"Time":"2021-04-07T14:41:46.477871282Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestPatchVeryLargeObject","Output":"local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc0003ae140, 0x5576d20, 0xc058f770e0, 0x5569c00, 0xc06e7e2e40, 0x3d83250, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc058f77400, 0x5576d20, 0xc058f770e0, 0x5569c00, 0xc06e7e2e40, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc058f77400, 0x5576d20, 0xc058f770e0, 0x5569c00, 0xc06e7e2e40, 0xc0006e4300, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes"}
{"Time":"2021-04-07T14:41:46.477879067Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestPatchVeryLargeObject","Output":"/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4d0fcfe, 0x10, 0x7fa0f9e5f680, 0xc058f77400, 0x55c66f8, 0xc06df48030, 0xc002f48800, 0x1f4, 0x5576d20, 0xc058f770e0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x55c9158, 0xc05f3011c0, 0x55c9368, 0x7936900, 0x0, 0x0, 0x4cf1cda, 0x2, 0x55c66f8, 0xc06df48030, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x556d180, 0xc010b0c000, 0x55c9158, 0xc05f3011c0, 0x0, 0x0, 0x4cf1cda, 0x2, 0x55c66f8, 0xc"}
{"Time":"2021-04-07T14:41:46.477887907Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestPatchVeryLargeObject","Output":"06df48030, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:106\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.PatchResource.func1(0x55c66f8, 0xc06df48030, 0xc002f48800)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/patch.go:225 +0x211f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulPatchResource.func1(0xc007ac08d0, 0xc06dc3e150)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1227 +0x99\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc"}
{"Time":"2021-04-07T14:41:46.477896309Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestPatchVeryLargeObject","Output":"007ac08d0, 0xc06dc3e150)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:428 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc05f35c000, 0x7fa0f948bb38, 0xc06df48018, 0xc002f48800)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4d094c4, 0xe, 0xc05f35c000, 0xc044af0bd0, 0x7fa0f948bb38, 0xc06df48018, 0xc002f48800)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filter"}
{"Time":"2021-04-07T14:41:46.477919101Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestPatchVeryLargeObject","Output":"iserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc05f3006c0, 0x7fa0f948bb38, 0xc06df48018, 0xc002f48800)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa0f948bb38, 0xc06df48018, 0xc002f48800)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc053fa42a0, 0x7fa0f948bb38, 0xc06df48018, 0xc002f48800)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:127 +0x1ba\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kubern"}
{"Time":"2021-04-07T14:41:46.477931526Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestPatchVeryLargeObject","Output":"etes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc014dee160, 0xc0383b6cef, 0xc06dc3e0e0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:329 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc014dee160, 0xc06dc3e0e0, 0xc00dcbc2e0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:330 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc03a4ee800, 0x55cb1f8, 0xc007ac06f0, 0xc014dee000, 0x55cbce8, 0xc06e604100, 0xc00dcbc100, 0xc00dcbc110, 0xc06e7e2300)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kube"}
{"Time":"2021-04-07T14:41:46.477940363Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestPatchVeryLargeObject","Output":"rnetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:166 +0x907\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7fa0f948bb38, 0xc06df48018, 0xc002f48600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:130 +0x606\\nnet/http.HandlerFunc.ServeHTTP(0xc053fa42d0, 0x7fa0f948bb38, 0xc06df48018, 0xc002f48600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa0f948bb38, 0xc06df48018, 0xc002f48600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc05f300740, 0x7fa0f948bb38, 0xc06df48018, 0xc002f48600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompl"}
{"Time":"2021-04-07T14:41:46.477957673Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestPatchVeryLargeObject","Output":"ints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc05f3007c0, 0x7fa0f948bb38, 0xc06df48018, 0xc002f48600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa0f948bb38, 0xc06df48018, 0xc002f48600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc053fa4330, 0x7fa0f948bb38, 0xc06df48018, 0xc002f48600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa0f948bb38, 0xc06df48018, 0xc002f48600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc05f300800, 0x7fa0f948bb38, 0xc06df48018, 0xc002f48600)\\n\\t/usr/local/go/src/n"}
{"Time":"2021-04-07T14:41:46.477966823Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestPatchVeryLargeObject","Output":"et/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fa0f948bb38, 0xc06df48018, 0xc002f48600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc053fa4390, 0x7fa0f948bb38, 0xc06df48018, 0xc002f48600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.withAuthentication.func1(0x7fa0f948bb38, 0xc06df48018, 0xc002f48600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authentication.go:80 +0x75c\\nnet/http.HandlerFunc.ServeHTTP(0xc03a4e9da0, 0x7fa0f948bb38, 0xc06df48018, 0xc008283c00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fa0f948bb38, 0xc06df48018, 0xc008283600)\\"}
{"Time":"2021-04-07T14:41:46.477977297Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/apply","Test":"TestPatchVeryLargeObject","Output":"n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:80 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc05f300840, 0x7fa0f948bb38, 0xc06df48018, 0xc008283600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc06e7fe1e0, 0xc05d910390, 0x55cbdc8, 0xc06df48018, 0xc008283600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:107 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:93 +0x1f4\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"etcdserv"}
{"Time":"2021-04-07T14:42:02.54244296Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/cronjob","Output":"ok  \tk8s.io/kubernetes/test/integration/cronjob\t35.736s\n"}
{"Time":"2021-04-07T14:42:17.992654544Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/flowcontrol","Output":"ok  \tk8s.io/kubernetes/test/integration/apiserver/flowcontrol\t137.697s\n"}
{"Time":"2021-04-07T14:42:44.329053977Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver","Output":"ok  \tk8s.io/kubernetes/test/integration/apiserver\t157.741s\n"}
{"Time":"2021-04-07T14:42:52.362812145Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/certificates","Output":"ok  \tk8s.io/kubernetes/test/integration/certificates\t140.798s\n"}
{"Time":"2021-04-07T14:43:00.443449004Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/defaulttolerationseconds","Output":"ok  \tk8s.io/kubernetes/test/integration/defaulttolerationseconds\t5.163s\n"}
{"Time":"2021-04-07T14:43:02.032982777Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/client","Test":"TestPatch","Output":"        \u0004name\u0012\u0005image*\u0000B\u0000j\u0014/dev/termination-logr\u0006Always\ufffd\u0001\u0000\ufffd\u0001\u0000\ufffd\u0001\u0000\ufffd\u0001\u0004File\u001a\u0006Always \u001e2\u000cClusterFirstB\u0000J\u0000R\u0000X\u0000`\u0000h\u0000r\u0000\ufffd\u0001\u0000\ufffd\u0001\u0000\ufffd\u0001\u0011default-scheduler\ufffd\u00016\n"}
... skipping 172 lines ...
{"Time":"2021-04-07T14:46:13.582984489Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:428 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc0009f2240, 0x7fdaf4329080, 0xc0084711c0, 0xc009136600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4a1ce04, 0xe, 0xc0009f2240, 0xc000896850, 0x7fdaf4329080, 0xc0084711c0, 0xc009136600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fdaf4329080, 0xc0084"}
{"Time":"2021-04-07T14:46:13.582995598Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"711c0, 0xc009136600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bd10, 0x7fdaf4329080, 0xc0084711c0, 0xc009136600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthorization.func1(0x7fdaf4329080, 0xc0084711c0, 0xc009136600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authorization.go:64 +0x603\\nnet/http.HandlerFunc.ServeHTTP(0xc000086940, 0x7fdaf4329080, 0xc0084711c0, 0xc009136600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fdaf4329080, 0xc0084711c0, 0xc009136600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go"}
{"Time":"2021-04-07T14:46:13.583005405Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":":71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000086980, 0x7fdaf4329080, 0xc0084711c0, 0xc009136600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fdaf4329080, 0xc0084711c0, 0xc009136600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bd40, 0x7fdaf4329080, 0xc0084711c0, 0xc009136600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:127 +0x1ba\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k"}
{"Time":"2021-04-07T14:46:13.583017891Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc00644a000, 0xc0091a0cef, 0xc001a64d90)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:329 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc00644a000, 0xc001a64d90, 0xc0053054a0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:330 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc0002e8f00, 0x5281e78, 0xc009133170, 0xc006edfe40, 0x52827a8, 0xc00504d340, 0xc005305370, 0xc005305380, 0xc0023c7200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/"}
{"Time":"2021-04-07T14:46:13.583027379Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"apf_filter.go:166 +0x907\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7fdaf4329080, 0xc0084711c0, 0xc009136500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:130 +0x606\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bd70, 0x7fdaf4329080, 0xc0084711c0, 0xc009136500)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fdaf4329080, 0xc0084711c0, 0xc009136500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000086a80, 0x7fdaf4329080, 0xc0084711c0, 0xc009136500)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fdaf4329080, 0xc0084711c0, 0xc00913650"}
{"Time":"2021-04-07T14:46:13.583039255Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bda0, 0x7fdaf4329080, 0xc0084711c0, 0xc009136500)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7fdaf4329080, 0xc0084711c0, 0xc009136500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:50 +0x240d\\nnet/http.HandlerFunc.ServeHTTP(0xc000086ac0, 0x7fdaf4329080, 0xc0084711c0, 0xc009136500)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fdaf4329080, 0xc0084711c0, 0xc009136500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/h"}
{"Time":"2021-04-07T14:46:13.583199462Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:80 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc000086bc0, 0x7fdaf4329080, 0xc0084711c0, 0xc009136300)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc0090133e0, 0xc00051bfe0, 0x5282888, 0xc0084711c0, 0xc009136300)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:107 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:93 +0x1f4\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"Internal error occurred: resource quota evaluation timed out"}
{"Time":"2021-04-07T14:46:13.586864801Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc0080fb650, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:537 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc0077ff380, 0xc00679c000, 0xfb, 0x37d, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc0080d0ea8, 0x4963c20, 0xc0047c2a00, 0xb76c1a, 0x4a062ca)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc0005ba050, 0x52310d0, 0xc0047c2a00, 0x5224d80, 0xc0077ff380, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachin"}
{"Time":"2021-04-07T14:46:13.586877522Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"ery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc0005ba050, 0x52310d0, 0xc0047c2a00, 0x5224d80, 0xc0077ff380, 0x3afffe9, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc0047c2be0, 0x52310d0, 0xc0047c2a00, 0x5224d80, 0xc0077ff380, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc0047c2be0, 0x52310d0, 0xc0047c2a00, 0x5224d80, 0xc0077ff380, 0xc000603b00, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/ver"}
{"Time":"2021-04-07T14:46:13.586886951Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"sioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4a2319a, 0x10, 0x7fdaf4db69f0, 0xc0047c2be0, 0x527d9c8, 0xc0044c3870, 0xc006da5600, 0x1f4, 0x52310d0, 0xc0047c2a00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x527ffa8, 0xc000a10600, 0x5280188, 0x7493400, 0x0, 0x0, 0x4a062ca, 0x2, 0x527d9c8, 0xc0044c3870, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x5224980, 0xc0047c2820, 0x527ffa8, 0xc000a10600, 0x0, 0x0, 0x4a062ca, 0x2, 0x527d9c8, 0xc0044c3870, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes"}
{"Time":"2021-04-07T14:46:13.586896893Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:106\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.createHandler.func1(0x527d9c8, 0xc0044c3870, 0xc006da5600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/create.go:188 +0x1bc6\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulCreateResource.func1(0xc0080fb5c0, 0xc0004ffd50)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1203 +0xe2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc0080fb5c0, 0xc0004ffd50)\\n\\t/home/prow/go/src/k8s.io"}
{"Time":"2021-04-07T14:46:13.586906398Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:428 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc0009f2240, 0x7fdaf4329080, 0xc0044c3858, 0xc006da5600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4a1ce04, 0xe, 0xc0009f2240, 0xc000896850, 0x7fdaf4329080, 0xc0044c3858, 0xc006da5600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fdaf4329080, 0xc0044"}
{"Time":"2021-04-07T14:46:13.58691586Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"c3858, 0xc006da5600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bd10, 0x7fdaf4329080, 0xc0044c3858, 0xc006da5600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthorization.func1(0x7fdaf4329080, 0xc0044c3858, 0xc006da5600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authorization.go:64 +0x603\\nnet/http.HandlerFunc.ServeHTTP(0xc000086940, 0x7fdaf4329080, 0xc0044c3858, 0xc006da5600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fdaf4329080, 0xc0044c3858, 0xc006da5600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go"}
{"Time":"2021-04-07T14:46:13.586925598Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":":71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000086980, 0x7fdaf4329080, 0xc0044c3858, 0xc006da5600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fdaf4329080, 0xc0044c3858, 0xc006da5600)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bd40, 0x7fdaf4329080, 0xc0044c3858, 0xc006da5600)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:127 +0x1ba\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k"}
{"Time":"2021-04-07T14:46:13.587331089Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc007d618c0, 0xc0080d2cef, 0xc0004ffce0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:329 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc007d618c0, 0xc0004ffce0, 0xc0054566c0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:330 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc0002e8f00, 0x5281e78, 0xc0080fb470, 0xc007d61760, 0x52827a8, 0xc005bc3d40, 0xc005456590, 0xc0054565a0, 0xc007367740)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/"}
{"Time":"2021-04-07T14:46:13.587355761Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"apf_filter.go:166 +0x907\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7fdaf4329080, 0xc0044c3858, 0xc006da5500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:130 +0x606\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bd70, 0x7fdaf4329080, 0xc0044c3858, 0xc006da5500)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fdaf4329080, 0xc0044c3858, 0xc006da5500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000086a80, 0x7fdaf4329080, 0xc0044c3858, 0xc006da5500)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fdaf4329080, 0xc0044c3858, 0xc006da550"}
{"Time":"2021-04-07T14:46:13.587366326Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bda0, 0x7fdaf4329080, 0xc0044c3858, 0xc006da5500)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7fdaf4329080, 0xc0044c3858, 0xc006da5500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:50 +0x240d\\nnet/http.HandlerFunc.ServeHTTP(0xc000086ac0, 0x7fdaf4329080, 0xc0044c3858, 0xc006da5500)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fdaf4329080, 0xc0044c3858, 0xc006da5500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/h"}
{"Time":"2021-04-07T14:46:13.587460526Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:80 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc000086bc0, 0x7fdaf4329080, 0xc0044c3858, 0xc006da5300)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc007ff6900, 0xc00051bfe0, 0x5282888, 0xc0044c3858, 0xc006da5300)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:107 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:93 +0x1f4\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"Internal error occurred: resource quota evaluation timed out"}
{"Time":"2021-04-07T14:46:13.587879059Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc0091108d0, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:537 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc0055842a0, 0xc008daa580, 0xfb, 0x55f, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc00911aea8, 0x4963c20, 0xc004a70aa0, 0xb76c1a, 0x4a062ca)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc0005ba050, 0x52310d0, 0xc004a70aa0, 0x5224d80, 0xc0055842a0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachine"}
{"Time":"2021-04-07T14:46:13.587889251Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"ry/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc0005ba050, 0x52310d0, 0xc004a70aa0, 0x5224d80, 0xc0055842a0, 0x3afffe9, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc004a70be0, 0x52310d0, 0xc004a70aa0, 0x5224d80, 0xc0055842a0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc004a70be0, 0x52310d0, 0xc004a70aa0, 0x5224d80, 0xc0055842a0, 0xc000603b00, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/vers"}
{"Time":"2021-04-07T14:46:13.587898621Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"ioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4a2319a, 0x10, 0x7fdaf4db69f0, 0xc004a70be0, 0x527d9c8, 0xc008471118, 0xc0055da300, 0x1f4, 0x52310d0, 0xc004a70aa0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x527ffa8, 0xc000a10600, 0x5280188, 0x7493400, 0x0, 0x0, 0x4a062ca, 0x2, 0x527d9c8, 0xc008471118, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x5224980, 0xc004a708c0, 0x527ffa8, 0xc000a10600, 0x0, 0x0, 0x4a062ca, 0x2, 0x527d9c8, 0xc008471118, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/"}
{"Time":"2021-04-07T14:46:13.587908289Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:106\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.createHandler.func1(0x527d9c8, 0xc008471118, 0xc0055da300)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/create.go:188 +0x1bc6\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulCreateResource.func1(0xc009110840, 0xc001a64620)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1203 +0xe2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc009110840, 0xc001a64620)\\n\\t/home/prow/go/src/k8s.io/"}
{"Time":"2021-04-07T14:46:13.587918683Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:428 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc0009f2240, 0x7fdaf4329080, 0xc008471100, 0xc0055da300)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4a1ce04, 0xe, 0xc0009f2240, 0xc000896850, 0x7fdaf4329080, 0xc008471100, 0xc0055da300)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fdaf4329080, 0xc00847"}
{"Time":"2021-04-07T14:46:13.587930097Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"1100, 0xc0055da300)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bd10, 0x7fdaf4329080, 0xc008471100, 0xc0055da300)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthorization.func1(0x7fdaf4329080, 0xc008471100, 0xc0055da300)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authorization.go:64 +0x603\\nnet/http.HandlerFunc.ServeHTTP(0xc000086940, 0x7fdaf4329080, 0xc008471100, 0xc0055da300)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fdaf4329080, 0xc008471100, 0xc0055da300)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:"}
{"Time":"2021-04-07T14:46:13.587939579Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000086980, 0x7fdaf4329080, 0xc008471100, 0xc0055da300)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fdaf4329080, 0xc008471100, 0xc0055da300)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bd40, 0x7fdaf4329080, 0xc008471100, 0xc0055da300)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:127 +0x1ba\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8"}
{"Time":"2021-04-07T14:46:13.587950022Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc006edf8c0, 0xc00911ccef, 0xc001a645b0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:329 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc006edf8c0, 0xc001a645b0, 0xc005304b80)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:330 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc0002e8f00, 0x5281e78, 0xc0091106f0, 0xc006edf760, 0x52827a8, 0xc00504ca00, 0xc005304a50, 0xc005304a60, 0xc007432f60)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/a"}
{"Time":"2021-04-07T14:46:13.587959245Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"pf_filter.go:166 +0x907\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7fdaf4329080, 0xc008471100, 0xc0055da200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:130 +0x606\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bd70, 0x7fdaf4329080, 0xc008471100, 0xc0055da200)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fdaf4329080, 0xc008471100, 0xc0055da200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000086a80, 0x7fdaf4329080, 0xc008471100, 0xc0055da200)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fdaf4329080, 0xc008471100, 0xc0055da200"}
{"Time":"2021-04-07T14:46:13.587968762Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":")\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bda0, 0x7fdaf4329080, 0xc008471100, 0xc0055da200)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7fdaf4329080, 0xc008471100, 0xc0055da200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:50 +0x240d\\nnet/http.HandlerFunc.ServeHTTP(0xc000086ac0, 0x7fdaf4329080, 0xc008471100, 0xc0055da200)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fdaf4329080, 0xc008471100, 0xc0055da200)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/ht"}
{"Time":"2021-04-07T14:46:13.588000502Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:80 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc000086bc0, 0x7fdaf4329080, 0xc008471100, 0xc0055da000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc009012c60, 0xc00051bfe0, 0x5282888, 0xc008471100, 0xc0055da000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:107 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:93 +0x1f4\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"Internal error occurred: resource quota evaluation timed out\\"}
{"Time":"2021-04-07T14:46:13.588033106Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc009111e00, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:537 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc0077ff560, 0xc00679c000, 0xfb, 0x37d, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc00862cea8, 0x4963c20, 0xc0047c3180, 0xb76c1a, 0x4a062ca)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc0005ba050, 0x52310d0, 0xc0047c3180, 0x5224d80, 0xc0077ff560, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachin"}
{"Time":"2021-04-07T14:46:13.588042428Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"ery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc0005ba050, 0x52310d0, 0xc0047c3180, 0x5224d80, 0xc0077ff560, 0x3afffe9, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc0047c32c0, 0x52310d0, 0xc0047c3180, 0x5224d80, 0xc0077ff560, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc0047c32c0, 0x52310d0, 0xc0047c3180, 0x5224d80, 0xc0077ff560, 0xc000603b00, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/ver"}
{"Time":"2021-04-07T14:46:13.588052717Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"sioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4a2319a, 0x10, 0x7fdaf4db69f0, 0xc0047c32c0, 0x527d9c8, 0xc008471178, 0xc0055db400, 0x1f4, 0x52310d0, 0xc0047c3180)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x527ffa8, 0xc000a10600, 0x5280188, 0x7493400, 0x0, 0x0, 0x4a062ca, 0x2, 0x527d9c8, 0xc008471178, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x5224980, 0xc0047c2fa0, 0x527ffa8, 0xc000a10600, 0x0, 0x0, 0x4a062ca, 0x2, 0x527d9c8, 0xc008471178, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes"}
{"Time":"2021-04-07T14:46:13.588072913Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:106\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.createHandler.func1(0x527d9c8, 0xc008471178, 0xc0055db400)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/create.go:188 +0x1bc6\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulCreateResource.func1(0xc009111d70, 0xc001a649a0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1203 +0xe2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc009111d70, 0xc001a649a0)\\n\\t/home/prow/go/src/k8s.io"}
{"Time":"2021-04-07T14:46:13.588082678Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:428 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc0009f2240, 0x7fdaf4329080, 0xc0068002e8, 0xc0055db400)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4a1ce04, 0xe, 0xc0009f2240, 0xc000896850, 0x7fdaf4329080, 0xc0068002e8, 0xc0055db400)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fdaf4329080, 0xc0068"}
{"Time":"2021-04-07T14:46:13.588117995Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"002e8, 0xc0055db400)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bd10, 0x7fdaf4329080, 0xc0068002e8, 0xc0055db400)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthorization.func1(0x7fdaf4329080, 0xc0068002e8, 0xc0055db400)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authorization.go:64 +0x603\\nnet/http.HandlerFunc.ServeHTTP(0xc000086940, 0x7fdaf4329080, 0xc0068002e8, 0xc0055db400)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fdaf4329080, 0xc0068002e8, 0xc0055db400)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go"}
{"Time":"2021-04-07T14:46:13.588127625Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":":71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000086980, 0x7fdaf4329080, 0xc0068002e8, 0xc0055db400)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fdaf4329080, 0xc0068002e8, 0xc0055db400)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bd40, 0x7fdaf4329080, 0xc0068002e8, 0xc0055db400)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:127 +0x1ba\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k"}
{"Time":"2021-04-07T14:46:13.588155026Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc006d67b80, 0xc00862ecef, 0xc001a648c0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:329 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc006d67b80, 0xc001a648c0, 0xc005304f90)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:330 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc0002e8f00, 0x5281e78, 0xc00aa76930, 0xc006a41e40, 0x52827a8, 0xc007471a80, 0xc004ec2720, 0xc004ec2730, 0xc0036f0a80)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/"}
{"Time":"2021-04-07T14:46:13.588165261Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"apf_filter.go:166 +0x907\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7fdaf4329080, 0xc0068002e8, 0xc007879000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:130 +0x606\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bd70, 0x7fdaf4329080, 0xc0068002e8, 0xc007879000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fdaf4329080, 0xc0068002e8, 0xc007879000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000086a80, 0x7fdaf4329080, 0xc0068002e8, 0xc007879000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fdaf4329080, 0xc0068002e8, 0xc00787900"}
{"Time":"2021-04-07T14:46:13.588177034Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bda0, 0x7fdaf4329080, 0xc0068002e8, 0xc007879000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7fdaf4329080, 0xc0068002e8, 0xc007879000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:50 +0x240d\\nnet/http.HandlerFunc.ServeHTTP(0xc000086ac0, 0x7fdaf4329080, 0xc0068002e8, 0xc007879000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fdaf4329080, 0xc0068002e8, 0xc007879000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/h"}
{"Time":"2021-04-07T14:46:13.588206844Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:80 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc000086bc0, 0x7fdaf4329080, 0xc0068002e8, 0xc007878e00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc007d85800, 0xc00051bfe0, 0x5282888, 0xc0068002e8, 0xc007878e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:107 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:93 +0x1f4\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"Internal error occurred: resource quota evaluation timed out"}
{"Time":"2021-04-07T14:46:13.592917358Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc0086073b0, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:537 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc005584480, 0xc008daa580, 0xfb, 0x55f, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc00851aea8, 0x4963c20, 0xc004a70dc0, 0xb76c1a, 0x4a062ca)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc0005ba050, 0x52310d0, 0xc004a70dc0, 0x5224d80, 0xc005584480, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachin"}
{"Time":"2021-04-07T14:46:13.592929995Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"ery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc0005ba050, 0x52310d0, 0xc004a70dc0, 0x5224d80, 0xc005584480, 0x3afffe9, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc004a70e60, 0x52310d0, 0xc004a70dc0, 0x5224d80, 0xc005584480, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc004a70e60, 0x52310d0, 0xc004a70dc0, 0x5224d80, 0xc005584480, 0xc000603b00, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/ver"}
{"Time":"2021-04-07T14:46:13.592940726Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"sioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4a2319a, 0x10, 0x7fdaf4db69f0, 0xc004a70e60, 0x527d9c8, 0xc002218d68, 0xc0002e8e00, 0x1f4, 0x52310d0, 0xc004a70dc0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x527ffa8, 0xc000a10600, 0x5280188, 0x7493400, 0x0, 0x0, 0x4a062ca, 0x2, 0x527d9c8, 0xc002218d68, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x5224980, 0xc004a70d20, 0x527ffa8, 0xc000a10600, 0x0, 0x0, 0x4a062ca, 0x2, 0x527d9c8, 0xc002218d68, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes"}
{"Time":"2021-04-07T14:46:13.59295156Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:106\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.createHandler.func1(0x527d9c8, 0xc002218d68, 0xc0002e8e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/create.go:188 +0x1bc6\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulCreateResource.func1(0xc008607320, 0xc000da96c0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1203 +0xe2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc008607320, 0xc000da96c0)\\n\\t/home/prow/go/src/k8s.io"}
{"Time":"2021-04-07T14:46:13.592961767Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:428 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc0009f2240, 0x7fdaf4329080, 0xc002218d48, 0xc0002e8e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4a1ce04, 0xe, 0xc0009f2240, 0xc000896850, 0x7fdaf4329080, 0xc002218d48, 0xc0002e8e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fdaf4329080, 0xc0022"}
{"Time":"2021-04-07T14:46:13.592972583Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"18d48, 0xc0002e8e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bd10, 0x7fdaf4329080, 0xc002218d48, 0xc0002e8e00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthorization.func1(0x7fdaf4329080, 0xc002218d48, 0xc0002e8e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authorization.go:64 +0x603\\nnet/http.HandlerFunc.ServeHTTP(0xc000086940, 0x7fdaf4329080, 0xc002218d48, 0xc0002e8e00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fdaf4329080, 0xc002218d48, 0xc0002e8e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go"}
{"Time":"2021-04-07T14:46:13.592982712Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":":71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000086980, 0x7fdaf4329080, 0xc002218d48, 0xc0002e8e00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fdaf4329080, 0xc002218d48, 0xc0002e8e00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bd40, 0x7fdaf4329080, 0xc002218d48, 0xc0002e8e00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:127 +0x1ba\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k"}
{"Time":"2021-04-07T14:46:13.592995933Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc006d67550, 0xc00851ccef, 0xc000da9650)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:329 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc006d67550, 0xc000da9650, 0xc00592ecd0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:330 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc0002e8f00, 0x5281e78, 0xc0086071d0, 0xc006d673f0, 0x52827a8, 0xc005a165c0, 0xc00592eba0, 0xc00592ebb0, 0xc006323ec0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/"}
{"Time":"2021-04-07T14:46:13.593006879Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"apf_filter.go:166 +0x907\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7fdaf4329080, 0xc002218d48, 0xc0002e8d00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:130 +0x606\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bd70, 0x7fdaf4329080, 0xc002218d48, 0xc0002e8d00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fdaf4329080, 0xc002218d48, 0xc0002e8d00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000086a80, 0x7fdaf4329080, 0xc002218d48, 0xc0002e8d00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fdaf4329080, 0xc002218d48, 0xc0002e8d0"}
{"Time":"2021-04-07T14:46:13.593016883Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bda0, 0x7fdaf4329080, 0xc002218d48, 0xc0002e8d00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7fdaf4329080, 0xc002218d48, 0xc0002e8d00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:50 +0x240d\\nnet/http.HandlerFunc.ServeHTTP(0xc000086ac0, 0x7fdaf4329080, 0xc002218d48, 0xc0002e8d00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fdaf4329080, 0xc002218d48, 0xc0002e8d00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/h"}
{"Time":"2021-04-07T14:46:13.593048129Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:80 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc000086bc0, 0x7fdaf4329080, 0xc002218d48, 0xc0033abf00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc0078916e0, 0xc00051bfe0, 0x5282888, 0xc002218d48, 0xc0033abf00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:107 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:93 +0x1f4\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"Internal error occurred: resource quota evaluation timed out"}
{"Time":"2021-04-07T14:46:13.593197862Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc00854e3f0, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:537 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc005a3d5c0, 0xc0085bc500, 0xfb, 0x4f8, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc0080d4ea8, 0x4963c20, 0xc004c06500, 0xb76c1a, 0x4a062ca)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc0005ba050, 0x52310d0, 0xc004c06500, 0x5224d80, 0xc005a3d5c0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachin"}
{"Time":"2021-04-07T14:46:13.593207753Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"ery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc0005ba050, 0x52310d0, 0xc004c06500, 0x5224d80, 0xc005a3d5c0, 0x3afffe9, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc004c068c0, 0x52310d0, 0xc004c06500, 0x5224d80, 0xc005a3d5c0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc004c068c0, 0x52310d0, 0xc004c06500, 0x5224d80, 0xc005a3d5c0, 0xc000603b00, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/ver"}
{"Time":"2021-04-07T14:46:13.593223935Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"sioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4a2319a, 0x10, 0x7fdaf4db69f0, 0xc004c068c0, 0x527d9c8, 0xc0067dcc78, 0xc00538ff00, 0x1f4, 0x52310d0, 0xc004c06500)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x527ffa8, 0xc000a10600, 0x5280188, 0x7493400, 0x0, 0x0, 0x4a062ca, 0x2, 0x527d9c8, 0xc0067dcc78, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x5224980, 0xc004c06140, 0x527ffa8, 0xc000a10600, 0x0, 0x0, 0x4a062ca, 0x2, 0x527d9c8, 0xc0067dcc78, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes"}
{"Time":"2021-04-07T14:46:13.593234636Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:106\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.createHandler.func1(0x527d9c8, 0xc0067dcc78, 0xc00538ff00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/create.go:188 +0x1bc6\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulCreateResource.func1(0xc00854e360, 0xc0018cb260)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1203 +0xe2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc00854e360, 0xc0018cb260)\\n\\t/home/prow/go/src/k8s.io"}
{"Time":"2021-04-07T14:46:13.593245495Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:428 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc0009f2240, 0x7fdaf4329080, 0xc0067dcc60, 0xc00538ff00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4a1ce04, 0xe, 0xc0009f2240, 0xc000896850, 0x7fdaf4329080, 0xc0067dcc60, 0xc00538ff00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fdaf4329080, 0xc0067"}
{"Time":"2021-04-07T14:46:13.593257326Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"dcc60, 0xc00538ff00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bd10, 0x7fdaf4329080, 0xc0067dcc60, 0xc00538ff00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthorization.func1(0x7fdaf4329080, 0xc0067dcc60, 0xc00538ff00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authorization.go:64 +0x603\\nnet/http.HandlerFunc.ServeHTTP(0xc000086940, 0x7fdaf4329080, 0xc0067dcc60, 0xc00538ff00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fdaf4329080, 0xc0067dcc60, 0xc00538ff00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go"}
{"Time":"2021-04-07T14:46:13.593268332Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":":71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000086980, 0x7fdaf4329080, 0xc0067dcc60, 0xc00538ff00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fdaf4329080, 0xc0067dcc60, 0xc00538ff00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bd40, 0x7fdaf4329080, 0xc0067dcc60, 0xc00538ff00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:127 +0x1ba\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k"}
{"Time":"2021-04-07T14:46:13.593279248Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc0084eefd0, 0xc0080d6cef, 0xc0018cb1f0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:329 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc0084eefd0, 0xc0018cb1f0, 0xc00516ca20)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:330 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc0002e8f00, 0x5281e78, 0xc00854e210, 0xc0084eee70, 0x52827a8, 0xc0074f5b80, 0xc00516c8f0, 0xc00516c900, 0xc0067e4720)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/"}
{"Time":"2021-04-07T14:46:13.593289815Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"apf_filter.go:166 +0x907\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7fdaf4329080, 0xc0067dcc60, 0xc00538fe00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:130 +0x606\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bd70, 0x7fdaf4329080, 0xc0067dcc60, 0xc00538fe00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fdaf4329080, 0xc0067dcc60, 0xc00538fe00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000086a80, 0x7fdaf4329080, 0xc0067dcc60, 0xc00538fe00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fdaf4329080, 0xc0067dcc60, 0xc00538fe0"}
{"Time":"2021-04-07T14:46:13.593331861Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bda0, 0x7fdaf4329080, 0xc0067dcc60, 0xc00538fe00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7fdaf4329080, 0xc0067dcc60, 0xc00538fe00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:50 +0x240d\\nnet/http.HandlerFunc.ServeHTTP(0xc000086ac0, 0x7fdaf4329080, 0xc0067dcc60, 0xc00538fe00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fdaf4329080, 0xc0067dcc60, 0xc00538fe00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/h"}
{"Time":"2021-04-07T14:46:13.593365783Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:80 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc000086bc0, 0x7fdaf4329080, 0xc0067dcc60, 0xc00538fc00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc005629aa0, 0xc00051bfe0, 0x5282888, 0xc0067dcc60, 0xc00538fc00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:107 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:93 +0x1f4\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"Internal error occurred: resource quota evaluation timed out"}
{"Time":"2021-04-07T14:46:13.593398344Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc00854f950, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:537 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc005584660, 0xc008daa580, 0xfb, 0x55f, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc009120ea8, 0x4963c20, 0xc004a71220, 0xb76c1a, 0x4a062ca)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc0005ba050, 0x52310d0, 0xc004a71220, 0x5224d80, 0xc005584660, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachin"}
{"Time":"2021-04-07T14:46:13.59340815Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"ery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc0005ba050, 0x52310d0, 0xc004a71220, 0x5224d80, 0xc005584660, 0x3afffe9, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc004a71400, 0x52310d0, 0xc004a71220, 0x5224d80, 0xc005584660, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc004a71400, 0x52310d0, 0xc004a71220, 0x5224d80, 0xc005584660, 0xc000603b00, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/ver"}
{"Time":"2021-04-07T14:46:13.593418563Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"sioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4a2319a, 0x10, 0x7fdaf4db69f0, 0xc004a71400, 0x527d9c8, 0xc0067dcd40, 0xc00565f000, 0x1f4, 0x52310d0, 0xc004a71220)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x527ffa8, 0xc000a10600, 0x5280188, 0x7493400, 0x0, 0x0, 0x4a062ca, 0x2, 0x527d9c8, 0xc0067dcd40, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x5224980, 0xc004a71040, 0x527ffa8, 0xc000a10600, 0x0, 0x0, 0x4a062ca, 0x2, 0x527d9c8, 0xc0067dcd40, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes"}
{"Time":"2021-04-07T14:46:13.59343087Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:106\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.createHandler.func1(0x527d9c8, 0xc0067dcd40, 0xc00565f000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/create.go:188 +0x1bc6\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulCreateResource.func1(0xc00854f8c0, 0xc0018cb650)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1203 +0xe2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc00854f8c0, 0xc0018cb650)\\n\\t/home/prow/go/src/k8s.io"}
{"Time":"2021-04-07T14:46:13.593440784Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:428 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc0009f2240, 0x7fdaf4329080, 0xc0067dcd28, 0xc00565f000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:288 +0xa7d\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).Dispatch(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/github.com/emicklei/go-restful/container.go:199\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x4a1ce04, 0xe, 0xc0009f2240, 0xc000896850, 0x7fdaf4329080, 0xc0067dcd28, 0xc00565f000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:146 +0x63e\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fdaf4329080, 0xc0067"}
{"Time":"2021-04-07T14:46:13.59719501Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"dcd28, 0xc00565f000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bd10, 0x7fdaf4329080, 0xc0067dcd28, 0xc00565f000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAuthorization.func1(0x7fdaf4329080, 0xc0067dcd28, 0xc00565f000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/authorization.go:64 +0x603\\nnet/http.HandlerFunc.ServeHTTP(0xc000086940, 0x7fdaf4329080, 0xc0067dcd28, 0xc00565f000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fdaf4329080, 0xc0067dcd28, 0xc00565f000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go"}
{"Time":"2021-04-07T14:46:13.597224577Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":":71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000086980, 0x7fdaf4329080, 0xc0067dcd28, 0xc00565f000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fdaf4329080, 0xc0067dcd28, 0xc00565f000)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bd40, 0x7fdaf4329080, 0xc0067dcd28, 0xc00565f000)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1.4()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:127 +0x1ba\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle.func2()\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k"}
{"Time":"2021-04-07T14:46:13.597237076Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"8s.io/apiserver/pkg/util/flowcontrol/apf_filter.go:176 +0x222\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish.func1(0xc0084ef340, 0xc009122cef, 0xc0018cb5e0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:329 +0x62\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset.(*request).Finish(0xc0084ef340, 0xc0018cb5e0, 0xc00516d1e0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/fairqueuing/queueset/queueset.go:330 +0x5d\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol.(*configController).Handle(0xc0002e8f00, 0x5281e78, 0xc00854f770, 0xc0084ef1e0, 0x52827a8, 0xc00067af80, 0xc00516d090, 0xc00516d0b0, 0xc0067e4de0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/util/flowcontrol/"}
{"Time":"2021-04-07T14:46:13.597250294Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"apf_filter.go:166 +0x907\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.WithPriorityAndFairness.func1(0x7fdaf4329080, 0xc0067dcd28, 0xc00565ef00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/priority-and-fairness.go:130 +0x606\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bd70, 0x7fdaf4329080, 0xc0067dcd28, 0xc00565ef00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fdaf4329080, 0xc0067dcd28, 0xc00565ef00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/http.HandlerFunc.ServeHTTP(0xc000086a80, 0x7fdaf4329080, 0xc0067dcd28, 0xc00565ef00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackCompleted.func1(0x7fdaf4329080, 0xc0067dcd28, 0xc00565ef0"}
{"Time":"2021-04-07T14:46:13.597268481Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:95 +0x193\\nnet/http.HandlerFunc.ServeHTTP(0xc00093bda0, 0x7fdaf4329080, 0xc0067dcd28, 0xc00565ef00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithImpersonation.func1(0x7fdaf4329080, 0xc0067dcd28, 0xc00565ef00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/impersonation.go:50 +0x240d\\nnet/http.HandlerFunc.ServeHTTP(0xc000086ac0, 0x7fdaf4329080, 0xc0067dcd28, 0xc00565ef00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency.trackStarted.func1(0x7fdaf4329080, 0xc0067dcd28, 0xc00565ef00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:71 +0x186\\nnet/h"}
{"Time":"2021-04-07T14:46:13.597304317Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filterlatency/filterlatency.go:80 +0x38c\\nnet/http.HandlerFunc.ServeHTTP(0xc000086bc0, 0x7fdaf4329080, 0xc0067dcd28, 0xc00565ed00)\\n\\t/usr/local/go/src/net/http/server.go:2069 +0x44\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1(0xc005629f80, 0xc00051bfe0, 0x5282888, 0xc0067dcd28, 0xc00565ed00)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:107 +0xb8\\ncreated by k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:93 +0x1f4\\n\" addedInfo=\"\\nlogging error output: \\\"{\\\\\\\"kind\\\\\\\":\\\\\\\"Status\\\\\\\",\\\\\\\"apiVersion\\\\\\\":\\\\\\\"v1\\\\\\\",\\\\\\\"metadata\\\\\\\":{},\\\\\\\"status\\\\\\\":\\\\\\\"Failure\\\\\\\",\\\\\\\"message\\\\\\\":\\\\\\\"Internal error occurred: resource quota evaluation timed out"}
{"Time":"2021-04-07T14:46:13.597384705Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.(*ResponseWriterDelegator).WriteHeader(0xc00855ce70, 0x1f4)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:537 +0x45\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.(*deferredResponseWriter).Write(0xc004a343c0, 0xc005f82580, 0xfb, 0x55f, 0x0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:228 +0x2fd\\nencoding/json.(*Encoder).Encode(0xc0090c0ea8, 0x4963c20, 0xc0048041e0, 0xb76c1a, 0x4a062ca)\\n\\t/usr/local/go/src/encoding/json/stream.go:231 +0x1df\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).doEncode(0xc0005ba050, 0x52310d0, 0xc0048041e0, 0x5224d80, 0xc004a343c0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachin"}
{"Time":"2021-04-07T14:46:13.597396863Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"ery/pkg/runtime/serializer/json/json.go:327 +0x2e9\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json.(*Serializer).Encode(0xc0005ba050, 0x52310d0, 0xc0048041e0, 0x5224d80, 0xc004a343c0, 0x3afffe9, 0x6)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/json/json.go:301 +0x169\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).doEncode(0xc0048043c0, 0x52310d0, 0xc0048041e0, 0x5224d80, 0xc004a343c0, 0x0, 0x0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning/versioning.go:228 +0x3b6\\nk8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/versioning.(*codec).Encode(0xc0048043c0, 0x52310d0, 0xc0048041e0, 0x5224d80, 0xc004a343c0, 0xc000603b00, 0x3)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apimachinery/pkg/runtime/serializer/ver"}
{"Time":"2021-04-07T14:46:13.597409837Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"sioning/versioning.go:184 +0x170\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.SerializeObject(0x4a2319a, 0x10, 0x7fdaf4db69f0, 0xc0048043c0, 0x527d9c8, 0xc0067dcdf8, 0xc008560100, 0x1f4, 0x52310d0, 0xc0048041e0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:106 +0x457\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.WriteObjectNegotiated(0x527ffa8, 0xc000a10600, 0x5280188, 0x7493400, 0x0, 0x0, 0x4a062ca, 0x2, 0x527d9c8, 0xc0067dcdf8, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:275 +0x5cd\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters.ErrorNegotiated(0x5224980, 0xc004804000, 0x527ffa8, 0xc000a10600, 0x0, 0x0, 0x4a062ca, 0x2, 0x527d9c8, 0xc0067dcdf8, ...)\\n\\t/home/prow/go/src/k8s.io/kubernetes"}
{"Time":"2021-04-07T14:46:13.597420437Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/responsewriters/writers.go:294 +0x16f\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.(*RequestScope).err(...)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/rest.go:106\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers.createHandler.func1(0x527d9c8, 0xc0067dcdf8, 0xc008560100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/handlers/create.go:188 +0x1bc6\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints.restfulCreateResource.func1(0xc00855cde0, 0xc0018cb9d0)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/installer.go:1203 +0xe2\\nk8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics.InstrumentRouteFunc.func1(0xc00855cde0, 0xc0018cb9d0)\\n\\t/home/prow/go/src/k8s.io"}
{"Time":"2021-04-07T14:46:13.597430778Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/quota","Test":"TestQuota","Output":"/kubernetes/_output/local/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/metrics/metrics.go:428 +0x2d5\\nk8s.io/kubernetes/vendor/github.com/emicklei/go-restful.(*Container).dispatch(0xc0009f2240, 0x7fdaf4329080, 0xc0067dcdc0, 0xc008560100)\\n\\t/home/prow/go/src/k8s.io/kubernetes/_output/local/go/