This job view page is being replaced by Spyglass soon. Check out the new job view.
PRtanjunchen: Fix golint issues in test/e2e/lifecycle/
ResultFAILURE
Tests 0 failed / 2867 succeeded
Started2019-11-29 06:37
Elapsed25m27s
Revisionc5f74beef5088310c6e163f7adf7d0136b057384
Refs 85744

No Test Failures!


Show 2867 Passed Tests

Show 4 Skipped Tests

Error lines from build-log.txt

... skipping 56 lines ...
Recording: record_command_canary
Running command: record_command_canary

+++ Running case: test-cmd.record_command_canary 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: record_command_canary
/home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh: line 155: bogus-expected-to-fail: command not found
!!! [1129 06:41:13] Call tree:
!!! [1129 06:41:13]  1: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:47 record_command_canary(...)
!!! [1129 06:41:13]  2: /home/prow/go/src/k8s.io/kubernetes/test/cmd/../../third_party/forked/shell2junit/sh2ju.sh:112 eVal(...)
!!! [1129 06:41:13]  3: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:131 juLog(...)
!!! [1129 06:41:13]  4: /home/prow/go/src/k8s.io/kubernetes/test/cmd/legacy-script.sh:159 record_command(...)
!!! [1129 06:41:13]  5: hack/make-rules/test-cmd.sh:27 source(...)
+++ exit code: 1
+++ error: 1
+++ [1129 06:41:13] Running kubeadm tests
+++ [1129 06:41:18] Building go targets for linux/amd64:
    cmd/kubeadm
Running tests for APIVersion: v1,admissionregistration.k8s.io/v1,admissionregistration.k8s.io/v1beta1,admission.k8s.io/v1,admission.k8s.io/v1beta1,apps/v1,apps/v1beta1,apps/v1beta2,auditregistration.k8s.io/v1alpha1,authentication.k8s.io/v1,authentication.k8s.io/v1beta1,authorization.k8s.io/v1,authorization.k8s.io/v1beta1,autoscaling/v1,autoscaling/v2beta1,autoscaling/v2beta2,batch/v1,batch/v1beta1,batch/v2alpha1,certificates.k8s.io/v1beta1,coordination.k8s.io/v1beta1,coordination.k8s.io/v1,discovery.k8s.io/v1alpha1,discovery.k8s.io/v1beta1,extensions/v1beta1,events.k8s.io/v1beta1,imagepolicy.k8s.io/v1alpha1,networking.k8s.io/v1,networking.k8s.io/v1beta1,node.k8s.io/v1alpha1,node.k8s.io/v1beta1,policy/v1beta1,rbac.authorization.k8s.io/v1,rbac.authorization.k8s.io/v1beta1,rbac.authorization.k8s.io/v1alpha1,scheduling.k8s.io/v1alpha1,scheduling.k8s.io/v1beta1,scheduling.k8s.io/v1,settings.k8s.io/v1alpha1,storage.k8s.io/v1beta1,storage.k8s.io/v1,storage.k8s.io/v1alpha1,flowcontrol.apiserver.k8s.io/v1alpha1,
+++ [1129 06:41:58] Running tests without code coverage
{"Time":"2019-11-29T06:43:08.3374658Z","Action":"output","Package":"k8s.io/kubernetes/cmd/kubeadm/test/cmd","Output":"ok  \tk8s.io/kubernetes/cmd/kubeadm/test/cmd\t35.100s\n"}
... skipping 285 lines ...
+++ [1129 06:44:43] Building kube-controller-manager
+++ [1129 06:44:47] Building go targets for linux/amd64:
    cmd/kube-controller-manager
+++ [1129 06:45:14] Starting controller-manager
Flag --port has been deprecated, see --secure-port instead.
I1129 06:45:15.312793   54238 serving.go:312] Generated self-signed cert in-memory
W1129 06:45:15.808766   54238 authentication.go:409] failed to read in-cluster kubeconfig for delegated authentication: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W1129 06:45:15.808937   54238 authentication.go:267] No authentication-kubeconfig provided in order to lookup client-ca-file in configmap/extension-apiserver-authentication in kube-system, so client certificate authentication won't work.
W1129 06:45:15.808985   54238 authentication.go:291] No authentication-kubeconfig provided in order to lookup requestheader-client-ca-file in configmap/extension-apiserver-authentication in kube-system, so request-header client certificate authentication won't work.
W1129 06:45:15.809043   54238 authorization.go:177] failed to read in-cluster kubeconfig for delegated authorization: open /var/run/secrets/kubernetes.io/serviceaccount/token: no such file or directory
W1129 06:45:15.809099   54238 authorization.go:146] No authorization-kubeconfig provided, so SubjectAccessReview of authorization tokens won't work.
I1129 06:45:15.809147   54238 controllermanager.go:161] Version: v1.18.0-alpha.0.1307+875c5f6dd8410a
I1129 06:45:15.810544   54238 secure_serving.go:178] Serving securely on [::]:10257
I1129 06:45:15.810678   54238 tlsconfig.go:219] Starting DynamicServingCertificateController
I1129 06:45:15.811139   54238 deprecated_insecure_serving.go:53] Serving insecurely on [::]:10252
I1129 06:45:15.811218   54238 leaderelection.go:242] attempting to acquire leader lease  kube-system/kube-controller-manager...
... skipping 123 lines ...
I1129 06:45:16.860520   54238 node_lifecycle_controller.go:554] Starting node controller
I1129 06:45:16.860533   54238 shared_informer.go:197] Waiting for caches to sync for taint
I1129 06:45:16.861014   54238 controllermanager.go:533] Started "endpoint"
I1129 06:45:16.861138   54238 endpoints_controller.go:181] Starting endpoint controller
I1129 06:45:16.861148   54238 shared_informer.go:197] Waiting for caches to sync for endpoint
I1129 06:45:16.861942   54238 controllermanager.go:533] Started "ttl"
E1129 06:45:16.862216   54238 core.go:91] Failed to start service controller: WARNING: no cloud provider provided, services of type LoadBalancer will fail
W1129 06:45:16.862232   54238 controllermanager.go:525] Skipping "service"
I1129 06:45:16.862466   54238 ttl_controller.go:116] Starting TTL controller
I1129 06:45:16.862492   54238 node_lifecycle_controller.go:77] Sending events to api server
I1129 06:45:16.862492   54238 shared_informer.go:197] Waiting for caches to sync for TTL
E1129 06:45:16.862520   54238 core.go:232] failed to start cloud node lifecycle controller: no cloud provider provided
W1129 06:45:16.862529   54238 controllermanager.go:525] Skipping "cloud-node-lifecycle"
I1129 06:45:16.862831   54238 controllermanager.go:533] Started "pvc-protection"
I1129 06:45:16.863051   54238 pvc_protection_controller.go:100] Starting PVC protection controller
I1129 06:45:16.863070   54238 shared_informer.go:197] Waiting for caches to sync for PVC protection
I1129 06:45:16.870178   54238 controllermanager.go:533] Started "namespace"
W1129 06:45:16.870237   54238 controllermanager.go:525] Skipping "csrsigning"
I1129 06:45:16.870660   54238 controllermanager.go:533] Started "daemonset"
I1129 06:45:16.871031   54238 namespace_controller.go:200] Starting namespace controller
I1129 06:45:16.871045   54238 shared_informer.go:197] Waiting for caches to sync for namespace
I1129 06:45:16.871067   54238 daemon_controller.go:255] Starting daemon sets controller
I1129 06:45:16.871072   54238 shared_informer.go:197] Waiting for caches to sync for daemon sets
W1129 06:45:16.900673   54238 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="127.0.0.1" does not exist
I1129 06:45:16.959006   54238 shared_informer.go:204] Caches are synced for deployment 
I1129 06:45:16.959882   54238 shared_informer.go:204] Caches are synced for stateful set 
I1129 06:45:16.962669   54238 shared_informer.go:204] Caches are synced for TTL 
I1129 06:45:16.969007   54238 shared_informer.go:204] Caches are synced for PVC protection 
I1129 06:45:16.983638   54238 shared_informer.go:204] Caches are synced for PV protection 
I1129 06:45:16.984627   54238 shared_informer.go:204] Caches are synced for HPA 
... skipping 3 lines ...
I1129 06:45:17.001105   54238 shared_informer.go:204] Caches are synced for disruption 
I1129 06:45:17.001129   54238 disruption.go:338] Sending events to api server.
I1129 06:45:17.002336   54238 shared_informer.go:204] Caches are synced for ReplicaSet 
I1129 06:45:17.002618   54238 shared_informer.go:204] Caches are synced for GC 
I1129 06:45:17.003239   54238 shared_informer.go:204] Caches are synced for persistent volume 
I1129 06:45:17.003494   54238 shared_informer.go:204] Caches are synced for ReplicationController 
E1129 06:45:17.014470   54238 clusterroleaggregation_controller.go:180] view failed with : Operation cannot be fulfilled on clusterroles.rbac.authorization.k8s.io "view": the object has been modified; please apply your changes to the latest version and try again
Successful: the flag '--client' shows correct client info
(BSuccessful: the flag '--client' correctly has no server version info
(B+++ [1129 06:45:17] Testing kubectl version: verify json output
I1129 06:45:17.158817   54238 shared_informer.go:204] Caches are synced for job 
I1129 06:45:17.160704   54238 shared_informer.go:204] Caches are synced for taint 
I1129 06:45:17.160794   54238 node_lifecycle_controller.go:1443] Initializing eviction metric for zone: 
... skipping 69 lines ...
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_RESTMapper_evaluation_tests
+++ [1129 06:45:20] Creating namespace namespace-1575009920-32696
namespace/namespace-1575009920-32696 created
Context "test" modified.
+++ [1129 06:45:20] Testing RESTMapper
+++ [1129 06:45:20] "kubectl get unknownresourcetype" returns error as expected: error: the server doesn't have a resource type "unknownresourcetype"
+++ exit code: 0
NAME                              SHORTNAMES   APIGROUP                       NAMESPACED   KIND
bindings                                                                      true         Binding
componentstatuses                 cs                                          false        ComponentStatus
configmaps                        cm                                          true         ConfigMap
endpoints                         ep                                          true         Endpoints
... skipping 601 lines ...
has:valid-pod
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
has:valid-pod
core.sh:186: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: resource(s) were provided, but no name, label selector, or --all flag specified
core.sh:190: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:194: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Berror: setting 'all' parameter but found a non empty selector. 
core.sh:198: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bcore.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
core.sh:206: Successful get pods -l'name in (valid-pod)' {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:211: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-kubectl-describe-pod\" }}found{{end}}{{end}}:: :
... skipping 12 lines ...
(Bpoddisruptionbudget.policy/test-pdb-2 created
core.sh:245: Successful get pdb/test-pdb-2 --namespace=test-kubectl-describe-pod {{.spec.minAvailable}}: 50%
(Bpoddisruptionbudget.policy/test-pdb-3 created
core.sh:251: Successful get pdb/test-pdb-3 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 2
(Bpoddisruptionbudget.policy/test-pdb-4 created
core.sh:255: Successful get pdb/test-pdb-4 --namespace=test-kubectl-describe-pod {{.spec.maxUnavailable}}: 50%
(Berror: min-available and max-unavailable cannot be both specified
core.sh:261: Successful get pods --namespace=test-kubectl-describe-pod {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/env-test-pod created
matched TEST_CMD_1
matched <set to the key 'key-1' in secret 'test-secret'>
matched TEST_CMD_2
matched <set to the key 'key-2' of config map 'test-configmap'>
... skipping 188 lines ...
(Bpod/valid-pod patched
core.sh:470: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: changed-with-yaml:
(Bpod/valid-pod patched
core.sh:475: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:3.1:
(Bpod/valid-pod patched
core.sh:491: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
(B+++ [1129 06:45:55] "kubectl patch with resourceVersion 519" returns error as expected: Error from server (Conflict): Operation cannot be fulfilled on pods "valid-pod": the object has been modified; please apply your changes to the latest version and try again
pod "valid-pod" deleted
pod/valid-pod replaced
core.sh:515: Successful get pod valid-pod {{(index .spec.containers 0).name}}: replaced-k8s-serve-hostname
(BSuccessful
message:error: --grace-period must have --force specified
has:\-\-grace-period must have \-\-force specified
Successful
message:error: --timeout must have --force specified
has:\-\-timeout must have \-\-force specified
node/node-v1-test created
W1129 06:45:56.490135   54238 actual_state_of_world.go:506] Failed to update statusUpdateNeeded field in actual state of world: Failed to set statusUpdateNeeded to needed true, because nodeName="node-v1-test" does not exist
node/node-v1-test replaced
core.sh:552: Successful get node node-v1-test {{.metadata.annotations.a}}: b
(Bnode "node-v1-test" deleted
core.sh:559: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: nginx:
(Bcore.sh:562: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: k8s.gcr.io/serve_hostname:
(BEdit cancelled, no changes made.
... skipping 22 lines ...
spec:
  containers:
  - image: k8s.gcr.io/pause:2.0
    name: kubernetes-pause
has:localonlyvalue
core.sh:585: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Berror: 'name' already has a value (valid-pod), and --overwrite is false
core.sh:589: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bcore.sh:593: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod
(Bpod/valid-pod labeled
core.sh:597: Successful get pod valid-pod {{.metadata.labels.name}}: valid-pod-super-sayan
(Bcore.sh:601: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
... skipping 85 lines ...
+++ Running case: test-cmd.run_kubectl_create_error_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_create_error_tests
+++ [1129 06:46:05] Creating namespace namespace-1575009965-29031
namespace/namespace-1575009965-29031 created
Context "test" modified.
+++ [1129 06:46:05] Testing kubectl create with error
Error: must specify one of -f and -k

Create a resource from a file or from stdin.

 JSON and YAML formats are accepted.

Examples:
... skipping 41 lines ...

Usage:
  kubectl create -f FILENAME [options]

Use "kubectl <command> --help" for more information about a given command.
Use "kubectl options" for a list of global command-line options (applies to all commands).
+++ [1129 06:46:05] "kubectl create with empty string list returns error as expected: error: error validating "hack/testdata/invalid-rc-with-empty-args.yaml": error validating data: ValidationError(ReplicationController.spec.template.spec.containers[0].args): unknown object type "nil" in ReplicationController.spec.template.spec.containers[0].args[0]; if you choose to ignore these errors, turn validation off with --validate=false
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
+++ exit code: 0
Recording: run_kubectl_apply_tests
Running command: run_kubectl_apply_tests

... skipping 17 lines ...
(Bpod "test-pod" deleted
customresourcedefinition.apiextensions.k8s.io/resources.mygroup.example.com created
I1129 06:46:08.443238   50776 client.go:361] parsed scheme: "endpoint"
I1129 06:46:08.443284   50776 endpoint.go:68] ccResolverWrapper: sending new addresses to cc: [{http://127.0.0.1:2379 0  <nil>}]
I1129 06:46:08.446883   50776 controller.go:606] quota admission added evaluator for: resources.mygroup.example.com
kind.mygroup.example.com/myobj serverside-applied (server dry run)
Error from server (NotFound): resources.mygroup.example.com "myobj" not found
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
+++ exit code: 0
Recording: run_kubectl_run_tests
Running command: run_kubectl_run_tests

+++ Running case: test-cmd.run_kubectl_run_tests 
... skipping 102 lines ...
Context "test" modified.
+++ [1129 06:46:10] Testing kubectl create filter
create.sh:30: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/selector-test-pod created
create.sh:34: Successful get pods selector-test-pod {{.metadata.labels.name}}: selector-test-pod
(BSuccessful
message:Error from server (NotFound): pods "selector-test-pod-dont-apply" not found
has:pods "selector-test-pod-dont-apply" not found
pod "selector-test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_apply_deployments_tests
Running command: run_kubectl_apply_deployments_tests

... skipping 29 lines ...
I1129 06:46:13.479729   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575009971-28694", Name:"nginx", UID:"d56265f5-1c64-4e35-9864-13ae14284787", APIVersion:"apps/v1", ResourceVersion:"607", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-8484dd655 to 3
I1129 06:46:13.484066   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575009971-28694", Name:"nginx-8484dd655", UID:"1e7b6c73-8fb5-4639-aeb7-66c7b3656b83", APIVersion:"apps/v1", ResourceVersion:"608", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-bvl72
I1129 06:46:13.487625   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575009971-28694", Name:"nginx-8484dd655", UID:"1e7b6c73-8fb5-4639-aeb7-66c7b3656b83", APIVersion:"apps/v1", ResourceVersion:"608", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-w5tmz
I1129 06:46:13.487987   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575009971-28694", Name:"nginx-8484dd655", UID:"1e7b6c73-8fb5-4639-aeb7-66c7b3656b83", APIVersion:"apps/v1", ResourceVersion:"608", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-8484dd655-tdnmz
apps.sh:148: Successful get deployment nginx {{.metadata.name}}: nginx
(BSuccessful
message:Error from server (Conflict): error when applying patch:
{"metadata":{"annotations":{"kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"Deployment\",\"metadata\":{\"annotations\":{},\"labels\":{\"name\":\"nginx\"},\"name\":\"nginx\",\"namespace\":\"namespace-1575009971-28694\",\"resourceVersion\":\"99\"},\"spec\":{\"replicas\":3,\"selector\":{\"matchLabels\":{\"name\":\"nginx2\"}},\"template\":{\"metadata\":{\"labels\":{\"name\":\"nginx2\"}},\"spec\":{\"containers\":[{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"nginx\",\"ports\":[{\"containerPort\":80}]}]}}}}\n"},"resourceVersion":"99"},"spec":{"selector":{"matchLabels":{"name":"nginx2"}},"template":{"metadata":{"labels":{"name":"nginx2"}}}}}
to:
Resource: "apps/v1, Resource=deployments", GroupVersionKind: "apps/v1, Kind=Deployment"
Name: "nginx", Namespace: "namespace-1575009971-28694"
for: "hack/testdata/deployment-label-change2.yaml": Operation cannot be fulfilled on deployments.apps "nginx": the object has been modified; please apply your changes to the latest version and try again
has:Error from server (Conflict)
I1129 06:46:19.921318   54238 horizontal.go:341] Horizontal Pod Autoscaler frontend has been deleted in namespace-1575009963-14012
deployment.apps/nginx configured
I1129 06:46:23.009631   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575009971-28694", Name:"nginx", UID:"86aae467-91ff-4fbc-b1ef-b8ccc55f8b5d", APIVersion:"apps/v1", ResourceVersion:"651", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-668b6c7744 to 3
I1129 06:46:23.014542   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575009971-28694", Name:"nginx-668b6c7744", UID:"eb3eb93b-f766-4548-a087-cf95c2a2bafb", APIVersion:"apps/v1", ResourceVersion:"652", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-kmdsc
I1129 06:46:23.018026   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575009971-28694", Name:"nginx-668b6c7744", UID:"eb3eb93b-f766-4548-a087-cf95c2a2bafb", APIVersion:"apps/v1", ResourceVersion:"652", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-h7qcc
I1129 06:46:23.019057   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575009971-28694", Name:"nginx-668b6c7744", UID:"eb3eb93b-f766-4548-a087-cf95c2a2bafb", APIVersion:"apps/v1", ResourceVersion:"652", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-668b6c7744-xfvvq
... skipping 142 lines ...
+++ [1129 06:46:30] Creating namespace namespace-1575009990-25858
namespace/namespace-1575009990-25858 created
Context "test" modified.
+++ [1129 06:46:30] Testing kubectl get
get.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:37: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
get.sh:45: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:{
    "apiVersion": "v1",
    "items": [],
... skipping 23 lines ...
has not:No resources found
Successful
message:NAME
has not:No resources found
get.sh:73: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:error: the server doesn't have a resource type "foobar"
has not:No resources found
Successful
message:No resources found in namespace-1575009990-25858 namespace.
has:No resources found
Successful
message:
has not:No resources found
Successful
message:No resources found in namespace-1575009990-25858 namespace.
has:No resources found
get.sh:93: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
Successful
message:Error from server (NotFound): pods "abc" not found
has not:List
Successful
message:I1129 06:46:31.970265   64668 loader.go:375] Config loaded from file:  /tmp/tmp.BYbdlG6rFv/.kube/config
I1129 06:46:31.971747   64668 round_trippers.go:443] GET http://127.0.0.1:8080/version?timeout=32s 200 OK in 1 milliseconds
I1129 06:46:31.994562   64668 round_trippers.go:443] GET http://127.0.0.1:8080/api/v1/namespaces/default/pods 200 OK in 1 milliseconds
I1129 06:46:31.996017   64668 round_trippers.go:443] GET http://127.0.0.1:8080/api/v1/namespaces/default/replicationcontrollers 200 OK in 1 milliseconds
... skipping 479 lines ...
Successful
message:NAME    DATA   AGE
one     0      0s
three   0      0s
two     0      0s
STATUS    REASON          MESSAGE
Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has not:watch is only supported on individual resources
Successful
message:STATUS    REASON          MESSAGE
Failure   InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has not:watch is only supported on individual resources
+++ [1129 06:46:38] Creating namespace namespace-1575009998-4740
namespace/namespace-1575009998-4740 created
Context "test" modified.
get.sh:153: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
... skipping 56 lines ...
}
get.sh:158: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(B<no value>Successful
message:valid-pod:
has:valid-pod:
Successful
message:error: error executing jsonpath "{.missing}": Error executing template: missing is not found. Printing more information for debugging the template:
	template was:
		{.missing}
	object given to jsonpath engine was:
		map[string]interface {}{"apiVersion":"v1", "kind":"Pod", "metadata":map[string]interface {}{"creationTimestamp":"2019-11-29T06:46:38Z", "labels":map[string]interface {}{"name":"valid-pod"}, "name":"valid-pod", "namespace":"namespace-1575009998-4740", "resourceVersion":"737", "selfLink":"/api/v1/namespaces/namespace-1575009998-4740/pods/valid-pod", "uid":"5db4dfdc-6cd0-4a36-8134-cfe131c7c87c"}, "spec":map[string]interface {}{"containers":[]interface {}{map[string]interface {}{"image":"k8s.gcr.io/serve_hostname", "imagePullPolicy":"Always", "name":"kubernetes-serve-hostname", "resources":map[string]interface {}{"limits":map[string]interface {}{"cpu":"1", "memory":"512Mi"}, "requests":map[string]interface {}{"cpu":"1", "memory":"512Mi"}}, "terminationMessagePath":"/dev/termination-log", "terminationMessagePolicy":"File"}}, "dnsPolicy":"ClusterFirst", "enableServiceLinks":true, "priority":0, "restartPolicy":"Always", "schedulerName":"default-scheduler", "securityContext":map[string]interface {}{}, "terminationGracePeriodSeconds":30}, "status":map[string]interface {}{"phase":"Pending", "qosClass":"Guaranteed"}}
has:missing is not found
error: error executing template "{{.missing}}": template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing"
Successful
message:Error executing template: template: output:1:2: executing "output" at <.missing>: map has no entry for key "missing". Printing more information for debugging the template:
	template was:
		{{.missing}}
	raw data was:
		{"apiVersion":"v1","kind":"Pod","metadata":{"creationTimestamp":"2019-11-29T06:46:38Z","labels":{"name":"valid-pod"},"name":"valid-pod","namespace":"namespace-1575009998-4740","resourceVersion":"737","selfLink":"/api/v1/namespaces/namespace-1575009998-4740/pods/valid-pod","uid":"5db4dfdc-6cd0-4a36-8134-cfe131c7c87c"},"spec":{"containers":[{"image":"k8s.gcr.io/serve_hostname","imagePullPolicy":"Always","name":"kubernetes-serve-hostname","resources":{"limits":{"cpu":"1","memory":"512Mi"},"requests":{"cpu":"1","memory":"512Mi"}},"terminationMessagePath":"/dev/termination-log","terminationMessagePolicy":"File"}],"dnsPolicy":"ClusterFirst","enableServiceLinks":true,"priority":0,"restartPolicy":"Always","schedulerName":"default-scheduler","securityContext":{},"terminationGracePeriodSeconds":30},"status":{"phase":"Pending","qosClass":"Guaranteed"}}
	object given to template engine was:
		map[apiVersion:v1 kind:Pod metadata:map[creationTimestamp:2019-11-29T06:46:38Z labels:map[name:valid-pod] name:valid-pod namespace:namespace-1575009998-4740 resourceVersion:737 selfLink:/api/v1/namespaces/namespace-1575009998-4740/pods/valid-pod uid:5db4dfdc-6cd0-4a36-8134-cfe131c7c87c] spec:map[containers:[map[image:k8s.gcr.io/serve_hostname imagePullPolicy:Always name:kubernetes-serve-hostname resources:map[limits:map[cpu:1 memory:512Mi] requests:map[cpu:1 memory:512Mi]] terminationMessagePath:/dev/termination-log terminationMessagePolicy:File]] dnsPolicy:ClusterFirst enableServiceLinks:true priority:0 restartPolicy:Always schedulerName:default-scheduler securityContext:map[] terminationGracePeriodSeconds:30] status:map[phase:Pending qosClass:Guaranteed]]
has:map has no entry for key "missing"
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:STATUS
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:valid-pod
Successful
message:pod/valid-pod
status/<unknown>
has not:STATUS
Successful
... skipping 45 lines ...
      (Client.Timeout exceeded while reading body)'
    reason: UnexpectedServerResponse
  - message: 'unable to decode an event from the watch stream: net/http: request canceled
      (Client.Timeout exceeded while reading body)'
    reason: ClientWatchDecoding
kind: Status
message: 'an error on the server ("unable to decode an event from the watch stream:
  net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented
  the request from succeeding'
metadata: {}
reason: InternalError
status: Failure
has not:STATUS
... skipping 42 lines ...
      (Client.Timeout exceeded while reading body)'
    reason: UnexpectedServerResponse
  - message: 'unable to decode an event from the watch stream: net/http: request canceled
      (Client.Timeout exceeded while reading body)'
    reason: ClientWatchDecoding
kind: Status
message: 'an error on the server ("unable to decode an event from the watch stream:
  net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented
  the request from succeeding'
metadata: {}
reason: InternalError
status: Failure
has:name: valid-pod
Successful
message:Error from server (NotFound): pods "invalid-pod" not found
has:"invalid-pod" not found
pod "valid-pod" deleted
get.sh:196: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/redis-master created
pod/valid-pod created
Successful
... skipping 35 lines ...
+++ command: run_kubectl_exec_pod_tests
+++ [1129 06:46:44] Creating namespace namespace-1575010004-17183
namespace/namespace-1575010004-17183 created
Context "test" modified.
+++ [1129 06:46:44] Testing kubectl exec POD COMMAND
Successful
message:Error from server (NotFound): pods "abc" not found
has:pods "abc" not found
pod/test-pod created
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pods "test-pod" not found
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod or type/name must be specified
pod "test-pod" deleted
+++ exit code: 0
Recording: run_kubectl_exec_resource_name_tests
Running command: run_kubectl_exec_resource_name_tests

... skipping 2 lines ...
+++ command: run_kubectl_exec_resource_name_tests
+++ [1129 06:46:45] Creating namespace namespace-1575010005-13669
namespace/namespace-1575010005-13669 created
Context "test" modified.
+++ [1129 06:46:45] Testing kubectl exec TYPE/NAME COMMAND
Successful
message:error: the server doesn't have a resource type "foo"
has:error:
Successful
message:Error from server (NotFound): deployments.apps "bar" not found
has:"bar" not found
pod/test-pod created
replicaset.apps/frontend created
I1129 06:46:46.488085   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010005-13669", Name:"frontend", UID:"70dcc0c5-5b85-415b-8af4-23c37915e43b", APIVersion:"apps/v1", ResourceVersion:"795", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-68k7d
I1129 06:46:46.493242   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010005-13669", Name:"frontend", UID:"70dcc0c5-5b85-415b-8af4-23c37915e43b", APIVersion:"apps/v1", ResourceVersion:"795", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-98z84
I1129 06:46:46.496012   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010005-13669", Name:"frontend", UID:"70dcc0c5-5b85-415b-8af4-23c37915e43b", APIVersion:"apps/v1", ResourceVersion:"795", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-zp7zr
configmap/test-set-env-config created
Successful
message:error: cannot attach to *v1.ConfigMap: selector for *v1.ConfigMap not implemented
has:not implemented
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:not found
Successful
message:Error from server (BadRequest): pod test-pod does not have a host assigned
has not:pod or type/name must be specified
Successful
message:Error from server (BadRequest): pod frontend-68k7d does not have a host assigned
has not:not found
Successful
message:Error from server (BadRequest): pod frontend-68k7d does not have a host assigned
has not:pod or type/name must be specified
pod "test-pod" deleted
replicaset.apps "frontend" deleted
configmap "test-set-env-config" deleted
+++ exit code: 0
Recording: run_create_secret_tests
Running command: run_create_secret_tests

+++ Running case: test-cmd.run_create_secret_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_create_secret_tests
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
message:Error from server (NotFound): secrets "mysecret" not found
has:secrets "mysecret" not found
Successful
message:user-specified
has:user-specified
Successful
{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"b7f736c7-78fb-4028-8b98-8d3170dd4aa5","resourceVersion":"815","creationTimestamp":"2019-11-29T06:46:47Z"}}
... skipping 2 lines ...
has:uid
Successful
message:{"kind":"ConfigMap","apiVersion":"v1","metadata":{"name":"tester-update-cm","namespace":"default","selfLink":"/api/v1/namespaces/default/configmaps/tester-update-cm","uid":"b7f736c7-78fb-4028-8b98-8d3170dd4aa5","resourceVersion":"816","creationTimestamp":"2019-11-29T06:46:47Z"},"data":{"key1":"config1"}}
has:config1
{"kind":"Status","apiVersion":"v1","metadata":{},"status":"Success","details":{"name":"tester-update-cm","kind":"configmaps","uid":"b7f736c7-78fb-4028-8b98-8d3170dd4aa5"}}
Successful
message:Error from server (NotFound): configmaps "tester-update-cm" not found
has:configmaps "tester-update-cm" not found
+++ exit code: 0
Recording: run_kubectl_create_kustomization_directory_tests
Running command: run_kubectl_create_kustomization_directory_tests

+++ Running case: test-cmd.run_kubectl_create_kustomization_directory_tests 
... skipping 110 lines ...
valid-pod   0/1     Pending   0          0s
has:valid-pod
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
STATUS      REASON          MESSAGE
Failure     InternalError   an error on the server ("unable to decode an event from the watch stream: net/http: request canceled (Client.Timeout exceeded while reading body)") has prevented the request from succeeding
has:Timeout exceeded while reading body
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          1s
has:valid-pod
Successful
message:error: Invalid timeout value. Timeout must be a single integer in seconds, or an integer followed by a corresponding time unit (e.g. 1s | 2m | 3h)
has:Invalid timeout value
pod "valid-pod" deleted
+++ exit code: 0
Recording: run_crd_tests
Running command: run_crd_tests

... skipping 158 lines ...
foo.company.com/test patched
crd.sh:236: Successful get foos/test {{.patched}}: value1
(Bfoo.company.com/test patched
crd.sh:238: Successful get foos/test {{.patched}}: value2
(Bfoo.company.com/test patched
crd.sh:240: Successful get foos/test {{.patched}}: <no value>
(B+++ [1129 06:46:57] "kubectl patch --local" returns error as expected for CustomResource: error: cannot apply strategic merge patch for company.com/v1, Kind=Foo locally, try --type merge
{
    "apiVersion": "company.com/v1",
    "kind": "Foo",
    "metadata": {
        "annotations": {
            "kubernetes.io/change-cause": "kubectl patch foos/test --server=http://127.0.0.1:8080 --match-server-version=true --patch={\"patched\":null} --type=merge --record=true"
... skipping 193 lines ...
(Bcrd.sh:450: Successful get bars {{range.items}}{{.metadata.name}}:{{end}}: 
(Bnamespace/non-native-resources created
bar.company.com/test created
crd.sh:455: Successful get bars {{len .items}}: 1
(Bnamespace "non-native-resources" deleted
crd.sh:458: Successful get bars {{len .items}}: 0
(BError from server (NotFound): namespaces "non-native-resources" not found
customresourcedefinition.apiextensions.k8s.io "foos.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "bars.company.com" deleted
customresourcedefinition.apiextensions.k8s.io "resources.mygroup.example.com" deleted
customresourcedefinition.apiextensions.k8s.io "validfoos.company.com" deleted
+++ exit code: 0
Recording: run_cmd_with_img_tests
... skipping 11 lines ...
I1129 06:47:31.500134   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010051-6910", Name:"test1-6cdffdb5b8", UID:"25b9e1b5-b09a-4a04-9af4-0374f7432a05", APIVersion:"apps/v1", ResourceVersion:"996", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test1-6cdffdb5b8-xsnml
Successful
message:deployment.apps/test1 created
has:deployment.apps/test1 created
deployment.apps "test1" deleted
Successful
message:error: Invalid image name "InvalidImageName": invalid reference format
has:error: Invalid image name "InvalidImageName": invalid reference format
+++ exit code: 0
+++ [1129 06:47:31] Testing recursive resources
+++ [1129 06:47:31] Creating namespace namespace-1575010051-13958
namespace/namespace-1575010051-13958 created
Context "test" modified.
W1129 06:47:31.893130   50776 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E1129 06:47:31.894323   54238 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:202: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BW1129 06:47:31.993136   50776 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E1129 06:47:31.994386   54238 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
W1129 06:47:32.078961   50776 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E1129 06:47:32.080081   54238 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
W1129 06:47:32.171276   50776 cacher.go:162] Terminating all watchers from cacher *unstructured.Unstructured
E1129 06:47:32.172418   54238 reflector.go:320] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to watch *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:206: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:pod/busybox0 created
pod/busybox1 created
error: error validating "hack/testdata/recursive/pod/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:211: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:220: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: busybox:busybox:
(BSuccessful
message:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:227: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:231: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:pod/busybox0 replaced
pod/busybox1 replaced
error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
E1129 06:47:32.895505   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:236: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE1129 06:47:32.995313   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Name:         busybox0
Namespace:    namespace-1575010051-13958
Priority:     0
Node:         <none>
Labels:       app=busybox0
... skipping 153 lines ...
QoS Class:        BestEffort
Node-Selectors:   <none>
Tolerations:      <none>
Events:           <none>
unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E1129 06:47:33.081271   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:246: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE1129 06:47:33.173525   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:250: Successful get pods {{range.items}}{{.metadata.annotations.annotatekey}}:{{end}}: annotatevalue:annotatevalue:
(BSuccessful
message:pod/busybox0 annotated
pod/busybox1 annotated
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:255: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:259: Successful get pods {{range.items}}{{.metadata.labels.status}}:{{end}}: replaced:replaced:
(BSuccessful
message:Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
pod/busybox0 configured
Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
pod/busybox1 configured
error: error validating "hack/testdata/recursive/pod-modify/pod/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
has:error validating data: kind not set
generic-resources.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx created
I1129 06:47:33.858372   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010051-13958", Name:"nginx", UID:"2a1517de-2e83-4088-aae9-16bffcb83649", APIVersion:"apps/v1", ResourceVersion:"1022", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-f87d999f7 to 3
I1129 06:47:33.862291   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010051-13958", Name:"nginx-f87d999f7", UID:"5ead830c-c56c-4010-9379-904cca2ebfa8", APIVersion:"apps/v1", ResourceVersion:"1023", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-t77z9
I1129 06:47:33.866012   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010051-13958", Name:"nginx-f87d999f7", UID:"5ead830c-c56c-4010-9379-904cca2ebfa8", APIVersion:"apps/v1", ResourceVersion:"1023", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-8m92r
I1129 06:47:33.866704   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010051-13958", Name:"nginx-f87d999f7", UID:"5ead830c-c56c-4010-9379-904cca2ebfa8", APIVersion:"apps/v1", ResourceVersion:"1023", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-7gvhg
E1129 06:47:33.896561   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:269: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(BE1129 06:47:33.996408   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:270: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE1129 06:47:34.082506   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
E1129 06:47:34.174689   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:274: Successful get deployment nginx {{ .apiVersion }}: apps/v1
(BSuccessful
message:apiVersion: extensions/v1beta1
kind: Deployment
metadata:
  creationTimestamp: null
... skipping 38 lines ...
deployment.apps "nginx" deleted
generic-resources.sh:281: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:285: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:290: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:busybox0:busybox1:
Successful
message:busybox0:busybox1:error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:299: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bpod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
E1129 06:47:34.897675   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:304: Successful get pods {{range.items}}{{.metadata.labels.mylabel}}:{{end}}: myvalue:myvalue:
(BSuccessful
message:pod/busybox0 labeled
pod/busybox1 labeled
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
E1129 06:47:34.997870   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:309: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE1129 06:47:35.083694   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
E1129 06:47:35.175768   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:314: Successful get pods {{range.items}}{{(index .spec.containers 0).image}}:{{end}}: prom/busybox:prom/busybox:
(BSuccessful
message:pod/busybox0 patched
pod/busybox1 patched
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:319: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:323: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "busybox0" force deleted
pod "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/pod/pod/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"Pod","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}'
has:Object 'Kind' is missing
generic-resources.sh:328: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BI1129 06:47:35.639160   54238 namespace_controller.go:185] Namespace has been deleted non-native-resources
replicationcontroller/busybox0 created
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I1129 06:47:35.722420   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010051-13958", Name:"busybox0", UID:"cfc20f64-8eef-463b-b4d0-4182d86b373e", APIVersion:"v1", ResourceVersion:"1054", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-j4fwd
I1129 06:47:35.727863   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010051-13958", Name:"busybox1", UID:"e5ecd577-9ea7-4fb0-8e87-acfc4e11cfdb", APIVersion:"v1", ResourceVersion:"1055", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-kghjr
generic-resources.sh:332: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE1129 06:47:35.898833   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:337: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE1129 06:47:35.998851   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:338: Successful get rc busybox0 {{.spec.replicas}}: 1
(BE1129 06:47:36.084885   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:339: Successful get rc busybox1 {{.spec.replicas}}: 1
(BE1129 06:47:36.176794   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:344: Successful get hpa busybox0 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(Bgeneric-resources.sh:345: Successful get hpa busybox1 {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 80
(BSuccessful
message:horizontalpodautoscaler.autoscaling/busybox0 autoscaled
horizontalpodautoscaler.autoscaling/busybox1 autoscaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
horizontalpodautoscaler.autoscaling "busybox0" deleted
horizontalpodautoscaler.autoscaling "busybox1" deleted
generic-resources.sh:353: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:354: Successful get rc busybox0 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:355: Successful get rc busybox1 {{.spec.replicas}}: 1
(BE1129 06:47:36.899789   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:359: Successful get service busybox0 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(BE1129 06:47:37.000038   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:360: Successful get service busybox1 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(BSuccessful
message:service/busybox0 exposed
service/busybox1 exposed
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
E1129 06:47:37.086024   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:366: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BE1129 06:47:37.177938   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:367: Successful get rc busybox0 {{.spec.replicas}}: 1
(Bgeneric-resources.sh:368: Successful get rc busybox1 {{.spec.replicas}}: 1
(BI1129 06:47:37.394244   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010051-13958", Name:"busybox0", UID:"cfc20f64-8eef-463b-b4d0-4182d86b373e", APIVersion:"v1", ResourceVersion:"1076", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-mg4vq
I1129 06:47:37.402712   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010051-13958", Name:"busybox1", UID:"e5ecd577-9ea7-4fb0-8e87-acfc4e11cfdb", APIVersion:"v1", ResourceVersion:"1081", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-2f5jh
generic-resources.sh:372: Successful get rc busybox0 {{.spec.replicas}}: 2
(Bgeneric-resources.sh:373: Successful get rc busybox1 {{.spec.replicas}}: 2
(BSuccessful
message:replicationcontroller/busybox0 scaled
replicationcontroller/busybox1 scaled
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
generic-resources.sh:378: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(Bgeneric-resources.sh:382: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BSuccessful
message:warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
E1129 06:47:37.900976   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:387: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:47:38.001252   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx1-deployment created
I1129 06:47:38.062819   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010051-13958", Name:"nginx1-deployment", UID:"141c9f6b-3b59-4e93-9628-b3721591ef30", APIVersion:"apps/v1", ResourceVersion:"1096", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx1-deployment-7bdbbfb5cf to 2
deployment.apps/nginx0-deployment created
error: error validating "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I1129 06:47:38.065289   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010051-13958", Name:"nginx1-deployment-7bdbbfb5cf", UID:"9c3e6e11-3138-4124-a65e-118b427a6698", APIVersion:"apps/v1", ResourceVersion:"1097", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7bdbbfb5cf-wl8w4
I1129 06:47:38.068265   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010051-13958", Name:"nginx0-deployment", UID:"327234ec-e8d6-4698-9d7b-dea4a2ae9499", APIVersion:"apps/v1", ResourceVersion:"1098", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx0-deployment-57c6bff7f6 to 2
I1129 06:47:38.068969   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010051-13958", Name:"nginx1-deployment-7bdbbfb5cf", UID:"9c3e6e11-3138-4124-a65e-118b427a6698", APIVersion:"apps/v1", ResourceVersion:"1097", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx1-deployment-7bdbbfb5cf-fj2wf
I1129 06:47:38.070521   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010051-13958", Name:"nginx0-deployment-57c6bff7f6", UID:"23fd2f47-ae5a-4421-80b0-3006054b9d7e", APIVersion:"apps/v1", ResourceVersion:"1101", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-57c6bff7f6-2plrz
I1129 06:47:38.073117   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010051-13958", Name:"nginx0-deployment-57c6bff7f6", UID:"23fd2f47-ae5a-4421-80b0-3006054b9d7e", APIVersion:"apps/v1", ResourceVersion:"1101", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx0-deployment-57c6bff7f6-h8h6f
E1129 06:47:38.086641   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:391: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx0-deployment:nginx1-deployment:
(BE1129 06:47:38.180224   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:392: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(Bgeneric-resources.sh:396: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:k8s.gcr.io/nginx:1.7.9:
(BSuccessful
message:deployment.apps/nginx1-deployment skipped rollback (current template already matches revision 1)
deployment.apps/nginx0-deployment skipped rollback (current template already matches revision 1)
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
deployment.apps/nginx1-deployment paused
deployment.apps/nginx0-deployment paused
generic-resources.sh:404: Successful get deployment {{range.items}}{{.spec.paused}}:{{end}}: true:true:
(BSuccessful
message:unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
... skipping 10 lines ...
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx0-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:nginx1-deployment
Successful
message:deployment.apps/nginx1-deployment 
REVISION  CHANGE-CAUSE
1         <none>

deployment.apps/nginx0-deployment 
REVISION  CHANGE-CAUSE
1         <none>

error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
has:Object 'Kind' is missing
E1129 06:47:38.902131   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
deployment.apps "nginx1-deployment" force deleted
deployment.apps "nginx0-deployment" force deleted
error: unable to decode "hack/testdata/recursive/deployment/deployment/nginx-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"apps/v1","ind":"Deployment","metadata":{"labels":{"app":"nginx2-deployment"},"name":"nginx2-deployment"},"spec":{"replicas":2,"selector":{"matchLabels":{"app":"nginx2"}},"template":{"metadata":{"labels":{"app":"nginx2"}},"spec":{"containers":[{"image":"k8s.gcr.io/nginx:1.7.9","name":"nginx","ports":[{"containerPort":80}]}]}}}}'
E1129 06:47:39.001985   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:39.087814   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:39.181445   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:39.903272   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:40.003169   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:426: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:47:40.088891   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:40.182728   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/busybox0 created
I1129 06:47:40.219385   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010051-13958", Name:"busybox0", UID:"0c54a8d1-679b-4238-94ec-5cf80209c3fc", APIVersion:"v1", ResourceVersion:"1146", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox0-7m49h
replicationcontroller/busybox1 created
error: error validating "hack/testdata/recursive/rc/rc/busybox-broken.yaml": error validating data: kind not set; if you choose to ignore these errors, turn validation off with --validate=false
I1129 06:47:40.224640   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010051-13958", Name:"busybox1", UID:"35bef492-8bcd-40f0-ae26-097970360b9e", APIVersion:"v1", ResourceVersion:"1148", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: busybox1-k6l62
generic-resources.sh:430: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: busybox0:busybox1:
(BSuccessful
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
... skipping 2 lines ...
message:no rollbacker has been implemented for "ReplicationController"
no rollbacker has been implemented for "ReplicationController"
unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox0" pausing is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" pausing is not supported
error: replicationcontrollers "busybox1" pausing is not supported
has:replicationcontrollers "busybox1" pausing is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:Object 'Kind' is missing
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox0" resuming is not supported
Successful
message:unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
error: replicationcontrollers "busybox0" resuming is not supported
error: replicationcontrollers "busybox1" resuming is not supported
has:replicationcontrollers "busybox1" resuming is not supported
warning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
replicationcontroller "busybox0" force deleted
replicationcontroller "busybox1" force deleted
error: unable to decode "hack/testdata/recursive/rc/rc/busybox-broken.yaml": Object 'Kind' is missing in '{"apiVersion":"v1","ind":"ReplicationController","metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"replicas":1,"selector":{"app":"busybox2"},"template":{"metadata":{"labels":{"app":"busybox2"},"name":"busybox2"},"spec":{"containers":[{"command":["sleep","3600"],"image":"busybox","imagePullPolicy":"IfNotPresent","name":"busybox"}],"restartPolicy":"Always"}}}}'
E1129 06:47:40.904462   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:41.004234   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:41.090021   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:41.183836   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_namespace_tests
Running command: run_namespace_tests

+++ Running case: test-cmd.run_namespace_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_namespace_tests
+++ [1129 06:47:41] Testing kubectl(v1:namespaces)
namespace/my-namespace created
core.sh:1314: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(BE1129 06:47:41.905537   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "my-namespace" deleted
E1129 06:47:42.005308   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:42.091166   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:42.184884   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:42.906725   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:43.006442   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:43.092289   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:43.186164   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:43.907940   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:44.007620   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:44.093487   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:44.187263   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:44.909102   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:45.008737   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:45.094591   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:45.188351   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:45.910211   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:46.009991   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:46.095746   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:46.189575   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:46.911511   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:47.010850   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/my-namespace condition met
E1129 06:47:47.096766   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Error from server (NotFound): namespaces "my-namespace" not found
has: not found
namespace/my-namespace created
E1129 06:47:47.190525   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1323: Successful get namespaces/my-namespace {{.metadata.name}}: my-namespace
(BSuccessful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
namespace "my-namespace" deleted
namespace "namespace-1575009917-1842" deleted
... skipping 27 lines ...
namespace "namespace-1575010009-46" deleted
namespace "namespace-1575010010-17733" deleted
namespace "namespace-1575010012-22065" deleted
namespace "namespace-1575010013-15179" deleted
namespace "namespace-1575010051-13958" deleted
namespace "namespace-1575010051-6910" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:warning: deleting cluster-scoped resources
Successful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
namespace "kube-node-lease" deleted
namespace "my-namespace" deleted
namespace "namespace-1575009917-1842" deleted
... skipping 27 lines ...
namespace "namespace-1575010009-46" deleted
namespace "namespace-1575010010-17733" deleted
namespace "namespace-1575010012-22065" deleted
namespace "namespace-1575010013-15179" deleted
namespace "namespace-1575010051-13958" deleted
namespace "namespace-1575010051-6910" deleted
Error from server (Forbidden): namespaces "default" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-public" is forbidden: this namespace may not be deleted
Error from server (Forbidden): namespaces "kube-system" is forbidden: this namespace may not be deleted
has:namespace "my-namespace" deleted
core.sh:1335: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"other\" }}found{{end}}{{end}}:: :
(Bnamespace/other created
core.sh:1339: Successful get namespaces/other {{.metadata.name}}: other
(Bcore.sh:1343: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
E1129 06:47:47.912430   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1347: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE1129 06:47:48.011845   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1349: Successful get pods -n other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE1129 06:47:48.097726   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: a resource cannot be retrieved by name across all namespaces
has:a resource cannot be retrieved by name across all namespaces
E1129 06:47:48.191601   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1356: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
core.sh:1360: Successful get pods --namespace=other {{range.items}}{{.metadata.name}}:{{end}}: 
(Bnamespace "other" deleted
E1129 06:47:48.913584   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:49.012921   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:49.098910   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:49.192783   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1129 06:47:49.899385   54238 shared_informer.go:197] Waiting for caches to sync for garbage collector
I1129 06:47:49.899449   54238 shared_informer.go:204] Caches are synced for garbage collector 
E1129 06:47:49.914402   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1129 06:47:49.964493   54238 shared_informer.go:197] Waiting for caches to sync for resource quota
I1129 06:47:49.964532   54238 shared_informer.go:204] Caches are synced for resource quota 
E1129 06:47:50.013969   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:50.099969   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:50.193852   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:50.915622   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:51.015139   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:51.101168   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1129 06:47:51.177809   54238 horizontal.go:341] Horizontal Pod Autoscaler busybox0 has been deleted in namespace-1575010051-13958
I1129 06:47:51.180898   54238 horizontal.go:341] Horizontal Pod Autoscaler busybox1 has been deleted in namespace-1575010051-13958
E1129 06:47:51.194986   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:51.916865   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:52.015872   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:52.101988   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:52.196010   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:52.918462   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:53.016869   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:53.102798   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:53.197367   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_secrets_test
Running command: run_secrets_test

+++ Running case: test-cmd.run_secrets_test 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
... skipping 36 lines ...
  key1: dmFsdWUx
kind: Secret
metadata:
  creationTimestamp: null
  name: test
has not:example.com
E1129 06:47:53.919751   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:725: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-secrets\" }}found{{end}}{{end}}:: :
(Bnamespace/test-secrets created
E1129 06:47:54.017672   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:729: Successful get namespaces/test-secrets {{.metadata.name}}: test-secrets
(BE1129 06:47:54.103854   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:733: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:47:54.198444   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret/test-secret created
core.sh:737: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(Bcore.sh:738: Successful get secret/test-secret --namespace=test-secrets {{.type}}: test-type
(Bsecret "test-secret" deleted
core.sh:748: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret/test-secret created
core.sh:752: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(Bcore.sh:753: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/dockerconfigjson
(BE1129 06:47:54.920987   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:55.018851   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret "test-secret" deleted
E1129 06:47:55.104739   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:763: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret/test-secret created
E1129 06:47:55.199574   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:766: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(Bcore.sh:767: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/tls
(Bsecret "test-secret" deleted
secret/test-secret created
core.sh:773: Successful get secret/test-secret --namespace=test-secrets {{.metadata.name}}: test-secret
(Bcore.sh:774: Successful get secret/test-secret --namespace=test-secrets {{.type}}: kubernetes.io/tls
(Bsecret "test-secret" deleted
secret/secret-string-data created
E1129 06:47:55.921952   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:796: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
(BE1129 06:47:56.020005   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:797: Successful get secret/secret-string-data --namespace=test-secrets  {{.data}}: map[k1:djE= k2:djI=]
(BE1129 06:47:56.105826   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:798: Successful get secret/secret-string-data --namespace=test-secrets  {{.stringData}}: <no value>
(BE1129 06:47:56.203592   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret "secret-string-data" deleted
core.sh:807: Successful get secrets --namespace=test-secrets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bsecret "test-secret" deleted
namespace "test-secrets" deleted
E1129 06:47:56.923059   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:57.021198   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1129 06:47:57.088770   54238 namespace_controller.go:185] Namespace has been deleted my-namespace
E1129 06:47:57.106904   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:57.204666   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1129 06:47:57.499667   54238 namespace_controller.go:185] Namespace has been deleted kube-node-lease
I1129 06:47:57.507434   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575009920-32696
I1129 06:47:57.511263   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575009933-21066
I1129 06:47:57.521615   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575009933-9837
I1129 06:47:57.533350   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575009917-1842
I1129 06:47:57.534532   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575009937-30605
... skipping 9 lines ...
I1129 06:47:57.757915   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575009966-7793
I1129 06:47:57.764873   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575009962-23587
I1129 06:47:57.770454   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575009965-29031
I1129 06:47:57.786774   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575009963-14012
I1129 06:47:57.800099   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575009968-3305
I1129 06:47:57.918981   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575009988-29960
E1129 06:47:57.924233   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1129 06:47:57.925072   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575009970-263
I1129 06:47:57.929431   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575009989-23922
I1129 06:47:57.967431   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575010009-23173
I1129 06:47:57.969899   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575009990-25858
I1129 06:47:57.980669   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575010004-17183
I1129 06:47:57.983904   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575009971-28694
I1129 06:47:58.000815   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575009998-4740
I1129 06:47:58.003872   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575010009-46
I1129 06:47:58.006366   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575010005-13669
E1129 06:47:58.022250   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1129 06:47:58.087461   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575010010-17733
I1129 06:47:58.089347   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575010013-15179
I1129 06:47:58.100064   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575010012-22065
E1129 06:47:58.107951   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1129 06:47:58.114558   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575010051-6910
I1129 06:47:58.145446   54238 namespace_controller.go:185] Namespace has been deleted namespace-1575010051-13958
E1129 06:47:58.205814   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1129 06:47:58.558349   54238 namespace_controller.go:185] Namespace has been deleted other
E1129 06:47:58.925281   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:59.023412   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:59.109037   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:59.206898   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:47:59.926491   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:00.024486   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:00.110230   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:00.208003   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:00.927605   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:01.025617   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:01.111342   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:01.209222   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_configmap_tests
Running command: run_configmap_tests

+++ Running case: test-cmd.run_configmap_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_configmap_tests
+++ [1129 06:48:01] Creating namespace namespace-1575010081-8840
namespace/namespace-1575010081-8840 created
Context "test" modified.
+++ [1129 06:48:01] Testing configmaps
E1129 06:48:01.928945   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:02.026721   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap/test-configmap created
E1129 06:48:02.112495   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:28: Successful get configmap/test-configmap {{.metadata.name}}: test-configmap
(BE1129 06:48:02.210389   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap "test-configmap" deleted
core.sh:33: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-configmaps\" }}found{{end}}{{end}}:: :
(Bnamespace/test-configmaps created
core.sh:37: Successful get namespaces/test-configmaps {{.metadata.name}}: test-configmaps
(Bcore.sh:41: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-configmap\" }}found{{end}}{{end}}:: :
(Bcore.sh:42: Successful get configmaps {{range.items}}{{ if eq .metadata.name \"test-binary-configmap\" }}found{{end}}{{end}}:: :
(Bconfigmap/test-configmap created
configmap/test-binary-configmap created
core.sh:48: Successful get configmap/test-configmap --namespace=test-configmaps {{.metadata.name}}: test-configmap
(Bcore.sh:49: Successful get configmap/test-binary-configmap --namespace=test-configmaps {{.metadata.name}}: test-binary-configmap
(BE1129 06:48:02.929991   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:03.027794   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:03.119261   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap "test-configmap" deleted
E1129 06:48:03.211506   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap "test-binary-configmap" deleted
namespace "test-configmaps" deleted
E1129 06:48:03.931121   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:04.028983   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:04.120411   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:04.212556   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:04.932210   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:05.030164   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:05.121704   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:05.213621   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:05.933339   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:06.031409   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:06.122887   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:06.214818   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1129 06:48:06.638501   54238 namespace_controller.go:185] Namespace has been deleted test-secrets
E1129 06:48:06.934646   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:07.032683   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:07.124053   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:07.215868   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:07.935826   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:08.033966   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:08.125269   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:08.216978   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_client_config_tests
Running command: run_client_config_tests

+++ Running case: test-cmd.run_client_config_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_client_config_tests
+++ [1129 06:48:08] Creating namespace namespace-1575010088-22776
namespace/namespace-1575010088-22776 created
Context "test" modified.
+++ [1129 06:48:08] Testing client config
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:error: stat missing: no such file or directory
has:missing: no such file or directory
Successful
message:Error in configuration: context was not found for specified context: missing-context
has:context was not found for specified context: missing-context
Successful
message:error: no server found for cluster "missing-cluster"
has:no server found for cluster "missing-cluster"
E1129 06:48:08.937070   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: auth info "missing-user" does not exist
has:auth info "missing-user" does not exist
E1129 06:48:09.035059   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: error loading config file "/tmp/newconfig.yaml": no kind "Config" is registered for version "v-1" in scheme "k8s.io/client-go/tools/clientcmd/api/latest/latest.go:50"
has:error loading config file
E1129 06:48:09.126516   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: stat missing-config: no such file or directory
has:no such file or directory
+++ exit code: 0
Recording: run_service_accounts_tests
Running command: run_service_accounts_tests
E1129 06:48:09.218279   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_service_accounts_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_service_accounts_tests
+++ [1129 06:48:09] Creating namespace namespace-1575010089-30628
namespace/namespace-1575010089-30628 created
... skipping 3 lines ...
(Bnamespace/test-service-accounts created
core.sh:832: Successful get namespaces/test-service-accounts {{.metadata.name}}: test-service-accounts
(Bserviceaccount/test-service-account created
core.sh:838: Successful get serviceaccount/test-service-account --namespace=test-service-accounts {{.metadata.name}}: test-service-account
(Bserviceaccount "test-service-account" deleted
namespace "test-service-accounts" deleted
E1129 06:48:09.937965   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:10.036266   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:10.127654   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:10.219470   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:10.939040   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:11.037470   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:11.128806   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:11.220644   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:11.940154   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:12.038669   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:12.129919   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:12.221722   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:12.941349   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:13.039918   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:13.131079   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:13.222896   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1129 06:48:13.371634   54238 namespace_controller.go:185] Namespace has been deleted test-configmaps
E1129 06:48:13.942654   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:14.041137   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:14.132347   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:14.224146   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:14.943382   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
E1129 06:48:15.042278   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_job_tests
Running command: run_job_tests

+++ Running case: test-cmd.run_job_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_job_tests
+++ [1129 06:48:15] Creating namespace namespace-1575010095-8767
E1129 06:48:15.133427   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1575010095-8767 created
E1129 06:48:15.225281   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [1129 06:48:15] Testing job
batch.sh:30: Successful get namespaces {{range.items}}{{ if eq .metadata.name \"test-jobs\" }}found{{end}}{{end}}:: :
(Bnamespace/test-jobs created
batch.sh:34: Successful get namespaces/test-jobs {{.metadata.name}}: test-jobs
(Bkubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
... skipping 6 lines ...
Labels:                        run=pi
Annotations:                   <none>
Schedule:                      59 23 31 2 *
Concurrency Policy:            Allow
Suspend:                       False
Successful Job History Limit:  3
Failed Job History Limit:      1
Starting Deadline Seconds:     <unset>
Selector:                      <unset>
Parallelism:                   <unset>
Completions:                   <unset>
Pod Template:
  Labels:  run=pi
... skipping 16 lines ...
Last Schedule Time:  <unset>
Active Jobs:         <none>
Events:              <none>
Successful
message:job.batch/test-job
has:job.batch/test-job
E1129 06:48:15.944582   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
batch.sh:48: Successful get jobs {{range.items}}{{.metadata.name}}{{end}}: 
(BI1129 06:48:16.035432   54238 event.go:281] Event(v1.ObjectReference{Kind:"Job", Namespace:"test-jobs", Name:"test-job", UID:"06ce7187-52e6-4c1f-8892-7c5739fc4c29", APIVersion:"batch/v1", ResourceVersion:"1486", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-tw8vv
job.batch/test-job created
E1129 06:48:16.043072   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
batch.sh:53: Successful get job/test-job --namespace=test-jobs {{.metadata.name}}: test-job
(BE1129 06:48:16.134570   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME       COMPLETIONS   DURATION   AGE
test-job   0/1           0s         0s
E1129 06:48:16.226354   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Name:           test-job
Namespace:      test-jobs
Selector:       controller-uid=06ce7187-52e6-4c1f-8892-7c5739fc4c29
Labels:         controller-uid=06ce7187-52e6-4c1f-8892-7c5739fc4c29
                job-name=test-job
                run=pi
Annotations:    cronjob.kubernetes.io/instantiate: manual
Controlled By:  CronJob/pi
Parallelism:    1
Completions:    1
Start Time:     Fri, 29 Nov 2019 06:48:16 +0000
Pods Statuses:  1 Running / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  controller-uid=06ce7187-52e6-4c1f-8892-7c5739fc4c29
           job-name=test-job
           run=pi
  Containers:
   pi:
... skipping 15 lines ...
  Type    Reason            Age   From            Message
  ----    ------            ----  ----            -------
  Normal  SuccessfulCreate  0s    job-controller  Created pod: test-job-tw8vv
job.batch "test-job" deleted
cronjob.batch "pi" deleted
namespace "test-jobs" deleted
E1129 06:48:16.945628   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:17.044231   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:17.135660   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:17.227392   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:17.946678   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:18.045444   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:18.136837   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:18.228543   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:18.947878   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:19.046697   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:19.137942   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:19.229627   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:19.948794   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1129 06:48:20.006142   54238 namespace_controller.go:185] Namespace has been deleted test-service-accounts
E1129 06:48:20.047874   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:20.138879   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:20.230609   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:20.949900   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:21.048999   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:21.140219   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:21.231763   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_create_job_tests
Running command: run_create_job_tests

+++ Running case: test-cmd.run_create_job_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_create_job_tests
+++ [1129 06:48:21] Creating namespace namespace-1575010101-11345
namespace/namespace-1575010101-11345 created
Context "test" modified.
I1129 06:48:21.864385   54238 event.go:281] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1575010101-11345", Name:"test-job", UID:"dd9528a2-a97c-439c-9466-e7d0554aeef8", APIVersion:"batch/v1", ResourceVersion:"1507", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-67h54
job.batch/test-job created
E1129 06:48:21.950881   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
create.sh:86: Successful get job test-job {{(index .spec.template.spec.containers 0).image}}: k8s.gcr.io/nginx:test-cmd
(Bjob.batch "test-job" deleted
E1129 06:48:22.049926   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1129 06:48:22.100324   54238 event.go:281] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1575010101-11345", Name:"test-job-pi", UID:"bf5f9cf5-08a7-4a6e-b511-0d00b5608ac3", APIVersion:"batch/v1", ResourceVersion:"1514", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-job-pi-xdjhm
job.batch/test-job-pi created
E1129 06:48:22.141020   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
create.sh:92: Successful get job test-job-pi {{(index .spec.template.spec.containers 0).image}}: k8s.gcr.io/perl
(BE1129 06:48:22.232873   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
job.batch "test-job-pi" deleted
kubectl run --generator=cronjob/v1beta1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
cronjob.batch/test-pi created
I1129 06:48:22.412044   54238 event.go:281] Event(v1.ObjectReference{Kind:"Job", Namespace:"namespace-1575010101-11345", Name:"my-pi", UID:"3db20591-918e-41b6-9580-6b195adeb9e2", APIVersion:"batch/v1", ResourceVersion:"1522", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: my-pi-mg5pb
job.batch/my-pi created
Successful
... skipping 10 lines ...
+++ command: run_pod_templates_tests
+++ [1129 06:48:22] Creating namespace namespace-1575010102-9416
namespace/namespace-1575010102-9416 created
Context "test" modified.
+++ [1129 06:48:22] Testing pod templates
core.sh:1421: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:48:22.951879   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:23.051139   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1129 06:48:23.103129   50776 controller.go:606] quota admission added evaluator for: podtemplates
podtemplate/nginx created
E1129 06:48:23.142052   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1425: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(BE1129 06:48:23.233978   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
NAME    CONTAINERS   IMAGES   POD LABELS
nginx   nginx        nginx    name=nginx
core.sh:1433: Successful get podtemplates {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(Bpodtemplate "nginx" deleted
core.sh:1437: Successful get podtemplate {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
... skipping 3 lines ...
+++ Running case: test-cmd.run_service_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_service_tests
Context "test" modified.
+++ [1129 06:48:23] Testing kubectl(v1:services)
core.sh:858: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(BE1129 06:48:23.953069   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master created
E1129 06:48:24.052238   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:862: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(BE1129 06:48:24.143148   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Labels:
matched Selector:
matched IP:
matched Port:
matched Endpoints:
... skipping 10 lines ...
IP:                10.0.0.192
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
(BE1129 06:48:24.235077   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:866: Successful describe
Name:              redis-master
Namespace:         default
Labels:            app=redis
                   role=master
                   tier=backend
... skipping 161 lines ...
Port:              <unset>  6379/TCP
TargetPort:        6379/TCP
Endpoints:         <none>
Session Affinity:  None
Events:            <none>
(Bcore.sh:882: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(BE1129 06:48:24.954144   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apiVersion: v1
kind: Service
metadata:
  creationTimestamp: null
  labels:
    app: redis
... skipping 5 lines ...
  - port: 6379
    targetPort: 6379
  selector:
    role: padawan
status:
  loadBalancer: {}
E1129 06:48:25.053847   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apiVersion: v1
kind: Service
metadata:
  creationTimestamp: "2019-11-29T06:48:23Z"
  labels:
    app: redis
... skipping 14 lines ...
    role: padawan
  sessionAffinity: None
  type: ClusterIP
status:
  loadBalancer: {}
service/redis-master selector updated
E1129 06:48:25.144134   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:890: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: padawan:
(BE1129 06:48:25.236253   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/redis-master selector updated
core.sh:894: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(BapiVersion: v1
kind: Service
metadata:
  creationTimestamp: "2019-11-29T06:48:23Z"
... skipping 15 lines ...
  selector:
    role: padawan
  sessionAffinity: None
  type: ClusterIP
status:
  loadBalancer: {}
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
core.sh:898: Successful get services redis-master {{range.spec.selector}}{{.}}:{{end}}: redis:master:backend:
(Bservice/redis-master selector updated
Successful
message:Error from server (Conflict): Operation cannot be fulfilled on services "redis-master": the object has been modified; please apply your changes to the latest version and try again
has:Conflict
E1129 06:48:25.955277   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:26.054874   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:911: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(BE1129 06:48:26.146217   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "redis-master" deleted
E1129 06:48:26.239278   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:918: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:922: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bservice/redis-master created
I1129 06:48:26.574976   54238 namespace_controller.go:185] Namespace has been deleted test-jobs
core.sh:926: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(Bcore.sh:930: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:
(Bservice/service-v1-test created
core.sh:951: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test:
(BE1129 06:48:26.957464   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:27.055956   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/service-v1-test replaced
E1129 06:48:27.147264   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:958: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:service-v1-test:
(BE1129 06:48:27.240532   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "redis-master" deleted
service "service-v1-test" deleted
core.sh:966: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:970: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bservice/redis-master created
service/redis-slave created
core.sh:975: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:redis-slave:
(BE1129 06:48:27.958523   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:NAME           RSRC
kubernetes     144
redis-master   1561
redis-slave    1564
has:redis-master
E1129 06:48:28.056973   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:985: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:redis-master:redis-slave:
(BE1129 06:48:28.148443   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "redis-master" deleted
service "redis-slave" deleted
E1129 06:48:28.241841   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:992: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bcore.sh:996: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: kubernetes:
(Bservice/beep-boop created
core.sh:1000: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: beep-boop:kubernetes:
(Bcore.sh:1004: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: beep-boop:kubernetes:
(Bservice "beep-boop" deleted
... skipping 2 lines ...
(Bkubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
I1129 06:48:28.914406   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"default", Name:"testmetadata", UID:"35491869-f8e5-4931-aec7-bb361e8f7707", APIVersion:"apps/v1", ResourceVersion:"1578", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set testmetadata-bd968f46 to 2
I1129 06:48:28.921280   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-bd968f46", UID:"66ce2f99-2910-4c42-800d-7fca09187440", APIVersion:"apps/v1", ResourceVersion:"1579", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-bd968f46-c5rw7
I1129 06:48:28.925526   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"default", Name:"testmetadata-bd968f46", UID:"66ce2f99-2910-4c42-800d-7fca09187440", APIVersion:"apps/v1", ResourceVersion:"1579", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: testmetadata-bd968f46-95b84
service/testmetadata created
deployment.apps/testmetadata created
E1129 06:48:28.959408   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1019: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: testmetadata:
(BE1129 06:48:29.058042   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1020: Successful get service testmetadata {{.metadata.annotations}}: map[zone-context:home]
(BE1129 06:48:29.149561   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/exposemetadata exposed
E1129 06:48:29.242891   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1026: Successful get service exposemetadata {{.metadata.annotations}}: map[zone-context:work]
(Bservice "exposemetadata" deleted
service "testmetadata" deleted
deployment.apps "testmetadata" deleted
+++ exit code: 0
Recording: run_daemonset_tests
... skipping 7 lines ...
Context "test" modified.
+++ [1129 06:48:29] Testing kubectl(v1:daemonsets)
apps.sh:30: Successful get daemonsets {{range.items}}{{.metadata.name}}:{{end}}: 
(BI1129 06:48:29.909788   50776 controller.go:606] quota admission added evaluator for: daemonsets.apps
daemonset.apps/bind created
I1129 06:48:29.921129   50776 controller.go:606] quota admission added evaluator for: controllerrevisions.apps
E1129 06:48:29.960782   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:34: Successful get daemonsets bind {{.metadata.generation}}: 1
(BE1129 06:48:30.059166   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:30.150893   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind configured
E1129 06:48:30.244205   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:37: Successful get daemonsets bind {{.metadata.generation}}: 1
(Bdaemonset.apps/bind image updated
apps.sh:40: Successful get daemonsets bind {{.metadata.generation}}: 2
(Bdaemonset.apps/bind env updated
apps.sh:42: Successful get daemonsets bind {{.metadata.generation}}: 3
(Bdaemonset.apps/bind resource requirements updated
apps.sh:44: Successful get daemonsets bind {{.metadata.generation}}: 4
(Bdaemonset.apps/bind restarted
apps.sh:48: Successful get daemonsets bind {{.metadata.generation}}: 5
(BE1129 06:48:30.961857   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps "bind" deleted
+++ exit code: 0
E1129 06:48:31.060250   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_daemonset_history_tests
Running command: run_daemonset_history_tests

+++ Running case: test-cmd.run_daemonset_history_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_daemonset_history_tests
+++ [1129 06:48:31] Creating namespace namespace-1575010111-27727
E1129 06:48:31.152006   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1575010111-27727 created
Context "test" modified.
E1129 06:48:31.245393   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [1129 06:48:31] Testing kubectl(v1:daemonsets, v1:controllerrevisions)
apps.sh:66: Successful get daemonsets {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdaemonset.apps/bind created
apps.sh:70: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[deprecated.daemonset.template.generation:1 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1575010111-27727"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:2.0","name":"kubernetes-pause"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(Bdaemonset.apps/bind skipped rollback (current template already matches revision 1)
apps.sh:73: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:74: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BE1129 06:48:31.962949   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind configured
E1129 06:48:32.061224   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:77: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(BE1129 06:48:32.153115   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:78: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE1129 06:48:32.246491   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:79: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bapps.sh:80: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[deprecated.daemonset.template.generation:1 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1575010111-27727"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:2.0","name":"kubernetes-pause"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:map[deprecated.daemonset.template.generation:2 kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"DaemonSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"service":"bind"},"name":"bind","namespace":"namespace-1575010111-27727"},"spec":{"selector":{"matchLabels":{"service":"bind"}},"template":{"metadata":{"labels":{"service":"bind"}},"spec":{"affinity":{"podAntiAffinity":{"requiredDuringSchedulingIgnoredDuringExecution":[{"labelSelector":{"matchExpressions":[{"key":"service","operator":"In","values":["bind"]}]},"namespaces":[],"topologyKey":"kubernetes.io/hostname"}]}},"containers":[{"image":"k8s.gcr.io/pause:latest","name":"kubernetes-pause"},{"image":"k8s.gcr.io/nginx:test-cmd","name":"app"}]}},"updateStrategy":{"rollingUpdate":{"maxUnavailable":"10%"},"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(Bdaemonset.apps/bind will roll back to Pod Template:
  Labels:	service=bind
... skipping 7 lines ...
  Volumes:	<none>
 (dry run)
apps.sh:83: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(Bapps.sh:84: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:85: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bdaemonset.apps/bind rolled back
E1129 06:48:32.791324   54238 daemon_controller.go:290] namespace-1575010111-27727/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1575010111-27727", SelfLink:"/apis/apps/v1/namespaces/namespace-1575010111-27727/daemonsets/bind", UID:"b0c484af-66d3-4bed-b48f-30e212b01661", ResourceVersion:"1644", Generation:3, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63710606911, loc:(*time.Location)(0x6b7e260)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"3", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1575010111-27727\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc00175a280), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:2.0", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc001b3b288), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc000a03860), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc00175a2c0), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc001b56a78)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc001b3b2dc)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:2, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
apps.sh:88: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:89: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BE1129 06:48:32.967632   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
E1129 06:48:33.062187   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:93: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE1129 06:48:33.154168   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:94: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BE1129 06:48:33.247564   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
daemonset.apps/bind rolled back
E1129 06:48:33.338640   54238 daemon_controller.go:290] namespace-1575010111-27727/bind failed with : error storing status for daemon set &v1.DaemonSet{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"bind", GenerateName:"", Namespace:"namespace-1575010111-27727", SelfLink:"/apis/apps/v1/namespaces/namespace-1575010111-27727/daemonsets/bind", UID:"b0c484af-66d3-4bed-b48f-30e212b01661", ResourceVersion:"1649", Generation:4, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:63710606911, loc:(*time.Location)(0x6b7e260)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string{"deprecated.daemonset.template.generation":"4", "kubectl.kubernetes.io/last-applied-configuration":"{\"apiVersion\":\"apps/v1\",\"kind\":\"DaemonSet\",\"metadata\":{\"annotations\":{\"kubernetes.io/change-cause\":\"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true\"},\"labels\":{\"service\":\"bind\"},\"name\":\"bind\",\"namespace\":\"namespace-1575010111-27727\"},\"spec\":{\"selector\":{\"matchLabels\":{\"service\":\"bind\"}},\"template\":{\"metadata\":{\"labels\":{\"service\":\"bind\"}},\"spec\":{\"affinity\":{\"podAntiAffinity\":{\"requiredDuringSchedulingIgnoredDuringExecution\":[{\"labelSelector\":{\"matchExpressions\":[{\"key\":\"service\",\"operator\":\"In\",\"values\":[\"bind\"]}]},\"namespaces\":[],\"topologyKey\":\"kubernetes.io/hostname\"}]}},\"containers\":[{\"image\":\"k8s.gcr.io/pause:latest\",\"name\":\"kubernetes-pause\"},{\"image\":\"k8s.gcr.io/nginx:test-cmd\",\"name\":\"app\"}]}},\"updateStrategy\":{\"rollingUpdate\":{\"maxUnavailable\":\"10%\"},\"type\":\"RollingUpdate\"}}}\n", "kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-daemonset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"}, OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.DaemonSetSpec{Selector:(*v1.LabelSelector)(0xc000a14e20), Template:v1.PodTemplateSpec{ObjectMeta:v1.ObjectMeta{Name:"", GenerateName:"", Namespace:"", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"service":"bind"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v1.PodSpec{Volumes:[]v1.Volume(nil), InitContainers:[]v1.Container(nil), Containers:[]v1.Container{v1.Container{Name:"kubernetes-pause", Image:"k8s.gcr.io/pause:latest", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}, v1.Container{Name:"app", Image:"k8s.gcr.io/nginx:test-cmd", Command:[]string(nil), Args:[]string(nil), WorkingDir:"", Ports:[]v1.ContainerPort(nil), EnvFrom:[]v1.EnvFromSource(nil), Env:[]v1.EnvVar(nil), Resources:v1.ResourceRequirements{Limits:v1.ResourceList(nil), Requests:v1.ResourceList(nil)}, VolumeMounts:[]v1.VolumeMount(nil), VolumeDevices:[]v1.VolumeDevice(nil), LivenessProbe:(*v1.Probe)(nil), ReadinessProbe:(*v1.Probe)(nil), StartupProbe:(*v1.Probe)(nil), Lifecycle:(*v1.Lifecycle)(nil), TerminationMessagePath:"/dev/termination-log", TerminationMessagePolicy:"File", ImagePullPolicy:"IfNotPresent", SecurityContext:(*v1.SecurityContext)(nil), Stdin:false, StdinOnce:false, TTY:false}}, EphemeralContainers:[]v1.EphemeralContainer(nil), RestartPolicy:"Always", TerminationGracePeriodSeconds:(*int64)(0xc001d41e48), ActiveDeadlineSeconds:(*int64)(nil), DNSPolicy:"ClusterFirst", NodeSelector:map[string]string(nil), ServiceAccountName:"", DeprecatedServiceAccount:"", AutomountServiceAccountToken:(*bool)(nil), NodeName:"", HostNetwork:false, HostPID:false, HostIPC:false, ShareProcessNamespace:(*bool)(nil), SecurityContext:(*v1.PodSecurityContext)(0xc0014f9da0), ImagePullSecrets:[]v1.LocalObjectReference(nil), Hostname:"", Subdomain:"", Affinity:(*v1.Affinity)(0xc000a14e80), SchedulerName:"default-scheduler", Tolerations:[]v1.Toleration(nil), HostAliases:[]v1.HostAlias(nil), PriorityClassName:"", Priority:(*int32)(nil), DNSConfig:(*v1.PodDNSConfig)(nil), ReadinessGates:[]v1.PodReadinessGate(nil), RuntimeClassName:(*string)(nil), EnableServiceLinks:(*bool)(nil), PreemptionPolicy:(*v1.PreemptionPolicy)(nil), Overhead:v1.ResourceList(nil), TopologySpreadConstraints:[]v1.TopologySpreadConstraint(nil)}}, UpdateStrategy:v1.DaemonSetUpdateStrategy{Type:"RollingUpdate", RollingUpdate:(*v1.RollingUpdateDaemonSet)(0xc001b56cb0)}, MinReadySeconds:0, RevisionHistoryLimit:(*int32)(0xc001d41e9c)}, Status:v1.DaemonSetStatus{CurrentNumberScheduled:0, NumberMisscheduled:0, DesiredNumberScheduled:0, NumberReady:0, ObservedGeneration:3, UpdatedNumberScheduled:0, NumberAvailable:0, NumberUnavailable:0, CollisionCount:(*int32)(nil), Conditions:[]v1.DaemonSetCondition(nil)}}: Operation cannot be fulfilled on daemonsets.apps "bind": the object has been modified; please apply your changes to the latest version and try again
apps.sh:97: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/pause:latest:
(Bapps.sh:98: Successful get daemonset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:99: Successful get daemonset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bdaemonset.apps "bind" deleted
+++ exit code: 0
Recording: run_rc_tests
... skipping 3 lines ...
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_rc_tests
+++ [1129 06:48:33] Creating namespace namespace-1575010113-23696
namespace/namespace-1575010113-23696 created
Context "test" modified.
+++ [1129 06:48:33] Testing kubectl(v1:replicationcontrollers)
E1129 06:48:33.968542   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1052: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:48:34.063348   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
I1129 06:48:34.129424   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"frontend", UID:"a0517095-2a2b-4003-b9b8-12a499313ace", APIVersion:"v1", ResourceVersion:"1657", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-96m2j
I1129 06:48:34.131845   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"frontend", UID:"a0517095-2a2b-4003-b9b8-12a499313ace", APIVersion:"v1", ResourceVersion:"1657", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-gmmbz
I1129 06:48:34.132081   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"frontend", UID:"a0517095-2a2b-4003-b9b8-12a499313ace", APIVersion:"v1", ResourceVersion:"1657", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-dnq2q
E1129 06:48:34.154954   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "frontend" deleted
E1129 06:48:34.248539   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1057: Successful get pods -l "name=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:1061: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/frontend created
I1129 06:48:34.531889   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"frontend", UID:"fe824177-81ad-4e1c-a8ce-fd83c7c49d73", APIVersion:"v1", ResourceVersion:"1673", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-j4rds
I1129 06:48:34.535202   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"frontend", UID:"fe824177-81ad-4e1c-a8ce-fd83c7c49d73", APIVersion:"v1", ResourceVersion:"1673", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-k2sbh
I1129 06:48:34.535719   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"frontend", UID:"fe824177-81ad-4e1c-a8ce-fd83c7c49d73", APIVersion:"v1", ResourceVersion:"1673", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-55lz8
... skipping 11 lines ...
Namespace:    namespace-1575010113-23696
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1575010113-23696
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 18 lines ...
Namespace:    namespace-1575010113-23696
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 4 lines ...
      memory:  100Mi
    Environment:
      GET_HOSTS_FROM:  dns
    Mounts:            <none>
  Volumes:             <none>
(B
E1129 06:48:34.969593   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1073: Successful describe
Name:         frontend
Namespace:    namespace-1575010113-23696
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 10 lines ...
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-j4rds
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-k2sbh
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-55lz8
(B
E1129 06:48:35.064358   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Name:
matched Pod Template:
matched Labels:
matched Selector:
matched Replicas:
matched Pods Status:
matched Volumes:
E1129 06:48:35.155930   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched GET_HOSTS_FROM:
Successful describe rc:
Name:         frontend
Namespace:    namespace-1575010113-23696
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 17 lines ...
Namespace:    namespace-1575010113-23696
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                    Message
  ----    ------            ----  ----                    -------
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-j4rds
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-k2sbh
  Normal  SuccessfulCreate  1s    replication-controller  Created pod: frontend-55lz8
(BE1129 06:48:35.249633   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1575010113-23696
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 11 lines ...
Namespace:    namespace-1575010113-23696
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v4
... skipping 15 lines ...
(Bcore.sh:1085: Successful get rc frontend {{.spec.replicas}}: 3
(Breplicationcontroller/frontend scaled
E1129 06:48:35.617073   54238 replica_set.go:199] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1575010113-23696 /api/v1/namespaces/namespace-1575010113-23696/replicationcontrollers/frontend fe824177-81ad-4e1c-a8ce-fd83c7c49d73 1684 2 2019-11-29 06:48:34 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc002a3b5b8 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:1,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
I1129 06:48:35.622852   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"frontend", UID:"fe824177-81ad-4e1c-a8ce-fd83c7c49d73", APIVersion:"v1", ResourceVersion:"1684", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-j4rds
core.sh:1089: Successful get rc frontend {{.spec.replicas}}: 2
(Bcore.sh:1093: Successful get rc frontend {{.spec.replicas}}: 2
(Berror: Expected replicas to be 3, was 2
E1129 06:48:35.971485   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1097: Successful get rc frontend {{.spec.replicas}}: 2
(BE1129 06:48:36.065701   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1101: Successful get rc frontend {{.spec.replicas}}: 2
(BE1129 06:48:36.157039   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend scaled
I1129 06:48:36.174978   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"frontend", UID:"fe824177-81ad-4e1c-a8ce-fd83c7c49d73", APIVersion:"v1", ResourceVersion:"1690", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-b46tb
E1129 06:48:36.250777   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1105: Successful get rc frontend {{.spec.replicas}}: 3
(Bcore.sh:1109: Successful get rc frontend {{.spec.replicas}}: 3
(Breplicationcontroller/frontend scaled
E1129 06:48:36.429163   54238 replica_set.go:199] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1575010113-23696 /api/v1/namespaces/namespace-1575010113-23696/replicationcontrollers/frontend fe824177-81ad-4e1c-a8ce-fd83c7c49d73 1695 4 2019-11-29 06:48:34 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v4 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc0023f9698 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:3,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
I1129 06:48:36.436105   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"frontend", UID:"fe824177-81ad-4e1c-a8ce-fd83c7c49d73", APIVersion:"v1", ResourceVersion:"1695", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-b46tb
core.sh:1113: Successful get rc frontend {{.spec.replicas}}: 2
(Breplicationcontroller "frontend" deleted
replicationcontroller/redis-master created
I1129 06:48:36.753923   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"redis-master", UID:"6cbc9919-bb52-4859-9390-cb988b67ad22", APIVersion:"v1", ResourceVersion:"1706", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-nss9s
replicationcontroller/redis-slave created
I1129 06:48:36.913816   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"redis-slave", UID:"dbdf3732-15dd-4d86-88da-d818ce2064e3", APIVersion:"v1", ResourceVersion:"1713", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-pzhzk
I1129 06:48:36.916657   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"redis-slave", UID:"dbdf3732-15dd-4d86-88da-d818ce2064e3", APIVersion:"v1", ResourceVersion:"1713", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-722tl
E1129 06:48:36.972522   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/redis-master scaled
I1129 06:48:37.002369   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"redis-master", UID:"6cbc9919-bb52-4859-9390-cb988b67ad22", APIVersion:"v1", ResourceVersion:"1720", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-k46mb
I1129 06:48:37.004821   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"redis-master", UID:"6cbc9919-bb52-4859-9390-cb988b67ad22", APIVersion:"v1", ResourceVersion:"1720", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-7ch7n
replicationcontroller/redis-slave scaled
I1129 06:48:37.006838   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"redis-master", UID:"6cbc9919-bb52-4859-9390-cb988b67ad22", APIVersion:"v1", ResourceVersion:"1720", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-master-h5nfz
I1129 06:48:37.007825   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"redis-slave", UID:"dbdf3732-15dd-4d86-88da-d818ce2064e3", APIVersion:"v1", ResourceVersion:"1722", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-9q77x
I1129 06:48:37.011679   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"redis-slave", UID:"dbdf3732-15dd-4d86-88da-d818ce2064e3", APIVersion:"v1", ResourceVersion:"1722", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-mbnp4
E1129 06:48:37.066887   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1123: Successful get rc redis-master {{.spec.replicas}}: 4
(BE1129 06:48:37.157991   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1124: Successful get rc redis-slave {{.spec.replicas}}: 4
(BE1129 06:48:37.251843   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "redis-master" deleted
replicationcontroller "redis-slave" deleted
deployment.apps/nginx-deployment created
I1129 06:48:37.426897   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010113-23696", Name:"nginx-deployment", UID:"5b8af72e-d0bd-4d7b-b04b-60d7093c15cb", APIVersion:"apps/v1", ResourceVersion:"1754", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I1129 06:48:37.429139   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010113-23696", Name:"nginx-deployment-6986c7bc94", UID:"8b46a399-b737-47d9-b1cd-a09d3da63fdd", APIVersion:"apps/v1", ResourceVersion:"1755", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-glbxm
I1129 06:48:37.431432   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010113-23696", Name:"nginx-deployment-6986c7bc94", UID:"8b46a399-b737-47d9-b1cd-a09d3da63fdd", APIVersion:"apps/v1", ResourceVersion:"1755", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-xggqz
... skipping 6 lines ...
(Bdeployment.apps "nginx-deployment" deleted
Successful
message:service/expose-test-deployment exposed
has:service/expose-test-deployment exposed
service "expose-test-deployment" deleted
Successful
message:error: couldn't retrieve selectors via --selector flag or introspection: invalid deployment: no selectors, therefore cannot be exposed
See 'kubectl expose -h' for help and examples
has:invalid deployment: no selectors
E1129 06:48:37.973729   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:38.068046   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I1129 06:48:38.079426   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010113-23696", Name:"nginx-deployment", UID:"c6574451-fcf9-4e36-87b2-156b67d4f63e", APIVersion:"apps/v1", ResourceVersion:"1792", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I1129 06:48:38.085935   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010113-23696", Name:"nginx-deployment-6986c7bc94", UID:"8ced7aee-cb16-4099-9565-1c1a8d99de8d", APIVersion:"apps/v1", ResourceVersion:"1793", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-p5lc9
I1129 06:48:38.088073   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010113-23696", Name:"nginx-deployment-6986c7bc94", UID:"8ced7aee-cb16-4099-9565-1c1a8d99de8d", APIVersion:"apps/v1", ResourceVersion:"1793", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-4qrhl
I1129 06:48:38.089148   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010113-23696", Name:"nginx-deployment-6986c7bc94", UID:"8ced7aee-cb16-4099-9565-1c1a8d99de8d", APIVersion:"apps/v1", ResourceVersion:"1793", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-7r5gf
E1129 06:48:38.159598   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1152: Successful get deployment nginx-deployment {{.spec.replicas}}: 3
(Bservice/nginx-deployment exposed
E1129 06:48:38.252645   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1156: Successful get service nginx-deployment {{(index .spec.ports 0).port}}: 80
(Bdeployment.apps "nginx-deployment" deleted
service "nginx-deployment" deleted
replicationcontroller/frontend created
I1129 06:48:38.578938   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"frontend", UID:"b0ca1a84-abe8-40f5-b27a-f4b0cb85697d", APIVersion:"v1", ResourceVersion:"1820", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-487rz
I1129 06:48:38.581217   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"frontend", UID:"b0ca1a84-abe8-40f5-b27a-f4b0cb85697d", APIVersion:"v1", ResourceVersion:"1820", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-v9fkt
I1129 06:48:38.583872   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"frontend", UID:"b0ca1a84-abe8-40f5-b27a-f4b0cb85697d", APIVersion:"v1", ResourceVersion:"1820", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-tmzwd
core.sh:1163: Successful get rc frontend {{.spec.replicas}}: 3
(Bservice/frontend exposed
core.sh:1167: Successful get service frontend {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bservice/frontend-2 exposed
E1129 06:48:38.974793   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1171: Successful get service frontend-2 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 443
(BE1129 06:48:39.069193   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
E1129 06:48:39.160650   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend-3 exposed
E1129 06:48:39.253460   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1176: Successful get service frontend-3 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 444
(Bservice/frontend-4 exposed
core.sh:1180: Successful get service frontend-4 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: default 80
(Bservice/frontend-5 exposed
core.sh:1184: Successful get service frontend-5 {{(index .spec.ports 0).port}}: 80
(Bpod "valid-pod" deleted
service "frontend" deleted
service "frontend-2" deleted
service "frontend-3" deleted
service "frontend-4" deleted
service "frontend-5" deleted
Successful
message:error: cannot expose a Node
has:cannot expose
E1129 06:48:39.975925   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:The Service "invalid-large-service-name-that-has-more-than-sixty-three-characters" is invalid: metadata.name: Invalid value: "invalid-large-service-name-that-has-more-than-sixty-three-characters": must be no more than 63 characters
has:metadata.name: Invalid value
E1129 06:48:40.070405   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:service/kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
has:kubernetes-serve-hostname-testing-sixty-three-characters-in-len exposed
E1129 06:48:40.161927   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "kubernetes-serve-hostname-testing-sixty-three-characters-in-len" deleted
E1129 06:48:40.254610   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:service/etcd-server exposed
has:etcd-server exposed
core.sh:1214: Successful get service etcd-server {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: port-1 2380
(Bcore.sh:1215: Successful get service etcd-server {{(index .spec.ports 1).name}} {{(index .spec.ports 1).port}}: port-2 2379
(Bservice "etcd-server" deleted
core.sh:1221: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Breplicationcontroller "frontend" deleted
core.sh:1225: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:1229: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:48:40.976917   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/frontend created
I1129 06:48:41.024388   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"frontend", UID:"79e4286b-2b2b-46d5-b217-953255d0b6fd", APIVersion:"v1", ResourceVersion:"1885", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-dxxjn
I1129 06:48:41.027080   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"frontend", UID:"79e4286b-2b2b-46d5-b217-953255d0b6fd", APIVersion:"v1", ResourceVersion:"1885", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-xp8vg
I1129 06:48:41.028326   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"frontend", UID:"79e4286b-2b2b-46d5-b217-953255d0b6fd", APIVersion:"v1", ResourceVersion:"1885", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-sdw5r
E1129 06:48:41.071372   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:41.163001   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/redis-slave created
I1129 06:48:41.178862   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"redis-slave", UID:"d65f2751-4abe-42b6-a73d-ce4efebf7404", APIVersion:"v1", ResourceVersion:"1894", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-6h8wh
I1129 06:48:41.181160   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"redis-slave", UID:"d65f2751-4abe-42b6-a73d-ce4efebf7404", APIVersion:"v1", ResourceVersion:"1894", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-dmpkk
E1129 06:48:41.255565   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1234: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Bcore.sh:1238: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Breplicationcontroller "frontend" deleted
replicationcontroller "redis-slave" deleted
core.sh:1242: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Bcore.sh:1246: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicationcontroller/frontend created
I1129 06:48:41.778871   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"frontend", UID:"df9f1602-720a-4cf6-bde0-a05dfaa0bc80", APIVersion:"v1", ResourceVersion:"1915", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-rwb2p
I1129 06:48:41.780952   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"frontend", UID:"df9f1602-720a-4cf6-bde0-a05dfaa0bc80", APIVersion:"v1", ResourceVersion:"1915", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-285c8
I1129 06:48:41.783644   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010113-23696", Name:"frontend", UID:"df9f1602-720a-4cf6-bde0-a05dfaa0bc80", APIVersion:"v1", ResourceVersion:"1915", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-gbpdw
core.sh:1249: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Bhorizontalpodautoscaler.autoscaling/frontend autoscaled
E1129 06:48:41.978087   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1252: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
(BE1129 06:48:42.072572   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling "frontend" deleted
E1129 06:48:42.164128   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling/frontend autoscaled
E1129 06:48:42.256659   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1256: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
Error: required flag(s) "max" not set


Examples:
  # Auto scale a deployment "foo", with the number of pods between 2 and 10, no target CPU utilization specified so a default autoscaling policy will be used:
  kubectl autoscale deployment foo --min=2 --max=10
  
... skipping 54 lines ...
          limits:
            cpu: 300m
          requests:
            cpu: 300m
      terminationGracePeriodSeconds: 0
status: {}
Error from server (NotFound): deployments.apps "nginx-deployment-resources" not found
deployment.apps/nginx-deployment-resources created
I1129 06:48:42.882363   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010113-23696", Name:"nginx-deployment-resources", UID:"2d8eb1c0-76a0-4f00-a7c0-c5da234dd9ea", APIVersion:"apps/v1", ResourceVersion:"1936", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-67f8cfff5 to 3
I1129 06:48:42.889163   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010113-23696", Name:"nginx-deployment-resources-67f8cfff5", UID:"1cb98f5a-3c52-4e30-a4b3-ab432346ddfc", APIVersion:"apps/v1", ResourceVersion:"1937", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-dkjbm
I1129 06:48:42.892168   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010113-23696", Name:"nginx-deployment-resources-67f8cfff5", UID:"1cb98f5a-3c52-4e30-a4b3-ab432346ddfc", APIVersion:"apps/v1", ResourceVersion:"1937", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-8rgn4
I1129 06:48:42.892360   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010113-23696", Name:"nginx-deployment-resources-67f8cfff5", UID:"1cb98f5a-3c52-4e30-a4b3-ab432346ddfc", APIVersion:"apps/v1", ResourceVersion:"1937", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-67f8cfff5-s52mw
E1129 06:48:42.979133   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1271: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment-resources:
(BE1129 06:48:43.073666   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1272: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bcore.sh:1273: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(BE1129 06:48:43.165185   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment-resources resource requirements updated
I1129 06:48:43.245130   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010113-23696", Name:"nginx-deployment-resources", UID:"2d8eb1c0-76a0-4f00-a7c0-c5da234dd9ea", APIVersion:"apps/v1", ResourceVersion:"1952", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-55c547f795 to 1
I1129 06:48:43.250092   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010113-23696", Name:"nginx-deployment-resources-55c547f795", UID:"9d544083-263e-4996-aa60-a7aba280e22c", APIVersion:"apps/v1", ResourceVersion:"1953", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-55c547f795-zkx4c
E1129 06:48:43.257369   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1276: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 100m:
(Bcore.sh:1277: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
(Berror: unable to find container named redis
deployment.apps/nginx-deployment-resources resource requirements updated
I1129 06:48:43.578666   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010113-23696", Name:"nginx-deployment-resources", UID:"2d8eb1c0-76a0-4f00-a7c0-c5da234dd9ea", APIVersion:"apps/v1", ResourceVersion:"1962", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-67f8cfff5 to 2
I1129 06:48:43.584069   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010113-23696", Name:"nginx-deployment-resources", UID:"2d8eb1c0-76a0-4f00-a7c0-c5da234dd9ea", APIVersion:"apps/v1", ResourceVersion:"1965", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6d86564b45 to 1
I1129 06:48:43.586591   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010113-23696", Name:"nginx-deployment-resources-67f8cfff5", UID:"1cb98f5a-3c52-4e30-a4b3-ab432346ddfc", APIVersion:"apps/v1", ResourceVersion:"1966", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-67f8cfff5-dkjbm
I1129 06:48:43.589034   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010113-23696", Name:"nginx-deployment-resources-6d86564b45", UID:"b9e0e78a-6cc0-4a3f-9c7f-ba5953cde820", APIVersion:"apps/v1", ResourceVersion:"1969", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6d86564b45-hjcqm
core.sh:1282: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(Bcore.sh:1283: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 100m:
(Bdeployment.apps/nginx-deployment-resources resource requirements updated
I1129 06:48:43.838878   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010113-23696", Name:"nginx-deployment-resources", UID:"2d8eb1c0-76a0-4f00-a7c0-c5da234dd9ea", APIVersion:"apps/v1", ResourceVersion:"1983", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-resources-67f8cfff5 to 1
I1129 06:48:43.843756   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010113-23696", Name:"nginx-deployment-resources", UID:"2d8eb1c0-76a0-4f00-a7c0-c5da234dd9ea", APIVersion:"apps/v1", ResourceVersion:"1985", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-resources-6c478d4fdb to 1
I1129 06:48:43.846675   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010113-23696", Name:"nginx-deployment-resources-67f8cfff5", UID:"1cb98f5a-3c52-4e30-a4b3-ab432346ddfc", APIVersion:"apps/v1", ResourceVersion:"1987", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-resources-67f8cfff5-8rgn4
I1129 06:48:43.849372   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010113-23696", Name:"nginx-deployment-resources-6c478d4fdb", UID:"e622e097-b428-4206-8eb3-a5b1dd696406", APIVersion:"apps/v1", ResourceVersion:"1990", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-resources-6c478d4fdb-tdpqt
core.sh:1286: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(BE1129 06:48:43.980395   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1287: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
(BE1129 06:48:44.074789   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1288: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
(BE1129 06:48:44.166199   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apiVersion: apps/v1
kind: Deployment
metadata:
  annotations:
    deployment.kubernetes.io/revision: "4"
  creationTimestamp: "2019-11-29T06:48:42Z"
... skipping 65 lines ...
    status: "True"
    type: Progressing
  observedGeneration: 4
  replicas: 4
  unavailableReplicas: 4
  updatedReplicas: 1
error: you must specify resources by --filename when --local is set.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
E1129 06:48:44.258340   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1292: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).resources.limits.cpu}}:{{end}}: 200m:
(Bcore.sh:1293: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.limits.cpu}}:{{end}}: 300m:
(Bcore.sh:1294: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).resources.requests.cpu}}:{{end}}: 300m:
(Bdeployment.apps "nginx-deployment-resources" deleted
+++ exit code: 0
Recording: run_deployment_tests
... skipping 7 lines ...
Context "test" modified.
+++ [1129 06:48:44] Testing deployments
deployment.apps/test-nginx-extensions created
I1129 06:48:44.862017   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"test-nginx-extensions", UID:"83e47ced-baf6-4d0c-a17a-6f11fe16d4bc", APIVersion:"apps/v1", ResourceVersion:"2019", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-nginx-extensions-5559c76db7 to 1
I1129 06:48:44.868680   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"test-nginx-extensions-5559c76db7", UID:"6c4955d0-2bcc-4cf5-8c1b-36aaeaa03a1e", APIVersion:"apps/v1", ResourceVersion:"2020", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-nginx-extensions-5559c76db7-ntds4
apps.sh:185: Successful get deploy test-nginx-extensions {{(index .spec.template.spec.containers 0).name}}: nginx
(BE1129 06:48:44.981315   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:10
has not:2
E1129 06:48:45.075791   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:apps/v1
has:apps/v1
E1129 06:48:45.167272   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "test-nginx-extensions" deleted
deployment.apps/test-nginx-apps created
I1129 06:48:45.243657   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"test-nginx-apps", UID:"1e93f990-d360-4d9d-9936-130c14634020", APIVersion:"apps/v1", ResourceVersion:"2035", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-nginx-apps-79b9bd9585 to 1
I1129 06:48:45.246398   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"test-nginx-apps-79b9bd9585", UID:"772c4c38-a49b-473d-8ec0-06fb113fd0d8", APIVersion:"apps/v1", ResourceVersion:"2036", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-nginx-apps-79b9bd9585-mvj8l
E1129 06:48:45.259040   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:198: Successful get deploy test-nginx-apps {{(index .spec.template.spec.containers 0).name}}: nginx
(BSuccessful
message:10
has:10
Successful
message:apps/v1
... skipping 14 lines ...
                pod-template-hash=79b9bd9585
Annotations:    deployment.kubernetes.io/desired-replicas: 1
                deployment.kubernetes.io/max-replicas: 2
                deployment.kubernetes.io/revision: 1
Controlled By:  Deployment/test-nginx-apps
Replicas:       1 current / 1 desired
Pods Status:    0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=test-nginx-apps
           pod-template-hash=79b9bd9585
  Containers:
   nginx:
    Image:        k8s.gcr.io/nginx:test-cmd
... skipping 38 lines ...
Events:           <none>
(Bdeployment.apps "test-nginx-apps" deleted
apps.sh:214: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-with-command created
I1129 06:48:45.961368   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"nginx-with-command", UID:"e10e988c-ae7c-49c0-818e-2e278c8cbc90", APIVersion:"apps/v1", ResourceVersion:"2049", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-with-command-757c6f58dd to 1
I1129 06:48:45.966023   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-with-command-757c6f58dd", UID:"84456b69-e38a-4700-8369-f22c88c324e0", APIVersion:"apps/v1", ResourceVersion:"2050", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-with-command-757c6f58dd-vgc54
E1129 06:48:45.982118   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:218: Successful get deploy nginx-with-command {{(index .spec.template.spec.containers 0).name}}: nginx
(BE1129 06:48:46.076878   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx-with-command" deleted
E1129 06:48:46.168182   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:224: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:48:46.268051   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/deployment-with-unixuserid created
I1129 06:48:46.380724   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"deployment-with-unixuserid", UID:"929e6a03-230b-4b91-b1f8-505e46b1526d", APIVersion:"apps/v1", ResourceVersion:"2060", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set deployment-with-unixuserid-8fcdfc94f to 1
I1129 06:48:46.383994   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"deployment-with-unixuserid-8fcdfc94f", UID:"a34fe633-10fc-499a-b7c0-2af58e04a8b6", APIVersion:"apps/v1", ResourceVersion:"2061", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deployment-with-unixuserid-8fcdfc94f-ppfcs
apps.sh:228: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: deployment-with-unixuserid:
(Bdeployment.apps "deployment-with-unixuserid" deleted
apps.sh:235: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-deployment created
I1129 06:48:46.783028   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment", UID:"d2063711-c70a-4fe0-baf7-6820e8ce9d81", APIVersion:"apps/v1", ResourceVersion:"2077", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I1129 06:48:46.787435   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment-6986c7bc94", UID:"62d14a87-d316-4883-befe-15e02f02bc1e", APIVersion:"apps/v1", ResourceVersion:"2078", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-v4lnz
I1129 06:48:46.790860   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment-6986c7bc94", UID:"62d14a87-d316-4883-befe-15e02f02bc1e", APIVersion:"apps/v1", ResourceVersion:"2078", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-8sn5h
I1129 06:48:46.792288   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment-6986c7bc94", UID:"62d14a87-d316-4883-befe-15e02f02bc1e", APIVersion:"apps/v1", ResourceVersion:"2078", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-npjck
apps.sh:239: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 3
(Bdeployment.apps "nginx-deployment" deleted
E1129 06:48:46.983062   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:242: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:48:47.077921   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:246: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:48:47.169365   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:247: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:48:47.269102   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I1129 06:48:47.278270   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment", UID:"7dd816a8-2619-41ea-bc59-ecbda05c6fd1", APIVersion:"apps/v1", ResourceVersion:"2101", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-7f6fc565b9 to 1
I1129 06:48:47.282606   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment-7f6fc565b9", UID:"12a3baef-68f5-492a-88d0-423dd99edd12", APIVersion:"apps/v1", ResourceVersion:"2102", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-7f6fc565b9-jjblk
apps.sh:251: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
(Bdeployment.apps "nginx-deployment" deleted
apps.sh:256: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:257: Successful get rs {{range.items}}{{.spec.replicas}}{{end}}: 1
(Breplicaset.apps "nginx-deployment-7f6fc565b9" deleted
apps.sh:265: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:48:47.984711   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment created
I1129 06:48:48.017445   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment", UID:"56a3009c-6ef8-4c32-9f93-93417ba68dfb", APIVersion:"apps/v1", ResourceVersion:"2119", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6986c7bc94 to 3
I1129 06:48:48.022431   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment-6986c7bc94", UID:"cb0cc037-0be7-43ea-9f65-0e3262483142", APIVersion:"apps/v1", ResourceVersion:"2120", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-czdbt
I1129 06:48:48.026836   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment-6986c7bc94", UID:"cb0cc037-0be7-43ea-9f65-0e3262483142", APIVersion:"apps/v1", ResourceVersion:"2120", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-tsdd4
I1129 06:48:48.026878   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment-6986c7bc94", UID:"cb0cc037-0be7-43ea-9f65-0e3262483142", APIVersion:"apps/v1", ResourceVersion:"2120", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6986c7bc94-srzvm
E1129 06:48:48.078817   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:268: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(BE1129 06:48:48.170317   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling/nginx-deployment autoscaled
apps.sh:271: Successful get hpa nginx-deployment {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(BE1129 06:48:48.270041   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling "nginx-deployment" deleted
deployment.apps "nginx-deployment" deleted
apps.sh:279: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx created
I1129 06:48:48.658428   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"nginx", UID:"22cab508-bb20-4977-b550-28e3b2ecbe87", APIVersion:"apps/v1", ResourceVersion:"2143", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-f87d999f7 to 3
I1129 06:48:48.660629   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-f87d999f7", UID:"8a464c7f-42bf-4ba6-aab7-946f798cc3c7", APIVersion:"apps/v1", ResourceVersion:"2144", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-z666l
I1129 06:48:48.665123   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-f87d999f7", UID:"8a464c7f-42bf-4ba6-aab7-946f798cc3c7", APIVersion:"apps/v1", ResourceVersion:"2144", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-glzwq
I1129 06:48:48.665350   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-f87d999f7", UID:"8a464c7f-42bf-4ba6-aab7-946f798cc3c7", APIVersion:"apps/v1", ResourceVersion:"2144", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-f87d999f7-p2fcm
apps.sh:283: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx:
(Bapps.sh:284: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bdeployment.apps/nginx skipped rollback (current template already matches revision 1)
E1129 06:48:48.985584   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:287: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE1129 06:48:49.079853   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Warning: kubectl apply should be used on resource created by either kubectl create --save-config or kubectl apply
deployment.apps/nginx configured
I1129 06:48:49.150845   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"nginx", UID:"22cab508-bb20-4977-b550-28e3b2ecbe87", APIVersion:"apps/v1", ResourceVersion:"2159", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-78487f9fd7 to 1
I1129 06:48:49.154671   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-78487f9fd7", UID:"97317170-8a30-4526-8383-1de4af8e58e9", APIVersion:"apps/v1", ResourceVersion:"2160", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-78487f9fd7-gjzvl
E1129 06:48:49.171027   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:290: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BE1129 06:48:49.271289   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
    Image:	k8s.gcr.io/nginx:test-cmd
apps.sh:293: Successful get deployment.apps {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bdeployment.apps/nginx rolled back
E1129 06:48:49.986755   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:50.080932   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:50.171797   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:50.272220   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:297: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Berror: unable to find specified revision 1000000 in history
apps.sh:300: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bdeployment.apps/nginx rolled back
E1129 06:48:50.987544   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:51.082103   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:51.172828   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:51.273453   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:51.988574   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:304: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BE1129 06:48:52.083271   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx paused
E1129 06:48:52.173894   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: you cannot rollback a paused deployment; resume it first with 'kubectl rollout resume deployment/nginx' and try again
E1129 06:48:52.274571   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: deployments.apps "nginx" can't restart paused deployment (run rollout resume first)
deployment.apps/nginx resumed
deployment.apps/nginx rolled back
    deployment.kubernetes.io/revision-history: 1,3
error: desired revision (3) is different from the running revision (5)
deployment.apps/nginx restarted
I1129 06:48:52.906941   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"nginx", UID:"22cab508-bb20-4977-b550-28e3b2ecbe87", APIVersion:"apps/v1", ResourceVersion:"2190", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-f87d999f7 to 2
I1129 06:48:52.911347   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"nginx", UID:"22cab508-bb20-4977-b550-28e3b2ecbe87", APIVersion:"apps/v1", ResourceVersion:"2193", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-545c5dbc58 to 1
I1129 06:48:52.914279   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-f87d999f7", UID:"8a464c7f-42bf-4ba6-aab7-946f798cc3c7", APIVersion:"apps/v1", ResourceVersion:"2194", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-f87d999f7-z666l
I1129 06:48:52.914352   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-545c5dbc58", UID:"68779a48-760e-4955-b155-56253cca7405", APIVersion:"apps/v1", ResourceVersion:"2197", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-545c5dbc58-np4lq
E1129 06:48:52.989283   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:53.084363   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:53.175226   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:53.275727   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:53.990213   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:apiVersion: apps/v1
kind: ReplicaSet
metadata:
  annotations:
    deployment.kubernetes.io/desired-replicas: "3"
... skipping 48 lines ...
      terminationGracePeriodSeconds: 30
status:
  fullyLabeledReplicas: 1
  observedGeneration: 2
  replicas: 1
has:deployment.kubernetes.io/revision: "6"
E1129 06:48:54.085458   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:48:54.176426   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx2 created
I1129 06:48:54.230595   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"nginx2", UID:"36f3f1b0-7639-4b6e-994b-9e21e45e91f9", APIVersion:"apps/v1", ResourceVersion:"2213", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx2-57b7865cd9 to 3
I1129 06:48:54.233452   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx2-57b7865cd9", UID:"eb31cd73-2644-4ea0-8d69-034e23b447b3", APIVersion:"apps/v1", ResourceVersion:"2214", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-8ps2x
I1129 06:48:54.237604   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx2-57b7865cd9", UID:"eb31cd73-2644-4ea0-8d69-034e23b447b3", APIVersion:"apps/v1", ResourceVersion:"2214", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-8wvfx
I1129 06:48:54.238584   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx2-57b7865cd9", UID:"eb31cd73-2644-4ea0-8d69-034e23b447b3", APIVersion:"apps/v1", ResourceVersion:"2214", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx2-57b7865cd9-2gh2t
E1129 06:48:54.276704   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "nginx2" deleted
deployment.apps "nginx" deleted
apps.sh:334: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-deployment created
I1129 06:48:54.632455   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment", UID:"6cab82ba-c8cd-451f-8476-c4fbbb4b7953", APIVersion:"apps/v1", ResourceVersion:"2247", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-598d4d68b4 to 3
I1129 06:48:54.635623   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment-598d4d68b4", UID:"39dfa5b8-29f4-4fe6-b26d-803269d5aca5", APIVersion:"apps/v1", ResourceVersion:"2248", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-5rxqd
... skipping 2 lines ...
apps.sh:337: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(Bapps.sh:338: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:339: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
I1129 06:48:54.968860   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment", UID:"6cab82ba-c8cd-451f-8476-c4fbbb4b7953", APIVersion:"apps/v1", ResourceVersion:"2261", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-59df9b5f5b to 1
I1129 06:48:54.971502   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment-59df9b5f5b", UID:"858e27e7-7c11-4a35-a012-f5216e49ceed", APIVersion:"apps/v1", ResourceVersion:"2262", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-59df9b5f5b-9z4kg
E1129 06:48:54.990848   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:342: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BE1129 06:48:55.086469   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:343: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(BE1129 06:48:55.177554   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
error: unable to find container named "redis"
E1129 06:48:55.277856   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment image updated
apps.sh:348: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:349: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bdeployment.apps/nginx-deployment image updated
apps.sh:352: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(Bapps.sh:353: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(Bapps.sh:356: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:1.7.9:
(BE1129 06:48:55.991809   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:357: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/perl:
(BE1129 06:48:56.087605   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment image updated
I1129 06:48:56.111229   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment", UID:"6cab82ba-c8cd-451f-8476-c4fbbb4b7953", APIVersion:"apps/v1", ResourceVersion:"2281", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 2
I1129 06:48:56.118555   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment", UID:"6cab82ba-c8cd-451f-8476-c4fbbb4b7953", APIVersion:"apps/v1", ResourceVersion:"2283", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-7d758dbc54 to 1
I1129 06:48:56.123537   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment-598d4d68b4", UID:"39dfa5b8-29f4-4fe6-b26d-803269d5aca5", APIVersion:"apps/v1", ResourceVersion:"2285", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-5rxqd
I1129 06:48:56.123753   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment-7d758dbc54", UID:"59890a0a-0694-43ab-98bf-1f8c438c34f9", APIVersion:"apps/v1", ResourceVersion:"2287", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-7d758dbc54-jfp57
E1129 06:48:56.178472   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:360: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(BE1129 06:48:56.278857   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:361: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:364: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bapps.sh:365: Successful get deployment {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/nginx:test-cmd:
(Bdeployment.apps "nginx-deployment" deleted
apps.sh:371: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: 
(Bdeployment.apps/nginx-deployment created
I1129 06:48:56.838935   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment", UID:"09d24fde-bdc1-4d3d-bb63-7f4cad06fe23", APIVersion:"apps/v1", ResourceVersion:"2314", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-598d4d68b4 to 3
I1129 06:48:56.842772   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment-598d4d68b4", UID:"8970e0f7-1e04-4c60-b426-bf9a685f1c8e", APIVersion:"apps/v1", ResourceVersion:"2315", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-9dkqh
I1129 06:48:56.846808   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment-598d4d68b4", UID:"8970e0f7-1e04-4c60-b426-bf9a685f1c8e", APIVersion:"apps/v1", ResourceVersion:"2315", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-7vn6b
I1129 06:48:56.847041   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment-598d4d68b4", UID:"8970e0f7-1e04-4c60-b426-bf9a685f1c8e", APIVersion:"apps/v1", ResourceVersion:"2315", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-598d4d68b4-gn7d4
I1129 06:48:56.943357   54238 horizontal.go:341] Horizontal Pod Autoscaler frontend has been deleted in namespace-1575010113-23696
E1129 06:48:56.993112   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
configmap/test-set-env-config created
E1129 06:48:57.088734   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
secret/test-set-env-secret created
E1129 06:48:57.179554   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:376: Successful get deployment {{range.items}}{{.metadata.name}}:{{end}}: nginx-deployment:
(BE1129 06:48:57.280079   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:378: Successful get configmaps/test-set-env-config {{.metadata.name}}: test-set-env-config
(Bapps.sh:379: Successful get secret {{range.items}}{{.metadata.name}}:{{end}}: test-set-env-secret:
(Bdeployment.apps/nginx-deployment env updated
I1129 06:48:57.492105   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment", UID:"09d24fde-bdc1-4d3d-bb63-7f4cad06fe23", APIVersion:"apps/v1", ResourceVersion:"2332", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-6b9f7756b4 to 1
I1129 06:48:57.496536   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment-6b9f7756b4", UID:"cc1700f6-50e2-4001-bda2-be1476956641", APIVersion:"apps/v1", ResourceVersion:"2333", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-6b9f7756b4-sltbg
apps.sh:383: Successful get deploy nginx-deployment {{ (index (index .spec.template.spec.containers 0).env 0).name}}: KEY_2
... skipping 6 lines ...
apps.sh:389: Successful get deploy nginx-deployment {{ len (index .spec.template.spec.containers 0).env }}: 2
(Bdeployment.apps/nginx-deployment env updated
I1129 06:48:57.937717   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment", UID:"09d24fde-bdc1-4d3d-bb63-7f4cad06fe23", APIVersion:"apps/v1", ResourceVersion:"2362", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 1
I1129 06:48:57.944398   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment-598d4d68b4", UID:"8970e0f7-1e04-4c60-b426-bf9a685f1c8e", APIVersion:"apps/v1", ResourceVersion:"2366", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-7vn6b
I1129 06:48:57.945888   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment", UID:"09d24fde-bdc1-4d3d-bb63-7f4cad06fe23", APIVersion:"apps/v1", ResourceVersion:"2365", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-c6d5c5c7b to 1
I1129 06:48:57.949427   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment-c6d5c5c7b", UID:"9816de0d-25cd-4669-9b2f-a928f2c477cd", APIVersion:"apps/v1", ResourceVersion:"2370", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-c6d5c5c7b-9wvtf
E1129 06:48:57.994055   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
I1129 06:48:58.037962   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment", UID:"09d24fde-bdc1-4d3d-bb63-7f4cad06fe23", APIVersion:"apps/v1", ResourceVersion:"2382", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-598d4d68b4 to 0
I1129 06:48:58.043172   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment", UID:"09d24fde-bdc1-4d3d-bb63-7f4cad06fe23", APIVersion:"apps/v1", ResourceVersion:"2384", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-5958f7687 to 1
I1129 06:48:58.045186   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment-598d4d68b4", UID:"8970e0f7-1e04-4c60-b426-bf9a685f1c8e", APIVersion:"apps/v1", ResourceVersion:"2386", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-598d4d68b4-gn7d4
I1129 06:48:58.048989   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment-5958f7687", UID:"f67e32cb-fb17-4081-b78e-6317e1e647f1", APIVersion:"apps/v1", ResourceVersion:"2389", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-5958f7687-xzkbg
E1129 06:48:58.089721   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
E1129 06:48:58.180503   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1129 06:48:58.193205   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment", UID:"09d24fde-bdc1-4d3d-bb63-7f4cad06fe23", APIVersion:"apps/v1", ResourceVersion:"2403", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled down replica set nginx-deployment-6b9f7756b4 to 0
deployment.apps/nginx-deployment env updated
E1129 06:48:58.281051   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/nginx-deployment env updated
I1129 06:48:58.310723   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment-6b9f7756b4", UID:"cc1700f6-50e2-4001-bda2-be1476956641", APIVersion:"apps/v1", ResourceVersion:"2406", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: nginx-deployment-6b9f7756b4-sltbg
I1129 06:48:58.343077   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment", UID:"09d24fde-bdc1-4d3d-bb63-7f4cad06fe23", APIVersion:"apps/v1", ResourceVersion:"2408", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set nginx-deployment-d74969475 to 1
deployment.apps "nginx-deployment" deleted
I1129 06:48:58.459588   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010124-6636", Name:"nginx-deployment-d74969475", UID:"6cd623fd-94ac-458a-ad4d-b2e524718bfb", APIVersion:"apps/v1", ResourceVersion:"2413", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: nginx-deployment-d74969475-ph87k
configmap "test-set-env-config" deleted
... skipping 3 lines ...
Running command: run_rs_tests

+++ Running case: test-cmd.run_rs_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_rs_tests
+++ [1129 06:48:58] Creating namespace namespace-1575010138-15588
E1129 06:48:58.658087   54238 replica_set.go:534] sync "namespace-1575010124-6636/nginx-deployment-6b9f7756b4" failed with replicasets.apps "nginx-deployment-6b9f7756b4" not found
namespace/namespace-1575010138-15588 created
E1129 06:48:58.713669   54238 replica_set.go:534] sync "namespace-1575010124-6636/nginx-deployment-d74969475" failed with replicasets.apps "nginx-deployment-d74969475" not found
Context "test" modified.
+++ [1129 06:48:58] Testing kubectl(v1:replicasets)
apps.sh:511: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:48:58.995286   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I1129 06:48:59.024739   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"frontend", UID:"109c22c2-ce9e-4e4a-8bf5-36f5a81cc03e", APIVersion:"apps/v1", ResourceVersion:"2441", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-k5lpz
I1129 06:48:59.026889   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"frontend", UID:"109c22c2-ce9e-4e4a-8bf5-36f5a81cc03e", APIVersion:"apps/v1", ResourceVersion:"2441", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-rd9w6
I1129 06:48:59.030335   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"frontend", UID:"109c22c2-ce9e-4e4a-8bf5-36f5a81cc03e", APIVersion:"apps/v1", ResourceVersion:"2441", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-ghxmd
+++ [1129 06:48:59] Deleting rs
E1129 06:48:59.090667   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps "frontend" deleted
E1129 06:48:59.157827   54238 replica_set.go:534] sync "namespace-1575010138-15588/frontend" failed with replicasets.apps "frontend" not found
E1129 06:48:59.181742   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:517: Successful get pods -l "tier=frontend" {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:521: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:48:59.282072   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend-no-cascade created
I1129 06:48:59.430584   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"frontend-no-cascade", UID:"c18220f8-a7b0-4e13-b4e4-7dcf7c43e05f", APIVersion:"apps/v1", ResourceVersion:"2456", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-jb8pw
I1129 06:48:59.433058   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"frontend-no-cascade", UID:"c18220f8-a7b0-4e13-b4e4-7dcf7c43e05f", APIVersion:"apps/v1", ResourceVersion:"2456", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-fbkhd
I1129 06:48:59.434100   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"frontend-no-cascade", UID:"c18220f8-a7b0-4e13-b4e4-7dcf7c43e05f", APIVersion:"apps/v1", ResourceVersion:"2456", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-no-cascade-9t47j
apps.sh:527: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(B+++ [1129 06:48:59] Deleting rs
replicaset.apps "frontend-no-cascade" deleted
E1129 06:48:59.708527   54238 replica_set.go:534] sync "namespace-1575010138-15588/frontend-no-cascade" failed with replicasets.apps "frontend-no-cascade" not found
apps.sh:531: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:533: Successful get pods -l "tier=frontend" {{range.items}}{{(index .spec.containers 0).name}}:{{end}}: php-redis:php-redis:php-redis:
(Bpod "frontend-no-cascade-9t47j" deleted
pod "frontend-no-cascade-fbkhd" deleted
pod "frontend-no-cascade-jb8pw" deleted
apps.sh:536: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:48:59.996358   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:540: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:49:00.091728   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:00.182863   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I1129 06:49:00.213924   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"frontend", UID:"e1a543bb-7e3b-484d-bf75-f15c987676b5", APIVersion:"apps/v1", ResourceVersion:"2477", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-fx6xp
I1129 06:49:00.215869   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"frontend", UID:"e1a543bb-7e3b-484d-bf75-f15c987676b5", APIVersion:"apps/v1", ResourceVersion:"2477", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-jrh4s
I1129 06:49:00.218679   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"frontend", UID:"e1a543bb-7e3b-484d-bf75-f15c987676b5", APIVersion:"apps/v1", ResourceVersion:"2477", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-jnvdk
E1129 06:49:00.283157   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:544: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Bmatched Name:
matched Pod Template:
matched Labels:
matched Selector:
matched Replicas:
... skipping 4 lines ...
Namespace:    namespace-1575010138-15588
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1575010138-15588
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 18 lines ...
Namespace:    namespace-1575010138-15588
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 12 lines ...
Namespace:    namespace-1575010138-15588
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 25 lines ...
Namespace:    namespace-1575010138-15588
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 17 lines ...
Namespace:    namespace-1575010138-15588
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                   Message
  ----    ------            ----  ----                   -------
  Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-fx6xp
  Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-jrh4s
  Normal  SuccessfulCreate  0s    replicaset-controller  Created pod: frontend-jnvdk
(BE1129 06:49:00.997259   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1575010138-15588
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 3 lines ...
      cpu:     100m
      memory:  100Mi
    Environment:
      GET_HOSTS_FROM:  dns
    Mounts:            <none>
  Volumes:             <none>
(BE1129 06:49:01.092604   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful describe
Name:         frontend
Namespace:    namespace-1575010138-15588
Selector:     app=guestbook,tier=frontend
Labels:       app=guestbook
              tier=frontend
Annotations:  <none>
Replicas:     3 current / 3 desired
Pods Status:  0 Running / 3 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=guestbook
           tier=frontend
  Containers:
   php-redis:
    Image:      gcr.io/google_samples/gb-frontend:v3
... skipping 9 lines ...
Events:
  Type    Reason            Age   From                   Message
  ----    ------            ----  ----                   -------
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-fx6xp
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-jrh4s
  Normal  SuccessfulCreate  1s    replicaset-controller  Created pod: frontend-jnvdk
(BE1129 06:49:01.183844   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
matched Image:
matched Node:
matched Labels:
matched Status:
matched Controlled By
... skipping 81 lines ...
Volumes:               <none>
QoS Class:             Burstable
Node-Selectors:        <none>
Tolerations:           <none>
Events:                <none>
(Bapps.sh:566: Successful get rs frontend {{.spec.replicas}}: 3
(BE1129 06:49:01.284222   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend scaled
E1129 06:49:01.360279   54238 replica_set.go:199] ReplicaSet has no controller: &ReplicaSet{ObjectMeta:{frontend  namespace-1575010138-15588 /apis/apps/v1/namespaces/namespace-1575010138-15588/replicasets/frontend e1a543bb-7e3b-484d-bf75-f15c987676b5 2488 2 2019-11-29 06:49:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []},Spec:ReplicaSetSpec{Replicas:*2,Selector:&v1.LabelSelector{MatchLabels:map[string]string{app: guestbook,tier: frontend,},MatchExpressions:[]LabelSelectorRequirement{},},Template:{{      0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app:guestbook tier:frontend] map[] [] []  []} {[] [] [{php-redis gcr.io/google_samples/gb-frontend:v3 [] []  [{ 0 80 TCP }] [] [{GET_HOSTS_FROM dns nil}] {map[] map[cpu:{{100 -3} {<nil>} 100m DecimalSI} memory:{{104857600 0} {<nil>} 100Mi BinarySI}]} [] [] nil nil nil nil /dev/termination-log File IfNotPresent nil false false false}] [] Always 0xc0019ee258 <nil> ClusterFirst map[]   <nil>  false false false <nil> PodSecurityContext{SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,SupplementalGroups:[],FSGroup:nil,RunAsGroup:nil,Sysctls:[]Sysctl{},WindowsOptions:nil,} []   nil default-scheduler [] []  <nil> nil [] <nil> <nil> <nil> map[] []}},MinReadySeconds:0,},Status:ReplicaSetStatus{Replicas:3,FullyLabeledReplicas:3,ObservedGeneration:1,ReadyReplicas:0,AvailableReplicas:0,Conditions:[]ReplicaSetCondition{},},}
I1129 06:49:01.365916   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"frontend", UID:"e1a543bb-7e3b-484d-bf75-f15c987676b5", APIVersion:"apps/v1", ResourceVersion:"2488", FieldPath:""}): type: 'Normal' reason: 'SuccessfulDelete' Deleted pod: frontend-fx6xp
apps.sh:570: Successful get rs frontend {{.spec.replicas}}: 2
(Bdeployment.apps/scale-1 created
I1129 06:49:01.597981   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010138-15588", Name:"scale-1", UID:"66a8dd8e-a0f8-42a4-9461-d3ec9cfc836a", APIVersion:"apps/v1", ResourceVersion:"2494", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 1
I1129 06:49:01.603336   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"scale-1-5c5565bcd9", UID:"72852191-7d8f-44f3-ae32-3209e93de6e7", APIVersion:"apps/v1", ResourceVersion:"2495", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-z668t
deployment.apps/scale-2 created
I1129 06:49:01.760622   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010138-15588", Name:"scale-2", UID:"c2a55c3a-3586-423d-85f1-dcd9f80cad1a", APIVersion:"apps/v1", ResourceVersion:"2504", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 1
I1129 06:49:01.764051   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"scale-2-5c5565bcd9", UID:"b37a4c91-c750-41ed-b0e2-6aa13ad4577a", APIVersion:"apps/v1", ResourceVersion:"2505", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-zcd5n
deployment.apps/scale-3 created
I1129 06:49:01.921125   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010138-15588", Name:"scale-3", UID:"7699ab92-f640-4c4d-8d1a-c8e759cf795e", APIVersion:"apps/v1", ResourceVersion:"2514", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-3-5c5565bcd9 to 1
I1129 06:49:01.926277   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"scale-3-5c5565bcd9", UID:"5c0c440b-07cc-4c54-9b1a-bad1821221c9", APIVersion:"apps/v1", ResourceVersion:"2515", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-s67hm
E1129 06:49:01.998389   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:576: Successful get deploy scale-1 {{.spec.replicas}}: 1
(BE1129 06:49:02.093772   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:577: Successful get deploy scale-2 {{.spec.replicas}}: 1
(BE1129 06:49:02.184822   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:578: Successful get deploy scale-3 {{.spec.replicas}}: 1
(Bdeployment.apps/scale-1 scaled
I1129 06:49:02.264240   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010138-15588", Name:"scale-1", UID:"66a8dd8e-a0f8-42a4-9461-d3ec9cfc836a", APIVersion:"apps/v1", ResourceVersion:"2524", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 2
deployment.apps/scale-2 scaled
I1129 06:49:02.269321   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010138-15588", Name:"scale-2", UID:"c2a55c3a-3586-423d-85f1-dcd9f80cad1a", APIVersion:"apps/v1", ResourceVersion:"2526", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-2-5c5565bcd9 to 2
I1129 06:49:02.269889   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"scale-1-5c5565bcd9", UID:"72852191-7d8f-44f3-ae32-3209e93de6e7", APIVersion:"apps/v1", ResourceVersion:"2525", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-1-5c5565bcd9-cf5lw
I1129 06:49:02.273171   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"scale-2-5c5565bcd9", UID:"b37a4c91-c750-41ed-b0e2-6aa13ad4577a", APIVersion:"apps/v1", ResourceVersion:"2530", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-2-5c5565bcd9-76p2z
E1129 06:49:02.285508   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:581: Successful get deploy scale-1 {{.spec.replicas}}: 2
(Bapps.sh:582: Successful get deploy scale-2 {{.spec.replicas}}: 2
(Bapps.sh:583: Successful get deploy scale-3 {{.spec.replicas}}: 1
(Bdeployment.apps/scale-1 scaled
I1129 06:49:02.597018   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010138-15588", Name:"scale-1", UID:"66a8dd8e-a0f8-42a4-9461-d3ec9cfc836a", APIVersion:"apps/v1", ResourceVersion:"2544", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set scale-1-5c5565bcd9 to 3
deployment.apps/scale-2 scaled
... skipping 5 lines ...
I1129 06:49:02.614189   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"scale-3-5c5565bcd9", UID:"5c0c440b-07cc-4c54-9b1a-bad1821221c9", APIVersion:"apps/v1", ResourceVersion:"2558", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-fzcp9
apps.sh:586: Successful get deploy scale-1 {{.spec.replicas}}: 3
(BI1129 06:49:02.709291   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"scale-3-5c5565bcd9", UID:"5c0c440b-07cc-4c54-9b1a-bad1821221c9", APIVersion:"apps/v1", ResourceVersion:"2558", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: scale-3-5c5565bcd9-tb4s6
apps.sh:587: Successful get deploy scale-2 {{.spec.replicas}}: 3
(Bapps.sh:588: Successful get deploy scale-3 {{.spec.replicas}}: 3
(Breplicaset.apps "frontend" deleted
E1129 06:49:02.999316   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "scale-1" deleted
deployment.apps "scale-2" deleted
deployment.apps "scale-3" deleted
E1129 06:49:03.057391   54238 replica_set.go:534] sync "namespace-1575010138-15588/scale-3-5c5565bcd9" failed with replicasets.apps "scale-3-5c5565bcd9" not found
E1129 06:49:03.094794   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1129 06:49:03.180225   54238 horizontal.go:341] Horizontal Pod Autoscaler nginx-deployment has been deleted in namespace-1575010124-6636
E1129 06:49:03.186029   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend created
I1129 06:49:03.194904   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"frontend", UID:"07729dbc-f4a4-435c-bf10-30a47151ad42", APIVersion:"apps/v1", ResourceVersion:"2606", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-9gpbp
I1129 06:49:03.196855   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"frontend", UID:"07729dbc-f4a4-435c-bf10-30a47151ad42", APIVersion:"apps/v1", ResourceVersion:"2606", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-qk96x
I1129 06:49:03.208645   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"frontend", UID:"07729dbc-f4a4-435c-bf10-30a47151ad42", APIVersion:"apps/v1", ResourceVersion:"2606", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-pkr4b
apps.sh:596: Successful get rs frontend {{.spec.replicas}}: 3
(BE1129 06:49:03.286454   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/frontend exposed
apps.sh:600: Successful get service frontend {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: <no value> 80
(Bservice/frontend-2 exposed
apps.sh:604: Successful get service frontend-2 {{(index .spec.ports 0).name}} {{(index .spec.ports 0).port}}: default 80
(Bservice "frontend" deleted
service "frontend-2" deleted
apps.sh:610: Successful get rs frontend {{.metadata.generation}}: 1
(Breplicaset.apps/frontend image updated
apps.sh:612: Successful get rs frontend {{.metadata.generation}}: 2
(BE1129 06:49:04.000553   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend env updated
E1129 06:49:04.095915   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:614: Successful get rs frontend {{.metadata.generation}}: 3
(BE1129 06:49:04.187241   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps/frontend resource requirements updated
E1129 06:49:04.287741   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:616: Successful get rs frontend {{.metadata.generation}}: 4
(Bapps.sh:620: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Breplicaset.apps "frontend" deleted
apps.sh:624: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Bapps.sh:628: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicaset.apps/frontend created
I1129 06:49:04.785785   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"frontend", UID:"85d89fe1-a52b-465c-979c-e8ebcb18b3df", APIVersion:"apps/v1", ResourceVersion:"2640", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-txsbh
I1129 06:49:04.788120   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"frontend", UID:"85d89fe1-a52b-465c-979c-e8ebcb18b3df", APIVersion:"apps/v1", ResourceVersion:"2640", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-7tzfn
I1129 06:49:04.789403   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"frontend", UID:"85d89fe1-a52b-465c-979c-e8ebcb18b3df", APIVersion:"apps/v1", ResourceVersion:"2640", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-2jh4d
replicaset.apps/redis-slave created
I1129 06:49:04.942534   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"redis-slave", UID:"a1f75760-44e7-4c3f-9fbc-c634783b810f", APIVersion:"apps/v1", ResourceVersion:"2649", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-xzjpn
I1129 06:49:04.945538   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"redis-slave", UID:"a1f75760-44e7-4c3f-9fbc-c634783b810f", APIVersion:"apps/v1", ResourceVersion:"2649", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: redis-slave-n7l8q
E1129 06:49:05.001653   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:633: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(BE1129 06:49:05.096869   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:637: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:redis-slave:
(Breplicaset.apps "frontend" deleted
E1129 06:49:05.188376   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicaset.apps "redis-slave" deleted
apps.sh:641: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:49:05.288775   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:646: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: 
(Breplicaset.apps/frontend created
I1129 06:49:05.536933   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"frontend", UID:"de3759ba-6af4-4370-8e0d-42b32069f294", APIVersion:"apps/v1", ResourceVersion:"2670", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-24fbw
I1129 06:49:05.539623   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"frontend", UID:"de3759ba-6af4-4370-8e0d-42b32069f294", APIVersion:"apps/v1", ResourceVersion:"2670", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-hxj99
I1129 06:49:05.543784   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010138-15588", Name:"frontend", UID:"de3759ba-6af4-4370-8e0d-42b32069f294", APIVersion:"apps/v1", ResourceVersion:"2670", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: frontend-h99xv
apps.sh:649: Successful get rs {{range.items}}{{.metadata.name}}:{{end}}: frontend:
(Bhorizontalpodautoscaler.autoscaling/frontend autoscaled
apps.sh:652: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 1 2 70
(Bhorizontalpodautoscaler.autoscaling "frontend" deleted
horizontalpodautoscaler.autoscaling/frontend autoscaled
E1129 06:49:06.002667   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:656: Successful get hpa frontend {{.spec.minReplicas}} {{.spec.maxReplicas}} {{.spec.targetCPUUtilizationPercentage}}: 2 3 80
(BE1129 06:49:06.098065   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
horizontalpodautoscaler.autoscaling "frontend" deleted
E1129 06:49:06.189475   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Error: required flag(s) "max" not set


Examples:
  # Auto scale a deployment "foo", with the number of pods between 2 and 10, no target CPU utilization specified so a default autoscaling policy will be used:
  kubectl autoscale deployment foo --min=2 --max=10
  
... skipping 20 lines ...
  kubectl autoscale (-f FILENAME | TYPE NAME | TYPE/NAME) [--min=MINPODS] --max=MAXPODS [--cpu-percent=CPU] [options]

Use "kubectl options" for a list of global command-line options (applies to all commands).

replicaset.apps "frontend" deleted
+++ exit code: 0
E1129 06:49:06.289581   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Recording: run_stateful_set_tests
Running command: run_stateful_set_tests

+++ Running case: test-cmd.run_stateful_set_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_stateful_set_tests
... skipping 3 lines ...
+++ [1129 06:49:06] Testing kubectl(v1:statefulsets)
apps.sh:470: Successful get statefulset {{range.items}}{{.metadata.name}}:{{end}}: 
(BI1129 06:49:06.729429   50776 controller.go:606] quota admission added evaluator for: statefulsets.apps
statefulset.apps/nginx created
apps.sh:476: Successful get statefulset nginx {{.spec.replicas}}: 0
(Bapps.sh:477: Successful get statefulset nginx {{.status.observedGeneration}}: 1
(BE1129 06:49:07.004109   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx scaled
I1129 06:49:07.014595   54238 event.go:281] Event(v1.ObjectReference{Kind:"StatefulSet", Namespace:"namespace-1575010146-2216", Name:"nginx", UID:"beb44412-6f8a-457e-89e8-9735bcfc6a90", APIVersion:"apps/v1", ResourceVersion:"2695", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' create Pod nginx-0 in StatefulSet nginx successful
E1129 06:49:07.099046   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:481: Successful get statefulset nginx {{.spec.replicas}}: 1
(Bapps.sh:482: Successful get statefulset nginx {{.status.observedGeneration}}: 2
(BE1129 06:49:07.190433   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:07.290745   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx restarted
apps.sh:490: Successful get statefulset nginx {{.status.observedGeneration}}: 3
(Bstatefulset.apps "nginx" deleted
I1129 06:49:07.512522   54238 stateful_set.go:420] StatefulSet has been deleted namespace-1575010146-2216/nginx
+++ exit code: 0
Recording: run_statefulset_history_tests
... skipping 4 lines ...
+++ command: run_statefulset_history_tests
+++ [1129 06:49:07] Creating namespace namespace-1575010147-19742
namespace/namespace-1575010147-19742 created
Context "test" modified.
+++ [1129 06:49:07] Testing kubectl(v1:statefulsets, v1:controllerrevisions)
apps.sh:418: Successful get statefulset {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:49:08.005261   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx created
E1129 06:49:08.099976   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:422: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1575010147-19742"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.7","name":"nginx","ports":[{"containerPort":80,"name":"web"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(BE1129 06:49:08.191603   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx skipped rollback (current template already matches revision 1)
E1129 06:49:08.291941   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:425: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:426: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bstatefulset.apps/nginx configured
apps.sh:429: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(Bapps.sh:430: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(Bapps.sh:431: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(Bapps.sh:432: Successful get controllerrevisions {{range.items}}{{.metadata.annotations}}:{{end}}: map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1575010147-19742"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.7","name":"nginx","ports":[{"containerPort":80,"name":"web"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:map[kubectl.kubernetes.io/last-applied-configuration:{"apiVersion":"apps/v1","kind":"StatefulSet","metadata":{"annotations":{"kubernetes.io/change-cause":"kubectl apply --filename=hack/testdata/rollingupdate-statefulset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true"},"labels":{"app":"nginx-statefulset"},"name":"nginx","namespace":"namespace-1575010147-19742"},"spec":{"replicas":0,"selector":{"matchLabels":{"app":"nginx-statefulset"}},"serviceName":"nginx","template":{"metadata":{"labels":{"app":"nginx-statefulset"}},"spec":{"containers":[{"command":["sh","-c","while true; do sleep 1; done"],"image":"k8s.gcr.io/nginx-slim:0.8","name":"nginx","ports":[{"containerPort":80,"name":"web"}]},{"image":"k8s.gcr.io/pause:2.0","name":"pause","ports":[{"containerPort":81,"name":"web-2"}]}],"terminationGracePeriodSeconds":5}},"updateStrategy":{"type":"RollingUpdate"}}}
 kubernetes.io/change-cause:kubectl apply --filename=hack/testdata/rollingupdate-statefulset-rv2.yaml --record=true --server=http://127.0.0.1:8080 --match-server-version=true]:
(BE1129 06:49:09.006122   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx will roll back to Pod Template:
  Labels:	app=nginx-statefulset
  Containers:
   nginx:
    Image:	k8s.gcr.io/nginx-slim:0.7
    Port:	80/TCP
... skipping 3 lines ...
      -c
      while true; do sleep 1; done
    Environment:	<none>
    Mounts:	<none>
  Volumes:	<none>
 (dry run)
E1129 06:49:09.100879   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:435: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(Bapps.sh:436: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE1129 06:49:09.192737   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:437: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE1129 06:49:09.293084   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps/nginx rolled back
apps.sh:440: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:441: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(BSuccessful
message:error: unable to find specified revision 1000000 in history
has:unable to find specified revision
apps.sh:445: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.7:
(Bapps.sh:446: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 1
(Bstatefulset.apps/nginx rolled back
apps.sh:449: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 0).image}}:{{end}}: k8s.gcr.io/nginx-slim:0.8:
(BE1129 06:49:10.007246   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:450: Successful get statefulset {{range.items}}{{(index .spec.template.spec.containers 1).image}}:{{end}}: k8s.gcr.io/pause:2.0:
(BE1129 06:49:10.102025   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
apps.sh:451: Successful get statefulset {{range.items}}{{(len .spec.template.spec.containers)}}{{end}}: 2
(BE1129 06:49:10.194071   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
statefulset.apps "nginx" deleted
I1129 06:49:10.222914   54238 stateful_set.go:420] StatefulSet has been deleted namespace-1575010147-19742/nginx
E1129 06:49:10.294210   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ exit code: 0
Recording: run_lists_tests
Running command: run_lists_tests

+++ Running case: test-cmd.run_lists_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
... skipping 17 lines ...
+++ command: run_multi_resources_tests
+++ [1129 06:49:10] Creating namespace namespace-1575010150-11326
namespace/namespace-1575010150-11326 created
Context "test" modified.
+++ [1129 06:49:11] Testing kubectl(v1:multiple resources)
Testing with file hack/testdata/multi-resource-yaml.yaml and replace with file hack/testdata/multi-resource-yaml-modify.yaml
E1129 06:49:11.008395   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:49:11.102903   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:49:11.195158   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:11.295327   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I1129 06:49:11.324607   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010150-11326", Name:"mock", UID:"c7c6844c-dffb-465a-82d9-15c4b0147d59", APIVersion:"v1", ResourceVersion:"2757", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-g52zs
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BNAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
... skipping 18 lines ...
Name:         mock
Namespace:    namespace-1575010150-11326
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 8 lines ...
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I1129 06:49:11.917233   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010150-11326", Name:"mock", UID:"a0f5abb0-76ca-4006-aa9e-37e4de7c6236", APIVersion:"v1", ResourceVersion:"2771", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-vbxk4
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(BE1129 06:49:12.009485   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(BE1129 06:49:12.103891   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:12.196208   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock edited
replicationcontroller/mock edited
E1129 06:49:12.296243   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bservice/mock labeled
replicationcontroller/mock labeled
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(Bservice/mock annotated
replicationcontroller/mock annotated
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
replicationcontroller "mock" deleted
E1129 06:49:13.010281   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Testing with file hack/testdata/multi-resource-list.json and replace with file hack/testdata/multi-resource-list-modify.json
E1129 06:49:13.104669   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(Bgeneric-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:49:13.197248   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:13.297444   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I1129 06:49:13.344426   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010150-11326", Name:"mock", UID:"d7d30608-4f30-4891-8126-ebdcb3816d1d", APIVersion:"v1", ResourceVersion:"2797", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-28gdm
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BNAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
... skipping 18 lines ...
Name:         mock
Namespace:    namespace-1575010150-11326
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 7 lines ...
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: mock-28gdm
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I1129 06:49:13.933566   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010150-11326", Name:"mock", UID:"e108fcb4-c434-4201-bd3c-15f8410c1c8c", APIVersion:"v1", ResourceVersion:"2811", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-2fr9k
E1129 06:49:14.011237   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(Bgeneric-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(BE1129 06:49:14.105718   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:14.198212   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock edited
replicationcontroller/mock edited
E1129 06:49:14.298401   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bservice/mock labeled
replicationcontroller/mock labeled
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(Bservice/mock annotated
replicationcontroller/mock annotated
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(BE1129 06:49:15.012365   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service "mock" deleted
replicationcontroller "mock" deleted
Testing with file hack/testdata/multi-resource-json.json and replace with file hack/testdata/multi-resource-json-modify.json
E1129 06:49:15.106556   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:49:15.199084   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:49:15.299688   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I1129 06:49:15.355492   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010150-11326", Name:"mock", UID:"3d15c019-df88-48bb-bc9d-564852134aed", APIVersion:"v1", ResourceVersion:"2835", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-q8np6
generic-resources.sh:72: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:80: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(BNAME           TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
... skipping 18 lines ...
Name:         mock
Namespace:    namespace-1575010150-11326
Selector:     app=mock
Labels:       app=mock
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 7 lines ...
  Normal  SuccessfulCreate  0s    replication-controller  Created pod: mock-q8np6
service "mock" deleted
replicationcontroller "mock" deleted
service/mock replaced
replicationcontroller/mock replaced
I1129 06:49:15.994968   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010150-11326", Name:"mock", UID:"048f6dbe-6704-4bd9-93d4-a48865b90bb9", APIVersion:"v1", ResourceVersion:"2847", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-8j8vh
E1129 06:49:16.013185   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(BE1129 06:49:16.107442   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(BE1129 06:49:16.200014   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:16.300950   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock edited
replicationcontroller/mock edited
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bservice/mock labeled
replicationcontroller/mock labeled
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(Bservice/mock annotated
replicationcontroller/mock annotated
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(BE1129 06:49:17.014277   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
replicationcontroller "mock" deleted
E1129 06:49:17.108872   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Testing with file hack/testdata/multi-resource-rclist.json and replace with file hack/testdata/multi-resource-rclist-modify.json
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:49:17.201029   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:49:17.302168   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/mock created
I1129 06:49:17.435615   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010150-11326", Name:"mock", UID:"cebf2b83-f8b2-4062-98bd-c0cb1b646729", APIVersion:"v1", ResourceVersion:"2870", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-q2kxd
replicationcontroller/mock2 created
I1129 06:49:17.441705   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010150-11326", Name:"mock2", UID:"8950fc46-ac07-4e6b-b1f1-af57d7934f91", APIVersion:"v1", ResourceVersion:"2872", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-s7q2l
generic-resources.sh:78: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
(BNAME    DESIRED   CURRENT   READY   AGE
... skipping 3 lines ...
Namespace:    namespace-1575010150-11326
Selector:     app=mock
Labels:       app=mock
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 11 lines ...
Namespace:    namespace-1575010150-11326
Selector:     app=mock2
Labels:       app=mock2
              status=replaced
Annotations:  <none>
Replicas:     1 current / 1 desired
Pods Status:  0 Running / 1 Waiting / 0 Succeeded / 0 Failed
Pod Template:
  Labels:  app=mock2
  Containers:
   mock-container:
    Image:        k8s.gcr.io/pause:2.0
    Port:         9949/TCP
... skipping 8 lines ...
replicationcontroller "mock" deleted
replicationcontroller "mock2" deleted
replicationcontroller/mock replaced
I1129 06:49:17.950618   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010150-11326", Name:"mock", UID:"66b3f0c7-3a43-461c-98e2-9e4fa8963352", APIVersion:"v1", ResourceVersion:"2886", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-97z5b
replicationcontroller/mock2 replaced
I1129 06:49:17.956011   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010150-11326", Name:"mock2", UID:"01534223-3e1f-4015-b327-063b092121f6", APIVersion:"v1", ResourceVersion:"2888", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock2-tlvqs
E1129 06:49:18.015424   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:102: Successful get rc mock {{.metadata.labels.status}}: replaced
(BE1129 06:49:18.109844   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:104: Successful get rc mock2 {{.metadata.labels.status}}: replaced
(BE1129 06:49:18.201943   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/mock edited
E1129 06:49:18.303287   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/mock2 edited
generic-resources.sh:120: Successful get rc mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:122: Successful get rc mock2 {{.metadata.labels.status}}: edited
(Breplicationcontroller/mock labeled
replicationcontroller/mock2 labeled
generic-resources.sh:140: Successful get rc mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:142: Successful get rc mock2 {{.metadata.labels.labeled}}: true
(Breplicationcontroller/mock annotated
replicationcontroller/mock2 annotated
generic-resources.sh:159: Successful get rc mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:161: Successful get rc mock2 {{.metadata.annotations.annotated}}: true
(BE1129 06:49:19.016533   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller "mock" deleted
replicationcontroller "mock2" deleted
Testing with file hack/testdata/multi-resource-svclist.json and replace with file hack/testdata/multi-resource-svclist-modify.json
E1129 06:49:19.110880   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:63: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:49:19.203031   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:64: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:49:19.304638   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
service/mock2 created
generic-resources.sh:70: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:mock2:
(BNAME    TYPE        CLUSTER-IP   EXTERNAL-IP   PORT(S)   AGE
mock    ClusterIP   10.0.0.79    <none>        99/TCP    0s
mock2   ClusterIP   10.0.0.98    <none>        99/TCP    0s
... skipping 25 lines ...
Events:            <none>
service "mock" deleted
service "mock2" deleted
service/mock replaced
service/mock2 replaced
generic-resources.sh:96: Successful get services mock {{.metadata.labels.status}}: replaced
(BE1129 06:49:20.017671   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:98: Successful get services mock2 {{.metadata.labels.status}}: replaced
(BE1129 06:49:20.111942   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:20.204167   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock edited
service/mock2 edited
E1129 06:49:20.305791   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:114: Successful get services mock {{.metadata.labels.status}}: edited
(Bgeneric-resources.sh:116: Successful get services mock2 {{.metadata.labels.status}}: edited
(Bservice/mock labeled
service/mock2 labeled
generic-resources.sh:134: Successful get services mock {{.metadata.labels.labeled}}: true
(Bgeneric-resources.sh:136: Successful get services mock2 {{.metadata.labels.labeled}}: true
(BI1129 06:49:20.706014   54238 horizontal.go:341] Horizontal Pod Autoscaler frontend has been deleted in namespace-1575010138-15588
service/mock annotated
service/mock2 annotated
generic-resources.sh:153: Successful get services mock {{.metadata.annotations.annotated}}: true
(Bgeneric-resources.sh:155: Successful get services mock2 {{.metadata.annotations.annotated}}: true
(Bservice "mock" deleted
service "mock2" deleted
E1129 06:49:21.018594   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:173: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:49:21.113093   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:174: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:49:21.205214   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:21.307177   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/mock created
replicationcontroller/mock created
I1129 06:49:21.587174   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010150-11326", Name:"mock", UID:"7db22a53-68cb-43a6-a658-dfe8d6d63b4c", APIVersion:"v1", ResourceVersion:"2949", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: mock-8pfls
generic-resources.sh:180: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bgeneric-resources.sh:181: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: mock:
(Bservice "mock" deleted
replicationcontroller "mock" deleted
E1129 06:49:22.019729   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:187: Successful get services {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:49:22.114105   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
generic-resources.sh:188: Successful get rc {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_persistent_volumes_tests
Running command: run_persistent_volumes_tests

+++ Running case: test-cmd.run_persistent_volumes_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_persistent_volumes_tests
E1129 06:49:22.206177   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [1129 06:49:22] Creating namespace namespace-1575010162-12969
namespace/namespace-1575010162-12969 created
E1129 06:49:22.308183   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [1129 06:49:22] Testing persistent volumes
storage.sh:30: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpersistentvolume/pv0001 created
storage.sh:33: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
(Bpersistentvolume "pv0001" deleted
persistentvolume/pv0002 created
storage.sh:36: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0002:
(BE1129 06:49:23.021005   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume "pv0002" deleted
E1129 06:49:23.115328   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:23.207381   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume/pv0003 created
E1129 06:49:23.218392   54238 pv_protection_controller.go:116] PV pv0003 failed with : Operation cannot be fulfilled on persistentvolumes "pv0003": the object has been modified; please apply your changes to the latest version and try again
storage.sh:39: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0003:
(BE1129 06:49:23.309283   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolume "pv0003" deleted
storage.sh:42: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpersistentvolume/pv0001 created
storage.sh:45: Successful get pv {{range.items}}{{.metadata.name}}:{{end}}: pv0001:
(BSuccessful
message:warning: deleting cluster-scoped resources, not scoped to the provided namespace
... skipping 9 lines ...
Running command: run_persistent_volume_claims_tests

+++ Running case: test-cmd.run_persistent_volume_claims_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_persistent_volume_claims_tests
+++ [1129 06:49:23] Creating namespace namespace-1575010163-15133
E1129 06:49:24.022259   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1575010163-15133 created
Context "test" modified.
+++ [1129 06:49:24] Testing persistent volumes claims
E1129 06:49:24.116578   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:64: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:49:24.208457   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:24.310486   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
persistentvolumeclaim/myclaim-1 created
I1129 06:49:24.330787   54238 event.go:281] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1575010163-15133", Name:"myclaim-1", UID:"4255b8d5-74e1-41ef-97b2-ce6fe5a65b30", APIVersion:"v1", ResourceVersion:"2987", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I1129 06:49:24.333888   54238 event.go:281] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1575010163-15133", Name:"myclaim-1", UID:"4255b8d5-74e1-41ef-97b2-ce6fe5a65b30", APIVersion:"v1", ResourceVersion:"2989", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
storage.sh:67: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-1:
(BI1129 06:49:24.501191   54238 event.go:281] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1575010163-15133", Name:"myclaim-1", UID:"4255b8d5-74e1-41ef-97b2-ce6fe5a65b30", APIVersion:"v1", ResourceVersion:"2991", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
persistentvolumeclaim "myclaim-1" deleted
... skipping 3 lines ...
storage.sh:71: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-2:
(Bpersistentvolumeclaim "myclaim-2" deleted
I1129 06:49:24.835328   54238 event.go:281] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1575010163-15133", Name:"myclaim-2", UID:"a48b6db2-533d-4985-9bd0-0f219cf5811d", APIVersion:"v1", ResourceVersion:"2998", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
I1129 06:49:24.994087   54238 event.go:281] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1575010163-15133", Name:"myclaim-3", UID:"05c7077e-7ad6-44c4-8f62-15b395e509dd", APIVersion:"v1", ResourceVersion:"3001", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
persistentvolumeclaim/myclaim-3 created
I1129 06:49:24.997451   54238 event.go:281] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1575010163-15133", Name:"myclaim-3", UID:"05c7077e-7ad6-44c4-8f62-15b395e509dd", APIVersion:"v1", ResourceVersion:"3003", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
E1129 06:49:25.023385   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:75: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: myclaim-3:
(BE1129 06:49:25.117714   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1129 06:49:25.163727   54238 event.go:281] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"namespace-1575010163-15133", Name:"myclaim-3", UID:"05c7077e-7ad6-44c4-8f62-15b395e509dd", APIVersion:"v1", ResourceVersion:"3007", FieldPath:""}): type: 'Normal' reason: 'FailedBinding' no persistent volumes available for this claim and no storage class is set
persistentvolumeclaim "myclaim-3" deleted
E1129 06:49:25.209690   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
storage.sh:78: Successful get pvc {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_storage_class_tests
Running command: run_storage_class_tests
E1129 06:49:25.311633   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_storage_class_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_storage_class_tests
+++ [1129 06:49:25] Testing storage class
storage.sh:92: Successful get storageclass {{range.items}}{{.metadata.name}}:{{end}}: 
... skipping 6 lines ...
Recording: run_nodes_tests
Running command: run_nodes_tests

+++ Running case: test-cmd.run_nodes_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_nodes_tests
E1129 06:49:26.024562   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ [1129 06:49:26] Testing kubectl(v1:nodes)
core.sh:1375: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(BE1129 06:49:26.118865   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Name:
E1129 06:49:26.210659   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
matched Labels:
matched CreationTimestamp:
matched Conditions:
matched Addresses:
matched Capacity:
matched Pods:
... skipping 40 lines ...
  Resource           Requests  Limits
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(BE1129 06:49:26.312600   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1379: Successful describe
Name:               127.0.0.1
Roles:              <none>
Labels:             <none>
Annotations:        node.alpha.kubernetes.io/ttl: 0
CreationTimestamp:  Fri, 29 Nov 2019 06:45:16 +0000
... skipping 319 lines ...
  --------           --------  ------
  cpu                0 (0%)    0 (0%)
  memory             0 (0%)    0 (0%)
  ephemeral-storage  0 (0%)    0 (0%)
Events:              <none>
(Bcore.sh:1395: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE1129 06:49:27.025688   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 patched
E1129 06:49:27.120267   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1398: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: true
(BE1129 06:49:27.211639   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 patched
E1129 06:49:27.313713   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
core.sh:1401: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Btokenreview.authentication.k8s.io/<unknown> created
tokenreview.authentication.k8s.io/<unknown> created
+++ exit code: 0
Recording: run_authorization_tests
Running command: run_authorization_tests
... skipping 60 lines ...
Successful
message:yes
has:yes
Successful
message:yes
has:yes
E1129 06:49:28.026837   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:28.121445   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Warning: the server doesn't have a resource type 'invalid_resource'
yes
has:the server doesn't have a resource type
E1129 06:49:28.212709   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:yes
has:yes
E1129 06:49:28.314839   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: --subresource can not be used with NonResourceURL
has:subresource can not be used with NonResourceURL
Successful
Successful
message:yes
0
has:0
... skipping 15 lines ...
message:yes
has not:Warning
Successful
message:Warning: resource 'nodes' is not namespace scoped
yes
has:Warning: resource 'nodes' is not namespace scoped
E1129 06:49:29.027858   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:yes
has not:Warning: resource 'nodes' is not namespace scoped
E1129 06:49:29.122511   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
clusterrole.rbac.authorization.k8s.io/testing-CR reconciled
	reconciliation required create
	missing rules added:
		{Verbs:[create delete deletecollection get list patch update watch] APIGroups:[] Resources:[pods] ResourceNames:[] NonResourceURLs:[]}
clusterrolebinding.rbac.authorization.k8s.io/testing-CRB reconciled
	reconciliation required create
... skipping 4 lines ...
	missing subjects added:
		{Kind:Group APIGroup:rbac.authorization.k8s.io Name:system:masters Namespace:}
role.rbac.authorization.k8s.io/testing-R reconciled
	reconciliation required create
	missing rules added:
		{Verbs:[get list watch] APIGroups:[] Resources:[configmaps] ResourceNames:[] NonResourceURLs:[]}
E1129 06:49:29.213685   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
legacy-script.sh:821: Successful get rolebindings -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-RB:
(BE1129 06:49:29.315896   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
legacy-script.sh:822: Successful get roles -n some-other-random -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-R:
(Blegacy-script.sh:823: Successful get clusterrolebindings -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CRB:
(Blegacy-script.sh:824: Successful get clusterroles -l test-cmd=auth {{range.items}}{{.metadata.name}}:{{end}}: testing-CR:
(BSuccessful
message:error: only rbac.authorization.k8s.io/v1 is supported: not *v1beta1.ClusterRole
has:only rbac.authorization.k8s.io/v1 is supported
rolebinding.rbac.authorization.k8s.io "testing-RB" deleted
role.rbac.authorization.k8s.io "testing-R" deleted
warning: deleting cluster-scoped resources, not scoped to the provided namespace
clusterrole.rbac.authorization.k8s.io "testing-CR" deleted
clusterrolebinding.rbac.authorization.k8s.io "testing-CRB" deleted
... skipping 11 lines ...
Running command: run_resource_aliasing_tests

+++ Running case: test-cmd.run_resource_aliasing_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_resource_aliasing_tests
+++ [1129 06:49:30] Creating namespace namespace-1575010170-18359
E1129 06:49:30.028903   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1575010170-18359 created
E1129 06:49:30.123590   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
+++ [1129 06:49:30] Testing resource aliasing
E1129 06:49:30.214807   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
replicationcontroller/cassandra created
I1129 06:49:30.298458   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010170-18359", Name:"cassandra", UID:"20f49609-04da-4adf-bd88-560b780a87d0", APIVersion:"v1", ResourceVersion:"3032", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-9s8kh
I1129 06:49:30.301330   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicationController", Namespace:"namespace-1575010170-18359", Name:"cassandra", UID:"20f49609-04da-4adf-bd88-560b780a87d0", APIVersion:"v1", ResourceVersion:"3032", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: cassandra-r4fck
E1129 06:49:30.316643   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
service/cassandra created
Waiting for Get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}} : expected: cassandra:cassandra:cassandra:cassandra::, got: cassandra:cassandra:cassandra:cassandra:

discovery.sh:91: FAIL!
Get all -l'app=cassandra' {{range.items}}{{range .metadata.labels}}{{.}}:{{end}}{{end}}
  Expected: cassandra:cassandra:cassandra:cassandra::
  Got:      cassandra:cassandra:cassandra:cassandra:
(B
55 /home/prow/go/src/k8s.io/kubernetes/hack/lib/test.sh
(B
... skipping 9 lines ...
Running command: run_kubectl_explain_tests

+++ Running case: test-cmd.run_kubectl_explain_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_explain_tests
+++ [1129 06:49:30] Testing kubectl(v1:explain)
E1129 06:49:31.030016   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
KIND:     Pod
VERSION:  v1

DESCRIPTION:
     Pod is a collection of containers that can run on a host. This resource is
     created by clients and scheduled onto hosts.
... skipping 21 lines ...

   status	<Object>
     Most recently observed status of the pod. This data may not be up to date.
     Populated by the system. Read-only. More info:
     https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#spec-and-status

E1129 06:49:31.124638   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
KIND:     Pod
VERSION:  v1

DESCRIPTION:
     Pod is a collection of containers that can run on a host. This resource is
     created by clients and scheduled onto hosts.
... skipping 21 lines ...

   status	<Object>
     Most recently observed status of the pod. This data may not be up to date.
     Populated by the system. Read-only. More info:
     https://git.k8s.io/community/contributors/devel/sig-architecture/api-conventions.md#spec-and-status

E1129 06:49:31.215617   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:31.317824   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
KIND:     Pod
VERSION:  v1

FIELD:    message <string>

DESCRIPTION:
... skipping 48 lines ...
+++ command: run_kubectl_sort_by_tests
+++ [1129 06:49:31] Testing kubectl --sort-by
get.sh:256: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BNo resources found in namespace-1575010170-18359 namespace.
No resources found in namespace-1575010170-18359 namespace.
get.sh:264: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:49:32.031021   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:32.125833   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
E1129 06:49:32.216744   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:268: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BE1129 06:49:32.318894   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:NAME        READY   STATUS    RESTARTS   AGE
valid-pod   0/1     Pending   0          0s
has:valid-pod
Successful
message:I1129 06:49:32.397538   85963 loader.go:375] Config loaded from file:  /tmp/tmp.BYbdlG6rFv/.kube/config
... skipping 29 lines ...
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
get.sh:283: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bget.sh:288: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/sorted-pod1 created
get.sh:292: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:
(BE1129 06:49:33.032211   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:33.127037   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/sorted-pod2 created
E1129 06:49:33.217950   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:296: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:
(BE1129 06:49:33.320009   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/sorted-pod3 created
get.sh:300: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:sorted-pod3:
(BSuccessful
message:sorted-pod1:sorted-pod2:sorted-pod3:
has:sorted-pod1:sorted-pod2:sorted-pod3:
Successful
... skipping 27 lines ...
has not:Table
get.sh:325: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: sorted-pod1:sorted-pod2:sorted-pod3:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "sorted-pod1" force deleted
pod "sorted-pod2" force deleted
pod "sorted-pod3" force deleted
E1129 06:49:34.033021   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:34.128046   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:329: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(B+++ exit code: 0
Recording: run_kubectl_all_namespace_tests
Running command: run_kubectl_all_namespace_tests

+++ Running case: test-cmd.run_kubectl_all_namespace_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_kubectl_all_namespace_tests
+++ [1129 06:49:34] Testing kubectl --all-namespace
E1129 06:49:34.219066   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:342: Successful get namespaces {{range.items}}{{if eq .metadata.name \"default\"}}{{.metadata.name}}:{{end}}{{end}}: default:
(BE1129 06:49:34.321079   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:346: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bpod/valid-pod created
get.sh:350: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(BNAMESPACE                    NAME        READY   STATUS    RESTARTS   AGE
namespace-1575010170-18359   valid-pod   0/1     Pending   0          0s
namespace/all-ns-test-1 created
serviceaccount/test created
namespace/all-ns-test-2 created
serviceaccount/test created
E1129 06:49:35.034136   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:NAMESPACE                    NAME      SECRETS   AGE
all-ns-test-1                default   0         1s
all-ns-test-1                test      0         1s
all-ns-test-2                default   0         1s
all-ns-test-2                test      0         1s
... skipping 50 lines ...
namespace-1575010150-11326   default   0         25s
namespace-1575010162-12969   default   0         13s
namespace-1575010163-15133   default   0         11s
namespace-1575010170-18359   default   0         5s
some-other-random            default   0         6s
has:all-ns-test-2
E1129 06:49:35.129283   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:NAMESPACE                    NAME      SECRETS   AGE
all-ns-test-1                default   0         1s
all-ns-test-1                test      0         1s
all-ns-test-2                default   0         1s
all-ns-test-2                test      0         1s
... skipping 50 lines ...
namespace-1575010150-11326   default   0         25s
namespace-1575010162-12969   default   0         13s
namespace-1575010163-15133   default   0         11s
namespace-1575010170-18359   default   0         5s
some-other-random            default   0         6s
has:all-ns-test-2
E1129 06:49:35.219869   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "all-ns-test-1" deleted
E1129 06:49:35.322233   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:36.035273   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:36.130479   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:36.220912   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:36.323463   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:37.036499   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:37.131508   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:37.221508   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:37.324457   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:38.037623   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:38.132671   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:38.222637   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:38.325501   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:39.038845   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:39.133908   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:39.223586   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:39.326883   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:40.039980   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:40.135114   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:40.224322   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:40.327909   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace "all-ns-test-2" deleted
E1129 06:49:41.041178   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:41.136225   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:41.225360   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:41.329046   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:42.042411   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:42.137609   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:42.226593   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:42.330266   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:43.043743   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:43.138752   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:43.227711   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:43.331686   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:44.044966   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:44.140001   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:44.228793   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:44.332861   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:45.046116   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:45.141179   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:45.229877   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1129 06:49:45.297440   54238 namespace_controller.go:185] Namespace has been deleted all-ns-test-1
E1129 06:49:45.333776   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
get.sh:376: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: valid-pod:
(Bwarning: Immediate deletion does not wait for confirmation that the running resource has been terminated. The resource may continue to run on the cluster indefinitely.
pod "valid-pod" force deleted
get.sh:380: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(Bget.sh:384: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(BSuccessful
... skipping 6 lines ...

+++ Running case: test-cmd.run_template_output_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_template_output_tests
+++ [1129 06:49:45] Testing --template support on commands
+++ [1129 06:49:45] Creating namespace namespace-1575010185-21619
E1129 06:49:46.047190   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
namespace/namespace-1575010185-21619 created
Context "test" modified.
E1129 06:49:46.142352   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
template-output.sh:29: Successful get pods {{range.items}}{{.metadata.name}}:{{end}}: 
(BE1129 06:49:46.231018   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:46.335025   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
pod/valid-pod created
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "v1",
... skipping 59 lines ...
Successful
message:valid-pod:
has:valid-pod:
Successful
message:valid-pod:
has:valid-pod:
E1129 06:49:47.048508   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
E1129 06:49:47.143357   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:scale-1:
has:scale-1:
Successful
message:redis-slave:
has:redis-slave:
E1129 06:49:47.232026   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
kubectl convert is DEPRECATED and will be removed in a future version.
In order to convert, kubectl apply the object to the cluster, then kubectl get at the desired version.
Successful
message:nginx:
has:nginx:
E1129 06:49:47.336112   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
kubectl run --generator=job/v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead.
Successful
message:pi:
has:pi:
Successful
message:127.0.0.1:
... skipping 23 lines ...
Successful
message:myclusterrole:
has:myclusterrole:
Successful
message:foo:
has:foo:
E1129 06:49:48.049593   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:cm:
has:cm:
E1129 06:49:48.144465   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
I1129 06:49:48.156814   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010185-21619", Name:"deploy", UID:"d51a9e80-7191-4b35-a642-7cb2e31c5d65", APIVersion:"apps/v1", ResourceVersion:"3127", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set deploy-74bcc58696 to 1
Successful
message:deploy:
has:deploy:
I1129 06:49:48.161963   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010185-21619", Name:"deploy-74bcc58696", UID:"7dd4a537-5445-427f-b1b5-b495408c96f3", APIVersion:"apps/v1", ResourceVersion:"3128", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: deploy-74bcc58696-8j8lb
E1129 06:49:48.232944   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
cronjob.batch/pi created
E1129 06:49:48.337200   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
Successful
message:bar:
has:bar:
... skipping 15 lines ...
Successful
message:foo:
has:foo:
Successful
message:valid-pod:
has:valid-pod:
E1129 06:49:49.050684   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
Successful
message:valid-pod:
has:valid-pod:
E1129 06:49:49.145483   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:kubernetes:
has:kubernetes:
E1129 06:49:49.234142   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:valid-pod:
has:valid-pod:
E1129 06:49:49.338202   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:foo:
has:foo:
Successful
message:foo:
has:foo:
... skipping 33 lines ...
preferences: {}
users: null
has:kind: Config
Successful
message:deploy:
has:deploy:
E1129 06:49:50.051691   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deploy:
has:deploy:
E1129 06:49:50.146676   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deploy:
has:deploy:
E1129 06:49:50.235148   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:deploy:
has:deploy:
E1129 06:49:50.339366   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:Config:
has:Config
Successful
message:apiVersion: v1
kind: ConfigMap
... skipping 19 lines ...
Running command: run_certificates_tests

+++ Running case: test-cmd.run_certificates_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_certificates_tests
+++ [1129 06:49:51] Testing certificates
E1129 06:49:51.052767   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:51.147814   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
E1129 06:49:51.236151   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:29: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo approved
E1129 06:49:51.340381   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "certificates.k8s.io/v1beta1",
            "kind": "CertificateSigningRequest",
... skipping 80 lines ...
    "kind": "List",
    "metadata": {
        "resourceVersion": "",
        "selfLink": ""
    }
}
E1129 06:49:52.053892   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:40: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Approved
(BE1129 06:49:52.148909   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io "foo" deleted
E1129 06:49:52.237245   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:42: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(BE1129 06:49:52.341420   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
certificate.sh:46: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo denied
{
    "apiVersion": "v1",
    "items": [
... skipping 38 lines ...
    }
}
certificate.sh:49: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Denied
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
certificate.sh:51: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(Bcertificatesigningrequest.certificates.k8s.io/foo created
E1129 06:49:53.055013   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:54: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: 
(BE1129 06:49:53.149974   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo denied
E1129 06:49:53.238254   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
{
    "apiVersion": "v1",
    "items": [
        {
            "apiVersion": "certificates.k8s.io/v1beta1",
            "kind": "CertificateSigningRequest",
... skipping 31 lines ...
    "kind": "List",
    "metadata": {
        "resourceVersion": "",
        "selfLink": ""
    }
}
E1129 06:49:53.342768   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificate.sh:57: Successful get csr/foo {{range.status.conditions}}{{.type}}{{end}}: Denied
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
certificate.sh:59: Successful get csr {{range.items}}{{.metadata.name}}{{end}}: 
(B+++ exit code: 0
Recording: run_cluster_management_tests
Running command: run_cluster_management_tests
... skipping 2 lines ...
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_cluster_management_tests
+++ [1129 06:49:53] Testing cluster-management commands
node-management.sh:27: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(Bpod/test-pod-1 created
pod/test-pod-2 created
E1129 06:49:54.056159   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:76: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
(Bnode/127.0.0.1 tainted
E1129 06:49:54.150923   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:79: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: dedicated=foo:PreferNoSchedule
(BE1129 06:49:54.239192   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 untainted
E1129 06:49:54.343808   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:83: Successful get nodes 127.0.0.1 {{range .spec.taints}}{{if eq .key \"dedicated\"}}{{.key}}={{.value}}:{{.effect}}{{end}}{{end}}: 
(Bnode-management.sh:87: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 cordoned (dry run)
node-management.sh:89: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode-management.sh:93: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 cordoned (dry run)
node/127.0.0.1 drained (dry run)
node-management.sh:96: Successful get nodes {{range.items}}{{.metadata.name}}:{{end}}: 127.0.0.1:
(Bnode-management.sh:97: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode-management.sh:101: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BE1129 06:49:55.057109   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:103: Successful get pods {{range .items}}{{.metadata.name}},{{end}}: test-pod-1,test-pod-2,
(BE1129 06:49:55.151998   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 cordoned
node/127.0.0.1 drained
E1129 06:49:55.240171   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node-management.sh:106: Successful get pods/test-pod-2 {{.metadata.name}}: test-pod-2
(Bpod "test-pod-2" deleted
E1129 06:49:55.344820   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
node/127.0.0.1 uncordoned
node-management.sh:111: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode-management.sh:115: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(BSuccessful
message:node/127.0.0.1 already uncordoned (dry run)
has:already uncordoned
node-management.sh:119: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: <no value>
(Bnode/127.0.0.1 labeled
node-management.sh:124: Successful get nodes 127.0.0.1 {{.metadata.labels.test}}: label
(BSuccessful
message:error: cannot specify both a node name and a --selector option
See 'kubectl drain -h' for help and examples
has:cannot specify both a node name
E1129 06:49:56.058133   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: USAGE: cordon NODE [flags]
See 'kubectl cordon -h' for help and examples
has:error\: USAGE\: cordon NODE
node/127.0.0.1 already uncordoned
E1129 06:49:56.153059   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:error: You must provide one or more resources by argument or filename.
Example resource specifications include:
   '-f rsrc.yaml'
   '--filename=rsrc.json'
   '<resource> <name>'
   '<resource>'
has:must provide one or more resources
E1129 06:49:56.241242   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:node/127.0.0.1 cordoned
has:node/127.0.0.1 cordoned
E1129 06:49:56.345839   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Successful
message:
has not:cordoned
node-management.sh:145: Successful get nodes 127.0.0.1 {{.spec.unschedulable}}: true
(B+++ exit code: 0
Recording: run_plugins_tests
... skipping 6 lines ...
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/version/kubectl-version
  - warning: kubectl-version overwrites existing command: "kubectl version"

error: one plugin warning was found
has:kubectl-version overwrites existing command: "kubectl version"
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo
  - warning: test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin: test/fixtures/pkg/kubectl/plugins/kubectl-foo

error: one plugin warning was found
has:test/fixtures/pkg/kubectl/plugins/foo/kubectl-foo is overshadowed by a similarly named plugin
Successful
message:The following compatible plugins are available:

test/fixtures/pkg/kubectl/plugins/kubectl-foo
has:plugins are available
Successful
message:Unable read directory "test/fixtures/pkg/kubectl/plugins/empty" from your PATH: open test/fixtures/pkg/kubectl/plugins/empty: no such file or directory. Skipping...
error: unable to find any kubectl plugins in your PATH
has:unable to find any kubectl plugins in your PATH
Successful
message:I am plugin foo
has:plugin foo
Successful
message:I am plugin bar called with args test/fixtures/pkg/kubectl/plugins/bar/kubectl-bar arg1
... skipping 7 lines ...
Server Version: version.Info{Major:"1", Minor:"18+", GitVersion:"v1.18.0-alpha.0.1307+875c5f6dd8410a", GitCommit:"875c5f6dd8410a71cde417ece2df22057f43aa72", GitTreeState:"clean", BuildDate:"2019-11-28T19:17:17Z", GoVersion:"go1.13.4", Compiler:"gc", Platform:"linux/amd64"}
has not:overshadows an existing plugin
+++ exit code: 0
Recording: run_impersonation_tests
Running command: run_impersonation_tests

E1129 06:49:57.059229   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
+++ Running case: test-cmd.run_impersonation_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_impersonation_tests
+++ [1129 06:49:57] Testing impersonation
Successful
message:error: requesting groups or user-extra for  without impersonating a user
has:without impersonating a user
E1129 06:49:57.154170   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:57.242290   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
certificatesigningrequest.certificates.k8s.io/foo created
E1129 06:49:57.346742   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
authorization.sh:68: Successful get csr/foo {{.spec.username}}: user1
(Bauthorization.sh:69: Successful get csr/foo {{range .spec.groups}}{{.}}{{end}}: system:authenticated
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
certificatesigningrequest.certificates.k8s.io/foo created
authorization.sh:74: Successful get csr/foo {{len .spec.groups}}: 3
(Bauthorization.sh:75: Successful get csr/foo {{range .spec.groups}}{{.}} {{end}}: group2 group1 ,,,chameleon 
(Bcertificatesigningrequest.certificates.k8s.io "foo" deleted
+++ exit code: 0
Recording: run_wait_tests
Running command: run_wait_tests
E1129 06:49:58.060358   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource

+++ Running case: test-cmd.run_wait_tests 
+++ working dir: /home/prow/go/src/k8s.io/kubernetes
+++ command: run_wait_tests
+++ [1129 06:49:58] Testing kubectl wait
+++ [1129 06:49:58] Creating namespace namespace-1575010198-31783
namespace/namespace-1575010198-31783 created
E1129 06:49:58.156078   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
Context "test" modified.
E1129 06:49:58.243347   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/test-1 created
I1129 06:49:58.284495   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010198-31783", Name:"test-1", UID:"62c05ee6-21b7-4019-b32b-55fffdd4854a", APIVersion:"apps/v1", ResourceVersion:"3218", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-1-6d98955cc9 to 1
I1129 06:49:58.290876   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010198-31783", Name:"test-1-6d98955cc9", UID:"55cae8d0-340a-4222-82b1-6258068af9ca", APIVersion:"apps/v1", ResourceVersion:"3219", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-1-6d98955cc9-qxz2r
E1129 06:49:58.347982   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps/test-2 created
I1129 06:49:58.359243   54238 event.go:281] Event(v1.ObjectReference{Kind:"Deployment", Namespace:"namespace-1575010198-31783", Name:"test-2", UID:"139acb43-2b80-411f-9095-ed8e8a8d9c2d", APIVersion:"apps/v1", ResourceVersion:"3228", FieldPath:""}): type: 'Normal' reason: 'ScalingReplicaSet' Scaled up replica set test-2-65897ff84d to 1
I1129 06:49:58.363023   54238 event.go:281] Event(v1.ObjectReference{Kind:"ReplicaSet", Namespace:"namespace-1575010198-31783", Name:"test-2-65897ff84d", UID:"cc4a6f1a-3795-4a40-8f36-f929189f4ef6", APIVersion:"apps/v1", ResourceVersion:"3229", FieldPath:""}): type: 'Normal' reason: 'SuccessfulCreate' Created pod: test-2-65897ff84d-zm29w
wait.sh:36: Successful get deployments {{range .items}}{{.metadata.name}},{{end}}: test-1,test-2,
(BE1129 06:49:59.061520   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:59.157142   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:59.244443   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:49:59.348871   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:50:00.062642   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:50:00.158356   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:50:00.245603   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
E1129 06:50:00.350022   54238 reflector.go:156] k8s.io/client-go/metadata/metadatainformer/informer.go:89: Failed to list *v1.PartialObjectMetadata: the server could not find the requested resource
deployment.apps "test-1" deleted
deployment.apps "test-2" deleted
Successful
message:deployment.apps/test-1 condition met
deployment.apps/test-2 condition met
has:test-1 condition met
... skipping 24 lines ...
I1129 06:50:00.759930   50776 crdregistration_controller.go:142] Shutting down crd-autoregister controller
I1129 06:50:00.759949   50776 autoregister_controller.go:164] Shutting down autoregister controller
I1129 06:50:00.759951   50776 secure_serving.go:222] Stopped listening on 127.0.0.1:6443
I1129 06:50:00.759971   50776 cluster_authentication_trust_controller.go:463] Shutting down cluster_authentication_trust_controller controller
I1129 06:50:00.760398   50776 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1129 06:50:00.760464   50776 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W1129 06:50:00.760635   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.760698   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.760838   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.760966   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761058   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761126   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761160   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761213   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761222   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761381   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761394   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761401   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761445   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761459   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761624   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761638   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761692   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761824   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761857   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761864   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761886   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761914   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761921   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761943   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761960   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761990   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.761998   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.762015   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.762038   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
I1129 06:50:00.762940   50776 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
I1129 06:50:00.763262   50776 clientconn.go:825] blockingPicker: the picked transport is not ready, loop back to repick
W1129 06:50:00.764207   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764244   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764262   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764324   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764340   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764372   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764419   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764427   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764438   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764473   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764488   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764499   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764523   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764538   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764547   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764582   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764586   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764598   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764610   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764331   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764635   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764646   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764695   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764746   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764801   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764826   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764755   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764842   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764755   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764796   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764877   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764895   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764795   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764814   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.764763   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.765006   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:00.765015   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
junit report dir: /logs/artifacts
+++ [1129 06:50:00] Clean up complete
+ make test-integration
W1129 06:50:01.761032   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.761081   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.761081   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.761483   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.761493   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.761497   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.761560   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.761562   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.761595   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.761631   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.762279   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.762385   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.762396   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.762570   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.762574   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.762579   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.762582   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.762583   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.762600   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.762279   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.762609   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.762636   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.762643   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.762645   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.762658   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.762677   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.762683   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.762690   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.762844   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765112   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765142   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765114   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765163   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765153   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765179   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765188   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765189   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765189   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765213   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765223   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765229   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765232   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765252   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765261   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765264   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765267   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765268   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765276   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765314   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765360   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765365   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765376   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765383   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765389   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765407   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765427   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765433   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765442   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765568   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765569   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765573   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765611   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765773   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.765786   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.766124   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:01.766499   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.074902   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.078491   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.102754   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.114486   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.132363   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.152872   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.160165   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.177241   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.180593   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.197896   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.201567   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.228359   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.232217   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.236833   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.245489   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.249889   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.263560   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.264368   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.267946   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.268842   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.281235   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.306904   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.309795   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.328684   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.341901   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.352628   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.365049   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.385922   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.387991   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.400517   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.409040   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.412266   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.422709   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.423707   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.429462   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.437368   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.440240   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.444543   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.446810   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.453199   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.476368   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.492591   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.492850   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.492995   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.511640   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.512875   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.526338   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.527515   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.538071   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.541382   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.541721   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.559470   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.561535   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.568713   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.577545   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.590900   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.595990   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.628056   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.628095   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.629749   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.648512   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.656979   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.657208   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.670899   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.677607   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
W1129 06:50:03.683499   50776 clientconn.go:1120] grpc: addrConn.createTransport failed to connect to {http://127.0.0.1:2379 0  <nil>}. Err :connection error: desc = "transport: Error while dialing dial tcp 127.0.0.1:2379: connect: connection refused". Reconnecting...
+++ [1129 06:50:04] Checking etcd is on PATH
/home/prow/go/src/k8s.io/kubernetes/third_party/etcd/etcd
+++ [1129 06:50:04] Starting etcd instance
etcd --advertise-client-urls http://127.0.0.1:2379 --data-dir /tmp/tmp.hJe083gM1n --listen-client-urls http://127.0.0.1:2379 --debug > "/logs/artifacts/etcd.6f5eb711-1272-11ea-bad1-0265daafe4f5.root.log.DEBUG.20191129-065004.90266" 2>/dev/null
Waiting for etcd to come up.
+++ [1129 06:50:05] On try 1, etcd: : {"health":"true"}
{"header":{"cluster_id":"14841639068965178418","member_id":"10276657743932975437","revision":"2","raft_term":"2"}}+++ [1129 06:50:05] Running integration test cases
E1129 06:50:06.573478   50776 controller.go:183] StorageError: key not found, Code: 1, Key: /registry/masterleases/10.60.85.199, ResourceVersion: 0, AdditionalErrorMsg: 
Running tests for APIVersion: v1,admissionregistration.k8s.io/v1,admissionregistration.k8s.io/v1beta1,admission.k8s.io/v1,admission.k8s.io/v1beta1,apps/v1,apps/v1beta1,apps/v1beta2,auditregistration.k8s.io/v1alpha1,authentication.k8s.io/v1,authentication.k8s.io/v1beta1,authorization.k8s.io/v1,authorization.k8s.io/v1beta1,autoscaling/v1,autoscaling/v2beta1,autoscaling/v2beta2,batch/v1,batch/v1beta1,batch/v2alpha1,certificates.k8s.io/v1beta1,coordination.k8s.io/v1beta1,coordination.k8s.io/v1,discovery.k8s.io/v1alpha1,discovery.k8s.io/v1beta1,extensions/v1beta1,events.k8s.io/v1beta1,imagepolicy.k8s.io/v1alpha1,networking.k8s.io/v1,networking.k8s.io/v1beta1,node.k8s.io/v1alpha1,node.k8s.io/v1beta1,policy/v1beta1,rbac.authorization.k8s.io/v1,rbac.authorization.k8s.io/v1beta1,rbac.authorization.k8s.io/v1alpha1,scheduling.k8s.io/v1alpha1,scheduling.k8s.io/v1beta1,scheduling.k8s.io/v1,settings.k8s.io/v1alpha1,storage.k8s.io/v1beta1,storage.k8s.io/v1,storage.k8s.io/v1alpha1,flowcontrol.apiserver.k8s.io/v1alpha1,
+++ [1129 06:50:09] Running tests without code coverage
# k8s.io/kubernetes/test/integration/auth [k8s.io/kubernetes/test/integration/auth.test]
test/integration/auth/bootstraptoken_test.go:50:18: undefined: "k8s.io/kubernetes/test/e2e/lifecycle/bootstrap".GenerateTokenId
{"Time":"2019-11-29T06:51:17.515683894Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/podlogs","Output":"ok  \tk8s.io/kubernetes/test/integration/apiserver/podlogs\t7.393s\n"}
{"Time":"2019-11-29T06:51:22.165506026Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/configmap","Output":"ok  \tk8s.io/kubernetes/test/integration/configmap\t3.743s\n"}
{"Time":"2019-11-29T06:51:37.617475539Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/certreload","Output":"ok  \tk8s.io/kubernetes/test/integration/apiserver/certreload\t28.402s\n"}
{"Time":"2019-11-29T06:51:38.151352829Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/client","Test":"TestPatch","Output":"        \u0008patchpod\u0012\u0000\u001a\u0007default\"(/api/v1/namespaces/default/pods/patchpod*$9b163016-5035-4fca-9631-73a2af0456b42\u000419528\u0000B\u0008\u0008\ufffd\ufffd\ufffd\ufffd\u0005\u0010\u0000Z\n"}
{"Time":"2019-11-29T06:51:38.151402087Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/client","Test":"TestPatch","Output":"        \u0004name\u0012\u0005image*\u0000B\u0000j\u0014/dev/termination-logr\u0006Always\ufffd\u0001\u0000\ufffd\u0001\u0000\ufffd\u0001\u0000\ufffd\u0001\u0004File\u001a\u0006Always \u001e2\u000cClusterFirstB\u0000J\u0000R\u0000X\u0000`\u0000h\u0000r\u0000\ufffd\u0001\u0000\ufffd\u0001\u0000\ufffd\u0001\u0011default-scheduler\ufffd\u00016\n"}
{"Time":"2019-11-29T06:51:38.151414417Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/client","Test":"TestPatch","Output":"        \u001cnode.kubernetes.io/not-ready\u0012\u0006Exists\u001a\u0000\"\tNoExecute(\ufffd\u0002\ufffd\u00018\n"}
... skipping 193 lines ...
{"Time":"2019-11-29T06:59:32.698184375Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/master","Test":"TestObjectSizeResponses/1_MB_finalizers","Output":" [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:45428]\n"}
{"Time":"2019-11-29T06:59:33.279772122Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/master","Test":"TestObjectSizeResponses/2_MB_finalizers","Output":" [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:45428]\n"}
{"Time":"2019-11-29T06:59:56.063883479Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/master","Test":"TestServiceAlloc","Output":" [master.test/v0.0.0 (linux/amd64) kubernetes/$Format 127.0.0.1:52112]\n"}
{"Time":"2019-11-29T06:59:57.106750225Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/master","Output":"ok  \tk8s.io/kubernetes/test/integration/master\t371.720s\n"}
{"Time":"2019-11-29T07:00:20.559331229Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/scheduler","Output":"ok  \tk8s.io/kubernetes/test/integration/scheduler\t276.384s\n"}
{"Time":"2019-11-29T07:00:44.789442656Z","Action":"output","Package":"k8s.io/kubernetes/test/integration/apiserver/admissionwebhook","Output":"ok  \tk8s.io/kubernetes/test/integration/apiserver/admissionwebhook\t571.849s\n"}
FAIL	k8s.io/kubernetes/test/integration/auth [build failed]
{"Time":"2019-11-29T07:01:03.331054857Z","Action":"output","Package":"k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/test/integration","Output":"ok  \tk8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/test/integration\t232.044s\n"}
{"Time":"2019-11-29T07:02:51.388503755Z","Action":"output","Package":"k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/test/integration/conversion","Output":"ok  \tk8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/test/integration/conversion\t335.706s\n"}
✓  test/integration/apiserver/podlogs (7.393s)
✓  test/integration/configmap (3.743s)
✓  test/integration/apiserver/certreload (28.402s)
✓  test/integration/defaulttolerationseconds (3.641s)
... skipping 32 lines ...
✓  test/integration/serving (46.933s)
✓  test/integration/volume (1m34.625s)
✓  test/integration/volumescheduling (2m18.331s)
✓  test/integration/master (6m11.722s)
✓  test/integration/scheduler (4m36.388s)
✓  test/integration/apiserver/admissionwebhook (9m31.85s)
time="2019-11-29T07:02:57Z" level=warning msg="FAIL\tk8s.io/kubernetes/test/integration/auth [build failed]"
bad output from test2json: FAIL	k8s.io/kubernetes/test/integration/auth [build failed]
✓  vendor/k8s.io/apiextensions-apiserver/test/integration (3m52.045s)
✓  vendor/k8s.io/apiextensions-apiserver/test/integration/conversion (5m35.707s)

=== Skipped
=== SKIP: test/integration/client TestMultiWatch (0.00s)
I1129 06:51:57.806393  105574 apiapproval_controller.go:197] Shutting down KubernetesAPIApprovalPolicyConformantConditionController
... skipping 13 lines ...
=== SKIP: test/integration/scheduler_perf TestSchedule100Node3KPods (0.00s)
    scheduler_test.go:73: Skipping because we want to run short tests


DONE 2745 tests, 4 skipped in 5.243s
+++ [1129 07:02:57] Saved JUnit XML test report to /logs/artifacts/junit_304dbea7698c16157bb4586f231ea1f94495b046_20191129-065009.xml
make[1]: *** [Makefile:185: test] Error 1
!!! [1129 07:02:57] Call tree:
!!! [1129 07:02:57]  1: hack/make-rules/test-integration.sh:89 runTests(...)
+++ [1129 07:02:57] Cleaning up etcd
+++ [1129 07:02:57] Integration test cleanup complete
make: *** [Makefile:204: test-integration] Error 1
+ EXIT_VALUE=2
+ set +o xtrace
Cleaning up after docker in docker.
================================================================================
[Barnacle] 2019/11/29 07:02:57 Cleaning up Docker data root...
[Barnacle] 2019/11/29 07:02:57 Removing all containers.
... skipping 12 lines ...