This job view page is being replaced by Spyglass soon. Check out the new job view.
PRandyzhangx: fix: set ForceAttemptHTTP2 as false to increase ARM throttling limit
ResultFAILURE
Tests 0 failed / 0 succeeded
Started2023-01-04 08:35
Elapsed1h14m
Revision9b2e9a6ab846434f409b84cbf98266f0ae9a8ee3
Refs 1155

No Test Failures!


Error lines from build-log.txt

... skipping 764 lines ...
certificate.cert-manager.io "selfsigned-cert" deleted
# Create secret for AzureClusterIdentity
./hack/create-identity-secret.sh
make[2]: Entering directory '/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure'
make[2]: Nothing to be done for 'kubectl'.
make[2]: Leaving directory '/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure'
Error from server (NotFound): secrets "cluster-identity-secret" not found
secret/cluster-identity-secret created
secret/cluster-identity-secret labeled
# Create customized cloud provider configs
./hack/create-custom-cloud-provider-config.sh
make[2]: Entering directory '/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure'
make[2]: Nothing to be done for 'kubectl'.
... skipping 137 lines ...
# Wait for the kubeconfig to become available.
timeout --foreground 300 bash -c "while ! /home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/hack/tools/bin/kubectl-v1.22.4 get secrets | grep capz-d35khr-kubeconfig; do sleep 1; done"
capz-d35khr-kubeconfig                 cluster.x-k8s.io/secret   1      0s
# Get kubeconfig and store it locally.
/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/hack/tools/bin/kubectl-v1.22.4 get secrets capz-d35khr-kubeconfig -o json | jq -r .data.value | base64 --decode > ./kubeconfig
timeout --foreground 600 bash -c "while ! /home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/hack/tools/bin/kubectl-v1.22.4 --kubeconfig=./kubeconfig get nodes | grep control-plane; do sleep 1; done"
error: the server doesn't have a resource type "nodes"
capz-d35khr-control-plane-ngxp5   NotReady   control-plane,master   10s   v1.23.16-rc.0.37+548d5d2298c4a5
run "/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/hack/tools/bin/kubectl-v1.22.4 --kubeconfig=./kubeconfig ..." to work with the new target cluster
make[1]: Leaving directory '/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure'
CCM cluster CIDR: 192.168.0.0/16
Waiting for 1 control plane machine(s), 0 worker machine(s), and 2 windows machine(s) to become Ready
node/capz-d35k-8w5hq condition met
... skipping 1031 lines ...
          image: "mcr.microsoft.com/oss/kubernetes-csi/csi-resizer:v1.6.0"
          args:
            - "-csi-address=$(ADDRESS)"
            - "-v=2"
            - "-leader-election"
            - "--leader-election-namespace=kube-system"
            - '-handle-volume-inuse-error=false'
            - '-timeout=120s'
            - '-feature-gates=RecoverVolumeExpansionFailure=true'
          env:
            - name: ADDRESS
              value: /csi/csi.sock
          imagePullPolicy: IfNotPresent
... skipping 190 lines ...
Jan  4 09:06:07.943: INFO: Waiting up to timeout=5m0s for PersistentVolumeClaims [pvc-7wjn2] to have phase Bound
Jan  4 09:06:07.975: INFO: PersistentVolumeClaim pvc-7wjn2 found and phase=Bound (32.152145ms)
STEP: checking the PVC 01/04/23 09:06:07.975
STEP: validating provisioned PV 01/04/23 09:06:08.006
STEP: checking the PV 01/04/23 09:06:08.038
STEP: deploying the pod 01/04/23 09:06:08.038
STEP: checking that the pods command exits with an error 01/04/23 09:06:08.077
Jan  4 09:06:08.077: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-dv66l" in namespace "azurefile-8081" to be "Error status code"
Jan  4 09:06:08.109: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Pending", Reason="", readiness=false. Elapsed: 32.358441ms
Jan  4 09:06:10.141: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Pending", Reason="", readiness=false. Elapsed: 2.064539799s
Jan  4 09:06:12.142: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Pending", Reason="", readiness=false. Elapsed: 4.064821117s
Jan  4 09:06:14.143: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Pending", Reason="", readiness=false. Elapsed: 6.065805865s
Jan  4 09:06:16.143: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Pending", Reason="", readiness=false. Elapsed: 8.065743151s
Jan  4 09:06:18.142: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Pending", Reason="", readiness=false. Elapsed: 10.064641126s
... skipping 191 lines ...
Jan  4 09:12:42.145: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Pending", Reason="", readiness=false. Elapsed: 6m34.067646308s
Jan  4 09:12:44.145: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Running", Reason="", readiness=true. Elapsed: 6m36.067954642s
Jan  4 09:12:46.144: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Running", Reason="", readiness=true. Elapsed: 6m38.067054516s
Jan  4 09:12:48.143: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Running", Reason="", readiness=true. Elapsed: 6m40.066347758s
Jan  4 09:12:50.217: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Running", Reason="", readiness=true. Elapsed: 6m42.139689677s
Jan  4 09:12:52.145: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Running", Reason="", readiness=false. Elapsed: 6m44.068207799s
Jan  4 09:12:54.144: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Failed", Reason="", readiness=false. Elapsed: 6m46.06665329s
STEP: Saw pod failure 01/04/23 09:12:54.144
Jan  4 09:12:54.144: INFO: Pod "azurefile-volume-tester-dv66l" satisfied condition "Error status code"
STEP: checking that pod logs contain expected message 01/04/23 09:12:54.144
Jan  4 09:12:54.204: INFO: deleting Pod "azurefile-8081"/"azurefile-volume-tester-dv66l"
Jan  4 09:12:54.238: INFO: Pod azurefile-volume-tester-dv66l has the following logs: Out-File : Access to the path 'C:\mnt\test-1\data.txt' is denied.
At line:1 char:22
+ ... cho 'hello world' | Out-File -FilePath C:\mnt\test-1\data.txt; Get-Co ...
+                         ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
... skipping 73 lines ...
    Jan  4 09:06:07.943: INFO: Waiting up to timeout=5m0s for PersistentVolumeClaims [pvc-7wjn2] to have phase Bound
    Jan  4 09:06:07.975: INFO: PersistentVolumeClaim pvc-7wjn2 found and phase=Bound (32.152145ms)
    STEP: checking the PVC 01/04/23 09:06:07.975
    STEP: validating provisioned PV 01/04/23 09:06:08.006
    STEP: checking the PV 01/04/23 09:06:08.038
    STEP: deploying the pod 01/04/23 09:06:08.038
    STEP: checking that the pods command exits with an error 01/04/23 09:06:08.077
    Jan  4 09:06:08.077: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-dv66l" in namespace "azurefile-8081" to be "Error status code"
    Jan  4 09:06:08.109: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Pending", Reason="", readiness=false. Elapsed: 32.358441ms
    Jan  4 09:06:10.141: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Pending", Reason="", readiness=false. Elapsed: 2.064539799s
    Jan  4 09:06:12.142: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Pending", Reason="", readiness=false. Elapsed: 4.064821117s
    Jan  4 09:06:14.143: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Pending", Reason="", readiness=false. Elapsed: 6.065805865s
    Jan  4 09:06:16.143: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Pending", Reason="", readiness=false. Elapsed: 8.065743151s
    Jan  4 09:06:18.142: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Pending", Reason="", readiness=false. Elapsed: 10.064641126s
... skipping 191 lines ...
    Jan  4 09:12:42.145: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Pending", Reason="", readiness=false. Elapsed: 6m34.067646308s
    Jan  4 09:12:44.145: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Running", Reason="", readiness=true. Elapsed: 6m36.067954642s
    Jan  4 09:12:46.144: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Running", Reason="", readiness=true. Elapsed: 6m38.067054516s
    Jan  4 09:12:48.143: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Running", Reason="", readiness=true. Elapsed: 6m40.066347758s
    Jan  4 09:12:50.217: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Running", Reason="", readiness=true. Elapsed: 6m42.139689677s
    Jan  4 09:12:52.145: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Running", Reason="", readiness=false. Elapsed: 6m44.068207799s
    Jan  4 09:12:54.144: INFO: Pod "azurefile-volume-tester-dv66l": Phase="Failed", Reason="", readiness=false. Elapsed: 6m46.06665329s
    STEP: Saw pod failure 01/04/23 09:12:54.144
    Jan  4 09:12:54.144: INFO: Pod "azurefile-volume-tester-dv66l" satisfied condition "Error status code"
    STEP: checking that pod logs contain expected message 01/04/23 09:12:54.144
    Jan  4 09:12:54.204: INFO: deleting Pod "azurefile-8081"/"azurefile-volume-tester-dv66l"
    Jan  4 09:12:54.238: INFO: Pod azurefile-volume-tester-dv66l has the following logs: Out-File : Access to the path 'C:\mnt\test-1\data.txt' is denied.
    At line:1 char:22
    + ... cho 'hello world' | Out-File -FilePath C:\mnt\test-1\data.txt; Get-Co ...
    +                         ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
... skipping 49 lines ...
Jan  4 09:12:56.789: INFO: Waiting up to timeout=5m0s for PersistentVolumeClaims [pvc-twrjx] to have phase Bound
Jan  4 09:12:56.823: INFO: PersistentVolumeClaim pvc-twrjx found and phase=Bound (33.847474ms)
STEP: checking the PVC 01/04/23 09:12:56.823
STEP: validating provisioned PV 01/04/23 09:12:56.854
STEP: checking the PV 01/04/23 09:12:56.885
STEP: deploying the pod 01/04/23 09:12:56.885
STEP: checking that the pods command exits with no error 01/04/23 09:12:56.92
Jan  4 09:12:56.920: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-qszbw" in namespace "azurefile-2540" to be "Succeeded or Failed"
Jan  4 09:12:56.952: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Pending", Reason="", readiness=false. Elapsed: 31.728549ms
Jan  4 09:12:58.985: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Pending", Reason="", readiness=false. Elapsed: 2.064704327s
Jan  4 09:13:00.984: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Pending", Reason="", readiness=false. Elapsed: 4.063731449s
Jan  4 09:13:02.986: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Pending", Reason="", readiness=false. Elapsed: 6.065719391s
Jan  4 09:13:04.984: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Pending", Reason="", readiness=false. Elapsed: 8.064321914s
Jan  4 09:13:06.986: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Pending", Reason="", readiness=false. Elapsed: 10.065507s
... skipping 182 lines ...
Jan  4 09:19:12.987: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Running", Reason="", readiness=true. Elapsed: 6m16.066572764s
Jan  4 09:19:14.987: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Running", Reason="", readiness=true. Elapsed: 6m18.067124732s
Jan  4 09:19:16.986: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Running", Reason="", readiness=true. Elapsed: 6m20.066224309s
Jan  4 09:19:18.987: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Running", Reason="", readiness=false. Elapsed: 6m22.067030288s
Jan  4 09:19:20.987: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Succeeded", Reason="", readiness=false. Elapsed: 6m24.066563136s
STEP: Saw pod success 01/04/23 09:19:20.987
Jan  4 09:19:20.987: INFO: Pod "azurefile-volume-tester-qszbw" satisfied condition "Succeeded or Failed"
STEP: setting up the PV 01/04/23 09:19:20.987
STEP: creating a PV 01/04/23 09:19:20.987
STEP: setting up the PVC 01/04/23 09:19:21.023
STEP: creating a PVC 01/04/23 09:19:21.023
STEP: waiting for PVC to be in phase "Bound" 01/04/23 09:19:21.06
Jan  4 09:19:21.060: INFO: Waiting up to timeout=5m0s for PersistentVolumeClaims [pvc-lw84c] to have phase Bound
Jan  4 09:19:21.092: INFO: PersistentVolumeClaim pvc-lw84c found but phase is Pending instead of Bound.
Jan  4 09:19:23.124: INFO: PersistentVolumeClaim pvc-lw84c found and phase=Bound (2.064089201s)
STEP: checking the PVC 01/04/23 09:19:23.124
STEP: validating provisioned PV 01/04/23 09:19:23.155
STEP: checking the PV 01/04/23 09:19:23.187
STEP: deploying the pod 01/04/23 09:19:23.187
STEP: checking that the pods command exits with no error 01/04/23 09:19:23.229
Jan  4 09:19:23.229: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-sqkv6" in namespace "azurefile-2540" to be "Succeeded or Failed"
Jan  4 09:19:23.259: INFO: Pod "azurefile-volume-tester-sqkv6": Phase="Pending", Reason="", readiness=false. Elapsed: 30.464254ms
Jan  4 09:19:25.292: INFO: Pod "azurefile-volume-tester-sqkv6": Phase="Pending", Reason="", readiness=false. Elapsed: 2.063479412s
Jan  4 09:19:27.294: INFO: Pod "azurefile-volume-tester-sqkv6": Phase="Pending", Reason="", readiness=false. Elapsed: 4.065369471s
Jan  4 09:19:29.295: INFO: Pod "azurefile-volume-tester-sqkv6": Phase="Pending", Reason="", readiness=false. Elapsed: 6.066009436s
Jan  4 09:19:31.294: INFO: Pod "azurefile-volume-tester-sqkv6": Phase="Running", Reason="", readiness=true. Elapsed: 8.065184586s
Jan  4 09:19:33.294: INFO: Pod "azurefile-volume-tester-sqkv6": Phase="Running", Reason="", readiness=true. Elapsed: 10.065402081s
Jan  4 09:19:35.294: INFO: Pod "azurefile-volume-tester-sqkv6": Phase="Running", Reason="", readiness=true. Elapsed: 12.065454042s
Jan  4 09:19:37.296: INFO: Pod "azurefile-volume-tester-sqkv6": Phase="Running", Reason="", readiness=false. Elapsed: 14.066668452s
Jan  4 09:19:39.294: INFO: Pod "azurefile-volume-tester-sqkv6": Phase="Succeeded", Reason="", readiness=false. Elapsed: 16.065079212s
STEP: Saw pod success 01/04/23 09:19:39.294
Jan  4 09:19:39.294: INFO: Pod "azurefile-volume-tester-sqkv6" satisfied condition "Succeeded or Failed"
STEP: setting up the PV 01/04/23 09:19:39.294
STEP: creating a PV 01/04/23 09:19:39.294
STEP: setting up the PVC 01/04/23 09:19:39.335
STEP: creating a PVC 01/04/23 09:19:39.335
STEP: waiting for PVC to be in phase "Bound" 01/04/23 09:19:39.369
Jan  4 09:19:39.369: INFO: Waiting up to timeout=5m0s for PersistentVolumeClaims [pvc-n5wfr] to have phase Bound
Jan  4 09:19:39.417: INFO: PersistentVolumeClaim pvc-n5wfr found but phase is Pending instead of Bound.
Jan  4 09:19:41.450: INFO: PersistentVolumeClaim pvc-n5wfr found and phase=Bound (2.080673969s)
STEP: checking the PVC 01/04/23 09:19:41.45
STEP: validating provisioned PV 01/04/23 09:19:41.481
STEP: checking the PV 01/04/23 09:19:41.513
STEP: deploying the pod 01/04/23 09:19:41.513
STEP: checking that the pods command exits with no error 01/04/23 09:19:41.551
Jan  4 09:19:41.551: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-8fls2" in namespace "azurefile-2540" to be "Succeeded or Failed"
Jan  4 09:19:41.582: INFO: Pod "azurefile-volume-tester-8fls2": Phase="Pending", Reason="", readiness=false. Elapsed: 30.592759ms
Jan  4 09:19:43.615: INFO: Pod "azurefile-volume-tester-8fls2": Phase="Pending", Reason="", readiness=false. Elapsed: 2.064101548s
Jan  4 09:19:45.616: INFO: Pod "azurefile-volume-tester-8fls2": Phase="Pending", Reason="", readiness=false. Elapsed: 4.064916853s
Jan  4 09:19:47.617: INFO: Pod "azurefile-volume-tester-8fls2": Phase="Pending", Reason="", readiness=false. Elapsed: 6.065386502s
Jan  4 09:19:49.617: INFO: Pod "azurefile-volume-tester-8fls2": Phase="Running", Reason="", readiness=true. Elapsed: 8.065562795s
Jan  4 09:19:51.616: INFO: Pod "azurefile-volume-tester-8fls2": Phase="Running", Reason="", readiness=true. Elapsed: 10.064677172s
Jan  4 09:19:53.618: INFO: Pod "azurefile-volume-tester-8fls2": Phase="Running", Reason="", readiness=true. Elapsed: 12.066358861s
Jan  4 09:19:55.615: INFO: Pod "azurefile-volume-tester-8fls2": Phase="Running", Reason="", readiness=false. Elapsed: 14.06411045s
Jan  4 09:19:57.616: INFO: Pod "azurefile-volume-tester-8fls2": Phase="Succeeded", Reason="", readiness=false. Elapsed: 16.065020585s
STEP: Saw pod success 01/04/23 09:19:57.616
Jan  4 09:19:57.617: INFO: Pod "azurefile-volume-tester-8fls2" satisfied condition "Succeeded or Failed"
STEP: setting up the PV 01/04/23 09:19:57.617
STEP: creating a PV 01/04/23 09:19:57.617
STEP: setting up the PVC 01/04/23 09:19:57.652
STEP: creating a PVC 01/04/23 09:19:57.652
STEP: waiting for PVC to be in phase "Bound" 01/04/23 09:19:57.696
Jan  4 09:19:57.696: INFO: Waiting up to timeout=5m0s for PersistentVolumeClaims [pvc-q5c28] to have phase Bound
Jan  4 09:19:57.728: INFO: PersistentVolumeClaim pvc-q5c28 found and phase=Bound (32.30484ms)
STEP: checking the PVC 01/04/23 09:19:57.728
STEP: validating provisioned PV 01/04/23 09:19:57.759
STEP: checking the PV 01/04/23 09:19:57.79
STEP: deploying the pod 01/04/23 09:19:57.79
STEP: checking that the pods command exits with no error 01/04/23 09:19:57.825
Jan  4 09:19:57.825: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-9q868" in namespace "azurefile-2540" to be "Succeeded or Failed"
Jan  4 09:19:57.856: INFO: Pod "azurefile-volume-tester-9q868": Phase="Pending", Reason="", readiness=false. Elapsed: 30.630786ms
Jan  4 09:19:59.887: INFO: Pod "azurefile-volume-tester-9q868": Phase="Pending", Reason="", readiness=false. Elapsed: 2.0619848s
Jan  4 09:20:01.891: INFO: Pod "azurefile-volume-tester-9q868": Phase="Pending", Reason="", readiness=false. Elapsed: 4.065654892s
Jan  4 09:20:03.893: INFO: Pod "azurefile-volume-tester-9q868": Phase="Pending", Reason="", readiness=false. Elapsed: 6.067818104s
Jan  4 09:20:05.890: INFO: Pod "azurefile-volume-tester-9q868": Phase="Running", Reason="", readiness=true. Elapsed: 8.064604562s
Jan  4 09:20:07.891: INFO: Pod "azurefile-volume-tester-9q868": Phase="Running", Reason="", readiness=true. Elapsed: 10.066006452s
Jan  4 09:20:09.890: INFO: Pod "azurefile-volume-tester-9q868": Phase="Running", Reason="", readiness=false. Elapsed: 12.065251688s
Jan  4 09:20:11.890: INFO: Pod "azurefile-volume-tester-9q868": Phase="Succeeded", Reason="", readiness=false. Elapsed: 14.064689553s
STEP: Saw pod success 01/04/23 09:20:11.89
Jan  4 09:20:11.890: INFO: Pod "azurefile-volume-tester-9q868" satisfied condition "Succeeded or Failed"
STEP: setting up the PV 01/04/23 09:20:11.89
STEP: creating a PV 01/04/23 09:20:11.89
STEP: setting up the PVC 01/04/23 09:20:11.926
STEP: creating a PVC 01/04/23 09:20:11.927
STEP: waiting for PVC to be in phase "Bound" 01/04/23 09:20:11.963
Jan  4 09:20:11.963: INFO: Waiting up to timeout=5m0s for PersistentVolumeClaims [pvc-vv48b] to have phase Bound
Jan  4 09:20:11.995: INFO: PersistentVolumeClaim pvc-vv48b found but phase is Pending instead of Bound.
Jan  4 09:20:14.029: INFO: PersistentVolumeClaim pvc-vv48b found and phase=Bound (2.065282776s)
STEP: checking the PVC 01/04/23 09:20:14.029
STEP: validating provisioned PV 01/04/23 09:20:14.06
STEP: checking the PV 01/04/23 09:20:14.092
STEP: deploying the pod 01/04/23 09:20:14.092
STEP: checking that the pods command exits with no error 01/04/23 09:20:14.126
Jan  4 09:20:14.126: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-qcmxd" in namespace "azurefile-2540" to be "Succeeded or Failed"
Jan  4 09:20:14.157: INFO: Pod "azurefile-volume-tester-qcmxd": Phase="Pending", Reason="", readiness=false. Elapsed: 30.824581ms
Jan  4 09:20:16.189: INFO: Pod "azurefile-volume-tester-qcmxd": Phase="Pending", Reason="", readiness=false. Elapsed: 2.062755985s
Jan  4 09:20:18.193: INFO: Pod "azurefile-volume-tester-qcmxd": Phase="Pending", Reason="", readiness=false. Elapsed: 4.066248598s
Jan  4 09:20:20.192: INFO: Pod "azurefile-volume-tester-qcmxd": Phase="Pending", Reason="", readiness=false. Elapsed: 6.065365702s
Jan  4 09:20:22.192: INFO: Pod "azurefile-volume-tester-qcmxd": Phase="Running", Reason="", readiness=true. Elapsed: 8.065403893s
Jan  4 09:20:24.193: INFO: Pod "azurefile-volume-tester-qcmxd": Phase="Running", Reason="", readiness=true. Elapsed: 10.066674616s
Jan  4 09:20:26.194: INFO: Pod "azurefile-volume-tester-qcmxd": Phase="Running", Reason="", readiness=true. Elapsed: 12.067833616s
Jan  4 09:20:28.191: INFO: Pod "azurefile-volume-tester-qcmxd": Phase="Running", Reason="", readiness=false. Elapsed: 14.06457357s
Jan  4 09:20:30.191: INFO: Pod "azurefile-volume-tester-qcmxd": Phase="Succeeded", Reason="", readiness=false. Elapsed: 16.064286038s
STEP: Saw pod success 01/04/23 09:20:30.191
Jan  4 09:20:30.191: INFO: Pod "azurefile-volume-tester-qcmxd" satisfied condition "Succeeded or Failed"
STEP: setting up the PV 01/04/23 09:20:30.191
STEP: creating a PV 01/04/23 09:20:30.191
STEP: setting up the PVC 01/04/23 09:20:30.228
STEP: creating a PVC 01/04/23 09:20:30.228
STEP: waiting for PVC to be in phase "Bound" 01/04/23 09:20:30.265
Jan  4 09:20:30.265: INFO: Waiting up to timeout=5m0s for PersistentVolumeClaims [pvc-5skb7] to have phase Bound
Jan  4 09:20:30.295: INFO: PersistentVolumeClaim pvc-5skb7 found and phase=Bound (30.707234ms)
STEP: checking the PVC 01/04/23 09:20:30.295
STEP: validating provisioned PV 01/04/23 09:20:30.326
STEP: checking the PV 01/04/23 09:20:30.358
STEP: deploying the pod 01/04/23 09:20:30.358
STEP: checking that the pods command exits with no error 01/04/23 09:20:30.395
Jan  4 09:20:30.395: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-tt9c4" in namespace "azurefile-2540" to be "Succeeded or Failed"
Jan  4 09:20:30.426: INFO: Pod "azurefile-volume-tester-tt9c4": Phase="Pending", Reason="", readiness=false. Elapsed: 30.388434ms
Jan  4 09:20:32.458: INFO: Pod "azurefile-volume-tester-tt9c4": Phase="Pending", Reason="", readiness=false. Elapsed: 2.062432011s
Jan  4 09:20:34.461: INFO: Pod "azurefile-volume-tester-tt9c4": Phase="Pending", Reason="", readiness=false. Elapsed: 4.065737369s
Jan  4 09:20:36.460: INFO: Pod "azurefile-volume-tester-tt9c4": Phase="Pending", Reason="", readiness=false. Elapsed: 6.064789135s
Jan  4 09:20:38.460: INFO: Pod "azurefile-volume-tester-tt9c4": Phase="Running", Reason="", readiness=true. Elapsed: 8.064517803s
Jan  4 09:20:40.460: INFO: Pod "azurefile-volume-tester-tt9c4": Phase="Running", Reason="", readiness=true. Elapsed: 10.064213722s
Jan  4 09:20:42.461: INFO: Pod "azurefile-volume-tester-tt9c4": Phase="Running", Reason="", readiness=false. Elapsed: 12.065608415s
Jan  4 09:20:44.460: INFO: Pod "azurefile-volume-tester-tt9c4": Phase="Succeeded", Reason="", readiness=false. Elapsed: 14.064600699s
STEP: Saw pod success 01/04/23 09:20:44.46
Jan  4 09:20:44.460: INFO: Pod "azurefile-volume-tester-tt9c4" satisfied condition "Succeeded or Failed"
Jan  4 09:20:44.460: INFO: deleting Pod "azurefile-2540"/"azurefile-volume-tester-tt9c4"
Jan  4 09:20:44.513: INFO: Pod azurefile-volume-tester-tt9c4 has the following logs: hello world

STEP: Deleting pod azurefile-volume-tester-tt9c4 in namespace azurefile-2540 01/04/23 09:20:44.513
Jan  4 09:20:44.555: INFO: deleting PVC "azurefile-2540"/"pvc-5skb7"
Jan  4 09:20:44.555: INFO: Deleting PersistentVolumeClaim "pvc-5skb7"
... skipping 94 lines ...
    Jan  4 09:12:56.789: INFO: Waiting up to timeout=5m0s for PersistentVolumeClaims [pvc-twrjx] to have phase Bound
    Jan  4 09:12:56.823: INFO: PersistentVolumeClaim pvc-twrjx found and phase=Bound (33.847474ms)
    STEP: checking the PVC 01/04/23 09:12:56.823
    STEP: validating provisioned PV 01/04/23 09:12:56.854
    STEP: checking the PV 01/04/23 09:12:56.885
    STEP: deploying the pod 01/04/23 09:12:56.885
    STEP: checking that the pods command exits with no error 01/04/23 09:12:56.92
    Jan  4 09:12:56.920: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-qszbw" in namespace "azurefile-2540" to be "Succeeded or Failed"
    Jan  4 09:12:56.952: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Pending", Reason="", readiness=false. Elapsed: 31.728549ms
    Jan  4 09:12:58.985: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Pending", Reason="", readiness=false. Elapsed: 2.064704327s
    Jan  4 09:13:00.984: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Pending", Reason="", readiness=false. Elapsed: 4.063731449s
    Jan  4 09:13:02.986: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Pending", Reason="", readiness=false. Elapsed: 6.065719391s
    Jan  4 09:13:04.984: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Pending", Reason="", readiness=false. Elapsed: 8.064321914s
    Jan  4 09:13:06.986: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Pending", Reason="", readiness=false. Elapsed: 10.065507s
... skipping 182 lines ...
    Jan  4 09:19:12.987: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Running", Reason="", readiness=true. Elapsed: 6m16.066572764s
    Jan  4 09:19:14.987: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Running", Reason="", readiness=true. Elapsed: 6m18.067124732s
    Jan  4 09:19:16.986: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Running", Reason="", readiness=true. Elapsed: 6m20.066224309s
    Jan  4 09:19:18.987: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Running", Reason="", readiness=false. Elapsed: 6m22.067030288s
    Jan  4 09:19:20.987: INFO: Pod "azurefile-volume-tester-qszbw": Phase="Succeeded", Reason="", readiness=false. Elapsed: 6m24.066563136s
    STEP: Saw pod success 01/04/23 09:19:20.987
    Jan  4 09:19:20.987: INFO: Pod "azurefile-volume-tester-qszbw" satisfied condition "Succeeded or Failed"
    STEP: setting up the PV 01/04/23 09:19:20.987
    STEP: creating a PV 01/04/23 09:19:20.987
    STEP: setting up the PVC 01/04/23 09:19:21.023
    STEP: creating a PVC 01/04/23 09:19:21.023
    STEP: waiting for PVC to be in phase "Bound" 01/04/23 09:19:21.06
    Jan  4 09:19:21.060: INFO: Waiting up to timeout=5m0s for PersistentVolumeClaims [pvc-lw84c] to have phase Bound
    Jan  4 09:19:21.092: INFO: PersistentVolumeClaim pvc-lw84c found but phase is Pending instead of Bound.
    Jan  4 09:19:23.124: INFO: PersistentVolumeClaim pvc-lw84c found and phase=Bound (2.064089201s)
    STEP: checking the PVC 01/04/23 09:19:23.124
    STEP: validating provisioned PV 01/04/23 09:19:23.155
    STEP: checking the PV 01/04/23 09:19:23.187
    STEP: deploying the pod 01/04/23 09:19:23.187
    STEP: checking that the pods command exits with no error 01/04/23 09:19:23.229
    Jan  4 09:19:23.229: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-sqkv6" in namespace "azurefile-2540" to be "Succeeded or Failed"
    Jan  4 09:19:23.259: INFO: Pod "azurefile-volume-tester-sqkv6": Phase="Pending", Reason="", readiness=false. Elapsed: 30.464254ms
    Jan  4 09:19:25.292: INFO: Pod "azurefile-volume-tester-sqkv6": Phase="Pending", Reason="", readiness=false. Elapsed: 2.063479412s
    Jan  4 09:19:27.294: INFO: Pod "azurefile-volume-tester-sqkv6": Phase="Pending", Reason="", readiness=false. Elapsed: 4.065369471s
    Jan  4 09:19:29.295: INFO: Pod "azurefile-volume-tester-sqkv6": Phase="Pending", Reason="", readiness=false. Elapsed: 6.066009436s
    Jan  4 09:19:31.294: INFO: Pod "azurefile-volume-tester-sqkv6": Phase="Running", Reason="", readiness=true. Elapsed: 8.065184586s
    Jan  4 09:19:33.294: INFO: Pod "azurefile-volume-tester-sqkv6": Phase="Running", Reason="", readiness=true. Elapsed: 10.065402081s
    Jan  4 09:19:35.294: INFO: Pod "azurefile-volume-tester-sqkv6": Phase="Running", Reason="", readiness=true. Elapsed: 12.065454042s
    Jan  4 09:19:37.296: INFO: Pod "azurefile-volume-tester-sqkv6": Phase="Running", Reason="", readiness=false. Elapsed: 14.066668452s
    Jan  4 09:19:39.294: INFO: Pod "azurefile-volume-tester-sqkv6": Phase="Succeeded", Reason="", readiness=false. Elapsed: 16.065079212s
    STEP: Saw pod success 01/04/23 09:19:39.294
    Jan  4 09:19:39.294: INFO: Pod "azurefile-volume-tester-sqkv6" satisfied condition "Succeeded or Failed"
    STEP: setting up the PV 01/04/23 09:19:39.294
    STEP: creating a PV 01/04/23 09:19:39.294
    STEP: setting up the PVC 01/04/23 09:19:39.335
    STEP: creating a PVC 01/04/23 09:19:39.335
    STEP: waiting for PVC to be in phase "Bound" 01/04/23 09:19:39.369
    Jan  4 09:19:39.369: INFO: Waiting up to timeout=5m0s for PersistentVolumeClaims [pvc-n5wfr] to have phase Bound
    Jan  4 09:19:39.417: INFO: PersistentVolumeClaim pvc-n5wfr found but phase is Pending instead of Bound.
    Jan  4 09:19:41.450: INFO: PersistentVolumeClaim pvc-n5wfr found and phase=Bound (2.080673969s)
    STEP: checking the PVC 01/04/23 09:19:41.45
    STEP: validating provisioned PV 01/04/23 09:19:41.481
    STEP: checking the PV 01/04/23 09:19:41.513
    STEP: deploying the pod 01/04/23 09:19:41.513
    STEP: checking that the pods command exits with no error 01/04/23 09:19:41.551
    Jan  4 09:19:41.551: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-8fls2" in namespace "azurefile-2540" to be "Succeeded or Failed"
    Jan  4 09:19:41.582: INFO: Pod "azurefile-volume-tester-8fls2": Phase="Pending", Reason="", readiness=false. Elapsed: 30.592759ms
    Jan  4 09:19:43.615: INFO: Pod "azurefile-volume-tester-8fls2": Phase="Pending", Reason="", readiness=false. Elapsed: 2.064101548s
    Jan  4 09:19:45.616: INFO: Pod "azurefile-volume-tester-8fls2": Phase="Pending", Reason="", readiness=false. Elapsed: 4.064916853s
    Jan  4 09:19:47.617: INFO: Pod "azurefile-volume-tester-8fls2": Phase="Pending", Reason="", readiness=false. Elapsed: 6.065386502s
    Jan  4 09:19:49.617: INFO: Pod "azurefile-volume-tester-8fls2": Phase="Running", Reason="", readiness=true. Elapsed: 8.065562795s
    Jan  4 09:19:51.616: INFO: Pod "azurefile-volume-tester-8fls2": Phase="Running", Reason="", readiness=true. Elapsed: 10.064677172s
    Jan  4 09:19:53.618: INFO: Pod "azurefile-volume-tester-8fls2": Phase="Running", Reason="", readiness=true. Elapsed: 12.066358861s
    Jan  4 09:19:55.615: INFO: Pod "azurefile-volume-tester-8fls2": Phase="Running", Reason="", readiness=false. Elapsed: 14.06411045s
    Jan  4 09:19:57.616: INFO: Pod "azurefile-volume-tester-8fls2": Phase="Succeeded", Reason="", readiness=false. Elapsed: 16.065020585s
    STEP: Saw pod success 01/04/23 09:19:57.616
    Jan  4 09:19:57.617: INFO: Pod "azurefile-volume-tester-8fls2" satisfied condition "Succeeded or Failed"
    STEP: setting up the PV 01/04/23 09:19:57.617
    STEP: creating a PV 01/04/23 09:19:57.617
    STEP: setting up the PVC 01/04/23 09:19:57.652
    STEP: creating a PVC 01/04/23 09:19:57.652
    STEP: waiting for PVC to be in phase "Bound" 01/04/23 09:19:57.696
    Jan  4 09:19:57.696: INFO: Waiting up to timeout=5m0s for PersistentVolumeClaims [pvc-q5c28] to have phase Bound
    Jan  4 09:19:57.728: INFO: PersistentVolumeClaim pvc-q5c28 found and phase=Bound (32.30484ms)
    STEP: checking the PVC 01/04/23 09:19:57.728
    STEP: validating provisioned PV 01/04/23 09:19:57.759
    STEP: checking the PV 01/04/23 09:19:57.79
    STEP: deploying the pod 01/04/23 09:19:57.79
    STEP: checking that the pods command exits with no error 01/04/23 09:19:57.825
    Jan  4 09:19:57.825: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-9q868" in namespace "azurefile-2540" to be "Succeeded or Failed"
    Jan  4 09:19:57.856: INFO: Pod "azurefile-volume-tester-9q868": Phase="Pending", Reason="", readiness=false. Elapsed: 30.630786ms
    Jan  4 09:19:59.887: INFO: Pod "azurefile-volume-tester-9q868": Phase="Pending", Reason="", readiness=false. Elapsed: 2.0619848s
    Jan  4 09:20:01.891: INFO: Pod "azurefile-volume-tester-9q868": Phase="Pending", Reason="", readiness=false. Elapsed: 4.065654892s
    Jan  4 09:20:03.893: INFO: Pod "azurefile-volume-tester-9q868": Phase="Pending", Reason="", readiness=false. Elapsed: 6.067818104s
    Jan  4 09:20:05.890: INFO: Pod "azurefile-volume-tester-9q868": Phase="Running", Reason="", readiness=true. Elapsed: 8.064604562s
    Jan  4 09:20:07.891: INFO: Pod "azurefile-volume-tester-9q868": Phase="Running", Reason="", readiness=true. Elapsed: 10.066006452s
    Jan  4 09:20:09.890: INFO: Pod "azurefile-volume-tester-9q868": Phase="Running", Reason="", readiness=false. Elapsed: 12.065251688s
    Jan  4 09:20:11.890: INFO: Pod "azurefile-volume-tester-9q868": Phase="Succeeded", Reason="", readiness=false. Elapsed: 14.064689553s
    STEP: Saw pod success 01/04/23 09:20:11.89
    Jan  4 09:20:11.890: INFO: Pod "azurefile-volume-tester-9q868" satisfied condition "Succeeded or Failed"
    STEP: setting up the PV 01/04/23 09:20:11.89
    STEP: creating a PV 01/04/23 09:20:11.89
    STEP: setting up the PVC 01/04/23 09:20:11.926
    STEP: creating a PVC 01/04/23 09:20:11.927
    STEP: waiting for PVC to be in phase "Bound" 01/04/23 09:20:11.963
    Jan  4 09:20:11.963: INFO: Waiting up to timeout=5m0s for PersistentVolumeClaims [pvc-vv48b] to have phase Bound
    Jan  4 09:20:11.995: INFO: PersistentVolumeClaim pvc-vv48b found but phase is Pending instead of Bound.
    Jan  4 09:20:14.029: INFO: PersistentVolumeClaim pvc-vv48b found and phase=Bound (2.065282776s)
    STEP: checking the PVC 01/04/23 09:20:14.029
    STEP: validating provisioned PV 01/04/23 09:20:14.06
    STEP: checking the PV 01/04/23 09:20:14.092
    STEP: deploying the pod 01/04/23 09:20:14.092
    STEP: checking that the pods command exits with no error 01/04/23 09:20:14.126
    Jan  4 09:20:14.126: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-qcmxd" in namespace "azurefile-2540" to be "Succeeded or Failed"
    Jan  4 09:20:14.157: INFO: Pod "azurefile-volume-tester-qcmxd": Phase="Pending", Reason="", readiness=false. Elapsed: 30.824581ms
    Jan  4 09:20:16.189: INFO: Pod "azurefile-volume-tester-qcmxd": Phase="Pending", Reason="", readiness=false. Elapsed: 2.062755985s
    Jan  4 09:20:18.193: INFO: Pod "azurefile-volume-tester-qcmxd": Phase="Pending", Reason="", readiness=false. Elapsed: 4.066248598s
    Jan  4 09:20:20.192: INFO: Pod "azurefile-volume-tester-qcmxd": Phase="Pending", Reason="", readiness=false. Elapsed: 6.065365702s
    Jan  4 09:20:22.192: INFO: Pod "azurefile-volume-tester-qcmxd": Phase="Running", Reason="", readiness=true. Elapsed: 8.065403893s
    Jan  4 09:20:24.193: INFO: Pod "azurefile-volume-tester-qcmxd": Phase="Running", Reason="", readiness=true. Elapsed: 10.066674616s
    Jan  4 09:20:26.194: INFO: Pod "azurefile-volume-tester-qcmxd": Phase="Running", Reason="", readiness=true. Elapsed: 12.067833616s
    Jan  4 09:20:28.191: INFO: Pod "azurefile-volume-tester-qcmxd": Phase="Running", Reason="", readiness=false. Elapsed: 14.06457357s
    Jan  4 09:20:30.191: INFO: Pod "azurefile-volume-tester-qcmxd": Phase="Succeeded", Reason="", readiness=false. Elapsed: 16.064286038s
    STEP: Saw pod success 01/04/23 09:20:30.191
    Jan  4 09:20:30.191: INFO: Pod "azurefile-volume-tester-qcmxd" satisfied condition "Succeeded or Failed"
    STEP: setting up the PV 01/04/23 09:20:30.191
    STEP: creating a PV 01/04/23 09:20:30.191
    STEP: setting up the PVC 01/04/23 09:20:30.228
    STEP: creating a PVC 01/04/23 09:20:30.228
    STEP: waiting for PVC to be in phase "Bound" 01/04/23 09:20:30.265
    Jan  4 09:20:30.265: INFO: Waiting up to timeout=5m0s for PersistentVolumeClaims [pvc-5skb7] to have phase Bound
    Jan  4 09:20:30.295: INFO: PersistentVolumeClaim pvc-5skb7 found and phase=Bound (30.707234ms)
    STEP: checking the PVC 01/04/23 09:20:30.295
    STEP: validating provisioned PV 01/04/23 09:20:30.326
    STEP: checking the PV 01/04/23 09:20:30.358
    STEP: deploying the pod 01/04/23 09:20:30.358
    STEP: checking that the pods command exits with no error 01/04/23 09:20:30.395
    Jan  4 09:20:30.395: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-tt9c4" in namespace "azurefile-2540" to be "Succeeded or Failed"
    Jan  4 09:20:30.426: INFO: Pod "azurefile-volume-tester-tt9c4": Phase="Pending", Reason="", readiness=false. Elapsed: 30.388434ms
    Jan  4 09:20:32.458: INFO: Pod "azurefile-volume-tester-tt9c4": Phase="Pending", Reason="", readiness=false. Elapsed: 2.062432011s
    Jan  4 09:20:34.461: INFO: Pod "azurefile-volume-tester-tt9c4": Phase="Pending", Reason="", readiness=false. Elapsed: 4.065737369s
    Jan  4 09:20:36.460: INFO: Pod "azurefile-volume-tester-tt9c4": Phase="Pending", Reason="", readiness=false. Elapsed: 6.064789135s
    Jan  4 09:20:38.460: INFO: Pod "azurefile-volume-tester-tt9c4": Phase="Running", Reason="", readiness=true. Elapsed: 8.064517803s
    Jan  4 09:20:40.460: INFO: Pod "azurefile-volume-tester-tt9c4": Phase="Running", Reason="", readiness=true. Elapsed: 10.064213722s
    Jan  4 09:20:42.461: INFO: Pod "azurefile-volume-tester-tt9c4": Phase="Running", Reason="", readiness=false. Elapsed: 12.065608415s
    Jan  4 09:20:44.460: INFO: Pod "azurefile-volume-tester-tt9c4": Phase="Succeeded", Reason="", readiness=false. Elapsed: 14.064600699s
    STEP: Saw pod success 01/04/23 09:20:44.46
    Jan  4 09:20:44.460: INFO: Pod "azurefile-volume-tester-tt9c4" satisfied condition "Succeeded or Failed"
    Jan  4 09:20:44.460: INFO: deleting Pod "azurefile-2540"/"azurefile-volume-tester-tt9c4"
    Jan  4 09:20:44.513: INFO: Pod azurefile-volume-tester-tt9c4 has the following logs: hello world

    STEP: Deleting pod azurefile-volume-tester-tt9c4 in namespace azurefile-2540 01/04/23 09:20:44.513
    Jan  4 09:20:44.555: INFO: deleting PVC "azurefile-2540"/"pvc-5skb7"
    Jan  4 09:20:44.555: INFO: Deleting PersistentVolumeClaim "pvc-5skb7"
... skipping 174 lines ...
Jan  4 09:20:50.110: INFO: PersistentVolumeClaim pvc-6dqhl found but phase is Pending instead of Bound.
Jan  4 09:20:52.143: INFO: PersistentVolumeClaim pvc-6dqhl found and phase=Bound (2.063468795s)
STEP: checking the PVC 01/04/23 09:20:52.143
STEP: validating provisioned PV 01/04/23 09:20:52.174
STEP: checking the PV 01/04/23 09:20:52.205
STEP: deploying the pod 01/04/23 09:20:52.205
STEP: checking that the pods command exits with no error 01/04/23 09:20:52.24
Jan  4 09:20:52.240: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-xphtd" in namespace "azurefile-5466" to be "Succeeded or Failed"
Jan  4 09:20:52.278: INFO: Pod "azurefile-volume-tester-xphtd": Phase="Pending", Reason="", readiness=false. Elapsed: 37.666426ms
Jan  4 09:20:54.310: INFO: Pod "azurefile-volume-tester-xphtd": Phase="Pending", Reason="", readiness=false. Elapsed: 2.069287777s
Jan  4 09:20:56.316: INFO: Pod "azurefile-volume-tester-xphtd": Phase="Pending", Reason="", readiness=false. Elapsed: 4.075967962s
Jan  4 09:20:58.312: INFO: Pod "azurefile-volume-tester-xphtd": Phase="Pending", Reason="", readiness=false. Elapsed: 6.072036444s
Jan  4 09:21:00.311: INFO: Pod "azurefile-volume-tester-xphtd": Phase="Running", Reason="", readiness=true. Elapsed: 8.071038269s
Jan  4 09:21:02.311: INFO: Pod "azurefile-volume-tester-xphtd": Phase="Running", Reason="", readiness=true. Elapsed: 10.070958846s
Jan  4 09:21:04.312: INFO: Pod "azurefile-volume-tester-xphtd": Phase="Running", Reason="", readiness=true. Elapsed: 12.071345481s
Jan  4 09:21:06.312: INFO: Pod "azurefile-volume-tester-xphtd": Phase="Running", Reason="", readiness=false. Elapsed: 14.071781148s
Jan  4 09:21:08.311: INFO: Pod "azurefile-volume-tester-xphtd": Phase="Succeeded", Reason="", readiness=false. Elapsed: 16.071144949s
STEP: Saw pod success 01/04/23 09:21:08.311
Jan  4 09:21:08.312: INFO: Pod "azurefile-volume-tester-xphtd" satisfied condition "Succeeded or Failed"
Jan  4 09:21:08.312: INFO: deleting Pod "azurefile-5466"/"azurefile-volume-tester-xphtd"
Jan  4 09:21:08.353: INFO: Pod azurefile-volume-tester-xphtd has the following logs: hello world

STEP: Deleting pod azurefile-volume-tester-xphtd in namespace azurefile-5466 01/04/23 09:21:08.353
Jan  4 09:21:08.409: INFO: deleting PVC "azurefile-5466"/"pvc-6dqhl"
Jan  4 09:21:08.409: INFO: Deleting PersistentVolumeClaim "pvc-6dqhl"
Jan  4 09:21:08.455: INFO: Waiting up to 5m0s for PersistentVolumeClaim azurefile-5466 to be removed
Jan  4 09:21:08.486: INFO: Claim "azurefile-5466" in namespace "pvc-6dqhl" doesn't exist in the system
Jan  4 09:21:08.486: INFO: deleting StorageClass azurefile-5466-file.csi.azure.com-pre-provisioned-sc-r7bnx
W0104 09:21:08.644349   38721 azurefile.go:802] DeleteFileShare(pre-provisioned-existing-credentials) on account(f46c311439d5b4b38903e84) failed with error(storage.FileSharesClient#Delete: Failure responding to request: StatusCode=429 -- Original Error: autorest/azure: Service returned an error. Status=429 Code="TooManyRequests" Message="The request is being throttled as the limit has been reached for operation type - Write_ObservationWindow_00:00:01. For more information, see - https://aka.ms/srpthrottlinglimits"), waiting for retrying
W0104 09:21:08.644400   38721 azurefile.go:804] switch to use data plane API instead for account f46c311439d5b4b38903e84 since it's throttled
STEP: dump namespace information after failure 01/04/23 09:21:08.644
STEP: Destroying namespace "azurefile-5466" for this suite. 01/04/23 09:21:08.645
------------------------------
• [FAILED] [19.687 seconds]
Pre-Provisioned [AfterEach]
/home/prow/go/src/sigs.k8s.io/azurefile-csi-driver/test/e2e/pre_provisioning_test.go:67
  should use existing credentials in k8s cluster [file.csi.azure.com] [Windows]
  /home/prow/go/src/sigs.k8s.io/azurefile-csi-driver/test/e2e/pre_provisioning_test.go:186

  Begin Captured GinkgoWriter Output >>
... skipping 15 lines ...
    Jan  4 09:20:50.110: INFO: PersistentVolumeClaim pvc-6dqhl found but phase is Pending instead of Bound.
    Jan  4 09:20:52.143: INFO: PersistentVolumeClaim pvc-6dqhl found and phase=Bound (2.063468795s)
    STEP: checking the PVC 01/04/23 09:20:52.143
    STEP: validating provisioned PV 01/04/23 09:20:52.174
    STEP: checking the PV 01/04/23 09:20:52.205
    STEP: deploying the pod 01/04/23 09:20:52.205
    STEP: checking that the pods command exits with no error 01/04/23 09:20:52.24
    Jan  4 09:20:52.240: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-xphtd" in namespace "azurefile-5466" to be "Succeeded or Failed"
    Jan  4 09:20:52.278: INFO: Pod "azurefile-volume-tester-xphtd": Phase="Pending", Reason="", readiness=false. Elapsed: 37.666426ms
    Jan  4 09:20:54.310: INFO: Pod "azurefile-volume-tester-xphtd": Phase="Pending", Reason="", readiness=false. Elapsed: 2.069287777s
    Jan  4 09:20:56.316: INFO: Pod "azurefile-volume-tester-xphtd": Phase="Pending", Reason="", readiness=false. Elapsed: 4.075967962s
    Jan  4 09:20:58.312: INFO: Pod "azurefile-volume-tester-xphtd": Phase="Pending", Reason="", readiness=false. Elapsed: 6.072036444s
    Jan  4 09:21:00.311: INFO: Pod "azurefile-volume-tester-xphtd": Phase="Running", Reason="", readiness=true. Elapsed: 8.071038269s
    Jan  4 09:21:02.311: INFO: Pod "azurefile-volume-tester-xphtd": Phase="Running", Reason="", readiness=true. Elapsed: 10.070958846s
    Jan  4 09:21:04.312: INFO: Pod "azurefile-volume-tester-xphtd": Phase="Running", Reason="", readiness=true. Elapsed: 12.071345481s
    Jan  4 09:21:06.312: INFO: Pod "azurefile-volume-tester-xphtd": Phase="Running", Reason="", readiness=false. Elapsed: 14.071781148s
    Jan  4 09:21:08.311: INFO: Pod "azurefile-volume-tester-xphtd": Phase="Succeeded", Reason="", readiness=false. Elapsed: 16.071144949s
    STEP: Saw pod success 01/04/23 09:21:08.311
    Jan  4 09:21:08.312: INFO: Pod "azurefile-volume-tester-xphtd" satisfied condition "Succeeded or Failed"
    Jan  4 09:21:08.312: INFO: deleting Pod "azurefile-5466"/"azurefile-volume-tester-xphtd"
    Jan  4 09:21:08.353: INFO: Pod azurefile-volume-tester-xphtd has the following logs: hello world

    STEP: Deleting pod azurefile-volume-tester-xphtd in namespace azurefile-5466 01/04/23 09:21:08.353
    Jan  4 09:21:08.409: INFO: deleting PVC "azurefile-5466"/"pvc-6dqhl"
    Jan  4 09:21:08.409: INFO: Deleting PersistentVolumeClaim "pvc-6dqhl"
    Jan  4 09:21:08.455: INFO: Waiting up to 5m0s for PersistentVolumeClaim azurefile-5466 to be removed
    Jan  4 09:21:08.486: INFO: Claim "azurefile-5466" in namespace "pvc-6dqhl" doesn't exist in the system
    Jan  4 09:21:08.486: INFO: deleting StorageClass azurefile-5466-file.csi.azure.com-pre-provisioned-sc-r7bnx
    W0104 09:21:08.644349   38721 azurefile.go:802] DeleteFileShare(pre-provisioned-existing-credentials) on account(f46c311439d5b4b38903e84) failed with error(storage.FileSharesClient#Delete: Failure responding to request: StatusCode=429 -- Original Error: autorest/azure: Service returned an error. Status=429 Code="TooManyRequests" Message="The request is being throttled as the limit has been reached for operation type - Write_ObservationWindow_00:00:01. For more information, see - https://aka.ms/srpthrottlinglimits"), waiting for retrying
    W0104 09:21:08.644400   38721 azurefile.go:804] switch to use data plane API instead for account f46c311439d5b4b38903e84 since it's throttled
    STEP: dump namespace information after failure 01/04/23 09:21:08.644
    STEP: Destroying namespace "azurefile-5466" for this suite. 01/04/23 09:21:08.645
  << End Captured GinkgoWriter Output

  create volume "capz-d35khr#f46c311439d5b4b38903e84#pre-provisioned-existing-credentials##pre-provisioned-existing-credentials#azurefile-5466" error: rpc error: code = Internal desc = DeleteFileShare pre-provisioned-existing-credentials under account(f46c311439d5b4b38903e84) rg(capz-d35khr) failed with error: storage.FileSharesClient#Delete: Failure responding to request: StatusCode=429 -- Original Error: autorest/azure: Service returned an error. Status=429 Code="TooManyRequests" Message="The request is being throttled as the limit has been reached for operation type - Write_ObservationWindow_00:00:01. For more information, see - https://aka.ms/srpthrottlinglimits"
  In [AfterEach] at: /home/prow/go/src/sigs.k8s.io/azurefile-csi-driver/test/e2e/pre_provisioning_test.go:74

  There were additional failures detected after the initial failure:
    [PANICKED]
    Test Panicked
    In [DeferCleanup (Each)] at: /usr/local/go/src/runtime/panic.go:260

    runtime error: invalid memory address or nil pointer dereference

    Full Stack Trace
      k8s.io/kubernetes/test/e2e/framework.(*Framework).dumpNamespaceInfo.func1()
      	/home/prow/go/src/sigs.k8s.io/azurefile-csi-driver/vendor/k8s.io/kubernetes/test/e2e/framework/framework.go:274 +0x5c
      k8s.io/kubernetes/test/e2e/framework.(*Framework).dumpNamespaceInfo(0xc0005da0f0)
      	/home/prow/go/src/sigs.k8s.io/azurefile-csi-driver/vendor/k8s.io/kubernetes/test/e2e/framework/framework.go:271 +0x179
... skipping 26 lines ...
Jan  4 09:21:09.877: INFO: Waiting up to timeout=5m0s for PersistentVolumeClaims [pvc-jccsl] to have phase Bound
Jan  4 09:21:09.907: INFO: PersistentVolumeClaim pvc-jccsl found and phase=Bound (30.530954ms)
STEP: checking the PVC 01/04/23 09:21:09.907
STEP: validating provisioned PV 01/04/23 09:21:09.938
STEP: checking the PV 01/04/23 09:21:09.969
STEP: deploying the pod 01/04/23 09:21:09.969
STEP: checking that the pods command exits with no error 01/04/23 09:21:10.004
Jan  4 09:21:10.005: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-5kxs6" in namespace "azurefile-2790" to be "Succeeded or Failed"
Jan  4 09:21:10.036: INFO: Pod "azurefile-volume-tester-5kxs6": Phase="Pending", Reason="", readiness=false. Elapsed: 31.782498ms
Jan  4 09:21:12.069: INFO: Pod "azurefile-volume-tester-5kxs6": Phase="Pending", Reason="", readiness=false. Elapsed: 2.064135133s
Jan  4 09:21:14.076: INFO: Pod "azurefile-volume-tester-5kxs6": Phase="Pending", Reason="", readiness=false. Elapsed: 4.071646885s
Jan  4 09:21:16.072: INFO: Pod "azurefile-volume-tester-5kxs6": Phase="Pending", Reason="", readiness=false. Elapsed: 6.067243298s
Jan  4 09:21:18.070: INFO: Pod "azurefile-volume-tester-5kxs6": Phase="Running", Reason="", readiness=true. Elapsed: 8.065183301s
Jan  4 09:21:20.071: INFO: Pod "azurefile-volume-tester-5kxs6": Phase="Running", Reason="", readiness=true. Elapsed: 10.066227197s
Jan  4 09:21:22.072: INFO: Pod "azurefile-volume-tester-5kxs6": Phase="Running", Reason="", readiness=true. Elapsed: 12.067049823s
Jan  4 09:21:24.072: INFO: Pod "azurefile-volume-tester-5kxs6": Phase="Running", Reason="", readiness=false. Elapsed: 14.067028374s
Jan  4 09:21:26.071: INFO: Pod "azurefile-volume-tester-5kxs6": Phase="Succeeded", Reason="", readiness=false. Elapsed: 16.066260669s
STEP: Saw pod success 01/04/23 09:21:26.071
Jan  4 09:21:26.071: INFO: Pod "azurefile-volume-tester-5kxs6" satisfied condition "Succeeded or Failed"
Jan  4 09:21:26.071: INFO: deleting Pod "azurefile-2790"/"azurefile-volume-tester-5kxs6"
Jan  4 09:21:26.111: INFO: Pod azurefile-volume-tester-5kxs6 has the following logs: hello world

STEP: Deleting pod azurefile-volume-tester-5kxs6 in namespace azurefile-2790 01/04/23 09:21:26.111
Jan  4 09:21:26.158: INFO: deleting PVC "azurefile-2790"/"pvc-jccsl"
Jan  4 09:21:26.158: INFO: Deleting PersistentVolumeClaim "pvc-jccsl"
... skipping 31 lines ...
    Jan  4 09:21:09.877: INFO: Waiting up to timeout=5m0s for PersistentVolumeClaims [pvc-jccsl] to have phase Bound
    Jan  4 09:21:09.907: INFO: PersistentVolumeClaim pvc-jccsl found and phase=Bound (30.530954ms)
    STEP: checking the PVC 01/04/23 09:21:09.907
    STEP: validating provisioned PV 01/04/23 09:21:09.938
    STEP: checking the PV 01/04/23 09:21:09.969
    STEP: deploying the pod 01/04/23 09:21:09.969
    STEP: checking that the pods command exits with no error 01/04/23 09:21:10.004
    Jan  4 09:21:10.005: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-5kxs6" in namespace "azurefile-2790" to be "Succeeded or Failed"
    Jan  4 09:21:10.036: INFO: Pod "azurefile-volume-tester-5kxs6": Phase="Pending", Reason="", readiness=false. Elapsed: 31.782498ms
    Jan  4 09:21:12.069: INFO: Pod "azurefile-volume-tester-5kxs6": Phase="Pending", Reason="", readiness=false. Elapsed: 2.064135133s
    Jan  4 09:21:14.076: INFO: Pod "azurefile-volume-tester-5kxs6": Phase="Pending", Reason="", readiness=false. Elapsed: 4.071646885s
    Jan  4 09:21:16.072: INFO: Pod "azurefile-volume-tester-5kxs6": Phase="Pending", Reason="", readiness=false. Elapsed: 6.067243298s
    Jan  4 09:21:18.070: INFO: Pod "azurefile-volume-tester-5kxs6": Phase="Running", Reason="", readiness=true. Elapsed: 8.065183301s
    Jan  4 09:21:20.071: INFO: Pod "azurefile-volume-tester-5kxs6": Phase="Running", Reason="", readiness=true. Elapsed: 10.066227197s
    Jan  4 09:21:22.072: INFO: Pod "azurefile-volume-tester-5kxs6": Phase="Running", Reason="", readiness=true. Elapsed: 12.067049823s
    Jan  4 09:21:24.072: INFO: Pod "azurefile-volume-tester-5kxs6": Phase="Running", Reason="", readiness=false. Elapsed: 14.067028374s
    Jan  4 09:21:26.071: INFO: Pod "azurefile-volume-tester-5kxs6": Phase="Succeeded", Reason="", readiness=false. Elapsed: 16.066260669s
    STEP: Saw pod success 01/04/23 09:21:26.071
    Jan  4 09:21:26.071: INFO: Pod "azurefile-volume-tester-5kxs6" satisfied condition "Succeeded or Failed"
    Jan  4 09:21:26.071: INFO: deleting Pod "azurefile-2790"/"azurefile-volume-tester-5kxs6"
    Jan  4 09:21:26.111: INFO: Pod azurefile-volume-tester-5kxs6 has the following logs: hello world

    STEP: Deleting pod azurefile-volume-tester-5kxs6 in namespace azurefile-2790 01/04/23 09:21:26.111
    Jan  4 09:21:26.158: INFO: deleting PVC "azurefile-2790"/"pvc-jccsl"
    Jan  4 09:21:26.158: INFO: Deleting PersistentVolumeClaim "pvc-jccsl"
... skipping 167 lines ...
Jan  4 09:23:31.644: INFO: PersistentVolumeClaim pvc-c2tpj found but phase is Pending instead of Bound.
Jan  4 09:23:33.676: INFO: PersistentVolumeClaim pvc-c2tpj found and phase=Bound (1m39.606521107s)
STEP: checking the PVC 01/04/23 09:23:33.677
STEP: validating provisioned PV 01/04/23 09:23:33.708
STEP: checking the PV 01/04/23 09:23:33.739
STEP: deploying the pod 01/04/23 09:23:33.74
STEP: checking that the pods command exits with no error 01/04/23 09:23:33.777
Jan  4 09:23:33.777: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-qbr2r" in namespace "azurefile-5194" to be "Succeeded or Failed"
Jan  4 09:23:33.812: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Pending", Reason="", readiness=false. Elapsed: 35.395058ms
Jan  4 09:23:35.844: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Pending", Reason="", readiness=false. Elapsed: 2.066798774s
Jan  4 09:23:37.845: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Pending", Reason="", readiness=false. Elapsed: 4.06801435s
Jan  4 09:23:39.847: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Pending", Reason="", readiness=false. Elapsed: 6.069636637s
Jan  4 09:23:41.847: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Pending", Reason="", readiness=false. Elapsed: 8.069677888s
Jan  4 09:23:43.847: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Pending", Reason="", readiness=false. Elapsed: 10.070010636s
Jan  4 09:23:45.846: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Running", Reason="", readiness=true. Elapsed: 12.069002445s
Jan  4 09:23:47.846: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Running", Reason="", readiness=true. Elapsed: 14.06951191s
Jan  4 09:23:49.847: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Running", Reason="", readiness=true. Elapsed: 16.070501815s
Jan  4 09:23:51.853: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Running", Reason="", readiness=false. Elapsed: 18.076567125s
Jan  4 09:23:53.847: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Succeeded", Reason="", readiness=false. Elapsed: 20.070355937s
STEP: Saw pod success 01/04/23 09:23:53.847
Jan  4 09:23:53.847: INFO: Pod "azurefile-volume-tester-qbr2r" satisfied condition "Succeeded or Failed"
Jan  4 09:23:53.848: INFO: deleting Pod "azurefile-5194"/"azurefile-volume-tester-qbr2r"
Jan  4 09:23:53.896: INFO: Pod azurefile-volume-tester-qbr2r has the following logs: hello world

STEP: Deleting pod azurefile-volume-tester-qbr2r in namespace azurefile-5194 01/04/23 09:23:53.896
Jan  4 09:23:53.965: INFO: deleting PVC "azurefile-5194"/"pvc-c2tpj"
Jan  4 09:23:53.965: INFO: Deleting PersistentVolumeClaim "pvc-c2tpj"
... skipping 76 lines ...
    Jan  4 09:23:31.644: INFO: PersistentVolumeClaim pvc-c2tpj found but phase is Pending instead of Bound.
    Jan  4 09:23:33.676: INFO: PersistentVolumeClaim pvc-c2tpj found and phase=Bound (1m39.606521107s)
    STEP: checking the PVC 01/04/23 09:23:33.677
    STEP: validating provisioned PV 01/04/23 09:23:33.708
    STEP: checking the PV 01/04/23 09:23:33.739
    STEP: deploying the pod 01/04/23 09:23:33.74
    STEP: checking that the pods command exits with no error 01/04/23 09:23:33.777
    Jan  4 09:23:33.777: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-qbr2r" in namespace "azurefile-5194" to be "Succeeded or Failed"
    Jan  4 09:23:33.812: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Pending", Reason="", readiness=false. Elapsed: 35.395058ms
    Jan  4 09:23:35.844: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Pending", Reason="", readiness=false. Elapsed: 2.066798774s
    Jan  4 09:23:37.845: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Pending", Reason="", readiness=false. Elapsed: 4.06801435s
    Jan  4 09:23:39.847: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Pending", Reason="", readiness=false. Elapsed: 6.069636637s
    Jan  4 09:23:41.847: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Pending", Reason="", readiness=false. Elapsed: 8.069677888s
    Jan  4 09:23:43.847: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Pending", Reason="", readiness=false. Elapsed: 10.070010636s
    Jan  4 09:23:45.846: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Running", Reason="", readiness=true. Elapsed: 12.069002445s
    Jan  4 09:23:47.846: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Running", Reason="", readiness=true. Elapsed: 14.06951191s
    Jan  4 09:23:49.847: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Running", Reason="", readiness=true. Elapsed: 16.070501815s
    Jan  4 09:23:51.853: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Running", Reason="", readiness=false. Elapsed: 18.076567125s
    Jan  4 09:23:53.847: INFO: Pod "azurefile-volume-tester-qbr2r": Phase="Succeeded", Reason="", readiness=false. Elapsed: 20.070355937s
    STEP: Saw pod success 01/04/23 09:23:53.847
    Jan  4 09:23:53.847: INFO: Pod "azurefile-volume-tester-qbr2r" satisfied condition "Succeeded or Failed"
    Jan  4 09:23:53.848: INFO: deleting Pod "azurefile-5194"/"azurefile-volume-tester-qbr2r"
    Jan  4 09:23:53.896: INFO: Pod azurefile-volume-tester-qbr2r has the following logs: hello world

    STEP: Deleting pod azurefile-volume-tester-qbr2r in namespace azurefile-5194 01/04/23 09:23:53.896
    Jan  4 09:23:53.965: INFO: deleting PVC "azurefile-5194"/"pvc-c2tpj"
    Jan  4 09:23:53.965: INFO: Deleting PersistentVolumeClaim "pvc-c2tpj"
... skipping 29 lines ...
Jan  4 09:23:59.856: INFO: PersistentVolumeClaim pvc-cjj2r found but phase is Pending instead of Bound.
Jan  4 09:24:01.887: INFO: PersistentVolumeClaim pvc-cjj2r found and phase=Bound (2.061381669s)
STEP: checking the PVC 01/04/23 09:24:01.887
STEP: validating provisioned PV 01/04/23 09:24:01.926
STEP: checking the PV 01/04/23 09:24:01.957
STEP: deploying the pod 01/04/23 09:24:01.957
STEP: checking that the pods command exits with no error 01/04/23 09:24:01.992
Jan  4 09:24:01.992: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-k8ngc" in namespace "azurefile-1353" to be "Succeeded or Failed"
Jan  4 09:24:02.023: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Pending", Reason="", readiness=false. Elapsed: 31.531267ms
Jan  4 09:24:04.057: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Pending", Reason="", readiness=false. Elapsed: 2.065174583s
Jan  4 09:24:06.058: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Pending", Reason="", readiness=false. Elapsed: 4.066668587s
Jan  4 09:24:08.058: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Pending", Reason="", readiness=false. Elapsed: 6.065832779s
Jan  4 09:24:10.056: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Pending", Reason="", readiness=false. Elapsed: 8.064788269s
Jan  4 09:24:12.058: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Pending", Reason="", readiness=false. Elapsed: 10.065818408s
Jan  4 09:24:14.058: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Running", Reason="", readiness=true. Elapsed: 12.066463793s
Jan  4 09:24:16.058: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Running", Reason="", readiness=true. Elapsed: 14.066773242s
Jan  4 09:24:18.060: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Running", Reason="", readiness=true. Elapsed: 16.067989289s
Jan  4 09:24:20.057: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Running", Reason="", readiness=false. Elapsed: 18.065714003s
Jan  4 09:24:22.058: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Succeeded", Reason="", readiness=false. Elapsed: 20.06654882s
STEP: Saw pod success 01/04/23 09:24:22.058
Jan  4 09:24:22.058: INFO: Pod "azurefile-volume-tester-k8ngc" satisfied condition "Succeeded or Failed"
Jan  4 09:24:22.058: INFO: deleting Pod "azurefile-1353"/"azurefile-volume-tester-k8ngc"
Jan  4 09:24:22.098: INFO: Pod azurefile-volume-tester-k8ngc has the following logs: hello world

STEP: Deleting pod azurefile-volume-tester-k8ngc in namespace azurefile-1353 01/04/23 09:24:22.098
Jan  4 09:24:22.146: INFO: deleting PVC "azurefile-1353"/"pvc-cjj2r"
Jan  4 09:24:22.146: INFO: Deleting PersistentVolumeClaim "pvc-cjj2r"
... skipping 28 lines ...
    Jan  4 09:23:59.856: INFO: PersistentVolumeClaim pvc-cjj2r found but phase is Pending instead of Bound.
    Jan  4 09:24:01.887: INFO: PersistentVolumeClaim pvc-cjj2r found and phase=Bound (2.061381669s)
    STEP: checking the PVC 01/04/23 09:24:01.887
    STEP: validating provisioned PV 01/04/23 09:24:01.926
    STEP: checking the PV 01/04/23 09:24:01.957
    STEP: deploying the pod 01/04/23 09:24:01.957
    STEP: checking that the pods command exits with no error 01/04/23 09:24:01.992
    Jan  4 09:24:01.992: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-k8ngc" in namespace "azurefile-1353" to be "Succeeded or Failed"
    Jan  4 09:24:02.023: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Pending", Reason="", readiness=false. Elapsed: 31.531267ms
    Jan  4 09:24:04.057: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Pending", Reason="", readiness=false. Elapsed: 2.065174583s
    Jan  4 09:24:06.058: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Pending", Reason="", readiness=false. Elapsed: 4.066668587s
    Jan  4 09:24:08.058: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Pending", Reason="", readiness=false. Elapsed: 6.065832779s
    Jan  4 09:24:10.056: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Pending", Reason="", readiness=false. Elapsed: 8.064788269s
    Jan  4 09:24:12.058: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Pending", Reason="", readiness=false. Elapsed: 10.065818408s
    Jan  4 09:24:14.058: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Running", Reason="", readiness=true. Elapsed: 12.066463793s
    Jan  4 09:24:16.058: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Running", Reason="", readiness=true. Elapsed: 14.066773242s
    Jan  4 09:24:18.060: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Running", Reason="", readiness=true. Elapsed: 16.067989289s
    Jan  4 09:24:20.057: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Running", Reason="", readiness=false. Elapsed: 18.065714003s
    Jan  4 09:24:22.058: INFO: Pod "azurefile-volume-tester-k8ngc": Phase="Succeeded", Reason="", readiness=false. Elapsed: 20.06654882s
    STEP: Saw pod success 01/04/23 09:24:22.058
    Jan  4 09:24:22.058: INFO: Pod "azurefile-volume-tester-k8ngc" satisfied condition "Succeeded or Failed"
    Jan  4 09:24:22.058: INFO: deleting Pod "azurefile-1353"/"azurefile-volume-tester-k8ngc"
    Jan  4 09:24:22.098: INFO: Pod azurefile-volume-tester-k8ngc has the following logs: hello world

    STEP: Deleting pod azurefile-volume-tester-k8ngc in namespace azurefile-1353 01/04/23 09:24:22.098
    Jan  4 09:24:22.146: INFO: deleting PVC "azurefile-1353"/"pvc-cjj2r"
    Jan  4 09:24:22.146: INFO: Deleting PersistentVolumeClaim "pvc-cjj2r"
... skipping 209 lines ...
Jan  4 09:25:40.480: INFO: PersistentVolumeClaim pvc-nrkg5 found but phase is Pending instead of Bound.
Jan  4 09:25:42.512: INFO: PersistentVolumeClaim pvc-nrkg5 found and phase=Bound (40.675514395s)
STEP: checking the PVC 01/04/23 09:25:42.512
STEP: validating provisioned PV 01/04/23 09:25:42.544
STEP: checking the PV 01/04/23 09:25:42.575
STEP: deploying the pod 01/04/23 09:25:42.575
STEP: checking that the pods command exits with an error 01/04/23 09:25:42.612
Jan  4 09:25:42.612: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-dtxml" in namespace "azurefile-156" to be "Error status code"
Jan  4 09:25:42.646: INFO: Pod "azurefile-volume-tester-dtxml": Phase="Pending", Reason="", readiness=false. Elapsed: 34.043914ms
Jan  4 09:25:44.678: INFO: Pod "azurefile-volume-tester-dtxml": Phase="Pending", Reason="", readiness=false. Elapsed: 2.065849242s
Jan  4 09:25:46.680: INFO: Pod "azurefile-volume-tester-dtxml": Phase="Pending", Reason="", readiness=false. Elapsed: 4.06738914s
Jan  4 09:25:48.680: INFO: Pod "azurefile-volume-tester-dtxml": Phase="Pending", Reason="", readiness=false. Elapsed: 6.067586698s
Jan  4 09:25:50.681: INFO: Pod "azurefile-volume-tester-dtxml": Phase="Running", Reason="", readiness=true. Elapsed: 8.068814157s
Jan  4 09:25:52.680: INFO: Pod "azurefile-volume-tester-dtxml": Phase="Running", Reason="", readiness=true. Elapsed: 10.067819106s
Jan  4 09:25:54.682: INFO: Pod "azurefile-volume-tester-dtxml": Phase="Running", Reason="", readiness=true. Elapsed: 12.069231921s
Jan  4 09:25:56.680: INFO: Pod "azurefile-volume-tester-dtxml": Phase="Running", Reason="", readiness=false. Elapsed: 14.067343989s
Jan  4 09:25:58.680: INFO: Pod "azurefile-volume-tester-dtxml": Phase="Failed", Reason="", readiness=false. Elapsed: 16.067356188s
STEP: Saw pod failure 01/04/23 09:25:58.68
Jan  4 09:25:58.680: INFO: Pod "azurefile-volume-tester-dtxml" satisfied condition "Error status code"
STEP: checking that pod logs contain expected message 01/04/23 09:25:58.68
Jan  4 09:25:58.724: INFO: deleting Pod "azurefile-156"/"azurefile-volume-tester-dtxml"
Jan  4 09:25:58.757: INFO: Pod azurefile-volume-tester-dtxml has the following logs: out-file : Access to the path 'C:\mnt\test-1\data' is denied.
At line:1 char:1
+ echo $null >> C:\mnt\test-1\data
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
... skipping 56 lines ...
    Jan  4 09:25:40.480: INFO: PersistentVolumeClaim pvc-nrkg5 found but phase is Pending instead of Bound.
    Jan  4 09:25:42.512: INFO: PersistentVolumeClaim pvc-nrkg5 found and phase=Bound (40.675514395s)
    STEP: checking the PVC 01/04/23 09:25:42.512
    STEP: validating provisioned PV 01/04/23 09:25:42.544
    STEP: checking the PV 01/04/23 09:25:42.575
    STEP: deploying the pod 01/04/23 09:25:42.575
    STEP: checking that the pods command exits with an error 01/04/23 09:25:42.612
    Jan  4 09:25:42.612: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-dtxml" in namespace "azurefile-156" to be "Error status code"
    Jan  4 09:25:42.646: INFO: Pod "azurefile-volume-tester-dtxml": Phase="Pending", Reason="", readiness=false. Elapsed: 34.043914ms
    Jan  4 09:25:44.678: INFO: Pod "azurefile-volume-tester-dtxml": Phase="Pending", Reason="", readiness=false. Elapsed: 2.065849242s
    Jan  4 09:25:46.680: INFO: Pod "azurefile-volume-tester-dtxml": Phase="Pending", Reason="", readiness=false. Elapsed: 4.06738914s
    Jan  4 09:25:48.680: INFO: Pod "azurefile-volume-tester-dtxml": Phase="Pending", Reason="", readiness=false. Elapsed: 6.067586698s
    Jan  4 09:25:50.681: INFO: Pod "azurefile-volume-tester-dtxml": Phase="Running", Reason="", readiness=true. Elapsed: 8.068814157s
    Jan  4 09:25:52.680: INFO: Pod "azurefile-volume-tester-dtxml": Phase="Running", Reason="", readiness=true. Elapsed: 10.067819106s
    Jan  4 09:25:54.682: INFO: Pod "azurefile-volume-tester-dtxml": Phase="Running", Reason="", readiness=true. Elapsed: 12.069231921s
    Jan  4 09:25:56.680: INFO: Pod "azurefile-volume-tester-dtxml": Phase="Running", Reason="", readiness=false. Elapsed: 14.067343989s
    Jan  4 09:25:58.680: INFO: Pod "azurefile-volume-tester-dtxml": Phase="Failed", Reason="", readiness=false. Elapsed: 16.067356188s
    STEP: Saw pod failure 01/04/23 09:25:58.68
    Jan  4 09:25:58.680: INFO: Pod "azurefile-volume-tester-dtxml" satisfied condition "Error status code"
    STEP: checking that pod logs contain expected message 01/04/23 09:25:58.68
    Jan  4 09:25:58.724: INFO: deleting Pod "azurefile-156"/"azurefile-volume-tester-dtxml"
    Jan  4 09:25:58.757: INFO: Pod azurefile-volume-tester-dtxml has the following logs: out-file : Access to the path 'C:\mnt\test-1\data' is denied.
    At line:1 char:1
    + echo $null >> C:\mnt\test-1\data
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
... skipping 327 lines ...
Jan  4 09:27:11.881: INFO: PersistentVolumeClaim pvc-xc2bp found but phase is Pending instead of Bound.
Jan  4 09:27:13.913: INFO: PersistentVolumeClaim pvc-xc2bp found and phase=Bound (2.063326811s)
STEP: checking the PVC 01/04/23 09:27:13.913
STEP: validating provisioned PV 01/04/23 09:27:13.944
STEP: checking the PV 01/04/23 09:27:13.978
STEP: deploying the pod 01/04/23 09:27:13.978
STEP: checking that the pods command exits with no error 01/04/23 09:27:14.018
Jan  4 09:27:14.018: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-9n7fs" in namespace "azurefile-9336" to be "Succeeded or Failed"
Jan  4 09:27:14.050: INFO: Pod "azurefile-volume-tester-9n7fs": Phase="Pending", Reason="", readiness=false. Elapsed: 32.676472ms
Jan  4 09:27:16.083: INFO: Pod "azurefile-volume-tester-9n7fs": Phase="Pending", Reason="", readiness=false. Elapsed: 2.065522509s
Jan  4 09:27:18.085: INFO: Pod "azurefile-volume-tester-9n7fs": Phase="Pending", Reason="", readiness=false. Elapsed: 4.066803396s
Jan  4 09:27:20.085: INFO: Pod "azurefile-volume-tester-9n7fs": Phase="Pending", Reason="", readiness=false. Elapsed: 6.067685247s
Jan  4 09:27:22.085: INFO: Pod "azurefile-volume-tester-9n7fs": Phase="Running", Reason="", readiness=true. Elapsed: 8.067247558s
Jan  4 09:27:24.085: INFO: Pod "azurefile-volume-tester-9n7fs": Phase="Running", Reason="", readiness=true. Elapsed: 10.067487058s
Jan  4 09:27:26.084: INFO: Pod "azurefile-volume-tester-9n7fs": Phase="Running", Reason="", readiness=true. Elapsed: 12.06630194s
Jan  4 09:27:28.084: INFO: Pod "azurefile-volume-tester-9n7fs": Phase="Running", Reason="", readiness=false. Elapsed: 14.065876537s
Jan  4 09:27:30.085: INFO: Pod "azurefile-volume-tester-9n7fs": Phase="Succeeded", Reason="", readiness=false. Elapsed: 16.066761324s
STEP: Saw pod success 01/04/23 09:27:30.085
Jan  4 09:27:30.085: INFO: Pod "azurefile-volume-tester-9n7fs" satisfied condition "Succeeded or Failed"
STEP: resizing the pvc 01/04/23 09:27:30.116
STEP: sleep 30s waiting for resize complete 01/04/23 09:27:30.152
STEP: checking the resizing result 01/04/23 09:28:00.153
STEP: checking the resizing PV result 01/04/23 09:28:00.184
STEP: checking the resizing azurefile result 01/04/23 09:28:00.215
Jan  4 09:28:00.737: INFO: deleting Pod "azurefile-9336"/"azurefile-volume-tester-9n7fs"
... skipping 33 lines ...
    Jan  4 09:27:11.881: INFO: PersistentVolumeClaim pvc-xc2bp found but phase is Pending instead of Bound.
    Jan  4 09:27:13.913: INFO: PersistentVolumeClaim pvc-xc2bp found and phase=Bound (2.063326811s)
    STEP: checking the PVC 01/04/23 09:27:13.913
    STEP: validating provisioned PV 01/04/23 09:27:13.944
    STEP: checking the PV 01/04/23 09:27:13.978
    STEP: deploying the pod 01/04/23 09:27:13.978
    STEP: checking that the pods command exits with no error 01/04/23 09:27:14.018
    Jan  4 09:27:14.018: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-9n7fs" in namespace "azurefile-9336" to be "Succeeded or Failed"
    Jan  4 09:27:14.050: INFO: Pod "azurefile-volume-tester-9n7fs": Phase="Pending", Reason="", readiness=false. Elapsed: 32.676472ms
    Jan  4 09:27:16.083: INFO: Pod "azurefile-volume-tester-9n7fs": Phase="Pending", Reason="", readiness=false. Elapsed: 2.065522509s
    Jan  4 09:27:18.085: INFO: Pod "azurefile-volume-tester-9n7fs": Phase="Pending", Reason="", readiness=false. Elapsed: 4.066803396s
    Jan  4 09:27:20.085: INFO: Pod "azurefile-volume-tester-9n7fs": Phase="Pending", Reason="", readiness=false. Elapsed: 6.067685247s
    Jan  4 09:27:22.085: INFO: Pod "azurefile-volume-tester-9n7fs": Phase="Running", Reason="", readiness=true. Elapsed: 8.067247558s
    Jan  4 09:27:24.085: INFO: Pod "azurefile-volume-tester-9n7fs": Phase="Running", Reason="", readiness=true. Elapsed: 10.067487058s
    Jan  4 09:27:26.084: INFO: Pod "azurefile-volume-tester-9n7fs": Phase="Running", Reason="", readiness=true. Elapsed: 12.06630194s
    Jan  4 09:27:28.084: INFO: Pod "azurefile-volume-tester-9n7fs": Phase="Running", Reason="", readiness=false. Elapsed: 14.065876537s
    Jan  4 09:27:30.085: INFO: Pod "azurefile-volume-tester-9n7fs": Phase="Succeeded", Reason="", readiness=false. Elapsed: 16.066761324s
    STEP: Saw pod success 01/04/23 09:27:30.085
    Jan  4 09:27:30.085: INFO: Pod "azurefile-volume-tester-9n7fs" satisfied condition "Succeeded or Failed"
    STEP: resizing the pvc 01/04/23 09:27:30.116
    STEP: sleep 30s waiting for resize complete 01/04/23 09:27:30.152
    STEP: checking the resizing result 01/04/23 09:28:00.153
    STEP: checking the resizing PV result 01/04/23 09:28:00.184
    STEP: checking the resizing azurefile result 01/04/23 09:28:00.215
    Jan  4 09:28:00.737: INFO: deleting Pod "azurefile-9336"/"azurefile-volume-tester-9n7fs"
... skipping 361 lines ...
Jan  4 09:28:22.013: INFO: PersistentVolumeClaim pvc-4kmdz found but phase is Pending instead of Bound.
Jan  4 09:28:24.046: INFO: PersistentVolumeClaim pvc-4kmdz found and phase=Bound (2.063522423s)
STEP: checking the PVC 01/04/23 09:28:24.046
STEP: validating provisioned PV 01/04/23 09:28:24.077
STEP: checking the PV 01/04/23 09:28:24.108
STEP: deploying the pod 01/04/23 09:28:24.109
STEP: checking that the pods command exits with no error 01/04/23 09:28:24.146
Jan  4 09:28:24.146: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-85cb4" in namespace "azurefile-2305" to be "Succeeded or Failed"
Jan  4 09:28:24.180: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Pending", Reason="", readiness=false. Elapsed: 34.623423ms
Jan  4 09:28:26.215: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Pending", Reason="", readiness=false. Elapsed: 2.069221318s
Jan  4 09:28:28.214: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Pending", Reason="", readiness=false. Elapsed: 4.068464262s
Jan  4 09:28:30.214: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Pending", Reason="", readiness=false. Elapsed: 6.068182455s
Jan  4 09:28:32.215: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Pending", Reason="", readiness=false. Elapsed: 8.069599965s
Jan  4 09:28:34.216: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Pending", Reason="", readiness=false. Elapsed: 10.070071453s
Jan  4 09:28:36.214: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Running", Reason="", readiness=true. Elapsed: 12.067893479s
Jan  4 09:28:38.215: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Running", Reason="", readiness=true. Elapsed: 14.068882612s
Jan  4 09:28:40.214: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Running", Reason="", readiness=true. Elapsed: 16.068337325s
Jan  4 09:28:42.214: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Running", Reason="", readiness=false. Elapsed: 18.068594074s
Jan  4 09:28:44.214: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Succeeded", Reason="", readiness=false. Elapsed: 20.068401101s
STEP: Saw pod success 01/04/23 09:28:44.214
Jan  4 09:28:44.214: INFO: Pod "azurefile-volume-tester-85cb4" satisfied condition "Succeeded or Failed"
Jan  4 09:28:44.215: INFO: deleting Pod "azurefile-2305"/"azurefile-volume-tester-85cb4"
Jan  4 09:28:44.259: INFO: Pod azurefile-volume-tester-85cb4 has the following logs: hello world

STEP: Deleting pod azurefile-volume-tester-85cb4 in namespace azurefile-2305 01/04/23 09:28:44.26
Jan  4 09:28:44.311: INFO: deleting PVC "azurefile-2305"/"pvc-4kmdz"
Jan  4 09:28:44.311: INFO: Deleting PersistentVolumeClaim "pvc-4kmdz"
... skipping 128 lines ...
    Jan  4 09:28:22.013: INFO: PersistentVolumeClaim pvc-4kmdz found but phase is Pending instead of Bound.
    Jan  4 09:28:24.046: INFO: PersistentVolumeClaim pvc-4kmdz found and phase=Bound (2.063522423s)
    STEP: checking the PVC 01/04/23 09:28:24.046
    STEP: validating provisioned PV 01/04/23 09:28:24.077
    STEP: checking the PV 01/04/23 09:28:24.108
    STEP: deploying the pod 01/04/23 09:28:24.109
    STEP: checking that the pods command exits with no error 01/04/23 09:28:24.146
    Jan  4 09:28:24.146: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-85cb4" in namespace "azurefile-2305" to be "Succeeded or Failed"
    Jan  4 09:28:24.180: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Pending", Reason="", readiness=false. Elapsed: 34.623423ms
    Jan  4 09:28:26.215: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Pending", Reason="", readiness=false. Elapsed: 2.069221318s
    Jan  4 09:28:28.214: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Pending", Reason="", readiness=false. Elapsed: 4.068464262s
    Jan  4 09:28:30.214: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Pending", Reason="", readiness=false. Elapsed: 6.068182455s
    Jan  4 09:28:32.215: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Pending", Reason="", readiness=false. Elapsed: 8.069599965s
    Jan  4 09:28:34.216: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Pending", Reason="", readiness=false. Elapsed: 10.070071453s
    Jan  4 09:28:36.214: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Running", Reason="", readiness=true. Elapsed: 12.067893479s
    Jan  4 09:28:38.215: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Running", Reason="", readiness=true. Elapsed: 14.068882612s
    Jan  4 09:28:40.214: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Running", Reason="", readiness=true. Elapsed: 16.068337325s
    Jan  4 09:28:42.214: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Running", Reason="", readiness=false. Elapsed: 18.068594074s
    Jan  4 09:28:44.214: INFO: Pod "azurefile-volume-tester-85cb4": Phase="Succeeded", Reason="", readiness=false. Elapsed: 20.068401101s
    STEP: Saw pod success 01/04/23 09:28:44.214
    Jan  4 09:28:44.214: INFO: Pod "azurefile-volume-tester-85cb4" satisfied condition "Succeeded or Failed"
    Jan  4 09:28:44.215: INFO: deleting Pod "azurefile-2305"/"azurefile-volume-tester-85cb4"
    Jan  4 09:28:44.259: INFO: Pod azurefile-volume-tester-85cb4 has the following logs: hello world

    STEP: Deleting pod azurefile-volume-tester-85cb4 in namespace azurefile-2305 01/04/23 09:28:44.26
    Jan  4 09:28:44.311: INFO: deleting PVC "azurefile-2305"/"pvc-4kmdz"
    Jan  4 09:28:44.311: INFO: Deleting PersistentVolumeClaim "pvc-4kmdz"
... skipping 98 lines ...
check the driver pods if restarts ...
======================================================================================
2023/01/04 09:29:16 Check successfully
Jan  4 09:29:16.832: INFO: >>> kubeConfig: /home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/kubeconfig
2023/01/04 09:29:16 run script: test/utils/get_storage_account_secret_name.sh
2023/01/04 09:29:17 got output: azure-storage-account-f46c311439d5b4b38903e84-secret
, error: <nil>
2023/01/04 09:29:17 got storage account secret name: azure-storage-account-f46c311439d5b4b38903e84-secret
STEP: setting up the StorageClass 01/04/23 09:29:17.071
STEP: creating a StorageClass  01/04/23 09:29:17.071
STEP: setting up the PVC and PV 01/04/23 09:29:17.105
STEP: creating a PVC 01/04/23 09:29:17.105
STEP: waiting for PVC to be in phase "Bound" 01/04/23 09:29:17.142
Jan  4 09:29:17.142: INFO: Waiting up to timeout=5m0s for PersistentVolumeClaims [pvc-t68wm] to have phase Bound
Jan  4 09:29:17.173: INFO: PersistentVolumeClaim pvc-t68wm found but phase is Pending instead of Bound.
Jan  4 09:29:19.205: INFO: PersistentVolumeClaim pvc-t68wm found and phase=Bound (2.063029135s)
STEP: checking the PVC 01/04/23 09:29:19.205
STEP: validating provisioned PV 01/04/23 09:29:19.236
STEP: checking the PV 01/04/23 09:29:19.267
STEP: deploying the pod 01/04/23 09:29:19.267
STEP: checking that the pods command exits with no error 01/04/23 09:29:19.31
Jan  4 09:29:19.310: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-4g79k" in namespace "azurefile-7743" to be "Succeeded or Failed"
Jan  4 09:29:19.340: INFO: Pod "azurefile-volume-tester-4g79k": Phase="Pending", Reason="", readiness=false. Elapsed: 30.316348ms
Jan  4 09:29:21.373: INFO: Pod "azurefile-volume-tester-4g79k": Phase="Pending", Reason="", readiness=false. Elapsed: 2.062968888s
Jan  4 09:29:23.374: INFO: Pod "azurefile-volume-tester-4g79k": Phase="Pending", Reason="", readiness=false. Elapsed: 4.064559933s
Jan  4 09:29:25.374: INFO: Pod "azurefile-volume-tester-4g79k": Phase="Pending", Reason="", readiness=false. Elapsed: 6.063857842s
Jan  4 09:29:27.374: INFO: Pod "azurefile-volume-tester-4g79k": Phase="Running", Reason="", readiness=true. Elapsed: 8.064675527s
Jan  4 09:29:29.374: INFO: Pod "azurefile-volume-tester-4g79k": Phase="Running", Reason="", readiness=true. Elapsed: 10.064728148s
Jan  4 09:29:31.375: INFO: Pod "azurefile-volume-tester-4g79k": Phase="Running", Reason="", readiness=true. Elapsed: 12.065746246s
Jan  4 09:29:33.375: INFO: Pod "azurefile-volume-tester-4g79k": Phase="Running", Reason="", readiness=false. Elapsed: 14.064967411s
Jan  4 09:29:35.373: INFO: Pod "azurefile-volume-tester-4g79k": Phase="Succeeded", Reason="", readiness=false. Elapsed: 16.063649522s
STEP: Saw pod success 01/04/23 09:29:35.373
Jan  4 09:29:35.373: INFO: Pod "azurefile-volume-tester-4g79k" satisfied condition "Succeeded or Failed"
Jan  4 09:29:35.374: INFO: deleting Pod "azurefile-7743"/"azurefile-volume-tester-4g79k"
Jan  4 09:29:35.415: INFO: Pod azurefile-volume-tester-4g79k has the following logs: hello world

STEP: Deleting pod azurefile-volume-tester-4g79k in namespace azurefile-7743 01/04/23 09:29:35.415
Jan  4 09:29:35.461: INFO: deleting PVC "azurefile-7743"/"pvc-t68wm"
Jan  4 09:29:35.461: INFO: Deleting PersistentVolumeClaim "pvc-t68wm"
... skipping 28 lines ...
    Jan  4 09:29:17.173: INFO: PersistentVolumeClaim pvc-t68wm found but phase is Pending instead of Bound.
    Jan  4 09:29:19.205: INFO: PersistentVolumeClaim pvc-t68wm found and phase=Bound (2.063029135s)
    STEP: checking the PVC 01/04/23 09:29:19.205
    STEP: validating provisioned PV 01/04/23 09:29:19.236
    STEP: checking the PV 01/04/23 09:29:19.267
    STEP: deploying the pod 01/04/23 09:29:19.267
    STEP: checking that the pods command exits with no error 01/04/23 09:29:19.31
    Jan  4 09:29:19.310: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-4g79k" in namespace "azurefile-7743" to be "Succeeded or Failed"
    Jan  4 09:29:19.340: INFO: Pod "azurefile-volume-tester-4g79k": Phase="Pending", Reason="", readiness=false. Elapsed: 30.316348ms
    Jan  4 09:29:21.373: INFO: Pod "azurefile-volume-tester-4g79k": Phase="Pending", Reason="", readiness=false. Elapsed: 2.062968888s
    Jan  4 09:29:23.374: INFO: Pod "azurefile-volume-tester-4g79k": Phase="Pending", Reason="", readiness=false. Elapsed: 4.064559933s
    Jan  4 09:29:25.374: INFO: Pod "azurefile-volume-tester-4g79k": Phase="Pending", Reason="", readiness=false. Elapsed: 6.063857842s
    Jan  4 09:29:27.374: INFO: Pod "azurefile-volume-tester-4g79k": Phase="Running", Reason="", readiness=true. Elapsed: 8.064675527s
    Jan  4 09:29:29.374: INFO: Pod "azurefile-volume-tester-4g79k": Phase="Running", Reason="", readiness=true. Elapsed: 10.064728148s
    Jan  4 09:29:31.375: INFO: Pod "azurefile-volume-tester-4g79k": Phase="Running", Reason="", readiness=true. Elapsed: 12.065746246s
    Jan  4 09:29:33.375: INFO: Pod "azurefile-volume-tester-4g79k": Phase="Running", Reason="", readiness=false. Elapsed: 14.064967411s
    Jan  4 09:29:35.373: INFO: Pod "azurefile-volume-tester-4g79k": Phase="Succeeded", Reason="", readiness=false. Elapsed: 16.063649522s
    STEP: Saw pod success 01/04/23 09:29:35.373
    Jan  4 09:29:35.373: INFO: Pod "azurefile-volume-tester-4g79k" satisfied condition "Succeeded or Failed"
    Jan  4 09:29:35.374: INFO: deleting Pod "azurefile-7743"/"azurefile-volume-tester-4g79k"
    Jan  4 09:29:35.415: INFO: Pod azurefile-volume-tester-4g79k has the following logs: hello world

    STEP: Deleting pod azurefile-volume-tester-4g79k in namespace azurefile-7743 01/04/23 09:29:35.415
    Jan  4 09:29:35.461: INFO: deleting PVC "azurefile-7743"/"pvc-t68wm"
    Jan  4 09:29:35.461: INFO: Deleting PersistentVolumeClaim "pvc-t68wm"
... skipping 38 lines ...
Jan  4 09:29:59.642: INFO: PersistentVolumeClaim pvc-kqjfh found but phase is Pending instead of Bound.
Jan  4 09:30:01.674: INFO: PersistentVolumeClaim pvc-kqjfh found and phase=Bound (20.350976545s)
STEP: checking the PVC 01/04/23 09:30:01.674
STEP: validating provisioned PV 01/04/23 09:30:01.705
STEP: checking the PV 01/04/23 09:30:01.735
STEP: deploying the pod 01/04/23 09:30:01.735
STEP: checking that the pods command exits with no error 01/04/23 09:30:01.77
Jan  4 09:30:01.771: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-sg8xd" in namespace "azurefile-3162" to be "Succeeded or Failed"
Jan  4 09:30:01.801: INFO: Pod "azurefile-volume-tester-sg8xd": Phase="Pending", Reason="", readiness=false. Elapsed: 30.350567ms
Jan  4 09:30:03.833: INFO: Pod "azurefile-volume-tester-sg8xd": Phase="Pending", Reason="", readiness=false. Elapsed: 2.062037218s
Jan  4 09:30:05.836: INFO: Pod "azurefile-volume-tester-sg8xd": Phase="Pending", Reason="", readiness=false. Elapsed: 4.065219423s
Jan  4 09:30:07.835: INFO: Pod "azurefile-volume-tester-sg8xd": Phase="Pending", Reason="", readiness=false. Elapsed: 6.064008571s
Jan  4 09:30:09.835: INFO: Pod "azurefile-volume-tester-sg8xd": Phase="Running", Reason="", readiness=true. Elapsed: 8.064586084s
Jan  4 09:30:11.835: INFO: Pod "azurefile-volume-tester-sg8xd": Phase="Running", Reason="", readiness=true. Elapsed: 10.064036933s
Jan  4 09:30:13.835: INFO: Pod "azurefile-volume-tester-sg8xd": Phase="Running", Reason="", readiness=true. Elapsed: 12.064126106s
Jan  4 09:30:15.835: INFO: Pod "azurefile-volume-tester-sg8xd": Phase="Running", Reason="", readiness=false. Elapsed: 14.064207916s
Jan  4 09:30:17.835: INFO: Pod "azurefile-volume-tester-sg8xd": Phase="Succeeded", Reason="", readiness=false. Elapsed: 16.06409321s
STEP: Saw pod success 01/04/23 09:30:17.835
Jan  4 09:30:17.835: INFO: Pod "azurefile-volume-tester-sg8xd" satisfied condition "Succeeded or Failed"
Jan  4 09:30:17.835: INFO: deleting Pod "azurefile-3162"/"azurefile-volume-tester-sg8xd"
Jan  4 09:30:17.886: INFO: Pod azurefile-volume-tester-sg8xd has the following logs: hello world

STEP: Deleting pod azurefile-volume-tester-sg8xd in namespace azurefile-3162 01/04/23 09:30:17.886
Jan  4 09:30:17.936: INFO: deleting PVC "azurefile-3162"/"pvc-kqjfh"
Jan  4 09:30:17.936: INFO: Deleting PersistentVolumeClaim "pvc-kqjfh"
... skipping 37 lines ...
    Jan  4 09:29:59.642: INFO: PersistentVolumeClaim pvc-kqjfh found but phase is Pending instead of Bound.
    Jan  4 09:30:01.674: INFO: PersistentVolumeClaim pvc-kqjfh found and phase=Bound (20.350976545s)
    STEP: checking the PVC 01/04/23 09:30:01.674
    STEP: validating provisioned PV 01/04/23 09:30:01.705
    STEP: checking the PV 01/04/23 09:30:01.735
    STEP: deploying the pod 01/04/23 09:30:01.735
    STEP: checking that the pods command exits with no error 01/04/23 09:30:01.77
    Jan  4 09:30:01.771: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-sg8xd" in namespace "azurefile-3162" to be "Succeeded or Failed"
    Jan  4 09:30:01.801: INFO: Pod "azurefile-volume-tester-sg8xd": Phase="Pending", Reason="", readiness=false. Elapsed: 30.350567ms
    Jan  4 09:30:03.833: INFO: Pod "azurefile-volume-tester-sg8xd": Phase="Pending", Reason="", readiness=false. Elapsed: 2.062037218s
    Jan  4 09:30:05.836: INFO: Pod "azurefile-volume-tester-sg8xd": Phase="Pending", Reason="", readiness=false. Elapsed: 4.065219423s
    Jan  4 09:30:07.835: INFO: Pod "azurefile-volume-tester-sg8xd": Phase="Pending", Reason="", readiness=false. Elapsed: 6.064008571s
    Jan  4 09:30:09.835: INFO: Pod "azurefile-volume-tester-sg8xd": Phase="Running", Reason="", readiness=true. Elapsed: 8.064586084s
    Jan  4 09:30:11.835: INFO: Pod "azurefile-volume-tester-sg8xd": Phase="Running", Reason="", readiness=true. Elapsed: 10.064036933s
    Jan  4 09:30:13.835: INFO: Pod "azurefile-volume-tester-sg8xd": Phase="Running", Reason="", readiness=true. Elapsed: 12.064126106s
    Jan  4 09:30:15.835: INFO: Pod "azurefile-volume-tester-sg8xd": Phase="Running", Reason="", readiness=false. Elapsed: 14.064207916s
    Jan  4 09:30:17.835: INFO: Pod "azurefile-volume-tester-sg8xd": Phase="Succeeded", Reason="", readiness=false. Elapsed: 16.06409321s
    STEP: Saw pod success 01/04/23 09:30:17.835
    Jan  4 09:30:17.835: INFO: Pod "azurefile-volume-tester-sg8xd" satisfied condition "Succeeded or Failed"
    Jan  4 09:30:17.835: INFO: deleting Pod "azurefile-3162"/"azurefile-volume-tester-sg8xd"
    Jan  4 09:30:17.886: INFO: Pod azurefile-volume-tester-sg8xd has the following logs: hello world

    STEP: Deleting pod azurefile-volume-tester-sg8xd in namespace azurefile-3162 01/04/23 09:30:17.886
    Jan  4 09:30:17.936: INFO: deleting PVC "azurefile-3162"/"pvc-kqjfh"
    Jan  4 09:30:17.936: INFO: Deleting PersistentVolumeClaim "pvc-kqjfh"
... skipping 39 lines ...
Jan  4 09:30:44.194: INFO: PersistentVolumeClaim pvc-44tfg found but phase is Pending instead of Bound.
Jan  4 09:30:46.226: INFO: PersistentVolumeClaim pvc-44tfg found and phase=Bound (22.38514495s)
STEP: checking the PVC 01/04/23 09:30:46.226
STEP: validating provisioned PV 01/04/23 09:30:46.257
STEP: checking the PV 01/04/23 09:30:46.289
STEP: deploying the pod 01/04/23 09:30:46.289
STEP: checking that the pods command exits with no error 01/04/23 09:30:46.324
Jan  4 09:30:46.324: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-wpq5c" in namespace "azurefile-9700" to be "Succeeded or Failed"
Jan  4 09:30:46.355: INFO: Pod "azurefile-volume-tester-wpq5c": Phase="Pending", Reason="", readiness=false. Elapsed: 30.377981ms
Jan  4 09:30:48.387: INFO: Pod "azurefile-volume-tester-wpq5c": Phase="Pending", Reason="", readiness=false. Elapsed: 2.062785657s
Jan  4 09:30:50.389: INFO: Pod "azurefile-volume-tester-wpq5c": Phase="Pending", Reason="", readiness=false. Elapsed: 4.064997739s
Jan  4 09:30:52.388: INFO: Pod "azurefile-volume-tester-wpq5c": Phase="Pending", Reason="", readiness=false. Elapsed: 6.063944632s
Jan  4 09:30:54.389: INFO: Pod "azurefile-volume-tester-wpq5c": Phase="Running", Reason="", readiness=true. Elapsed: 8.065109398s
Jan  4 09:30:56.390: INFO: Pod "azurefile-volume-tester-wpq5c": Phase="Running", Reason="", readiness=true. Elapsed: 10.065323375s
Jan  4 09:30:58.394: INFO: Pod "azurefile-volume-tester-wpq5c": Phase="Running", Reason="", readiness=true. Elapsed: 12.069664707s
Jan  4 09:31:00.388: INFO: Pod "azurefile-volume-tester-wpq5c": Phase="Running", Reason="", readiness=false. Elapsed: 14.063945053s
Jan  4 09:31:02.388: INFO: Pod "azurefile-volume-tester-wpq5c": Phase="Succeeded", Reason="", readiness=false. Elapsed: 16.063763718s
STEP: Saw pod success 01/04/23 09:31:02.388
Jan  4 09:31:02.388: INFO: Pod "azurefile-volume-tester-wpq5c" satisfied condition "Succeeded or Failed"
Jan  4 09:31:02.388: INFO: deleting Pod "azurefile-9700"/"azurefile-volume-tester-wpq5c"
Jan  4 09:31:02.428: INFO: Pod azurefile-volume-tester-wpq5c has the following logs: hello world

STEP: Deleting pod azurefile-volume-tester-wpq5c in namespace azurefile-9700 01/04/23 09:31:02.428
Jan  4 09:31:02.487: INFO: deleting PVC "azurefile-9700"/"pvc-44tfg"
Jan  4 09:31:02.487: INFO: Deleting PersistentVolumeClaim "pvc-44tfg"
... skipping 38 lines ...
    Jan  4 09:30:44.194: INFO: PersistentVolumeClaim pvc-44tfg found but phase is Pending instead of Bound.
    Jan  4 09:30:46.226: INFO: PersistentVolumeClaim pvc-44tfg found and phase=Bound (22.38514495s)
    STEP: checking the PVC 01/04/23 09:30:46.226
    STEP: validating provisioned PV 01/04/23 09:30:46.257
    STEP: checking the PV 01/04/23 09:30:46.289
    STEP: deploying the pod 01/04/23 09:30:46.289
    STEP: checking that the pods command exits with no error 01/04/23 09:30:46.324
    Jan  4 09:30:46.324: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-wpq5c" in namespace "azurefile-9700" to be "Succeeded or Failed"
    Jan  4 09:30:46.355: INFO: Pod "azurefile-volume-tester-wpq5c": Phase="Pending", Reason="", readiness=false. Elapsed: 30.377981ms
    Jan  4 09:30:48.387: INFO: Pod "azurefile-volume-tester-wpq5c": Phase="Pending", Reason="", readiness=false. Elapsed: 2.062785657s
    Jan  4 09:30:50.389: INFO: Pod "azurefile-volume-tester-wpq5c": Phase="Pending", Reason="", readiness=false. Elapsed: 4.064997739s
    Jan  4 09:30:52.388: INFO: Pod "azurefile-volume-tester-wpq5c": Phase="Pending", Reason="", readiness=false. Elapsed: 6.063944632s
    Jan  4 09:30:54.389: INFO: Pod "azurefile-volume-tester-wpq5c": Phase="Running", Reason="", readiness=true. Elapsed: 8.065109398s
    Jan  4 09:30:56.390: INFO: Pod "azurefile-volume-tester-wpq5c": Phase="Running", Reason="", readiness=true. Elapsed: 10.065323375s
    Jan  4 09:30:58.394: INFO: Pod "azurefile-volume-tester-wpq5c": Phase="Running", Reason="", readiness=true. Elapsed: 12.069664707s
    Jan  4 09:31:00.388: INFO: Pod "azurefile-volume-tester-wpq5c": Phase="Running", Reason="", readiness=false. Elapsed: 14.063945053s
    Jan  4 09:31:02.388: INFO: Pod "azurefile-volume-tester-wpq5c": Phase="Succeeded", Reason="", readiness=false. Elapsed: 16.063763718s
    STEP: Saw pod success 01/04/23 09:31:02.388
    Jan  4 09:31:02.388: INFO: Pod "azurefile-volume-tester-wpq5c" satisfied condition "Succeeded or Failed"
    Jan  4 09:31:02.388: INFO: deleting Pod "azurefile-9700"/"azurefile-volume-tester-wpq5c"
    Jan  4 09:31:02.428: INFO: Pod azurefile-volume-tester-wpq5c has the following logs: hello world

    STEP: Deleting pod azurefile-volume-tester-wpq5c in namespace azurefile-9700 01/04/23 09:31:02.428
    Jan  4 09:31:02.487: INFO: deleting PVC "azurefile-9700"/"pvc-44tfg"
    Jan  4 09:31:02.487: INFO: Deleting PersistentVolumeClaim "pvc-44tfg"
... skipping 31 lines ...
I0104 09:31:14.435267   38721 testsuites.go:627] 0/1 replicas in the StatefulSet are ready
I0104 09:31:16.433918   38721 testsuites.go:627] 1/1 replicas in the StatefulSet are ready
STEP: checking that the pod is running 01/04/23 09:31:16.466
STEP: check pod exec 01/04/23 09:31:16.5
Jan  4 09:31:16.500: INFO: Running '/usr/local/bin/kubectl --kubeconfig=/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/kubeconfig --namespace=azurefile-8010 exec azurefile-volume-tester-n9cbt-0 -- cmd /c type C:\mnt\test-1\data.txt'
Jan  4 09:31:17.127: INFO: rc: 1
Jan  4 09:31:17.127: INFO: Error waiting for output "hello world\r\n" in pod "azurefile-volume-tester-n9cbt-0": error running /usr/local/bin/kubectl --kubeconfig=/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/kubeconfig --namespace=azurefile-8010 exec azurefile-volume-tester-n9cbt-0 -- cmd /c type C:\mnt\test-1\data.txt:
Command stdout:

stderr:
The system cannot find the file specified.
command terminated with exit code 1

error:
exit status 1.
Jan  4 09:31:19.129: INFO: Running '/usr/local/bin/kubectl --kubeconfig=/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/kubeconfig --namespace=azurefile-8010 exec azurefile-volume-tester-n9cbt-0 -- cmd /c type C:\mnt\test-1\data.txt'
Jan  4 09:31:19.586: INFO: rc: 1
Jan  4 09:31:19.586: INFO: Error waiting for output "hello world\r\n" in pod "azurefile-volume-tester-n9cbt-0": error running /usr/local/bin/kubectl --kubeconfig=/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/kubeconfig --namespace=azurefile-8010 exec azurefile-volume-tester-n9cbt-0 -- cmd /c type C:\mnt\test-1\data.txt:
Command stdout:

stderr:
The system cannot find the file specified.
command terminated with exit code 1

error:
exit status 1.
Jan  4 09:31:21.129: INFO: Running '/usr/local/bin/kubectl --kubeconfig=/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/kubeconfig --namespace=azurefile-8010 exec azurefile-volume-tester-n9cbt-0 -- cmd /c type C:\mnt\test-1\data.txt'
Jan  4 09:31:21.655: INFO: stderr: ""
Jan  4 09:31:21.655: INFO: stdout: "hello world\r\n"
STEP: deleting the pod for statefulset 01/04/23 09:31:21.655
Jan  4 09:31:21.655: INFO: Deleting pod "azurefile-volume-tester-n9cbt-0" in namespace "azurefile-8010"
... skipping 31 lines ...
    I0104 09:31:14.435267   38721 testsuites.go:627] 0/1 replicas in the StatefulSet are ready
    I0104 09:31:16.433918   38721 testsuites.go:627] 1/1 replicas in the StatefulSet are ready
    STEP: checking that the pod is running 01/04/23 09:31:16.466
    STEP: check pod exec 01/04/23 09:31:16.5
    Jan  4 09:31:16.500: INFO: Running '/usr/local/bin/kubectl --kubeconfig=/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/kubeconfig --namespace=azurefile-8010 exec azurefile-volume-tester-n9cbt-0 -- cmd /c type C:\mnt\test-1\data.txt'
    Jan  4 09:31:17.127: INFO: rc: 1
    Jan  4 09:31:17.127: INFO: Error waiting for output "hello world\r\n" in pod "azurefile-volume-tester-n9cbt-0": error running /usr/local/bin/kubectl --kubeconfig=/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/kubeconfig --namespace=azurefile-8010 exec azurefile-volume-tester-n9cbt-0 -- cmd /c type C:\mnt\test-1\data.txt:
    Command stdout:

    stderr:
    The system cannot find the file specified.
    command terminated with exit code 1

    error:
    exit status 1.
    Jan  4 09:31:19.129: INFO: Running '/usr/local/bin/kubectl --kubeconfig=/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/kubeconfig --namespace=azurefile-8010 exec azurefile-volume-tester-n9cbt-0 -- cmd /c type C:\mnt\test-1\data.txt'
    Jan  4 09:31:19.586: INFO: rc: 1
    Jan  4 09:31:19.586: INFO: Error waiting for output "hello world\r\n" in pod "azurefile-volume-tester-n9cbt-0": error running /usr/local/bin/kubectl --kubeconfig=/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/kubeconfig --namespace=azurefile-8010 exec azurefile-volume-tester-n9cbt-0 -- cmd /c type C:\mnt\test-1\data.txt:
    Command stdout:

    stderr:
    The system cannot find the file specified.
    command terminated with exit code 1

    error:
    exit status 1.
    Jan  4 09:31:21.129: INFO: Running '/usr/local/bin/kubectl --kubeconfig=/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/kubeconfig --namespace=azurefile-8010 exec azurefile-volume-tester-n9cbt-0 -- cmd /c type C:\mnt\test-1\data.txt'
    Jan  4 09:31:21.655: INFO: stderr: ""
    Jan  4 09:31:21.655: INFO: stdout: "hello world\r\n"
    STEP: deleting the pod for statefulset 01/04/23 09:31:21.655
    Jan  4 09:31:21.655: INFO: Deleting pod "azurefile-volume-tester-n9cbt-0" in namespace "azurefile-8010"
... skipping 39 lines ...
Jan  4 09:32:29.259: INFO: deployment status: v1.DeploymentStatus{ObservedGeneration:1, Replicas:1, UpdatedReplicas:1, ReadyReplicas:0, AvailableReplicas:0, UnavailableReplicas:1, Conditions:[]v1.DeploymentCondition{v1.DeploymentCondition{Type:"Available", Status:"False", LastUpdateTime:time.Date(2023, time.January, 4, 9, 32, 25, 0, time.Local), LastTransitionTime:time.Date(2023, time.January, 4, 9, 32, 25, 0, time.Local), Reason:"MinimumReplicasUnavailable", Message:"Deployment does not have minimum availability."}, v1.DeploymentCondition{Type:"Progressing", Status:"True", LastUpdateTime:time.Date(2023, time.January, 4, 9, 32, 25, 0, time.Local), LastTransitionTime:time.Date(2023, time.January, 4, 9, 32, 25, 0, time.Local), Reason:"ReplicaSetUpdated", Message:"ReplicaSet \"azurefile-volume-tester-tbmv2-746557889c\" is progressing."}}, CollisionCount:(*int32)(nil)}
Jan  4 09:32:31.259: INFO: deployment status: v1.DeploymentStatus{ObservedGeneration:1, Replicas:1, UpdatedReplicas:1, ReadyReplicas:0, AvailableReplicas:0, UnavailableReplicas:1, Conditions:[]v1.DeploymentCondition{v1.DeploymentCondition{Type:"Available", Status:"False", LastUpdateTime:time.Date(2023, time.January, 4, 9, 32, 25, 0, time.Local), LastTransitionTime:time.Date(2023, time.January, 4, 9, 32, 25, 0, time.Local), Reason:"MinimumReplicasUnavailable", Message:"Deployment does not have minimum availability."}, v1.DeploymentCondition{Type:"Progressing", Status:"True", LastUpdateTime:time.Date(2023, time.January, 4, 9, 32, 25, 0, time.Local), LastTransitionTime:time.Date(2023, time.January, 4, 9, 32, 25, 0, time.Local), Reason:"ReplicaSetUpdated", Message:"ReplicaSet \"azurefile-volume-tester-tbmv2-746557889c\" is progressing."}}, CollisionCount:(*int32)(nil)}
STEP: checking that the pod is running 01/04/23 09:32:33.323
STEP: check pod exec 01/04/23 09:32:34.387
Jan  4 09:32:34.387: INFO: Running '/usr/local/bin/kubectl --kubeconfig=/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/kubeconfig --namespace=azurefile-8666 exec azurefile-volume-tester-tbmv2-746557889c-n9n5q -- cmd /c type C:\mnt\test-1\data.txt'
Jan  4 09:32:34.880: INFO: rc: 1
Jan  4 09:32:34.880: INFO: Error waiting for output "hello world\r\n" in pod "azurefile-volume-tester-tbmv2-746557889c-n9n5q": error running /usr/local/bin/kubectl --kubeconfig=/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/kubeconfig --namespace=azurefile-8666 exec azurefile-volume-tester-tbmv2-746557889c-n9n5q -- cmd /c type C:\mnt\test-1\data.txt:
Command stdout:

stderr:
The system cannot find the file specified.
command terminated with exit code 1

error:
exit status 1.
Jan  4 09:32:36.881: INFO: Running '/usr/local/bin/kubectl --kubeconfig=/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/kubeconfig --namespace=azurefile-8666 exec azurefile-volume-tester-tbmv2-746557889c-n9n5q -- cmd /c type C:\mnt\test-1\data.txt'
Jan  4 09:32:37.407: INFO: stderr: ""
Jan  4 09:32:37.408: INFO: stdout: "hello world\r\n"
STEP: delete volume capz-d35khr#f373b7e3bdb5549e8ab2c2c#pvc-7820338d-390a-45d3-8a4a-90bcd16d2441###azurefile-8666 first, make sure pod could still be terminated 01/04/23 09:32:37.408
STEP: check whether capz-d35khr#f373b7e3bdb5549e8ab2c2c#pvc-7820338d-390a-45d3-8a4a-90bcd16d2441###azurefile-8666 exists 01/04/23 09:32:38.271
STEP: ValidateVolumeCapabilities capz-d35khr#f373b7e3bdb5549e8ab2c2c#pvc-7820338d-390a-45d3-8a4a-90bcd16d2441###azurefile-8666 returned with error: rpc error: code = NotFound desc = the requested volume(capz-d35khr#f373b7e3bdb5549e8ab2c2c#pvc-7820338d-390a-45d3-8a4a-90bcd16d2441###azurefile-8666) does not exist. 01/04/23 09:32:38.419
STEP: deleting the pod for deployment 01/04/23 09:32:38.42
Jan  4 09:32:38.420: INFO: Deleting pod "azurefile-volume-tester-tbmv2-746557889c-n9n5q" in namespace "azurefile-8666"
Jan  4 09:32:38.462: INFO: Waiting for pod "azurefile-volume-tester-tbmv2-746557889c-n9n5q" in namespace "azurefile-8666" to be fully deleted
Jan  4 09:32:42.546: INFO: deleting Deployment "azurefile-8666"/"azurefile-volume-tester-tbmv2"
Jan  4 09:32:42.577: INFO: Error getting logs for pod azurefile-volume-tester-tbmv2-746557889c-n9n5q: the server could not find the requested resource (get pods azurefile-volume-tester-tbmv2-746557889c-n9n5q)
Jan  4 09:32:42.612: INFO: deleting PVC "azurefile-8666"/"pvc-kvndv"
Jan  4 09:32:42.612: INFO: Deleting PersistentVolumeClaim "pvc-kvndv"
STEP: waiting for claim's PV "pvc-7820338d-390a-45d3-8a4a-90bcd16d2441" to be deleted 01/04/23 09:32:42.719
Jan  4 09:32:42.719: INFO: Waiting up to 10m0s for PersistentVolume pvc-7820338d-390a-45d3-8a4a-90bcd16d2441 to get deleted
Jan  4 09:32:42.750: INFO: PersistentVolume pvc-7820338d-390a-45d3-8a4a-90bcd16d2441 found and phase=Bound (30.720409ms)
Jan  4 09:32:47.784: INFO: PersistentVolume pvc-7820338d-390a-45d3-8a4a-90bcd16d2441 found and phase=Bound (5.064419782s)
... skipping 56 lines ...
    Jan  4 09:32:29.259: INFO: deployment status: v1.DeploymentStatus{ObservedGeneration:1, Replicas:1, UpdatedReplicas:1, ReadyReplicas:0, AvailableReplicas:0, UnavailableReplicas:1, Conditions:[]v1.DeploymentCondition{v1.DeploymentCondition{Type:"Available", Status:"False", LastUpdateTime:time.Date(2023, time.January, 4, 9, 32, 25, 0, time.Local), LastTransitionTime:time.Date(2023, time.January, 4, 9, 32, 25, 0, time.Local), Reason:"MinimumReplicasUnavailable", Message:"Deployment does not have minimum availability."}, v1.DeploymentCondition{Type:"Progressing", Status:"True", LastUpdateTime:time.Date(2023, time.January, 4, 9, 32, 25, 0, time.Local), LastTransitionTime:time.Date(2023, time.January, 4, 9, 32, 25, 0, time.Local), Reason:"ReplicaSetUpdated", Message:"ReplicaSet \"azurefile-volume-tester-tbmv2-746557889c\" is progressing."}}, CollisionCount:(*int32)(nil)}
    Jan  4 09:32:31.259: INFO: deployment status: v1.DeploymentStatus{ObservedGeneration:1, Replicas:1, UpdatedReplicas:1, ReadyReplicas:0, AvailableReplicas:0, UnavailableReplicas:1, Conditions:[]v1.DeploymentCondition{v1.DeploymentCondition{Type:"Available", Status:"False", LastUpdateTime:time.Date(2023, time.January, 4, 9, 32, 25, 0, time.Local), LastTransitionTime:time.Date(2023, time.January, 4, 9, 32, 25, 0, time.Local), Reason:"MinimumReplicasUnavailable", Message:"Deployment does not have minimum availability."}, v1.DeploymentCondition{Type:"Progressing", Status:"True", LastUpdateTime:time.Date(2023, time.January, 4, 9, 32, 25, 0, time.Local), LastTransitionTime:time.Date(2023, time.January, 4, 9, 32, 25, 0, time.Local), Reason:"ReplicaSetUpdated", Message:"ReplicaSet \"azurefile-volume-tester-tbmv2-746557889c\" is progressing."}}, CollisionCount:(*int32)(nil)}
    STEP: checking that the pod is running 01/04/23 09:32:33.323
    STEP: check pod exec 01/04/23 09:32:34.387
    Jan  4 09:32:34.387: INFO: Running '/usr/local/bin/kubectl --kubeconfig=/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/kubeconfig --namespace=azurefile-8666 exec azurefile-volume-tester-tbmv2-746557889c-n9n5q -- cmd /c type C:\mnt\test-1\data.txt'
    Jan  4 09:32:34.880: INFO: rc: 1
    Jan  4 09:32:34.880: INFO: Error waiting for output "hello world\r\n" in pod "azurefile-volume-tester-tbmv2-746557889c-n9n5q": error running /usr/local/bin/kubectl --kubeconfig=/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/kubeconfig --namespace=azurefile-8666 exec azurefile-volume-tester-tbmv2-746557889c-n9n5q -- cmd /c type C:\mnt\test-1\data.txt:
    Command stdout:

    stderr:
    The system cannot find the file specified.
    command terminated with exit code 1

    error:
    exit status 1.
    Jan  4 09:32:36.881: INFO: Running '/usr/local/bin/kubectl --kubeconfig=/home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/kubeconfig --namespace=azurefile-8666 exec azurefile-volume-tester-tbmv2-746557889c-n9n5q -- cmd /c type C:\mnt\test-1\data.txt'
    Jan  4 09:32:37.407: INFO: stderr: ""
    Jan  4 09:32:37.408: INFO: stdout: "hello world\r\n"
    STEP: delete volume capz-d35khr#f373b7e3bdb5549e8ab2c2c#pvc-7820338d-390a-45d3-8a4a-90bcd16d2441###azurefile-8666 first, make sure pod could still be terminated 01/04/23 09:32:37.408
    STEP: check whether capz-d35khr#f373b7e3bdb5549e8ab2c2c#pvc-7820338d-390a-45d3-8a4a-90bcd16d2441###azurefile-8666 exists 01/04/23 09:32:38.271
    STEP: ValidateVolumeCapabilities capz-d35khr#f373b7e3bdb5549e8ab2c2c#pvc-7820338d-390a-45d3-8a4a-90bcd16d2441###azurefile-8666 returned with error: rpc error: code = NotFound desc = the requested volume(capz-d35khr#f373b7e3bdb5549e8ab2c2c#pvc-7820338d-390a-45d3-8a4a-90bcd16d2441###azurefile-8666) does not exist. 01/04/23 09:32:38.419
    STEP: deleting the pod for deployment 01/04/23 09:32:38.42
    Jan  4 09:32:38.420: INFO: Deleting pod "azurefile-volume-tester-tbmv2-746557889c-n9n5q" in namespace "azurefile-8666"
    Jan  4 09:32:38.462: INFO: Waiting for pod "azurefile-volume-tester-tbmv2-746557889c-n9n5q" in namespace "azurefile-8666" to be fully deleted
    Jan  4 09:32:42.546: INFO: deleting Deployment "azurefile-8666"/"azurefile-volume-tester-tbmv2"
    Jan  4 09:32:42.577: INFO: Error getting logs for pod azurefile-volume-tester-tbmv2-746557889c-n9n5q: the server could not find the requested resource (get pods azurefile-volume-tester-tbmv2-746557889c-n9n5q)
    Jan  4 09:32:42.612: INFO: deleting PVC "azurefile-8666"/"pvc-kvndv"
    Jan  4 09:32:42.612: INFO: Deleting PersistentVolumeClaim "pvc-kvndv"
    STEP: waiting for claim's PV "pvc-7820338d-390a-45d3-8a4a-90bcd16d2441" to be deleted 01/04/23 09:32:42.719
    Jan  4 09:32:42.719: INFO: Waiting up to 10m0s for PersistentVolume pvc-7820338d-390a-45d3-8a4a-90bcd16d2441 to get deleted
    Jan  4 09:32:42.750: INFO: PersistentVolume pvc-7820338d-390a-45d3-8a4a-90bcd16d2441 found and phase=Bound (30.720409ms)
    Jan  4 09:32:47.784: INFO: PersistentVolume pvc-7820338d-390a-45d3-8a4a-90bcd16d2441 found and phase=Bound (5.064419782s)
... skipping 72 lines ...
check the driver pods if restarts ...
======================================================================================
2023/01/04 09:34:44 Check successfully
Jan  4 09:34:44.724: INFO: >>> kubeConfig: /home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/kubeconfig
2023/01/04 09:34:44 run script: test/utils/get_storage_account_secret_name.sh
2023/01/04 09:34:44 got output: azure-storage-account-f46c311439d5b4b38903e84-secret
, error: <nil>
2023/01/04 09:34:44 got storage account secret name: azure-storage-account-f46c311439d5b4b38903e84-secret
STEP: Successfully provisioned AzureFile volume: "capz-d35khr#f46c311439d5b4b38903e84#csi-inline-smb-volume##csi-inline-smb-volume#azurefile-9103"
 01/04/23 09:34:46.005
STEP: deploying the pod 01/04/23 09:34:46.005
STEP: checking that the pods command exits with no error 01/04/23 09:34:46.042
Jan  4 09:34:46.042: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-nvvfs" in namespace "azurefile-9103" to be "Succeeded or Failed"
Jan  4 09:34:46.075: INFO: Pod "azurefile-volume-tester-nvvfs": Phase="Pending", Reason="", readiness=false. Elapsed: 32.411083ms
Jan  4 09:34:48.107: INFO: Pod "azurefile-volume-tester-nvvfs": Phase="Pending", Reason="", readiness=false. Elapsed: 2.064490174s
Jan  4 09:34:50.106: INFO: Pod "azurefile-volume-tester-nvvfs": Phase="Pending", Reason="", readiness=false. Elapsed: 4.06402787s
Jan  4 09:34:52.108: INFO: Pod "azurefile-volume-tester-nvvfs": Phase="Pending", Reason="", readiness=false. Elapsed: 6.065812281s
Jan  4 09:34:54.109: INFO: Pod "azurefile-volume-tester-nvvfs": Phase="Pending", Reason="", readiness=false. Elapsed: 8.067166037s
Jan  4 09:34:56.109: INFO: Pod "azurefile-volume-tester-nvvfs": Phase="Running", Reason="", readiness=true. Elapsed: 10.066713437s
Jan  4 09:34:58.109: INFO: Pod "azurefile-volume-tester-nvvfs": Phase="Running", Reason="", readiness=true. Elapsed: 12.066536945s
Jan  4 09:35:00.109: INFO: Pod "azurefile-volume-tester-nvvfs": Phase="Running", Reason="", readiness=true. Elapsed: 14.066294875s
Jan  4 09:35:02.110: INFO: Pod "azurefile-volume-tester-nvvfs": Phase="Running", Reason="", readiness=false. Elapsed: 16.067706858s
Jan  4 09:35:04.109: INFO: Pod "azurefile-volume-tester-nvvfs": Phase="Succeeded", Reason="", readiness=false. Elapsed: 18.066639255s
STEP: Saw pod success 01/04/23 09:35:04.109
Jan  4 09:35:04.109: INFO: Pod "azurefile-volume-tester-nvvfs" satisfied condition "Succeeded or Failed"
Jan  4 09:35:04.109: INFO: deleting Pod "azurefile-9103"/"azurefile-volume-tester-nvvfs"
Jan  4 09:35:04.158: INFO: Pod azurefile-volume-tester-nvvfs has the following logs: hello world

STEP: Deleting pod azurefile-volume-tester-nvvfs in namespace azurefile-9103 01/04/23 09:35:04.158
STEP: Destroying namespace "azurefile-9103" for this suite. 01/04/23 09:35:04.206
------------------------------
... skipping 10 lines ...
    STEP: Waiting for a default service account to be provisioned in namespace 01/04/23 09:34:44.303
    STEP: Waiting for kube-root-ca.crt to be provisioned in namespace 01/04/23 09:34:44.363
    Jan  4 09:34:44.724: INFO: >>> kubeConfig: /home/prow/go/src/sigs.k8s.io/cluster-api-provider-azure/kubeconfig
    STEP: Successfully provisioned AzureFile volume: "capz-d35khr#f46c311439d5b4b38903e84#csi-inline-smb-volume##csi-inline-smb-volume#azurefile-9103"
     01/04/23 09:34:46.005
    STEP: deploying the pod 01/04/23 09:34:46.005
    STEP: checking that the pods command exits with no error 01/04/23 09:34:46.042
    Jan  4 09:34:46.042: INFO: Waiting up to 15m0s for pod "azurefile-volume-tester-nvvfs" in namespace "azurefile-9103" to be "Succeeded or Failed"
    Jan  4 09:34:46.075: INFO: Pod "azurefile-volume-tester-nvvfs": Phase="Pending", Reason="", readiness=false. Elapsed: 32.411083ms
    Jan  4 09:34:48.107: INFO: Pod "azurefile-volume-tester-nvvfs": Phase="Pending", Reason="", readiness=false. Elapsed: 2.064490174s
    Jan  4 09:34:50.106: INFO: Pod "azurefile-volume-tester-nvvfs": Phase="Pending", Reason="", readiness=false. Elapsed: 4.06402787s
    Jan  4 09:34:52.108: INFO: Pod "azurefile-volume-tester-nvvfs": Phase="Pending", Reason="", readiness=false. Elapsed: 6.065812281s
    Jan  4 09:34:54.109: INFO: Pod "azurefile-volume-tester-nvvfs": Phase="Pending", Reason="", readiness=false. Elapsed: 8.067166037s
    Jan  4 09:34:56.109: INFO: Pod "azurefile-volume-tester-nvvfs": Phase="Running", Reason="", readiness=true. Elapsed: 10.066713437s
    Jan  4 09:34:58.109: INFO: Pod "azurefile-volume-tester-nvvfs": Phase="Running", Reason="", readiness=true. Elapsed: 12.066536945s
    Jan  4 09:35:00.109: INFO: Pod "azurefile-volume-tester-nvvfs": Phase="Running", Reason="", readiness=true. Elapsed: 14.066294875s
    Jan  4 09:35:02.110: INFO: Pod "azurefile-volume-tester-nvvfs": Phase="Running", Reason="", readiness=false. Elapsed: 16.067706858s
    Jan  4 09:35:04.109: INFO: Pod "azurefile-volume-tester-nvvfs": Phase="Succeeded", Reason="", readiness=false. Elapsed: 18.066639255s
    STEP: Saw pod success 01/04/23 09:35:04.109
    Jan  4 09:35:04.109: INFO: Pod "azurefile-volume-tester-nvvfs" satisfied condition "Succeeded or Failed"
    Jan  4 09:35:04.109: INFO: deleting Pod "azurefile-9103"/"azurefile-volume-tester-nvvfs"
    Jan  4 09:35:04.158: INFO: Pod azurefile-volume-tester-nvvfs has the following logs: hello world

    STEP: Deleting pod azurefile-volume-tester-nvvfs in namespace azurefile-9103 01/04/23 09:35:04.158
    STEP: Destroying namespace "azurefile-9103" for this suite. 01/04/23 09:35:04.206
  << End Captured GinkgoWriter Output
... skipping 235 lines ...

Streaming logs below:
I0104 09:05:27.562560       1 azurefile.go:299] driver userAgent: file.csi.azure.com/e2e-a43df1bddbd87e646edd537bc97b8fccfb98fc15 e2e-test
I0104 09:05:27.562687       1 azure.go:65] set QPS(25.000000) and QPS Burst(50) for driver kubeClient
I0104 09:05:27.563032       1 azure.go:80] reading cloud config from secret kube-system/azure-cloud-provider
I0104 09:05:27.579784       1 round_trippers.go:553] GET https://10.96.0.1:443/api/v1/namespaces/kube-system/secrets/azure-cloud-provider 404 Not Found in 16 milliseconds
I0104 09:05:27.580530       1 azure.go:87] InitializeCloudFromSecret: failed to get cloud config from secret kube-system/azure-cloud-provider: failed to get secret kube-system/azure-cloud-provider: secrets "azure-cloud-provider" not found
I0104 09:05:27.580704       1 azure.go:92] could not read cloud config from secret kube-system/azure-cloud-provider
I0104 09:05:27.580846       1 azure.go:102] use default AZURE_CREDENTIAL_FILE env var: /etc/kubernetes/azure.json
I0104 09:05:27.581014       1 azure.go:110] read cloud config from file: /etc/kubernetes/azure.json successfully
I0104 09:05:27.581777       1 azure_auth.go:245] Using AzurePublicCloud environment
I0104 09:05:27.581823       1 azure_auth.go:130] azure: using client_id+client_secret to retrieve access token
I0104 09:05:27.581895       1 azure_diskclient.go:68] Azure DisksClient using API version: 2022-03-02
I0104 09:05:27.581963       1 azure_blobclient.go:67] Azure BlobClient using API version: 2021-09-01
I0104 09:05:27.582050       1 azure.go:1008] attach/detach disk operation rate limit QPS: 6.000000, Bucket: 10
I0104 09:05:27.582069       1 azure.go:140] disable UseInstanceMetadata for controller server
I0104 09:05:27.582078       1 azure.go:143] starting controller server...
I0104 09:05:27.582092       1 azurefile.go:304] cloud: AzurePublicCloud, location: eastus, rg: capz-d35khr, VnetName: capz-d35khr-vnet, VnetResourceGroup: capz-d35khr, SubnetName: node-subnet
I0104 09:05:27.587301       1 mount_linux.go:287] 'umount /tmp/kubelet-detect-safe-umount3670939756' failed with: exit status 32, output: umount: /tmp/kubelet-detect-safe-umount3670939756: must be superuser to unmount.
I0104 09:05:27.587327       1 mount_linux.go:289] Detected umount with unsafe 'not mounted' behavior
I0104 09:05:27.587405       1 driver.go:93] Enabling controller service capability: CREATE_DELETE_VOLUME
I0104 09:05:27.587413       1 driver.go:93] Enabling controller service capability: PUBLISH_UNPUBLISH_VOLUME
I0104 09:05:27.587421       1 driver.go:93] Enabling controller service capability: CREATE_DELETE_SNAPSHOT
I0104 09:05:27.587427       1 driver.go:93] Enabling controller service capability: EXPAND_VOLUME
I0104 09:05:27.587433       1 driver.go:93] Enabling controller service capability: SINGLE_NODE_MULTI_WRITER
... skipping 590 lines ...
X-Ms-Request-Id: 6aa2f1d0-86f9-45ce-9841-c7ae7d14679e
X-Ms-Routing-Request-Id: EASTUS:20230104T092154Z:6aa2f1d0-86f9-45ce-9841-c7ae7d14679e
Strict-Transport-Security: max-age=31536000; includeSubDomains
X-Content-Type-Options: nosniff
Date: Wed, 04 Jan 2023 09:21:53 GMT
I0104 09:21:54.166893       1 util.go:124] Send.sendRequest got response with ContentLength 250, StatusCode 404 and responseBody length 250
I0104 09:21:54.166911       1 azure_privatednsclient.go:215] Received error in privatednszone.get.request: resourceID: /subscriptions/0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e/resourceGroups/capz-d35khr/providers/Microsoft.Network/privateDnsZones/privatelink.file.core.windows.net, error: Retriable: false, RetryAfter: 0s, HTTPStatusCode: 404, RawError: {"error":{"code":"ResourceNotFound","message":"The Resource 'Microsoft.Network/privateDnsZones/privatelink.file.core.windows.net' under resource group 'capz-d35khr' was not found. For more details please go to https://aka.ms/ARMResourceNotFoundFix"}}
I0104 09:21:54.167099       1 azure_metrics.go:115] "Observed Request Latency" latency_seconds=0.021682327 request="private_dns_zones_get" resource_group="capz-d35khr" subscription_id="0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e" source="" error_code="ResourceNotFound"
I0104 09:21:54.167123       1 azure_storageaccount.go:234] get private dns zone privatelink.file.core.windows.net returned with Retriable: false, RetryAfter: 0s, HTTPStatusCode: 404, RawError: {"error":{"code":"ResourceNotFound","message":"The Resource 'Microsoft.Network/privateDnsZones/privatelink.file.core.windows.net' under resource group 'capz-d35khr' was not found. For more details please go to https://aka.ms/ARMResourceNotFoundFix"}}
I0104 09:21:54.167181       1 azure_storageaccount.go:452] Creating private dns zone(privatelink.file.core.windows.net) in resourceGroup (capz-d35khr)
(2023-01-04T09:21:54.1673326Z) INFO: REQUEST: PUT https://management.azure.com/subscriptions/0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e/resourceGroups/capz-d35khr/providers/Microsoft.Network/privateDnsZones/privatelink.file.core.windows.net?api-version=2018-09-01
Authorization: **REDACTED**
Content-Type: application/json; charset=utf-8
User-Agent: Go/go1.19.4 (amd64-linux) go-autorest/v14.2.1 file.csi.azure.com/e2e-a43df1bddbd87e646edd537bc97b8fccfb98fc15 e2e-test
(2023-01-04T09:21:54.1673783Z) INFO: REQUEST: PUT https://management.azure.com/subscriptions/0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e/resourceGroups/capz-d35khr/providers/Microsoft.Network/privateDnsZones/privatelink.file.core.windows.net?api-version=2018-09-01
... skipping 140 lines ...
X-Ms-Correlation-Request-Id: 699ccde6-bb58-4d47-a6ab-5960d2f5deb5
X-Ms-Routing-Request-Id: EASTUS:20230104T092225Z:699ccde6-bb58-4d47-a6ab-5960d2f5deb5
Strict-Transport-Security: max-age=31536000; includeSubDomains
X-Content-Type-Options: nosniff
Content-Type: application/json; charset=utf-8
I0104 09:22:25.953062       1 util.go:124] Send.sendRequest got response with ContentLength 296, StatusCode 404 and responseBody length 296
I0104 09:22:25.953083       1 azure_virtualnetworklinksclient.go:212] Received error in virtualnetworklink.get.request: resourceID: /subscriptions/0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e/resourceGroups/capz-d35khr/providers/Microsoft.Network/privateDnsZones/privatelink.file.core.windows.net/virtualNetworkLinks/capz-d35khr-vnet-vnetlink, error: Retriable: false, RetryAfter: 0s, HTTPStatusCode: 404, RawError: {"error":{"code":"ResourceNotFound","message":"The Resource 'Microsoft.Network/privateDnsZones/privatelink.file.core.windows.net/virtualNetworkLinks/capz-d35khr-vnet-vnetlink' under resource group 'capz-d35khr' was not found. For more details please go to https://aka.ms/ARMResourceNotFoundFix"}}
I0104 09:22:25.953319       1 azure_metrics.go:115] "Observed Request Latency" latency_seconds=0.022714035 request="virtual_network_links_get" resource_group="capz-d35khr" subscription_id="0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e" source="" error_code="ResourceNotFound"
I0104 09:22:25.953385       1 azure_storageaccount.go:244] get virtual link for vnet(capz-d35khr-vnet) and DNS Zone(privatelink.file.core.windows.net) returned with Retriable: false, RetryAfter: 0s, HTTPStatusCode: 404, RawError: {"error":{"code":"ResourceNotFound","message":"The Resource 'Microsoft.Network/privateDnsZones/privatelink.file.core.windows.net/virtualNetworkLinks/capz-d35khr-vnet-vnetlink' under resource group 'capz-d35khr' was not found. For more details please go to https://aka.ms/ARMResourceNotFoundFix"}}
I0104 09:22:25.953460       1 azure_storageaccount.go:466] Creating virtual link for vnet(capz-d35khr-vnet-vnetlink) and DNS Zone(privatelink.file.core.windows.net) in resourceGroup(capz-d35khr)
(2023-01-04T09:22:25.9536078Z) INFO: REQUEST: PUT https://management.azure.com/subscriptions/0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e/resourceGroups/capz-d35khr/providers/Microsoft.Network/privateDnsZones/privatelink.file.core.windows.net/virtualNetworkLinks/capz-d35khr-vnet-vnetlink?api-version=2020-06-01
Authorization: **REDACTED**
Content-Type: application/json; charset=utf-8
User-Agent: Go/go1.19.4 (amd64-linux) go-autorest/v14.2.1 file.csi.azure.com/e2e-a43df1bddbd87e646edd537bc97b8fccfb98fc15 e2e-test
(2023-01-04T09:22:25.9537236Z) INFO: REQUEST: PUT https://management.azure.com/subscriptions/0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e/resourceGroups/capz-d35khr/providers/Microsoft.Network/privateDnsZones/privatelink.file.core.windows.net/virtualNetworkLinks/capz-d35khr-vnet-vnetlink?api-version=2020-06-01
... skipping 3474 lines ...

Streaming logs below:
I0104 09:05:42.897550    5552 azurefile.go:299] driver userAgent: file.csi.azure.com/e2e-a43df1bddbd87e646edd537bc97b8fccfb98fc15 e2e-test
I0104 09:05:42.900056    5552 azure.go:65] set QPS(25.000000) and QPS Burst(50) for driver kubeClient
I0104 09:05:42.906051    5552 azure.go:80] reading cloud config from secret kube-system/azure-cloud-provider
I0104 09:05:42.923020    5552 round_trippers.go:553] GET https://10.96.0.1:443/api/v1/namespaces/kube-system/secrets/azure-cloud-provider 404 Not Found in 14 milliseconds
I0104 09:05:42.924284    5552 azure.go:87] InitializeCloudFromSecret: failed to get cloud config from secret kube-system/azure-cloud-provider: failed to get secret kube-system/azure-cloud-provider: secrets "azure-cloud-provider" not found
I0104 09:05:42.924284    5552 azure.go:92] could not read cloud config from secret kube-system/azure-cloud-provider
I0104 09:05:42.924284    5552 azure.go:102] use default AZURE_CREDENTIAL_FILE env var: C:\k\azure.json
I0104 09:05:42.924284    5552 azure.go:110] read cloud config from file: C:\k\azure.json successfully
I0104 09:05:42.925889    5552 azure_auth.go:245] Using AzurePublicCloud environment
I0104 09:05:42.926272    5552 azure_auth.go:130] azure: using client_id+client_secret to retrieve access token
I0104 09:05:42.928765    5552 azure_diskclient.go:68] Azure DisksClient using API version: 2022-03-02
... skipping 76 lines ...
I0104 09:12:57.384358    5552 utils.go:76] GRPC call: /csi.v1.Node/NodeGetCapabilities
I0104 09:12:57.384358    5552 utils.go:77] GRPC request: {}
I0104 09:12:57.384358    5552 utils.go:83] GRPC response: {"capabilities":[{"Type":{"Rpc":{"type":1}}},{"Type":{"Rpc":{"type":5}}},{"Type":{"Rpc":{"type":6}}},{"Type":{"Rpc":{"type":2}}}]}
I0104 09:12:57.385869    5552 utils.go:76] GRPC call: /csi.v1.Node/NodeStageVolume
I0104 09:12:57.385869    5552 utils.go:77] GRPC request: {"staging_target_path":"\\var\\lib\\kubelet\\plugins\\kubernetes.io\\csi\\pv\\azurefile-2540-file.csi.azure.com-preprovsioned-pv-vhv6t\\globalmount","volume_capability":{"AccessType":{"Mount":{}},"access_mode":{"mode":7}},"volume_id":"capz-d35khr#f46c311439d5b4b38903e84#pre-provisioned-multiple-pods##pre-provisioned-multiple-pods#azurefile-25401"}
I0104 09:12:57.396797    5552 round_trippers.go:553] GET https://10.96.0.1:443/api/v1/namespaces/azurefile-25401/secrets/azure-storage-account-f46c311439d5b4b38903e84-secret 404 Not Found in 8 milliseconds
W0104 09:12:57.396797    5552 azurefile.go:668] GetStorageAccountFromSecret(azure-storage-account-f46c311439d5b4b38903e84-secret, azurefile-25401) failed with error: could not get secret(azure-storage-account-f46c311439d5b4b38903e84-secret): secrets "azure-storage-account-f46c311439d5b4b38903e84-secret" not found
I0104 09:12:57.396797    5552 azurefile.go:670] use cluster identity to get account key from (0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e, capz-d35khr, f46c311439d5b4b38903e84)
(2023-01-04T09:12:57.3967974Z) INFO: REQUEST: POST https://login.microsoftonline.com/097f89a0-9286-43d2-9a1a-08f1d49b1af8/oauth2/token
User-Agent: Go/go1.19.4 (amd64-windows) go-autorest/adal/v1.0.0
Content-Type: application/x-www-form-urlencoded
(2023-01-04T09:12:57.9473165Z) INFO: RESPONSE: 200 https://login.microsoftonline.com/097f89a0-9286-43d2-9a1a-08f1d49b1af8/oauth2/token
Strict-Transport-Security: max-age=31536000; includeSubDomains
... skipping 167 lines ...
I0104 09:19:23.412617    5552 utils.go:76] GRPC call: /csi.v1.Node/NodeGetCapabilities
I0104 09:19:23.412617    5552 utils.go:77] GRPC request: {}
I0104 09:19:23.412617    5552 utils.go:83] GRPC response: {"capabilities":[{"Type":{"Rpc":{"type":1}}},{"Type":{"Rpc":{"type":5}}},{"Type":{"Rpc":{"type":6}}},{"Type":{"Rpc":{"type":2}}}]}
I0104 09:19:23.414068    5552 utils.go:76] GRPC call: /csi.v1.Node/NodeStageVolume
I0104 09:19:23.414068    5552 utils.go:77] GRPC request: {"staging_target_path":"\\var\\lib\\kubelet\\plugins\\kubernetes.io\\csi\\pv\\azurefile-2540-file.csi.azure.com-preprovsioned-pv-lgwfd\\globalmount","volume_capability":{"AccessType":{"Mount":{}},"access_mode":{"mode":7}},"volume_id":"capz-d35khr#f46c311439d5b4b38903e84#pre-provisioned-multiple-pods##pre-provisioned-multiple-pods#azurefile-25402"}
I0104 09:19:23.423014    5552 round_trippers.go:553] GET https://10.96.0.1:443/api/v1/namespaces/azurefile-25402/secrets/azure-storage-account-f46c311439d5b4b38903e84-secret 404 Not Found in 8 milliseconds
W0104 09:19:23.423014    5552 azurefile.go:668] GetStorageAccountFromSecret(azure-storage-account-f46c311439d5b4b38903e84-secret, azurefile-25402) failed with error: could not get secret(azure-storage-account-f46c311439d5b4b38903e84-secret): secrets "azure-storage-account-f46c311439d5b4b38903e84-secret" not found
I0104 09:19:23.423014    5552 azurefile.go:670] use cluster identity to get account key from (0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e, capz-d35khr, f46c311439d5b4b38903e84)
(2023-01-04T09:19:23.4230143Z) INFO: REQUEST: POST https://management.azure.com/subscriptions/0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e/resourceGroups/capz-d35khr/providers/Microsoft.Storage/storageAccounts/f46c311439d5b4b38903e84/listKeys?api-version=2021-02-01
Content-Type: application/json; charset=utf-8
User-Agent: Go/go1.19.4 (amd64-windows) go-autorest/v14.2.1 file.csi.azure.com/e2e-a43df1bddbd87e646edd537bc97b8fccfb98fc15 e2e-test
Authorization: **REDACTED**
(2023-01-04T09:19:23.4283453Z) INFO: REQUEST: POST https://management.azure.com/subscriptions/0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e/resourceGroups/capz-d35khr/providers/Microsoft.Storage/storageAccounts/f46c311439d5b4b38903e84/listKeys?api-version=2021-02-01
... skipping 1465 lines ...
I0104 09:34:46.347623    5552 utils.go:76] GRPC call: /csi.v1.Node/NodeGetCapabilities
I0104 09:34:46.347623    5552 utils.go:77] GRPC request: {}
I0104 09:34:46.347623    5552 utils.go:83] GRPC response: {"capabilities":[{"Type":{"Rpc":{"type":1}}},{"Type":{"Rpc":{"type":5}}},{"Type":{"Rpc":{"type":6}}},{"Type":{"Rpc":{"type":2}}}]}
I0104 09:34:46.348852    5552 utils.go:76] GRPC call: /csi.v1.Node/NodePublishVolume
I0104 09:34:46.348852    5552 utils.go:77] GRPC request: {"target_path":"c:\\var\\lib\\kubelet\\pods\\551c90c6-fc1a-4807-8d8c-e348ebddf955\\volumes\\kubernetes.io~csi\\test-volume-1\\mount","volume_capability":{"AccessType":{"Mount":{}},"access_mode":{"mode":7}},"volume_context":{"csi.storage.k8s.io/ephemeral":"true","csi.storage.k8s.io/pod.name":"azurefile-volume-tester-nvvfs","csi.storage.k8s.io/pod.namespace":"azurefile-9103","csi.storage.k8s.io/pod.uid":"551c90c6-fc1a-4807-8d8c-e348ebddf955","csi.storage.k8s.io/serviceAccount.name":"default","mountOptions":"dir_mode=0755,file_mode=0721,cache=singleclient","secretName":"azure-storage-account-f46c311439d5b4b38903e84-secret","server":"","shareName":"csi-inline-smb-volume"},"volume_id":"csi-5a547fc55af779a2f02968e875d89956732381d22fa91e620287453f285882f9"}
I0104 09:34:46.348852    5552 nodeserver.go:68] NodePublishVolume: ephemeral volume(csi-5a547fc55af779a2f02968e875d89956732381d22fa91e620287453f285882f9) mount on c:\var\lib\kubelet\pods\551c90c6-fc1a-4807-8d8c-e348ebddf955\volumes\kubernetes.io~csi\test-volume-1\mount, VolumeContext: map[csi.storage.k8s.io/ephemeral:true csi.storage.k8s.io/pod.name:azurefile-volume-tester-nvvfs csi.storage.k8s.io/pod.namespace:azurefile-9103 csi.storage.k8s.io/pod.uid:551c90c6-fc1a-4807-8d8c-e348ebddf955 csi.storage.k8s.io/serviceAccount.name:default getaccountkeyfromsecret:true mountOptions:dir_mode=0755,file_mode=0721,cache=singleclient secretName:azure-storage-account-f46c311439d5b4b38903e84-secret secretnamespace:azurefile-9103 server: shareName:csi-inline-smb-volume storageaccount:]
W0104 09:34:46.349453    5552 azurefile.go:595] parsing volumeID(csi-5a547fc55af779a2f02968e875d89956732381d22fa91e620287453f285882f9) return with error: error parsing volume id: "csi-5a547fc55af779a2f02968e875d89956732381d22fa91e620287453f285882f9", should at least contain two #
I0104 09:34:46.361701    5552 round_trippers.go:553] GET https://10.96.0.1:443/api/v1/namespaces/azurefile-9103/secrets/azure-storage-account-f46c311439d5b4b38903e84-secret 200 OK in 11 milliseconds
I0104 09:34:46.361701    5552 nodeserver.go:302] cifsMountPath(c:\var\lib\kubelet\pods\551c90c6-fc1a-4807-8d8c-e348ebddf955\volumes\kubernetes.io~csi\test-volume-1\mount) fstype() volumeID(csi-5a547fc55af779a2f02968e875d89956732381d22fa91e620287453f285882f9) context(map[csi.storage.k8s.io/ephemeral:true csi.storage.k8s.io/pod.name:azurefile-volume-tester-nvvfs csi.storage.k8s.io/pod.namespace:azurefile-9103 csi.storage.k8s.io/pod.uid:551c90c6-fc1a-4807-8d8c-e348ebddf955 csi.storage.k8s.io/serviceAccount.name:default getaccountkeyfromsecret:true mountOptions:dir_mode=0755,file_mode=0721,cache=singleclient secretName:azure-storage-account-f46c311439d5b4b38903e84-secret secretnamespace:azurefile-9103 server: shareName:csi-inline-smb-volume storageaccount:]) mountflags([]) mountOptions([AZURE\f46c311439d5b4b38903e84]) volumeMountGroup()
I0104 09:34:46.361701    5552 safe_mounter_windows.go:155] IsLikelyNotMountPoint: c:\var\lib\kubelet\pods\551c90c6-fc1a-4807-8d8c-e348ebddf955\volumes\kubernetes.io~csi\test-volume-1\mount
I0104 09:34:46.361701    5552 safe_mounter_windows.go:212] Exists path: c:\var\lib\kubelet\pods\551c90c6-fc1a-4807-8d8c-e348ebddf955\volumes\kubernetes.io~csi\test-volume-1\mount
I0104 09:34:46.363796    5552 safe_mounter_windows.go:212] Exists path: c:\var\lib\kubelet\pods\551c90c6-fc1a-4807-8d8c-e348ebddf955\volumes\kubernetes.io~csi\test-volume-1\mount
I0104 09:34:46.364371    5552 azure_common_windows.go:52] Removing path: c:\var\lib\kubelet\pods\551c90c6-fc1a-4807-8d8c-e348ebddf955\volumes\kubernetes.io~csi\test-volume-1\mount
... skipping 28 lines ...

Streaming logs below:
I0104 09:05:45.983893    6880 azurefile.go:299] driver userAgent: file.csi.azure.com/e2e-a43df1bddbd87e646edd537bc97b8fccfb98fc15 e2e-test
I0104 09:05:45.995533    6880 azure.go:65] set QPS(25.000000) and QPS Burst(50) for driver kubeClient
I0104 09:05:46.003525    6880 azure.go:80] reading cloud config from secret kube-system/azure-cloud-provider
I0104 09:05:46.012956    6880 round_trippers.go:553] GET https://10.96.0.1:443/api/v1/namespaces/kube-system/secrets/azure-cloud-provider  in 5 milliseconds
I0104 09:05:46.014354    6880 azure.go:87] InitializeCloudFromSecret: failed to get cloud config from secret kube-system/azure-cloud-provider: failed to get secret kube-system/azure-cloud-provider: Get "https://10.96.0.1:443/api/v1/namespaces/kube-system/secrets/azure-cloud-provider": dial tcp 10.96.0.1:443: connectex: An attempt was made to access a socket in a way forbidden by its access permissions.
I0104 09:05:46.014354    6880 azure.go:92] could not read cloud config from secret kube-system/azure-cloud-provider
I0104 09:05:46.014354    6880 azure.go:102] use default AZURE_CREDENTIAL_FILE env var: C:\k\azure.json
I0104 09:05:46.014591    6880 azure.go:110] read cloud config from file: C:\k\azure.json successfully
I0104 09:05:46.016363    6880 azure_auth.go:245] Using AzurePublicCloud environment
I0104 09:05:46.016443    6880 azure_auth.go:130] azure: using client_id+client_secret to retrieve access token
I0104 09:05:46.018800    6880 azure_diskclient.go:68] Azure DisksClient using API version: 2022-03-02
... skipping 608 lines ...
I0104 09:32:38.717331    6880 safe_mounter_windows.go:69] SMBMount: remote path: \\f373b7e3bdb5549e8ab2c2c.file.core.windows.net\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441 local path: \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441\globalmount
I0104 09:32:38.717331    6880 safe_mounter_windows.go:212] Exists path: \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441
I0104 09:32:38.717925    6880 safe_mounter_windows.go:96] begin to mount \\f373b7e3bdb5549e8ab2c2c.file.core.windows.net\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441 on c:\var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441\globalmount
I0104 09:33:06.141187    6880 utils.go:76] GRPC call: /csi.v1.Identity/Probe
I0104 09:33:06.141187    6880 utils.go:77] GRPC request: {}
I0104 09:33:06.141294    6880 utils.go:83] GRPC response: {"ready":{"value":true}}
E0104 09:33:12.984780    6880 utils.go:81] GRPC error: rpc error: code = Internal desc = volume(capz-d35khr#f373b7e3bdb5549e8ab2c2c#pvc-7820338d-390a-45d3-8a4a-90bcd16d2441###azurefile-8666) mount \\f373b7e3bdb5549e8ab2c2c.file.core.windows.net\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441 on \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441\globalmount failed with smb mapping failed with error: rpc error: code = Unknown desc = NewSmbGlobalMapping failed. output: "New-SmbGlobalMapping : The network name cannot be found. \r\nAt line:1 char:190\r\n+ ... ser, $PWord;New-SmbGlobalMapping -RemotePath $Env:smbremotepath -Cred ...\r\n+                 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\r\n    + CategoryInfo          : NotSpecified: (MSFT_SmbGlobalMapping:ROOT/Microsoft/...mbGlobalMapping) [New-SmbGlobalMa \r\n   pping], CimException\r\n    + FullyQualifiedErrorId : Windows System Error 67,New-SmbGlobalMapping\r\n \r\n", err: exit status 1
I0104 09:33:13.499900    6880 utils.go:76] GRPC call: /csi.v1.Node/NodeGetCapabilities
I0104 09:33:13.499900    6880 utils.go:77] GRPC request: {}
I0104 09:33:13.499900    6880 utils.go:83] GRPC response: {"capabilities":[{"Type":{"Rpc":{"type":1}}},{"Type":{"Rpc":{"type":5}}},{"Type":{"Rpc":{"type":6}}},{"Type":{"Rpc":{"type":2}}}]}
I0104 09:33:13.503534    6880 utils.go:76] GRPC call: /csi.v1.Node/NodeGetCapabilities
I0104 09:33:13.503731    6880 utils.go:77] GRPC request: {}
I0104 09:33:13.503731    6880 utils.go:83] GRPC response: {"capabilities":[{"Type":{"Rpc":{"type":1}}},{"Type":{"Rpc":{"type":5}}},{"Type":{"Rpc":{"type":6}}},{"Type":{"Rpc":{"type":2}}}]}
... skipping 11 lines ...
I0104 09:33:13.512186    6880 safe_mounter_windows.go:69] SMBMount: remote path: \\f373b7e3bdb5549e8ab2c2c.file.core.windows.net\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441 local path: \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441\globalmount
I0104 09:33:13.512186    6880 safe_mounter_windows.go:212] Exists path: \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441
I0104 09:33:13.512819    6880 safe_mounter_windows.go:96] begin to mount \\f373b7e3bdb5549e8ab2c2c.file.core.windows.net\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441 on c:\var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441\globalmount
I0104 09:33:36.136928    6880 utils.go:76] GRPC call: /csi.v1.Identity/Probe
I0104 09:33:36.136973    6880 utils.go:77] GRPC request: {}
I0104 09:33:36.137043    6880 utils.go:83] GRPC response: {"ready":{"value":true}}
E0104 09:33:47.314866    6880 utils.go:81] GRPC error: rpc error: code = Internal desc = volume(capz-d35khr#f373b7e3bdb5549e8ab2c2c#pvc-7820338d-390a-45d3-8a4a-90bcd16d2441###azurefile-8666) mount \\f373b7e3bdb5549e8ab2c2c.file.core.windows.net\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441 on \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441\globalmount failed with smb mapping failed with error: rpc error: code = Unknown desc = NewSmbGlobalMapping failed. output: "New-SmbGlobalMapping : The network name cannot be found. \r\nAt line:1 char:190\r\n+ ... ser, $PWord;New-SmbGlobalMapping -RemotePath $Env:smbremotepath -Cred ...\r\n+                 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\r\n    + CategoryInfo          : NotSpecified: (MSFT_SmbGlobalMapping:ROOT/Microsoft/...mbGlobalMapping) [New-SmbGlobalMa \r\n   pping], CimException\r\n    + FullyQualifiedErrorId : Windows System Error 67,New-SmbGlobalMapping\r\n \r\n", err: exit status 1
I0104 09:33:48.394304    6880 utils.go:76] GRPC call: /csi.v1.Node/NodeGetCapabilities
I0104 09:33:48.394346    6880 utils.go:77] GRPC request: {}
I0104 09:33:48.394382    6880 utils.go:83] GRPC response: {"capabilities":[{"Type":{"Rpc":{"type":1}}},{"Type":{"Rpc":{"type":5}}},{"Type":{"Rpc":{"type":6}}},{"Type":{"Rpc":{"type":2}}}]}
I0104 09:33:48.396663    6880 utils.go:76] GRPC call: /csi.v1.Node/NodeGetCapabilities
I0104 09:33:48.396663    6880 utils.go:77] GRPC request: {}
I0104 09:33:48.396663    6880 utils.go:83] GRPC response: {"capabilities":[{"Type":{"Rpc":{"type":1}}},{"Type":{"Rpc":{"type":5}}},{"Type":{"Rpc":{"type":6}}},{"Type":{"Rpc":{"type":2}}}]}
... skipping 11 lines ...
I0104 09:33:48.403131    6880 safe_mounter_windows.go:69] SMBMount: remote path: \\f373b7e3bdb5549e8ab2c2c.file.core.windows.net\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441 local path: \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441\globalmount
I0104 09:33:48.403131    6880 safe_mounter_windows.go:212] Exists path: \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441
I0104 09:33:48.403768    6880 safe_mounter_windows.go:96] begin to mount \\f373b7e3bdb5549e8ab2c2c.file.core.windows.net\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441 on c:\var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441\globalmount
I0104 09:34:06.145035    6880 utils.go:76] GRPC call: /csi.v1.Identity/Probe
I0104 09:34:06.145035    6880 utils.go:77] GRPC request: {}
I0104 09:34:06.145035    6880 utils.go:83] GRPC response: {"ready":{"value":true}}
E0104 09:34:22.896661    6880 utils.go:81] GRPC error: rpc error: code = Internal desc = volume(capz-d35khr#f373b7e3bdb5549e8ab2c2c#pvc-7820338d-390a-45d3-8a4a-90bcd16d2441###azurefile-8666) mount \\f373b7e3bdb5549e8ab2c2c.file.core.windows.net\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441 on \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441\globalmount failed with smb mapping failed with error: rpc error: code = Unknown desc = NewSmbGlobalMapping failed. output: "New-SmbGlobalMapping : The network name cannot be found. \r\nAt line:1 char:190\r\n+ ... ser, $PWord;New-SmbGlobalMapping -RemotePath $Env:smbremotepath -Cred ...\r\n+                 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\r\n    + CategoryInfo          : NotSpecified: (MSFT_SmbGlobalMapping:ROOT/Microsoft/...mbGlobalMapping) [New-SmbGlobalMa \r\n   pping], CimException\r\n    + FullyQualifiedErrorId : Windows System Error 67,New-SmbGlobalMapping\r\n \r\n", err: exit status 1
I0104 09:34:24.988228    6880 utils.go:76] GRPC call: /csi.v1.Node/NodeGetCapabilities
I0104 09:34:24.988228    6880 utils.go:77] GRPC request: {}
I0104 09:34:24.988228    6880 utils.go:83] GRPC response: {"capabilities":[{"Type":{"Rpc":{"type":1}}},{"Type":{"Rpc":{"type":5}}},{"Type":{"Rpc":{"type":6}}},{"Type":{"Rpc":{"type":2}}}]}
I0104 09:34:24.990638    6880 utils.go:76] GRPC call: /csi.v1.Node/NodeGetCapabilities
I0104 09:34:24.990638    6880 utils.go:77] GRPC request: {}
I0104 09:34:24.990638    6880 utils.go:83] GRPC response: {"capabilities":[{"Type":{"Rpc":{"type":1}}},{"Type":{"Rpc":{"type":5}}},{"Type":{"Rpc":{"type":6}}},{"Type":{"Rpc":{"type":2}}}]}
... skipping 11 lines ...
I0104 09:34:25.000571    6880 safe_mounter_windows.go:69] SMBMount: remote path: \\f373b7e3bdb5549e8ab2c2c.file.core.windows.net\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441 local path: \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441\globalmount
I0104 09:34:25.000571    6880 safe_mounter_windows.go:212] Exists path: \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441
I0104 09:34:25.000876    6880 safe_mounter_windows.go:96] begin to mount \\f373b7e3bdb5549e8ab2c2c.file.core.windows.net\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441 on c:\var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441\globalmount
I0104 09:34:36.135013    6880 utils.go:76] GRPC call: /csi.v1.Identity/Probe
I0104 09:34:36.135013    6880 utils.go:77] GRPC request: {}
I0104 09:34:36.135013    6880 utils.go:83] GRPC response: {"ready":{"value":true}}
E0104 09:34:59.137383    6880 utils.go:81] GRPC error: rpc error: code = Internal desc = volume(capz-d35khr#f373b7e3bdb5549e8ab2c2c#pvc-7820338d-390a-45d3-8a4a-90bcd16d2441###azurefile-8666) mount \\f373b7e3bdb5549e8ab2c2c.file.core.windows.net\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441 on \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-7820338d-390a-45d3-8a4a-90bcd16d2441\globalmount failed with smb mapping failed with error: rpc error: code = Unknown desc = NewSmbGlobalMapping failed. output: "New-SmbGlobalMapping : The network name cannot be found. \r\nAt line:1 char:190\r\n+ ... ser, $PWord;New-SmbGlobalMapping -RemotePath $Env:smbremotepath -Cred ...\r\n+                 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\r\n    + CategoryInfo          : NotSpecified: (MSFT_SmbGlobalMapping:ROOT/Microsoft/...mbGlobalMapping) [New-SmbGlobalMa \r\n   pping], CimException\r\n    + FullyQualifiedErrorId : Windows System Error 67,New-SmbGlobalMapping\r\n \r\n", err: exit status 1
I0104 09:35:06.135218    6880 utils.go:76] GRPC call: /csi.v1.Identity/Probe
I0104 09:35:06.135218    6880 utils.go:77] GRPC request: {}
I0104 09:35:06.135218    6880 utils.go:83] GRPC response: {"ready":{"value":true}}
print out cloudprovider_azure metrics ...
======================================================================================
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
... skipping 414 lines ...
# HELP go_gc_heap_objects_objects Number of objects, live or unswept, occupying heap memory.
# TYPE go_gc_heap_objects_objects gauge
go_gc_heap_objects_objects 49737
# HELP go_gc_heap_tiny_allocs_objects_total Count of small allocations that are packed together into blocks. These allocations are counted separately from other allocations because each individual allocation is not tracked by the runtime, only their block. Each block is already accounted for in allocs-by-size and frees-by-size.
# TYPE go_gc_heap_tiny_allocs_objects_total counter
go_gc_heap_tiny_allocs_objects_total 17591
# HELP go_gc_limiter_last_enabled_gc_cycle GC cycle the last time the GC CPU limiter was enabled. This metric is useful for diagnosing the root cause of an out-of-memory error, because the limiter trades memory for CPU time when the GC's CPU time gets too high. This is most likely to occur with use of SetMemoryLimit. The first GC cycle is cycle 1, so a value of 0 indicates that it was never enabled.
# TYPE go_gc_limiter_last_enabled_gc_cycle gauge
go_gc_limiter_last_enabled_gc_cycle 0
# HELP go_gc_pauses_seconds Distribution individual GC-related stop-the-world pause latencies.
# TYPE go_gc_pauses_seconds histogram
go_gc_pauses_seconds_bucket{le="9.999999999999999e-10"} 0
go_gc_pauses_seconds_bucket{le="9.999999999999999e-09"} 0
... skipping 395 lines ...

Streaming logs below:
I0104 09:05:27.562560       1 azurefile.go:299] driver userAgent: file.csi.azure.com/e2e-a43df1bddbd87e646edd537bc97b8fccfb98fc15 e2e-test
I0104 09:05:27.562687       1 azure.go:65] set QPS(25.000000) and QPS Burst(50) for driver kubeClient
I0104 09:05:27.563032       1 azure.go:80] reading cloud config from secret kube-system/azure-cloud-provider
I0104 09:05:27.579784       1 round_trippers.go:553] GET https://10.96.0.1:443/api/v1/namespaces/kube-system/secrets/azure-cloud-provider 404 Not Found in 16 milliseconds
I0104 09:05:27.580530       1 azure.go:87] InitializeCloudFromSecret: failed to get cloud config from secret kube-system/azure-cloud-provider: failed to get secret kube-system/azure-cloud-provider: secrets "azure-cloud-provider" not found
I0104 09:05:27.580704       1 azure.go:92] could not read cloud config from secret kube-system/azure-cloud-provider
I0104 09:05:27.580846       1 azure.go:102] use default AZURE_CREDENTIAL_FILE env var: /etc/kubernetes/azure.json
I0104 09:05:27.581014       1 azure.go:110] read cloud config from file: /etc/kubernetes/azure.json successfully
I0104 09:05:27.581777       1 azure_auth.go:245] Using AzurePublicCloud environment
I0104 09:05:27.581823       1 azure_auth.go:130] azure: using client_id+client_secret to retrieve access token
I0104 09:05:27.581895       1 azure_diskclient.go:68] Azure DisksClient using API version: 2022-03-02
I0104 09:05:27.581963       1 azure_blobclient.go:67] Azure BlobClient using API version: 2021-09-01
I0104 09:05:27.582050       1 azure.go:1008] attach/detach disk operation rate limit QPS: 6.000000, Bucket: 10
I0104 09:05:27.582069       1 azure.go:140] disable UseInstanceMetadata for controller server
I0104 09:05:27.582078       1 azure.go:143] starting controller server...
I0104 09:05:27.582092       1 azurefile.go:304] cloud: AzurePublicCloud, location: eastus, rg: capz-d35khr, VnetName: capz-d35khr-vnet, VnetResourceGroup: capz-d35khr, SubnetName: node-subnet
I0104 09:05:27.587301       1 mount_linux.go:287] 'umount /tmp/kubelet-detect-safe-umount3670939756' failed with: exit status 32, output: umount: /tmp/kubelet-detect-safe-umount3670939756: must be superuser to unmount.
I0104 09:05:27.587327       1 mount_linux.go:289] Detected umount with unsafe 'not mounted' behavior
I0104 09:05:27.587405       1 driver.go:93] Enabling controller service capability: CREATE_DELETE_VOLUME
I0104 09:05:27.587413       1 driver.go:93] Enabling controller service capability: PUBLISH_UNPUBLISH_VOLUME
I0104 09:05:27.587421       1 driver.go:93] Enabling controller service capability: CREATE_DELETE_SNAPSHOT
I0104 09:05:27.587427       1 driver.go:93] Enabling controller service capability: EXPAND_VOLUME
I0104 09:05:27.587433       1 driver.go:93] Enabling controller service capability: SINGLE_NODE_MULTI_WRITER
... skipping 590 lines ...
X-Ms-Request-Id: 6aa2f1d0-86f9-45ce-9841-c7ae7d14679e
X-Ms-Routing-Request-Id: EASTUS:20230104T092154Z:6aa2f1d0-86f9-45ce-9841-c7ae7d14679e
Strict-Transport-Security: max-age=31536000; includeSubDomains
X-Content-Type-Options: nosniff
Date: Wed, 04 Jan 2023 09:21:53 GMT
I0104 09:21:54.166893       1 util.go:124] Send.sendRequest got response with ContentLength 250, StatusCode 404 and responseBody length 250
I0104 09:21:54.166911       1 azure_privatednsclient.go:215] Received error in privatednszone.get.request: resourceID: /subscriptions/0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e/resourceGroups/capz-d35khr/providers/Microsoft.Network/privateDnsZones/privatelink.file.core.windows.net, error: Retriable: false, RetryAfter: 0s, HTTPStatusCode: 404, RawError: {"error":{"code":"ResourceNotFound","message":"The Resource 'Microsoft.Network/privateDnsZones/privatelink.file.core.windows.net' under resource group 'capz-d35khr' was not found. For more details please go to https://aka.ms/ARMResourceNotFoundFix"}}
I0104 09:21:54.167099       1 azure_metrics.go:115] "Observed Request Latency" latency_seconds=0.021682327 request="private_dns_zones_get" resource_group="capz-d35khr" subscription_id="0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e" source="" error_code="ResourceNotFound"
I0104 09:21:54.167123       1 azure_storageaccount.go:234] get private dns zone privatelink.file.core.windows.net returned with Retriable: false, RetryAfter: 0s, HTTPStatusCode: 404, RawError: {"error":{"code":"ResourceNotFound","message":"The Resource 'Microsoft.Network/privateDnsZones/privatelink.file.core.windows.net' under resource group 'capz-d35khr' was not found. For more details please go to https://aka.ms/ARMResourceNotFoundFix"}}
I0104 09:21:54.167181       1 azure_storageaccount.go:452] Creating private dns zone(privatelink.file.core.windows.net) in resourceGroup (capz-d35khr)
(2023-01-04T09:21:54.1673326Z) INFO: REQUEST: PUT https://management.azure.com/subscriptions/0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e/resourceGroups/capz-d35khr/providers/Microsoft.Network/privateDnsZones/privatelink.file.core.windows.net?api-version=2018-09-01
Authorization: **REDACTED**
Content-Type: application/json; charset=utf-8
User-Agent: Go/go1.19.4 (amd64-linux) go-autorest/v14.2.1 file.csi.azure.com/e2e-a43df1bddbd87e646edd537bc97b8fccfb98fc15 e2e-test
(2023-01-04T09:21:54.1673783Z) INFO: REQUEST: PUT https://management.azure.com/subscriptions/0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e/resourceGroups/capz-d35khr/providers/Microsoft.Network/privateDnsZones/privatelink.file.core.windows.net?api-version=2018-09-01
... skipping 140 lines ...
X-Ms-Correlation-Request-Id: 699ccde6-bb58-4d47-a6ab-5960d2f5deb5
X-Ms-Routing-Request-Id: EASTUS:20230104T092225Z:699ccde6-bb58-4d47-a6ab-5960d2f5deb5
Strict-Transport-Security: max-age=31536000; includeSubDomains
X-Content-Type-Options: nosniff
Content-Type: application/json; charset=utf-8
I0104 09:22:25.953062       1 util.go:124] Send.sendRequest got response with ContentLength 296, StatusCode 404 and responseBody length 296
I0104 09:22:25.953083       1 azure_virtualnetworklinksclient.go:212] Received error in virtualnetworklink.get.request: resourceID: /subscriptions/0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e/resourceGroups/capz-d35khr/providers/Microsoft.Network/privateDnsZones/privatelink.file.core.windows.net/virtualNetworkLinks/capz-d35khr-vnet-vnetlink, error: Retriable: false, RetryAfter: 0s, HTTPStatusCode: 404, RawError: {"error":{"code":"ResourceNotFound","message":"The Resource 'Microsoft.Network/privateDnsZones/privatelink.file.core.windows.net/virtualNetworkLinks/capz-d35khr-vnet-vnetlink' under resource group 'capz-d35khr' was not found. For more details please go to https://aka.ms/ARMResourceNotFoundFix"}}
I0104 09:22:25.953319       1 azure_metrics.go:115] "Observed Request Latency" latency_seconds=0.022714035 request="virtual_network_links_get" resource_group="capz-d35khr" subscription_id="0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e" source="" error_code="ResourceNotFound"
I0104 09:22:25.953385       1 azure_storageaccount.go:244] get virtual link for vnet(capz-d35khr-vnet) and DNS Zone(privatelink.file.core.windows.net) returned with Retriable: false, RetryAfter: 0s, HTTPStatusCode: 404, RawError: {"error":{"code":"ResourceNotFound","message":"The Resource 'Microsoft.Network/privateDnsZones/privatelink.file.core.windows.net/virtualNetworkLinks/capz-d35khr-vnet-vnetlink' under resource group 'capz-d35khr' was not found. For more details please go to https://aka.ms/ARMResourceNotFoundFix"}}
I0104 09:22:25.953460       1 azure_storageaccount.go:466] Creating virtual link for vnet(capz-d35khr-vnet-vnetlink) and DNS Zone(privatelink.file.core.windows.net) in resourceGroup(capz-d35khr)
(2023-01-04T09:22:25.9536078Z) INFO: REQUEST: PUT https://management.azure.com/subscriptions/0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e/resourceGroups/capz-d35khr/providers/Microsoft.Network/privateDnsZones/privatelink.file.core.windows.net/virtualNetworkLinks/capz-d35khr-vnet-vnetlink?api-version=2020-06-01
Authorization: **REDACTED**
Content-Type: application/json; charset=utf-8
User-Agent: Go/go1.19.4 (amd64-linux) go-autorest/v14.2.1 file.csi.azure.com/e2e-a43df1bddbd87e646edd537bc97b8fccfb98fc15 e2e-test
(2023-01-04T09:22:25.9537236Z) INFO: REQUEST: PUT https://management.azure.com/subscriptions/0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e/resourceGroups/capz-d35khr/providers/Microsoft.Network/privateDnsZones/privatelink.file.core.windows.net/virtualNetworkLinks/capz-d35khr-vnet-vnetlink?api-version=2020-06-01
... skipping 3787 lines ...
Platform: linux/amd64

Streaming logs below:
I0104 09:35:53.458801       1 azurefile.go:299] driver userAgent: file.csi.azure.com/v1.24.0
I0104 09:35:53.458909       1 azure.go:65] set QPS(25.000000) and QPS Burst(50) for driver kubeClient
I0104 09:35:53.459188       1 azure.go:80] reading cloud config from secret kube-system/azure-cloud-provider
I0104 09:35:53.470304       1 azure.go:87] InitializeCloudFromSecret: failed to get cloud config from secret kube-system/azure-cloud-provider: failed to get secret kube-system/azure-cloud-provider: secrets "azure-cloud-provider" not found
I0104 09:35:53.470336       1 azure.go:92] could not read cloud config from secret kube-system/azure-cloud-provider
I0104 09:35:53.470343       1 azure.go:102] use default AZURE_CREDENTIAL_FILE env var: /etc/kubernetes/azure.json
I0104 09:35:53.470369       1 azure.go:110] read cloud config from file: /etc/kubernetes/azure.json successfully
I0104 09:35:53.471233       1 azure_auth.go:245] Using AzurePublicCloud environment
I0104 09:35:53.471275       1 azure_auth.go:130] azure: using client_id+client_secret to retrieve access token
I0104 09:35:53.471378       1 azure_diskclient.go:68] Azure DisksClient using API version: 2022-03-02
... skipping 42 lines ...
Git Commit: b97b1825130f9837731ff25fd878281b0d2288a3
Go Version: go1.18.3
Platform: windows/amd64

Streaming logs below:
I0104 09:35:53.341785    7608 azurefile.go:299] driver userAgent: file.csi.azure.com/v1.24.0
W0104 09:35:53.342668    7608 azure.go:73] get kubeconfig(C:\\k\\config) failed with error: CreateFile C:\\k\\config: The system cannot find the file specified.
I0104 09:35:53.343281    7608 azure.go:92] could not read cloud config from secret kube-system/azure-cloud-provider
I0104 09:35:53.343817    7608 azure.go:102] use default AZURE_CREDENTIAL_FILE env var: C:\k\azure.json
I0104 09:35:53.343915    7608 azure.go:110] read cloud config from file: C:\k\azure.json successfully
I0104 09:35:53.345959    7608 azure_auth.go:245] Using AzurePublicCloud environment
I0104 09:35:53.345959    7608 azure_auth.go:130] azure: using client_id+client_secret to retrieve access token
I0104 09:35:53.348954    7608 azure_diskclient.go:68] Azure DisksClient using API version: 2022-03-02
... skipping 27 lines ...
I0104 09:35:54.067130    7608 utils.go:83] GRPC response: {"name":"file.csi.azure.com","vendor_version":"v1.24.0"}
I0104 09:35:54.873639    7608 utils.go:76] GRPC call: /csi.v1.Node/NodeGetInfo
I0104 09:35:54.873703    7608 utils.go:77] GRPC request: {}
I0104 09:35:54.873757    7608 utils.go:83] GRPC response: {"node_id":"capz-d35k-8w5hq"}
I0104 09:36:11.007750    7608 utils.go:76] GRPC call: /csi.v1.Node/NodeStageVolume
I0104 09:36:11.007750    7608 utils.go:77] GRPC request: {"staging_target_path":"\\var\\lib\\kubelet\\plugins\\kubernetes.io\\csi\\pv\\pvc-28b9f7c8-f539-4934-ac3a-5bfff0dea813\\globalmount","volume_capability":{"AccessType":{"Mount":{"mount_flags":["dir_mode=0777","file_mode=0777","mfsymlinks","cache=strict","nosharesock","actimeo=30"]}},"access_mode":{"mode":5}},"volume_context":{"csi.storage.k8s.io/pv/name":"pvc-28b9f7c8-f539-4934-ac3a-5bfff0dea813","csi.storage.k8s.io/pvc/name":"pvc-azurefile","csi.storage.k8s.io/pvc/namespace":"default","secretnamespace":"default","skuName":"Premium_LRS","storage.kubernetes.io/csiProvisionerIdentity":"1672823128067-8081-file.csi.azure.com"},"volume_id":"capz-d35khr#f8d1792b73c094da0a38f82#pvc-28b9f7c8-f539-4934-ac3a-5bfff0dea813###default"}
W0104 09:36:11.009078    7608 azurefile.go:668] GetStorageAccountFromSecret(azure-storage-account-f8d1792b73c094da0a38f82-secret, default) failed with error: could not get account key from secret(azure-storage-account-f8d1792b73c094da0a38f82-secret): KubeClient is nil
I0104 09:36:11.009078    7608 azurefile.go:670] use cluster identity to get account key from (0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e, capz-d35khr, f8d1792b73c094da0a38f82)
I0104 09:36:11.728812    7608 nodeserver.go:302] cifsMountPath(\var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-28b9f7c8-f539-4934-ac3a-5bfff0dea813\globalmount) fstype() volumeID(capz-d35khr#f8d1792b73c094da0a38f82#pvc-28b9f7c8-f539-4934-ac3a-5bfff0dea813###default) context(map[csi.storage.k8s.io/pv/name:pvc-28b9f7c8-f539-4934-ac3a-5bfff0dea813 csi.storage.k8s.io/pvc/name:pvc-azurefile csi.storage.k8s.io/pvc/namespace:default secretnamespace:default skuName:Premium_LRS storage.kubernetes.io/csiProvisionerIdentity:1672823128067-8081-file.csi.azure.com]) mountflags([dir_mode=0777 file_mode=0777 mfsymlinks cache=strict nosharesock actimeo=30]) mountOptions([AZURE\f8d1792b73c094da0a38f82]) volumeMountGroup()
I0104 09:36:11.729338    7608 safe_mounter_windows.go:163] IsLikelyNotMountPoint: \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-28b9f7c8-f539-4934-ac3a-5bfff0dea813\globalmount
I0104 09:36:11.729406    7608 safe_mounter_windows.go:220] Exists path: \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-28b9f7c8-f539-4934-ac3a-5bfff0dea813\globalmount
I0104 09:36:11.732172    7608 safe_mounter_windows.go:220] Exists path: \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-28b9f7c8-f539-4934-ac3a-5bfff0dea813\globalmount
I0104 09:36:11.732386    7608 azure_common_windows.go:73] Removing path: \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-28b9f7c8-f539-4934-ac3a-5bfff0dea813\globalmount
... skipping 27 lines ...
Git Commit: b97b1825130f9837731ff25fd878281b0d2288a3
Go Version: go1.18.3
Platform: windows/amd64

Streaming logs below:
I0104 09:35:54.002915    3716 azurefile.go:299] driver userAgent: file.csi.azure.com/v1.24.0
W0104 09:35:54.003759    3716 azure.go:73] get kubeconfig(C:\\k\\config) failed with error: CreateFile C:\\k\\config: The system cannot find the file specified.
I0104 09:35:54.004596    3716 azure.go:92] could not read cloud config from secret kube-system/azure-cloud-provider
I0104 09:35:54.004596    3716 azure.go:102] use default AZURE_CREDENTIAL_FILE env var: C:\k\azure.json
I0104 09:35:54.004596    3716 azure.go:110] read cloud config from file: C:\k\azure.json successfully
I0104 09:35:54.006793    3716 azure_auth.go:245] Using AzurePublicCloud environment
I0104 09:35:54.006930    3716 azure_auth.go:130] azure: using client_id+client_secret to retrieve access token
I0104 09:35:54.009201    3716 azure_diskclient.go:68] Azure DisksClient using API version: 2022-03-02
... skipping 43 lines ...
I0104 09:35:57.520602    3716 nodeserver.go:393] NodeUnstageVolume: CleanupMountPoint volume capz-d35khr#f373b7e3bdb5549e8ab2c2c#pvc-9c45c4f2-b289-4585-b2c1-338cfe68aa36###azurefile-1719 on \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-9c45c4f2-b289-4585-b2c1-338cfe68aa36\proxy-mount
I0104 09:35:57.520602    3716 safe_mounter_windows.go:133] Remove directory: \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-9c45c4f2-b289-4585-b2c1-338cfe68aa36\proxy-mount
I0104 09:35:57.521385    3716 nodeserver.go:397] NodeUnstageVolume: unmount volume capz-d35khr#f373b7e3bdb5549e8ab2c2c#pvc-9c45c4f2-b289-4585-b2c1-338cfe68aa36###azurefile-1719 on \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-9c45c4f2-b289-4585-b2c1-338cfe68aa36\globalmount successfully
I0104 09:35:57.521423    3716 utils.go:83] GRPC response: {}
I0104 09:36:24.015307    3716 utils.go:76] GRPC call: /csi.v1.Node/NodeStageVolume
I0104 09:36:24.015307    3716 utils.go:77] GRPC request: {"staging_target_path":"\\var\\lib\\kubelet\\plugins\\kubernetes.io\\csi\\pv\\pvc-75839412-414e-472c-82ea-ddb556553f4f\\globalmount","volume_capability":{"AccessType":{"Mount":{"mount_flags":["dir_mode=0777","file_mode=0777","mfsymlinks","cache=strict","nosharesock","actimeo=30"]}},"access_mode":{"mode":5}},"volume_context":{"csi.storage.k8s.io/pv/name":"pvc-75839412-414e-472c-82ea-ddb556553f4f","csi.storage.k8s.io/pvc/name":"persistent-storage-statefulset-azurefile-win-0","csi.storage.k8s.io/pvc/namespace":"default","secretnamespace":"default","skuName":"Premium_LRS","storage.kubernetes.io/csiProvisionerIdentity":"1672823128067-8081-file.csi.azure.com"},"volume_id":"capz-d35khr#f8d1792b73c094da0a38f82#pvc-75839412-414e-472c-82ea-ddb556553f4f###default"}
W0104 09:36:24.016774    3716 azurefile.go:668] GetStorageAccountFromSecret(azure-storage-account-f8d1792b73c094da0a38f82-secret, default) failed with error: could not get account key from secret(azure-storage-account-f8d1792b73c094da0a38f82-secret): KubeClient is nil
I0104 09:36:24.016774    3716 azurefile.go:670] use cluster identity to get account key from (0e46bd28-a80f-4d3a-8200-d9eb8d80cb2e, capz-d35khr, f8d1792b73c094da0a38f82)
I0104 09:36:24.635237    3716 nodeserver.go:302] cifsMountPath(\var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-75839412-414e-472c-82ea-ddb556553f4f\globalmount) fstype() volumeID(capz-d35khr#f8d1792b73c094da0a38f82#pvc-75839412-414e-472c-82ea-ddb556553f4f###default) context(map[csi.storage.k8s.io/pv/name:pvc-75839412-414e-472c-82ea-ddb556553f4f csi.storage.k8s.io/pvc/name:persistent-storage-statefulset-azurefile-win-0 csi.storage.k8s.io/pvc/namespace:default secretnamespace:default skuName:Premium_LRS storage.kubernetes.io/csiProvisionerIdentity:1672823128067-8081-file.csi.azure.com]) mountflags([dir_mode=0777 file_mode=0777 mfsymlinks cache=strict nosharesock actimeo=30]) mountOptions([AZURE\f8d1792b73c094da0a38f82]) volumeMountGroup()
I0104 09:36:24.635237    3716 safe_mounter_windows.go:163] IsLikelyNotMountPoint: \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-75839412-414e-472c-82ea-ddb556553f4f\globalmount
I0104 09:36:24.635237    3716 safe_mounter_windows.go:220] Exists path: \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-75839412-414e-472c-82ea-ddb556553f4f\globalmount
I0104 09:36:24.636742    3716 safe_mounter_windows.go:220] Exists path: \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-75839412-414e-472c-82ea-ddb556553f4f\globalmount
I0104 09:36:24.638803    3716 azure_common_windows.go:73] Removing path: \var\lib\kubelet\plugins\kubernetes.io\csi\pv\pvc-75839412-414e-472c-82ea-ddb556553f4f\globalmount
... skipping 435 lines ...
# HELP go_gc_heap_objects_objects Number of objects, live or unswept, occupying heap memory.
# TYPE go_gc_heap_objects_objects gauge
go_gc_heap_objects_objects 52872
# HELP go_gc_heap_tiny_allocs_objects_total Count of small allocations that are packed together into blocks. These allocations are counted separately from other allocations because each individual allocation is not tracked by the runtime, only their block. Each block is already accounted for in allocs-by-size and frees-by-size.
# TYPE go_gc_heap_tiny_allocs_objects_total counter
go_gc_heap_tiny_allocs_objects_total 19054
# HELP go_gc_limiter_last_enabled_gc_cycle GC cycle the last time the GC CPU limiter was enabled. This metric is useful for diagnosing the root cause of an out-of-memory error, because the limiter trades memory for CPU time when the GC's CPU time gets too high. This is most likely to occur with use of SetMemoryLimit. The first GC cycle is cycle 1, so a value of 0 indicates that it was never enabled.
# TYPE go_gc_limiter_last_enabled_gc_cycle gauge
go_gc_limiter_last_enabled_gc_cycle 0
# HELP go_gc_pauses_seconds Distribution individual GC-related stop-the-world pause latencies.
# TYPE go_gc_pauses_seconds histogram
go_gc_pauses_seconds_bucket{le="9.999999999999999e-10"} 0
go_gc_pauses_seconds_bucket{le="9.999999999999999e-09"} 0
... skipping 252 lines ...
    STEP: GetAccountNumByResourceGroup(capz-d35khr) returns 7 accounts 01/04/23 09:36:44.922
  << End Captured GinkgoWriter Output
------------------------------


Summarizing 1 Failure:
  [FAIL] Pre-Provisioned [AfterEach] should use existing credentials in k8s cluster [file.csi.azure.com] [Windows]
  /home/prow/go/src/sigs.k8s.io/azurefile-csi-driver/test/e2e/pre_provisioning_test.go:74

Ran 22 of 38 Specs in 1911.457 seconds
FAIL! -- 21 Passed | 1 Failed | 0 Pending | 16 Skipped
You're using deprecated Ginkgo functionality:
=============================================
  Support for custom reporters has been removed in V2.  Please read the documentation linked to below for Ginkgo's new behavior and for a migration path:
  Learn more at: https://onsi.github.io/ginkgo/MIGRATING_TO_V2#removed-custom-reporters

To silence deprecations that can be silenced set the following environment variable:
  ACK_GINKGO_DEPRECATIONS=2.4.0

--- FAIL: TestE2E (1911.46s)
FAIL
FAIL	sigs.k8s.io/azurefile-csi-driver/test/e2e	1911.538s
FAIL
make: *** [Makefile:85: e2e-test] Error 1
NAME                              STATUS   ROLES                  AGE   VERSION                           INTERNAL-IP   EXTERNAL-IP   OS-IMAGE                         KERNEL-VERSION     CONTAINER-RUNTIME
capz-d35k-8w5hq                   Ready    <none>                 34m   v1.23.16-rc.0.37+548d5d2298c4a5   10.1.0.4      <none>        Windows Server 2019 Datacenter   10.0.17763.3770    containerd://1.6.2
capz-d35k-s45sk                   Ready    <none>                 35m   v1.23.16-rc.0.37+548d5d2298c4a5   10.1.0.5      <none>        Windows Server 2019 Datacenter   10.0.17763.3770    containerd://1.6.2
capz-d35khr-control-plane-ngxp5   Ready    control-plane,master   48m   v1.23.16-rc.0.37+548d5d2298c4a5   10.0.0.4      <none>        Ubuntu 18.04.6 LTS               5.4.0-1098-azure   containerd://1.6.2
NAMESPACE     NAME                                                      READY   STATUS        RESTARTS      AGE   IP                NODE                              NOMINATED NODE   READINESS GATES
default       deployment-azurefile-win-6667b5bb77-k88xl                 1/1     Running       0             36s   192.168.235.139   capz-d35k-8w5hq                   <none>           <none>
... skipping 31 lines ...

Jan  4 09:40:24.231: INFO: Collecting logs for Windows node capz-d35k-8w5hq in cluster capz-d35khr in namespace default

Jan  4 09:42:09.923: INFO: Attempting to copy file /c:/crashdumps.tar on node capz-d35k-8w5hq to /logs/artifacts/clusters/capz-d35khr/machines/capz-d35khr-md-win-55c45585d8-2brbj/crashdumps.tar
Jan  4 09:42:11.753: INFO: Collecting boot logs for AzureMachine capz-d35khr-md-win-8w5hq

Failed to get logs for machine capz-d35khr-md-win-55c45585d8-2brbj, cluster default/capz-d35khr: running command "$p = 'c:\localdumps' ; if (Test-Path $p) { tar.exe -cvzf c:\crashdumps.tar $p *>&1 | %{ Write-Output "$_"} } else { Write-Host "No crash dumps found at $p" }": Process exited with status 1
Jan  4 09:42:12.818: INFO: Collecting logs for Windows node capz-d35k-s45sk in cluster capz-d35khr in namespace default

Jan  4 09:43:59.042: INFO: Attempting to copy file /c:/crashdumps.tar on node capz-d35k-s45sk to /logs/artifacts/clusters/capz-d35khr/machines/capz-d35khr-md-win-55c45585d8-5jq8w/crashdumps.tar
Jan  4 09:44:00.652: INFO: Collecting boot logs for AzureMachine capz-d35khr-md-win-s45sk

Failed to get logs for machine capz-d35khr-md-win-55c45585d8-5jq8w, cluster default/capz-d35khr: running command "$p = 'c:\localdumps' ; if (Test-Path $p) { tar.exe -cvzf c:\crashdumps.tar $p *>&1 | %{ Write-Output "$_"} } else { Write-Host "No crash dumps found at $p" }": Process exited with status 1
STEP: Dumping workload cluster default/capz-d35khr kube-system pod logs
STEP: Fetching kube-system pod logs took 380.491342ms
STEP: Dumping workload cluster default/capz-d35khr Azure activity log
STEP: Creating log watcher for controller kube-system/calico-kube-controllers-85f479877b-dh4lr, container calico-kube-controllers
STEP: Creating log watcher for controller kube-system/calico-node-windows-8n9tp, container calico-node-startup
STEP: Collecting events for Pod kube-system/calico-node-gs8kw
... skipping 53 lines ...