This post is about an incident where I had to troubleshoot an error for a few users that they got when they tried to connect to an Azure file share. These users do not use Azure storage explorer or the Azure portal to access the share folder, but instead use a script to map the Azure File Share as a mapped drive using the shared access key.
I think once they mapped the drive, it stayed mapped. So when this user tried to click on her U drive that was suppoed to be mapped to the Azure File Share, she was getting the error message below.
At first glance, this error is about not having enough quota. The first place anyone would go is to the storage account and try to increase the quota. However, increasing the quota did not help in this case. Users were still getting the same error message. This File Share was used for a critical process that copies files from an On-Premises VM to the File Share. If we reboot the server that run this process, it resolves the error immediately. However, the problem would come back a few days later.
This server is in a production environment and restarting it every few days is not an option. We found that the issue occurred because there are too many opened handles to this file share, using the handle.exe sys internal tool from Microsoft. To workaround this issue while looking into the root cause, I used the Close-AzStorageFileHandle PowerShell cmdlet.
## this will prompt you to login, login with the account you use for Azure portal Connect-AzAccount $Context = New-AzStorageContext -StorageAccountName "replace this with the name of your storage account" -StorageAccountKey "Put in the shared access key for the storage account here" ## This will Get all the opened handles that are currently connected from the IP specified Get-AzStorageFileHandle -Context $Context -ShareName "put the name of the share here" -Recursive | Where-Object -Property ClientIP -eq "put the IP address that caused this problem here" ## This will close all the handles from this IP Get-AzStorageFileHandle -Context $Context -ShareName "put the name of the share here" -Recursive | Where-Object -Property ClientIP -eq "put the IP of the server that caused this problem here" | Close-AzStorageFileHandle -context $context -ShareName "put the name of the share here"
Running this script every day would at least prevent the problem from happening. As a workaround, we can simply setup a task to run this script every day. However, I decided to look further into the issue.
It turned out that the servers that use this file share are all using a PowerShell cmdlet to map this File Share with a shared access key. The PowerShell script is run as a service in a scheduled task that repeats every two minutes. The problem is that nowhere in the script did it disconnect from the mapped drive. It would not be a problem if it was not run as a service. When the script is run as a service, each time it runs, the script will establish a new connection handle to the File Share. Since the connection was never disconnected, it keeps on creating a new handle each time the script runs – that is every two minute. The maximum number of concurrent connection handle to any file share is 2,000 handles. I went back to the person who was in charge of that server and asked him to modified the script so that it will disconnect from the file share at the end of the call.
One reply on “Azure File Share – Not enough quota to process this command”
Thanks for this article. It save me lot of time. Well explain, provide a quick & dirty solution to solve issue quickly and explain the long term solution. Great. 🙂