Upload Large Files to Azure Blob Storage with Python

Andrew Zhu (Shudong Zhu)
2 min readOct 7, 2022

Here is the problem I faced today. When I use the following Python code to upload a CSV file to Azure Blob container.

blob_client = self.container_client.get_blob_client(blob_file_path)
with open(local_file_path,'rb') as f:
blob_client.upload_blob(f,blob_type="BlockBlob")

See the complete sample code here.

A small file of size less than 9MB works well. However, when the size grows. I will get an error message like this:

The operation did not complete (write) (_ssl.c:2471)

In the beginning, I suspected that the connection might timeout due to the long transmission time, So I added timeout settings after initializing the blob_service_client. And also specify the upload chunk size.

self.blob_service_client.max_single_put_size = 4*1024*1024 #4M
self.blob_service_client.timeout = 60*20 # 20 mins

However, the error message persists, but welcome me in another way:

azure.core.exceptions.ServiceResponseError: (‘Connection aborted.’, timeout(‘The write operation timed out’))

Since I have already set the timeout as 20 mins, the error shouldn’t be caused by timeout settings.

Kept searching, until I read this thread. Failing to upload larger blobs to Azure: azure.core.exceptions.ServiceRequestError: The operation did not complete (write) (_ssl.c:2317).

--

--