Batched Media Upload to Azure Blob Storage through WebApi Batched Media Upload to Azure Blob Storage through WebApi azure azure

Batched Media Upload to Azure Blob Storage through WebApi


Assuming your AzureStorageMultipartFormDataStreamProvider is similar to the same class mentioned on this blog, that is actually already processing multiple files if there are multiple files in the request.

So all you need to do is change your UploadFile to return a IEnumerable<string> and change your controller to have mediaPath as such.

So your MediaService would have:

var filenames = provider.FileData.Select(x => x.LocalFileName).ToList(); ;return filenames;

And your controller would have:

var mediaPaths = await _mediaService.UploadFile(User.Identity.Name, Request.Content);return Ok(mediaPaths);


Since you don't post the related codes with the AzureStorageMultipartFormDataStreamProvider class.

So I create a custom AzureStorageMultipartFormDataStreamProvider which inherits from the MultipartFileStreamProvider to enable the web api upload batched uploads of multiple files.

In the AzureStorageMultipartFormDataStreamProvider we could override the ExecutePostProcessingAsync method.

In this method, we could get the upload file data, then we could upload these data to the azure storage.

More details, you could refer to below codes. The total Controller.

 public class UploadingController : ApiController    {        public Task<List<FileItem>> PostFile()        {            if (!Request.Content.IsMimeMultipartContent("form-data"))            {                throw new HttpResponseException(HttpStatusCode.UnsupportedMediaType);            }            var multipartStreamProvider = new AzureStorageMultipartFormDataStreamProvider(GetWebApiContainer());            return Request.Content.ReadAsMultipartAsync<AzureStorageMultipartFormDataStreamProvider>(multipartStreamProvider).ContinueWith<List<FileItem>>(t =>            {                if (t.IsFaulted)                {                    throw t.Exception;                }                AzureStorageMultipartFormDataStreamProvider provider = t.Result;                return provider.Files;            });        }        public static CloudBlobContainer GetWebApiContainer(string containerName = "webapi-file-container")        {            // Retrieve storage account from connection-string            CloudStorageAccount storageAccount = CloudStorageAccount.Parse(               "your connection string");            // Create the blob client             CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();            CloudBlobContainer container = blobClient.GetContainerReference(containerName);            // Create the container if it doesn't already exist            container.CreateIfNotExists();            // Enable public access to blob            var permissions = container.GetPermissions();            if (permissions.PublicAccess == BlobContainerPublicAccessType.Off)            {                permissions.PublicAccess = BlobContainerPublicAccessType.Blob;                container.SetPermissions(permissions);            }            return container;        }    }    public class FileItem    {        /// <summary>        /// file name        /// </summary>        public string Name { get; set; }        /// <summary>        /// size in bytes        /// </summary>        public string SizeInMB { get; set; }        public string ContentType { get; set; }        public string Path { get; set; }        public string BlobUploadCostInSeconds { get; set; }    }    public class AzureStorageMultipartFormDataStreamProvider : MultipartFileStreamProvider    {        private CloudBlobContainer _container;        public AzureStorageMultipartFormDataStreamProvider(CloudBlobContainer container)            : base(Path.GetTempPath())        {            _container = container;            Files = new List<FileItem>();        }        public List<FileItem> Files { get; set; }        public override Task ExecutePostProcessingAsync()        {            // Upload the files to azure blob storage and remove them from local disk            foreach (var fileData in this.FileData)            {                var sp = new Stopwatch();                sp.Start();                string fileName = Path.GetFileName(fileData.Headers.ContentDisposition.FileName.Trim('"'));                CloudBlockBlob blob = _container.GetBlockBlobReference(fileName);                blob.Properties.ContentType = fileData.Headers.ContentType.MediaType;                //set the number of blocks that may be simultaneously uploaded                var requestOption = new BlobRequestOptions()                {                    ParallelOperationThreadCount = 5,                    SingleBlobUploadThresholdInBytes = 10 * 1024 * 1024 ////maximum for 64MB,32MB by default                };                //upload a file to blob                blob.UploadFromFile(fileData.LocalFileName, options: requestOption);                blob.FetchAttributes();                File.Delete(fileData.LocalFileName);                sp.Stop();                Files.Add(new FileItem                {                    ContentType = blob.Properties.ContentType,                    Name = blob.Name,                    SizeInMB = string.Format("{0:f2}MB", blob.Properties.Length / (1024.0 * 1024.0)),                    Path = blob.Uri.AbsoluteUri,                    BlobUploadCostInSeconds = string.Format("{0:f2}s", sp.ElapsedMilliseconds / 1000.0)                });            }            return base.ExecutePostProcessingAsync();        }    }

The result like this:

enter image description here


I would checkout uploading the media directly to the blob storage after getting the SAS token for all your files from the Web API in one request. Upload the files using a promise and http get from your client, which will parallelize the upload. Which will be your right design and approach. Which will also increase your upload speed and reduce the latency.