Serializing a very large List of items into Azure blob storage using C# Serializing a very large List of items into Azure blob storage using C# azure azure

Serializing a very large List of items into Azure blob storage using C#


Perhaps you should switch to JSON?

Using the JSON Serializer, you can stream to and from files and serialize/deserialize piecemeal (as the file is read).

Would your objects map to JSON well?

This is what I use to take a NetworkStream and put into a Json Object.

    private static async Task<JObject> ProcessJsonResponse(HttpResponseMessage response)    {        // Open the stream the stream from the network        using (var s = await ProcessResponseStream(response).ConfigureAwait(false))        {            using (var sr = new StreamReader(s))            {                using (var reader = new JsonTextReader(sr))                {                    var serializer = new JsonSerializer {DateParseHandling = DateParseHandling.None};                    return serializer.Deserialize<JObject>(reader);                }            }        }    }

Additionally, you could GZip the stream to reduce the file transfer times. We stream directly to GZipped JSON and back again.

Edit, although this is a Deserialize, the same approach should work for a Serialize


JSON serialization can work, as the previous poster mentioned, although one a large enough list, this was also causing OutOfMemoryException exceptions to be thrown because the string was simply too big to fit in memory. You might be able to get around this by serializing in pieces if your object is a list, but if you're okay with binary serialization, a much faster/lower memory way is to use Protobuf serialization.

Protobuf has faster serialization than JSON and requires a smaller memory footprint, but at the cost of it being not human readable. Protobuf-net is a great C# implementation of it. Here is a way to set it up with annotations and here is a way to set it up at runtime. In some instances, you can even GZip the Protobuf serialized bytes and save even more space.