Sunday, May 12, 2013

Consumption of Large Json objects from the IIS


In my previous post I talked about binary serialization of Large Objects. Today, I‘m going to talk about the consumption of such objects from the IIS.

In this case our recommendation is: always stream objects, don't create intermediate strings or similar, because you will find yourself with "Out of memory etc..." exceptions.

First, let’s update our client so it will ask for “gzip” stream:
public class SpecialisedWebClient :  WebClient
{
      protected override WebRequest GetWebRequest(Uri address)
      {
           HttpWebRequest request = base.GetWebRequest(address) as HttpWebRequest;
           request.AutomaticDecompression = DecompressionMethods.Deflate | DecompressionMethods.GZip;
           request.Timeout = 90 * 60 * 1000;
           return request ;
      }
}
Next, an Asp .NET application usually has a controller which returns some JsonResult. We introduced the new class LargeJsonResult which is returned instead:
public class LargeJsonResult : JsonResult
{
       public override void ExecuteResult(ControllerContext context)
       {
            context.HttpContext.Response.ContentType = "application/json";

            if(ReturnCompressedStream())
            {
                 context.HttpContext.Response.AppendHeader("Content-encoding", "gzip");

                 using (GZipStream gZipStream = new GZipStream(response.OutputStream, CompressionMode.Compress))
                 {
                      SerializeResponse(gZipStream, Data); 
                 }
            }
            else
            {
                 SerializeResponse(gZipStream, Data); 
            }
       }

       private bool ReturnCompressedStream(ControllerContext context)
       {
            string acceptEncoding = context.HttpContext.Request.Headers["Accept-Encoding"];
            if (!string.IsNullOrEmpty(acceptEncoding) && acceptEncoding.ToLowerInvariant().Contains("gzip"))
            {                
               return true;
            }

            return false;
      }

      private static void SerializeResponse(Stream stream, object data)
      {
           using (StreamWriter streamWriter = new StreamWriter(stream))
           using (JsonWriter writer = new JsonTextWriter(streamWriter))
           {
                JsonSerializer serializer = new JsonSerializer();

                streamWriter.AutoFlush = true;
                serializer.Serialize(writer, data);
           }
      }
}
Here we use the Newtonsoft Json Serializer. As you can see, the Json object is streamed to the Client, so the last thing what we need to do is to update the Clients' code for the consumption of this object:
using (SpecialisedWebClient client = new SpecialisedWebClient())
{
      client.Headers.Add("Content-Type: application/json");

      using (Stream stream = client.OpenRead(serviceUri))   
      using (StreamReader reader = new StreamReader(stream, System.Text.Encoding.UTF8))
      using (JsonReader jreader = new JsonTextReader(reader))
      {
           JsonSerializer js = new JsonSerializer();    

           return js.Deserialize<JObject>(jreader);
      }
}
That's it. Now you can transfer GBs of Json data over the wire.

Note: In my previous post, I talked about large objects serialization and our custom implementation of ISerializable. Apparently, when the Json re-presentation of this object is streamed from the IIS using Newtonsoft serializer it calls to ISerializable method and instead of Json the binary stuff is displayed. In order to disable this behavior we need to add the attribute [JsonObjectAttributeon top of the object.

No comments:

Post a Comment