Monday, May 27, 2013

Apprenticeship Program


All professionals around the world need to be trained and software engineers aren't an exception.

Hence, we announce a unique program (and for sure the first in Israel) we are proud to kick off this week: a Software Development Apprenticeship Program.

PicSocut will hire and train apprentices; We will focus (but not limit) on clean code, reading/writing code, clean architecture, BDD, TDD, simple and business oriented designs, tools and best practices. In nutshell, all what you need to become a highly competent software engineer who cares and proud about his profession.

We are looking for couple of candidates to begin the program!

If you feel it's you, please feel free to send us your resume at jobs@picscout.com

Good Luck!



  

Wednesday, May 15, 2013

Building Lightweight Products

Here is a short talk about how we build lightweight products at PicScout (in Hebrew).
Unfortunately, the video has focused on the speaker, instead of on the slides... so ping us if you wish to receive the slides.


Sunday, May 12, 2013

Consumption of Large Json objects from the IIS


In my previous post I talked about binary serialization of Large Objects. Today, I‘m going to talk about the consumption of such objects from the IIS.

In this case our recommendation is: always stream objects, don't create intermediate strings or similar, because you will find yourself with "Out of memory etc..." exceptions.

First, let’s update our client so it will ask for “gzip” stream:
public class SpecialisedWebClient :  WebClient
{
      protected override WebRequest GetWebRequest(Uri address)
      {
           HttpWebRequest request = base.GetWebRequest(address) as HttpWebRequest;
           request.AutomaticDecompression = DecompressionMethods.Deflate | DecompressionMethods.GZip;
           request.Timeout = 90 * 60 * 1000;
           return request ;
      }
}
Next, an Asp .NET application usually has a controller which returns some JsonResult. We introduced the new class LargeJsonResult which is returned instead:
public class LargeJsonResult : JsonResult
{
       public override void ExecuteResult(ControllerContext context)
       {
            context.HttpContext.Response.ContentType = "application/json";

            if(ReturnCompressedStream())
            {
                 context.HttpContext.Response.AppendHeader("Content-encoding", "gzip");

                 using (GZipStream gZipStream = new GZipStream(response.OutputStream, CompressionMode.Compress))
                 {
                      SerializeResponse(gZipStream, Data); 
                 }
            }
            else
            {
                 SerializeResponse(gZipStream, Data); 
            }
       }

       private bool ReturnCompressedStream(ControllerContext context)
       {
            string acceptEncoding = context.HttpContext.Request.Headers["Accept-Encoding"];
            if (!string.IsNullOrEmpty(acceptEncoding) && acceptEncoding.ToLowerInvariant().Contains("gzip"))
            {                
               return true;
            }

            return false;
      }

      private static void SerializeResponse(Stream stream, object data)
      {
           using (StreamWriter streamWriter = new StreamWriter(stream))
           using (JsonWriter writer = new JsonTextWriter(streamWriter))
           {
                JsonSerializer serializer = new JsonSerializer();

                streamWriter.AutoFlush = true;
                serializer.Serialize(writer, data);
           }
      }
}
Here we use the Newtonsoft Json Serializer. As you can see, the Json object is streamed to the Client, so the last thing what we need to do is to update the Clients' code for the consumption of this object:
using (SpecialisedWebClient client = new SpecialisedWebClient())
{
      client.Headers.Add("Content-Type: application/json");

      using (Stream stream = client.OpenRead(serviceUri))   
      using (StreamReader reader = new StreamReader(stream, System.Text.Encoding.UTF8))
      using (JsonReader jreader = new JsonTextReader(reader))
      {
           JsonSerializer js = new JsonSerializer();    

           return js.Deserialize<JObject>(jreader);
      }
}
That's it. Now you can transfer GBs of Json data over the wire.

Note: In my previous post, I talked about large objects serialization and our custom implementation of ISerializable. Apparently, when the Json re-presentation of this object is streamed from the IIS using Newtonsoft serializer it calls to ISerializable method and instead of Json the binary stuff is displayed. In order to disable this behavior we need to add the attribute [JsonObjectAttributeon top of the object.

Wednesday, May 1, 2013

BDD @ PicScout

Lead

It is a well known fact that BDD (Behavior-driven development) is gaining increasing popularity last couple of years, both in theory and practice. Dan North, the acknowledged founder of this idea and creator of its first implementation, describes its main concepts beautifully in his Introducing BDD paper. 

Since so many words were already written in favor (and disfavor) of BDD, I will not get into details of this methodology and its techniques, but rather try to present PicScout’s perspective.


Our Case

We, at PicScout, came across this methodology while trying to improve our delivery cycles. On one hand we discovered that our unitests, though comprehensive, were not reducing enough the intensity of bugs in QA. On the other hand, we noticed ever-growing imparities between what product and business owners imagined to what our developers eventually delivered. Every approach we tried to apply in order to resolve that pitfall was doomed to failure. What happened eventually is that our QA-engineers were obliged to manually bridge this chasm by spending more time and focus on acceptance and regression. Needless to say it overwhelmed the QA-pipe.

What we found interesting about BDD is that it creates a common-language for product-owners, QA-engineers and developers - describing the feature with a "given-when-than" logical pattern. The developer can then go ahead and create (and test) the logic based on those guidelines. Correspondingly, the QA-engineer has better grasp of the cycle and product-owner gets a clear visibility of features in development.

It was all what we had hoped for!!!


Our Practice

However, as with any XP-derived methodology, embracing BDD requires an ideology revolution. To be honest, it was not something we were willing to do without giving it a thought. The first "D" in BDD is for "Driven" - as in TDD – which means design & development are totally directed by writing tests first. TDD critics argue that developing a real-world system from scratch with TDD is unreasonable or at least comes with excessive overhead. We at PicScout do not take sides in this theoretical war. We practice TDD not as a must but as a privilege, mostly in case of autonomic modules. This is why we decided to spare the "Driven" part in BDD, meaning system (or feature) development is mostly written in a code-first approach followed by a must-have unitests.

But the "B" in BDD is what intrigued us the most. Key-features are translated (not entirely though) to most valuable "given-when-then” scenarios, usually written by QA-engineer but always reviewed and edited by a developer. Scenarios are implemented usually during the development of logic and unitests (again, not necessarily in a TDD style). They can be implemented by the developer as a system-test (for that we're using SpecFlow) or by the QA-engineer as UI-test (with Selenium).


Our Gain

·         The developers - now fully aware of all important aspects in a user-story, which might have not been realized from the story directly - can provide better feature & code coverage.
·         The developers have a common guideline to testing logic beyond the unit-scope. As Dan north accurately describes, our developers can now answer the five common WH questions - Where to start, What (not) to test, How much to test, What to call a test and Why a test fails.
·         QA-engineers have a comprehensive acceptance and regression layer, assured to cover all main features of a system.  They are no longer a bottleneck.
·         Product and business owners get a quick-reference to what is\was delivered, thus maintaining and expanding production system becomes simpler.


Conclusion

BDD is an emerging development-methodology which aims to facilitate acceptance and regression pain. As with TDD, it earned critics along the way, mainly concerning the ability to be fully practiced in real-world large-scale systems.

At PicScout, we are embracing several aspects and technique of this approach, especially all that is oriented by the "Behavior" principal. What we can already determine is that the gain overcomes the extra-effort, both from development-perspective and business-value.