Tuesday, August 7, 2012

Detecting site performance anomalies using cURL

I recently participated in an IBM Webshpere Commerce Server deployment and was tasked with hunting down the source of some performance anomalies. cURL is a tool that has the ability to easily probe and collect detailed timing data using its many options. Below are the links I referenced to write my scripts that provided critical information and insight into the problem. I will add more detail here soon about what exact steps I used. 


http://newestindustry.org/2006/10/03/baseline-testing-with-curl-2/

http://josephscott.org/archives/2011/10/timing-details-with-curl/

http://www.jonefox.com/blog/2011/10/10/find-the-time-to-first-byte-using-curl/

http://curl.haxx.se/docs/manual.html

Front-end enginerring: Image compression


I recently participated in an IBM Webshpere Commerce Server deployment and was surprised when I ran Google's PageSpeed against the site, that it suggests the use of image optimizers on the stock widget graphics. The analysis for one of our pages was as follows:




The detailed message under the Learn more link was helpful in that it linked to the tools they suggest using - bookmarking it here for reference:


Use an image compressor.

Several tools are available that perform further, lossless compression on JPEG and PNG files, with no effect on image quality. For JPEG, we recommend jpegtran or jpegoptim (available on Linux only; run with the --strip-all option). For PNG, we recommend OptiPNG or PNGOUT.

Tip: When you run Page Speed against a page referencing JPEG and PNG files, it automatically compresses the files and saves the output to a configurable directory.

I would expect an off the shelf tool to do these kinds of optimizations on their own graphics before shipping, no? Some of the absolute numbers seem small, but every 1.5K is a packet that doesn't need to be sent, and this is a small step to make for such an easy gain.