How Fast Is… The US Postal Service?
Our regular video series How Fast Is…? examines real world websites and details the cause of their performances issues as well as what should be done to solve them. After all, the best way to learn about front-end web performance is to see what other people are doing right and doing wrong. In this edition of How Fast Is…? we analyze the US Portal Service’s website.
It’s tax time again in the United States. I was at the post office this week mailing in my tax forms and as I left I saw an interesting poster:
One million people visit the US Postal Services website a day! That’s a lot of traffic! I imagine that during tax season this number increases as people try to locate post offices and are researching things like certified mail and return receipts. Since the USPS website is so important, I decided to see the types of web performance best practices they were implementing, and see there were additional optimizations to be made.
For this video, I crawled and analyzed just over 1100 pages of the US Postal Service’s website.
I’ve been talking a lot about HTTP compression as part of Zoompf’s Lose the Wait blog series. Often people do are not compressing what should be compressed. One of the challenges with properly configuring a web service for HTTP compression is that it is op-in. You must specific specify which files, either by MIME type or file extension that should be compressed. If you haven’t conducting an inventory to understand all the different types of content your website serves, you may forget to include something.
This is exactly what happen to USPS.com. They have a large number of downloadable spreadsheets with shipping rates for various packages based on size, dimensions, and shipping locations. These spreadsheets are saved as CSV files and are served with the MIME type
text/csv. USPS is not compressing these files, even though they are non-natively compressed text files which should be served with HTTP compression. Additionally, USPS is not compressing any of their 404 pages, another common mistake.
Zoompf’s crawler found two enormous TIFF image files on USPS.com. TIFF images are typically used for high quality printing. In fact, most web browsers do native support TIFF images, and instead rely on plug-ins to display TIFF images, as shown in the screen shot below:
This is so odd and unexpected in fact, that Zoompf does not even have a check for websites which use TIFF images. Zoompf does flag when a BMP image is used a website. Using TIFF and BMP images is a bad idea, because both images do not always compress their graphical data. BMP images are not compressed at all. (While the file format does allow for a primitive type of run length encoding, it is not required and support is almost non-existent). TIFF images do support different compression modes, from lossless to lossy, but often do not use compression to avoid compression artifacts as they are used in high resolution printing. As soon as I’m done with this post, I’ll add a check to Zoompf for TIFF images.
Another odd finding was that not only were images not cached, they included a
Cache-Control: must-revalidate header. This directive forces the browser to always send a conditional request for a resource before using it. Even if USPS.com started using HTTP caching for its resources, it would need to also remove the
must-revalidate directive for caching to function properly. The only time you usually
must-revalidate is on the image beacons used to report into web analytic packages. They use
must-revalidate to ensure that each visit causes a request to the analytics’s tracking script to record the visit. I don’t think I’ve ever seen a normal web resource like a website’s logo always forcing conditional requests using a