permalink

59

Tools for image optimization

As we saw a few weeks ago, the weight of an average web page is now almost 1.5MB (median ~1MB), with > 50% of this being images. It’s a harsh reminder that many of our pages on the web are still quite fat, a big concern for slower mobile data connections.

BigQuery calculated medians for a HTTP Archive run

BigQuery calculated medians for a HTTP Archive run thanks to Ilya Grigorik

There have been plenty of well documented cases of page weight being heavy, with the Oakley site Brad Frost mentioned in April clocking in at ~ 25MB worth of images alone. Insanity. Just think of this on mobile: slower data, CPU, GPU..and it’s just ONE page.

Images are a non-trivial problem to solve because they occasionally need to be high-res, but at the same time small enough to not kill your users mobile data cap. My hope is that srcset will help us improve this long-term. Thankfully Blink, WebKit and soon FF will have it.

The page cost of using images on the web is however not a new problem but we’re at least moving beyond blaming scripts as the main culprit. As a reminder, here’s a quote from Adam Sontag who suggested “One less JPG” as a solution to our bickering about framework sizes back in 2012:

Tools

Where possible, it’s best to try automating image optimization so that it’s a first-class citizen in your build chain. To help, I thought I’d share some of the tools I use for this.

As a general rule run lossy optimizers first, then lossless. Most developers forget that optimizers optimize a particular file rather than the image. This means that it doesn’t make sense to optimize an image file and then resize/crop or convert it as any changes to the file will completely undo lossless optimizations and make lossy ones a lot less effective.

Grunt tasks

Grunt is a fantastic task runner I use daily and there are a number of reliable tasks that can assist with image weight reduction:

Of course, not everyone uses Grunt so let’s take a look at some individual tools you can use regardless of your tooling choices.

Individual tools

Some of the image compressions tools I recommend checking out include:

PNG:
PNG Quantizer
JPG:
GIF:

The Yeoman team have a Node.js wrapper called node-gifsicle that makes this available as a local dependency on OS X, Linux and Windows in case you’re interested.We have wrappers for optipng, jpegtran, pngquant too.

SVG

You may also find that removing EXIF data and unneeded color profile information from images also leads to some gains.

List of tools? Argh. What should I use?

Image compression expert Kornel Lesiński was kind enough to reach out with some recommendations for what to use based on research and usage of them. If opting for your own tooling chain:

For JPEG:

  1. JPEGMini – lossy (30-50% reduction)

JPEGMini sets the quality of your JPEG to the lowest setting the human eyes can tolerate. It’s quite good at doing this. If you’re unable to use it, consider manually adjusting the quality as low as possible. Be careful though as you shouldn’t just save all JPEGs at “80%”. The quality setting is only a weak approximation and quality that you actually achieve can vary from image to image.

JPEGMini doesn’t really have an open-source/CLI equivalent (ImageOptim-CLI scripts it though) but the closest equivalent is adept-jpg-compressor.

  1. jpegcrush (same as jpegrescan) is lossless (5-10% reduction), beating jpegoptim in 99% of cases. jpegcrush is a Perl script utilizing jpegtran, so there’s little need to use jpegtran separately.

For PNG:

There are 3 steps involved in PNG compression: first lossy conversion (50-70% reduction), then search for optimal filters (5-10%), and then optimal gzip (5-30%).

  1. pngquant2 provides a competitive filesize and quality compression option for PNG. Windows users can use Tinypng.org which is pngquant2+optipng (and Kraken.io is the same thing again). Note that most “stable” Linux distributions ship pngquant 1.0. This is quite old and offers significantly poorer quality encodes. pngquant is worth using from version 1.6 up.

pngnq-s9, pngnq and Photoshop export (if you don’t have alpha) are also decent options worth trying (they’re okay). I would suggest staying away from RIOT, PHP-libgd and if at all possible ImageMagick and IrfanView as they aren’t great at PNG8 and don’t fully support alpha either.

  1. cryopng is also worth checking out and (if you have time) pngwolf, which was mentioned earlier. Alternatively Optipng or pngcrush.

  2. advpng probably has the best speed/compression ratio and I believe that’s what punypng and Kraken.io use too. If you have time, then Zopflipng is also worth considering. It’s quite slow, but beats everything else 95% of the time. PNGOUT is a close second (and pretty slow too).

Online tools

There are also a number of free online tools you can use for optimization including some of those mentioned already: Kraken.io, punypng, smush.it, tinypng and jpegmini. Also check out Spriteme for combining background images into CSS sprites.

Desktop tools

If you’re primarily a designer or don’t have a build process setup, please consider at least running your images through tools like ImageOptim or ImageAlpha as they will shave bytes off your images and keep your pages a little more lean.

You might also find this write-up on image compression for web developers by Colt McAnlis of interest.

mod_pagespeed

For those looking for a more automated server-side solution to image optimization, mod_pagespeed is an Apache module created by some of my colleagues at Google to speed up pages to reduce latency and bandwidth. A list of image optimization techniques it supports is available and includes inlining and recompression.

Others?

If there are other tools or Grunt tasks you’ve found helpful for image optimization, please feel free to share them. I know that both I and others are always interested in benchmarking new alternatives.

Wrapping up

Mobile users are the biggest victims of image bloat on web pages. They take ages to load on slow connections and when used without any optimization can make for a costly user experience.


Respect your user’s time, try to keep your pages lean and with some luck we’ll make the web just a little bit faster.

59 Comments

    • Check Paul’s answer lower down, but where possible integrating this into your build-process is more ideal than relying on desktop or online tools. That way it’s a seamless part of your workflow and you don’t have to worry about it.

  1. I still get the best results from pngout with automated trials. Nearly every single time. For years. Occasionally optipng will shave off another couple of bytes.

    Though I don’t believe I’ve tried prepng or pngwolf. Thanks for the summary!

    • You’re very welcome! How have you found pngout speed-wise? Curious if you’ve run into the same issues with slowness that we have.

  2. > Which method is more effective Grunt task tool, Online or Desktop way?

    Automate everything. You don’t want to repeat image optimization yourself, nor do you want to ask your team to do the same six steps you repeat. Set up a build process that smartly optimizes new images. Use lossy compression.

  3. I’m a fan of how CodeKit builds it in. It’s such a no-brainer thing I like to see it integrated into tools developers need to use anyway. Like build processes, deploy scripts, whatever. It would even be cool for an FTP based tool like Coda to just do it whenever it pushes up an image.

  4. A few tools worth mentioning:
    There’s the excellent ScriptPNG for Windows, it makes use of nearly everything you mentioned and even more:
    http://css-ig.net/scriptpng
    It has a JPEG targeted sibling:
    http://css-ig.net/scriptjpg

    When using JPEGmini run JPEGrescan afterwards. JPEGmini is a really good lossy tool, JPEGrescan losslessly tries different progressive mode to pick the one that produces the smallest file, thus they are two completely different and complementary beasts.

    PNGOUT is available for Linux, Mac OS X and FreeBSD:
    http://www.jonof.id.au/kenutils (beware server is down from time to time)

    If you play with PNGOUT -r option (random mode) you could take benefit from Huffmix:
    http://encode.ru/threads/1313-Huffmix-a-PNGOUT-r-catalyst
    Several PNGOUT -r + Huffmix runs can beat ZopfliPNG but may take even more time.

    Some more bits (sometimes turning into bytes or dozens of bytes) can be saved by using Huffman tables optimization tools like Deflopt and Defluff:
    http://www.walbeehm.com/download/
    http://encode.ru/threads/1214-defluff-a-deflate-huffman-optimizer

    Cryopng has been turned into a powerful but slow GUI app for Mac OS X (Google translation to English, if you read French use the second link):
    http://translate.google.com/translate?sl=fr&tl=en&js=n&prev=_t&hl=fr&ie=UTF-8&u=http%3A%2F%2Ffrdx.free.fr%2Flog.htm&act=url
    http://frdx.free.fr/log.htm

    • Thanks for the suggestions, Frederic. I wasn’t aware of the savings Defluff and Deflopt might be able to offer, but I’ll play around with them a little more.

  5. For performance I’d also recommend imagemagick for it’s fuzzy image edge trimming feature (-trim).

    Trimming dead space around images can significantly reduce the amount of in-browser memory consumption, and also makes images a bit lighter down the wire too.

  6. Interesting read with a lot of good tips, thanks!

    I also wanted to point you and others to “Responsive-image” (RIMG): supports responsive websites to provide a way to optimize images (like CMS-content) in a simple and performant way. Pure Javascript, no server-side code and 2 lines of code (library + definition).

    Srcset is good but still gives you the option to provide only 2 options: normal picture or retina-version but nowadays we also have 1080p screens on mobiles and thats more than 1x, 2x…its almost 4x bigger so I used srcset but extended it.

    Have a look at http://github.com/joeyvandijk/rimg or have a look at the demo at http://joeyvandijk.github.io/rimg

    ;)

  7. Very nice and comprehensive write up, Addy!

    Allow me to add iconizr to the list of tools (http://iconizr.com respectively https://github.com/jkphl/iconizr). It does pretty much the same as grunticon (convert a bunch of SVG files into a CSS icon kit), with some differences:

    * Besides outputting SVG and PNG data URI and single image icons, it also produces PNG and even SVG sprites, which can save even more HTTP requests and bandwidth
    * It automatically makes use of several PNG optimization tools (pngcrush, pngquant and optipng) when available on the system
    * It optionally outputs Sass code
    * It doesn’t require grunt, as it’s a PHP CLI script

    iconizr may be installed locally on any Linux box or used online at http://iconizr.com

  8. Great article!

    Before optimizing JPG files through imageoptim, do you save them with 100% quality from Photoshop with Progressive checked?

    The grunt-imageoptim tool os a definite must!

  9. Great list, i always use smush.it and find it’s amazing that jpegmini saved 2kb one a 34kb “smushed” pic and i cannot see any difference. Thanks a ton…

  10. Thanks for all the ideas! I wanted to add something about sprites:

    I’m not sure about grunt-spritefiles, instead, people should try the tool it’s built from, grunt-spritesmith:

    https://github.com/Ensighten/grunt-spritesmith

    It’s awesome, very flexible, very actively maintained.

    Also, if anybody is interested in creating hidpi-ready spritesheets, I’ve been working on an adaptation of grunt-spritesmith that does so — grunt-spritesmith-hd:

    https://github.com/davidtheclark/grunt-spritesmith-hd

    It’s less flexible than the regular grunt-spritesmith right now, more tailored to the projects I’ve been working on; but I’d love open-source input if anybody’s interested.

    Hope those tools help someone.

  11. Pingback: Tools for image optimization

  12. Pingback: Simon's List - Week 40 - Simon says - Simon says

  13. Svg graphics can be much smaller than bitmapped graphics for certain uses and they can now be base64 encoded into css to eliminate a http request. See svgeneration.com for some nice examples.

  14. There is a very good plugin for image optimization for all you guys who are using WordPress.

    It does lossless image optimization (after you deploy typical compression) and finally you reduce image even more than Photoshop or most of the other tools will do. Google will recognize your images as 100% optimized on Google page speed test. Few Kb is not much. but if you have a lot of images it can save your website from. Please check out http://www.ngimo.com/demo

    Cheers,

    Mat

  15. Pingback: Optimisation d’images : le bilan | MathieuRobin.com

  16. Pingback: פיקסלגיק: קישורים להדרכות, חינמיים, השראה, תכנות פרונט אנד, עיצוב לסטארטאפים וחווית משתמש [10-10-13] | פיקסל פרפקט מגזין

  17. Pingback: Responsive Design Weekly #78

  18. Pingback: Internet Hoarding #6

  19. Pingback: How Optimized Are Your Images? Meet ImageOptim-CLI, a Batch Compression Tool | Jungus

  20. Pingback: How Optimized Are Your Images? Meet ImageOptim-CLI, a Batch Compression Tool | Beta Sites Galore

  21. Pingback: How Optimized Are Your Images? Meet ImageOptim-CLI, a Batch Compression Tool - Abstract PHP

  22. Pingback: How Optimized Are Your Images? Meet ImageOptim-CLI, a Batch Compression Tool | Chris Roubis Photography

  23. Pingback: How Optimized Are Your Images? Meet ImageOptim-CLI, a Batch Compression Tool - rehavaPress

Leave a Reply

Required fields are marked *.