Image Manipulation Series - Binaries For Optimization

April 22, 2014
Updated on: May 02, 2014 at 22:11

Image Manipulation Series - Binaries For Optimization → via @_patrickwelker

Image Manipulation Series
Part One: ImageMagick
Part Two: Binaries For Optimization
Part Three: Keyboard Maestro Image Editing Suite
Part Four: Markdown Image Links


In the second part we are dealing with the next part of our standard suite – we’re still in the middle of setting the table for the macros. Our library is already packed with ImageMagick which is a great tool for shaping your images. But what we really want before our images ready to be uploaded or shared is to get them as small as possible. It’s the year 2014 and size still matters with all the mobile devices on a limited data “flat-rate” which access your site. In my book compression is one of the top three key factors of optimization to boost the load time of your page.

What we learn in this part is:

  1. The basics about popular image formats and when to use which one.
  2. What binaries to use.
  3. How to install them.
  4. Basic commands and how to set the compression to a maximum for best results.


There are a lot of image formats out there. Before we jump into getting our compression started it’s always a good idea to see behind the curtain. Having a basic knowledge of what exactly we are dealing with is key. You don’t jump into a car and just drive away – so here’s your drivers license theoretical test. This is by no means meant to be a scientific, in-depth explanation, it’s rather me summing it all up in my own words.

I’m trying to cover the most popular formats in this introduction, outline their pros and cons and explain when I tend to a certain format. This is basic stuff, if you already know how to drive a truck feel free to proceed to the next section.

Bitmaps And Vectors

Bitmap images are made of pixels aka a lot of small dots in different colors. When you view them at a distance or move away from the picture these dots look like they are one picture. Your retina smooths out everything for you.

Here’s a picture I took from Lincoln in Dalivision by Salvador Dalí in the Dalí museum in Figerues (Spain):


The artist used this optical illusion to create two images in one painting. From the corridor on the other side of the hall you can see his version of the classic Abraham Lincoln portrait. The closer you move up more details of his mosaic collage surface and show you the naked buttocks of his muse Gala.

Due to this effect, there’s almost no problem when downsizing JPEG’s. The big drawback is, they don’t scale up well at all.


Left picture: by unidentified (Christie’s, LotFinder: entry 5176324) [Public domain], via Wikimedia Commons

The left side shows you the normal image, on the left side is an enlarged version (just picture yourself standing right in front of a wall where this picture is hanging… only 20 times the original size…. You could count the pixels if you get bored).

Vector images are nothing like that. They are pure math, so to speak, and consist of points, lines, curves and shapes. If you enlarge or shrink a vector image only the mathematical co-ordinates get adjusted. The result is that the image stays sharp all the time, no matter the size.


The image above is one of the mockups for the Der Ubercast1 logo. You can see that the text and graphical elements are still sharp as a blade.

Lossless And Lossy, Images And Compression

Since the utilities which we will set up only work with a limited group of file formats, the only lossless image format we’re dealing with is PNG. The same goes for JPEG and GIF; those are the lossy file formats we will be able to compress.

A lossless image retains all of a files data – every singe 0 and every 1 is kept alive. Hence the images tend to be much bigger.

Lossy images don’t keep every “bit” of an image. Different systems and algorithms try to be smart about what to get rid of when saving a file and targeting a certain file size/compression level/overall quality. Do this over and over to a lossy image and it will undergo a metamorphosis… becoming more and more like Frankensteins monster every time you hit save. That’s also why you should archive your original artwork in a lossless format.

The same goes for compression. An image which gets compressed lossless can be recovered any time. It’s a non-destructive act which only compresses or encrypts a file. An image which undergoes a lossy compression can’t. Some information will get lost and your image isn’t the same.

Usually the first step of optimization algorithms is to pass the pixels through a lossless arithmetic transformation: (delta) filtering. Then they are transmitted further as a byte sequence. Filtering itself doesn’t reduce the size of the data (aka compress it), but it aims to make your pack of zeros and ones more compressible.

Tip: on the official site of OptiPNG is “A guide to PNG optimization” which goes thoroughly into all possible details of PNG compression and explains it far better than I could do here.

In general it can be said that the binaries we will use are smarter than your favorite image editor when it comes to compression. They are really good at stripping out redundant information.

Use Cases For Bitmaps

Bitmaps are great for photo-realistic images or photos. Essentially, everything rich in detail is suited to be published as a Bitmap. Popular formats are JPG, GIF, TIF, PNG and PSD (although a Photoshop file could contain vectors nowadays).

For web publishing most images are 72 DPI (dots per inch). Most common displays can’t handle more DPI so you can save some serious weight. However, with high pixel density displays on the rise nothing really changed so far. If your website has dozens of pictures which are twice the size it will impact the loading speed and visitors from mobile devices which are on 3G/LTE likely won’t shake your hand and say thank you for loosing a big chunk of their limited data “flat-rate”. Most content (read pictures) are still 72 DPI Bitmaps. A good practice is to compress Bitmap images before publishing to the web.

For print a higher DPI Bitmap with at least 300 DPI provides enough density to look good. Images for print don’t need to be compressed.

Aside from the resolution there are other factors that could sway me into using the one or the other format.

GIF’s (Graphics Interchange Format) are ancient (and if you’re interested in their cultural history check out this Mashable article). I tend to use them only for animated images be they pure fun shoots or tutorial-like screen shots. I don’t see me using them in anything web design related anymore. Now that most things can be done via CSS there’s no need for them. GIF’s were once a good alternative due to their small file size and transparency capabilities for decorative elements or fillers. In print the have no place – the poor things do only sport 256 colors.

JPEG’s - even if they are lossy - are a great all-rounder. They feel right at home in print and web design. Although I wouldn’t really pass them on to clients who need something for print. If they by accident save the file over and over they quality would suffer.

PNG (Portable Network Graphics) are somehow my favorites. I guess it’s because the text is always more sharp and readable than when using a JPEG. They are ideal for screen shots and if compressed they are most times even smaller than JPEG’s. They are also the way to go if you want to preserve the transparency of an image. These days I convert most of my static GIF’s to PNG files since they offer a better compression level.

For clients I’d use TIFF (Tagged_Image_File_Format) files if it’s a bitmap and EPS (Encapsulated PostScript) if it’s a vector. The are a crowd pleaser: lossless, can be compressed and are usually preferred.

Use Cases For Vectors

Vectors shine at being razor sharp. The aren’t as feature-rich as photographies; they don’t have as many colors and shades. Due to their mathematical nature they a relatively small in size.

In web-publishing we recently start to see them more and more as logos. Which makes sense since they scale nicely whatever display you put them on – be it a high-res or a standard 72 DPI monitor.

The advantages in print are obvious: a vector graphic can be scaled to any size. It can end up on a letter, a T-Shirt or on a gigantic billboard. You only need that one file whilst a Bitmap must be provided in the respective resolution to still look good.

Most times vector graphics don’t get compressed, but it is possible. Popular formats are EPS, SVG and AI.

This is as deep as I want to get into vectors here since the I see no need for compressing them at this point in time. From now an all you read will be addressing bitmaps.

Binaries For Optimization

All of the utilities mentioned here compress your images. They come with only a handful of options… which again translates to “don’t be afraid of the shell”: what these utilities do is complex, but using them isn’t.

Most of them let you influence several simple factors like, do you prefer a better compression over faster compression (and vice versa) or what should get appended to the file name (e.g. demo-opt.png) and so on. In the case of lossy compression there’s likely to be an option to set the quality, too.

The goal of downloading all these binaries is to have a versatile collection that exactly mimics what ImageOptim and ImageAlpha do and bring it to the command-line for easier scripting and macro building.

In the next part of the series we will take JPEGmini with us to have the best of JPEG compression on our team, too.

Like yesterday, to get you started you use the demo files download and put it on the desktop.


pngquant is a command-line utility and a library for lossy compression of PNG images.

The conversion reduces file sizes significantly (often as much as 70%) and preserves full alpha transparency. Generated images are compatible with all modern web browsers, and have better fallback in IE6 than 24-bit PNGs.

PNG is typically known to be a lossless image format which preserves all image details, even minuscule ones. The problem is, those tiny details can costs you several kb’s.

pngquant is the only tool we will use that offers lossy compression of PNG images. You can grab the latest Binary for Mac OS X (v2.2.0) from the official website. However, if you have Homebrew installed you should put it to use.

brew install pngquant

There’s also a Macports package but it’s version 2.1.0.

If you copy the file by hand to your /usr/local/bin (or wherever) you might have to make it executable with chmod +x pngquant.

Check out the manual section on the website and/or type pngquant -help in the Terminal which will show you the 7 options it comes with.

A GUI wrapper for pngquant is the app ImageAlpha – it doesn’t support batch processing yet and for this use-case refers to pngquant. They do a good job at explaining how it works here. By the way, here’s a thread on HackerNews which also goes into detail why it does a good job at what it does.

If you plan on combining multiple optimization tools and applying lossy compression, the smart thing to do is to start with the lossy compression first because then you have a solid base for the lossless compression tools to work on.

This is a basic pngquant command:

pngquant --quality=65-80 ~/Desktop/demo/1.png

It will add the optimized file with -fs8 appended to the name to the source folder. The quality setting can be adjusted to your liking – for some screen shots you can go even lower. You could also use it on the whole folder:

pngquant --quality=65-80 ~/Desktop/demo/*.png

Or, overwrite the original files:

pngquant --quality=65-80 --ext .png --force ~/Desktop/demo/*.png

Documentation: man pngquant or the official site.


Its main purpose is to reduce the size of the PNG IDAT data-stream by trying various compression levels an PNG filter methods. It also can be used to remove unwanted ancillary chunks, or to add certain chunks including gAMA, tRNS, iCCP, and textual chunks.

Our first optimizer, pngcrush, reduces the size of your files around 60-70% (,… if you’re lucky). You could download a pre-compiled build of the latest version (v1.7.73) from this source. Alternatively:

  • brew install pngcrush or
  • sudo port install pngcrush

If you really want to compile it yourself then…

  1. Download it from the source (see above)
  2. Open the Terminal (/Applications/Utilities/) and navigate to the directory of the extracted folder by typing cd and simple dragging said folder to the terminal, for instance: cd ~/Downloads.
  3. Now extract the file with tar -xvzf pngcrush-1.7.73.tar.xz and cd into this folder again.
  4. The next step: make = this compiles the downloaded source code to a binary.
  5. The default path for binaries under OS X is /usr/local bin/. Let’s copy our file to that destination – putting it there will make sure that you can use it without additional steps (okay, you have to restart your Terminal session, but that’s it):

    sudo mv pngcrush /usr/local/bin/

Since it’s ready now, why not give it a whirl. Open and cd into the demo folder:

cd ~/Desktop/demo
pngcrush -reduce -brute 1.png 1-opt.png

If you want to replace the original file one way to do it is to rename the output file right after the command:

pngcrush -reduce -brute 1.png 1-opt.png && mv 1-opt.png 1.png

But wait… that’s not all pngcrush can do. It might take a bit longer, but when using this tool you really want to crank up the compression.

pngcrush -rem allb -reduce -brute -l9 1.png 1-opt.png

It’s the most drastic setting I know of… if you have other suggestions, don’t hesitate and tell me.

If you’re tired of specifying an output file you can use the -e <ext> option:

pngcrush -rem allb -reduce -brute -l9 -e -opt 1.png

If you found a setting you’re comfortable with there’s practically no need to change your command.

Documentation: man pngcrush or man pngcrush.


PNGOUT optimizes the size of .PNG files losslessly. It uses the same deflate compressor I wrote for KZIP.EXE. With the right options, it can often beat other programs by 5-10%. That includes pngcrush -brute, optipng -o7, advpng -z4, etc.. PNGOUT is a great tool for optimizing the size of web pages or squeezing game content onto mobile devices. Almost all .PNG files on my website were optimized with PNGOUT.

pngout comes pre-compiled, you can download the binary for OS X from here. There’s no homebrew package for it and you can go ahead and just download it from the website.

Drag it into your /usr/local/bin folder (and mayhaps make it executable `chmod +x pngout… I don’t think you need to).

It’s super easy to use:

pngout 1.png 1-opt.png

or to replace to original file:

pngout 1.png

The option for the best compression is also the default /f0 – no need to change anything.

Documentation: man pngout or the PNGOUT Tutorial by Ken Silverman aka the creator of pngout.

(4) optipng

OptiPNG is a PNG optimizer that recompresses image files to a smaller size, without losing any information. This program also converts external formats (BMP, GIF, PNM and TIFF) to optimized PNG, and performs PNG integrity checks and corrections.

This one doesn’t come pre-compiled, but you can install it via Homebrew:

brew install optipng

Or, if you have it or don’t want to do it the hard way and compile it yourself. If you choose to go that route…

  1. Download it from here as a zip or tar file.
  2. In your Finder, click the downloaded file and uncompress it.
  3. Now open the Terminal and navigate to the directory of the extracted folder by typing cd and simple dragging the folder called optipng-0.7.5. into the Terminal. Press return. Your now in said directory.
  4. Type ./configure and press Return again.
  5. As a last step, type sudo make install and press Return.

I picked the “hardcore compression setting” directly from the man page and haven’t added any other option:

optipng -o7 -zm1-9 1.png

Documentation: man optipng


The main purpose of this utility is to recompress png files to get the smallest possible size.

To compress the files this utility uses the following strategies:

• Remove all ancillary chunks. • Concatenate all the IDAT chunks. • Use the 7zip Deflate implementation.

advpng can be downloaded from the website, but you have to compile it just like explained above… or… with Homebrew:

brew install advancecomp

As far as my default setting goes, I tend to use this:

advpng -z -4 1.png

I don’t use the --iter <number> option because it takes its precious time, but the results are often not there.

Documentation: man advpng or the official site.


Utility to optimize jpeg files. Provides lossless optimization (based on optimizing the Huffman tables) and “lossy” optimization based on setting maximum quality factor.

Now we entering the world of JPEG’s. Later on we rely mainly on JPEGmini to do the most work for us (which offers no command-line version), but nonetheless we can use jpgeoptim to pre-process our JPEG’s a tad and optionally choose to do this lossy, too.

jpegoptim can remove non-essential information like comments, color profiles, extra bytes at the beginning or end of a file, plus assorted other bits and pieces.

Again, this one isn’t pre-compiled. But…

brew install jpegoptim

… will get you up and running.

Since there are no demo JPEG’s in the demo folder we could use ImageMagick from part 1 of this series to generate some:

mogrify -format jpg

… and then do a lossless compression…

jpegoptim -pt --strip-all 1.jpg

or a lossy one:

jpegoptim -pt --strip-all -m60 1.jpg

Documentation: man jpegoptim or man page.


Gifsicle is a command-line tool for creating, editing, and getting information about GIF images and animations. Making a GIF animation with gifsicle is easy.

The last player in our optimization toolkit is gifsicle. Which will work on your gif files to make them smaller.

brew install gifsicle

I hesitated to add this to the list, since I tend to convert all of my non-anigif’s to png’s (and gifsicle doesn’t work well with those). Anyway, this way all of your compression needs will be satisfied.

gifsicle -b --optimize=3 your-gif-i-dont-have-one.gif

Documentation: man gifsicle or the official website.

The Basics Of Building A Script

If you want to edit more than one file at a time the easiest way to do it is to put it in a script. Remember the loop from series one? We can use it to pipe our files through a whole folder like this:

for i in ~/Desktop/demo/*.png; do convert "$i" "${i%.png}.jpg"; done;

Our script will take all files in the demo directory on your desktop, copy them to a temp folder, optimize them and put the optimized version in a folder called “optimized images” on your desktop.

#   Variables
targetDir=~/Desktop/optimized\ images
cp ~/Desktop/demo/*.png "$backupDir"
# Optimize PNG's
function optimizePNG {
	# Lossy (good) -- for simple screen shots change the value to 65-80
    pngquant --quality=95-98 --iebug --ext .png --force "$i";
    # Lossless
    pngcrush -reduce -brute -l9 -e "$i" "${i%.png}-crushed.png" && mv "${i%.png}-crushed.png" "$i";
    pngout "$i";
    optipng -o7 "$i";
    advpng -z -4 "$i";
# Optimize JPG's
function optimizeJPG {
	# Lossy
    #jpegoptim -pt --strip-all -m60 "$i";
	# Lossless
    jpegoptim -pt --strip-all "$i";
# Optimize GIF's
function optimizeGIF {
    gifsicle -b --optimize=3 "$i";
cd "$backupDir";
# Check if files exist (ignore directories)
if ! ls -ld *.jpg | grep -v '^d' > /dev/null 2>&1; then # If not then exit
	echo "No JPG's found."
	for i in *.jpg; do
if ! ls -ld *.png | grep -v '^d' > /dev/null 2>&1; then # If not then exit
	echo "No PNG's found."
	for i in *.png; do
if ! ls -ld *.gif | grep -v '^d' > /dev/null 2>&1; then # If not then exit
	echo "No GIF's found."
	for i in *.gif; do
for i in "$backupDir"/*; do
	# Append "-opt" to all images
	basename="$(basename "$i")"
	mv "$i" "$basename"-opt."$ext"
# Create an optimized images folder on the Desktop and move all optimzied images into it
mkdir "$targetDir";
mv *.png "$targetDir"/

Save this code as “”, make it executable chmod +x and give it a go in the Terminal app.

App-Tip: CodeRunner is a pretty nice OS X app to test all kinds of scripts.flat-rate

Feel free to send me a your optimizing scripts and feedback. I’m always looking for ways to improve my workflow and I’m generally curious to see what others build.

Regarding ImageOptim: Jamie has a superb wrapper for image compression and it would be a shame not to mention it here.

It’s an alternative for those who just seek for compression. I stumbled upon it when writing my Hazel workflow. I decided to stick to my own set of scripts because I already run them on my server (where there is no AppleScript) and the idea behind the CLI was the same thing I already set up (i.e. bring ImageOptim, ImageAlpha, and JPEGmini to the command-line). In addition, it’s also easier for me to maintain and expand my scripts this way. Lastly, I never got JPEGmini to work with it. I upgraded from JPEG mini Lite via in-app purchase which ImageOptim-CLI doesn’t support. Later on I got JPEGmini Pro which currently isn’t supported either. Jamie is working on fixing this.

In short: if your in search for compression (and don’t want any of the upcoming extras), don’t want to install seven different binaries and rather remote control the GUI-versions, then ImageOptim-CLI could be the thing for you.

Check out the stats why the tools we choose are the best at hand.

  1. A German tech podcast I recently started with Sven Fechner and Andreas Zeitler.

comments powered by Disqus

Related Posts