Thumbnail sizes

What's the done thing these days with thumbnail sizes? How do you make the trade-off between serving images that are small enough to load quickly, but large enough to look good on ridiculously high resolution phones?

Currently, all of the DNA Lounge flyers are encoded at 1920px along the long edge, with thumbnails being 320px wide. Those smaller thumbs are what show up on the flyer index pages and on the calendar pages.

I picked that number 320px a long time ago, before the prevalence of desktop retina displays and the mobile resolution explosion, so it's probably no longer an ideal choice.

For an image to look sharp, you want to give the browser an image that is at least as large as it will be rendered on the screen in real hardware pixels, and those numbers are enormous these days. For example:

An iPhone 13 Pro Max has a 428x746 pixel viewport, but hardware pixels are 3x, so that's really 1284x2238. An iPhone 8 is 375x548 2x = 750x1096. And even my lowly 2013 iPad Pro is 1118×1232 2x = 2236×2464. On the calendar page it displays the flyer thumb desktop-style at 33% width, though, so it would only need a thumb that was.... 745px wide.

So I could just throw up my hands and serve the full-sized image all the time. But there's that pesky trade-off between quality and bandwidth. I would still like these pages to load fast over slow connections! Shitty wifi is still a concern! For that Halloween flyer, the big image is 573 KB and the small one is only 57 KB.

So what's the done thing?

(Don't say jQuery.)

BTW, a while back I made a page for testing img srcsets to see what sizes are actually getting loaded on what devices and to help figure out why, which is informative when thinking about this stuff.


Tags: , , , ,

24 Responses:

  1. Ben Hamilton says:

    Sorry, srcset, and looks like you already knew that anyway.

  2. Greg says:

    Have you considered using WebP images? They're surprisingly well supported these days and would cut down on big image sizes if you go that route.

  3. jwm says:

    I can't find any advice better than "double the dimensions", so I hope someone with design clues replies, as I'd like to know too. It does look like the answer may so heavily depend on your images and traffic that no general advice works.

    You may be able to claw back the thumbnail size increase with webp, or even with a more modern jpeg encoder — according to this article MozJPEG delivers a 10% improvement versus the reference decoder, but webp is generally better below 500px. (It's also possible that webp lossless might have a similar size to jpeg at the quality level needed to avoid artifacts near text, but I doubt it.)

    (AVIF is genuinely better than both for lossy compression, but is still at the "come back next year" stage of browser availability.)

    If I were trying to figure this out for my day job I'd make sure my servers were reporting client IP and time take to deliver each GET, then stuff a few months worth of data into some analysis tools to look for the slowest 95%-ile both geo-fenced to the bay area, and world wide, to figure out how slow "shitty wifi" actually is, in practice. I'm willing to bet there will be a messy long tail in that data, but you'll be able to figure out what sort of page size budget you have to play with.

    • jwz says:

      I am already using the more modern jpeg encoder, which involved a custom build of imagemagick, but was worth it. Huge size difference.

      • Derpatron9000 says:

        Details please, should time allow.

        • jwz says:

          I see that MozJPEG 4.x is out now but there's no changelog from 3.2 saying why I should care, so I haven't upgraded. Here's how I build:

          port install nasm

          ./configure && make && sudo make install

          export CC=gcc
          export CPPFLAGS='-I/opt/mozjpeg/include -I/opt/local/include'
          export LDFLAGS='-L/opt/mozjpeg/lib -L/opt/local/lib'
          ./configure --prefix=/opt/local --with-perl=/opt/local/bin/perl \
            --without-x --with-gslib --with-quantum-depth=16 \
            --with-bzlib --with-fontconfig --with-freetype --with-gslib \
            --with-jng --with-jpeg --with-lcms --with-png --with-ps \
            --with-tiff --with-xml --with-zlib --with-heic \
          && make CC=gcc \
          && ( cd PerlMagick && perl Makefile.PL && make CC=gcc ) \
          && sudo make install \
          && ( cd PerlMagick && sudo make install )

          Verify with:

          otool -L /opt/local/bin/convert | grep mozjpeg

  4. Wout says:

    I use client-side js to determine desired density, serving very low res+blur on load and then auto-generate images on the server.

    This code is several years old and doesn't work with CDNs.

    Now I would use srcset but still auto-generate. If I wanted a CDN I'd probably use cloudinary or figure something out with CloudFlare workers.

    As for actual sizes, I just picked some numbers back then. I do tend to pick higher resolutions but lower quality settings.

  5. Big says:

    On my iphone13 (not max), the thumbnails on your linked index page look just fine to me. Even zoomed up to full screen width there’s no way I’d bitch about the image quality there. All the small print for each flyer is readable with the exception of the url at the bottom of the Toxic Summer one, which isn’t quite legible but I can easily guess what it says.

    If I were in your place, I’d go “Meh, what’s there now does the job just fine.” (Unless, I guess, you’re expecting them to be displayed on a huge 4K screen somewhere?)

    • jwz says:

      If you take a screenshot and zoom in you'll be able to see that the pixels in the image are twice-or-more as large as the pixels in the text. Some people perceive that as being blurry.

      Particularly the kind of people who need to believe that they are making the right choice by spending $1200 on a new phone every year.

      • If you think you need to appeal to the $1200 phone every year, but can’t decide on a $20 gig ticket if the image is totally readable but a little bit blurry, then I guess you chose the right business to be in…

        (Also, image pixels being larger than text pixels has been a thing since way back when I occasionally moonlighted driving PageMaker and Quark back in the 90s. We all sent text to the printers at 300dpi but except for expensive glossy printing images worked just fine at 75dpi… Kids these days…)

  6. Agreeing with the other comments here: high res but low bpp does look better to people up to somewhere around "-quality 70" at least, for most people looking at photographs.

    I'm crossing my fingers that AVIF support will become widespread soonish because it does promise to make this better: AIUI, the better intra compression makes larger image sizes cheaper.

    I think about 2x the nominal display pixels looks good to me (i.e. about 200dpi) even though I'm using a phone with a nominal 300dpi display. To a certain extent I think the really high DPIs are more for making text rendering look super nice than for images?

    Might be worth making a copy of one of the gallery pages with all full size images for a side by side comparison. At the moment the images in there look very very slightly blurry to me.

    • P.S. I'll be very embarrassed but not really surprised if it turns out my "the images in there look slightly blurry to me" directly immediately contradicts my "about 200dpi looks fine to my eyes" ;)

  7. Aristotle says:

    Just chiming in to concur with the suggestion to decrease quality while increasing resolution. You don’t want the parameters too far apart – low res but super-high quality is daft, as is high res with super-low quality – but in general, as you go up in res you can afford to knock the quality back further than before. The more real signal you start with (meaning huge res alone is worthless if your S/N is bad, too), the more headroom you have for lossy compression without perceptible degradation. If your first step is to downsample to a low res, you’ve already thrown away most of the signal before the encoder can start picking what to throw away, so you can’t turn its quality knob down as much.

    (That’s all particularly relevant for video. Still images are much more demanding. But the principle still applies.)

    • jwz says:

      Interesting... if I just do "convert -quality 10" on the large Halloween image without changing the dimensions, I get a 98 KB file, which is tolerably larger than the old 57 KB thumb, and when scaled down to the same size, they're pretty similar, I guess. And when the 360px image is scaled up to the original size, it does look somewhat crappier. So maybe this is the way to go? Hard to say.

      • Eric TF Bat says:

        Weird that the colour of the text "A HALLOWEEN EXTRAVAGANZA!" is clearly different between those two on my screen, but I can't see why. Surely it's not reducing the colour palette and picking a "near enough" choice when there are so many obvious gradations in the hair, for example. Image compression is Deep Magic.

        My only useful contribution is that webp is aggravating if you're supporting old iOS devices, but that the last time I tried I couldn't get the HTML <picture> element to behave itself with my CSS in time for a deadline. That may be irrelevant to you right up until it bites you and makes you waste half a day debugging stuff that should Just Work.

        • frandroid says:

          JPEG is known to be crummy with fine lines/high contrast at low rez. The picture on the right has 50% more pixels or so, and that makes a big difference for those edges.

      • Netluser says:

        Scrutinizing the two by flipping back and forth with them at the same scale, there's two details I see better on the 360px - the fine detail in the illustration actually looks higher res because of the blockiness under the eye shadow, and the skin tone is not banded with red-greenish like it is on the 1280px - the block of red near the nose is visible no matter how much I zoom out.

        The melting black text in the textbox at the bottom is also an eyesore, so 10 quality is probably not enough. 50% zoom (half resolution) leaves the text legible, but chroma subsampling if that background was red or blue might kill any readability that's left, though it's still better than the 360px thumbnail where it's simply impossible to read.

        My worthless opinion: don't leave the quality so low that color banding and block artifacts are visible to the naked eye. 700-something to 800 pixels should be plenty to keep text readable, and hiDPI displays shouldn't need to upscale too much at 800px. So I'd compromise, keep enough quality to prevent artifacts from being offensive to the eye, but raise the resolution enough that tiny text is not painfully obvious at high DPI.

        Alternatively, if bandwidth is cheap enough thanks to the kilo-gigabit technology available in 2021 (bandwidth will never be cheap enough), saying "fuck it" and going full res may be the correct option. If you don't pay by the megabyte, a trial run serving full res, medium quality "thumbnails" is worth trying. Can always change it again.

  8. halcy says:

    What you're writing on the srcset tester page doesn't seem to be strictly true (anymore? always? for some configurations and browsers?) - for my browser (slightly outdated firefox on windows), it actually does dynamically load in different images as I resize, no reloading needed.

    • jwz says:

      Well that's weird, because everything I've read about it (which I hesitate to elevate by calling it "documentation") says that it absolutely does not do that. Safari doesn't. Firefox appears to reload the image as the window is resized. And Chrome appears to load two of them (1024 and 1920) regardless of the size of the window or zoom factor, and always displays the 1920 version no matter what.


    • margaret says:

      on 11.6 chrome and firefox if i make the window small and load the page i get 640 on both then if i expand the page both will switch to 960, 1280, and 1920. resizing to smaller again firefox will go back down but chrome stays at the largest. safari sticks with whatever it got first. good luck with that.

  9. frandroid says:

    1) I see you have progressive JPEG which helps more on the individual large posters, but it's not making a big difference on a page with multiple thumbnails. Wondering if there are more steps to the progressiveness that can be introduced for the posters.

    2) You could use loading="lazy" on your image tags

    If you meant "no JavaScript" by no-jquery, then ignore the following...

    3) You could stack lazy-loading with double-loading, where you first load a lower rez version of the thumbnails for fast loading. Then when all the thumbnails (a number I just picked out of a hat) are loaded, you load a second set of higher res pictures if srcset (or a media query) says you need them. So you're progressive-loading horizontally across elements, in some way.

  • Previously