On photographic resolution

Another one of my weekend posts about photography, and 2 or 3 times the length I'd have liked. Still, I have managed to get most of the key ideas about resolution into it.

Photography is about more than the amount of recorded detail, but choice of subject, camera position, timing, lighting, lens choice, colour rendition and so on. But Photographers get hung up on resolution - the ability to resolve detail, and every time we talk about it in the digital realm things go wrong.

The Process of making a photograph divides three parts

First the lens throws an image onto the back of the camera. Lens resolution is usually measured in line pairs per mm - we can multiply this by the width of the image to get image resolution

Second, the image is captured on film or a digital sensor. The combination of image resolution and the fidelity of the medium gives a recorded resolution. The Digital world equates "resolution" with "pixel count" e.g. 3000x2000 pixels - 6MP. This indicates how faithfully the image is digitized, not the overall quality; it's only useful for comparing different recordings made with the same lens.

We can express sensor resolution in line pairs: and you can't reliably record a pair of lines with just a two rows of pixels - if the pattern of line pairs is offset by half a pixel everything would be an even grey - so we need 3 pixels per line pair.

Common sense tells us that recorded resolution can't exceed the resolution of recording medium or the image formed by lens. The film world could get an indication of resolution in the photograph using the formula: 1/R = 1/L + 1/M (where R, L, M are the resolutions Recorded, of the Lens, and of the Medium in lines per unit width). Fleshing this out with some numbers: a photograph made with a top-quality 100 lp/mm lens on 100 lp/mm film will have a resolution of 50 lp/mm; or 1800 line pairs over the width of 35mm film frame. Halving the lens (or film) resolution doesn't halve the final resolution - it only goes down to 33 lp/mm -1200 lines per frame.

 People argue about the application of this formula to digital, but it fits observed behaviour. If we scan an image with 1200 line pairs resolution on a scanner which has 1800 lines resolution (roughly 4000 DPI giving a 20MP image) we'll get an image with 720 lines resolution. I've seen it argued (wrongly) that, in order to match film, digital cameras need as many pixels as the best scan. Digital SLRs owners find their pictures aren't much worse than film, and in some cases seem a little better, despite having an image area smaller than film. Lens resolution per mm remains the same but a smaller image has fewer lines across it's width: a typical DSLR sensor gets 2400 or 1200 lines with a 50 lp/mm or 100 lp/mm lens. Compact digital cameras have sensors in region of 7mm wide giving only 700 line pairs of resolution with a 100 LP/mm lens (and compacts' lenses aren't that good). The point here is that all other things being equal a bigger sensor produces a better image.

We can predict the resolution we'll get in the final image. .

SensorResolution

PixelsWide

Lines Resolution

Recorded res froma 2400 lines image

Recorded res froma 1200 line image

3MP

2100

700

541

442

6MP

3000

1000

705

545

10MP

3900

1300

843

624

15MP

4800

1600

960

685

22MP

5700

1900

1060

735

This shows that recorded resolution isn't proportional to number of pixels. It also shows better lenses benefit more from extra pixels. Further upgrading the lens can give more improvement than adding pixels.
In short, all pixels are not equal, Pixel count alone doesn't tell us how much detail is recorded. (See point #1 from in this Popular Photo "Truth Squad" piece)

The third stage in making a photo is printing. With film, an enlarging lens projects the image onto photosensitive paper and the net resolution can be calculated as before. Digital printing doesn't suffer the same losses as printing from film. 35mm film's bigger image holds more detail but the other steps in the process erode this advantage.

Questions I see about pixels and print sizes boil down to one of two things. The first is "When will my pictures look pixelated". I want to sink the myth of 300 DPI here. 300 Dots per inch was the finest half tone screen in the printing industry, but today's photo printers print can print well in excessof 3000DPI. Somehow 300 DPI became a wish for 300 Pixels per Printed Inch. In fact the printer will interpolate the number of pixels you give it, to give the (fixed) number of Dots it prints. Try making copies of a picture scaled down to give a variety of PPI values in a print of (say) 6x4 size, and compare the results when printed. You'll find below 100 pixels per printed inch, the print looks pixelated; above 150 it doesn't. You can read about someone who tried it.

The second question is "Will I have enough detail to stop my pictures looking 'mushy' ?". Small prints hide a lack of recorded detail, pore over a bigger print and it looks fuzzy (although you tend to view a bigger print from further away). Depending on how you view your pictures you may not see the benefits / shortcomings of your camera. But how much detail do you need per printed inch?

Lets ignore changes in viewing distance and say " X line pairs per printed inch and above looks sharp". Line pairs aren't directly related to pixels, so there's no justification for 300 DPI myth here (300x300 Pixels from a medium format lens' image recorded on a Phase one back holds more detail the same size block from a phone camera. They can't both be exactly what's needed for a 1 inch square of print) .
This is why the latest compact cameras can be disappointing. On an A3 size print (13x9 inches) 10MP gives 300 PPI, and 4MP only 175 PPI. You'd expect the 10MP to look superior, especially if you believe the 300DPI myth. In practice the 4MP doesn't look pixelated, and the 10MP only wrings a little more detail out of the image formed in the camera. It reinforces the point Print resolution depends on print size, lens quality, sensor size, and Pixel count combined, not just one (like pixels) .

The 300 DPI myth is also to blame for bad Resampling or Interpolation: I've encountered people who convert a 2100x1400 (3MP) compressed JPEG into a 10MP uncompressed TIFF before sending them to the lab to make a 13x9 print. They have heard that JPEG can lose detail and introduce artefacts into the picture. Converting JPG to TIFF won't restore lost detail or remove artefacts.

Printers print at fixed numbers of dots per inch, with even cheap ones going up to 5760x1440, so most images need to be interpolated before hitting the paper. The printer will do this for you, but you can interpolate on your computer using anything from Windows Paint to specialised printing software. Interpolation can't bring back lost detail, but better interpolation can maximize what you get from an image file and give a better result overall. But which does better, the software inside your printer (or at the lab) or on your PC ? The only way to tell is to try both. Interpolating to 300DPI when the printer resolution is a multiple of 240 or 360 DPI may actually produce worse results.

Does all this prove anything? Only the futility of trying to judge much about pictures from numbers of any kind. In the end the photographer needs to stop worrying about Pixels and ask "is it a good picture" which depends on so many other things. I've said before that I can buy the same paints and brushes as a great painter; I can be taught to put paint onto canvas the same way, but that wouldn't make me a great painter. So much of what we do in photography amounts to a study of the paint, not the painting.