Originally Posted by
rhadorn
The first issue relates to the ppi - dpi relationship:
- how much bigger must the dpi be to produce a given picture quality (understood as richness in grey levels)?
This is really a question about the trade-off between resolution and tonal quality (see response to second question for much more on this). For those following this thread, also take a look at the tutorial on digital pixels, PPI and print size. (PPI - pixels per inch, DPI = dots per inch)
Dithering is a technique which allows a printer to produce more colors than it actually has (as separate ink cartridges or toner). This is probably most familiar when looking up closely at magazine or newspaper images in print, as you will see that colors/tones are actually comprised tiny patterns using a relatively small number of colors or black dots/shapes.
Continuous tone image (no dither):
Dithered image (magnified 400%):
Originally Posted by
rhadorn
- how much better is a 1200/4800 resolution compared with a 1200/1200 resolution? Is it four times 'better'?
- the print raster is not recognizable, even if one looks at the picture through a microscope. Which are the concepts behind the pixel to dot conversion?
The necessary DPI to achieve a given quality PPI primarily depends on how many colors of ink the printer has to work with, and the level of sophistication of the printer's drivers for dithering and half-toning. There's no one size fits all answer here, but we can make some crude generalizations.
You mention an example of a 1200/4800 PPI/DPI printer, but I'm going to ignore the PPI part of the printer specification because it depends on how they define a pixel. Is this an 8-bit pixel, a 10-bit pixel, a 16-bit pixel...? It all depends. Often times the marketing team gets too involved in determining PPI for dithered printers that it's not very well standardized between models and manufacturers.
Instead, let's define our own pixel. In a simplified example, let's make this pixel be one which has 2-bit gray scale (plus white) and is created from a black/white dither pattern. Each pixel is therefore comprised of a 2x2 arrangement of black ink dots, producing 4 different possible shades, plus all white:
Alternatively, out pixel could be defined such that it was comprised of a 4x4 pattern of black dots-- yielding 4X as many tones but with half the resolution (at that tonality). Here's an example of one such pixel:
From these two examples we could conclude that for a given dither size/type and number of ink cartridges, twice the DPI yields 4X as many tones. Alternatively, for a given number of tones per pixel, twice the DPI yields twice the resolution. There's a limit to how many tones we can perceive over a given distance, and to our eye's resolution, so there's usually an optimum pixel size for a given DPI. Too much resolution at the expense of tonal quality, or vice versa, may therefore actually harm image quality.
However, as with most things, it's not always that straightforward. Many high-DPI printers actually improve their resolution dramatically near the edges of objects within the image. Even though a dithered pixel may be 2x2 or 4x4 as in the above examples, the printer could intelligently place the dither pattern so that it more closely follows the edge of an object-- giving the appearance of higher resolution and certainly making sharper edges (although with fewer tones). Further, it should be apparent that if the printer had more possible inks (such as black plus gray in the above examples), then this would mean that it could produce many times more tones per pixel using a given number of dots per inch. Additionally, a given tone can often be represented by many different possible dither patterns; many printers are intelligent about randomizing these to make dithering less apparent, or will use one pattern over another to improve resolution. Unfortunately, some of this technology is proprietary or trade secret, so you will likely not get a printer company like HP, Epson or Canon to fully divulge their dithering secrets.
In the real world, there's also diminishing returns. Every time DPI doubles, the perceived improvement in tones and resolution improves by less than double-- and increasingly so as the resolution approaches the limits of what we can see. If you're looking at the dithered image under a loupe/microscope, then sure, DPI without limit is best, but most images are not viewed like this. In my mind there comes a point when other image quality factors become MUCH more important, such as color gamut and dynamic range.
Also, the above discussion applies for dithered printers; many photo labs use continuous tone printers where only PPI matters and DPI is no longer applicable.