the all-seeingeye Pixel Math

Color Depth and File Size

All other things being equal, the size of raster-based images (e.g. tiff) varies directly with the "color depth" of the image. The simplest, black and white, requires only a single bit per pixel. Each pixel is either white or black-expressable as either a 1 or 0. To assign shades of gray to an image requires more bits. To get 256 shade from white to black requires 8 bits per pixel. Thus, a gray scale scan of the same image at the same resolution is eight times the size of a black and white scan. To achieve full color, you need 256 shades in each of the RGB primaries, further tripling the size of the file. If you choose to use a CMYK color scheme the file is four times the size of a gray scale file. Thus, the range of file sizes for an image of any given physical dimension scanned at a given resolution can vary by a factor of 32.

Resolution and File Size

10 pixels per inch, 1 inch by 1 inch = 100 pixels.

20 pixels per inch, 1 inch by 1 inch = 400 pixels. (go ahead, count 'em, I'll wait) Twice as much resolution, four times as many pixels

File size also varies with the resolution. In this case it varies by the square of the change in resolution. If you have a 1-inch by 1-inch scan made at 10 pixels per inch, it will have 100 pixels in it. If you double the resolution to 20 pixels per inch, it will have 400 pixels in it. The change in resolution was a factor of 2, and the change in the file size was 2 squared (4). If, instead, you had increased the resolution to 30 pixels per inch, the one inch square would have 900 pixels. For a resolution increase of 3X, you achieve a pixel increase of 9X.

Combine 'em

If, for instance, you decide to scan in 24 bit color rather than 8 bit grayscale, and you'd like to go from 100 dpi to 200 dpi, the increase in file size would be 3 for the color depth increase times 4 for the resolution increase, for a total of twelve times the file size. If you had a 1MB gray scale scan, you now have a 12MB color scan.

So, how much do you really need?

The answer will vary with the output device, and depends not on the dpi of the scan, but on the total pixels needed across the maximum width of the display. As an example, if you need to fill a 600 X 800 display with a Powerpoint image, and the photograph is four inches by six inches, then a scan at 133 dpi would be sufficient to address every pixel on the screen. (6 X 133 = 798) For display screens, a resolution sufficient to address every pixel is enough. Other uses require other standards. Generally speaking, professional offset printing requires files that have no less than 1.5 times the halftone screen frequency (often 133 lines per inch-though sometimes higher) at their printed size. If an image is to appear in print at 4 inches across, then it must have a width of at least 800 pixels. Mind you, this is a minimum-and it will be pretty obvious in print. 1600 pixels would be better, and 2400 better still. Much beyond three times the halftone frequency and you begin to waste processing time for no improvement in image quality at any given size.