Color on the Web

I. Color Fundamentals

Color theory shows us that any color can be created by combining
the three primary colors. On the Web, as with Television, the primary colors
are Red, Green, and Blue. (Note that print production uses different colors
and a somewhat different process of combining them, but for our purpose we can
ignore print production.) If you were to take a magnifying glass and look at
the screen of a television, or at a monitor, you would be able to see that the
picture is made up of a grid of little dots, some Red, some Green, and some
Blue. You cannot see them without magnification, and in normal viewing these
dots blend together into the picture you see, and you are never aware of the
little dots. If you have never seen this, get a magnifying glass and check it

I’ll wait.

OK, now that you have seen these little dots, you know how the
system works. By adjusting the intensities of these three colors you can create
any color you can imagine. With Television, this is done in an Analog process,
like using a dial to turn up the intensity or turn it down. Theoretically, this
type of Analog process can produce an infinite number of colors, though at some
point the differences are so minute that the human eye could never distinguish
them. But with computers, everything has to be digital. So instead of an infinite
variation of the dial, the dial clicks through discrete steps, like an old TV
tuner knob. If you make the number of possible steps very large and make the
individual steps very close, the effect is much like the smooth Analog dial,
and you can get just as many colors.

II. Computer Video Displays

A computer displays video using two pieces of equipment: a video
adapter card, and a monitor. Each is designed with certain capacities to display
video data. In most cases, monitors do not pose any kind of problem. Most monitors
sold in the last few years can display as much data as anyone is likely to want
to display on them. Video adapters, though, can prove to be a bottleneck. To
see why, we need to look at how monitors display images.

A monitor is just a higher-quality version of a television set.
It needs to be. Displaying readable text requires much better resolution than
watching “Baywatch”. Find someone with WebTV and get them to show
you why TV is bad for displaying text. You will see why monitors need to have
higher resolutions. If you try the trick with the magnifying glass on a computer
monitor, you will need a much more powerful magnifying glass to see anything.
But if you have such a magnifying glass, you will see that monitors, too, have
little Red, Green, and Blue dots making up the picture. In defining the video
display, we can take one each of the Red, Green, and Blue dots, put them together
as a unit, and we would have what is called a picture element.
Well, if you anything about computer people, you know they are not going to
keep saying “picture element.” After all, it has five entire syllables.
So it quickly got abbreviated to pixel.

Pixel = A group of three dots, one Red, one Green, and one Blue,
that combine to form one picture element.

III. How Colors Are Described

OK, now we know what a pixel is. How does a color get described
on a computer? Well, we know that it has to be a digital description, which
means we have to able to do it with numbers. The smallest possible unit to describe
a number in computers is a “bit” (short for “binary digit”).
Suppose we used one bit to describe the intensity of each dot. Well, a bit only
has two values, 0 and 1, so the best we could do is say that the dot was either
“on” or “off”. Hmmm…This might work for a “black
and white” type of image, like old monochrome monitors, but it seems a
bit limiting. Using two bits would let us have four settings on each dot: off,
a little bit on, mostly on, and full on. Better, but not very subtle. Using
three bits would let us have 8 settings, and so on. If we go all the way to
a “byte” for each dot, that gives us 256 possible settings. 1 byte
= 8 bits, 8 bits gives 2^8 (2^8 means two raised to the eight power) possible
values = 256.

Now, there are three dots in each pixel, and using a byte for
each dot means that each pixel can have 256 x 256 x 256 possible colors, and
that comes out to 16,777,216 colors! Not bad, eh?

Well, not good, either. Now we have to consider the memory needed
to handle all of that.

IV. Screen Resolution and Video RAM

Suppose you had a relatively simple VGA monitor, the kind that
was state of the art 10 years ago. VGA monitors were designed to display pixels
in a grid that is 640 pixels wide and 480 pixels tall. That means that you have
307,200 (640 x 480) pixels on the screen. Each pixel requires 3 bytes to describe
it (1 byte for each of the three primary colors), so you have a total of 921,600
bytes (or roughly 1 MB) required to describe the screen at any moment. All of
that needs to held in Video RAM on your video card. And that doesn’t count all
of the processing the video card has to do to update the picture whenever something
happens, which also requires RAM. You would have to have several MB of video
RAM on your video card to handle this. And a lot of video cards did not have
that much video RAM, particularly when RAM was very expensive, as it was years

Then, higher resolutions were developed, like SVGA, which lets
you have 800 x 600 pixels. An SVGA screen with 8-bit color (i.e., one byte per
dot, or three bytes per pixel), requires 1,440,000 bytes to describe the screen.
Go to the next higher resolution, 1024 x 768, and you need 2,359,296 bytes to
describe the screen. This was quickly getting out of hand.

Of course, you could always use fewer bits for your colors, as
long as you didn’t mind sacrificing some of the range of colors available. That
way you could have “larger”, more detailed screens with lots of pixels
and not exceed what your video card was capable of showing. The standard that
a lot people settled for, and it is still the default setting on most new computers,
is to have the computer display 256 colors. And that is where the Web comes

V. Web-safe colors

Back in the dark ages when the Web was being developed, the problem
arose of specifying colors in a way that anyone on any computer could see them
properly. Two constraints affected this. The first was that most computers could
only display 256 colors, maximum. The second was they they were using different
colors already. Windows, Macintosh, and the other platforms each had certain
colors that they were already using, called “system colors”. And no
two platforms used the same system colors. Well, if each platform is already
using some colors, and there are a maximum of 256 to work with, how many can
be used in common by all systems? Clearly something less than 256. The engineers
figured out that optimal solution was to use 216 colors as common colors. This
was sufficiently below the 256 maximum that everyone could have their system
colors. And the number 216 = 6 x 6 x 6. You could give each dot a total of six
settings, and define 216 colors that. Being engineers, they just took the 256
settings available for each dot in an 8-bit system, and divided by six. The
range of numbers available went from 0 to 255, and dividing by 6 gave them the
following settings:


So, the system of Web-safe colors allowed for any color to be
described by using one of these six numbers for each of the three primary colors.
0 meant none of that color (i.e. the dial turned all the way down), and 255
meant full color. So, the color (0,0,0) would mean none of any color, and that
would be black. (255,255,255) would mean all of every color, which would combine
to be white. (255,0,0) would be deep red, (0,255,0) deep green, and (0,0,255)
would be deep blue. Except that the engineers didn’t use the above numbers,
even though that is a recognized standard called RGB. They thought it would
be more interesting to use a completely different numbering scheme called hexadecimal,
which instead of having ten numerals has 16. Computer engineers love to do stuff
like that, which is why they are so much fun at parties. In hexadecimal you
count like this : 1,2,3,4,5,6,7,8,9,A,B,C,D,E,F,10,11,12,13,14,15,16,17,18,19,1A,1B,1C,1D,1E,1F,20,21,

In this numbering system, the six numbers for colors become:


These are the same numbers, just expressed in a different numbering

VI. Why this matters

You might wonder what would happen if you used a color that is
not one of the Web safe colors. Possibly, nothing. If the person viewing the
page has modern system, with a video card that has 8MB of RAM, and has their
system set to display full 8-bit color (i.e. 16 million possible colors), you
could use any color you could think of, and it would look just fine. It is the
people that are only displaying 256 colors that you have to worry about. If
you pick a color that is not one of the safe colors that it knows how to display,
the computer will attempt to match your color by combining several of the colors
that it *can* display. And it will do this by using pixels of different colors,
trying to “average out” to your color. From a distance, that may work,
but most people have their eyes about one foot (30 cm) from the monitor, and
at that range your image will look awful. Here is an example:

Dithered GIF

Now, this is a very heavily dithered picture, but it illustrates
the problem. If you use a color that is not Web-safe, such as for a background
color for your page, the result could turn out like this. Not quite what you
expected, I imagine.

VII. For More Information

No doubt you would like more information on Web-safe colors. A
good starting point is Lynda Weinman’s site, at
She has a section on Web colors at
You can also see some interesting examples of dithered photographs at her site.
I also have a couple of her books, Deconstructing Web Graphics and
Designing Web Graphics.

A chart of the Web safe colors you can work with can be found

For a slightly different approach, at this site you can enter
any color and find the nearest Web-safe color:

Another site that organizes the colors differently can be found

 Save as PDF