The graphics card plays an
essential role in the PC.
It takes the digital information that the computer produces and turns it
into something human beings can see. On most computers, the graphics card
converts digital information to analog information for display on the
monitor; on laptops,
the data remains digital because laptop displays are digital.
RADEON™ 64-MB AGP Graphics Card
|
If you look at the screen of a typical PC very closely, you can see that
all of the different things on the screen are made up of individual dots.
These dots are called pixels, and each pixel has a color. On some
screens (for example, on the original Macintosh), the pixels could have just
two colors -- black or white. On some screens today, a pixel can be one of
256 colors. On many screens, the pixels are full-color (also known as
true color) and have 16.8 million possible shades. Since the human
eye can only discern about 10-million different colors, 16.8 million colors
is more than enough for most people.
The goal of a graphics card is to create a set of signals that display
the dots on the computer screen. If you have read
How Computer
Monitors Work and How
Television Works, you have a good sense of what those signals are and
how a monitor turns them into
light.
In this edition of
HowStuffWorks, you'll learn all about graphics cards and how they
optimize your PC experience.
What's a Graphics Card?
A modern graphics card is a circuit board with
memory and a
dedicated
processor. The processor is designed specifically to handle the intense
computational requirements of displaying graphics. Most of these graphics
processors have special command sets for graphics manipulation built right
into the chip.
Graphics cards are known by many names, such as:
- Video cards
- Video boards
- Video display boards
- Graphics boards
- Graphics adapter cards
- Video adapter cards
Today's graphics cards are computing systems in their own right. But
these cards started out as very simple devices. By understanding the
evolution of graphics cards, you can begin to see why they are so powerful
today.
Graphics Card Basics
You can better understand the essence of a graphics card by looking at the
simplest possible one. This card would be able to display only black or
white pixels, and it would do that on a 640x480-pixel screen.
Here are the three basic components of a graphics card and what they do:
- Memory - The first thing that a graphics card needs is memory.
The memory holds the color of each pixel. In the simplest case, since each
pixel is only black or white, you need just 1 bit to store each pixel's
color (See How Bits
and Bytes Work for details.). Since a byte holds 8 bits, you need
(640/8) 80 bytes to store the pixel colors for one line of pixels on the
display. You need (480 X 80) 38,400 bytes of memory to hold all of the
pixels visible on the display.
- Computer Interface - The second thing a graphics card needs is
a way for the computer to change the graphics card's memory. This is
normally done by connecting the graphics card to the card bus on the
motherboard.
The computer can send signals through the bus to alter the memory.
- Video Interface - The next thing that the graphics card needs
is a way to generate the signals for the monitor. The card must generate
color signals that drive the cathode ray tube (CRT) electron beam, as well
as synchronization signals for horizontal and vertical sync (see
How Television Works
for details). Let's say that the screen is refreshing at 60 frames per
second. This means that the graphics card scans the entire memory array 1
bit at a time and does this 60 times per second. It sends signals to the
monitor for each pixel on each line, and then sends a horizontal sync
pulse; it does this repeatedly for all 480 lines, and then sends a
vertical sync pulse.
The basic parts of a graphics card are computer
interface, memory and video interface.
|
When a graphics card handles color, it does it in one of two ways. A
true-color card devotes 3 or 4 bytes per pixel (4 bytes allows an extra byte
for an "alpha channel"). On a 1600x1200-pixel display, this adds up to about
8 million bytes of video memory.
The other alternative is to use 1 byte per pixel and then use these bytes
to index a Color Look-Up Table (CLUT). The CLUT contains 256 entries
with 3 or 4 bytes per entry.
The table above provides an example of a CLUT. Each
pixel is assigned a byte value that is 8 bits (1 byte) in length, with
256 possible values. The byte value corresponds to a color value taken
from a larger palette that is 24 bits (3 bytes), which is about 16.8
million possible colors.
|
The CLUT gets loaded with the 256 true colors that the screen will
display.
Graphics Coprocessors
A simple graphics card, like the one described previously, is called a
frame buffer. The card simply holds a frame of information that is sent
to the screen. The computer's
microprocessor does the job of updating every byte of video memory.
The problem with frame buffers is that, on complex graphics operations,
the microprocessor ends up spending all of its time updating video memory
and can't get any other work done. For example, if a 3-D image contains
10,000 polygons, the microprocessor has to draw and fill each polygon in the
video memory, 1 pixel at a time. This takes a while.
Modern graphics cards have evolved to take some or all of this load off
the microprocessor. A modern card contains its own high-power central
processing unit (CPU) that is optimized for graphics operations. Depending
on the graphics card, this CPU will be either a graphics coprocessor
or a graphics accelerator.
Think of a coprocessor as a co-worker, and an accelerator as an
assistant. The coprocessor and the CPU work simultaneously, while the
accelerator receives instructions from the CPU and carries them out.
In the coprocessor system, the graphics card driver software sends
graphics-related tasks directly to the graphics coprocessor. The
operating
system sends everything else to the CPU.
With a graphics accelerator, the driver software sends everything to the
computer's CPU. The CPU then directs the graphics accelerator to perform
specific graphics-intensive tasks. For example, the CPU might say to the
accelerator, "Draw a polygon with these three vertices," and the accelerator
would do the work of painting the pixels of the polygon into video memory.
More and more complex graphics operations have moved to the graphics
coprocessor or accelerator, including
shading,
texturing and anti-aliasing.
As graphics cards and coprocessors continue to evolve, the capabilities
become more and more amazing. Modern cards can draw millions of polygons per
second. These features make it possible to create extremely realistic games
and simulations.
More on Graphics Card
Components
There are several components on a typical graphics card:
- Graphics Processor - The graphics processor is the brains of
the card, and is typically one of three configurations:
- Graphics co-processor: A card with this type of processor can handle
all of the graphics chores without any assistance from the computer's
CPU. Graphics co-processors are typically found on high-end video cards.
- Graphics accelerator: In this configuration, the chip on the
graphics card renders graphics based on commands from the computer's
CPU. This is the most common configuration used today.
- Framebuffer: This chip simply controls the memory on the card and
sends information to the digital-to-analog converter (DAC) (see
below). It does no processing of the image data and is rarely used
anymore.
- Memory - The type of
RAM used on
graphics cards varies widely, but the most popular types use a
dual-ported configuration. Dual-ported cards can write to one section
of memory while it is reading from another section, decreasing the time it
takes to refresh an image.
- Graphics BIOS - Graphics cards have a small
ROM chip
containing basic information that tells the other components of the card
how to function in relation to each other. The BIOS also performs
diagnostic tests on the card's memory and input/output (I/O) to ensure
that everything is functioning correctly.
- Digital-to-Analog Converter (DAC) - The DAC on a graphics card
is commonly known as a RAMDAC because it takes the data it converts
directly from the card's memory. RAMDAC speed greatly affects the image
you see on the monitor. This is because the refresh rate of the image
depends on how quickly the analog information gets to the monitor.
- Display Connector - Graphics cards use standard connectors.
Most cards use the 15-pin connector that was introduced with Video
Graphics Array (VGA). You'll learn about VGA on the next page.
- Computer (Bus) Connector - This is usually
Accelerated
Graphics Port (AGP). This port enables the video card to directly
access system memory. Direct memory access helps to make the peak
bandwidth four times higher than the Peripheral Component Interconnect (PCI)
bus adapter card slots. This allows the central processor to do other
tasks while the graphics chip on the video card accesses system memory.
Graphics Card History
and Standards
The first graphics cards, introduced in August of 1981 by
IBM, were monochrome cards designated as Monochrome Display Adapters
(MDAs). The displays that used these cards were typically text-only,
with green or white text on a black background. Often, the graphics card had
a printer port,
since the printer would print the same data shown on the low-resolution
"green" screen. Color for IBM-compatible computers appeared on the scene
with the 4-color Hercules Graphics Card (HGC), followed by the
8-color Color Graphics Adapter (CGA) and 16-color Enhanced
Graphics Adapter (EGA). During the same time, other computer
manufacturers, such as Commodore, were introducing computers with built-in
graphics adapters that could handle a varying number of colors.
When IBM introduced the Video Graphics Array (VGA) in 1987, a new
graphics standard came into being. A VGA display could support up to 256
colors (out of a possible 262,144-color palette) at resolutions up to
720x400. Perhaps the most interesting difference between VGA and the
preceding formats is that VGA was analog, whereas displays had been digital
up to that point. Going from digital to analog may seem like a step
backward, but it actually provided the ability to vary the signal for more
possible combinations than the strict on/off nature of digital. Of course,
the way we manipulate digital display data has changed significantly since
the days of CGA and EGA. Now, graphics-card manufacturers are able to
provide all-digital display solutions that can support the same number of
colors that analog adapters can.
Over the years, VGA gave way to Super Video Graphics Array (SVGA).
SVGA cards were based on VGA, but each card manufacturer added resolutions
and increased color depth in different ways. eventually, the Video
Electronics Standards Association (VESA) agreed on a standard
implementation of SVGA that provided up to 16.8 million colors and 1280x1024
resolution. Most graphics cards available today support Ultra Extended
Graphics Array (UXGA). UXGA can support a palette of up to 16.8 million
colors and resolutions up to 1600x1200 pixels.
Graphics cards adhere to industry standards so that you can choose from a
variety of cards for your PC. Even though any card you can buy today will
offer higher colors and resolution than the basic VGA specification, VGA
mode is the de facto standard for graphics and is the minimum on all cards.
In addition to including VGA, a graphics card must be able to connect to
your computer. While there are still a number of graphics cards that plug
into an Industry Standard Architecture (ISA) or
Peripheral Component
Interconnect (PCI) slot, most current graphics cards use the
Accelerated Graphics
Port (AGP).