If you look at page 8, the basic architecture of a modern video
card is shown. (The PDF code on this page is complex and it will
take a few seconds before the diagram on page 8, appears in
acrobat reader.)
http://www.ati.com/products/radeonx1k/whitepapers/X1000_Family_Technology_Overview_Whitepaper.pdf
The two cores you see on the left of the picture, typically
result in two entried in Device Manager. In a sense, it is
like they are separate video cards, but contained within the
one card.
The output can be steered to the various output options.
You could do two VGA analog displays, a VGA plus a TV (Svideo)
display, VGA plus a DVI digital, and so on. I don't know if
it is possible to do three displays at the same time, as there
are only two "cores" on the left of the figure, to drive the
outputs at a particular resolution choice.
While I haven't found a similar figure for Nvidia cards, it
is the same basic idea for them.
Just make sure, when you buy a video card, that the
connector types on the faceplate, are suited to your monitors.
For example, if you have an expensive 30" LCD that requires
dual-link DVI, then you'd want to shop pretty carefully for
a card. If you have a couple of basic $300 1280x1024 LCDs,
then a lot more cards with single-link DVI would be suitable.
If a card has DVI-I connectors, the connector carries two interfaces
on it. There are DVI-D pins, which is the digital component. A
DVI-D LCD monitor can plug in there. The video card may also
come with a DVI-I to VGA adapter, and the adapter just grabs
the analog signals off the DVI-I connector and sends them to
the 15 pin VGA connector. If you use VGA monitors, make sure the
video card you buy, comes with the adapter dongle to convert a
DVI-I connector to VGA.
All manner of output connectors are available. On the cheapest
cards, you might get s-video (TV), a DVI-I and a VGA. To drive
two VGA monitors, you would need one DVI-I to VGA adapter in
the package, to complete the job. Obviously, a card like that
cannot drive two DVI-D LCDs.
More expensive cards may have s-video (TV) plus two DVI-I
connectors. You would need two DVI-I to VGA adapters in the
package, in order to drive two VGA monitors. If driving two
LCDs digitally, the two DVI connectors on that card would be
ideal.
For large monitors, you'll want to read up on dual-link DVI.
Basically, more of the pins on the connector are populated,
and the interface has double the bandwidth. (Dual-link does
not mean two connectors - it is a doubling of bandwidth on
a single connector, due to the connector having twice as many
signals on it.) If you don't have a big monitor, then don't
worry about this. Since dual-link typically requires some
extra Silicon Image transmitter chips on the video card,
don't expect to find dual-link interfaces on the cheapest
of video cards. To drive two 30" monitors, you'd need
a card with "dual dual-link connectors".
http://en.wikipedia.org/wiki/Dual-link_DVI
When you get your new video card, there will usually be
a little trick in the control panel, to enabling dual
displays. So don't be surprised if the setup method is
not clear to you. Generally, finding an ATI CCC.pdf or
a Forceware User Manual, will make it clear as to what
must be done.
http://www.visiontek.com/teksupport/pdf/Catalyst_control_center_guide.pdf
ftp://download.nvidia.com/Windows/84.12/84.12_Forceware_Display_Property_User_Guide.pdf
HTH,
Paul