Julian Cassin said:
The problem now with the latest DirectX is that DirectDraw has been removed
so
to make 2D games you have to use Polygons to simulate 2D graphics.
Also, there are many many polygon games on 8bit systems, every played
Elite, Starion, Mercneary, Cholo to name a few?
First of all, DirectDraw games still work. The API functionality is
still there in a backward compatible form just like many other portions of
Windows that still exist solely for supporting older software.
DirectDraw is not considered part of the current generation of the API
because it was decided by all concerned, both at Microsoft and at the game
developers who have considerable influence on how DirectX evolves, that the
methodology behind DirectDraw was not in tune with where the hardware was
going nor with how developers wanted to address that hardware..
Secondly, I said 'filled polygons.' All of the games you list are
wireframe exercises. Filled polygons were very rare in 8 bit systems due
both to limited color palettes and processing power. WayOut was a rarity in
that it used filled polygons in a very simple fashion to allow real-time
movement through a 3D maze. Think Wolfenstein 3D without any enemies except
a strong wind that prevented movement across certain areas.
Filled polygons didn't really become common place on home consumer
hardware until the 16-bit generation, most notably with Starglider II by the
same team who went on to produce the SNES Star Fox game for Nintendo.
If you're trying to suggest that the displays of 8-bit systems were
without flaw you'll have to prepare to be answered with laughter. I spent a
lot of time in game testing back when 8-bit systems were still a market. A
lot of time was spent isolating and minimizing all sorts of display issues.
Usually the problems had to do with the video hardware for scrolling and
sprites. While tearing as seen on PCs wasn't as common there instead
constant issues of flicker, both due to managing activity during the VBI and
attempting to get more sprites on screen than the hardware was intended to
support. The problems set in when you needed more sprites on a single
horizontal portion of the screen. The same sprite could made to appear
multiple times though the vertical scan and you just had to be careful to
track which occurance of that sprite was to be acted on when a collision
occurred.
Which isn't to say you couldn't tearing on 8-bit machines. If your email
address is an indicator you probably didn't see much of the Apple ][ system
but they had a major game market here. (You may have encountered some
popular Apple ports that treated machines like the C-64 as Apples, ignoring
most of the hardware advantages.) These systems had essentially no hardware
assistance for video functions. It all had to be done by hand. Any game that
needed to do a lot of fast scrolling, like the popular Choplifter, would
show a great deal of tearing in the display. It was simply beyond the means
of a sub-2 MHz 6502 to update the display quickly enough to avoid the
problem. We Atari 800 fans were quite annoyed at the number of of games that
failed to take advantage of the chipset and just treated it as an Apple.
This problem prevailed also in the PC world. Video hardware acceleration
really didn't come into its own on the PC until a unified API became
standard. A 16 MHz 286 with a 256K VGA card could replicate the Apple ][
version of Choplifter (which was essentially black & white with artifacted
color) with a much smoother display but by then the audience would expect
much better graphics. Inevitably the improved graphics would be enough to
swamp the system and the same kinds of display flaws were back in evidence.
Much the same could be seen in games produced for both the Amiga and
Atari ST. The Amiga had hardware scrolling, hardware sprites and hardware
support for more complex graphic objects called MOBS in Amiga-ese, as well
as a bunch of other really useful things from a game developer's
perspective. The Atari lacked all this, being more price conscious. (A
blitter chip was added to later model but poorly supported by game
developers who didn't want to lose the installed base of older ST models.)
Games written first for the Amiga and ported to the Atari often lost quite a
bit in the process, not only due to the lesser color range and depth on the
Atari but also due to the greater difficulty doing all the display
manipulation in software. And of course, games ported quickly the ST to the
Amiga lacked the full splendor of native Amiga games.
That didn't stop them from being good games on their own merits but
people like to see their choice of system favored. The moral is that no
matter how much power you add to your video platform game developers will
soon operate at its limits.
Plus there is a further issue of desktop PCs I'll get to below.
One other thing, Polygons don't seem to be the reason, didn't you notice
that
Windows itself suffers with tearing when you drag windows around?
Or are you going to claim once more that the problem is in my mind like
before
until you suddently admit to the problem?
No, I'm just going to suggest that you're desperately ignorant of the
way the Win32 APIs work. That, and you need to be more clear about the
nature of your complaint.
The routines used for the GDI are designed to work independently of
hardware acceleration and are completely separate from the DirectX suite..
They'll use it if available but otherwise they get by on the most minimal of
systems for the generation in question. The level of hardware acceleration
used by Windows is a easily accessed control panel setting. Perhaps you
should check yours.
On this fairly ancient machine in front in me (dual Celeron 533, Voodoo
3 3000 16 MB, 256 MB RAM, Win2K) I can grab the window containing this
message in progress and move it around while have the text remain completely
readable. The edges haves a bit of redraw ugliness but only if I move the
window so quickly its contents are no longer legible. It was just a few
years earlier that this was a big deal for any windowing system and many
would not even attempt to preserve the display until the system thought you
were done moving things around. At one point they added a control panel
checkbox to Windows to allow user control. Older machines with weaker CPU
and/or video hardware could slow to a crawl attempting to keep up with the
redraw task. On such a machine it was better to simply allow the window to
remain blank during movement since it would be unusual to be manipulating
any data while moving the window containing it.
Compare this window, an object created with little hardware
acceleration, to any hardware sprite on an 8-bit system. In terms of data
volume the window is orders of magnitude greater and has few of the
constraints , such as size color depth, typical of hardware sprites. The
simple text I'm looking at now could just as easily be a PhotoShop measured
in tens of megabytes. That window would bring this system to its knees just
as moving a Word window could on a typical Win95 generation machine.
Now, if you want rock solid windows with full hardware acceleration on a
consumer system, check out the current generation Mac OSX systems. Due to
the close control they have of their hardware Apple was able to dictate that
all machines from a certain date forward would have a specified minimum of
video hardware functionality. (Essentially the DirectX 7 generation although
Apple would put it in terms of an OpenGL version.) This became part of the
minimum specs for OSX, specifically the Quartz rendering system. (OSX
running as a headless server doesn't care about video hardware, of course.)
Microsoft is doing this also in the next major release of Windows, currently
referred to as Longhorn. If you look around you can find some video clips
for demonstrations of the fully hardware accelerated GDI that integrates
with DirectX.
It looks great but the funny thing is that the windows are no longer
treated as a sort of hardware sprite. They're going to be polygons with the
windows controls and contents painted on. The plan is to treat DirectX 7
class video chips as entry level without all the whiz-bangs and DirectX 9
class chips as the top of heap, at least until somebody can think of a
must-have application for more advanced video hardware for desktop apps.
Now, you might wonder, why is this coming so long after the hardware has
become common? Because the hardware is less common than you think. Microsoft
needs to sell new versions of Windows to companies that have vast numbers of
systems and don't change them out that often. Those corporate desktop
typically have minimal video hardware compared to consumer systems. Recently
the embedded solution in Intel chipsets have come to match the minimal
requirements for Longhorn, so the single most important customer finally
became ripe to run a fully hardware driven desktop OS.
Does this mean displays in games will become perfect? Nope. Just as a
window that could have an older machine to a dead halt while being moved is
now trivial game developers are going to find the limits of future PCs. The
desktop environment for Longhorn is fairly predictable and should come
nowhere near taxing a system released any time in the near future since
Longhorn will have to behave reasonably well on older machines.
Game developers are driven by a different set of motivations. While they
don't want to restrict their potential audience they also need to make use
of the newer hardware to make products more dazzling than their predecessors
already available in the bargain bins. Players will have the standard
options for lowering the game's power needs but most will try to get as much
as they can before the display becomes completely useless even if it means
putting up with some flaws like tearing.
The worst part is that this is pretty much impossible to avoid on a PC.
On a console developers can depend on unit #1 through unit #10,000,000
behaving exactly the same. This allows a great deal of fine tuning and is
why, even though it has much in common with a PC, you don't typically see
tearing on a Xbox game from a company with a good QA operation.
The same cannot be said for a PC. You can have two machines of almost
exactly the same specs but one has an Intel chipset and the other a VIA
chipset. These machines will produce slightly different behavior when you
push them to any extent. This can be largely invisible except on benchmarks
and that area where humans are very sensitive, visual pattern recognition.
It isn't just on IBM descended PCs. We ran into big problems way back
when the Amiga 500 and 2000 came out. For most purposes a first generation
512K 500/200 was the same platform as the earlier 1000 but there were little
differences that cause timing nightmares. On one game with a tactical map
display (Lords of the Rising Sun) the update moved at only about 25% of the
1000's speed on a 500/2000 system. It became necessary to test for which
type of Amiga the game was loading on and adjust accordingly. (I'm not sure
if this was ever implemented because I left the company before the game
shipped and QA testing left me with no interest in playing it on my own
time.)
This can be avoided if you can convince developers and gamers to settle
for games that treat four year old PCs as the current pinnacle. Don't
program for 2004 PCs until 2008. Not very likely, you think?