What's important for video playback?

  • Thread starter Thread starter PC Guy
  • Start date Start date
P

PC Guy

If we're talking about video-card performance only in terms of video
playback (playing hi-def MKV files using VLC for example) and if we're
NOT talking at all about anything to do with video games, then what sort
of hardware acceleration or processing is done for video decoding /
playback on the video card -> vs being done by the computer's CPU?

Does the amount of ram (say, 128mb vs 512 or more mb) play any role in
HD video playback?

I'm wondering if your typical modern $200 or $300 video card offers
anything in terms of better video playback (as opposed to 3-D gaming) vs
say a 5-year-old $50 AGP-8X Nvidia 6200 card or a 7-year-old Mvidia
AGP-8X MX440?
 
If we're talking about video-card performance only in terms of video
playback (playing hi-def MKV files using VLC for example) and if we're
NOT talking at all about anything to do with video games, then what sort
of hardware acceleration or processing is done for video decoding /
playback on the video card -> vs being done by the computer's CPU?

Does the amount of ram (say, 128mb vs 512 or more mb) play any role in
HD video playback?

I'm wondering if your typical modern $200 or $300 video card offers
anything in terms of better video playback (as opposed to 3-D gaming) vs
say a 5-year-old $50 AGP-8X Nvidia 6200 card or a 7-year-old Mvidia
AGP-8X MX440?

ATI AGP here. Sounds to me, so far as pushing higher encoding,
distinctions become vague between video processing enhancements on
newer CPUs and your basic, sub-$20 rebated PCI-E slotted board. Main
difference to the AGP is going to be lots more heat and possibly fan
widths to deal with. I used to stutter or run into sync issues with
the occasional better video on my Radeon, although switching from a
2.6Ghz 478 Celeron D to a 3Ghz P4 fixed
most all that. Dumped my other AGP Radeon last week when I sold it on
a 3.4Ghz 775 P4 build, so I'm down to my last board (excluding an old
steady-eddy PCI dualhead Matrox - a good tester board). Easier just
to run with a MB already chipped for vid -- too lazy to screw with
these new rebates they're interviewing your life history for, prior to
an issue back on plastic credit. Not that I'm exactly thrilled,
either, with idea of grabbing hold of crap that's running 200 degrees
Fahrenheit, although a PCI-E vidboard is probably unavoidably going to
be in the works. After the garden variety reviewers are finished
picking their asses, what they generally agree on is non-gamer boards
are just fine for most anything in the way of video. Something I'd
reserve judgment on with these vid/chipped MBs I've messed with --
haven't really ran one for entertainment purposes (more into sound
processing at sub broadcast levels with gear somewhat below what the
pros are using).
 
PC said:
If we're talking about video-card performance only in terms of video
playback (playing hi-def MKV files using VLC for example) and if we're
NOT talking at all about anything to do with video games, then what sort
of hardware acceleration or processing is done for video decoding /
playback on the video card -> vs being done by the computer's CPU?

Does the amount of ram (say, 128mb vs 512 or more mb) play any role in
HD video playback?

I'm wondering if your typical modern $200 or $300 video card offers
anything in terms of better video playback (as opposed to 3-D gaming) vs
say a 5-year-old $50 AGP-8X Nvidia 6200 card or a 7-year-old Mvidia
AGP-8X MX440?

http://en.wikipedia.org/wiki/UVD
http://en.wikipedia.org/wiki/Purevideo

Intel has something too, but it may be shader based for all I know.
It seems to be more powerful. The ATI and Nvidia ones, may be intended
for one or two decoding streams (like PIP), while the Intel one can
decode at about 5x real time.

http://sourceforge.net/projects/qsdecoder/

You can probably find some web articles, comparing quality and
CPU cycle usage for all of those.

All that an MX440 would have, is IDCT, which only helps a bit with
decoding macroblocks in compressed formats. Hardly helps at all, in
terms of the total video decoding process.

http://en.wikipedia.org/wiki/IDCT

A bit further along in time, before UVD or Purevideo, some cards
support hardware scaling, so you can take a 320x240 video and
play it full screen. It did make quite a difference, when I was
testing Adobe Flash full screen, between one generation of video card
and the next. I don't really have anything here with UVD or Purevideo,
and depending on version, they can be pretty effective for Hollywood formats.

Anandtech seemed to be enamored with core clock rate, for things like
UVD. Since the decoder was a dedicated video decoding block, it was
felt that sometimes, even a low end card with the block was sufficient,
as long as the core clock was fast enough. So you'd be looking for a
cheap card, with a high clock rate. Even a $50 card would have
an effective decoder now (like say, an HD5450).

Initially, when UVD and Purevideo came out, some of the "high end
gamer cards", had the video decoder disabled, and they were
doing the decoding in shaders. Leaving the people who paid for
the high end cards feeling a bit ripped off. All new cards,
should have something thrown in, and the dedicated video decoder block
probably doesn't differ between a $50 card and a $500 card.

Paul
 
PC Guy said:
If we're talking about video-card performance only in terms of video
playback (playing hi-def MKV files using VLC for example) and if we're
NOT talking at all about anything to do with video games, then what sort
of hardware acceleration or processing is done for video decoding /
playback on the video card -> vs being done by the computer's CPU?

In nVidia's case it depends on what version of the PureVideo video
processing hardware the card implements and the codec your media player uses
to access it. AMD has its Unified Video Decoder. Both are accessed
through Microsoft's Directx VA (Video Acceleration) API. Exactly what is
being parsed or handled by the CPU versus GPU is over my head. All I know is
that hardware assisted H.264 decoding is supposedly well supported in
graphics cards these days which is good news if you like the Matroska video
format.

http://en.wikipedia.org/wiki/Nvidia_PureVideo
http://en.wikipedia.org/wiki/Unified_Video_Decoder
Does the amount of ram (say, 128mb vs 512 or more mb) play any role in
HD video playback?

Not so much system memory, but the size of the video card's framebuffer
would make a big difference. It's not an issue these days.
I'm wondering if your typical modern $200 or $300 video card offers
anything in terms of better video playback (as opposed to 3-D gaming) vs
say a 5-year-old $50 AGP-8X Nvidia 6200 card or a 7-year-old Mvidia
AGP-8X MX440?

You can do full HD playback of h.264 videos with a $200 or $300 modern card
leaving your CPU free to choke on other stuff, with the right software.

Tony.
 
Fierce said:

===========
The first generation PureVideo GPUs (GeForce 6 series) spanned a wide
range of capabilities. On the low-end of GeForce 6 series (6200),
PureVideo was limited to standard-definition content (720x576). The
mainstream and high-end of the GeForce 6 series was split between older
products (6800 GT) which did not accelerate H.264/VC-1 at all, and newer
products (6600 GT) with added VC-1/H.264 offloading capability.
===========

Am I reading this correctly?

Is the above statement saying that the 6800GT is technically inferior to
the 6600GT?
Not so much system memory, but the size of the video card's
framebuffer would make a big difference. It's not an issue
these days.

I was actually talking about the amount of ram on the video card.

Why would having 128 mb vs 256 mb or 512 mb give a video card better /
smoother video playback?

I was trying to play a video file with these specifications:

Codec ID : V_MPEG4/ISO/AVC
Bit rate : 3550 Kbps
Width : 1280 pixels
Height : 720 pixels
Frame rate : 29.970 fps
Writing library : x264 core 120 r2164 da19765

VLC did a poor job playing that file. I was getting about one or two
second's worth of frames that played ok before it stuttered and
pixelized for a second or two before playing again. The audio was AC3
stereo, and it played just fine. This is VLC player version 2.0.0
(Twoflower).

However, when I play the same file using MediaPlayer Classic, the file
plays really well.

This is on a PC with Nvidia GeForce4 AGP-8X MX440 video card (with I
think 64 mb ram). The system is based on a Soyo motherboard with Intel
845 chipset and it has Pentium 4 CPU running 2.53 Ghz (533 mhz FSB) with
512 kb L2 cache. The system has 512 mb system RAM, and is running
Windows 98se (enhanced with KernelEx API extensions).

Most of the movie and TV video files I've been downloading over the past
3 years play just fine on this system, but there is an increasing amount
of HD video files that VLC can't render smoothly.

I have several Nvidia 6200 AGP 8x video cards (with 256 mb ram) that I
can swap into this system, but I was wondering if those cards will do
better than the MX440 that I have now.
 
PC said:
===========
The first generation PureVideo GPUs (GeForce 6 series) spanned a wide
range of capabilities. On the low-end of GeForce 6 series (6200),
PureVideo was limited to standard-definition content (720x576). The
mainstream and high-end of the GeForce 6 series was split between older
products (6800 GT) which did not accelerate H.264/VC-1 at all, and newer
products (6600 GT) with added VC-1/H.264 offloading capability.
===========

Am I reading this correctly?

Is the above statement saying that the 6800GT is technically inferior to
the 6600GT?


I was actually talking about the amount of ram on the video card.

Why would having 128 mb vs 256 mb or 512 mb give a video card better /
smoother video playback?

I was trying to play a video file with these specifications:

Codec ID : V_MPEG4/ISO/AVC
Bit rate : 3550 Kbps
Width : 1280 pixels
Height : 720 pixels
Frame rate : 29.970 fps
Writing library : x264 core 120 r2164 da19765

VLC did a poor job playing that file. I was getting about one or two
second's worth of frames that played ok before it stuttered and
pixelized for a second or two before playing again. The audio was AC3
stereo, and it played just fine. This is VLC player version 2.0.0
(Twoflower).

However, when I play the same file using MediaPlayer Classic, the file
plays really well.

This is on a PC with Nvidia GeForce4 AGP-8X MX440 video card (with I
think 64 mb ram). The system is based on a Soyo motherboard with Intel
845 chipset and it has Pentium 4 CPU running 2.53 Ghz (533 mhz FSB) with
512 kb L2 cache. The system has 512 mb system RAM, and is running
Windows 98se (enhanced with KernelEx API extensions).

Most of the movie and TV video files I've been downloading over the past
3 years play just fine on this system, but there is an increasing amount
of HD video files that VLC can't render smoothly.

I have several Nvidia 6200 AGP 8x video cards (with 256 mb ram) that I
can swap into this system, but I was wondering if those cards will do
better than the MX440 that I have now.

You need this chart. What's funny, is not all the members of the 6200
family, have the same tick boxes.

http://www.nvidia.com/docs/CP/11036/PureVideo_Product_Comparison.pdf

And acceleration is only available, if the program uses the appropriate
API. If the developer isn't using the video card acceleration features,
then it's going to take more CPU.

Some free software, doesn't even use SSE when available. And that's
why there can be such a wide range of CPU loadings for movie playback.
One of the last things the free software developers work on, is
hardware acceleration, and that is because they receive such poor support
from the hardware companies. Free software developers fear signing an
NDA, like it was "The Plague". And hardware companies don't generally
give out interesting documentation, without an NDA.

Paul
 
Back
Top