Hench said:
I have a crappy XF brand (or something like that) Radeon 4350. PCI-E
It's got 1 GB DDR2 dedicated and windows 7 claims it can use another 3
GB shared RAM although I don't know how.
Card can run up to 650 MHZ but likes it around 450MHz and the card
memory is at 400 MHz
I put it in my system to play one game, Civilization 4, because Intel
HD4000 wouldn't play that game.
When playing CIV card hits 92 C for a long time. Case has plenty
airflow and I haven't even put the cover back on. There are no
obstructions around the cards and there is no fan on the card.
One: how to get the card to use 4GB ram (I have 32 GB RAM)
Two: is 92 C a problem for this card?
system specs just in case:
650W 80% supply
Intel Quad 3770K running around 3800 MHz
32 gb DDR3 ram 1600 Mhz
SSD reads about 450 mb/sec
2 case fans
Windows 7 pro 64 bit
A description of how the memory thing works, starts here, but
this isn't exactly right.
http://en.wikipedia.org/wiki/Graphics_address_remapping_table
If we go here...
http://msdn.microsoft.com/en-us/library/windows/hardware/gg463285.aspx
"PCIe Graphics & AGP
By definition, AGP requires a chipset with a graphics address relocation
table (GART), which provides a linear view of nonlinear system memory
to the graphics device.
PCIe, however, requires that the memory linearization hardware exist
on the graphics device itself instead of on the chipset. Consequently,
driver support for memory linearization in PCIe must exist in the
video driver, instead of as an AGP-style separate GART miniport driver.
Graphics hardware vendors who want to use nonlocal video memory in
their Windows XP driver model (XPDM) drivers must implement both memory l
inearization hardware and the corresponding software. All PCIe graphics
adapters that are compatible with the WDDM must support memory
linearization in hardware and software."
These are examples of where ATI and Nvidia, cheaped out. This
didn't last too long, after they got customer feedback. This
was using system memory a little too much.
http://en.wikipedia.org/wiki/HyperMemory
http://en.wikipedia.org/wiki/TurboCache
http://www.anandtech.com/show/1679/2
"HyperMemory is ATI Technologies' method of using the motherboard's
main system RAM as part of or all of the video card's framebuffer
memory. It relies on new fast data transfer mechanisms within PCI Express."
"TurboCache - similar technology implemented by NVIDIA."
The terminology no longer comes up with more modern cards.
I don't think that implies it's completely dead though.
Just not popular from a marketing perspective.
http://howtotroubleshoot.blogspot.ca/2007/05/how-to-disable-turbocache.html
Someone in the newsgroups, mentioned a small utility that reports
how much local video RAM is available, and how much system memory can be
used, but I can't remember the name of it. (Maybe it was John Doe who
mentioned it ?)
On my video card, local video RAM has a bandwidth of 42GB/sec.
If loading structures over PCI Express, someone did a theoretical
calculation for my chipset, and claimed it can barely manage 2GB/sec
(due to limited PCI Express packet size in my chipset). So
accessing system memory for this purpose, is definitely inferior
and to be avoided. It's one thing to swap out textures at the
end of a level, in which case the 2GB/sec is fine (level reload).
But if you were trying to render graphics using structures that
stayed in system memory, that would be really really slow. And
that's why the previous link is so interested in "disabling turbocache".
The processor memory might have an awesome bandwidth rating,
but the chipset is a limit that is seldom measured (if at all)
on enthusiast sites. And a good video card, can easily
beat my 42GB/sec figure, with respect to local video memory.
*******
GPUZ can measure video card temperature.
Put the cover back on your PC, and continue to monitor the card.
Sometimes, the video card temp drops, after you close the cover.
Long term, you want a temp lower than 92C.
The chip is simulated by the designers, at temps like 105C.
The chip packaging (materials around the silicon die), may
have a lower limit. In general, you'd like a lower
operating temp than 92C, although plenty of laptop components
run their entire life at awful temps like that. And they seem
to last at least 3 years, until the warranty is gone
I don't know all the responses a video card and driver can
give. And whether the driver will actually down-clock the
card if it is overheating. If putting the cover back
on the PC, does not improve the temperature, then
fit an 80mm fan next to the passive card heatsink
and blow some air on it. I have one FX5200 passive here,
that if you don't blow air on it, the game crashes.
And that's one pathetic card! How could it overheat ?
I ended up fitting an 80mm fan next to the card.
To fit a fan like that, examine the computer case, where
the faceplates fit. The faceplate screw, is a place you
can fit a metal strip. I bought aluminum angle iron ($$$)
at Home Depot. Drilled one hole in the end, so that the
faceplate screw could go through it. Then, used nylon ties
to hold an 80mm fan to the L-shaped cross-section aluminum.
Being aluminum, the bar itself doesn't drag down the
assembly too much. That's how I hold the fan in place.
If I have to move that PC, I'd remove my fan assembly
for transport, because any shaking would rip it apart.
http://img404.imageshack.us/img404/7965/fanvideo.jpg
Note that, what makes that work, is the angle iron is
butted right against the back of the computer case. That
is what prevents it from drooping. You have to drill the
hole pretty precisely (and I wasn't 100% successful at it).
In that picture, the 80mm fan blows upwards, against the
heatsink on the GPU. The 80mm fan gets power from a header.
The wire junk you see in front of that stuff, is for running
the fan on the back of the case.
*******
http://www.tomshardware.com/reviews/gdc-2010-borderlands,2580-2.html
"He pointed out that Civilization IV’s graphics engine was actually
CPU-bound, and many systems would start to see serious frame rate
degradation as the unit count on screen approached 100."
That hardly seems likely on a 3770K
Must be another explanation.
Paul