J
Jorabi
I have an ATI Radeon 9250, which has a DVI output. I have it
hooked to my Sony HDTV while the VGA port is hooked to my PC.
The card is in Clone mode.
The problem is that whenever the TV's video source is switched
away from the DVI input, or the TV is shut off, the DVI output
goes dead and can't be seen again!
ATI says to go into the console app and do a 'detect displays'
every time I switch the TV over to the DVI input. Is that
ridiculous or what? I can also bring it back by doing anything
that resets the output, like changing the resolution up and
back.
I just want to be able to have that output active ALWAYS, so
I can switch the TV over to the PC feed whenever I feel like
it (and not have to get out of my La-Z-Boy). I have tried the
ATI feature "Force Display" or something but it had no effect.
Do all makes/models of cards have this limitation? ATI implied
to me that they all do.
hooked to my Sony HDTV while the VGA port is hooked to my PC.
The card is in Clone mode.
The problem is that whenever the TV's video source is switched
away from the DVI input, or the TV is shut off, the DVI output
goes dead and can't be seen again!
ATI says to go into the console app and do a 'detect displays'
every time I switch the TV over to the DVI input. Is that
ridiculous or what? I can also bring it back by doing anything
that resets the output, like changing the resolution up and
back.
I just want to be able to have that output active ALWAYS, so
I can switch the TV over to the PC feed whenever I feel like
it (and not have to get out of my La-Z-Boy). I have tried the
ATI feature "Force Display" or something but it had no effect.
Do all makes/models of cards have this limitation? ATI implied
to me that they all do.