P
Peter Werner
First my setup:
Macintosh G4/466 "Digital Audio" running OS 9.2.2
ATI Radeon 32 Mb DDR video card
Sony SDM-X82 TFT LCD Monitor
This works just fine through its VGA connection, but both my card and
the monitor have a DVI port and I've really been wanting to use this
for my connection. However, when I disconnect the VGA cable and
connect the DVI cable, about halfway through the startup, the screen
goes blank and I get a (Sony) error message on my monitor that says:
Out of Scan Range
Resolution > 1280 X 1024
This is odd, because my resolution is in fact set to 1280 X 1024 at 60
Hz. That's supposed to be well within the range for my monitor and
what I use for VGA.
Some further reading on the subject leads me to believe that it might
have something to do with the fact that the DVI port on the video card
is DVI-I while the port on the monitor is DVI-D. The cable itseld is
the one that came with the monitor and is DVI-D to DVI-D. However,
I've also heard that one should be able to plug this kind of cable
into a DVI-I port without any problem.
Anybody know what's going on? If I need to use another resolution/hz
rate, then what do I use? Should I get some kind of DVI-I to DVI-D
filter?
Let me know,
Peter
Macintosh G4/466 "Digital Audio" running OS 9.2.2
ATI Radeon 32 Mb DDR video card
Sony SDM-X82 TFT LCD Monitor
This works just fine through its VGA connection, but both my card and
the monitor have a DVI port and I've really been wanting to use this
for my connection. However, when I disconnect the VGA cable and
connect the DVI cable, about halfway through the startup, the screen
goes blank and I get a (Sony) error message on my monitor that says:
Out of Scan Range
Resolution > 1280 X 1024
This is odd, because my resolution is in fact set to 1280 X 1024 at 60
Hz. That's supposed to be well within the range for my monitor and
what I use for VGA.
Some further reading on the subject leads me to believe that it might
have something to do with the fact that the DVI port on the video card
is DVI-I while the port on the monitor is DVI-D. The cable itseld is
the one that came with the monitor and is DVI-D to DVI-D. However,
I've also heard that one should be able to plug this kind of cable
into a DVI-I port without any problem.
Anybody know what's going on? If I need to use another resolution/hz
rate, then what do I use? Should I get some kind of DVI-I to DVI-D
filter?
Let me know,
Peter