A lot of the monitors I look at in the store have a native resolution
of 1440x900. I looked at the possible settings for my video card (ASUS
N6200LE) under start>control panel>Nvidia Control Panel and I see that
there is no 1440x900. The closest settings are 1360x768 and 1600x900.
Does this mean...
1. I have a crappy video card that doesn't have enough resolution
choices to match a reasonable monitor
no
It means you either see the resolution when monitor is
hooked up, or you can set a custom resolution with any
semi-recent video driver (from
http://www.nvidia.com) in one
of the Display Properties Menu items, there's some button
you click to get to it after wading through a few menu
choices, IIRC.
2. The monitors I looked at were crappy and didn't have a native
resolution that matches a reasonable video card
No, although 1440x900 may not be such a desirable resolution
as it is not a very large screen nor much detail today, but
of course it would take up less desk space and cost less
than a larger monitor.
3. The process of matching a video card to a monitor is a time
consuming process that doesn't always work out
With some brands and/or older drivers this might be true but
you should have no trouble with your video and any
semi-recent full version of driver from nVidia's website.
As for a 3rd party driver provided by motherboard
manufacturers, it may or may not have all the features and
may or may not be a very recent driver even if a date seems
to suggest it is as new as some others.
What do you recommend? Do you like my video card or did I buy the
wrong one? Am I supposed to just keep looking until I find a matching
monitor?
Install recent driver, it should work.
And one final question.
Is the refresh rate something that I can safely change?
Not really except within the range the monitor spec sheet
states which is generally around 60-70Hz which is fine
because LCDs don't have the same flicker problem that CRTs
do at these refresh rate ranges. 60Hz is a good value to
use until you have reason to use another.
I alway though it was dangerous to the monitor to change it but in the
doc. for my video card there is a control to change the refresh rate
and it says that all it does is to help eliminate flicker, as if it is
something that can be set to your preference.
Many years ago CRTs had no facility to check and reject
using the wrong refresh rate, and running at wrong rate
could damage them. Nothing CRT made recently has this
problem and LCDs never would. In other words with any
monitor that allows using the rate you're trying to set, it
is fine to do so but you should notice little if any
improvement on an LCD by increasing it because it is a
different display technology.
On an LCD the larger improvement usually comes from using
DVI instead of D-sub analog connection, though 1440x900 is
only a moderate resolution, as the resolution gets higher it
makes even more of a difference.