OK, I just realized I didn't catch all of your initial post.I would have thought the same actually, except that the problem first appeared after a graphics driver related BSOD, and the image isn't good on the second monitor (I mean, it's as readable as it is now, writing from my primary monitor over VGA...at 1280x1024 on a widescreen...but it's still not as good as it was). The collections of white vertical lines over the BIOS background are what worry me
![]()
Now I'd agree, it's almost starting to sound like your vid card is going bad. I'm sure you know they build those things at the factory with a certain amount of smoke, and if you let it out, well that's just not good mmmkay?
No spare vid card! Heathen! Get a real cheap one to try this with, because if we find out your card is bad, you can just return it and get a nicer one. I say get a cheap one, because there's always the possibility that your mobo is going bad, and it send the vid card some juice outside of it's electrical tolerances and caused the problems. You don't want to go buy a real nice $200 card and then have it fried too. (Not like this is the voice of experience talking eitherI don't have a spare PCIE one unfortunately; certainly that'll be my first instruction once I've got it down at the shop tomorrow![]()
)
Honestly I don't know for sure, but I *think* that any standard DVI port is capable of doing normal analog VGA signals. That, and DVI has tighter signal thresholds (strength, voltage, etc) and is less amenable to bad signals or hardware. VGA is old and as long as you feed the signal within the monitor's ranges and tolerances it'll "work". I'm not 100% sure about what I just said, someone else can correct me/keep me honest.Do VGA connections have higher tolerances to signal fluctuations or something? Because I'm struggling to come up with a reason that the VGA input would work, shoddily, while the DVI didn't (by the way, by 'VGA input' I really mean a VGA-DVI converter sitting on my usual output)
Bookmarks