So, moving today, I had a monitor die. Which lead me to pull out an older monitor which had given me issues but I had hopes would work.
Power it on, get the Acer logo then 'No signal' dancing around. So it works on some level! Plug it in with DVI to DVI, no signal. Hmm. Take it and try connecting it to a laptop with DVI-HDMI cable, success! Monitor does in fact work.
Back into my room. Unplug my HDMI monitor and plug in the monitor to my computer using DVI-HDMI cable (DVI+VGA monitor, HDMI video card slot). It works! Gasp! Either the cable or the DVI slot on my video card is the problem! Except...
I bring up screen resolution to fiddle with. Change settings, from 1680x1050 down to 1440x900. It adjusts. As I click confirm changes, it blanks out. No signal, like it did with the DVI-DVI cable. Now functions exactly the same with that cable as it did with the other, i.e. not at all.
So now I'm back on the tiny HDMI TV that I use as a secondary monitor. Trying to figure out what my next step should be to get this thing working. The trouble monitor has DVI and VGA outputs. My video card has DVI, HDMI, and DisplayPort slots. The DVI slot worked fine with another DVI monitor (the one that died during the move) long after this problem started with this particular monitor.
Thoughts? More information that would help? Etc? I'm thinking I might try the VGA output and use a converter to connect it and see how that goes, but I'm not sure whether converting to use the DisplayPort or DVI input on my card would be smarter.
Posts
DisplayPort, DVI or if that is not an option then HDMI are certainly to be preferred over VGA - however most graphics cars with a DVI connector there is actually also VGA signals in the same connector so and DVI->VGA adapter should give you VGA.
Now to the issue at hand that is somewhat odd, so here is what I am thinking (making a few assumptions here).
While moving your monitor was of course disconnected form power, so it may have forgotten what connection you normally use. Make sure your monitor is set to use the input type you want in that given moment. SOme monitors will automatically switch to what is connected, but others won't and need to be set for what you use if it is not the default connector.
Finding out that it's a common Acer headache also revealed a sometimes-fix, that you can do a full reset of the monitor by unplugging it and holding the power button for a while. That got me a screen again, but the second anything changed in the graphics (in this case, swapping the orientation of the monitors so it was on the right side) it went back to no signal.
I'm thinking I either need to get a VGA-DVI converter so I can try the VGA output, or try to get the screen running, not touch anything finicky, and look into the monitor settings (since as far as I can tell I can't configure them something on the screen, for some reason).
Am I correct in assuming DVI-DVI and DVI-HDMI would both be digital by default, while DVI-VGA would give me the analog signal?
Worth nothing is that the DVI connectors on most graphics contain both the digital and analog signals, so you don't need a gizmo that converts the digital signals to analog you just need an adapter that let you connect the VGA cable. Those adapters used to be included when one bought a graphics card, so chance are if you ask anyone which have once replaced a graphics card her or she will have one laying around(I know I have a few somewhere).
Using VGA should be fine - the only real issue you may seen is that the computer can't tell what resolution the monitor supports so it may let you selected unsupported resolutions or not allow you to select a specific resolution you want(the later is unlikely and if so there are tools to be found that can fix it). Also you may need to adjust the picture position on the monitor, but mostly that is handled automatically.