Dual monitor setup issue - Both monitors detected but 2nd screen Is black

I recently upgraded my computer from using onboard graphics to a GTX 960 (GV-N960OC-4GD).

I previously had a dual monitor setup, with my primary monitor (SyncMaster2233sw) using a DVI cable and the secondary monitor (Syncmaster 173v) using a VGA cable. I purchased a DVI to VGA adapter to connect the secondary monitor, and while it detects the monitor the screen remains black as if it is in standby mode.

Screenshot

I can even move windows and my mouse onto the second monitor, however it obviously just disappears from sight. I've also tried switching the cables to see if it was a faulty port, however my primary monitor works in both, just the secondary that remains black.

Any help would be much appreciated!

NVIDIA Control Panel:

enter image description here

GPU drivers:

enter image description here

UPDATE: The problem was just a faulty VGA-to-DVI adapter. That's what I get for buying it off eBay I guess.

UPDATE 2: It turns out that I needed a DVI-I adapter as my secondary screen apparently only accepts analogue output whereas I was using a DVI-D (digital) adapter.

15

7 Answers

Assuming that you are using a Windows machine, try the following:

  1. See if the connections are correct.
  2. Press Windows key + P.
  3. Select duplicate or extend options.
  4. Even now if the screen does not appear then check if you have powered on the 2nd monitor.

I fixed mine by turning off monitor, plugging the hdmi cable out of the laptop. Then turned off the main power switch for the monitor and waited for about 20 sec. Turned back on the power switch. Turned back on the monitor and insert the hdmi cable back into the laptop. Andddd it suddenly appeared on the other monitor too.

It's weird, that the second monitor is not listed in the device manager.

Check if the current screen settings for your second monitor are actually supported by it (screen resolution, refresh rate).

Then try to swap monitors (change their order of 1 and 2 internally in Windows only) and see, if the second one is actually responding.

If that doesn't help, change their connections on your PC - maybe the second DVI-out out is somehow ... "broken".

If the second monitor still stays black - get a new one ;-)

Sometimes my second monitor isn't available too and flickrs (?) all around. Then i just turn it off and on again and then it is recognized by Windows.

But there might also be a wrong setting that prevents outputting video information on the second DVI-out. You might double check your setting within nVidia Control Panel and the (usually) built-in Intel Graphics Control panel or whatever.

2

There exist "video range extender" solutions which connect signals directly to a different cable style, without changing the signal format. This is useful if, for example, you have long VGA cables permanently installed inside the walls of your room, because you can use the existing cable to carry DVI or HDMI data. But you must use the corresponding adapter to change the signal back to its original connector type, that matches the signal encoding. You can't expect to connect directly to a device supporting the new connector format, since it also expects a different signal encoding.

While a DVI-I to VGA connector is also passive (no re-encoding), use of the analog pins in the DVI-I connector signal to the video source to generate VGA (analog) signalling instead of DVI, so that the monitor receives a real VGA signal as it expects.

Another thing to try if whether your second monitor has an "input" or "source" button. If your monitor is displaying from DVI in but you've hooked up its VGA cable, then you would not get any picture.

I just encountered the same problem. Found the solution which is to right click on the 2nd monitor -> Properties -> Under the Adapter tab -> List All Modes and change it to 16 bit.

Wow, all I did was press the monitor menu buttons on the physical monitor and it magically starting displaying and no longer blank.

I was about to try and reset to factory defaults as I read elsewhere on the web might work, but now... I don't need to.

Hope this helps someone :).

In the monitor menu, I changed the input port version from HDMI 2 to HDMI 1.4 and it worked

1

Your Answer

Sign up or log in

Sign up using Google Sign up using Facebook Sign up using Email and Password

Post as a guest

By clicking “Post Your Answer”, you agree to our terms of service, privacy policy and cookie policy

You Might Also Like