The new forums will be named Coin Return (based on the most recent vote)! You can check on the status and timeline of the transition to the new forums here.
The Guiding Principles and New Rules document is now in effect.

Need Help - Computer setup with 3 Screens

LanchesterLanchester Registered User regular
edited June 2012 in Help / Advice Forum
I'm trying to set up my desktop computer with 3 screens (2 monitors and my TV). I have a Geforce GTS 250 Video Card and it has 2 DVI ports to it.

I recently purchased a DVI splitter so I could turn 1 of the DVI ports into 2. So I have 1 port on the video card going straight to my primary monitor. The second DVI port getting split using the splitter, and having it go to my secondary monitor and TV. My TV doesn't have a DVI port (only VGA, HDMI and Components), but I have a DVI to VGA converter and a DVI to HDMI converter. I've tried both and still don't get anything to display on the TV.

To make things a little more complicated, I want it be an extended desktop except for the 2nd monitor and TV. I want those 2 to be clones/mirrors. So when I connected everything, I get a display on my 2 monitors, but nothing on the TV. I've downloaded the latest video card drivers, restarted my computer after everything was connected and still get displays on only my monitors...

Help me H/A, what am I doing wrong? Did I buy the wrong thing in the DVI splitter? Can my video card not support a DVI splitter? Tell me how stupid I am and what I need to do in order to get this to work...or if it isn't possible and I'm wasting my time.

Lanchester on

Posts

  • iRevertiRevert Tactical Martha Stewart Registered User regular
    edited June 2012
    For starters it can only support 2 monitors on its own.

    Depending on what your CPU is (say socket 1155 with a i3/i5) you could run one monitor off of it and the other two off the card. Basically thats the only way your going to get three displays off that card without doing a SLI with another or just getting a AMD card.

    Keep in mind your going to have a increase in temps and decrease in performance vs one display.

    iRevert on
  • BlindZenDriverBlindZenDriver Registered User regular
    edited June 2012
    I'd start by getting two monitors to work like say the primary monitor and the TV. That way you can make sure you cables work as well as the setting on the TV and the graphics card settings (resolution and refresh rate).

    As for the DVI splitter I don't think it handles VGA so I'm pretty sure that you can not get VGA out of the split end. For that you'll need a DVI to VGA+DVI splitter and also a graphics card that will support that (I'm not sure that exists). Also I'm not sure all graphics cards will work with a DVI-splitter.

    The reason is that what you and lots of other think of as being a DVI to VGA converter is nothing more than a connector sorting out which signals to pass through. The DVI connector on your graphics card is in essence a connector with electric signals for both DIV and VGA and the "converter" simply connects the analog video(VGA) connectors and has a physical shape in the other that matches a VGA connector.

    I see three ways you can go:

    A. Get an actual DVI to VGA converter eg. something that turns the digital(DVI) into analog(VGA). This is likely expensive and inflexible.
    B. Replace you graphics card with one that offers three connections. In my experince AMD does this best but Nvidia should offer something as well.
    C. Get one of those USB "graphics" cards meant mostly for laptops. That will likely be more expensive than a new graphics card.


    EDIT: Sorry, I missed that your TV has HDMI. HDMI is simply put DVI+sound so you should be able to get that going, so it could just be a question of having selected a resolution for the second split output that is the TV isn't happy about. Try the DVI-to-HDMI without the splitter and see what you get.

    BlindZenDriver on
    Bones heal, glory is forever.
  • iRevertiRevert Tactical Martha Stewart Registered User regular
    edited June 2012
    B. Replace you graphics card with one that offers three connections. In my experince AMD does this best but Nvidia should offer something as well.
    C. Get one of those USB "graphics" cards meant mostly for laptops. That will likely be more expensive than a new graphics card.

    Your not touching on the fact that Nvidia doesn't support 3 displays off a single card.

    Your also not touching on the fact that if he did get a different card (say a 6850 for around a $100) he may need to replace his PSU. We don't know his system specs so that should be mentioned.

    USB "graphics cards" aren't worth it unless your only using it for just dicking around. If he's trying to do it for gaming a new card or SLI would be the course of action I'd take. General browsing and dicking around then a USB graphics card could work, but depending on his CPU he could just run one display off of onboard and the other two from his card.

    iRevert on
  • amateurhouramateurhour One day I'll be professionalhour The woods somewhere in TennesseeRegistered User regular
    I've got a co-worker who used one of the USB cards and it works fine to support 3 monitors as long as the third monitor is just an Instant Messenger window or PC stats or something, but that's about it.

    are YOU on the beer list?
  • bowenbowen Sup? Registered User regular
    Yeah outside of USB 3 using them for video is an exercise in futility.

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • LanchesterLanchester Registered User regular
    iRevert wrote: »
    Depending on what your CPU is (say socket 1155 with a i3/i5) you could run one monitor off of it and the other two off the card. Basically thats the only way your going to get three displays off that card without doing a SLI with another or just getting a AMD card.

    Here is my CPU info - Intel(R) Core(TM) i7-920 processor (2.66GHz, 1MB L2 + 8MB shared L3 cache with QPI Technology)
    I'd start by getting two monitors to work like say the primary monitor and the TV. That way you can make sure you cables work as well as the setting on the TV and the graphics card settings (resolution and refresh rate).

    EDIT: Sorry, I missed that your TV has HDMI. HDMI is simply put DVI+sound so you should be able to get that going, so it could just be a question of having selected a resolution for the second split output that is the TV isn't happy about. Try the DVI-to-HDMI without the splitter and see what you get.

    I have verified and used 2 monitors before this. Using 2 monitors at 1 point, and 1 monitor and TV (using the DVI to HMDI converter). So I've verified that the card can handle 2 displays, and verified the cables and DVI to HDMI converter works.
    iRevert wrote: »
    Your not touching on the fact that Nvidia doesn't support 3 displays off a single card.

    I did not know this...so it sounds like it's a lost cause unless I want to go the USB graphics card way (which I've never even heard of before), or get new graphics card or other hardware.

    I just wanted to be like J.P. from Grandma's Boy, jamming out to loud techno with 6 screens...and of course speaking in a robot voice.

  • DraygoDraygo Registered User regular
    If you want 2 of them to be clones you can split one output just fine. Now if it isnt working it might need to have its signal boosted (basically a powered splitter). The problem is you cannot split a dvi port into both a DVI and VGA connection. You either go full vga, or full dvi (note that if you use an externally powered converter that takes a dvi input and gives you a vga output, that will work). You can use the dvi to vga connecter on your card and get a VGA splitter, put one on your monitor and one on your tv. Now you also have to make sure the TV and Monitor support the same resolution, or one or the other will simply not display anything or display a severly moved or distorted picture.

  • BlindZenDriverBlindZenDriver Registered User regular
    iRevert wrote: »
    Your not touching on the fact that Nvidia doesn't support 3 displays off a single card.

    OK. I really thought Nvidia had that figured out by now.

    iRevert wrote: »
    Your also not touching on the fact that if he did get a different card (say a 6850 for around a $100) he may need to replace his PSU. We don't know his system specs so that should be mentioned.

    Okay. Now I wasn't trying to write a complete book about what to do. The form factor could also be something that needs to be touched :-)

    As for needing to replace the PSU we know what OP has and we also know that graphics cards has gotten more efficient over time so with the right choice swapping PSU should not be needed.

    Bones heal, glory is forever.
  • iRevertiRevert Tactical Martha Stewart Registered User regular
    Lanchester wrote: »

    Here is my CPU info - Intel(R) Core(TM) i7-920 processor (2.66GHz, 1MB L2 + 8MB shared L3 cache with QPI Technology)

    I have no idea with socket 1366 but give this a try.

    Hook your two monitors up to your graphics card as you did before, then off of the motherboard attach a video cable to its onboard video (be it VGA or whatever, just hook a cable up to whatever video output is on your rear I/O) and see if that works out for giving you three.

  • DraygoDraygo Registered User regular
    You may have to flip a switch in the bios to enable the onboard video while a graphic card is plugged in. Some motherboards disable the onboard connection when a graphics card is present. Some motherboards dont even give you that option.

Sign In or Register to comment.