As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/

Looking for Clarity. [ Monitors ]

azith28azith28 Registered User regular
about 13 years ago I purchased This monitor: https://www.cnet.com/products/dell-ultrasharp-2405fpw/specs/

It's been an amazing monitor. It still is functioning beautifully, but I've been thinking of getting something larger or at the least will want something equal in quality when it eventually stops working.

The problem is that soon after this one came out, the next model number (The 2407) was a downgrade in many respects. I have not since seen a native resolution of 1920x1200 on a monitor, and even the larger physical size monitors seem to have a lower native resolution. I realize new technology may have made things like native resolution, image contrast and response rate less a factor but I don't know the details of how they compare, so I'm having trouble making an informed decision.

Is Native Resolution still something to base my evaluation on anymore?

What is the 'average' vs 'high quality' Refresh Rate? Same question for Contrast Ratio. Is the Brightness factor even a thing anymore considering the old plasma vs LCD vs LED debates.


If i wanted to go all out on a monitor to make sure my bases are covered for 4k, or whatever the current HD baseline is, what would i be looking for?

Are 4k Televisions generally still going to be a less optimal Choice as a Computer Monitor? If so, why?

Thanks

Stercus, Stercus, Stercus, Morituri Sum
«13456712

Posts

  • BahamutZEROBahamutZERO Registered User regular
    native resolution still matters, I have no idea about the rest of it

    BahamutZERO.gif
  • a5ehrena5ehren AtlantaRegistered User regular
    Native res still matters. You will want 2560x1440, 3440x1440 (21:9), or 3840x2160 (aka UHD/4K), probably around 27".

    Using a 4K TV as a monitor works, but certain models handle it much better than others. Sony is pretty good here as they tend to actually accept 120Hz input in PC mode. None of these are really "desk-sized", though.

    Right now, the trade-off is 1440p with high refresh (100+Hz with Freesync (AMD card) or Gsync (NV card)) or 4K.

    Look for a VA or IPS panel type to get good colors and contrast - TN is faster (up to 200Hz), but image quality is lower.

  • azith28azith28 Registered User regular
    a5ehren wrote: »
    Native res still matters. You will want 2560x1440, 3440x1440 (21:9), or 3840x2160 (aka UHD/4K), probably around 27".

    Using a 4K TV as a monitor works, but certain models handle it much better than others. Sony is pretty good here as they tend to actually accept 120Hz input in PC mode. None of these are really "desk-sized", though.

    Right now, the trade-off is 1440p with high refresh (100+Hz with Freesync (AMD card) or Gsync (NV card)) or 4K.

    Look for a VA or IPS panel type to get good colors and contrast - TN is faster (up to 200Hz), but image quality is lower.

    When you say the 4k TV could work as a monitor, Would I need a stronger video card to run exactly what im running now since it would be a larger screen (say same graphic settings/or equvilant resolution on the larger screen that I'm use to on my 24"? Do things like 'curved' monitors have any kind of effect on the power of the video card needed to push the image?

    How does the 120Hz input you mentioned factor in this discussion? the monitor i linked is only
    Horizontal Refresh Rate
    81 kHz
    Vertical Refresh Rate
    76 Hz

    at that resolution, so is the 120Hz a 4k thing for pushing a 4k image?

    Stercus, Stercus, Stercus, Morituri Sum
  • a5ehrena5ehren AtlantaRegistered User regular
    The native resolution has more of an effect on required graphics than the physical size of the monitor - I say the TVs aren't really suitable for desk use because their pixel density is poor compared to a monitor. Games would be OK, but text would look terrible.

    One of the nice things about 4K is that you'd be able to run games at 1080p without any artificial scaling introduced (each 1080p pixel maps directly to 4 4K pixels) and keep the same level of performance you have now, while getting much better text rendering for web/office stuff.

    The refresh rate is a separate axis of performance from resolution. This has more to do with motion smoothness than anything.

    The only issue would be if your video card is relatively old and does not support HDMI 2.0 or DisplayPort 1.2. We're talking AMD 2xx series or Geforce 9xx or newer for good 4K support.

  • Marty81Marty81 Registered User regular
    As for 1920x1200, have a look at the Dell u2412m. I bought one in 2013 and just bought a second one to sit next to it.

  • BlindZenDriverBlindZenDriver Registered User regular
    One can still find 1900x1200 and maybe even 2560x1600 which was the ultimate resolution back then, but progress has brought on new possibilities. While not ideal in every situation most monitors these days have a aspect ration of 16:9 just like TV's have and there is really much more choice and often also better monitors for the money when going with the mainstream 16:9 format.

    Main stream monitors are 60 Hz, just like the one you have already, and then there monitors which tend to be targeted more at computer gaming that can go higher and often offer a synchronization with the speed the computers graphics cars is able to render the gaming action. For non-gaming or for that matter just casual gaming one can just ignore those things. To put it simply I can not think of a monitor one can buy today that will not do 60hz, the only exception is some monitors will have some inputs that only allow for 30Hz for non-computer use.

    Monitors have in general gotten bigger since you last shopped and there are more choices when it comes to resolution as well. What is the best choice really depends on a lot of factors. Personally I have always favored big monitors and resolutions as high as possible (In the CRT days I ran 1600x1200 on a 17" and 2048x1536 on a 21") and things like good color uniformity and range are more in the nice than need to have category as I do not work as photo/video editor.
    My current setup is a 27" Dell U2711 2560x1440 and a 40" Philips 4k monitor (not a TV, big difference).

    Now the 40" will seem huge for most at first, but there is so much good to say for having a big monitor. A big monitor is like having a big desk, it allows for having many things going on at once or doing big things while maintaining a better overview. Also since the 40", and above, 4K monitors use panels that are also used for TV's while having monitor electronics that makes for some pretty sweet deals since those panels are relatively cheap as they a mass produced for TV's. In contrast the small 4K monitors are relatively more expensive since the panels are produced in smaller numbers.

    When buying a new monitor I would say one should consider the pixel density, one way of expresing that is in Pixels Per Inch (PPI) or dot pitch which is the size of the individual pixels. Here are some numbers:

    Size Resolution PPI Dot pitch(mm)
    24" 1920x1200 94.34 0.2692
    27" 2560x1440 108.79 0.2335
    28" 3840x2160 157.35 0.1614
    30" 2560x1600 100.63 0.2524
    31.5" 3840x2160 139.87 0.1816
    39.5" 3840x2160 111.54 0.2277

    If you go much above the 100 PPI then you will either need eagle eyes, run the monitor at a non-native resolution(bad thing) or have Windows scale the elements on the screen to compensate. Having Windows scale things does not work well with everything and some software simply ignores the scaling, so in my view that is best avoided. This then pretty much leaves going with either a 24", a 27" (1440p resolution) or a 40" 4K monitor and avoiding the smaller 4K monitors.

    On 4K it is worth nothing that in order to run that resolution it does require the right cable as well as graphics card. The later is unlikely to be an issue with anything in use today, but do check just in case and the same thing regarding the cable. There are as an example two types of DVI cables, many monitors do come with cables so that may make it easier.

    If you are planning on playing 3D computer games then switching your current monitor for a 4K will required the card to work pretty much 4 times as hard to maintain the same frame rate, but for non-gaming anything should be fine (and if not even a really cheap replacement one will do).

    Bones heal, glory is forever.
  • SynthesisSynthesis Honda Today! Registered User regular
    edited June 2018
    If you go much above the 100 PPI then you will either need eagle eyes, run the monitor at a non-native resolution(bad thing) or have Windows scale the elements on the screen to compensate. Having Windows scale things does not work well with everything and some software simply ignores the scaling, so in my view that is best avoided. This then pretty much leaves going with either a 24", a 27" (1440p resolution) or a 40" 4K monitor and avoiding the smaller 4K monitors.

    As someone who runs Windows 10 at 150% elements scaling and has done so far years...if you're using Windows 10, this pretty much becomes a non-issue. Even Windows 8.1 can compensate, though Windows 7 and older are ill-suited to this role.

    It's a non-issue because you have multiple solutions:

    1) Many modern programs (and by design, pretty much all "Windows 10 app" programs) have Windows-integrated scaling support, so they scale perfectly, or close to it.
    2) If your program lacks that support, or is too old, it will still have almost 100% effective override options accessible through the shortcut. It's extremely easy to override, and there are multiple options for partial overrides. And that's even before considering Nvidia and AMD's own options.
    3) Use the game's own UI settings, which are becoming more and more common in PC gaming (but, unlike the above, they are by no means universal).

    It really isn't an issue anymore, I think. I use a 27" LG IPS at native 2160p (4K), because 1440p at 27" simply wasn't enough of an improvement from 1080p in gaming to justify the cost of a new monitor (combined with a GTX 1080 and later, a GTX 1080ti), I hardly consider myself to have eagle vision (I wear glasses), but I'm very happy with the result and find 150% to be an extremely suitable scaling for desktop elements (I'm a big fan of clean space arrangement). I can still play 800x600 resolution titles, like Warcraft II at full screen, along with 4K Shadow of War. Basically, don't trust anyone who says they know definitively what you can or cannot see--because that's a load of crap, frankly. All any of us can do is offer general advice, not "The human eye cannot see over x resolution at y distance". We had the exact same claims when 1080p became the new resolution standard. I do not need a 40" monitor to use it at 4K, and frankly, I don't know a single person who has one (and unsurprisingly, I know more people who bought 2160p monitors than 1440p). When it comes to gaming performance, that is substantially more complicated--2160p gaming is not for everyone (it's not as exclusive as, say, virtual reality, but it's still a high technical order). 1440p gaming isn't for everyone either, for much the same reasons--the vast majority of people I know who built their own PCs still play at 1080p (the current state of GPU prices hasn't helped). Personally, I prefer a native 2160p image at medium than a native 1440p image at medium-high, but that's only me (also be mindful of anyone who claims if your PC can run 1080p at high that it will have no problem for 1440p--you're doubling the pixel count, performance doesn't only start to become an issue over 1440p, despite the stupid memes suggesting otherwise). It's something you need to figure out for yourself.

    That brings me to my recommendation, which I think is a universal truth: if you can, see a monitor in real life before you buy it. It's going to be vastly more useful to informing your decision than any distant chart erroneously presented as mathematical and scientific certainty. Yes, it's a little anachronistic in this age where only "grandpa" buys things at physical outlets, but it can really help with a monitor. You might find that 27" at 2160p isn't remotely worth it--then you've made a more educated decision. Or you'll be like me, and realize 1440p no longer cuts it. If you can, even with the less-than-optimal visual settings at places like Best Buy, try and find a monitor you like, play around with it, look how it handles scaling and colors, and then buy it online. In the end of the day, you're going to know what's better for your eyes than any so-called "expert" on the internet with a chart and a Youtube channel.
    azith28 wrote: »
    a5ehren wrote: »
    Native res still matters. You will want 2560x1440, 3440x1440 (21:9), or 3840x2160 (aka UHD/4K), probably around 27".

    Using a 4K TV as a monitor works, but certain models handle it much better than others. Sony is pretty good here as they tend to actually accept 120Hz input in PC mode. None of these are really "desk-sized", though.

    Right now, the trade-off is 1440p with high refresh (100+Hz with Freesync (AMD card) or Gsync (NV card)) or 4K.

    Look for a VA or IPS panel type to get good colors and contrast - TN is faster (up to 200Hz), but image quality is lower.

    When you say the 4k TV could work as a monitor, Would I need a stronger video card to run exactly what im running now since it would be a larger screen (say same graphic settings/or equvilant resolution on the larger screen that I'm use to on my 24"? Do things like 'curved' monitors have any kind of effect on the power of the video card needed to push the image?

    How does the 120Hz input you mentioned factor in this discussion? the monitor i linked is only
    Horizontal Refresh Rate
    81 kHz
    Vertical Refresh Rate
    76 Hz

    at that resolution, so is the 120Hz a 4k thing for pushing a 4k image?

    As already noted, using a 4K television is possible--technically, using almost any television is possible--but you're going to have "ghosting" issues when it comes to scrolling simply because of the refresh settings and how the panel handles changing brightness. One thing that might be worth considering is that there's a reason the vast overwhelming majority of games are designed to cap out at 60 on 60 hz displays (excluding those at 30, which still occasionally pop up)--with 120 hz displays, you're not actually going to be playing at 120 FPS. Or 100 FPS. You're going to be playing at 60 (or 40) to 100 (or 80) FPS. Even if you have the power to make it technically possibly, lots of games simply were never designed to run consistently over that 60. Aside from the "fastest" HDMI or DisplayPort cables (DP being almost exclusive to monitors, sort of how HDR support is almost exclusive to televisions), which are easy to get (just buy one off Amazon, it'll cost $3 more than the typical "any standard" HDMI cable), you'll need the actual machine that could do it. Even GTX 1080ti will struggle to reach 120 FPS with any consistency at 1440p, much less 2160p (even setting the game to low visual settings isn't a guarantee of that).

    To the best of my knowledge, curved panel displays make absolutely no difference here (though they're maybe more likely to have Gsync or Freesync?). Except that curved panels are often, though not always, "ultrawide" displays, somewhere between "QHD/2K" and "UHD/4K" (they're a really wide 1440p aspect ratio, basically). If you're worried about games not having good 2160p support, just remember it's still way more common than 2560x1080 or 3420x1440. There are curved panels that are in more "normal" aspect ratios that won't have that issue.

    Synthesis on
  • azith28azith28 Registered User regular
    Synthesis wrote: »
    If you go much above the 100 PPI then you will either need eagle eyes, run the monitor at a non-native resolution(bad thing) or have Windows scale the elements on the screen to compensate. Having Windows scale things does not work well with everything and some software simply ignores the scaling, so in my view that is best avoided. This then pretty much leaves going with either a 24", a 27" (1440p resolution) or a 40" 4K monitor and avoiding the smaller 4K monitors.

    As someone who runs Windows 10 at 150% elements scaling and has done so far years...if you're using Windows 10, this pretty much becomes a non-issue. Even Windows 8.1 can compensate, though Windows 7 and older are ill-suited to this role.

    It's a non-issue because you have multiple solutions:

    1) Many modern programs (and by design, pretty much all "Windows 10 app" programs) have Windows-integrated scaling support, so they scale perfectly, or close to it.
    2) If your program lacks that support, or is too old, it will still have almost 100% effective override options accessible through the shortcut. It's extremely easy to override, and there are multiple options for partial overrides. And that's even before considering Nvidia and AMD's own options.
    3) Use the game's own UI settings, which are becoming more and more common in PC gaming (but, unlike the above, they are by no means universal).

    It really isn't an issue anymore, I think. I use a 27" LG IPS at native 2160p (4K), because 1440p at 27" simply wasn't enough of an improvement from 1080p in gaming to justify the cost of a new monitor (combined with a GTX 1080 and later, a GTX 1080ti), I hardly consider myself to have eagle vision (I wear glasses), but I'm very happy with the result and find 150% to be an extremely suitable scaling for desktop elements (I'm a big fan of clean space arrangement). I can still play 800x600 resolution titles, like Warcraft II at full screen, along with 4K Shadow of War. Basically, don't trust anyone who says they know definitively what you can or cannot see--because that's a load of crap, frankly. All any of us can do is offer general advice, not "The human eye cannot see over x resolution at y distance". We had the exact same claims when 1080p became the new resolution standard. I do not need a 40" monitor to use it at 4K, and frankly, I don't know a single person who has one (and unsurprisingly, I know more people who bought 2160p monitors than 1440p). When it comes to gaming performance, that is substantially more complicated--2160p gaming is not for everyone (it's not as exclusive as, say, virtual reality, but it's still a high technical order). 1440p gaming isn't for everyone either, for much the same reasons--the vast majority of people I know who built their own PCs still play at 1080p (the current state of GPU prices hasn't helped). Personally, I prefer a native 2160p image at medium than a native 1440p image at medium-high, but that's only me (also be mindful of anyone who claims if your PC can run 1080p at high that it will have no problem for 1440p--you're doubling the pixel count, performance doesn't only start to become an issue over 1440p, despite the stupid memes suggesting otherwise). It's something you need to figure out for yourself.

    That brings me to my recommendation, which I think is a universal truth: if you can, see a monitor in real life before you buy it. It's going to be vastly more useful to informing your decision than any distant chart erroneously presented as mathematical and scientific certainty. Yes, it's a little anachronistic in this age where only "grandpa" buys things at physical outlets, but it can really help with a monitor. You might find that 27" at 2160p isn't remotely worth it--then you've made a more educated decision. Or you'll be like me, and realize 1440p no longer cuts it. If you can, even with the less-than-optimal visual settings at places like Best Buy, try and find a monitor you like, play around with it, look how it handles scaling and colors, and then buy it online. In the end of the day, you're going to know what's better for your eyes than any so-called "expert" on the internet with a chart and a Youtube channel.
    azith28 wrote: »
    a5ehren wrote: »
    Native res still matters. You will want 2560x1440, 3440x1440 (21:9), or 3840x2160 (aka UHD/4K), probably around 27".

    Using a 4K TV as a monitor works, but certain models handle it much better than others. Sony is pretty good here as they tend to actually accept 120Hz input in PC mode. None of these are really "desk-sized", though.

    Right now, the trade-off is 1440p with high refresh (100+Hz with Freesync (AMD card) or Gsync (NV card)) or 4K.

    Look for a VA or IPS panel type to get good colors and contrast - TN is faster (up to 200Hz), but image quality is lower.

    When you say the 4k TV could work as a monitor, Would I need a stronger video card to run exactly what im running now since it would be a larger screen (say same graphic settings/or equvilant resolution on the larger screen that I'm use to on my 24"? Do things like 'curved' monitors have any kind of effect on the power of the video card needed to push the image?

    How does the 120Hz input you mentioned factor in this discussion? the monitor i linked is only
    Horizontal Refresh Rate
    81 kHz
    Vertical Refresh Rate
    76 Hz

    at that resolution, so is the 120Hz a 4k thing for pushing a 4k image?

    As already noted, using a 4K television is possible--technically, using almost any television is possible--but you're going to have "ghosting" issues when it comes to scrolling simply because of the refresh settings and how the panel handles changing brightness. One thing that might be worth considering is that there's a reason the vast overwhelming majority of games are designed to cap out at 60 on 60 hz displays (excluding those at 30, which still occasionally pop up)--with 120 hz displays, you're not actually going to be playing at 120 FPS. Or 100 FPS. You're going to be playing at 60 (or 40) to 100 (or 80) FPS. Even if you have the power to make it technically possibly, lots of games simply were never designed to run consistently over that 60. Aside from the "fastest" HDMI or DisplayPort cables (DP being almost exclusive to monitors, sort of how HDR support is almost exclusive to televisions), which are easy to get (just buy one off Amazon, it'll cost $3 more than the typical "any standard" HDMI cable), you'll need the actual machine that could do it. Even GTX 1080ti will struggle to reach 120 FPS with any consistency at 1440p, much less 2160p (even setting the game to low visual settings isn't a guarantee of that).

    To the best of my knowledge, curved panel displays make absolutely no difference here (though they're maybe more likely to have Gsync or Freesync?). Except that curved panels are often, though not always, "ultrawide" displays, somewhere between "QHD/2K" and "UHD/4K" (they're a really wide 1440p aspect ratio, basically). If you're worried about games not having good 2160p support, just remember it's still way more common than 2560x1080 or 3420x1440. There are curved panels that are in more "normal" aspect ratios that won't have that issue.

    Oh, I stop in at best buy regularly to try and get some eyes on stuff, but I swear, why the hell are they displaying low-rez video to showcast a High Resolution monitor most of the time instead of something pretty is beyond my understanding.

    Cause nothing says 4k monitor than a 800x600 still frame image.

    Thanks for the input. Due to some unexpected sales i ran across, My money went into a 4k TV and a ps4 pro instead of a monitor, but I'll keep this thread in mind when i need a monitor.

    Stercus, Stercus, Stercus, Morituri Sum
  • SynthesisSynthesis Honda Today! Registered User regular
    azith28 wrote: »
    Synthesis wrote: »
    If you go much above the 100 PPI then you will either need eagle eyes, run the monitor at a non-native resolution(bad thing) or have Windows scale the elements on the screen to compensate. Having Windows scale things does not work well with everything and some software simply ignores the scaling, so in my view that is best avoided. This then pretty much leaves going with either a 24", a 27" (1440p resolution) or a 40" 4K monitor and avoiding the smaller 4K monitors.

    As someone who runs Windows 10 at 150% elements scaling and has done so far years...if you're using Windows 10, this pretty much becomes a non-issue. Even Windows 8.1 can compensate, though Windows 7 and older are ill-suited to this role.

    It's a non-issue because you have multiple solutions:

    1) Many modern programs (and by design, pretty much all "Windows 10 app" programs) have Windows-integrated scaling support, so they scale perfectly, or close to it.
    2) If your program lacks that support, or is too old, it will still have almost 100% effective override options accessible through the shortcut. It's extremely easy to override, and there are multiple options for partial overrides. And that's even before considering Nvidia and AMD's own options.
    3) Use the game's own UI settings, which are becoming more and more common in PC gaming (but, unlike the above, they are by no means universal).

    It really isn't an issue anymore, I think. I use a 27" LG IPS at native 2160p (4K), because 1440p at 27" simply wasn't enough of an improvement from 1080p in gaming to justify the cost of a new monitor (combined with a GTX 1080 and later, a GTX 1080ti), I hardly consider myself to have eagle vision (I wear glasses), but I'm very happy with the result and find 150% to be an extremely suitable scaling for desktop elements (I'm a big fan of clean space arrangement). I can still play 800x600 resolution titles, like Warcraft II at full screen, along with 4K Shadow of War. Basically, don't trust anyone who says they know definitively what you can or cannot see--because that's a load of crap, frankly. All any of us can do is offer general advice, not "The human eye cannot see over x resolution at y distance". We had the exact same claims when 1080p became the new resolution standard. I do not need a 40" monitor to use it at 4K, and frankly, I don't know a single person who has one (and unsurprisingly, I know more people who bought 2160p monitors than 1440p). When it comes to gaming performance, that is substantially more complicated--2160p gaming is not for everyone (it's not as exclusive as, say, virtual reality, but it's still a high technical order). 1440p gaming isn't for everyone either, for much the same reasons--the vast majority of people I know who built their own PCs still play at 1080p (the current state of GPU prices hasn't helped). Personally, I prefer a native 2160p image at medium than a native 1440p image at medium-high, but that's only me (also be mindful of anyone who claims if your PC can run 1080p at high that it will have no problem for 1440p--you're doubling the pixel count, performance doesn't only start to become an issue over 1440p, despite the stupid memes suggesting otherwise). It's something you need to figure out for yourself.

    That brings me to my recommendation, which I think is a universal truth: if you can, see a monitor in real life before you buy it. It's going to be vastly more useful to informing your decision than any distant chart erroneously presented as mathematical and scientific certainty. Yes, it's a little anachronistic in this age where only "grandpa" buys things at physical outlets, but it can really help with a monitor. You might find that 27" at 2160p isn't remotely worth it--then you've made a more educated decision. Or you'll be like me, and realize 1440p no longer cuts it. If you can, even with the less-than-optimal visual settings at places like Best Buy, try and find a monitor you like, play around with it, look how it handles scaling and colors, and then buy it online. In the end of the day, you're going to know what's better for your eyes than any so-called "expert" on the internet with a chart and a Youtube channel.
    azith28 wrote: »
    a5ehren wrote: »
    Native res still matters. You will want 2560x1440, 3440x1440 (21:9), or 3840x2160 (aka UHD/4K), probably around 27".

    Using a 4K TV as a monitor works, but certain models handle it much better than others. Sony is pretty good here as they tend to actually accept 120Hz input in PC mode. None of these are really "desk-sized", though.

    Right now, the trade-off is 1440p with high refresh (100+Hz with Freesync (AMD card) or Gsync (NV card)) or 4K.

    Look for a VA or IPS panel type to get good colors and contrast - TN is faster (up to 200Hz), but image quality is lower.

    When you say the 4k TV could work as a monitor, Would I need a stronger video card to run exactly what im running now since it would be a larger screen (say same graphic settings/or equvilant resolution on the larger screen that I'm use to on my 24"? Do things like 'curved' monitors have any kind of effect on the power of the video card needed to push the image?

    How does the 120Hz input you mentioned factor in this discussion? the monitor i linked is only
    Horizontal Refresh Rate
    81 kHz
    Vertical Refresh Rate
    76 Hz

    at that resolution, so is the 120Hz a 4k thing for pushing a 4k image?

    As already noted, using a 4K television is possible--technically, using almost any television is possible--but you're going to have "ghosting" issues when it comes to scrolling simply because of the refresh settings and how the panel handles changing brightness. One thing that might be worth considering is that there's a reason the vast overwhelming majority of games are designed to cap out at 60 on 60 hz displays (excluding those at 30, which still occasionally pop up)--with 120 hz displays, you're not actually going to be playing at 120 FPS. Or 100 FPS. You're going to be playing at 60 (or 40) to 100 (or 80) FPS. Even if you have the power to make it technically possibly, lots of games simply were never designed to run consistently over that 60. Aside from the "fastest" HDMI or DisplayPort cables (DP being almost exclusive to monitors, sort of how HDR support is almost exclusive to televisions), which are easy to get (just buy one off Amazon, it'll cost $3 more than the typical "any standard" HDMI cable), you'll need the actual machine that could do it. Even GTX 1080ti will struggle to reach 120 FPS with any consistency at 1440p, much less 2160p (even setting the game to low visual settings isn't a guarantee of that).

    To the best of my knowledge, curved panel displays make absolutely no difference here (though they're maybe more likely to have Gsync or Freesync?). Except that curved panels are often, though not always, "ultrawide" displays, somewhere between "QHD/2K" and "UHD/4K" (they're a really wide 1440p aspect ratio, basically). If you're worried about games not having good 2160p support, just remember it's still way more common than 2560x1080 or 3420x1440. There are curved panels that are in more "normal" aspect ratios that won't have that issue.

    Oh, I stop in at best buy regularly to try and get some eyes on stuff, but I swear, why the hell are they displaying low-rez video to showcast a High Resolution monitor most of the time instead of something pretty is beyond my understanding.

    Cause nothing says 4k monitor than a 800x600 still frame image.

    Thanks for the input. Due to some unexpected sales i ran across, My money went into a 4k TV and a ps4 pro instead of a monitor, but I'll keep this thread in mind when i need a monitor.

    Yeah, Best Buys will either competently display monitors, or totally fuck it up, or somewhere in between. Fry's is a little better about it, or if you have a local PC boutique outlet. Though I think Best Buy is getting better about it, since they're taking cues from their HDTV displays (which they are better about).

    Enjoy your new TV. I picked up a rather colossal Vizio M discounted 25%, and have loved it since (the lack of HDR notwithstanding). The new console upgrades are good ways to "show them off"--or UHD players, if you're a dinosaur like me and still buy physical copies of films (fuck iTunes).

  • Zilla360Zilla360 21st Century. |She/Her| Trans* Woman In Aviators Firing A Bazooka. ⚛️Registered User regular
    My monitor died today, six years old. RIP Samsung T190. Fails the self test like this:
    http://support-us.samsung.com/cyber/popup/iframe/pop_troubleshooting_fr.jsp?idx=434779&modelname=T190&modelcode=LS19TWHSUV/ZA

    Dead backlight, only works for a few seconds then nothing. Lucky that was enough to perform a blind clean system shutdown in Linux. Phew!

    Ordered a Asus VS248HR 24" inch-er, seems to have good reviews; and crucially three types of input socket, even VGA. :)

    |Ko-Fi Me! ☕😎|NH844lc.png | PSN | chi-logo-only-favicon.png(C.H.I) Ltd. |🏳️⚧️♥️
  • ThirithThirith Registered User regular
    Can anyone here tell me about the switch from 16:9 to 21:9, in particular when it comes to going curved? I'm about to upgrade to a new PC, and I could imagine getting a new screen within the next year. At present I've got a 16:9 1440p G-Sync screen that I'm pretty happy with, but on the whole it won't exactly tax my new PC and GPU. 4k is an option, but I'm more intrigued by the thought of a wider screen, possibly a curved one - also because I might soon have a bit more horizontal space at my disposal.

    Would those of you who have switched to 21:9 consider the switch worthwhile? How much of a fiddle is it to get older games to work on a 21:9 screen? Also, what are people's experiences with curved screens?

    webp-net-resizeimage.jpg
    "Nothing is gonna save us forever but a lot of things can save us today." - Night in the Woods
  • SynthesisSynthesis Honda Today! Registered User regular
    edited March 2019
    The basic thing would be: a lot of games--new, modern games--don't support ultrawide resolutions. A minority do, but a majority don't (the disparity is declining though).

    Solution: Some can be modded through something as a simple as a CFG edit to do so. If it's doable, it's probably not hard, though it's not always (and sometimes it just results in dumb black bars). Websites like this one do a decent job tracking compatibility and suggesting fixes.

    Many games with formal ultrawide support (and the vast majority of those with unofficial support, as above) do not have properly scaling 2D UI elements. How this is still an issue with official ultrawide support is beyond me. What, you don't think I know the difference between a circle and an any other ellipse? *angry noises*

    Solution: Not really one. But it's not the end of the world.

    Pretty much no games, before a certain point, support ultrawide resolutions. 2160p at least had the benefit of falling within the 16:9 and 16:10 that became the defacto norm with the standard of widescreen 1080p HD, and even then many developers dragged their feet. The big releases of 10 to 5 years ago are likely guilty of this.

    Solution: Play "older" at the aspect ratio they were intended to. If you're like me, and still run 800x600 games, you get used to old games not using all available real estate. Or just play new games which are much more likely to have the support at launch or in an early patch.

    Of course, none of this applies to general usage on the desktop. I don't have much experience with curved displays myself, though I generally don't favor their novelty, your options of non-curved displays for ultrawide resolutions become a lot more limited. There's probably a reason for that.

    Synthesis on
  • ThirithThirith Registered User regular
    "Just playing new games" is definitely not an option for me, as I still have to replay Ultima VI, Ultima VII and Serpent Isle. :biggrin:

    webp-net-resizeimage.jpg
    "Nothing is gonna save us forever but a lot of things can save us today." - Night in the Woods
  • SynthesisSynthesis Honda Today! Registered User regular
    They...will probably not work at all on ultrawide resolutions.

    Solution: Play with really big black bars on either side of your screen. Almost half the screen, probably.

    Addendum: 640x480 and 800x600 games do not always respond properly to GPU-set aspect ratio setting (which would take other 4:3 games and insert black bars). Modern video cards have an odd tendency to sometimes just "stretch it out" (in other words, how the monitor would handle aspect ratio settings without modification), instead of putting black bars like they would for higher resolution 4:3 games. Something to do with the lower resolution. This is less noticeable on 16:9, but it'd be damn noticeable at 21:9 I have to imagine.

  • ThirithThirith Registered User regular
    Oh, don't worry, I don't expect that I can play *everything* on ultrawide fullscreen. As long as I can get everything to work more or less fine, I don't mind black bars. (Though Ultima VII shouldn't be a big problem, thanks to Exult, which should make even 21:9 possible, though at the price of revealing more of the world than the game was designed for.)

    webp-net-resizeimage.jpg
    "Nothing is gonna save us forever but a lot of things can save us today." - Night in the Woods
  • DurkhanusDurkhanus Commander Registered User regular
    It's been a few years since I last toyed with Exult. I am curious to find out how the game would react to that much area being displayed. Could break it in interesting ways.

  • CantidoCantido Registered User regular
    What is the difference between DisplayPort and HDMI?

    I used to only be able to pull 140Hz on DisplayPort, but my new monitor can pull 240Hz on either.

    3DS Friendcode 5413-1311-3767
  • SoggybiscuitSoggybiscuit Tandem Electrostatic Accelerator Registered User regular
    edited April 2019
    Cantido wrote: »
    What is the difference between DisplayPort and HDMI?

    I used to only be able to pull 140Hz on DisplayPort, but my new monitor can pull 240Hz on either.

    DisplayPort is for champs, HDMI is for chumps.

    (I'm joking!)

    Really, it comes down to features. DisplayPort is *supposed* to support chaining of monitors and other more PC-oriented features. In reality, HDMI and DisplayPort are basically the same nowadays when it comes to resolution/refresh rate support, so the functional difference (if things even support it) is minuscule at best.

    Soggybiscuit on
    Steam - Synthetic Violence | XBOX Live - Cannonfuse | PSN - CastleBravo | Twitch - SoggybiscuitPA
  • SynthesisSynthesis Honda Today! Registered User regular
    Shot in the dark: many monitors limit their variable refresh capabilities to inputs from DisplayPort (or maybe HDMI, but not both). So make sure you don't unintentionally screw yourself over.

    Ironically, this has only become more of an issue with Nvidia's decision to support Freesync, rather than only G-Sync, on its high-end cards (and rolling that change out in an update earlier this year). Otherwise, most people would either be using an AMD card paired with a monitor that didn't have Freesync, or a Nvidia card paired with a monitor that did. Once you get it working, on the Nvidia side, the improvements were much appreciated (I assume this would also be true about a normal AMD to Freesync solution).

    Generally speaking, DisplayPort is suitable as a high-end gaming PC to gaming and/or UHD monitor option. My EVGA GTX 1080 ti, I'm pretty sure, only comes with 1 HDMI, 1 DVI, and 3 DisplayPort outputs. Ironically, I use the one HDMI port for VR, because the current generation VR headsets simply aren't as demanding, in this respect, as a 2160p/60hz monitor (I haven't heard of a VR headset that uses DP instead of HDMI). Everything else is DP or DVI.

    On a side note, Surface Pro devices have been using DP out since the first one, though Microsoft has generally marketed Mini-DP to HDMI dongles for exactly that reason. That's more of a space conservation thing.

  • ThirithThirith Registered User regular
    Follow-up on my earlier question concerning ultrawide screens: how do people generally handle browsing (and other activities that are largely about reading) on ultrawide displays?

    webp-net-resizeimage.jpg
    "Nothing is gonna save us forever but a lot of things can save us today." - Night in the Woods
  • AridholAridhol Daddliest Catch Registered User regular
    edited April 2019
    Thirith wrote: »
    Follow-up on my earlier question concerning ultrawide screens: how do people generally handle browsing (and other activities that are largely about reading) on ultrawide displays?

    You can partition the screen so you can maximize your windows in specific parts without them being stretched to hell.

    Display fusion is the best app

    Aridhol on
  • syndalissyndalis Getting Classy On the WallRegistered User, Loves Apple Products regular
    Thirith wrote: »
    Follow-up on my earlier question concerning ultrawide screens: how do people generally handle browsing (and other activities that are largely about reading) on ultrawide displays?

    As a Mac user, I am very used to not using apps in full screen.

    So, on my computer, I often have 3-4 windows visible with tons of horizontal space to have a "full width" 1000-1200px window for browsing and other stuff elsewhere.

    Full screen apps feels like a total waste on a 27" 16:9 screen, let alone a 34" or 38" ultrawide.

    SW-4158-3990-6116
    Let's play Mario Kart or something...
  • ThirithThirith Registered User regular
    Since I could get an LG 34GK950G-B at a relatively good discount (for Switzerland), I'm giving this some serious thought.

    One thing I wanted to ask: as a default, the screen is set to a refresh rate of 100Hz. However, it can be overclocked to 120Hz, with no drawback, as far as I understand. Can someone explain to me why it's not just set to 120Hz to begin with?

    webp-net-resizeimage.jpg
    "Nothing is gonna save us forever but a lot of things can save us today." - Night in the Woods
  • a5ehrena5ehren AtlantaRegistered User regular
    It probably drops the MTBF on the panel (in aggregate over the production run) to a level they would not be comfortable providing a warranty on.

    Or enough of them failed testing at that spec that they couldn't sell them to the public, but low enough to where the enthusiast community doesn't notice, etc.

    All kinds of business reasons, but even if it fails for your panel 100Hz covers almost all of the noticeable improvement anyway.

  • ThirithThirith Registered User regular
    Okay, I've gone and ordered that LG 34GK950G-B. I expect to have about an hour or two of intense regret when it arrives and freaks me out with its sheer ultrawideness, and then I'll be more than happy with it.

    webp-net-resizeimage.jpg
    "Nothing is gonna save us forever but a lot of things can save us today." - Night in the Woods
  • VoodooVVoodooV Registered User regular
    HDMI is proprietary, DisplayPort is not. This is reason enough for me to stick with DP

  • RiusRius Globex CEO Nobody ever says ItalyRegistered User regular
    Well, I was afraid this would happen. My new 2070 Super doesn't have a DVI port, which means my venerated Catleap 2b monitor is going to have to move on to someone else. Now I need a data dump into my brain so I can catch up with the last 5 years of monitor reviews and tech news.

    I'd love to stick with 1440p, I'd love to stick with 100+ hz, and I'd love to dip my toes into g-sync or at least g-sync compatibility. Is that possible without spending more than I spent on the damn 2070 Super?

  • V1mV1m Registered User regular
    A quick smidgen of research seems to indicate that the 2070S does freesync just as well as g-sync, which might broaden your field a bit.

  • Marty81Marty81 Registered User regular
    You know you can get DisplayPort to dvi cables for like $10, right?

  • RiusRius Globex CEO Nobody ever says ItalyRegistered User regular
    Marty81 wrote: »
    You know you can get DisplayPort to dvi cables for like $10, right?

    Yeah I'm not a moron, that's the first thing I tried. The catleap 2b needs a dual link DVI port, an adapter to DP won't work. Needs an active adapter which is $100+ and doesn't do 1440p/120hz.

  • SynthesisSynthesis Honda Today! Registered User regular
    I'm very much annoyed that GPUs are dropping DVI. Maybe I'd feel differently if I actually used USB-C for video output (or even knew anyone in person who did). For budget video cards I'd understand, but it's not hard to find a video card that comes in at twice the cost of a high-end model a couple generations ago (on Nvidia anyway).

  • DratatooDratatoo Registered User regular
    edited October 2019
    Rius wrote: »
    Marty81 wrote: »
    You know you can get DisplayPort to dvi cables for like $10, right?

    Yeah I'm not a moron, that's the first thing I tried. The catleap 2b needs a dual link DVI port, an adapter to DP won't work. Needs an active adapter which is $100+ and doesn't do 1440p/120hz.

    Some mainboards still have a dual link DVI. If enabled alongside a dedicated GPU, Windows 10 offers to route the GPU output through the integrated graphic. That’s how people got Freesync support with AMD Ryzen iGPUs and NVIDIA GPUs when NVIDIA didn’t support Freesync.

    Edit: If I remember correctly,
    Windows 10 has a selection screen somewhere in the display settings allowing you to select the output - it might not be available with every iGPU / GPU combo.

    Dratatoo on
  • BouwsTBouwsT Wanna come to a super soft birthday party? Registered User regular
    What's the latest hotness for monitors? My new PC is unbelievable overkill for my current 27" 1080P monitor.

    -1440p no question
    -VESA seems like a feature I will absolutely use. I already miss my other 144hz 1080p monitor (gave to a gamer in need, knowing I'd be upgrading)
    -I'm not competitive so even 100-120 hz would be nice.
    -Freesync (doesn't appear to be hard to find in this price-point)
    -HDR600+ seems appealing to me, deeper blacks is better.
    -Ultrawide? Maybe?
    -Flexible budget, thinking 500-800 USD (though I'm in Canada, I'll be in the US for Thanksgiving so I have a chance to buy and ship to a friend's house).

    I just want something that is good for the next 2-3 PC builds (5-10 years). So spending a little to get a lot means more to me than saving a few dollars here and there. What's good? What's worth splurging for? What's worth avoiding? Examples of monitors I should do some further research into?

    Between you and me, Peggy, I smoked this Juul and it did UNTHINKABLE things to my mind and body...
  • SynthesisSynthesis Honda Today! Registered User regular
    In practical terms, 100 to 120 hz at 1440p is "competitive" area, if you want any semblance of consistency, and is going to have according power demands (you didn't mention what GPU you were using, so I'm guessing here). 1440p ultrawide is coming up close to 2160p (presuming 3440 x 1440, compared to 3840 x 2160). Not that it's no worth pursuing, but that's still something to keep in mind.

    120 hz WQHD monitor are not that uncommon. But ones with HDR substantially narrow the options (I find this humorous--when I bought my UHD 4K from LG, people here were asking me, "Why don't you wait for HDR?"--fast forward more than three years, and HDR selection for PC monitors still isn't great).

    Do some research in advance on if that model has any particular history of "screen retention"--all monitors are reviewed for the potential of burn-in (since we're probably not dealing with OLED, that's less of a concern), but a lot of reviews overlook screen retention, especially when it takes a year or more to manifest. Screen retention, at least, isn't permanent...but it's fucking annoying.

  • MugsleyMugsley DelawareRegistered User regular
    My various searches keep bringing me back to this beauty:

    https://www.microcenter.com/product/478859/acer-xf270hu-27-wqhd-144hz-dvi-hdmi-dp-freesync-led-gaming-monitor

    The version on Amazon is the non-IPS SKU; from what I could find.

  • SynthesisSynthesis Honda Today! Registered User regular
    Mugsley wrote: »
    My various searches keep bringing me back to this beauty:

    https://www.microcenter.com/product/478859/acer-xf270hu-27-wqhd-144hz-dvi-hdmi-dp-freesync-led-gaming-monitor

    The version on Amazon is the non-IPS SKU; from what I could find.

    That's a nice monitor. It's 2560 x 1440 if it matters, though.

  • MugsleyMugsley DelawareRegistered User regular
    Yeah I'm looking for a 1440p IPS panel with GSync compatibility

  • SynthesisSynthesis Honda Today! Registered User regular
    Mugsley wrote: »
    Yeah I'm looking for a 1440p IPS panel with GSync compatibility

    Ah, I thought you were referring to BouwsT's search. My bad.

  • BouwsTBouwsT Wanna come to a super soft birthday party? Registered User regular
    Synthesis wrote: »
    In practical terms, 100 to 120 hz at 1440p is "competitive" area, if you want any semblance of consistency, and is going to have according power demands (you didn't mention what GPU you were using, so I'm guessing here). 1440p ultrawide is coming up close to 2160p (presuming 3440 x 1440, compared to 3840 x 2160). Not that it's no worth pursuing, but that's still something to keep in mind.

    120 hz WQHD monitor are not that uncommon. But ones with HDR substantially narrow the options (I find this humorous--when I bought my UHD 4K from LG, people here were asking me, "Why don't you wait for HDR?"--fast forward more than three years, and HDR selection for PC monitors still isn't great).

    Do some research in advance on if that model has any particular history of "screen retention"--all monitors are reviewed for the potential of burn-in (since we're probably not dealing with OLED, that's less of a concern), but a lot of reviews overlook screen retention, especially when it takes a year or more to manifest. Screen retention, at least, isn't permanent...but it's fucking annoying.

    Ya, sorry. PC is a 3900x w/ a 2080ti. PC Build thread has recommended the Samsung C32HG70, I'm under no delusions that the perfect monitor exists... Just getting some ideas out there as to what is reasonable to expect for features. For instance, the Samsung has good response times, HDR (at the expense of 144hz), but no ultrawide, and no VESA...

    I've just thrown up my hands a little because I don't really know what is worth splurging for, as I've had a 144hz TN panel for the last 5 years. I want better colours, definitely 1440p, maybe ultrawide? I've heard compatibility can be a bitch, so maybe that's not the direction I should go.

    Thanks again for the help, I really appreciate it!

    Between you and me, Peggy, I smoked this Juul and it did UNTHINKABLE things to my mind and body...
  • tsmvengytsmvengy Registered User regular
    BouwsT wrote: »
    Synthesis wrote: »
    In practical terms, 100 to 120 hz at 1440p is "competitive" area, if you want any semblance of consistency, and is going to have according power demands (you didn't mention what GPU you were using, so I'm guessing here). 1440p ultrawide is coming up close to 2160p (presuming 3440 x 1440, compared to 3840 x 2160). Not that it's no worth pursuing, but that's still something to keep in mind.

    120 hz WQHD monitor are not that uncommon. But ones with HDR substantially narrow the options (I find this humorous--when I bought my UHD 4K from LG, people here were asking me, "Why don't you wait for HDR?"--fast forward more than three years, and HDR selection for PC monitors still isn't great).

    Do some research in advance on if that model has any particular history of "screen retention"--all monitors are reviewed for the potential of burn-in (since we're probably not dealing with OLED, that's less of a concern), but a lot of reviews overlook screen retention, especially when it takes a year or more to manifest. Screen retention, at least, isn't permanent...but it's fucking annoying.

    Ya, sorry. PC is a 3900x w/ a 2080ti. PC Build thread has recommended the Samsung C32HG70, I'm under no delusions that the perfect monitor exists... Just getting some ideas out there as to what is reasonable to expect for features. For instance, the Samsung has good response times, HDR (at the expense of 144hz), but no ultrawide, and no VESA...

    I've just thrown up my hands a little because I don't really know what is worth splurging for, as I've had a 144hz TN panel for the last 5 years. I want better colours, definitely 1440p, maybe ultrawide? I've heard compatibility can be a bitch, so maybe that's not the direction I should go.

    Thanks again for the help, I really appreciate it!

    The C32HG70 does have VESA mounts according to Newegg and the manual. It has an adapter bracket that attaches to the stand attachment point. Looks like it also works with G-Sync.

    steam_sig.png
Sign In or Register to comment.