I registered for the normal allow me to buy a card on 10/ and 10/8 and still haven't heard anything. Not that I need a card anymore but I could give that slot to someone who does.
Well, DisplayHDR hasn’t certified it, but reviews are filtering out and the Odyssey G9 neo is 2000 nits with over 2000 local dimming zones.
And it costs 2500 dollars. Yikes.
I'd rather get a 55" OLED tv for that kind of money. But then I'm strange and I don't really like the extra width on those monitors without any extra height. If only they'd slap a couple inches to the top and bottom.
For that money, you can nearly get a 55" OLED TV and then buy another one as a spare in case the fretting about burn in really does apply to late-model SKUs.
+1
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
For that money, you can nearly get a 55" OLED TV and then buy another one as a spare in case the fretting about burn in really does apply to late-model SKUs.
I mean it's not going to have 360hz refresh rates or anything, but otherwise most TV's can be monitors. I use my OLED as a monitor and really only a monitor without issues. It's not even that hard to do, just get a slim stand from somewhere and a small cart with wheels to put your keyboard and mouse on so you can be 5 feet away or so. I used to put the stand behind my desk with a 3M keyboard/mouse tray and that worked just fine as well.
I know those super wide monitors allow for FoV that has competitive advantages. But for non-competition games, I'd rather have the better immersion that increased vertical space gives.
Origin ID\ Steam ID: Warder45
0
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
The difference between a TV and a monitor isn’t just frame rates; it’s response time, variable refresh rate tech, white to black (grey) response time, and overall picture.
It’s not like they’re making monitors more expensive for the hell of it.
The difference between a TV and a monitor isn’t just frame rates; it’s response time, variable refresh rate tech, white to black (grey) response time, and overall picture.
It’s not like they’re making monitors more expensive for the hell of it.
OLEDs wreck monitors on all of the things you listed, with the possible exception of VRR where they just do it p much the same.
Yeah, the LG OLEDs operate at 120Hz. WHile sure there are monitors at 360, most are still only at 60
0
GnomeTankWhat the what?Portland, OregonRegistered Userregular
edited July 2021
My 65" OLED has G-Sync compatible VRR and cost less than 2500 dollars.
That said OLED 4K TV panels are made at an economy of scale that 5120x1440 panels are not. OLED TV panels also don't generally do 240hz refresh (mine is 120hz @ 4K HDR if connected via HDMI 2.1 with a high bandwidth cable). Also the response times aren't even close. Even in game mode I'm fairly confident my OLED doesn't get anywhere near 1ms. For most people that won't matter a lick but there are use cases where it does.
I don't think 2000'ish is asking too much for that display. 2500 is probably a bit over, but also people will pay it so kind of big shrug from me. I don't think it's outrageous and I don't think OLED TV's being somewhat in the realm of affordability supersedes it's existence. Very different use cases.
The difference between a TV and a monitor isn’t just frame rates; it’s response time, variable refresh rate tech, white to black (grey) response time, and overall picture.
It’s not like they’re making monitors more expensive for the hell of it.
OLEDs wreck monitors on all of the things you listed, with the possible exception of VRR where they just do it p much the same.
Literally -- there is no better commercially available technology for response times than OLEDs, and the CX (2020) and newer LG displays support variable refresh-rate with both G-sync compatibility and Freesync.
0
GnomeTankWhat the what?Portland, OregonRegistered Userregular
There are two types of response time here. There's input response time, e.g. I move my house how quick does the screen update. That's what I mean when I say I'm fairly sure my OLED, even in game mode, doesn't approach a good gaming monitor. Then there's black-to-white, gray-to-gray, response time...and yes OLED's murder that.
I think partially the reason why OLED isn't as represented in the monitor space (for sure burn-in, is a consideration!) is that LG Display is the only game in town when it comes to making larger OLED panel.
I think that a subsidiary of Samsung also makes OLED panels but they're sized for phones and tablets at the largest.
0
GnomeTankWhat the what?Portland, OregonRegistered Userregular
edited July 2021
OLED also isn't "the future", as good as it is. It will almost certainly be superseded by some form of miniLED/microLED. That's where most of the other companies are focusing their dollars. Even LG is spending a lot to move that direction.
I'm thinking it will be MicroLED really, which purports to be an emissive display technology without the downsides that the organic elements that OLED has, but it has to develop economies of scale.
MiniLED isn't bad as a tech persay, but it's very similar to full-array local dimming displays of the past, just scaled up up into the thousands. And uh, well a true 1:1 lit display like OLED has more than 8 Million Lights. I certainly don't think MiniLED is worth spending several thousand dollars on.
The new PSU format we previously discussed which would greatly reduce idle power draw looks to be DOA, as the major manufacturers don't want to support it.
it was also said again that the major motherboard manufacturers, along with the major OEM power supply manufacturers, have prevailed and have given a united front of rejection to Intel’s ATX12V0 project. This keeps 24-pin and EPS (currently) and also keeps the cost of the motherboards in check. In our own discussions with board layouters, there were also several expressions of displeasure against the relocation of the extraction of the individual voltages on the motherboard, based on technical imponderables and an unwanted fragmentation. Maybe we can find out more about it here soon.
What I never understood was, why was the motherboard going to be that much more efficient at changing voltages, and why can't that efficiency be applied to the PSU itself?
What I never understood was, why was the motherboard going to be that much more efficient at changing voltages, and why can't that efficiency be applied to the PSU itself?
Lower transmission losses. But those have been minimized anyways by the very low utilization of the 3.3V and 5V rails.
As for the “generate the 3.3V and 5V rails from the 12V rail” they should be doing that anyways. I *think* they have in the past generated all of that directly from rectified mains, which has obvious disadvantages.
Steam - Synthetic Violence | XBOX Live - Cannonfuse | PSN - CastleBravo | Twitch - SoggybiscuitPA
As a hypothetical example, sending 10 amps at 3.3V across 1ft of 22ga wire has a power loss of about 4.5%. Sending that same amount of power at 12V across that wire has about a 0.3% loss. By moving the 12V to 3.3V conversion from the PSU to the motherboard, you eliminate that wire and it's efficiency losses. PSU makers are trying everything they can to bump their efficiency numbers up to meet regulations and this would have been a simple and relatively large improvement at the expense of making motherboards more expensive and adding another point of failure that's not as easily replaced to them.
Just remember that half the people you meet are below average intelligence.
There are two types of response time here. There's input response time, e.g. I move my house how quick does the screen update. That's what I mean when I say I'm fairly sure my OLED, even in game mode, doesn't approach a good gaming monitor. Then there's black-to-white, gray-to-gray, response time...and yes OLED's murder that.
Yeah the response time from controller/kbam to screen is where OLEDs kill it. I don’t know where I got the white/grey but I was obviously wrong.
I also figured LED monitors were dunking on OLEDs when it came to HDR as well. Top tier OLEDs have 750 peak I think?
That's true for sure with the peak nits! Though it's flipped when looking at true blacks too.
I know super bright screens can be really neat, but just a point of personal preference -- I don't even like it when I see the 750 peak nits on my OLED since I had LASIK several years ago and I'm still quite sensitive to light that bright. I can only imagine that a top LCD would make me cry haha xD
0
jungleroomxIt's never too many graves, it's always not enough shovelsRegistered Userregular
That's true for sure with the peak nits! Though it's flipped when looking at true blacks too.
I know super bright screens can be really neat, but just a point of personal preference -- I don't even like it when I see the 750 peak nits on my OLED since I had LASIK several years ago and I'm still quite sensitive to light that bright. I can only imagine that a top LCD would make me cry haha xD
I think 1000 nits on a wall mounted screen or one you’re several feet away from is fine. I just can’t imagine having that 2 feet from your face, my 600 nit screen makes me wince
In my new offices its the 48" C1 OLED ~3ft from me, wall mounted with lift and tilt. It's actually pretty neat with Powertoys installed to give me the more granular zones for window snapping -- easy to break my desktop down into the equivalent of 4 x 1080p 24" screens or some funky derivatives.
0
GnomeTankWhat the what?Portland, OregonRegistered Userregular
edited July 2021
Peak brightness is not the only important factor for HDR, dimming zones matter as well. That's where OLED makes up for maybe not having the peakiest brightness. Anything over HDR600 with enough local dimming zones should look great.
In our own discussions with board layouters, there were also several expressions of displeasure against the relocation of the extraction of the individual voltages on the motherboard, based on technical imponderables and an unwanted fragmentation. Maybe we can find out more about it here soon.
In our own discussions with board layouters, there were also several expressions of displeasure against the relocation of the extraction of the individual voltages on the motherboard, based on technical imponderables and an unwanted fragmentation. Maybe we can find out more about it here soon.
1/3 of the words in this quote aren't real
It's a German website that generally has good content, but they produce enough stuff that it's too expensive to have a translator on staff, but they DO get enough requests that they use a professional-grade machine translator.
The queue thing is now done since the drop is pretty much gone with all videocards. I expect another queue next week. Apparently they randomly assign you a place if you're there before "the event" begins, so it might be a lucky day for someone.
In our own discussions with board layouters, there were also several expressions of displeasure against the relocation of the extraction of the individual voltages on the motherboard, based on technical imponderables and an unwanted fragmentation. Maybe we can find out more about it here soon.
1/3 of the words in this quote aren't real
It's a German website that generally has good content, but they produce enough stuff that it's too expensive to have a translator on staff, but they DO get enough requests that they use a professional-grade machine translator.
It's not perfect, but it's better than nothing.
Oh I was straight trolling. I figured they used big words to smokescreen the real reasons why they aren't doing it.
Posts
I'm still kicking myself. I'm not sure why I registered the warranty on 9/28, but didn't get around to doing stepup until 10/20. Still, soon
Well, DisplayHDR hasn’t certified it, but reviews are filtering out and the Odyssey G9 neo is 2000 nits with over 2000 local dimming zones.
And it costs 2500 dollars. Yikes.
Let's play Mario Kart or something...
I'd rather get a 55" OLED tv for that kind of money. But then I'm strange and I don't really like the extra width on those monitors without any extra height. If only they'd slap a couple inches to the top and bottom.
Yeah but that’s a TV, not a monitor
I know those super wide monitors allow for FoV that has competitive advantages. But for non-competition games, I'd rather have the better immersion that increased vertical space gives.
It’s not like they’re making monitors more expensive for the hell of it.
OLEDs wreck monitors on all of the things you listed, with the possible exception of VRR where they just do it p much the same.
That said OLED 4K TV panels are made at an economy of scale that 5120x1440 panels are not. OLED TV panels also don't generally do 240hz refresh (mine is 120hz @ 4K HDR if connected via HDMI 2.1 with a high bandwidth cable). Also the response times aren't even close. Even in game mode I'm fairly confident my OLED doesn't get anywhere near 1ms. For most people that won't matter a lick but there are use cases where it does.
I don't think 2000'ish is asking too much for that display. 2500 is probably a bit over, but also people will pay it so kind of big shrug from me. I don't think it's outrageous and I don't think OLED TV's being somewhat in the realm of affordability supersedes it's existence. Very different use cases.
Literally -- there is no better commercially available technology for response times than OLEDs, and the CX (2020) and newer LG displays support variable refresh-rate with both G-sync compatibility and Freesync.
Like monitors are 4:4:4 and tv's aren't?
I think that a subsidiary of Samsung also makes OLED panels but they're sized for phones and tablets at the largest.
MiniLED isn't bad as a tech persay, but it's very similar to full-array local dimming displays of the past, just scaled up up into the thousands. And uh, well a true 1:1 lit display like OLED has more than 8 Million Lights. I certainly don't think MiniLED is worth spending several thousand dollars on.
https://www.igorslab.de/en/intel-alder-lake-s-launch-only-enthusiasts-cpu-and-z690-chipset-between-25-october-and-19-november-2021-the-remainder-is-coming/
Lower transmission losses. But those have been minimized anyways by the very low utilization of the 3.3V and 5V rails.
As for the “generate the 3.3V and 5V rails from the 12V rail” they should be doing that anyways. I *think* they have in the past generated all of that directly from rectified mains, which has obvious disadvantages.
Yeah the response time from controller/kbam to screen is where OLEDs kill it. I don’t know where I got the white/grey but I was obviously wrong.
I also figured LED monitors were dunking on OLEDs when it came to HDR as well. Top tier OLEDs have 750 peak I think?
I know super bright screens can be really neat, but just a point of personal preference -- I don't even like it when I see the 750 peak nits on my OLED since I had LASIK several years ago and I'm still quite sensitive to light that bright. I can only imagine that a top LCD would make me cry haha xD
I think 1000 nits on a wall mounted screen or one you’re several feet away from is fine. I just can’t imagine having that 2 feet from your face, my 600 nit screen makes me wince
1/3 of the words in this quote aren't real
It's a German website that generally has good content, but they produce enough stuff that it's too expensive to have a translator on staff, but they DO get enough requests that they use a professional-grade machine translator.
It's not perfect, but it's better than nothing.
that link is broken, for me at least. I think it should be this
https://amd.com/en/direct-buy/
Blizzard: Pailryder#1101
GoG: https://www.gog.com/u/pailryder
Yeaaahhhhhh. I'm at 10/18 as that's when I gave up drop watching and just bought a 2060 super.
Oops, you're right.
The queue thing is now done since the drop is pretty much gone with all videocards. I expect another queue next week. Apparently they randomly assign you a place if you're there before "the event" begins, so it might be a lucky day for someone.
Oh I was straight trolling. I figured they used big words to smokescreen the real reasons why they aren't doing it.
Any other details? Because some Sabrents are Gen 4, but the 980 (non Pro) is Gen 3.