[PC Build Thread] Nope, you still can't buy anything

19395979899

Posts

  • useruser Registered User regular
    Yeah I think its just down to Gigabyte bein slow to roll out the latest AGESA

  • SpoitSpoit *twitch twitch* Registered User regular
    schuss wrote: »
    As for step-ups - 3080 is up to 10/10 for north america at this point

    JFC

    I'm still kicking myself. I'm not sure why I registered the warranty on 9/28, but didn't get around to doing stepup until 10/20. Still, soon

    steam_sig.png
  • CormacCormac Registered User regular
    I registered for the normal allow me to buy a card on 10/ and 10/8 and still haven't heard anything. Not that I need a card anymore but I could give that slot to someone who does.

    Steam: Gridlynk | PSN: Gridlynk | FFXIV: Jarvellis Mika
  • syndalissyndalis Getting Classy On the WallRegistered User, Loves Apple Products regular
    GnomeTank wrote: »
    ....What? HDR2000? That's absurd if true. My 1000 nit TV is bright AF from 7 feet away. I can't even imagine.

    e: VESA says Samsung and Acer are full of shit: https://www.pcgamer.com/displayhdr-2000-false-vesa-samsung-acer/

    Well, DisplayHDR hasn’t certified it, but reviews are filtering out and the Odyssey G9 neo is 2000 nits with over 2000 local dimming zones.

    And it costs 2500 dollars. Yikes.

    SW-4158-3990-6116
    Let's play Mario Kart or something...
  • Trajan45Trajan45 Registered User regular
    syndalis wrote: »
    GnomeTank wrote: »
    ....What? HDR2000? That's absurd if true. My 1000 nit TV is bright AF from 7 feet away. I can't even imagine.

    e: VESA says Samsung and Acer are full of shit: https://www.pcgamer.com/displayhdr-2000-false-vesa-samsung-acer/

    Well, DisplayHDR hasn’t certified it, but reviews are filtering out and the Odyssey G9 neo is 2000 nits with over 2000 local dimming zones.

    And it costs 2500 dollars. Yikes.

    I'd rather get a 55" OLED tv for that kind of money. But then I'm strange and I don't really like the extra width on those monitors without any extra height. If only they'd slap a couple inches to the top and bottom.

    Origin ID\ Steam ID: Warder45
  • V1mV1m Registered User regular
    For that money, you can nearly get a 55" OLED TV and then buy another one as a spare in case the fretting about burn in really does apply to late-model SKUs.

  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    V1m wrote: »
    For that money, you can nearly get a 55" OLED TV and then buy another one as a spare in case the fretting about burn in really does apply to late-model SKUs.

    Yeah but that’s a TV, not a monitor

  • Trajan45Trajan45 Registered User regular
    I mean it's not going to have 360hz refresh rates or anything, but otherwise most TV's can be monitors. I use my OLED as a monitor and really only a monitor without issues. It's not even that hard to do, just get a slim stand from somewhere and a small cart with wheels to put your keyboard and mouse on so you can be 5 feet away or so. I used to put the stand behind my desk with a 3M keyboard/mouse tray and that worked just fine as well.

    I know those super wide monitors allow for FoV that has competitive advantages. But for non-competition games, I'd rather have the better immersion that increased vertical space gives.

    Origin ID\ Steam ID: Warder45
  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    The difference between a TV and a monitor isn’t just frame rates; it’s response time, variable refresh rate tech, white to black (grey) response time, and overall picture.

    It’s not like they’re making monitors more expensive for the hell of it.

  • V1mV1m Registered User regular
    The difference between a TV and a monitor isn’t just frame rates; it’s response time, variable refresh rate tech, white to black (grey) response time, and overall picture.

    It’s not like they’re making monitors more expensive for the hell of it.

    OLEDs wreck monitors on all of the things you listed, with the possible exception of VRR where they just do it p much the same.

  • SpoitSpoit *twitch twitch* Registered User regular
    Yeah, the LG OLEDs operate at 120Hz. WHile sure there are monitors at 360, most are still only at 60

    steam_sig.png
  • GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    edited July 2021
    My 65" OLED has G-Sync compatible VRR and cost less than 2500 dollars.

    That said OLED 4K TV panels are made at an economy of scale that 5120x1440 panels are not. OLED TV panels also don't generally do 240hz refresh (mine is 120hz @ 4K HDR if connected via HDMI 2.1 with a high bandwidth cable). Also the response times aren't even close. Even in game mode I'm fairly confident my OLED doesn't get anywhere near 1ms. For most people that won't matter a lick but there are use cases where it does.

    I don't think 2000'ish is asking too much for that display. 2500 is probably a bit over, but also people will pay it so kind of big shrug from me. I don't think it's outrageous and I don't think OLED TV's being somewhat in the realm of affordability supersedes it's existence. Very different use cases.

    GnomeTank on
    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • useruser Registered User regular
    V1m wrote: »
    The difference between a TV and a monitor isn’t just frame rates; it’s response time, variable refresh rate tech, white to black (grey) response time, and overall picture.

    It’s not like they’re making monitors more expensive for the hell of it.

    OLEDs wreck monitors on all of the things you listed, with the possible exception of VRR where they just do it p much the same.

    Literally -- there is no better commercially available technology for response times than OLEDs, and the CX (2020) and newer LG displays support variable refresh-rate with both G-sync compatibility and Freesync.

  • GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    There are two types of response time here. There's input response time, e.g. I move my house how quick does the screen update. That's what I mean when I say I'm fairly sure my OLED, even in game mode, doesn't approach a good gaming monitor. Then there's black-to-white, gray-to-gray, response time...and yes OLED's murder that.

    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • EtheaEthea Registered User regular
    Wasn't chroma subsampling the big deal for monitors vs tvs for the longest time?

    Like monitors are 4:4:4 and tv's aren't?

  • useruser Registered User regular
    Maybe the very earliest OLEDs? Even my older 2017 OLED tv has 4:4:4.

  • useruser Registered User regular
    I think partially the reason why OLED isn't as represented in the monitor space (for sure burn-in, is a consideration!) is that LG Display is the only game in town when it comes to making larger OLED panel.

    I think that a subsidiary of Samsung also makes OLED panels but they're sized for phones and tablets at the largest.

  • GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    edited July 2021
    OLED also isn't "the future", as good as it is. It will almost certainly be superseded by some form of miniLED/microLED. That's where most of the other companies are focusing their dollars. Even LG is spending a lot to move that direction.

    GnomeTank on
    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • useruser Registered User regular
    I'm thinking it will be MicroLED really, which purports to be an emissive display technology without the downsides that the organic elements that OLED has, but it has to develop economies of scale.

    MiniLED isn't bad as a tech persay, but it's very similar to full-array local dimming displays of the past, just scaled up up into the thousands. And uh, well a true 1:1 lit display like OLED has more than 8 Million Lights. I certainly don't think MiniLED is worth spending several thousand dollars on.

  • JragghenJragghen Registered User regular
    The new PSU format we previously discussed which would greatly reduce idle power draw looks to be DOA, as the major manufacturers don't want to support it.

    https://www.igorslab.de/en/intel-alder-lake-s-launch-only-enthusiasts-cpu-and-z690-chipset-between-25-october-and-19-november-2021-the-remainder-is-coming/
    it was also said again that the major motherboard manufacturers, along with the major OEM power supply manufacturers, have prevailed and have given a united front of rejection to Intel’s ATX12V0 project. This keeps 24-pin and EPS (currently) and also keeps the cost of the motherboards in check. In our own discussions with board layouters, there were also several expressions of displeasure against the relocation of the extraction of the individual voltages on the motherboard, based on technical imponderables and an unwanted fragmentation. Maybe we can find out more about it here soon.

  • tsmvengytsmvengy Registered User regular
    What I never understood was, why was the motherboard going to be that much more efficient at changing voltages, and why can't that efficiency be applied to the PSU itself?

    steam_sig.png
  • SoggybiscuitSoggybiscuit Tandem Electrostatic Accelerator Registered User regular
    tsmvengy wrote: »
    What I never understood was, why was the motherboard going to be that much more efficient at changing voltages, and why can't that efficiency be applied to the PSU itself?

    Lower transmission losses. But those have been minimized anyways by the very low utilization of the 3.3V and 5V rails.

    As for the “generate the 3.3V and 5V rails from the 12V rail” they should be doing that anyways. I *think* they have in the past generated all of that directly from rectified mains, which has obvious disadvantages.

    Steam - Synthetic Violence | XBOX Live - Cannonfuse | PSN - CastleBravo | Twitch - SoggybiscuitPA
  • SiliconStewSiliconStew Registered User regular
    As a hypothetical example, sending 10 amps at 3.3V across 1ft of 22ga wire has a power loss of about 4.5%. Sending that same amount of power at 12V across that wire has about a 0.3% loss. By moving the 12V to 3.3V conversion from the PSU to the motherboard, you eliminate that wire and it's efficiency losses. PSU makers are trying everything they can to bump their efficiency numbers up to meet regulations and this would have been a simple and relatively large improvement at the expense of making motherboards more expensive and adding another point of failure that's not as easily replaced to them.

    Just remember that half the people you meet are below average intelligence.
  • InfidelInfidel Heretic Registered User regular
    It makes sense but someone has to eat it here and the status quo is super heavy.

    OrokosPA.png
  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    GnomeTank wrote: »
    There are two types of response time here. There's input response time, e.g. I move my house how quick does the screen update. That's what I mean when I say I'm fairly sure my OLED, even in game mode, doesn't approach a good gaming monitor. Then there's black-to-white, gray-to-gray, response time...and yes OLED's murder that.

    Yeah the response time from controller/kbam to screen is where OLEDs kill it. I don’t know where I got the white/grey but I was obviously wrong.

    I also figured LED monitors were dunking on OLEDs when it came to HDR as well. Top tier OLEDs have 750 peak I think?

  • useruser Registered User regular
    That's true for sure with the peak nits! Though it's flipped when looking at true blacks too.

    I know super bright screens can be really neat, but just a point of personal preference -- I don't even like it when I see the 750 peak nits on my OLED since I had LASIK several years ago and I'm still quite sensitive to light that bright. I can only imagine that a top LCD would make me cry haha xD

  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    user wrote: »
    That's true for sure with the peak nits! Though it's flipped when looking at true blacks too.

    I know super bright screens can be really neat, but just a point of personal preference -- I don't even like it when I see the 750 peak nits on my OLED since I had LASIK several years ago and I'm still quite sensitive to light that bright. I can only imagine that a top LCD would make me cry haha xD

    I think 1000 nits on a wall mounted screen or one you’re several feet away from is fine. I just can’t imagine having that 2 feet from your face, my 600 nit screen makes me wince

  • useruser Registered User regular
    In my new offices its the 48" C1 OLED ~3ft from me, wall mounted with lift and tilt. It's actually pretty neat with Powertoys installed to give me the more granular zones for window snapping -- easy to break my desktop down into the equivalent of 4 x 1080p 24" screens or some funky derivatives.

  • GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    edited July 2021
    Peak brightness is not the only important factor for HDR, dimming zones matter as well. That's where OLED makes up for maybe not having the peakiest brightness. Anything over HDR600 with enough local dimming zones should look great.

    GnomeTank on
    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • MugsleyMugsley DelawareRegistered User regular
    In our own discussions with board layouters, there were also several expressions of displeasure against the relocation of the extraction of the individual voltages on the motherboard, based on technical imponderables and an unwanted fragmentation. Maybe we can find out more about it here soon.

    1/3 of the words in this quote aren't real

  • JragghenJragghen Registered User regular
    Mugsley wrote: »
    In our own discussions with board layouters, there were also several expressions of displeasure against the relocation of the extraction of the individual voltages on the motherboard, based on technical imponderables and an unwanted fragmentation. Maybe we can find out more about it here soon.

    1/3 of the words in this quote aren't real

    It's a German website that generally has good content, but they produce enough stuff that it's too expensive to have a translator on staff, but they DO get enough requests that they use a professional-grade machine translator.

    It's not perfect, but it's better than nothing.

  • DrovekDrovek Registered User regular
    edited July 2021
    They FINALLY added a queue to AMD's direct buy system.

    Drovek on
    steam_sig.png( < . . .
  • PailryderPailryder Registered User regular
    Drovek wrote: »
    They FINALLY added a queue to AMD's direct buy system.

    that link is broken, for me at least. I think it should be this
    https://amd.com/en/direct-buy/

  • schussschuss Registered User regular
    schuss wrote: »
    As for step-ups - 3080 is up to 10/10 for north america at this point

    JFC

    Yeaaahhhhhh. I'm at 10/18 as that's when I gave up drop watching and just bought a 2060 super.

  • DrovekDrovek Registered User regular
    Pailryder wrote: »
    Drovek wrote: »
    They FINALLY added a queue to AMD's direct buy system.

    that link is broken, for me at least. I think it should be this
    https://amd.com/en/direct-buy/

    Oops, you're right.

    The queue thing is now done since the drop is pretty much gone with all videocards. I expect another queue next week. Apparently they randomly assign you a place if you're there before "the event" begins, so it might be a lucky day for someone.

    steam_sig.png( < . . .
  • MugsleyMugsley DelawareRegistered User regular
    Jragghen wrote: »
    Mugsley wrote: »
    In our own discussions with board layouters, there were also several expressions of displeasure against the relocation of the extraction of the individual voltages on the motherboard, based on technical imponderables and an unwanted fragmentation. Maybe we can find out more about it here soon.

    1/3 of the words in this quote aren't real

    It's a German website that generally has good content, but they produce enough stuff that it's too expensive to have a translator on staff, but they DO get enough requests that they use a professional-grade machine translator.

    It's not perfect, but it's better than nothing.

    Oh I was straight trolling. I figured they used big words to smokescreen the real reasons why they aren't doing it.

  • MugsleyMugsley DelawareRegistered User regular
    Sealed in box Sabrent 1TB M.2 drive I have sitting at home (got for $100 shipped) or $110 for 980 1TB from Amazon?

  • GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    All my drives are Sabrent or Samsung and I've had zero issues with either. If you've already got the Sabrent just use it.

    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • Pixelated PixiePixelated Pixie They/Them Registered User regular
    Yeah, I have a Sabrent and a Samsung, no complaints about either.

    ~~ Pixie on Steam ~~
    ironzerg wrote: »
    Chipmunks are like nature's nipple clamps, I guess?
  • useruser Registered User regular
    Mugsley wrote: »
    Sealed in box Sabrent 1TB M.2 drive I have sitting at home (got for $100 shipped) or $110 for 980 1TB from Amazon?

    Any other details? Because some Sabrents are Gen 4, but the 980 (non Pro) is Gen 3.

This discussion has been closed.