Options

[PC Build Thread] It's a weird time in Hardwaretown

1888991939499

Posts

  • Options
    IncindiumIncindium Registered User regular
    edited July 2019
    Gigabyte acting quick on addressing chipset fan news concerns with new x570 boards it looks like. Sounds like a new fan profile will be out before I manage to build my system on the x570 Aorus Master.

    https://www.overclock.net/forum/11-amd-motherboards/1728360-gigabyte-x570-aorus-owners-thread-7.html#post28031760

    Incindium on
    steam_sig.png
    Nintendo ID: Incindium
    PSN: IncindiumX
  • Options
    IncindiumIncindium Registered User regular
    edited July 2019
    bowen wrote: »
    Incindium wrote: »
    @bowen I dunno if you are wanting an X570 motherboard but looks like 3900x and X570 Motherboard combos are in stock at Newegg currently.

    @Incindium they're all sold out for me?

    Looks like they sold out again. They were in stock when I posted earlier for sure because I bought one of the bundles. I'd recommend finding a motherboard bundle and click the link to create an Auto Notify notification. I got an email notification this morning when the bundle I was looking at came in stock.

    Incindium on
    steam_sig.png
    Nintendo ID: Incindium
    PSN: IncindiumX
  • Options
    Al_watAl_wat Registered User regular
    I wonder how the 3950x will compare.

  • Options
    bowenbowen How you doin'? Registered User regular
    Incindium wrote: »
    bowen wrote: »
    Incindium wrote: »
    @bowen I dunno if you are wanting an X570 motherboard but looks like 3900x and X570 Motherboard combos are in stock at Newegg currently.

    @Incindium they're all sold out for me?

    Looks like they sold out again. They were in stock when I posted earlier for sure because I bought one of the bundles. I'd recommend finding a motherboard bundle and click the link to create an Auto Notify notification. I got an email notification this morning when the bundle I was looking at came in stock.

    WELL

    I got the 3800x and the ASRock Taichi at least. A friend expressed interest in my 3800 if I upgrade to the 3950x when it drops.

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • Options
    MugsleyMugsley DelawareRegistered User regular
    I'm not trying to accuse you guys of anything, but I've never been a fan of ASRock boards; even their Taichi. Then again, I think a good number of you turn over hardware way more often than me so it's less of an issue.

  • Options
    AridholAridhol Daddliest Catch Registered User regular
    Al_wat wrote: »
    I wonder how the 3950x will compare.

    I think anyone who is editing videos on even a semi-regular basis will have one and intel will just go "poof" in this space :)
    16 cores and 32 threads in a consumer pricepoint is insane!

    I think my next system will be based around the 3700x though. Good price, good performance for what I do.

  • Options
    GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    edited July 2019
    Al_wat wrote: »
    I wonder how the 3950x will compare.

    I imagine it will be the 3900X with four more cores enabled and slightly slower frequencies. Hopefully they work out the SMT and frequency issues with BIOS patches before it drops, since I would assume any fixes applied for the 3900X will work for the 3950X.

    GnomeTank on
    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • Options
    IncindiumIncindium Registered User regular
    edited July 2019
    Oh man now the question is going to be whether I can wait to build this 3900x system until the 2080 Super cards are actually released... I've already started looking at 2080ti cards. But man they are so pricey and lots of reviews for cards that died.

    I guess I could pull the 970 out of my current system and run it for a while and just use the integrated graphics in the Ivy Bridge CPU for that one(or just shut that one down for a while).

    Incindium on
    steam_sig.png
    Nintendo ID: Incindium
    PSN: IncindiumX
  • Options
    bowenbowen How you doin'? Registered User regular
    Mugsley wrote: »
    I'm not trying to accuse you guys of anything, but I've never been a fan of ASRock boards; even their Taichi. Then again, I think a good number of you turn over hardware way more often than me so it's less of an issue.

    I've used a lot of ASRock boards and I've never been disappointed by them. But I've always been a fan of ASUS in general, they've never really let me down.

    MSI can suck some fat cocks tho.

    That said, the taichi looks like its one of the better rated mid-level x570 boards.

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • Options
    GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    The 2080 Ti card dying thing was way overblown. It's RMA numbers were never higher than the 1080 Ti's. People made a big deal out of it because the 20 series cards were a bitter launch for a lot of people, so they ranted.

    I've had my 2080 Ti since January with no issues. Obviously sample size of one, but GN has a video out where they did a more extensive analysis and got real RMA numbers from some insiders and there was never really an epidemic of cards dying. Just a very loud minority of people.

    The price is issue is real though. It was a bitter pill to swallow when I bought mine, but I was able to recoup some of the cost selling my 1080 Ti. I'd probably wait for the 2080 Super honestly. If it's really using a TU-102 GPU like the 2080 Ti it's going to be probably 90% of the performance for 500 dollars less.

    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • Options
    bowenbowen How you doin'? Registered User regular
    I'm glad nvidia had to eat their shit sandwich when crypto tanked and that market couldn't support the massive prices on video cards anymore and they had to reign in their crazy a tiny bit.

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • Options
    LD50LD50 Registered User regular
    GnomeTank wrote: »
    I think for the 3900X AMD needs to figure out the SMT issue more than frequency issues. You shouldn't need to turn SMT off to get max performance. This was an issue that Intel had years and years ago but mostly solved. I don't think it's entirely AMD's issue, I think they need to work with Microsoft to get Windows to play nice, but I think it's a big issue they need to solve.

    This is likely not an AMD issue but an engine issue. Game engines commonly intentionally attempt to force core affinity in order guarantee their heavy workload threads are loaded onto specific (and different) cores. If their game engine isn't properly identifying virtual cores vs physical ones (or if something dumb is happening to their internal scheduler because of the larger core count), they could be forcing two heavy workload threads onto the same physical/virtual pair.

    You likely don't need to turn off SMT to fix this problem, but just set core affinities in task manager.

  • Options
    IncindiumIncindium Registered User regular
    edited July 2019
    LD50 wrote: »
    GnomeTank wrote: »
    I think for the 3900X AMD needs to figure out the SMT issue more than frequency issues. You shouldn't need to turn SMT off to get max performance. This was an issue that Intel had years and years ago but mostly solved. I don't think it's entirely AMD's issue, I think they need to work with Microsoft to get Windows to play nice, but I think it's a big issue they need to solve.

    This is likely not an AMD issue but an engine issue. Game engines commonly intentionally attempt to force core affinity in order guarantee their heavy workload threads are loaded onto specific (and different) cores. If their game engine isn't properly identifying virtual cores vs physical ones (or if something dumb is happening to their internal scheduler because of the larger core count), they could be forcing two heavy workload threads onto the same physical/virtual pair.

    You likely don't need to turn off SMT to fix this problem, but just set core affinities in task manager.

    Yeah Steve from GamersNexus was just shutting it off to identify games that were having issues with SMT while he was benching. I think Linus mentioned playing with the task manager core affinities. Are task manager core affinities something that can be automated/configured easily on a per game basis in Windows?

    Incindium on
    steam_sig.png
    Nintendo ID: Incindium
    PSN: IncindiumX
  • Options
    GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    LD50 wrote: »
    GnomeTank wrote: »
    I think for the 3900X AMD needs to figure out the SMT issue more than frequency issues. You shouldn't need to turn SMT off to get max performance. This was an issue that Intel had years and years ago but mostly solved. I don't think it's entirely AMD's issue, I think they need to work with Microsoft to get Windows to play nice, but I think it's a big issue they need to solve.

    This is likely not an AMD issue but an engine issue. Game engines commonly intentionally attempt to force core affinity in order guarantee their heavy workload threads are loaded onto specific (and different) cores. If their game engine isn't properly identifying virtual cores vs physical ones (or if something dumb is happening to their internal scheduler because of the larger core count), they could be forcing two heavy workload threads onto the same physical/virtual pair.

    You likely don't need to turn off SMT to fix this problem, but just set core affinities in task manager.

    If it also happened on Intel CPU's I would agree, but Intel hasn't had this problem in years. And it ended up being a Windows scheduler problem more than anything else. I would imagine AMD's CCX concept is part of the problem as well. Since having part of the game running in one CCX, while the rest is on another, causes latency issues. It's one of the reasons they upped the L3 cache to such huge numbers, so the CCX's can communicate via the cache in a lot of cases instead of having to use the infinityfabric (which is fast, but still not as fast as direct core to core communication via cache).

    Most games don't schedule their own threads in-engine. They schedule tasks to threads, but the threads themselves are scheduled by the OS...because writing a thread scheduler is serious business that will gain you almost nothing in a modern OS which has 20+ years of refinement on it's preemptive scheduler.

    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • Options
    bowenbowen How you doin'? Registered User regular
    Incindium wrote: »
    LD50 wrote: »
    GnomeTank wrote: »
    I think for the 3900X AMD needs to figure out the SMT issue more than frequency issues. You shouldn't need to turn SMT off to get max performance. This was an issue that Intel had years and years ago but mostly solved. I don't think it's entirely AMD's issue, I think they need to work with Microsoft to get Windows to play nice, but I think it's a big issue they need to solve.

    This is likely not an AMD issue but an engine issue. Game engines commonly intentionally attempt to force core affinity in order guarantee their heavy workload threads are loaded onto specific (and different) cores. If their game engine isn't properly identifying virtual cores vs physical ones (or if something dumb is happening to their internal scheduler because of the larger core count), they could be forcing two heavy workload threads onto the same physical/virtual pair.

    You likely don't need to turn off SMT to fix this problem, but just set core affinities in task manager.

    Yeah Steve from GamersNexus was just shutting it off to identify games that were having issues with SMT while he was benching. I think Linus mentioned playing with the task manager core affinities. Are task manager core affinities something that can be automated/configured easily on a per game basis in Windows?

    yeah there are tools you can use to lock programs to cores if you need to

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • Options
    agoajagoaj Top Tier One FearRegistered User regular
    Coworking is selling their portable tower.
    https://pcpartpicker.com/list/XPZdBb
    I'm not interested in the portability. Would it be a good idea to buy it and replace with full size mobo and case? Was looking to buy a new PC for gaming and game development soonish anyways.
    Whats a good way to find the value of the unpriced parts?

    ujav5b9gwj1s.png
  • Options
    BlazeFireBlazeFire Registered User regular
    What card is required for 1440p gaming above 60Hz? I bought such a monitor but my 1070 doesn't seem up to the task.

  • Options
    AridholAridhol Daddliest Catch Registered User regular
    BlazeFire wrote: »
    What card is required for 1440p gaming above 60Hz? I bought such a monitor but my 1070 doesn't seem up to the task.

    5700xt and 2070 super are the minimums probably.
    I have the same situation. My 1070ti did well but not great.

  • Options
    Jeep-EepJeep-Eep Registered User regular
    bowen wrote: »
    Mugsley wrote: »
    I'm not trying to accuse you guys of anything, but I've never been a fan of ASRock boards; even their Taichi. Then again, I think a good number of you turn over hardware way more often than me so it's less of an issue.

    I've used a lot of ASRock boards and I've never been disappointed by them. But I've always been a fan of ASUS in general, they've never really let me down.

    MSI can suck some fat cocks tho.

    That said, the taichi looks like its one of the better rated mid-level x570 boards.

    Basically the same story as Gigabyte - good mainboards, crappy GPUs (outside of Aorus).

    I'm feeling rather vindicated in my saying that the standard 2060 was a bad deal, BTW.

    I would rather be accused of intransigence than tolerating genocide for the sake of everyone getting along. - @Metzger Meister
  • Options
    GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    I would second the 5700XT or 2070 Super. Basically base your decision on whether you want RT hardware or not. If you can afford it, I would recommend it. Ray tracing is going to be a thing sooner than most people think. Cyberpunk is going to push it hard as a feature and both of the new consoles will have RT hardware. If money is tight the 5700XT will do just fine for 1440p.

    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • Options
    MugsleyMugsley DelawareRegistered User regular
    These peeps are correct, but I'm going to qualify it by saying it depends on your games. My 1080 easily handles Hellblade, MHW, and Path of Exile at higher frame rates.

    I also play demanding games such as CrossCode and Va-11 Hall-A fully maxed. >.>

  • Options
    Jeep-EepJeep-Eep Registered User regular
    GnomeTank wrote: »
    I would second the 5700XT or 2070 Super. Basically base your decision on whether you want RT hardware or not. If you can afford it, I would recommend it. Ray tracing is going to be a thing sooner than most people think. Cyberpunk is going to push it hard as a feature and both of the new consoles will have RT hardware. If money is tight the 5700XT will do just fine for 1440p.

    If raytracing is that important, it will be worth waiting for the 2020s for it, the 20 series just isn't competent enough at it to justify the price tag, and this will get worse over time; it's still raster per dollar right now, and right now the 5700XT is the better choice.

    I would rather be accused of intransigence than tolerating genocide for the sake of everyone getting along. - @Metzger Meister
  • Options
    AridholAridhol Daddliest Catch Registered User regular
    edited July 2019
    1440p is hard as hell on current gen midrange GPU's.
    4k is even worse.

    I won't go back because it looks great but damn it's rough on hardware.

    I am still slightly leaning towards the 2070 super but the 5700xt is in the mix for sure. Gonna wait until the playing field is all known in a month for AIB's and all that shit.

    Either way it's a better landscape for the consumer :)
    zqugczK.jpg


    Aridhol on
  • Options
    Al_watAl_wat Registered User regular
    So I guess the 9900k is going to remain the best gaming chip for the foreseeable future eh?

    I've been thinking about making an upgrade from my 7700k later this year. These new Ryzens had me tempted but they don't seem like the thing to go to for pure gaming.

  • Options
    jgeisjgeis Registered User regular
    edited July 2019
    Uh, I guess that MicroCenter has an additional $50 off the RX 5700XT when you buy a Ryzen 3600 or better CPU at the same time, in addition to $50 off motherboards.

    The 5700XT is currently sold out at my local store, but getting one for $350 seems like a really great deal even if the stock cooling solution isn't great.

    Edit: Regular 5700 is also $50 off with a qualifying CPU.

    jgeis on
  • Options
    wunderbarwunderbar What Have I Done? Registered User regular
    Al_wat wrote: »
    So I guess the 9900k is going to remain the best gaming chip for the foreseeable future eh?

    I've been thinking about making an upgrade from my 7700k later this year. These new Ryzens had me tempted but they don't seem like the thing to go to for pure gaming.

    Honestly, I wouldn't bet on that. The 9900k will be better on existing games that are mostly single or dual threaded, but over the next 12-18 months you're going to see a lot more games coming out that will be able to actually take advantage of 6+ cores. AMD parts will likely actually show better performance over time as that happens.

    XBL: thewunderbar PSN: thewunderbar NNID: thewunderbar Steam: wunderbar87 Twitter: wunderbar
  • Options
    AridholAridhol Daddliest Catch Registered User regular
    Al_wat wrote: »
    So I guess the 9900k is going to remain the best gaming chip for the foreseeable future eh?

    I've been thinking about making an upgrade from my 7700k later this year. These new Ryzens had me tempted but they don't seem like the thing to go to for pure gaming.

    Intel is coming out with the i9-9900KS which is just a pre-overclocked good silicon version but yes.

  • Options
    GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    edited July 2019
    Jeep-Eep wrote: »
    GnomeTank wrote: »
    I would second the 5700XT or 2070 Super. Basically base your decision on whether you want RT hardware or not. If you can afford it, I would recommend it. Ray tracing is going to be a thing sooner than most people think. Cyberpunk is going to push it hard as a feature and both of the new consoles will have RT hardware. If money is tight the 5700XT will do just fine for 1440p.

    If raytracing is that important, it will be worth waiting for the 2020s for it, the 20 series just isn't competent enough at it to justify the price tag, and this will get worse over time; it's still raster per dollar right now, and right now the 5700XT is the better choice.

    Not everyone cares about "FPS per dollar" graphs. Some people just want the best card in their price range, and if the 2070 Super is in his price range it is a flat out better card than the 5700XT. There is no argument to be had there. Not to mention the 2070 Super runs cooler, draws less power and doesn't sound like a jet engine under load (the last part will be solved by AIB cards we can hope).

    That doesn't even take in to account overclocking, which you can do with the 2070 Super, but have much less room to play with on the 5700XT as it's frequency locked from the factory. An overclocked 2070 Super competes with the vanilla 2080. It's not even in the same game as the 5700XT at that point.

    GnomeTank on
    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • Options
    AridholAridhol Daddliest Catch Registered User regular
    jgeis wrote: »
    Uh, I guess that MicroCenter has an additional $50 off the RX 5700XT when you buy a Ryzen 3600 or better CPU at the same time, in addition to $50 off motherboards.

    The 5700XT is currently sold out at my local store, but getting one for $350 seems like a really great deal even if the stock cooling solution isn't great.

    God damnit I need to just drive down to the states and get these deals. That's insane.
    You get $30 off a mobo & cpu combo and then another $50 USD off the XT!?

  • Options
    Jeep-EepJeep-Eep Registered User regular
    edited July 2019
    Al_wat wrote: »
    So I guess the 9900k is going to remain the best gaming chip for the foreseeable future eh?

    I've been thinking about making an upgrade from my 7700k later this year. These new Ryzens had me tempted but they don't seem like the thing to go to for pure gaming.

    Only to a picayune degree, and that advantage is likely to be wiped out next console gen, which is next year; not to mention the frankly ridiculous price tag for something that might get worse as further security holes are found and you have to pay for an expensive cooler.

    Edit: There's only one 'gamer' usage case that justifies that thing, and that's if that thing needs the proprietary Intel instructions for prosumer work.

    Jeep-Eep on
    I would rather be accused of intransigence than tolerating genocide for the sake of everyone getting along. - @Metzger Meister
  • Options
    AridholAridhol Daddliest Catch Registered User regular
    wunderbar wrote: »
    Al_wat wrote: »
    So I guess the 9900k is going to remain the best gaming chip for the foreseeable future eh?

    I've been thinking about making an upgrade from my 7700k later this year. These new Ryzens had me tempted but they don't seem like the thing to go to for pure gaming.

    Honestly, I wouldn't bet on that. The 9900k will be better on existing games that are mostly single or dual threaded, but over the next 12-18 months you're going to see a lot more games coming out that will be able to actually take advantage of 6+ cores. AMD parts will likely actually show better performance over time as that happens.

    The 9900k is 8 cores and 16 threads. Intel absolutely can make high core count chips they just don't do it at the same price point.
    18 months on a game development cycle is not long. We're talking a few years minimum before the industry lurches into 8 cores+ in the PC space.

    Remember it's fine to do it for consoles as you can guarantee everyone has the hardware but in the PC world the vast majority are still at 4 cores-ish.

    The fact is that high frequency is still important and AMD is closing the gap rapidly but it's not all the way at the top end.
    New Ryzen again in a year though and not much revolutionary on the horizon from Intel.

  • Options
    GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    I still think counting Intel is stupid. Yes they are in a slump right now, yes 10nm has been hard for them to spin up...but they still have some incredible strategic advantages like owning all their own fab and buckets of Daddy Wobucks money.

    It's great that AMD is competing, very exciting...but counting a company like Intel out is silly. Lest we forget the last time Intel was losing (The P4/Athlon days) and the hammer blow they dealt after that with Core 2 Duo and Core 2 Quad. It will be very interesting to see if Intel reengages Intel Israel, which designed the P3 and Core architectures.

    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • Options
    jgeisjgeis Registered User regular
    Aridhol wrote: »
    jgeis wrote: »
    Uh, I guess that MicroCenter has an additional $50 off the RX 5700XT when you buy a Ryzen 3600 or better CPU at the same time, in addition to $50 off motherboards.

    The 5700XT is currently sold out at my local store, but getting one for $350 seems like a really great deal even if the stock cooling solution isn't great.

    God damnit I need to just drive down to the states and get these deals. That's insane.
    You get $30 off a mobo & cpu combo and then another $50 USD off the XT!?

    The full deal is:

    Pay full price for a Ryzen 3000 CPU
    Get $50 off one of the Navi cards
    Also get $50 off an X570 mobo
    Select (3200MHz?) DDR4 RAM is also $7 - $20 off when combined with a Ryzen 3000 CPU

  • Options
    Jeep-EepJeep-Eep Registered User regular
    GnomeTank wrote: »
    I still think counting Intel is stupid. Yes they are in a slump right now, yes 10nm has been hard for them to spin up...but they still have some incredible strategic advantages like owning all their own fab and buckets of Daddy Wobucks money.

    It's great that AMD is competing, very exciting...but counting a company like Intel out is silly. Lest we forget the last time Intel was losing (The P4/Athlon days) and the hammer blow they dealt after that with Core 2 Duo and Core 2 Quad. It will be very interesting to see if Intel reengages Intel Israel, which designed the P3 and Core architectures.

    I'm wondering if they're aiming at some kind of angle with Xe.

    I would rather be accused of intransigence than tolerating genocide for the sake of everyone getting along. - @Metzger Meister
  • Options
    MugsleyMugsley DelawareRegistered User regular
    This is crazy talk, but I wonder if Intel says, "fuck it" and rolls out a firmware update that unlocks HT on the 9700k and other 'mid-tier' CPUs as a competitive hedge.

  • Options
    bowenbowen How you doin'? Registered User regular
    edited July 2019
    Here's my part list if anyone wants to (in)sanity check me (CPU and motherboard are already ordered)

    bowen on
    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • Options
    Jeep-EepJeep-Eep Registered User regular
    bowen wrote: »
    Here's my part list if anyone wants to (in)sanity check me (CPU and motherboard are already ordered)

    Apparently the Taichi x570 has an issue with the design of the fan housing; wait for reviews for the other makes.

    I would rather be accused of intransigence than tolerating genocide for the sake of everyone getting along. - @Metzger Meister
  • Options
    bowenbowen How you doin'? Registered User regular
    Jeep-Eep wrote: »
    bowen wrote: »
    Here's my part list if anyone wants to (in)sanity check me (CPU and motherboard are already ordered)

    Apparently the Taichi x570 has an issue with the design of the fan housing; wait for reviews for the other makes.

    seems like they all do, gigabyte as well

    not a doctor, not a lawyer, examples I use may not be fully researched so don't take out of context plz, don't @ me
  • Options
    AridholAridhol Daddliest Catch Registered User regular
    bowen wrote: »
    Here's my part list if anyone wants to (in)sanity check me (CPU and motherboard are already ordered)

    Tis' a good build.
    I prefer EVGA PSU's just for customer service sake but that's a tier 1 great PSU anyways.
    The TaiChi is their flagship board so I'd expect it to be solid.
    The H700i is dreamy :)

  • Options
    AridholAridhol Daddliest Catch Registered User regular
    Tiny ass proprietary fans on chipsets will always suck. It's a shitty place for a fan blocked by a GPU or maybe drives and they are tiny so to be effective they are high RPM which is noisy. It also means they probably wear out faster and I can't wait to see if they are even replaceable...

    I thought we left chipset fans where they belonged, in 1998.

This discussion has been closed.