As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/
Options

[PC Build Thread] Rumor has it there are GPU's in the wild

13435373940101

Posts

  • Options
    Brovid HasselsmofBrovid Hasselsmof [Growling historic on the fury road] Registered User regular
    PCIE Gen 3 NVME SSD's do not need heatsinks.

    Samsung’s 960 Pros would like a word.

    Edit: to be clear, most all *current* gen 3 drives are fine. But a bunch of early ones had overheating problems.

    I think the one I've ordered is a Gen 3. Its full gibberish name is WD Blue SN550 1TB M.2 PCIe NVME SSD. Basically, the cheapest 1TB one this supplier had.

    Tbh I don't even really think about stuff like cooling, which is dumb.

    I'm going to have a CPU cooler, my GPU has 2 fans, and my case has 2 fans. But the case is old, I wonder if I should replace the fans? (if I even can).

  • Options
    -Loki--Loki- Don't pee in my mouth and tell me it's raining. Registered User regular
    I'm going to have a CPU cooler, my GPU has 2 fans, and my case has 2 fans. But the case is old, I wonder if I should replace the fans? (if I even can).

    You're going to want more than 2 fans in your case. Components get hot these days. At minimum, I'd say one at the back and fill out your cases front mounts - usually 3 120mm's or 2 140mm's. If it comes with 2 fans, I'm guessing they're 120's, so grab another 2 to fill out the front.

  • Options
    AbsalonAbsalon Lands of Always WinterRegistered User regular
    edited December 2021
    I have two monitors now and I thought I would set them up, but my motherboard and my 3080 both only provide one HDMI port. The screen plugged into the graphics card is noticed, but not the one connected to the MB. What do? Get a displayport cable so I can connect both screens to the card, or connect the two monitors? What gives the most fluid and smooth performance?

    Absalon on
  • Options
    -Loki--Loki- Don't pee in my mouth and tell me it's raining. Registered User regular
    edited December 2021
    Pretty sure the HDMI port on motherboards is for APU/integrated graphics video output. If you’ve got a CPU and a GPU you’ll need both plugged into the video card.

    So from the sound of it yeah, one will need to be display port and one HDMI.

    -Loki- on
  • Options
    AbsalonAbsalon Lands of Always WinterRegistered User regular
    edited December 2021
    Fair enough, should be easily solved! I think only one monitor has HDR compatibility, that one should get the Displayport cable right?

    Yeah there we go!

    Absalon on
  • Options
    MugsleyMugsley DelawareRegistered User regular
    Just grab a few DP cables and be done. I'm speaking from experience.

  • Options
    MugsleyMugsley DelawareRegistered User regular
    Cormac wrote: »
    I'm pretty sure my pc came with a heat sink for my ssd, but if I wanted to get another what brands are good?

    There should be a bunch out there. Just choose the one that looks best to you or matches well enough with your motherboard. There look to be some without any logos or branding too.

    This. It's literally a block of metal so branding doesn't matter.

  • Options
    BahamutZEROBahamutZERO Registered User regular
    case arrived yesterday, guts should arrive in a few hours, vibrating

    BahamutZERO.gif
  • Options
    wunderbarwunderbar What Have I Done? Registered User regular
    case arrived yesterday, guts should arrive in a few hours, vibrating

    I really hope the parts don't arrive vibrating. That would be weird.

    XBL: thewunderbar PSN: thewunderbar NNID: thewunderbar Steam: wunderbar87 Twitter: wunderbar
  • Options
    GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    user wrote: »
    GnomeTank wrote: »
    I've never felt the need to undervolt my Ryzen CPU's. They don't go haywire even with PBO turned on.

    My Ampere GPU's are another thing entirely, especially my 3090. I'm not comfortable with my extremely expensive GPU spiking to 350+ W and pushing nearly 90C under max stress workloads. Especially not after we found out that Nvidia's reference power design can't handle some of the nastier spikes on the top tier chips. Nvidia has a very "Over current protection? What's that?" style of design around their power management and I'm not willing to risk it.

    In fairness is there a game other than New World that's cooked 30*0 GPUs?

    Not to the scale of New World, no...but Buildzoid has done a bunch of testing with Ampere GPU's and it's almost a matter of luck that it hasn't happened in some other game. The OCP on the cards is laughably bad with such high values that it's shocking more cards aren't popping.

    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • Options
    V1mV1m Registered User regular
    GnomeTank wrote: »
    user wrote: »
    GnomeTank wrote: »
    I've never felt the need to undervolt my Ryzen CPU's. They don't go haywire even with PBO turned on.

    My Ampere GPU's are another thing entirely, especially my 3090. I'm not comfortable with my extremely expensive GPU spiking to 350+ W and pushing nearly 90C under max stress workloads. Especially not after we found out that Nvidia's reference power design can't handle some of the nastier spikes on the top tier chips. Nvidia has a very "Over current protection? What's that?" style of design around their power management and I'm not willing to risk it.

    In fairness is there a game other than New World that's cooked 30*0 GPUs?

    Not to the scale of New World, no...but Buildzoid has done a bunch of testing with Ampere GPU's and it's almost a matter of luck that it hasn't happened in some other game. The OCP on the cards is laughably bad with such high values that it's shocking more cards aren't popping.

    And if you liked that, then you're gonna love Lovelace, with rumoured power consumption hiking up another 50% or so. RDNA3 being better than that but only a little better, apparently.

  • Options
    Trajan45Trajan45 Registered User regular
    GnomeTank wrote: »
    user wrote: »
    GnomeTank wrote: »
    I've never felt the need to undervolt my Ryzen CPU's. They don't go haywire even with PBO turned on.

    My Ampere GPU's are another thing entirely, especially my 3090. I'm not comfortable with my extremely expensive GPU spiking to 350+ W and pushing nearly 90C under max stress workloads. Especially not after we found out that Nvidia's reference power design can't handle some of the nastier spikes on the top tier chips. Nvidia has a very "Over current protection? What's that?" style of design around their power management and I'm not willing to risk it.

    In fairness is there a game other than New World that's cooked 30*0 GPUs?

    Not to the scale of New World, no...but Buildzoid has done a bunch of testing with Ampere GPU's and it's almost a matter of luck that it hasn't happened in some other game. The OCP on the cards is laughably bad with such high values that it's shocking more cards aren't popping.

    I really should work on undervolting mine. I've already lost one reference card.

    Origin ID\ Steam ID: Warder45
  • Options
    GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    V1m wrote: »
    GnomeTank wrote: »
    user wrote: »
    GnomeTank wrote: »
    I've never felt the need to undervolt my Ryzen CPU's. They don't go haywire even with PBO turned on.

    My Ampere GPU's are another thing entirely, especially my 3090. I'm not comfortable with my extremely expensive GPU spiking to 350+ W and pushing nearly 90C under max stress workloads. Especially not after we found out that Nvidia's reference power design can't handle some of the nastier spikes on the top tier chips. Nvidia has a very "Over current protection? What's that?" style of design around their power management and I'm not willing to risk it.

    In fairness is there a game other than New World that's cooked 30*0 GPUs?

    Not to the scale of New World, no...but Buildzoid has done a bunch of testing with Ampere GPU's and it's almost a matter of luck that it hasn't happened in some other game. The OCP on the cards is laughably bad with such high values that it's shocking more cards aren't popping.

    And if you liked that, then you're gonna love Lovelace, with rumoured power consumption hiking up another 50% or so. RDNA3 being better than that but only a little better, apparently.

    Yeah, we're reaching that point where the only way to make GPU's faster is to pack more transistors in to a smaller space, to do more math in parallel, and even with node shrinks it doesn't scale efficiently. We need a complete rethink in the way GPU's work or we're going to hit a power and heat wall. Rumors are the 4090 will use up to 500W of power, and as you said, RDNA3 won't be much better.

    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • Options
    V1mV1m Registered User regular
    "going to"?

    I'm hearing 600W peaks for the top Lovelace SKU

    Yes others will be lower, but cards that 'only' need 400W or 450W isn't exactly good.

    Of course it's all moot if miners buy 97 % of the product

  • Options
    GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    edited December 2021
    600W is what the new PCI-e power connector can give, but every rumor I've seen is that the cards will "only" draw about 500W of that.

    GnomeTank on
    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • Options
    SpoitSpoit *twitch twitch* Registered User regular
    Absalon wrote: »
    Fair enough, should be easily solved! I think only one monitor has HDR compatibility, that one should get the Displayport cable right?

    Yeah there we go!

    If the monitor has hdmi 2.1 (it probably doesn't), it actually has higher throughput than dp1.4, but yeah , that's in like 4k120 or 8k level stuff

    steam_sig.png
  • Options
    OrcaOrca Also known as Espressosaurus WrexRegistered User regular
    GnomeTank wrote: »
    600W is what the new PCI-e power connector can give, but every rumor I've seen is that the cards will "only" draw about 500W of that.

    So you’re saying my 1000 watt PSU actually makes sense! Haha! It wasn’t overbuilding for silence, it was overbuilding because I’m installing a space heater in my computer case!

  • Options
    OrcaOrca Also known as Espressosaurus WrexRegistered User regular
    edited December 2021
    Real talk, that is completely bugfuck. 500 watts before talking about overclocking is absolutely nuts.

    Orca on
  • Options
    GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    edited December 2021
    I mean my 3090 will easily suck 350W from the wall if I let it. I absolutely do not let it at this point, but even with my good undervolt it can spike to 300W, especially at 4K. I have no idea why Ampere GPU's produce so much more heat and power draw when rendering native 4K, but it's absolutely a real thing. GPU usage won't be any different, as the game was generally not pegged 120+ at 1440p/ultra anyway but you lose frames going to 4K, gain no GPU usage, but the GPU goes bananas. I saw mine pull 360W and hit 90C the other day when I accidentally booted a game at 4K. To be clear I was on the wrong Afterburner profile so my undervolt wasn't in play, but even when it is, 4K still causes the card to spin up to frown town levels. It's one of the primary reason that I just don't run 4K and run everything at 1440p aiming to get locked 120/144 depending on which display I'm using.

    Somewhat related I found another game that makes my undervolt unstable, Cyberpunk with all the RT stuff on. My assumption is it's just not enough voltage for the GPU to stay stable when it ramps up all the RT stuff. Probably will setup a different profile with a bit higher voltage at slightly lower clocks specifically for RT games.

    GnomeTank on
    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • Options
    -Loki--Loki- Don't pee in my mouth and tell me it's raining. Registered User regular
    Got to say, I'm not super impressed with Ray Tracing. Not even talking about the obscene framerate hit you get, just that it's not noticeable enough to make that hit worth it.

    Like, the reflections look nice in Control, but I'm not looking at them when I'm actually playing. So once the wonder wears off and you get back to playing the game, you might as well just turn it back off and get better framerates.

    Some with Doom Eternal - it makes the metals and edge reflections look nice, but it's not a game where you are standing still looking at things so I'd rather just have the framerate back for smoother shooting.

  • Options
    GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    edited December 2021
    Reflections aren't the party piece, it's global illumination. It's not an always an in your face difference, but good GI looks so much cleaner, and more lifelike, than even the best rasterized lighting techniques. It's most noticeable in scenes with extreme lighting differences, like a dark room with sun streaming in a window, or a cave entrance. Best example is Metro Exodus right now as it has a GI and non-GI version of the game available. There is absolutely no question the GI version looks much, much better. The issue is that you need a 3080+ right now to even consider running a modern game as pure GI, and it can be extremely noisy if the denoiser isn't good.

    Side effect is that it's also much easier for developers. Much much faster to build scenes as you just put a light down, set it's properties, and simulated light physics does simulated light physics. No need to bake lights or build light maps, changes to the light happen in real time, immediately.

    GnomeTank on
    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • Options
    3cl1ps33cl1ps3 I will build a labyrinth to house the cheese Registered User regular
    Even WoW with raytracing maxed is noticeably prettier in heavily lit areas.

  • Options
    MugsleyMugsley DelawareRegistered User regular
    So basically we're in late Pentium territory, where they just jack up the power to try to progress because they're not sure what the next step should be.

    And now I'm clutching my 1080. So much for looking forward to a 4080 if I'm going to need a 800W PSU to drive it.

  • Options
    -Loki--Loki- Don't pee in my mouth and tell me it's raining. Registered User regular
    edited December 2021
    I don't have Metro Exodus, so I went and watched some comparison videos with it.

    The way the levels are lit looks a bit nicer, yeah. But it didn't look '50% of your framerate' nicer - between the on and off scenes his FPS went from 120+ to under 60. Still feels like we're a generation or so off with hardware.

    edit - that video was done with a 2080. Might look for some with a 3000 or 6000 series card.

    -Loki- on
  • Options
    GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    edited December 2021
    We are definitely a generation or more off from the hardware being able to do GI at high frame rates. Hell we're a generation or more from mid-tier and lower hardware being able to do it at consistently playable frame rates. The potential is there though and I think it's certainly where real-time rendering is headed.

    I think big AAA studios will put a lot of pressure on it to become the next thing in the coming years, because of the production savings. I can't overstate how finicky lighting is in rasterized engines. Even with decent tools that do some of what real-time RT does as a pre-bake, it requires a lot of work with a lot of special, hand-placed, primitives to "shape" the lighting.

    e: Comparisons, I think Digital Foundry has the best one. He breaks it down in pretty deep detail and shows many of the scenes where it makes a significant difference in scene image quality:

    https://www.youtube.com/watch?v=NbpZCSf4_Yk

    GnomeTank on
    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • Options
    SpoitSpoit *twitch twitch* Registered User regular
    Given how week consoles are, I kinda doubt we get many RT-only games like metro until the next console generation

    steam_sig.png
  • Options
    -Loki--Loki- Don't pee in my mouth and tell me it's raining. Registered User regular
    Spoit wrote: »
    Given how week consoles are, I kinda doubt we get many RT-only games like metro until the next console generation

    Yeah this is a big one. Ray tracing was supposed to be the big thing this generation with consoles and they're only managing it at 30fps, unless they do a lot of weird tricks like Insomniac did with Spiderman to get ray tracing at 60fps.

    I could see a console mid-generation refresh boosting ray tracing so they might be able to more easily manage 60fps, but developers won't be able to push it beyond what the base consoles can manage at 30fps or risk alienating customers.

  • Options
    GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    It's not just hardware it's also tooling. Things like Unreal 5, and it's generation of engines, will help bridge the gap by making it easier to offer a full GI pipeline on hardware that can support it while falling back on a rasterized pipeline when that makes sense. Its extremely cheap to add RT lights to a rasterized scene versus the other way around. The new generation of engines have extremely smart lighting engines that can use the right thing at the right time.

    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
  • Options
    BahamutZEROBahamutZERO Registered User regular
    yay all the parts work, now I just need to figure out if I want to do a clean windows reinstall or not

    BahamutZERO.gif
  • Options
    BahamutZEROBahamutZERO Registered User regular
    edited December 2021
    -

    BahamutZERO on
    BahamutZERO.gif
  • Options
    BahamutZEROBahamutZERO Registered User regular
    hahahah minecraft loads chunks so fast now

    BahamutZERO.gif
  • Options
    BahamutZEROBahamutZERO Registered User regular
    edited December 2021
    Question: can you plug a USB 3.2 Gen 2 type-C front panel connector into a USB 3.2 Gen 1 type-C motherboard socket and have it work at the lower Gen 1 speeds?
    These things are generally backwards compatible, but I'm not sure if this would be a case of backwards compatibility or forwards compatibility.

    BahamutZERO on
    BahamutZERO.gif
  • Options
    Trajan45Trajan45 Registered User regular
    Anyone have a link to a good tutorial on how to undervolt a 3080?

    Origin ID\ Steam ID: Warder45
  • Options
    tsmvengytsmvengy Registered User regular
    Question: can you plug a USB 3.2 Gen 2 type-C front panel connector into a USB 3.2 Gen 1 type-C motherboard socket and have it work at the lower Gen 1 speeds?
    These things are generally backwards compatible, but I'm not sure if this would be a case of backwards compatibility or forwards compatibility.

    Yes, it should work at the slower speeds.

    steam_sig.png
  • Options
    BahamutZEROBahamutZERO Registered User regular
    tsmvengy wrote: »
    Question: can you plug a USB 3.2 Gen 2 type-C front panel connector into a USB 3.2 Gen 1 type-C motherboard socket and have it work at the lower Gen 1 speeds?
    These things are generally backwards compatible, but I'm not sure if this would be a case of backwards compatibility or forwards compatibility.

    Yes, it should work at the slower speeds.

    yup you're right, tested it and it works, it just didn't show up as connected in the BIOS unlike the USB2.0 and USB3.0 connectors.

    BahamutZERO.gif
  • Options
    GilgaronGilgaron Registered User regular
    Orca wrote: »
    GnomeTank wrote: »
    600W is what the new PCI-e power connector can give, but every rumor I've seen is that the cards will "only" draw about 500W of that.

    So you’re saying my 1000 watt PSU actually makes sense! Haha! It wasn’t overbuilding for silence, it was overbuilding because I’m installing a space heater in my computer case!

    1000W... how much longer until we'll need a dedicated circuit out of the breaker for a gaming PC?

  • Options
    BahamutZEROBahamutZERO Registered User regular
    edited December 2021
    I wonder how accurate the benchmarks on PassMark are for very old video cards, or if there's a floor below which it's just "bad, who cares." It's funny to think that 20 years ago I had a card that rates a 25 on there now, today I am using a card that scores 400 times higher, and the highest end cards can benchmark 1000 times higher.

    BahamutZERO on
    BahamutZERO.gif
  • Options
    ThawmusThawmus +Jackface Registered User regular
    Gilgaron wrote: »
    Orca wrote: »
    GnomeTank wrote: »
    600W is what the new PCI-e power connector can give, but every rumor I've seen is that the cards will "only" draw about 500W of that.

    So you’re saying my 1000 watt PSU actually makes sense! Haha! It wasn’t overbuilding for silence, it was overbuilding because I’m installing a space heater in my computer case!

    1000W... how much longer until we'll need a dedicated circuit out of the breaker for a gaming PC?

    Honestly I've already been doing this for a while, but because I got tired of having the breaker trip when other people in the house plugged space heaters and shit into the same circuit. I lost my NAS from that shit too, was a bunch of 8 TB spinners from work. When we hired an electrician to put in some canned lights in our basement, I added "dedicated circuit for my PC and server" to the scope of work and got it done.

    Twitch: Thawmus83
  • Options
    minor incidentminor incident expert in a dying field njRegistered User regular
    You’d be surprised how many people we would get support calls from at my last job because they had us build them a 2400 watt workstation and then they kept tripping breakers and insisted it was our fault.

    Ah, it stinks, it sucks, it's anthropologically unjust
  • Options
    GilgaronGilgaron Registered User regular
    Thawmus wrote: »
    Gilgaron wrote: »
    Orca wrote: »
    GnomeTank wrote: »
    600W is what the new PCI-e power connector can give, but every rumor I've seen is that the cards will "only" draw about 500W of that.

    So you’re saying my 1000 watt PSU actually makes sense! Haha! It wasn’t overbuilding for silence, it was overbuilding because I’m installing a space heater in my computer case!

    1000W... how much longer until we'll need a dedicated circuit out of the breaker for a gaming PC?

    Honestly I've already been doing this for a while, but because I got tired of having the breaker trip when other people in the house plugged space heaters and shit into the same circuit. I lost my NAS from that shit too, was a bunch of 8 TB spinners from work. When we hired an electrician to put in some canned lights in our basement, I added "dedicated circuit for my PC and server" to the scope of work and got it done.

    I just finished my basement and while this had me second guessing myself on not doing that for the home office beforehand, I did put in conduit and have enough unfinished space below the office so I shouldn't have to unplug the dryer to game in 10 years. Your NAS issue makes me think about how I'd been procrastinating on putting my NAS on a mini UPS...

This discussion has been closed.