As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/
We're funding a new Acquisitions Incorporated series on Kickstarter right now! Check it out at https://www.kickstarter.com/projects/pennyarcade/acquisitions-incorporated-the-series-2

[PC Build Thread] Keep your human antivirus up to date

1777880828399

Posts

  • SynthesisSynthesis Honda Today! Registered User regular
    3clipse wrote: »
    Yeah I mean I am definitely not trying to say "you can't see the difference above 60 frames!!!!" just I personally haven't noticed enough of a difference with very high framerates that it's worth it to me to pay an extra $900 for.

    Some people can't--what this generally means is that yes, if they stare at relevant content for several minutes, have the opportunity to make comparisons, yes, they will eventually make the difference but that isn't actually how people play video games.

    Personally, I can't really see anything over 150 hz, and definitely not over 200 hz. It literally, without exaggeration, looks the same to me (as far as the scant few games that actually support those framerates properly, like CS GO, are concerned). I'll never be a pro-CS GO player, boohoo. But more broadly, I'm pretty sure everyone has a point of diminishing returns, and I suspect it's well below 200 hz for a lot of people, if not most people. But I can also speak for my own eyes, and only better than literally any other human being can for them.

    Everyone here is discussing building PCs which strongly suggests everyone here is old enough to remember people who would say, "You can't see the difference over 720p at this distance! 1080p isn't a real thing!" Except "you" actually means "I", they just don't care to admit that. It's comparable to that. Even with VRR as a rather normal feature on monitors and even televisions (Samsung ones anyway), I also get kind of annoyed by a sliding framerate between 60 to 70 and 120 to 130. It's distracting for me personally. I'd rather stick with a consistent 60 or even a consistent 50, but I also wear glasses and stare at monitors all that at my job at work.

  • GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    edited September 2020
    Elimination of tearing without limiting refresh rate I think is the big win of VRR. That and it definitely smooths out the image when your FPS does drop below a certain level, provided the VRR panel you have has a large enough refresh range. Some cheapo panels have a very small range at which the VRR will work.

    GnomeTank on
    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
    OrcaSoggybiscuitemp123Stabbity Style
  • cardboard delusionscardboard delusions USAgent PSN: USAgent31Registered User regular
    GnomeTank wrote: »
    Yes, the Ryzen 3000 series CPU's are warm. My 3900X regularly hovers at 40-50C during regular usage, with an AIO in play. My 3950X runs a bit cooler because it's a better binned chip, and is on a 360 AIO instead of a 240...but it still runs much hotter on average than my 8700K did. Of course, in both cases, it's quite literally double (or better) the cores. They just run warm. I've adjusted my mental model to it and don't freak out over the temps anymore.

    Re: the Nvidia cooler design and blowing hot air at your CPU. It shouldn't be that bad provided you have exhaust built in to your airflow. The back fan that blows up passes over the heatpipes, but not directly over the die area like the forward fan does (the one that acts as direct GPU exhaust). I doubt it causes that much issue for people that already had proper airflow in their case. Also a total non-issue if you're running an AIO. Remember that all dual and triple axial cards today basically just exhaust all the GPU waste heat in to your case. They may not blow directly at your CPU, but all the waste heat of those cards ends up having to be moved out by your exhaust fans anyway.

    Definitely swaying me to a FE version, especially if the AIB ones are $100 more.

  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    GnomeTank wrote: »
    Yes, the Ryzen 3000 series CPU's are warm. My 3900X regularly hovers at 40-50C during regular usage, with an AIO in play. My 3950X runs a bit cooler because it's a better binned chip, and is on a 360 AIO instead of a 240...but it still runs much hotter on average than my 8700K did. Of course, in both cases, it's quite literally double (or better) the cores. They just run warm. I've adjusted my mental model to it and don't freak out over the temps anymore.

    Re: the Nvidia cooler design and blowing hot air at your CPU. It shouldn't be that bad provided you have exhaust built in to your airflow. The back fan that blows up passes over the heatpipes, but not directly over the die area like the forward fan does (the one that acts as direct GPU exhaust). I doubt it causes that much issue for people that already had proper airflow in their case. Also a total non-issue if you're running an AIO. Remember that all dual and triple axial cards today basically just exhaust all the GPU waste heat in to your case. They may not blow directly at your CPU, but all the waste heat of those cards ends up having to be moved out by your exhaust fans anyway.

    Either way it's a talking point so it'll probably get tested by all the usual suspects.

    Unless it's blowing like 100C air I don't think it's going to be a huge deal

  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    GN brought up an interesting point wrt the Tiger Lake release

    Intel talked more about AMD's 4800U than they did their own product and did a ton of petty jabs (calling AMD an imitator, for instance).

    Nvidia never mentioned anything other than their own.

    And don't get me wrong, AMD's marketing and releases are usually filled to the brim with petty cheap shots too. It's just fucking sad that Nvidia is being the grownup in the room.

  • OrcaOrca Also known as Espressosaurus WrexRegistered User regular
    I'm skeptical that anyone can see a difference between 120 Hz and 240 Hz. I can believe that the reduction in latency necessitated by 240 Hz is noticeable.

    But that's a distinction without difference in this context. :)

  • DixonDixon Screwed...possibly doomed CanadaRegistered User regular
    I can def see a difference between 120 and 240, but it's hard to quantify. 120 was on my tv and then 240 on a buddies monitor he brought over to game with. It's definitely silky smooth movement comparatively. I'm not sure if the TV plays into that though. He did have gsync and I did not, so maybe it is that?

  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    Orca wrote: »
    I'm skeptical that anyone can see a difference between 120 Hz and 240 Hz. I can believe that the reduction in latency necessitated by 240 Hz is noticeable.

    But that's a distinction without difference in this context. :)

    I think LTT did a test on this and found that while people couldn't tell the difference, their actual gameplay results tended to be better on higher refresh rates.

    https://www.youtube.com/watch?v=tV8P6T5tTYs

  • GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    I think that's just a sign of the current state of the market. Nvidia is supremely confident they have a winner in the 3080. They just don't have a need to try and cheap shot at AMD right now. Until RDNA 2 is a real thing people can buy the 3080 and 3070 are likely to dominate the hardware news cycle, especially if independent benchmarking verifies Nvidia's claims about how good the cards are.

    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
    3cl1ps3
  • MvrckMvrck Dwarven MountainhomeRegistered User regular
    GnomeTank wrote: »
    I think that's just a sign of the current state of the market. Nvidia is supremely confident they have a winner in the 3080. They just don't have a need to try and cheap shot at AMD right now. Until RDNA 2 is a real thing people can buy the 3080 and 3070 are likely to dominate the hardware news cycle, especially if independent benchmarking verifies Nvidia's claims about how good the cards are.

    AMD may really be dead in the water GPU wise if they release a 3060 under $400 that can compete with the 2080 Super.

    3cl1ps3
  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    GnomeTank wrote: »
    I think that's just a sign of the current state of the market. Nvidia is supremely confident they have a winner in the 3080. They just don't have a need to try and cheap shot at AMD right now. Until RDNA 2 is a real thing people can buy the 3080 and 3070 are likely to dominate the hardware news cycle, especially if independent benchmarking verifies Nvidia's claims about how good the cards are.

    Yeah, they took a ton of cheap shots with the 2000 series launch.

    Just another thing to add to the pile of how bad the 2000 series kinda was, especially in retrospect.

  • 3cl1ps33cl1ps3 I will build a labyrinth to house the cheese Registered User regular
    LTT did a video yesterday where he made an interesting point I hadn't considered, which is that the pricing on the 3xxx cards isn't competing with AMD's PC cards it's competing with their console cards in the PS5 and XboxX and trying to give people a really compelling reason to stay in PC, where Nvidia dominates, and not migrate fully to consoles, where AMD dominates.

    This makes reasonable sense to me.

    OrcajungleroomxMvrckGnomeTankThawmusDrovekCormacCampyBouwsTCaedwyrTurambar
  • OrcaOrca Also known as Espressosaurus WrexRegistered User regular
    Orca wrote: »
    I'm skeptical that anyone can see a difference between 120 Hz and 240 Hz. I can believe that the reduction in latency necessitated by 240 Hz is noticeable.

    But that's a distinction without difference in this context. :)

    I think LTT did a test on this and found that while people couldn't tell the difference, their actual gameplay results tended to be better on higher refresh rates.

    https://www.youtube.com/watch?v=tV8P6T5tTYs

    I can buy the small improvement in latency and maybe physics updates and the like improving things, but I'm highly skeptical of the actual display rate mattering. I've done tests on 120 Hz displays with 0 persistence and above around 100 Hz literally no one could tell the difference. Granted, small N test population and not testing in competitive gaming, but anything below 90 Hz bothers me so I'm a bit of an outlier to begin with.

    I'd be curious how that display technology would work in this context...

  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    Orca wrote: »
    Orca wrote: »
    I'm skeptical that anyone can see a difference between 120 Hz and 240 Hz. I can believe that the reduction in latency necessitated by 240 Hz is noticeable.

    But that's a distinction without difference in this context. :)

    I think LTT did a test on this and found that while people couldn't tell the difference, their actual gameplay results tended to be better on higher refresh rates.

    https://www.youtube.com/watch?v=tV8P6T5tTYs

    I can buy the small improvement in latency and maybe physics updates and the like improving things, but I'm highly skeptical of the actual display rate mattering. I've done tests on 120 Hz displays with 0 persistence and above around 100 Hz literally no one could tell the difference. Granted, small N test population and not testing in competitive gaming, but anything below 90 Hz bothers me so I'm a bit of an outlier to begin with.

    I'd be curious how that display technology would work in this context...

    I will say tho

    If high refresh rate panels come along and have zero drawbacks and no cost penalty, then it's a part of the monitor I don't really care about

  • A duck!A duck! Moderator, ClubPA mod
    3clipse wrote: »
    LTT did a video yesterday where he made an interesting point I hadn't considered, which is that the pricing on the 3xxx cards isn't competing with AMD's PC cards it's competing with their console cards in the PS5 and XboxX and trying to give people a really compelling reason to stay in PC, where Nvidia dominates, and not migrate fully to consoles, where AMD dominates.

    This makes reasonable sense to me.

    Yeah, I thought this when I saw the 3070 prices. If you have any kind of modern CPU you can spend $???.?? on a console, or you can spend $500 on a GPU and play games at really primo settings.

    3cl1ps3
  • GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    3clipse wrote: »
    LTT did a video yesterday where he made an interesting point I hadn't considered, which is that the pricing on the 3xxx cards isn't competing with AMD's PC cards it's competing with their console cards in the PS5 and XboxX and trying to give people a really compelling reason to stay in PC, where Nvidia dominates, and not migrate fully to consoles, where AMD dominates.

    This makes reasonable sense to me.

    Yeah, I thought it was a pretty insightful comment. Not that "Big Navi" won't be good, it likely will be very good and great value, just that it's likely not Nvidia's primary competitive thought right now.

    My addition to the thought line would be that it's also a funny symbiotic relationship. Nvidia kind of needs the consoles to start pushing some of this tech to the masses, especially around RT and the fancy storage tech. The more the average Joe Gamer becomes accustomed to that being a thing, the more that bleeds into the PC space, the more Nvidia's (perceived) lead in those areas PC side helps them sell GPU's.

    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
    Orcajungleroomx3cl1ps3SoggybiscuitTrajan45
  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    edited September 2020
    I'm hesitant on Big Navi

    I got a little wrapped up in the surprise pricing of RTX 3000, and certain industry "insiders" saying it was going to be good, and I forgot that this is the song and dance we've gotten for years from AMD's graphics division. How Polaris and Vega and even Radeon 7 would be when AMD really socks it to Nvidia.

    Their CPU division has shown that when they have something above and beyond, they do not hide it. We got Ryzen 3000 leaks in December 2018 that turned out to be fairly accurate in July 2019. In fact, these "leaks" seem to be industry standard these days.

    AMD's GPU division hasn't shown shit except for a thinkmoji. I really really believe that if they had something that would truly compete we would've had a "leak" ahead or just after the Nvidia reveal to try and throw water on it.

    Because right now, the industry is all about RTX 3000.

    jungleroomx on
    MulletudeLD50
  • ThawmusThawmus +Jackface Registered User regular
    I'm holding out for AMD's stuff. The 3700 pricing is pretty good but I don't want to get one and decide between installing Windows and losing out on RTX features.

    Plus I still think the majority of my build money needs to be on a 4xxx series Ryzen. Guess we'll see! Gonna be hard to keep biting my nails and waiting the next couple months while you guys come in here and let us know the RTX 3000 series is out of stock everywhere.

    Twitch: Thawmus83
  • Stabbity StyleStabbity Style He/Him | Warning: Mothership Reporting Kennewick, WARegistered User regular
    edited September 2020
    I'm waffling between waiting for the card comparisons that show how good their cooling solutions are or just jumping in on an FE card first chance because I'm fairly certain they're going to run out of stock almost immediately.

    377270_20200901172854.jpg

    Seems like it should be good in theory. And I don't think I've ever heard them talking up their stock coolers like this before. So maybe they'll be competitive vs custom coolers?

    Stabbity Style on
    SijLqhH.pngSteam: stabbitystyle | XBL: Stabbity Style | PSN: Stabbity_Style | Twitch: stabbitystyle
    LD50
  • DixonDixon Screwed...possibly doomed CanadaRegistered User regular
    So if I want a 3080 on the first day, whats the best bet overall?

    A FE direct from nvidia? So that means being up at midnight on the 17th just refreshing till the page shows up on nvidia.com?

    Does me being in canada change that? Never ordered direct through nvidia before.

  • MulletudeMulletude Registered User regular
    I'm hesitant on Big Navi

    I got a little wrapped up in the surprise pricing of RTX 3000, and certain industry "insiders" saying it was going to be good, and I forgot that this is the song and dance we've gotten for years from AMD's graphics division. How Polaris and Vega and even Radeon 7 would be when AMD really socks it to Nvidia.

    Their CPU division has shown that when they have something above and beyond, they do not hide it. We got Ryzen 3000 leaks in December 2018 that turned out to be fairly accurate in July 2019. In fact, these "leaks" seem to be industry standard these days.

    AMD's GPU division hasn't shown shit except for a thinkmoji. I really really believe that if they had something that would truly compete we would've had a "leak" ahead or just after the Nvidia reveal to try and throw water on it.

    Because right now, the industry is all about RTX 3000.

    Agreed. If Amd had a true competitor they'd want that info out to undercut Nvidia imo.

    XBL-Dug Danger WiiU-DugDanger Steam-http://steamcommunity.com/id/DugDanger/
    Orca
  • V1mV1m Registered User regular
    V1m wrote: »
    Past me did Just Got Home From Holiday today me a huge solid by making sure there were a few Staropramens and a bottle of white burgundy in the fridge ready for when I got back and IN A TOTALLY UNRELATED EVENT I just blew ~£170 on a fancy-dan PSU

    New PSU arrived and installed (it actually arrived yesterday but my flat is a huge mess and I could not find my screwdriver set to save my life).

    It is quiet. Really quiet. In fact so quiet that I can now easily hear the fan on my GPU! Seriously though, I feel like the investment in the Noctua is now a lot more worthwhile.

    Also, god damb, Corsair don't stint the cables. If you think you're going to want up to 16 SATA devices in your PC, this PSU kit has you covered. The cables were rather longer than I required and I have a janky old PC case, so no pictures of admirable neat cabling will be provided. Just google up some pictures of how many tapeworms baleen whales carry if you want to get a feel for how it looks.

    SoggybiscuitThawmus
  • SoggybiscuitSoggybiscuit Tandem Electrostatic Accelerator Registered User regular
    AMD is definitely in a more complex situation than nVidia here and they have a larger amount of products to supply (CPUs, GPUs, APUs, etc.). AMD has to consider more variables. I wouldn't treat them not leaking something in return as an indication they can't compete. We just won't know until everything has been released.

    Steam - Synthetic Violence | XBOX Live - Cannonfuse | PSN - CastleBravo | Twitch - SoggybiscuitPA
    Drovek
  • V1mV1m Registered User regular
    Mulletude wrote: »
    I'm hesitant on Big Navi

    I got a little wrapped up in the surprise pricing of RTX 3000, and certain industry "insiders" saying it was going to be good, and I forgot that this is the song and dance we've gotten for years from AMD's graphics division. How Polaris and Vega and even Radeon 7 would be when AMD really socks it to Nvidia.

    Their CPU division has shown that when they have something above and beyond, they do not hide it. We got Ryzen 3000 leaks in December 2018 that turned out to be fairly accurate in July 2019. In fact, these "leaks" seem to be industry standard these days.

    AMD's GPU division hasn't shown shit except for a thinkmoji. I really really believe that if they had something that would truly compete we would've had a "leak" ahead or just after the Nvidia reveal to try and throw water on it.

    Because right now, the industry is all about RTX 3000.

    Agreed. If Amd had a true competitor they'd want that info out to undercut Nvidia imo.

    AMD's "true" competition will be launched when MS and Sony say so.

    No one is expecting anything to match the 3090, but it's definitely within the realms of possibility that they can produce an 80 CU Navi2 that will at least be in the same ballpark as the 3080.

    Naturally, if & when they do, Nvidia will announce the 16Gb 3070 Super and 20Gb 3080 Super

    One thing to keep in mind: AMD make CPUs, APUs and GPUs: discrete GPUs are the smallest part of their business. Nvidia only make GPUs, and they cannot and will not allow AMD or anyone else to have the best GPUs available, because if they don't have the best GPUs, they don't have anything.

    Nvidia are willing to make gigantic GPU dies in order to stay in front if they have to: the 2000 series are huge chunks of silicon, even taking into account the older process. AMD really want to make smaller dies because every square millimeter of TSMC silicon that goes into to making a GPU die is a square millimeter of TSMC silicon that's not being used to make highly profitable CPUs. Nvidia will always have an advantage in their willingness to throw as many transistors as it takes into the ante.

    SoggybiscuitSnicketysnickrahkeesh2000
  • Inquisitor77Inquisitor77 2 x Penny Arcade Fight Club Champion A fixed point in space and timeRegistered User regular
    Orca wrote: »
    Orca wrote: »
    I'm skeptical that anyone can see a difference between 120 Hz and 240 Hz. I can believe that the reduction in latency necessitated by 240 Hz is noticeable.

    But that's a distinction without difference in this context. :)

    I think LTT did a test on this and found that while people couldn't tell the difference, their actual gameplay results tended to be better on higher refresh rates.

    I can buy the small improvement in latency and maybe physics updates and the like improving things, but I'm highly skeptical of the actual display rate mattering. I've done tests on 120 Hz displays with 0 persistence and above around 100 Hz literally no one could tell the difference. Granted, small N test population and not testing in competitive gaming, but anything below 90 Hz bothers me so I'm a bit of an outlier to begin with.

    I'd be curious how that display technology would work in this context...

    I will say tho

    If high refresh rate panels come along and have zero drawbacks and no cost penalty, then it's a part of the monitor I don't really care about

    The experiment in this video was done so poorly and so broadly that it's hard to get any meaningful interpretation of the results. Not only was the population size essentially anecdotal, but each person also did 2 tests at each refresh rate in the exact same order followed by a randomized run of 4 tests. Even if all of those tests were set to a random refresh rate, it would barely pass the threshold for significance (if at all).

    I would not be surprised if, given how they described the experiment, the reason people did better at higher refresh rates was simply because those runs always came later, and the subject was simply more practiced in the actual test. This would be supported by the fact that subject performance improved far beyond anything that could mathematically be explained by mere refresh rate improvement.

    SynthesisOrca3cl1ps3Thawmus
  • SixSix Caches Tweets in the mainframe cyberhex Registered User regular
    PC thread, I come seeking advice.

    My current PC was built from scratch a few years back and is rocking a 970. I don’t need a new card but what the Hell, these new Nvidia cards sound awesome and I’ve got Gamepass now and I’m in. So I’ll probably get a 3080 when they’re readily available.

    The question, then, is do I need a new CPU? And if so, a new motherboard? Am I looking at a full PC build or will I see enough benefit out of just the new GPU for it to be worthwhile?

    can you feel the struggle within?
  • Stabbity StyleStabbity Style He/Him | Warning: Mothership Reporting Kennewick, WARegistered User regular
    edited September 2020
    Six wrote: »
    PC thread, I come seeking advice.

    My current PC was built from scratch a few years back and is rocking a 970. I don’t need a new card but what the Hell, these new Nvidia cards sound awesome and I’ve got Gamepass now and I’m in. So I’ll probably get a 3080 when they’re readily available.

    The question, then, is do I need a new CPU? And if so, a new motherboard? Am I looking at a full PC build or will I see enough benefit out of just the new GPU for it to be worthwhile?

    What are your CPU and motherboard? Or really just CPU.

    Stabbity Style on
    SijLqhH.pngSteam: stabbitystyle | XBL: Stabbity Style | PSN: Stabbity_Style | Twitch: stabbitystyle
    DrovekSix
  • SixSix Caches Tweets in the mainframe cyberhex Registered User regular
    Six wrote: »
    PC thread, I come seeking advice.

    My current PC was built from scratch a few years back and is rocking a 970. I don’t need a new card but what the Hell, these new Nvidia cards sound awesome and I’ve got Gamepass now and I’m in. So I’ll probably get a 3080 when they’re readily available.

    The question, then, is do I need a new CPU? And if so, a new motherboard? Am I looking at a full PC build or will I see enough benefit out of just the new GPU for it to be worthwhile?

    What are your CPU and motherboard? Or really just CPU.

    Weird - I had that written in there and now I feel like I’m in a different dimension.

    It’s an i5 4690K on a Z97A motherboard.

    can you feel the struggle within?
  • Stabbity StyleStabbity Style He/Him | Warning: Mothership Reporting Kennewick, WARegistered User regular
    edited September 2020
    Six wrote: »
    Six wrote: »
    PC thread, I come seeking advice.

    My current PC was built from scratch a few years back and is rocking a 970. I don’t need a new card but what the Hell, these new Nvidia cards sound awesome and I’ve got Gamepass now and I’m in. So I’ll probably get a 3080 when they’re readily available.

    The question, then, is do I need a new CPU? And if so, a new motherboard? Am I looking at a full PC build or will I see enough benefit out of just the new GPU for it to be worthwhile?

    What are your CPU and motherboard? Or really just CPU.

    Weird - I had that written in there and now I feel like I’m in a different dimension.

    It’s an i5 4690K on a Z97A motherboard.

    You're in luck, Gamer Nexus made a video just for you!

    https://www.youtube.com/watch?v=D6RsDyMn2gY

    In short, you probably wanna upgrade.

    Stabbity Style on
    SijLqhH.pngSteam: stabbitystyle | XBL: Stabbity Style | PSN: Stabbity_Style | Twitch: stabbitystyle
    V1m
  • V1mV1m Registered User regular
    Six wrote: »
    Six wrote: »
    PC thread, I come seeking advice.

    My current PC was built from scratch a few years back and is rocking a 970. I don’t need a new card but what the Hell, these new Nvidia cards sound awesome and I’ve got Gamepass now and I’m in. So I’ll probably get a 3080 when they’re readily available.

    The question, then, is do I need a new CPU? And if so, a new motherboard? Am I looking at a full PC build or will I see enough benefit out of just the new GPU for it to be worthwhile?

    What are your CPU and motherboard? Or really just CPU.

    Weird - I had that written in there and now I feel like I’m in a different dimension.

    It’s an i5 4690K on a Z97A motherboard.

    A quick internet indicates that they support PCIe 3.0, so as long as your PSU will deliver the watts, then the motherboard should be fine.

    A 4c/4t CPU though... on DDR3? With 6Mb of L3 cache? I think you'll notice a significant difference if you upgrade.

    Old Intel CPUs always seem to be absurdly expensive, and while you might get lucky on an ebay deal to upgrade your CPU power by 20-30%, you could also look at getting a Zen3 and some surprisingly cheap DDR4 on a B550 board (for rather less than even a 3070 will cost) which will upgrade your CPU power by about 200%. And also give you a PCIE 4.0 connection for your swanky new GPU.

  • JusticeforPlutoJusticeforPluto Registered User regular
    All the hype is going to the 3080. I understand why most aren't talking about the 3090, but I haven't hear much talk about the 3070.

  • OrcaOrca Also known as Espressosaurus WrexRegistered User regular
    4690K you're definitely going to be CPU limited for whatever you buy. It's time to replace the whole system; it has served admirably.

    Stabbity Style
  • GnomeTankGnomeTank What the what? Portland, OregonRegistered User regular
    All the hype is going to the 3080. I understand why most aren't talking about the 3090, but I haven't hear much talk about the 3070.

    I think you're getting a little confirmation bias from this thread, where almost all of us that post a lot have pretty high end boxes and are ogling the 3080 as the new flagship...but across the internet at large I see a ton of hype for the 3070.
    Dixon wrote: »
    So if I want a 3080 on the first day, whats the best bet overall?

    A FE direct from nvidia? So that means being up at midnight on the 17th just refreshing till the page shows up on nvidia.com?

    Does me being in canada change that? Never ordered direct through nvidia before.

    We don't know. Nvidia nor their partners have said much about pre-order times, when things go live, etc. My guess is Nvidia is trying to keep scalpers on the back foot by being coy, but I'm not sure how much it will actually help.

    Sagroth wrote: »
    Oh c'mon FyreWulff, no one's gonna pay to visit Uranus.
    Steam: Brainling, XBL / PSN: GnomeTank, NintendoID: Brainling, FF14: Zillius Rosh SFV: Brainling
    3cl1ps3jungleroomx
  • 3cl1ps33cl1ps3 I will build a labyrinth to house the cheese Registered User regular
    I think your best bet is once the product pages go live and dates/times are announced, plan to be there pretty much at that time and f5 as hard as you can and try your luck. This is looking to be a fairly gnarly launch.

    Stabbity Style
  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    Shopping online is simultaneously the best and absolute fucking worst

    SoggybiscuitStabbity StyleGnomeTankTrajan45MugsleyThawmus
  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    All the hype is going to the 3080. I understand why most aren't talking about the 3090, but I haven't hear much talk about the 3070.

    I mean, I'm all about the 3070.

    From where I'm sitting (1440p, 144hz, hdr) it's a perfect 60+ fps card for $500.

    V1mThawmus
  • MvrckMvrck Dwarven MountainhomeRegistered User regular
    3070 is quite probably the best value per dollar card announced in several generations across every price point. It is an insanely good card. But some of our more active posters are in some niche use cases so things like the 2080ti and now the 3090 do get an outsized focus (Gnome with his VR, me with being into both high end streaming, and doing extensive 4k video editing for work - which has mostly moved to my home computer now, so even more pertinent).

    GnomeTank
  • Stabbity StyleStabbity Style He/Him | Warning: Mothership Reporting Kennewick, WARegistered User regular
    Mvrck wrote: »
    3070 is quite probably the best value per dollar card announced in several generations across every price point. It is an insanely good card. But some of our more active posters are in some niche use cases so things like the 2080ti and now the 3090 do get an outsized focus (Gnome with his VR, me with being into both high end streaming, and doing extensive 4k video editing for work - which has mostly moved to my home computer now, so even more pertinent).

    Since the 970, which was also an insanely good value (that ended up being somewhat less because of the 3.5GB thing, but still good).

    SijLqhH.pngSteam: stabbitystyle | XBL: Stabbity Style | PSN: Stabbity_Style | Twitch: stabbitystyle
    OrcaWolveSight
  • SixSix Caches Tweets in the mainframe cyberhex Registered User regular
    Cool. New PC builds are always fun. Maybe I can eBay the old MB/CPU/GPU/RAM

    can you feel the struggle within?
  • jungleroomxjungleroomx It's never too many graves, it's always not enough shovels Registered User regular
    Mvrck wrote: »
    3070 is quite probably the best value per dollar card announced in several generations across every price point. It is an insanely good card. But some of our more active posters are in some niche use cases so things like the 2080ti and now the 3090 do get an outsized focus (Gnome with his VR, me with being into both high end streaming, and doing extensive 4k video editing for work - which has mostly moved to my home computer now, so even more pertinent).

    Since the 970, which was also an insanely good value (that ended up being somewhat less because of the 3.5GB thing, but still good).

    I went from the 970 to the 2060, which was a nice bump but I was always disappointed with how the 2060 handled games of the time vs how the 970 ate through them with aplomb.

    I should've known it would've been similar to how I got the 6600 back in the day and was a little disappointed on how it ran, but the 8800 was an absolute monster.

This discussion has been closed.